You were talking about the scenario where the dynamic sampling is actually done but the results rejected. ]]>

I was referring to a slightly different case I think where the dynamic sampling is actually done but result is rejected (and corresponding output can be found in the 10053 trace file) – but you might be correct that the treatment is different when using the dynamic_sampling_est_cdn hint in addition – something I need to test.

I thought you were referring to the case where dynamic sampling does not kick in because statistics have been gathered and default level 2 therefore doesn’t satisfy.

Of course it also depends on whether you mean cursor or table level dynamic sampling with your hint…

Randolf

]]>You’re right about the cardinality.

I had expected to find a cardinality note in the 10053 and didn’t verify before leaping to a conclusion and it’s not there.

A case of 1 + 1 = 3 and a lesson in being thorough.

Regarding your comments on the rejection of dynamic sampling, that was the point of my original comment mentioning this as a possible solution – that if underlying statistics are available then this example SQL won’t use dynamic sampling if you just hint it with /*+ dynamic_sampling() */ but that you could force it using dynamic_sampling_est_cdn. i.e. it can help here but you need the additional hint.

Cheers,

Dominic

Great explanations! (I think that it is safe to say that I would not have been able to drive as deep into this answer as Randolf did, I seem to have forgotten some of these fine details.)

–

I think that everyone who commented in this blog article was certainly headed in the right direction (just some of you were moving a bit faster in that direction). It is interesting to see how such a simple case of “why isn’t Oracle using my index” has brought out so much helpful information on the topic.

]]>> So, in this case, the absence of a note saying “dynamic sampling used” does not really mean that dynamic sampling was not used.

> It might make more sense in light of what I said in my first comment about dynamic_sampling_est_cdn.

> The dynamic_sampling hint applies to the selectivity of the single table access predicates.

> The “Note” also seems to only apply to that selectivity sampling.

> By using dynamic_sampling(0) we turned off the selectivity sampling (which allowed cardinality feedback to kick in on the subsequent execution if there was one).

> But, there’s no way to turn off the dynamic sampling of the cardinality.

> For example, there is no hint dynamic_sampling_no_est_cdn or no_dynamic_sampling_est_cdn.

Have you verified above? It would mean that you get dynamic sampling for base table cardinality estimates without a note “dynamic sampling used” in the plan output.

I don’t think this is the case. Above figures are actually based on hard-coded defaults – based on the segment size in blocks Oracle multiplies this by a default number of rows per block based on default block size. It’s a bit of coincidence that these defaults actually lead to a base table cardinality that is close to the actual one used in this example here.

There is no dynamic sampling taking place, not even for the base table cardinality, although your explanation sounds quite right with the dynamic_sampling vs. dynamic_sampling_est_cdn.

If you can show me evidence of dynamic sampling taking place here from a 10053 trace file, only then I’ll buy it.

As a side note – this case here can be used as an example where dynamic sampling results can be rejected – however the case needs some modification:

- No index eligible for dynamic sampling (e.g. unindexed complex expression used as filter predicate)

- Underlying statistics available

In such a case if you request dynamic sampling for better selectivity estimates on top this will be rejected in many cases up to quite high levels of dynamic sampling, because Oracle does not find any rows satisfying the filter predicate in the sample and hence rejects the dynamic sampling results until you reach a certain threshold where the majority of table blocks gets sampled.

What I’m trying to say here is that there are cases where dynamic sampling doesn’t help as expected – and the data distribution here represents such an edge case.

Randolf

]]>thanks a lot for your answer, which hit the nail on the head. I noticed the difference in the CBO trace file with estimate_percent => NULL vs. the default value yesterday, but was confused that the basic statistics knew about the 2 dictinct values in both cases.

Marcus

]]>I think the explanation for your 11.2 result can be seen from the following lines of the 10053 trace file excerpt:

> Histogram: Freq #Bkts: 1 UncompBkts: 5414 EndPtVals: 1

> Using density: 0.500000 of col #10 as selectivity of unpopular value pred

> Table: T1 Alias: T1

> Card: Original: 47789.000000 Rounded: 23895 Computed: 23894.50 Non Adjusted: 23894.50

So you seem to have ended up with a sampled frequency histogram (5414 rows sampled) that consisted of a single popular value – which simply means that due to the sampling the histogram missed the very rare INVALID value. It’s a bad side effect of AUTO_SAMPLE_SIZE in 11g that it uses 100% sampling for the table and basic column statistics but sometimes scales down the sampling size for histograms. This probably has been implemented for performance reasons in order to loose not too much time on the additional gathering iterations required for each histogram, but it can lead to inconsistent statistics – as you can see from the other parts of the 10053 trace file the basic column statistics have covered both values (NDV: 2).

Now combine this inconsistency with the fact that since Oracle 10.2.0.4 a value not found in a frequency histogram will no longer get a cardinality estimate of 1 as it used to be in the past but half of the cardinality of the least popular value – and you end up with a cardinality estimate of 50% of the table, since you have only a single popular value in the histogram that covers 100% of the data…

This bad cardinality estimate drives up the cost of the index access so you end up with the FTS being favoured by the optimizer.

This behaviour can be influenced with a FIX_CONTROL (5483301) if it causes you consist trouble in one of your applications. See Jonathan’s post on this and the corresponding comments. The fix control let’s you return to the old behaviour – missing values will get an cardinality estimate of 1.

Repeat your test case on 11.2 with estimate_percent => NULL or 100 and you should end up with a frequency histogram covering both values and therefore get both a correct cardinality estimate and an index usage.

Which leads back to my initial reply – check the cardinality estimates – bad cardinality estimates are the most common issue with bad execution plans.

Randolf

]]>So, in this case, the absence of a note saying “dynamic sampling used” does not really mean that dynamic sampling was not used.

It might make more sense in light of what I said in my first comment about dynamic_sampling_est_cdn.

The dynamic_sampling hint applies to the selectivity of the single table access predicates.

The “Note” also seems to only apply to that selectivity sampling.

By using dynamic_sampling(0) we turned off the selectivity sampling (which allowed cardinality feedback to kick in on the subsequent execution if there was one).

But, there’s no way to turn off the dynamic sampling of the cardinality.

For example, there is no hint dynamic_sampling_no_est_cdn or no_dynamic_sampling_est_cdn.

So, you then get a standard 1% selectivity estimate on the dynamically sampled cardinality rather than the “1″ we migh have expected with no stats and apparently no sampling.

It’s obvious I suppose once you realise what must be going, which I certainly didn’t even notice originally.

]]>