Posted by Ami Levin on Wednesday, 25 April 2012
An interesting discussion came up lately on the MVP private forums regarding index rebuilds,recompilation and statistics updates. Although the debate spanned too many aspects for me to cover here in this short post, I think that one important point should be emphasized as it might be making you do redundant work or even worse - unknowingly degrade the performance of your workload.
When you rebuild an index, a full scan of all index keys is performed. SQL Server uses this opportunity not only to rebuild the index, but also to update the related statistics. What is unique about this statistics update is the fact that it is equivalent to an update with the full-scan option. The default statistics sampling uses only a small fraction of the keys in order not to load your production server. A statistics based on a full scan of the data has the potential to be more accurate, therefore provide the query optimizer with better information, allowing it in turn to produce better performing execution plans for your workload queries. Also remember that a statistics update flushes all plans that use it from the procedure cache forcing a recompilation of the related queries.
Therefore, note the following tips:
- After rebuilding an index, there is no need to updates its statistics explicitly. Moreover, doing so might actually get you less accurate statistics if you fail to use the FULL SCAN option.
- After rebuilding an index, there is no need to explicitly flush the relevant plans from cache. They will be flushed automatically. Be aware that using more 'aggressive' means like executing sp_recompile on the underlying table or flushing the cache altogether might cause unexpected results as additional plans, ones that had nothing to do with your rebuilt index, will be recompiled again.
- If you have the time window during off-peak hours, try updating statistics of your larger tables using the FULL SCAN option. In some cases, it can make a huge difference.
Have a great weekend!