Skip to main content

Data and Proc Cache Analysis


                                              Very recently I had the opportunity of looking into an environment where the client was using inline code accessibly. Later during the investigation it was reviled they consumed recursive triggers to drive the business logic where views were used as a stepping stone to stage data and procedures/functions were used to implement the logic. (Sound very familiar !!!!!)  

As part of my analysis I wanted monitor the sql server cache consumption to identify any potential problems/record the statistics. As the cache consists of two components (Proc cache and Data cache) the analysis had to include both. The two DMV’s used for this are sys.dm_exec_cache_plans and sys.dm_os_buffer_description. 

1.)Data cache
SELECT
 CASE database_id
            WHEN 32767 THEN 'RESOURCEDB'
            ELSE db_name(database_id)
            END AS 'DatabaseName'
            ,page_type
,Count(1)*8/1024 AS 'Data_CacheSize_MB'
FROM sys.dm_os_buffer_descriptors
GROUP BY db_name(database_id) ,database_id , page_type
ORDER BY DatabaseName
This was fairly straight forward and could be found in the BOL to which I added the Page_Type to get a breakdown of the pages cached.

2.) Proc cache 
SELECT objtype AS [CacheType]
        , count_big(*)AS [Total Plans]
        , sum(cast(size_in_bytes as decimal(18,2)))/1024/1024 AS [Total MBs]
        , avg(usecounts) AS [Avg Use Count]
        , sum(cast((CASE WHEN usecounts = 1 THEN size_in_bytes ELSE 0 END) as decimal(18,2)))/1024/1024 AS [Total MBs - USE Count 1]
        , sum(CASE WHEN usecounts = 1 THEN 1 ELSE 0 END) AS [Total Plans - USE Count 1]
FROM sys.dm_exec_cached_plans
GROUP BY objtype
ORDER BY [Total MBs - USE Count 1] DESC

This looks a little complicated because it has a story behind it. This query gives you the details of the proc cache along with “First time complications”.
What is “First time compilations”? and the possible proc cache bloat ? What it means is that there are queries which have compiled plans in the cached but are used only ones. As the procedure cache is overwhelmed with  “first time compilation” allocations the proc cache ends up having to allocate more memory and resources to maintain the bloated cache. Please read Kimberly blog post on first time complications for a in-depth understanding http://www.sqlskills.com/blogs/kimberly/post/procedure-cache-and-optimizing-for-adhoc-workloads.aspx

One thing I failed to mention in the beginning was that this environment was a sql 2005 sp1. For the record the the proc cache allocation up to sql 2005 sp1 and beyond are significantly different. http://blog.sqlauthority.com/2009/08/21/sql-server-get-query-plan-along-with-query-text-and-execution-count/

Possible  workaround for "First time Complications"
For environment with sql server 2008 especially for those that use ORM tools to access the database “Optimize for adhoc workloads” can be used. http://blogs.technet.com/b/josebda/archive/2009/03/19/optimize-for-ad-hoc-workloads-in-sql-server-2008.aspx

Key takeout for me : Use the sql features more effectively and test them before using

Comments

Popular posts from this blog

High Watermarks For Incremental Models in dbt

The last few months it’s all been dbt. Dbt is a transform and load tool which is provided by fishtown analytics. For those that have created incremental models in dbt would have found the simplicity and easiness of how it drives the workload. Depending on the target datastore, the incremental model workload implementation changes. But all that said, the question is, should the incremental model use high-watermark as part of the implementation. How incremental models work behind the scenes is the best place to start this investigation. And when it’s not obvious, the next best place is to investigate the log after an test incremental model execution and find the implementation. Following are the internal steps followed for a datastore that does not support the merge statements. This was observed in the dbt log. - As the first step, It will copy all the data to a temp table generated from the incremental execution. - It will then delete all the data from the base table th

Create a dacpac To Compare Schema Differences

It's been some time since i added anything to the blog and a lot has happened in the last few months. I have run into many number of challenging stuff at Xero and spread my self to learn new things. As a start i want to share a situation where I used a dacpac to compare the differences of a database schema's. - This involves of creating the two dacpacs for the different databases - Comparing the two dacpacs and generating a report to know the exact differences - Generate a script that would have all the changes How to generate a dacbpac The easiest way to create a dacpac for a database is through management studio ( right click on the databae --> task --> Extract data-tier-application). This will work under most cases but will error out when the database has difffrent settings. ie. if CDC is enabled To work around this blocker, you need to use command line to send the extra parameters. Bellow is the command used to generate the dacpac. "%ProgramFiles

How To Execute A SQL Job Remotely

One of the clients needed its users to remotely execute a SQL job and as usual I picked this up hoping for a quick brownie point. Sure enough there was a catch and there was something to learn. Executing the job through SQLCMD was a no-brainer but getting it to execute on the remote machine was bit of challenge. On the coding Front 1    1.)     The bat file included the following code                 SQLCMD -S "[ServerName] " -E -Q "EXEC MSDB.dbo.sp_start_job @Job_Name = ' '[JobName]" 2    2.)     The Individual users were given minimum permissions  to execute the SQL job Ex. use msdb EXECUTE sp_addrolemember @rolename = 'SQLAgentOperatorRole', @membername = ' Domain\UserLogin ' At the client machine              This took a fair bit of time till our sysadmin got me an empty VM machine.  Thanks Michael G                   I’m just going to copy the exact instructions that I copied to OneNote and passed on