Skip to main content

Target server memory Vs. Total server Memory

I was recently faced with a situation where the users were complaining that of the one of the development database servers where slowing down. And so I logged to the server remotely and opened task manager and noticed the CPU was performing as usual but the PF memory was summing up to the total server memory. My first thought was, is the server under memory pressure?

Note - This database instance in concern lives in a box where there are two sql server instances.

To investigate further I opened the performance monitor and configured some of the usual counters I monitor with.
Buffer cache hit ratio , page life expectancy ,Avg and current disk queue length , Processor time, sql server connections , user database and temp transactions , complications and recompilations.
When I went through the Buffer cache and page life expectancy counters I knew it was nothing to do with memory pressure. But what could it be that server was encountering this slowness and request time outs ?

It was then I ran into the Total server and Target server memory counters under Memory. What does total and target server memory counters reprecent.

Total server memory
Is the amount of memory required by sql server to perform its current activities. This is a collection of both buffer cache and IO cache.

Target server memory
Target server memory is the memory that is been allocated by the system to sql server at the time sql server starts.


In a normal circumstance total server memory should always be less that target server memory. What this implies is, sql server has enough memory for it to perform its tasks. If the total server memory counter exceeds target server memory counter then its definite indication of memory pressure.

And so target and total server memory counters were showing me there were some memory pressure on the server. I guessed this can be a onetime issue as buffer cache and page life expectancy counters were far from looking troubled.

And so I opened the sql agent counters for both the instances and just on supposition I stopped the two sql server agents as the distributors where having very high values. But I was surprised to find the total server memory didn’t reduce but hanged on to the same old figure. By this time the developers were well into their dev work with the servers back in action. But it would be not long before they identify the replication were not working as the Sql agents were stopped.

I first started the agent of the second instance and before I stated the agent on the server where the problem originally was reported and check for the total server memory. It was having a lesser value than the target server memory for that instance.


Conclusion
Target server memory and Total server memory are two important memory counters available in sql server. These two counters will help identify if the server is under memory pressure along with other counters like buffer cache hit ratio and page life expectancy.

Comments

Popular posts from this blog

High Watermarks For Incremental Models in dbt

The last few months it’s all been dbt. Dbt is a transform and load tool which is provided by fishtown analytics. For those that have created incremental models in dbt would have found the simplicity and easiness of how it drives the workload. Depending on the target datastore, the incremental model workload implementation changes. But all that said, the question is, should the incremental model use high-watermark as part of the implementation. How incremental models work behind the scenes is the best place to start this investigation. And when it’s not obvious, the next best place is to investigate the log after an test incremental model execution and find the implementation. Following are the internal steps followed for a datastore that does not support the merge statements. This was observed in the dbt log. - As the first step, It will copy all the data to a temp table generated from the incremental execution. - It will then delete all the data from the base table th

Create a dacpac To Compare Schema Differences

It's been some time since i added anything to the blog and a lot has happened in the last few months. I have run into many number of challenging stuff at Xero and spread my self to learn new things. As a start i want to share a situation where I used a dacpac to compare the differences of a database schema's. - This involves of creating the two dacpacs for the different databases - Comparing the two dacpacs and generating a report to know the exact differences - Generate a script that would have all the changes How to generate a dacbpac The easiest way to create a dacpac for a database is through management studio ( right click on the databae --> task --> Extract data-tier-application). This will work under most cases but will error out when the database has difffrent settings. ie. if CDC is enabled To work around this blocker, you need to use command line to send the extra parameters. Bellow is the command used to generate the dacpac. "%ProgramFiles

How To Execute A SQL Job Remotely

One of the clients needed its users to remotely execute a SQL job and as usual I picked this up hoping for a quick brownie point. Sure enough there was a catch and there was something to learn. Executing the job through SQLCMD was a no-brainer but getting it to execute on the remote machine was bit of challenge. On the coding Front 1    1.)     The bat file included the following code                 SQLCMD -S "[ServerName] " -E -Q "EXEC MSDB.dbo.sp_start_job @Job_Name = ' '[JobName]" 2    2.)     The Individual users were given minimum permissions  to execute the SQL job Ex. use msdb EXECUTE sp_addrolemember @rolename = 'SQLAgentOperatorRole', @membername = ' Domain\UserLogin ' At the client machine              This took a fair bit of time till our sysadmin got me an empty VM machine.  Thanks Michael G                   I’m just going to copy the exact instructions that I copied to OneNote and passed on