Skip to main content

Replicated exceeds configured maximum 65536

We successfully migrated from sql 2005 to sql 2008 few weeks back (a non production environment) , there was little or no issues with the migration, I would consider this to be one of the items that went well for us as a DAB team for last quarter.

Few days back on of the dev folks walk up to me told me a transaction won’t get completed and the application is generating the following error.


Msg 7139, Level 16, State 1, Procedure csp_esccm_tTablename_Update, Line 49
Length of LOB data (72322) to be replicated exceeds configured maximum 65536.
The statement has been terminated.

After going through the message I noticed that the error was to do with replication and that the maximum row size is not supported with one of the current settings after the migarion.

Solution
There are server level setting that needs to be changed for replication to successfully transfer large data rows.

1. exec sp_configure 'max text repl size'

Highlight and execute the code on line one. This will display the maximum and the current sizes for the data row that can be replicated from publisher to subscriber in KB.

2. exec sp_configure 'max text repl size', 1024000
Then to overcome the current issue we can increase the size of the data that can be replicated. I have increased the max replication size from 64MB to 1 GB

3. RECONFIGURE WITH OVERRIDE
Then you execute the reconfigure with Override , this will assign the configure value to the current value immediately without having to restart sql server instance.

4. exec sp_configure 'max text repl size'
Verifying if the maximum replication setting has been successfully set


This change will solve the problem and let you proceed with replicating data as frequent you like.

 
Conclusion

- You need to know the size of data row or replication will fail.
- When large columns are replicated it would be required to set the 'max text repl size’ from its current value to the max size of the data row.
- We did forget to set the max replication setting after the migration to sql 2008

Comments

Post a Comment

Popular posts from this blog

How To Execute A SQL Job Remotely

One of the clients needed its users to remotely execute a SQL job and as usual I picked this up hoping for a quick brownie point. Sure enough there was a catch and there was something to learn. Executing the job through SQLCMD was a no-brainer but getting it to execute on the remote machine was bit of challenge. On the coding Front 1    1.)     The bat file included the following code                 SQLCMD -S "[ServerName] " -E -Q "EXEC MSDB.dbo.sp_start_job @Job_Name = ' '[JobName]" 2    2.)     The Individual users were given minimum permissions  to execute the SQL job Ex. use msdb EXECUTE sp_addrolemember @rolename = 'SQLAgentOperatorRole', @membername = ' Domain\UserLogin ' At the client machine              This took a fair bit of time till our sysadmin got me an empty VM machine....

Create a dacpac To Compare Schema Differences

It's been some time since i added anything to the blog and a lot has happened in the last few months. I have run into many number of challenging stuff at Xero and spread my self to learn new things. As a start i want to share a situation where I used a dacpac to compare the differences of a database schema's. - This involves of creating the two dacpacs for the different databases - Comparing the two dacpacs and generating a report to know the exact differences - Generate a script that would have all the changes How to generate a dacbpac The easiest way to create a dacpac for a database is through management studio ( right click on the databae --> task --> Extract data-tier-application). This will work under most cases but will error out when the database has difffrent settings. ie. if CDC is enabled To work around this blocker, you need to use command line to send the extra parameters. Bellow is the command used to generate the dacpac. "%ProgramFiles...

High Watermarks For Incremental Models in dbt

The last few months it’s all been dbt. Dbt is a transform and load tool which is provided by fishtown analytics. For those that have created incremental models in dbt would have found the simplicity and easiness of how it drives the workload. Depending on the target datastore, the incremental model workload implementation changes. But all that said, the question is, should the incremental model use high-watermark as part of the implementation. How incremental models work behind the scenes is the best place to start this investigation. And when it’s not obvious, the next best place is to investigate the log after an test incremental model execution and find the implementation. Following are the internal steps followed for a datastore that does not support the merge statements. This was observed in the dbt log. - As the first step, It will copy all the data to a temp table generated from the incremental execution. - It will then delete all the data from the base table th...