A note on behavioral change of IDENTITY property in SQL Server 2012

There can be gaps in the auto incrementing values implemented using IDENTITY property in SQL Server. In most practical situations this column is used as a surrogate key and not visible to the user. However in some business scenarios, the auto generated values have to be made visible to the users. This blog post is relevant to the latter case.

The implementation of IDENTITY property has been changed in SQL Server 2012 compared to the prior versions. In SQL Server 2012, the IDENTITY property is implemented using the new SEQUENCE object. By default the SEQUENCE object has a mechanism called caching, in which consecutive values are kept in memory for further use based on a threshold value. For example, when you insert first row and the current sequence value is 1, SQL Server may keep the next 1000 consecutive values in memory and keep the maximum value (i.e. 1000) in metadata. Once the incrementing value reaches 1000, the cache will create for numbers from 1001 to 2000 (note that these numbers are arbitrary). This is a performance improvement mechanism where the previous SQL Server versions keep the last value in transaction log.

The problem arises when there is a database failure or server restarts. Since the server is keeping only the maximum value, server will create a cache from this value. For example, you have inserted two rows and the current value for the auto increment column is 2. Now the server is restarted. You inserted the next row. Now the value of auto increment column will be 1001. This abrupt jump is not acceptable in some scenarios.

Microsoft has commented on this as ‘by design’ and suggested two workarounds.

  1. Create a SEQUENCE object in place of IDENTITY column using NO CACHE option
  2. Use trace flag 272 to force to use previous version’s IDENTITY property semantics

You can read more on this herehere, and here.



Leave a comment

Filed under T-SQL

The most dangerous command in SQL Server

There is one such command in SQL Server that if not used properly,

  1. May corrupt the database (if you don’t have a backup you are lost)
  2. May corrupt the system databases and the server (if you don’t have backup of system databases you are dead)
  3. May loss your job or entire career (if used in production)!

When using this command nothing will be recorded in transaction log! You cannot rollback the changes. The command is:


Using this command, one having sysadmin role can modify any page of any databases. I repeat, “Any page of any databases including system databases”! You can either modify the data or fill with junk bytes. Be away from this as far as you can and never think about executing this in production databases. Fortunately this is an undocumented command and used by SQL Server development teams internally for testing purposes.

Then what is the use of this command? If used this command properly you can create corrupted SQL Server databases for testing your disaster recovery policies and for demo purposes.

You can read more about this directly from Paul Randal, who is the developer of many of the DBCC commands (http://www.sqlskills.com/blogs/paul/dbcc-writepage/). Here is another blog that has some interesting demo scripts using WRITEPAGE (http://www.sqlnotes.info/2011/11/23/dbcc-writepage/).


Leave a comment

Filed under Internals

Book Review : CODE – Charles Petzold

If you are coming from pre .NET era of programming, I’m sure you should encounter this name: Charles Petzold or at least the book ‘Programming Windows’. In those days he was a ‘superstar’ on Windows Programming and VC++. When I browse through a book shop in San Francisco last year I chanced upon one of his book ‘CODE – The hidden Language of Computer Hardware and Software’.


I grabbed and purchased the book without a second thought though a bit costly for my standard ($18.00. Though Leigh’s Books San Francisco has good collection of used books,

I couldn’t find one for CODE. Nobody will ever sell this book. So I purchased the only remaining copy in there). I read the book in one week without placing it down. And Petzold again proved that he’s still superstar in this new generation.

I took the book for reading the second time last week. Code is a book about how computers work. I’m sure no one can resist the temptation of completing the book at once. The roller-coaster ride starts from the cover page itself. The name ‘CODE’ written in four representations in a white background .viz. English, Morse code, Braille, and binary coded ASCII. It’s very basic that he’s explaining the Morse codes and Braille and developing the concepts to binary codes and number representations. He has very well explained binary representations using flash light bulb circuits. He greatly explained logic circuits, flip-flops, memories and even shown how to build a binary adding machine gradually in each chapters.

Though the books technical Petzold has ingeniously infused history of computers and evolution of programming languages in this wonderful book. Every software professional should have this book in their shelf in a handy place.

I’m rating this 10/10.

Leave a comment

Filed under Books

Book Review : DBA Survivor ‘Become Rockstar DBA’

Since my family left the town for holidays last week to our native, I got a week’s spare time. I decided to spend the time reading one or two books resting in the book shelf for quite a long period waiting for their turns. After a few minutes of browsing in the shelf I chose two books. One is Thomas Larock’s DBA Survivor and Charles Petzold’s CODE.

ImageDBA Survivor is a very small book (nearly 200 pages), that is targeting the one who is starting her career as a DBA or DBA who’s in the midst of the career and wishes to advance to higher levels. This book is very different from other technical books. Apart from technical information, this can be considered as a motivational and a career development book. Even if the title has ‘DBA’ this can be an interesting read for anyone in software industry (trust me). The author has got enough real world experience and he is transferring that knowledge to us including what he has learned from the mistakes.

Do you ever expect a book that is covering about databases, has chapters on food habits, physical exercises and importance of sleep etc.? ‘DBA Survivor’ has got all these. Another interesting chapter is ‘Connect. Learn. Share’ which describes about the importance of networking and learning and sharing knowledge.

No introduction is needed for the author Thomas Larock, since he’s the Rockstar blogger in SQL Server world. The book’s subtitle also resembles this. “Become a Rock Star DBA”. He’s also famous for his SQL Server Blogger Ranking in the URL here (http://thomaslarock.com/rankings/).

I’m suggesting this book to anyone who is practicing software development. Non-Database professionals can skip the chapters on RAID levels, clustering, troubleshooting etc. If they are not interested, though I suggest reading without skipping any page.

Leave a comment

Filed under Books

Transient Fault Tolerance

If the behavior of a system deviates from the expected behavior, then this is called a fault. A transient fault is intermittent or short-lived fault. For example, when a user tries to update a table, the operation may fail due to timeout or network related errors or any intermittent database connection issues. Another scenario is the database resources are deadlocked. These issues are short-lived and usually if you retry the operation shortly after sometime the operation may succeed.

An enterprise level application has to be transient fault tolerant, if it is designed for high availability. For example, if a deadlock occurs, SQL engine will consider one resource as a deadlock victim and rollback the transaction and raise a 1205 error. Instead of raise this error to the application immediately, the code should catch this exception and retry issuing the statement a specified number times before raising the error to the user.

I’m currently reading the latest Enterprise Library version 6.0 documentation from Microsoft released on April, 2013 that contains a block for the Transient Fault Tolerance to include in your applications. Using this block you can define retry intervals and strategies targeting SQL Databases and Windows Azure services.

You can read more on EntLib and Transient Fault handling here:

Enterprise Library 6 – April 2013

Perseverance, Secret of All Triumphs: Using the Transient Fault Handling Application Block

Apart from Transient Fault Handling block, the EntLib 6.0 also contains Data Access application block, Exception handling block, Semantic logging block, validation block etc. Even if you decide not to use this library, at least reading the documentation and a code walk-through give you some insight on how to handle scenarios describing on it.


Filed under Uncategorized

Dedicated Administration Connection – A Knight’s Tale

Few days ago I wrote a logon trigger that prevents unauthorized applications from connecting to the development database server. The trigger code was simple and spans only a few lines of code and I just executed the script. And after a couple of seconds, I realized a logical mistake! Due to the logical error in the code, every user trying to connect was kicked out of the server. And as expected lot of complaints were piled up in front of admin’s desk in no time (unfortunately that’s my desk).

I was thinking of dropping the logon trigger and replied them back, “Don’t worry dude. I’m the admin and I’m here to help you. This will be resolved in no time.”

I tried to connect to the server. Oh my! The server is not accepting even my connection request. So the story begins here.

“Hey, I’m the King, the mighty Administrator. You are not allowing me to enter my castle?” asked the Administrator.

“Hey, King. I’m the Logon trigger, the valiant gate keeper. I’m instructed to prevent anyone who enters


through the gate even you the King!”

“What! But who instructed you that!”

“You, the King, and your ignorant instructions”

“I, the mighty Administrator, hereby order that, I will give half of my kingdom to the one who kills this Gate keeper!” the mighty Administrator requested for help.

And here comes the valiant knight DAC. The Dedicated Administrator Connection. DAC ingeniously made another way to reach the castle and the King killed the gate keeper. In the end the King, the mighty Administrator, announced the amended instructions to his people, and gave half of the Kingdom to DAC the Knight. The king and the people in the kingdom lived happily thereafter.

Long story short

The dedicated administrator connection is a special connection that allow administrator to enter a running server to troubleshoot, even if the server is unresponsive. The DAC bypasses even the logon trigger. The administrator can connect to the server through DAC by prefixing ADMIN: before server name box in SSMS connect window, or using –A command switch in sqlcmd. Since DAC is a single thread, don’t expect any query parallelism. There many limitations in DAC. The below Microsoft article gives a detailed list on this.

Using a Dedicated Administrator Connection

You can also check whether a DAC scheduler is running or not by querying the DMV sys.dm_os_schedulers. The status column will contain a value ‘VISIBLE ONLINE (DAC)’ that indicates a DAC scheduler is running.


CAUTION: Only use DAC if the situation extremely demands.

End of Story: I connected to the server using DAC and dropped the logon trigger. Then corrected the trigger and compiled.

Leave a comment

Filed under Administration, DMV

The mystery behind the statistics name _WA_Sys

If you are familiar with SQL Server statistics, you might have encountered the statistics names that are prefixed with _WA_Sys. These are generated by SQL Server during query optimization step if column statistics are missing, and if Auto Create Statistics option is enabled. For years it was believed that the ‘WA’ stands for state code for Washington, the headquarters of Microsoft. However here is a different story.

“…..Contrary to the common opinion, the WA does not refer to Washington (home of SQL Server) but rather to Waterloo, Canada, where the Sybase product was developed”

This can be an official statement since the above sentence is a quote from the book Training Kit (Exam 70-462): Administering Microsoft SQL Server 2012 Databases, the official book for the training from Microsoft Press.

Note: For those who wonder (especially newbies) what’s between Sybase and Microsoft, read the history here.

Leave a comment

Filed under Uncategorized

The Halloween Problem

On this Halloween day, I think it would be interesting to revisit the Halloween problem in database systems. This is related to some anomalies in UPDATE, INSERT, DELETE statements, that will end up in unexpected results. This is also known as Halloween protection problem and discovered by IBM researchers some 35 years ago. You can read about Halloween protection problem in numerous blogs and articles. Some of the interesting ones related to SQL Server are listed below.

Halloween Protection – The Complete Series – Paul White

Halloween Protection – Craig Freedman’s SQL Server Blog

Halloween Protection – Benjamin Nevarez

Leave a comment

Filed under Internals

Default schema for Windows group logins in SQL Server 2012

Alice is a tester in the Acme Corp. Alice has access to the testing database server through the Windows group login for the team [Acme\Testing]. When Alice creates a table without specifying a schema, her login name will be the default schema for the table as below:


When Bob from the same testing team tries to query the OrderDetail table by executing the following query, he will receive an error saying invalid object name. This is because he has specified the schema as ‘dbo’.

SELECT * FROM dbo.OrderDetail

A frustrated Bob reaches the DBA for help and they discussed the issue. After a few minutes of discussion, the DBA now understood the problem, the requirement, and fortunately the solution.

The problem is when a windows group user creates an object without specifying the schema, the windows login name is consider as the schema name (this is by design and exists there in SQL Server for quite some time). The requirement is to assign a default schema to a Windows group. And the good news is this is now possible in SQL Server 2012.

Using SSMS:

  1. Go to the database
  2. Expand Security and then Users node
  3. Open the Property page of the Windows user or group that you want to assign default schema

Default Schema for Win User in SQL Server 2012

In T-SQL you can use the below script



Leave a comment

Filed under Administration

Joining sys.dm_os_waiting_tasks with sys.dm_tran_locks doesn’t work in SQL Server 2012 (RTM)

Last week, I was helping a developer to debug a lock information query that involves a join between DMVs sys.dm_os_waiting_tasks and sys.dm_tran_locks. I created two sessions. In session 1, simulated a lock by starting a transaction (in READCOMMITED isolation level) without committing and in the session 2, issued a SELECT query in the same table from the session 1 that has been blocked by the query in session 1. Then I executed the following query for lock information.








OBJECT_NAME(p.object_id) AS ObjectName FROM sys.dm_tran_locks as t1

INNER JOIN sys.dm_os_waiting_tasks as t2

ON t1.lock_owner_address = t2.resource_address

INNER JOIN sys.partitions p

ON p.hobt_id = resource_associated_entity_id;

However the query doesn’t return the expected results. I queried the DMVs individually and found that the values in lock_owner_address column and resource_address column are different. I repeated the same test in another server and got the expected result! Then I checked the versions between the servers and found that the developer was working in SQL Server 2012 (RTM) and my second test was in SQL Server 2012 SP1. So this is a bug in the RTM version.

I searched the Microsoft connect website and found the below connect item by Kalen Delaney.


So I requested the system admin to install SP1 in the RTM version and issue sorted out.

Leave a comment

Filed under DMV, Internals