It’s been a while, that I did not write up any blog post. So as the title mentions, I would like to write about Temporary Tables in SQL Server and how those objects can reduce the overall performance.
You might seen or used Temporary Tables in stored procedures to implement the business logic with keeping up some data into temporary table (regardless of local or global type) and use it in the rest of the stored procedure or maybe in different stored procedures. I highly recommend you to do not use Temporary Table for every single query.
Using temporary table in stored procedure has many drawbacks such as multiple recompilation, transaction log use and concurrency overhead.
If you use temporary tables in stored procedures, and at same time you run some DDL commands against the temporary table, then you will get multiple times recompilation. stored procedure recompilation occurs for the following reasons but not limited to:
- The schema of the tables, temporary tables or views changed.
- Statistics of the object hits the threshold.
SQL Server keeps track of the number of changes to the table. if number of changes exceeds the recompilation threshold value, then SQL Server automatically updates the statistics. The following formula is determining the data changes within a temporary table and recompilation threshold:
- IF $ROWS
- IF 6
- IF $ROWS > 500 THEN $RECOMPILATION_THRESHOLD = 500 + ($ROWS x 0.2)
The other reason of recompilation is deferred object resolution, most of the times in the stored procedure you create local temporary tables instead of regular tables. A local temporary table is not available outside of the scope of stored procedure, so its schema cannot be altered in any way between multiple execution.
The below script creates a stored procedure as the following definition:-
use tempdb; go Create Procedure TestSP As Begin Create Table #TempResult (ID Bigint Identity(1,1) , Padding Binary(10) Default 0xFFF); While(Select Count(*) From #TempResult)
Once we execute the stored procedure, and monitor the SQL Profiler tool then you can see that execution plan been recompiled few times and consumes processors resources.
[plain]Remember that if the temporary table is cached, then all the statistics are also been cached and once the stored procedure executed again, the optimizer uses the old statistics. Therefore you might need to recompile the stored procedure or manually update the statistics on the temporary table.
Creating Clustered Index on temporary table, increases the Temporary Create Template value and consume more memory in SQL Server instance.
Create and Drop the temporary table does not help you to provide efficient execution plan as SQL Server caches the temporary table with a different name and use the previously created temporary table every time Create command is executed.
Transaction Log Overhead
Temporary Tables are like regular tables in terms of transaction handling, therefore any DML operation on this type of tables are logged into the transaction log file, which can cause performance impact. In a blog post titled ‘Boost-Up SQL Server Performance with Minimal Investment’ , I already discussed about how Transaction Log I/O operation can reduce performance.
To over come this issue, SQL Server provides a data type called Table Variable, which does not Transaction Log Overhead, but it has some limitations as the followings:-
- No DDL Execution allowed once the object is created.
- No Statistics are created.
- No Rollback Transaction is allowed.
[info]It is better to use Table Variable object in the following cases:-
- Very few records are required to keep.
- Rollback transaction is not required.
The following script updates the above stored procedure to use Table Variable instead of Temporary Table object.
use tempdb; go Alter Procedure TestSP As Begin Declare @TempResult As Table(ID Bigint Identity(1,1) Primary Key Clustered, Padding Binary(10) Default 0xFFF); While(Select Count(*) From @TempResult)
As you see the following SQL Profiler trace result, the recompilation operations are significantly reduced:-
Concurrency and Locking Overhead
The transaction that tries to make DML operations on a Temporary Table, is required to acquire a LOCK to perform the respective operation. Concurrent transaction operations on a temporary table can lead to blocking or in some cases deadlock issue. To over come this problem, you can use Table Variables, as they are treated like local variable and not like database object. therefore the locking overhead associated to regular or temporary tables are not exists.
It is highly recommended to do not use Temporary Tables or even Table Variables, and try to come out with SET BASE query to retrieve the data. Temporary Tables are considered as regular database object, in terms of transaction handling and performance, therefore using many temporary tables in your stored procedures can lead to very poor database performance.