다음을 통해 공유


An interesting find about Temp tables in SQL Server

I ran into a very interesting issue with temp tables recently. The issue description goes something like this. We have an application preparing some SQL statements, and sending them to the SQL Database engine for execution. However, the “issue” is easily reproducible in SQL Server Management studio. The first batch of statements looks something like this:-

--Execution Set 1 starts
if exists(select* from tempdb..sysobjects where id=OBJECT_ID('tempdb..#Tmp')and xtype='U')drop table #Tmp

Select

'1'as A,

'2'as B

Into #Tmp

Select

        a,b

from #Tmp

--Execution Set 1 ends here

Next, we prepare and send the following set of statements, from the same session:-

--Execution set 2 starts

if exists(select* from tempdb..sysobjects where id=OBJECT_ID('tempdb..#Tmp')and xtype='U')

drop table #Tmp

Select

'3'as A,

'4'as B,

'The Troublemaker' as C

Into #Tmp

Select

        a,b,c

from #Tmp

--Execution Set 2 ends here

Upon execution, the second batch generates an error:-

Msg 207, Level 16, State 1, Line 11

Invalid column name 'c'.

It does seem that SQL Server is caching the temp table definition, and when the second batch of statements goes in (remember, it is being compiled as a complete batch), the “select from” statement is compiled against the existing temp table definition, and thus, fails. However, if we use a “Select * from” instead of “Select a,b,c” in the second batch, we’re able to get the desired results. This is because when it actually gets to the execution of the “select from”, the table definition has been changed, and it picks up the new definition.

I also found that adding a “go” statement after the “If exists…then drop table #Tmp” statement resolves the issue in Management studio. This is again expected, as the go statement acts as a batch separator, and since the table(as well as it’s cached definition) has been dropped when the second select into statement (Batch 2) is parsed, it’s able to create a new table using the statement.

Adding a “go” statement after the “select into” statement also resolves the issue, and again, this makes sense too, because the select into statement goes as a separate batch, and the select from as a different one, the table definition in the cache has been updated (as the select into statement was compiled and executed before the select from statement came along).

However, in my case, since the customer was using an application, using go was not possible (since go is not a T-SQL command, as documented here). Upon doing some detailed research, I found the following excerpt from the Books Online for “Create Table” (available here) to be relevant to the situation at hand (though the scenario is not the exact same one):-

“A local temporary table created within a stored procedure or trigger can have the same name as a temporary table that was created before the stored procedure or trigger is called. However, if a query references a temporary table and two temporary tables with the same name exist at that time, it is not defined which table the query is resolved against. Nested stored procedures can also create temporary tables with the same name as a temporary table that was created by the stored procedure that called it. However, for modifications to resolve to the table that was created in the nested procedure, the table must have the same structure, with the same column names, as the table created in the calling procedure.”

So, in case if you ever face this issue, here are the possible workarounds:-

Connecting from SSMS (or sqlcmd or osql)

When connecting from SSMS, the simplest workaround is to insert a “go” statement in the second batch (after either of the first 2 statements), thereby breaking the batch into two batches. We can also try using a different temp table name, thereby eliminating the issue completely.

Connecting from Application

When connecting from an application, we can have the following workarounds:-

· Use a different temp table name

· Drop the temp table at the end of the first batch itself, rather than at the start of the second batch (which is what my customer used)

· Split the second batch into two batches, placing the “If exists…then drop table #tmp” statement and the other two “Select” statements in 2 separate batches.

Hope this helps someone. Please do let me know if you have any questions/doubts/comments related to the issue, or if you know of a different workaround for it.

Comments

  • Anonymous
    September 25, 2012
    good finding. thnx

  • Anonymous
    September 26, 2012
    Thanks for appreciating Gurmit. Glad to know you liked it.

  • Anonymous
    December 16, 2012
    Yeah, this is a an issue i faced recently...SQL should have a way to avoid this...

  • Anonymous
    December 16, 2012
    Yeah, this is a an issue i faced recently...SQL should have a way to avoid this...

  • Anonymous
    December 15, 2014
    you can use query like that : if OBJECT_ID('TEMPdb..#Tmp') is not null drop table #Tmp Select '1'as A, '2'as B Into #Tmp Select        a,b from #Tmp if OBJECT_ID('TEMPdb..#Tmp') is not null drop table #Tmp if OBJECT_ID('TEMPdb..#Tmp') is not null drop table #Tmp Select '3'as A, '4'as B, 'The Troublemaker' as C Into #Tmp Select        a,b,c from #Tmp if OBJECT_ID('TEMPdb..#Tmp') is not null drop table #Tmp