Skip to main content

The trouble with DBCC INPUTBUFFER()

DBCC INPUTBUFFER() is a very handy and easy to remember Database Console Command to show what SQL a specific user (mapped to a SPID) is executing. BUT it only returns a maximum of 255 characters which is often not enough to show the full command being executed.

There are a couple of solutions to this, thankfully. First, however, we need to know which users are currently logged in and their associated SPID (Server Process Identifier). To do this we use the trusty system stored procedure sp_who2:

EXEC sp_who2
GO

Click to enlarge





If we put the SPID we are interested in into:

DBCC INPUTBUFFER(54)

It returns the command being executed:

Click to enlarge




All well and good, but we can use this piece of code to return the full contents of the buffer:

USE master
GO

DECLARE @Handle BINARY (20)

SELECT @Handle = sql_handle
FROM sysprocesses
WHERE spid = 54

SELECT *
FROM ::fn_get_sql(@Handle)

Click to enlarge


I have happily been using the above script for years now, but wait there's a problem - ::fn_get_sql is marked as deprecated and though still currently available will be removed from future versions of SQL Server. Therefore we are being encouraged to use the Dynamic Management Function sys.dm_exec_sql_text() to return the text of commands. In the example below I have used sys.dm_exec_requests combined with the aforementioned function to return the contents of the buffer:

DECLARE @spid INT 
SET @spid = 54

SELECT r.session_id SPID
       ,r.STATUS
       ,r.command
       ,DB_NAME(database_id) 'DBName'
       ,t.TEXT
       ,wait_type
       ,last_wait_type
       ,percent_complete
       ,estimated_completion_time
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) t
WHERE r.session_id = @spid;

Click to enlarge


This is now my default way of dealing with this situation especially as we have some useful extra columns, however, remembering this (let alone typing it every time I need it) can be a nuisance so I have created a "snippet" from it - a new feature of SQL Server Management Studio 2012 details of which will follow in another posting. Previous to this I used snippets with the aid of SSMS Tools Pack which was free prior to the 2012 version but probably still worth buying if you like the other features it provides.

BTW, have you ever wondered what the double colon :: was before the function name? It is one of the peculiarities of SQL server and is only used in a few instances, more details of which can be found at Karen Delaney's blog.

This piece of code is ideal for a Snippet.

Comments

  1. Nice Article !
    This is my pleasure to read your article.
    Really this will help to people of SQL Server Community.

    I have also prepared one article about, Get last executed statement of lead blocker query using SQL Server DBCC Inputbuffer
    You can also visit my article, your comments and reviews are most welcome.

    http://www.dbrnd.com/2017/01/sql-server-dbcc-inputbuffer-to-find-the-last-statement-executed-by-a-spid-bloc-deadlock/

    ReplyDelete

Post a Comment

Popular posts from this blog

How to create a custom Windows Event Log view and email trigger

The filtering on Windows event logs can be slow, clunky and although you can do it on fields like event ID, it seems that many event IDs are shared amongst many different errors – the event ID may match but the body of the error (therefore the actual error) may be completely unrelated. Fortunately, it is possible to filter on the contents of the body of the error message but it requires creating a custom XML query. Also, it would be handy to send out a notification email when this event gets logged. Read on to find out how to work this magic…. This example is looking for a  Warning  event  1309  for  ASP.NET 4.0.30319.0  on a web server. If you were to just filter the log on the criteria above today it would return 435 results because it is a fairly general error ID. If I filter it using XML for SqlException (what I’m really interested in) only 5 results are returned. So the first step is go to the Application Log and choose  Create Custom View…  Select the  XML  tab, check  Edit

How to configure the SSAS service to use a Domain Account

NB Updating SPNs in AD is not for the faint hearted plus I got inconsistent results from different servers. Do so at your own risk! If you need the SSAS account on a SQL Server to use a domain account rather than the local “virtual” account “NT Service\MSSQLServerOLAPService”. You may think you just give the account login permissions to the server, perhaps give it sysadmin SQL permissions too. However, if you try and connect to SSAS  remotely  you may get this error: Authentication failed. (Microsoft.AnalysisService.AdomdClient) The target principal name is incorrect (Microsoft.AnalysisService.AdomdClient) From Microsoft: “A Service Principle Name (SPN) uniquely identifies a service instance in an Active Directory domain when Kerberos is used to mutually authenticate client and service identities. An SPN is associated with the logon account under which the service instance runs. For client applications connecting to Analysis Services via Kerberos authentication, th

How to import a large xml file into SQL Server

(Or how to import the StackOverflow database into SQL Server) Introduction NB  This process can be generalised to import any large (>2G) xml file into SQL Server. Some SQL Server training you can find online including that by Brent Ozar uses the StackOverflow database for practice. The tables from it are available online for download in xml format. In the past it was possible to use the scripts found here, https://www.toadworld.com/platforms/sql-server/w/wiki/9466.how-to-import-the-stackoverflow-xml-into-sql-server , to import them but as each xml file is now over 2GB you will get an error like this when you try to execute them: Brent Ozar, has a link to SODDI.exe, https://github.com/BrentOzarULTD/soddi , which can import the files (I haven’t tried it) but it means downloading and importing eight tables: Badges, Comments, PostHistory, PostLinks, Posts, Tags, Users, and Votes tables which amounts to >30GB of compressed xml increasing to ~200GB when decompre