When DBCC UpdateUsage is run, page and row counts are updated. When this information is incorrect, it can give incorrect reports for how much space is used by tables and indexes. The most likely cause for incorrect usage stats is when a database is upgraded to a new compatibility level of SQL Server (i.e. 2000 to 2005). Therefore it’s important to run this command after the statistics have been updated. It may not be a bad idea to run on a weekly or monthly schedule if your database undergoes frequent DDL modifications. To execute you can run one of the following commands: [cc lang=”sql”] Use MyDB; GO DBCC UPDATEUSAGE(0); — Execute for the current database GO DBCC UPDATEUSAGE(MyDB); — Execute using the database name GO DBCC UPDATEUSAGE(MyDB); — Execute using the database ID [/cc] This may take some time, so make sure to run it in a non production environment or during off hours as it may cause blocking. You can also run for a specific table or index. If you run for a table, then the usage stats are updated for all indexes on that table. [cc lang=”sql”] — Update for a table (and it’s indexes) DBCC UPDATEUSAGE(AdventureWorks, ‘Sales.SalesOrderDetail’); — Update usage for a single index DBCC UPDATEUSAGE(AdventureWorks, ‘Sales.SalesOrderDetail’, ‘IX_SalesOrderDetail_ProductID’); [/cc] That’s basically it. Now you may be running these commands and are not seeing any output. Well, that’s because the usage stats are already correct. But if you want to spoof the stats in order to see the […]
Continue reading ...
The only way to insert values into a field that is defined as an “IDENTITY” (or autonumber) field, is to set the IDENTITY_INSERT option to “ON” prior to inserting data into the table. To illustrate, let’s create a table that has an identity column defined. [cc lang=”sql”] — Create table with identity column CREATE TABLE MyOrders ( OrderID int IDENTITY(1,1), ProductName varchar(20) ); — Now try to insert into the table INSERT INTO dbo.MyOrders ( OrderID, ProductName ) VALUES ( 1, ‘socks’ ); [/cc] Executing the above code, we see that we get the following error: Cannot insert explicit value for identity column in table ‘MyOrders’ when IDENTITY_INSERT is set to OFF. To fix this, we simply need to set the IDENTITY_INSERT to On. Keep in mind, the minimum permissions needed to perform this action is database owner (dbo). [cc lang=”sql”] SET IDENTITY_INSERT dbo.MyOrders ON; INSERT INTO dbo.MyOrders ( OrderID, ProductName ) VALUES ( 1, ‘socks’ ); SET IDENTITY_INSERT dbo.MyOrders OFF; [/cc] You want to make sure to turn this off after it’s used. Otherwise, you will not be able to turn this feature on for any other table from within the same session until it is off for this table.
Continue reading ...
I recently had a need to monitor inserts against a particular table in order to determine what kind of locks they were acquiring. Being that we could not run traces on the system, I had to resort to a roll-your-own approach to monitoring the table. The goal was to determine what kind of locking was occurring, and it would also be nice to be able to associate it to the executing query. The following code uses DMVs in order to trap a sample of the queries running against it. Please note that this will not trap all queries, however it will work in getting a good number of samples. This script will run in an endless loop so be sure to hit cancel at some point. Also, for some reason the SQL Statement will not always be trapped. It was not that important for me, as I mostly needed the locks, however if someone figures it out, please post. [cc lang=”sql”] — Capture query activity against a table using DMVs DECLARE @TableName varchar(255); — Specify the table you want to monitor SET @TableName = ‘Sales.SalesOrderDetail’; DECLARE @ObjectID int; SET @ObjectID = (SELECT OBJECT_ID(@TableName)); IF OBJECT_ID(‘tempdb..##Activity’) IS NOT NULL BEGIN DROP TABLE ##Activity; END; — Create table SELECT TOP 0 * INTO ##Activity FROM sys.dm_tran_locks WITH (NOLOCK); — Add additional columns ALTER TABLE ##Activity ADD SQLStatement VARCHAR(MAX), SQLText VARCHAR(MAX), LoginName VARCHAR(200), HostName VARCHAR(50), Transaction_Isolation VARCHAR(100), DateTimeAdded DATETIME; DECLARE @Rowcount INT = 0; WHILE 1 = 1 BEGIN INSERT INTO ##Activity SELECT […]
Continue reading ...
SQL Server’s Batch Requests represents the number of SQL Statements that are being executed per second. In my opinion this is the one metric that gives the best baseline on how your server is performing. It cannot be looked at by itself however. It is necessary to correlate with other metrics (especially CPU usage) in order to get an overall understanding of how your server is performing. The overall goal of any SQL Server is to be able to achieve the most batch requests/sec while utilizing the least amount of resources. On some busy machines I’ve worked with, this counter averaged around 25,000 batch requests/sec during peak time. This peak throughput is heavily dependent on the architectural design of the system, including procedures, tables, and index design. One notable example of this was a system I worked with whose procedures were written using loops (ugh).. The average batch requests we could ever achieve was only 200. I was very surprised to initially see this, however digging deeper into the code I became less and less shocked. Ultimately I found one piece of code that affected the throughput of the entire system. It was a scalar UDF defined as a computed column in the busiest table on the system (don’t get me started). Anyway, after rewriting that one process, the system then found batch request peaks that went over 3500! The fact that the system could achieve that number now was a big achievement. It meant the overall throughput of the […]
Continue reading ...
Often times you have a need to export a table from SQL Server into another format. The most common of these is Excel, CSV (Comma Separated Value), Access, or even to another SQL Server. Luckily, there is one tool native to SQL Server that can export to all these formats, the SQL Server Import and Export wizard. To start this wizard, simply right-click on the database that contains the table you want to export within Management Studio, then select Tasks -> Export Data. What pops up next is the SQL Server Import/Export Wizard. Select next to get past the splash screen then choose the data source. This is going to be the same data source as the database you selected to export the table from. Once you select the ServerName and Database name, select next. Next you want to choose the destination. This is where multiple export options come in. You can choose any of the options that appear in the destination drop down box. Once you select the export file type and file location, you will then be prompted to select the tables to output, along with additional options. This will assist you in outputting many different export formats for your table.
Continue reading ...