Alter Index All Tables

This script comes in handy when needing to alter all indexes in a database and rebuild them. This will only work on SQL Server 2005+. It utilizes the ALL keyword in the Alter index statement to rebuild all the indexes for a particular table. This script retrieves all base tables and stores them into a temporary table, then loops through rebuilding everything. [cc lang=”sql”] USE AdventureWorksLT2008 GO SELECT RowNum = ROW_NUMBER() OVER(ORDER BY t.TABLE_NAME) ,TableName = t.TABLE_SCHEMA + ‘.’ + t.TABLE_NAME ,AlterMe = ‘ALTER INDEX ALL ON [‘ + t.TABLE_SCHEMA + ‘].[‘ + t.TABLE_NAME + ‘] REBUILD;’ INTO #Reindex_Tables FROM INFORMATION_SCHEMA.TABLES t WHERE TABLE_TYPE = ‘BASE TABLE’ DECLARE @Iter INT DECLARE @MaxIndex INT DECLARE @ExecMe VARCHAR(MAX) SET @Iter = 1 SET @MaxIndex = ( SELECT COUNT(1) FROM #Reindex_Tables ) WHILE @Iter < @MaxIndex BEGIN SET @ExecMe = ( SELECT AlterMe FROM #Reindex_Tables WHERE RowNum = @Iter ) EXEC (@ExecMe) PRINT @ExecMe + ' Executed' SET @Iter = @Iter + 1 END [/cc]

Continue reading ...

using sp_addrolemember

To add a user or group to a role, you need to use sp_addrolemember. This procedure is easy to use as it only accepts two parameters, the role name, and the username (or group). Roles are utilized in order to provide a layer of abstraction from permissions from being applied directly to users. While there are two different places in which roles can be defined (the server level or the database level), the sp_addrolemember procedure only works at the database level. (In order to add users or groups to the server-level roles, use: sp_addsrvrolemember). To add a user or group to a particular role execute the following: [cc lang=”sql”] USE YourDatabaseHere GO EXEC sp_addrolemember ‘db_datareader’, ‘derekdieter’ [/cc] This code adds my own account to the dv_datareader role for my database. While this practice may be relatively ok, it is still far better to create a custom role and assign read permissions to the custom role, then use sp_addrolemember to add all the users or the group. As an example, let’s create a sample role: [cc lang=”sql”] CREATE ROLE ReadExecute AUTHORIZATION dbo [/cc] Now that we have created the empty role, let’s assign it read & execute permissions to the entire ‘dbo’ schema. This will allow read & execute permissions for the ReadExecute role to the entire dbo schema. [cc lang=”sql”] use YourDatabaseHere GO GRANT EXECUTE ON SCHEMA::[dbo] TO [ReadExecute] GO use YourDatabaseHere GO GRANT SELECT ON SCHEMA::[dbo] TO [ReadExecute] GO [/cc] Finally, we add the role members. Remember, you can […]

Continue reading ...

How to Copy a Table

There are few methods of copying a table to another database, depending on your situation. Same SQL Server Instance If trying to copy a table to a database that is on the same instance of SQL Server, The easiest solution is to use a SELECT INTO while using the fully qualifed database names. [cc lang=”sql”] SELECT * INTO Database2.dbo.TargetTable FROM Database1.dbo.SourceTable [/cc] This will create a copy of the entire table across two different databases and will definitely be the quickest way. This method utilizes minimal inserts. The caveat is that it will not copy over the primary keys, foreign keys or the indexes. These will have to be scripted out and added after the insert. Different SQL Server Instance (and/or different server) If trying to copy a table to a different instance (be it on the same server or not), there are two basic routes. OPENROWSET is an easy way to do the insert. Just replace the server names below and instance name (if the database is defined by an instance). [cc lang=”sql”] SELECT * INTO dbo.TargetTable FROM OPENROWSET(‘SQLNCLI’, ‘server=SourceServerNameSourceInstance;trusted_connection=yes’, ‘SELECT * FROM SourceDatabase.dbo.SourceTable’) [/cc] The other way is to utilize the SSIS Import/Export Wizard. This always works well too. In order to initialize, right click on the database, select tasks, then import/export.

Continue reading ...

Check if Database Exists

In creating a database you also need to check whether or not the database already exists. In order to do so, simply use the ‘if exists’ method and select the name of the database from sysdatabases. [cc lang=”sql”] IF NOT EXISTS (SELECT name FROM master.dbo.sysdatabases WHERE name = N’SQLServerPlanet’) CREATE DATABASE [SQLServerPlanet] [/cc] The code below will drop an existing database if it exists so be careful. [cc lang=”sql”] IF EXISTS (SELECT name FROM master.dbo.sysdatabases WHERE name = N’SQLServerPlanet’) DROP DATABASE [SQLServerPlanet] [/cc] An alternate method is to use the db_id to convert the db_name and see if it is null. [cc lang=”sql”] if db_id(‘SQLServerPlanet’) is not null BEGIN SELECT ‘database does exist’ END ELSE BEGIN select ‘database does not exist’ END [/cc]

Continue reading ...

How to Shrink Log File

Sometimes after one or more large transactions, the t-log (transaction log) will become full. In these particular cases you may receive an error message indicating the transaction log is full. In order to alleviate this issue, you need to find the names of the transaction logs on your system and then shrink them. To find the names, execute the following statement: [cc lang=”sql”] SELECT name FROM sys.database_files WHERE TYPE = 1 [/cc] Once you’ve found the names of your transaction logs, use them in the DBCC SHRINKFILE command in order to make them smaller. DBCC SHRINKFILE(‘AdventureWorksLT2008_Log’); If your databases are in full recovery mode, then using this methodology will shrink the transaction logs while retaining full recovery mode. A riskier, yet highier yielding way to free disk space is to utilize: [cc lang=”sql”] — database name below BACKUP LOG AdventureWorksLT2008 — database name WITH TRUNCATE_ONLY — t-log name below DBCC SHRINKFILE ( ‘AdventureWorksLT2008_Log’ — transaction log name ,1 ) [/cc] While this command will shrink the transaction log to a greater degree, it will drop the previous transactions since the last T-Log backup. Essentially changing the recovery model of the database to simple since the last transaction log backup. Depending upon the backup schedule you have, this may or may not be risky. If you backup the transaction logs every hour, then the max possible data loss (should a complete loss occur) would be one hours worth of data. While the chance of this are very slim, it should be […]

Continue reading ...

Featured Articles

 Site Author

  • Thanks for visiting!