A simple Way to Migrate SQL Logins from One Instance to Another

March 19, 2012 at 6:39 AMSteven Wang

Subject: Trasfer SQL Logins with original passwords and SIDs

Scope: the technical applies to SQL server 2008 and later.


When we upgrade or migrate an SQL server instance to a new instance, if we have lots of SQL logins the migration process will get bogged down in 2 thorny issues:

  • SQL Login SIDs
  • SQL Login passwords

 

SSIS Transfer Logins Task can easily transfer the windows logins and groups but not for SQL logins. When using SSIS Transfer Logins Task to transfer the SQL logins it will generate new passwords and SIDs for the transfered SQL logins and these logins are disabled in the destination SQL server instance. This is not very useful for the real work.

 

When an SQL login is created in the new SQL server instance, a new SID is created and bound to the Login. In this case, the database user in the migrated database is no longer mapped into the new created SQL login as they are using different SID now.

Generally speaking, the SQL user can be remapped by using Alter User with login = [login_Name] or sp_change_users_login (deprecated SP, better to use Alter User).  However, when there are lots of SQL logins this technical becomes cumbersome. Particularly, for some environment, a production database might frequently be restored to a test environment then you need to fix the SQL user mapping all the time.

For the SQL login password, it is also every awkward for us to retype the password by creating a new SQL login in the new server instance.

Although the SQL login SIDs and password are stored in the old instance master database, before SQL server 2008 there is no an easy way to script out this binary information and transfer to a new instance. There are a few very intelligent techniques available in the SQL community and mainly it will use a complicated function to convert the binary data into string information and then convert it back to binary data in the destination instance. But in SQL server 2008 or later, these techniques became kinds of overkill.

In SQL server 2008, the COVERT function has added a new feature to convert binary data type to other data type. (check the book online for more details) The syntax as below:

CONVERT ( data_type [ ( length ) ] , expression [ , style ] )

Example: 

CONVERT(varchar(max), 0xE4305FD31D353546B4EC6E56C906E912, 1)

 

When Expression is binary(n) or varbinary(n), 3 style options are available 0 (default), 1, and 2.

When use 0 (default), binary bytes will be translated to ASCII characters. Each byte is converted 1:1.

When use 1, binary bytes will be translated to character string and the characters 0x will be added to the left of the converted result

When use 2, binary bytes will be translated to character string and 0x prefix will not be used.

SQL login SIDs and password_hash are both use 0x prefix, so we can use the binary style 1 to script out the SIDs and Hashed password from the old server instance and apply to the new instance.

The code to script out the SQL logins with the convert function becomes very simple, an example script as below:

Select 
'Create Login ' + QUOTENAME(A.name) 
+ ' With Password=' + CONVERT(varchar(max), A.password_hash, 1) + ' hashed'		--script out the passwrod
+ ', DEFAULT_DATABASE=' + quotename(A.default_database_Name) --if you apply the scrip to a mirroed server or log-shipped server user master database, as user database is not in useable state
+ ', DEFAULT_LANGUAGE=' + quotename(A.default_language_Name)
+ ', CHECK_POLICY=' + Case A.is_policy_checked when 0 then 'OFF' When 1 Then 'On' End 
+ ', CHECK_EXPIRATION=' + Case A.is_expiration_checked when 0 then 'OFF' When 1 Then 'On' End
+ ', SID=' + CONVERT(varchar(max), A.SID, 1)		--script out the SIDs
 As SQLLogin
From 
sys.sql_logins A
Where A.name Not like '##%##'  --remove those system generated sql logins
And A.sid != 0x01 --SA sid is always same

 (Note: If you have problem to copy and paste the script into the SSMS, copy it and paste to Office Word first and then paste to SSMS.)

You can use this scrip to store the SQL Login creation script to a table in the target server and use a cursor or loop to execute these scripts to transfer the SQL logins.

In the next blog, I will talk more depth on how to use an SSIS package to transfer the Logins and SQL server permissions from one server to another.

Posted in: Backup | Database Administration | T-SQL

Tags: , , , ,

The Problem of '##xp_cmdshell_proxy_account##' credential could not be created

December 29, 2011 at 6:59 PMSteven Wang

When a user that is not a member of the sysadmin fixed server role tries to use xp_cmeshell command, we need to set up a credential ##xp_cmdshell_proxy_account##. If this proxy credential does not exist, xp_cmdshell will fail.

We use this system stored procedure to set up the credential:

EXEC sp_xp_cmdshell_proxy_account 'DomainName\AccountName', 'Password';

While you are trying to set up the XP_CMDShell proxy account in SQL SSMS on Windows Server 2008 R2, you might get the error message below:

 Msg 15137, Level 16, State 1, Procedure sp_xp_cmdshell_proxy_account, Line 1

An error occurred during the execution of sp_xp_cmdshell_proxy_account. Possible reasons: the provided account was invalid or the '##xp_cmdshell_proxy_account##' credential could not be created. Error code: '5'.

This error message indeed didn't tell you what is the cause of the problem, as you might use the valid windows account and password but you were still not able to create the credential.

From what I experienced, the problem comes from the windows server UAC. It is because the user account you are running SSMS doesn't have the permission to create the credential, you need to run SSMS as Administrator as below:

After SSMS is running at as administrator, I rerun the same stored procedure. it is successful. I hope this helps you as well.

Posted in: Database Administration

Tags: ,

Bringing ETL to the Masses - Microsoft Codename "Data Explorer"

October 25, 2011 at 3:32 PMSteven Wang

One of the new projects Microsoft announced in this year's PASS summit, Data Explorer, is very interesting and possibly will have a big impact on business user and information workers' way to perform BI analysis.

According to the project lead, Tim Mallalieu, the project was named as Montego (which sounds more like a usual codename.). While in the PASS summit, the cloud version of the project was presented, according to Tim that they do have a client version of Data Explorer, which can run as an Excel add-in, pretty much operates at the similar way as you do with Powerpivot. And from the first look of the interface, it was used the name Powerimport. (Please note the diagrams used here are from early Montego project, not the new Data Explorer interface. Wait for the November 2011, maybe there will be a new demo available.)

 

 

After I looked at the Tim's workthough on the Montego client - the mashup component of Data Explorer, I was very impressed with this project. Basically this excel add-in can integrate different data sources, from Cloud Databases, RSS feeds, websites, excel workbooks, SQL databases etc. You can do the data transformations on the data source (table). The ribbon style of the transformation tools are very easy to understand and use:


The workflow and the concept look very easy to catch for the experienced Execl users. The data sources (tables) can be easily merged, nested, transposed and pivoted:

Data manipulations, De-duplication, Grouping, Aggregations, can be performed:

 

Once you complete this data mashup, it can be pushed back to Excel exactly similar way as what you do for the Powerpivot.

To me, the whole Montego mashup process likes a very simplified and easy to use yet very powerful version of an SSIS data flow task. While to effectively use the SSIS needs a steep learning curve, this mashup is very easy to use and understand.

According to Tim, this mashup can be reused and automatically refreshed from the data sources. They are also working on the integration of the tool with the Powerpivot and the new announced Power View (Previously known as Crescent).

Given that the Excel is still the most popular End user BI tool, I believe the new tool will Bring ETL to the Masses!

Looking forward to the trial version.

You may take some time to watch Tim's Demo video: Early walkthrough of the Montego client

Posted in: Data Explorer

Tags:

Using Trace Flag 3042 for Backup Compression

October 20, 2011 at 7:56 AMSteven Wang

While we were using the SQL Server 2008 R2 native backup compression to replace the Redgate SQL Backup, we noticed a huge backup file size increase at the beginning of the backup which basically nearlly full up of our backup disk volume.

The database we were backing up close to 2 Terabyte and by using the SQL backup with compression, the final backup size was roughly 376GB as below:

When we used the SQL server native backup with compression

 

Backup database MyDB to disk = N'I:\Backups\CompressionBackup_Without_TF.bak' With init, compression

 

the initial file size has gone up to 638GB, which is roughly one third of the original database size, as below:

This is a huge 262GB backup file size increase, which was totally thrown away our Backup disk capacity plan based on the Redgate compression ratio.

Surprisingly, once the backup completed the file size shrank back to around 380GB which is similar size as the SQL backup compressed file.

This is indeed a default behaviour of the SQL Server Native backup compression. for a compressed backup, it is hard for SQL server to determine what is the final size of the backup. simply, it creates a initial backup files size with 1/3 of the database size (pre-allocation algorithm). In case during the course of the backup more disk space is needed, it will dynamycally extend the file as needed. In case the final backup size is smaller thant the initail 1/3 of the database size, it will truncate the file to the actual.

This is not good enough for us as I mentioned above that we have already planned the capacity based on the Redgate SQL Backup. In some point that we may not have enough disk space to hold the initial backup files. We really want to have the actual size required for the compressed backup.

This issue was solved by introduce a trace flag 3042. The trace flag 3042 bypasses the “pre-allocation algorithm” and grows the file as needed with small performance penalty.

This trace flag was not offically documented but a few month ago it was announced in micorosoft support see the articale Space requirements for backup devices in SQL Server

By using the trace flag, you can notice that the initil backup file size is 0 and will grow up to the final size.

Backup compression with trace falg 3042 on:

 

dbcc traceon(3042)
Go

Backup database BiW to disk = N'I:\Backups\CompressionBackup_With_TF.bak' With init, compression
Go

dbcc traceoff(3042)
Go

 

The initial file size was 0 as below:

You can turn on this trace flag globally by adding the -T3042 to the startup parameters.

Posted in: Backup | Trace Flag

Tags: ,

The Use of Denali 'With Result Sets' Feature in SSIS Data Flow Task

October 15, 2011 at 3:58 PMSteven Wang

The new T-SQL feature 'With Result Sets' in SQL Denali can be very useful for using the stored procedure as the data flow task data source in SSIS.

Before SQL Denali, there are some limitations for SSIS to use stored procedure as data flow task data source, like:

1. if you want the data source to use the different column names, you have to either change the stored procedure or use the derived column component, both are not convenient;
2. The same for the data type change, if you want to change the column data type you again have to either change the stored procedure or use the derived column component;
3. If your stored procedure is ended with dynamic SQL execution, like using Exec (@SQL) or sp_executeSQL  @SQL, the SSIS was not able the return the column. A common work-around is to define a table variable inside the stored procedure, and use the SQL execution to insert the data to the table variable and then select the data from it. This is very awkward and the performance is bad when the data set is big.

 
The new T-SQL feature 'With Result Sets' solves this problem.

The syntax of the With Result Sets is simple and just like below: (More information for the Execute T-SQL, see BOL. link is here)
 
Exec Stored_Proc_Name @Variables
With Result Sets
(
(
Col1 Data_type,
Col2 Data_type,
.....
List All Columns here
)
);
 
A simple example to demo this functionality as SSIS data flow task data source to solve the problem 3 stated in the above.

1. Create a simple SP
USE [AdventureWorksDWDenali]  --Denali sample DW database, free download
GO
CREATE Proc [dbo].[usp_UseWithResultSet_Dynamic_SQL_Denali]
As
Begin
Declare @SQL varchar(Max)

Set @SQL = '
SELECT [GeographyKey]
,[City]
,[StateProvinceName]
,[EnglishCountryRegionName]
FROM[dbo].[DimGeography]
'
Exec (@SQL);

End
GO

2. Create an SSIS data flow task and use exec  [dbo].[usp_UseWithResultSet_Dynamic_SQL_Denali] as data source (SQL Command) while you click preview, it gets error like below:

 

3. By using the With Result sets, you can rename the column name and change the data type (I changed column from nvarchar to varchar), and best of all, there is no error to use this SP as the data source. see the below:

 

4. Terminate with a simple Union All Component and run. It is successful:

Posted in: SQL Server 2012 | SSIS | T-SQL

Tags: , ,

Minimal Logging & Data Manoeuvring On very Large Tables

August 28, 2011 at 10:53 AMSteven Wang

Thanks all for attending my session at Auckland MS CodeCamp Summit 2011.

Please see the attached slides and code presented in the session.

Minimal_logging_maneuvering_data.pptx (2.50 mb)

01_Full_vs_Minimal.sql (3.02 kb)

02_Heap_Tablock_On_Off.sql (1.09 kb)

03_Clustered_Tablock_On_Off.sql (1.21 kb)

04_Clustered_trace610.sql (1.46 kb)

Posted in: Data Load | Data warehouse | Transaction Log

Tags: , , ,