Script out Indexes and Index Option Settings for a SQL Server Database

December 11, 2012 at 2:58 PMSteven Wang

Sometimes, we need to script out the index definitions and its option settings. If you have a source control then you are lucky, as all index definitions should be kept in your source control system. If you want to just script out indexes for a table, you were also able to do so by selecting all indexes in the object explorer details. But if we have a query can easily script out the index definition and its settings then it will be much easier for testing or index creation automation.

In SQL server, the index definition and option settings are not stored in a single table, you have to join a few system tables or DMVs together to get all those information. Here I will share the script I used to script out all the indexes.

My consideration:

1. Only apply to the user defined indexes;

2. Clustered indexes created with Primary Key constraint is not included though it can be done.

3. Spatial Indexes and XML Indexes are not considered.

4. The Data compression setting for Partitioned indexes is simplified here as it is pretty complicated to be done just in a select statement. 

5. The Drop_Existing and Sort_In_TempDB setting is hard-coded in the script, which depending on your situation you can change it.

The script is as below:

Select		A.[object_id]
			, OBJECT_NAME(A.[object_id]) AS Table_Name
			, A.Index_ID
			, A.[Name] As Index_Name
			, CAST(
					Case	When A.type = 1 AND is_unique = 1 Then 'Create Unique Clustered Index ' 
							When A.type = 1 AND is_unique = 0 Then 'Create Clustered Index ' 
							When A.type = 2 AND is_unique = 1 Then 'Create Unique NonClustered Index '
							When A.type = 2 AND is_unique = 0 Then 'Create NonClustered Index '
					End
					+ quotename(A.[Name]) + ' On ' + quotename(S.name) + '.' + quotename(OBJECT_NAME(A.[object_id])) + ' ('
					+ Stuff(
							(
								Select
										',[' + COL_NAME(A.[object_id],C.column_id)
										+ Case When C.is_descending_key = 1 Then '] Desc' Else '] Asc' End
								From	sys.index_columns C WITH (NOLOCK)
								Where	A.[Object_ID] = C.object_id
										And A.Index_ID = C.Index_ID
										And C.is_included_column = 0
								Order by C.key_Ordinal Asc
								For XML Path('')
							)
							,1,1,'') + ') '

					+ CASE WHEN A.type = 1 THEN ''
							ELSE Coalesce('Include ('
									+ Stuff(
											(
												Select
														',' + QuoteName(COL_NAME(A.[object_id],C.column_id))
												From	sys.index_columns C WITH (NOLOCK)
												Where	A.[Object_ID] = C.object_id
														And A.Index_ID = C.Index_ID
														And C.is_included_column = 1
												Order by C.index_column_id Asc
												For XML Path('')
											)
											,1,1,'') + ') '
						,'') End
					+ Case When A.has_filter = 1 Then 'Where ' + A.filter_definition Else '' End
					+ ' With (Drop_Existing = OFF, SORT_IN_TEMPDB = ON'
					--when the same index exists you'd better to set the Drop_Existing = ON
					--SORT_IN_TEMPDB = ON is recommended but based on your own environment.
					+ ', Fillfactor = ' + Cast(Case When fill_factor = 0 Then 100 Else fill_factor End As varchar(3)) 
					+ Case When A.[is_padded] = 1 Then ', PAD_INDEX = ON' Else ', PAD_INDEX = OFF' END
					+ Case When D.[no_recompute] = 1 Then ', STATISTICS_NORECOMPUTE  = ON' Else ', STATISTICS_NORECOMPUTE  = OFF' End                  
					+ Case When A.[ignore_dup_key] = 1 Then ', IGNORE_DUP_KEY = ON' Else ', IGNORE_DUP_KEY = OFF' End
					+ Case When A.[ALLOW_ROW_LOCKS] = 1 Then ', ALLOW_ROW_LOCKS = ON' Else ', ALLOW_ROW_LOCKS = OFF' END
					+ Case When A.[ALLOW_PAGE_LOCKS] = 1 Then ', ALLOW_PAGE_LOCKS = ON' Else ', ALLOW_PAGE_LOCKS = OFF' End  
					+ Case When P.[data_compression] = 0 Then ', DATA_COMPRESSION = NONE' 
							When P.[data_compression] = 1 Then ', DATA_COMPRESSION = ROW' 
							Else ', DATA_COMPRESSION = PAGE' End                 
					+ ') On ' 
					+ Case when C.type = 'FG' THEN quotename(C.name) 
							ELSE quotename(C.name) + '(' + F.Partition_Column + ')' END + ';' --if it uses partition scheme then need partition column
			As nvarchar(Max)) As Index_Create_Statement
			, C.name AS FileGroupName
			, 'DROP INDEX ' + quotename(A.[Name]) + ' On ' + quotename(S.name) + '.' + quotename(OBJECT_NAME(A.[object_id])) + ';' AS Index_Drop_Statement
From		SYS.Indexes A WITH (NOLOCK)
				INNER JOIN
			sys.objects B WITH (NOLOCK)
				ON	A.object_id = B.object_id
				INNER JOIN 
			SYS.schemas S
				ON	B.schema_id = S.schema_id            
				INNER JOIN
			SYS.data_spaces C WITH (NOLOCK)
				ON	A.data_space_id =   C.data_space_id   
				INNER JOIN
			SYS.stats D WITH (NOLOCK)
				ON	A.object_id = D.object_id
					AND A.index_id = D.stats_id  
				Inner Join
			--The below code is to find out what data compression type was used by the index. If an index is not partitioned, it is easy as only one data compression
			--type can be used. If the index is partitioned, then each partition can be configued to use the different data compression. This is hard to generalize,
			--for simplicity, I just use the data compression type used most for the index partitions for all partitions. You can later rebuild the index partition to
			--the appropriate data compression type you want to use
			(
				select object_id, index_id, Data_Compression, ROW_NUMBER() Over(Partition By object_id, index_id Order by COUNT(*) Desc) As Main_Compression
				From sys.partitions WITH (NOLOCK)
				Group BY object_id, index_id, Data_Compression
			)	P
				ON	A.object_id = P.object_id
					AND A.index_id = P.index_id
					AND P.Main_Compression = 1
				Outer APPLY
			(
				SELECT	COL_NAME(A.object_id, E.column_id) AS Partition_Column
				From	sys.index_columns E WITH (NOLOCK)
				WHERE	E.object_id = A.object_id
						AND E.index_id = A.index_id
						AND E.partition_ordinal = 1
			) F     
Where		A.type IN (1,2) --clustered and nonclustered
			AND B.Type != 'S'
			AND is_primary_key = 0 --this is not for primary key constraint
			AND OBJECT_NAME(A.[object_id]) not like 'queue_messages_%'
			AND OBJECT_NAME(A.[object_id]) not like 'filestream_tombstone_%'
			AND OBJECT_NAME(A.[object_id]) not like 'sys%' --if you have index start with sys then remove it
OPTION (Recompile);

 

I have also attached a sql script file here:

Scriptout_Indexes.sql (4.95 kb)

 

 

 

Using SQL Server 2012 Window Functions to Solve Common T-SQL Challenges

September 10, 2012 at 1:55 AMSteven Wang

Thanks to all who attended my session at the TechED New Zealand 2012. I hope that you have enjoyed the session. I have attached the presentation slides deck and Demo code in this blog.

Session Abstract:

SQL Server 2012 has introduced several new window functions and enhanced support for window aggregate functions by introducing window order and frame clauses, support for offset functions. In this session, the presenter will apply these new functions to solve some most frequently asked questions in MSDN T-SQL forum. If you have ever been faced with a challenge of how to calculate moving average, identify gap between records, combine consecutive and/or overlapped records and calculate running totals etc., then this session is for you.

 

Session Video:

Using SQL Server 2012 Window Functions 

Session Slides:

DBI309_Using_Window_Functions.pptx (1.79 mb)

Demo Code and DB:

TechEdNZ2012.bak (5.27 mb)

TechEdNZ_2012_Demo.zip (258.64 kb)

 

 

Use 4 Different Ways to Monitor Database Log Size and its Usage

April 28, 2012 at 3:02 AMSteven Wang

In SQL server, there are often different ways to achieve the same goal. This is also true for monitoring the database log size and its usage.

1. DBCC SQLPERF(LOGSPACE)

Typically, we use DBCC SQLPERF(LOGSPACE) to monitor the log size and percent of space used for all databases. This script is very simple and straightforward to use. However, if you want to progmatically use the information output by the DBCC command, then there is a little bit work to do. The script below is one example which can give you the log space information programtically by using DBCC SQLPERF(LOGSPACE):

Declare @LogUsage Table (
							Database_Name sysname
							, Log_Size_MB Float
							, Log_Used_Percent Decimal(8,5)
							, Status tinyint
						);

Insert Into @LogUsage
Exec ('DBCC SQLPERF (LOGSPACE);');

Select	Database_Name
		, Log_Size_MB
		, Log_Size_MB * Log_Used_Percent / 100 As Log_Used_Size_MB
		, Log_Used_Percent
From	@LogUsage

2. sys.dm_os_performance_counters  

The second and less commonly used technique to monitor the log space usage is to use the sys.dm_os_performance_counters dynamic management view. This view keeps the records of the log file size and also data file size as counter name for all databases. As I only concern the log file size for this purpose, I use the script below to monitor the log space usage:

Select		Database_Name
			, [Log File(s) Size (KB)] / 1024.0 As Log_Size_MB
			, [Log File(s) Used Size (KB)] / 1024.0 As Log_Used_Size_MB
			,  ([Log File(s) Used Size (KB)] * 1.0 / [Log File(s) Size (KB)]) * 100 As Log_Used_Percent
From
(
select	instance_Name as Database_Name, counter_name, cntr_value as Log_Size
From	sys.dm_os_performance_counters
Where	counter_name in		(
								'Log File(s) Size (KB)'
								, 'Log File(s) Used Size (KB)'
							)     
			And
		instance_name not in	(
									'mssqlsystemresource'
									, '_Total'
								)  
) X
Pivot	(	Max(Log_Size)
			For Counter_Name 
			In
				(
					[Log File(s) Size (KB)]
					, [Log File(s) Used Size (KB)]
				)  
		) myPiv

If you only concern one database then you can limit the database name in instance_name. Although I calculated the log usage percent, it has indeed a count name 'Percent Log Used' for the sys.dm_os_performance_counters view, but it has no decimal points for the percent value. 

 3. sys.database_files

 The sys.database_files catalog view has information for data files and log file of the current refferenced database. This is different with the preceding 2 ways that can return back the log information for all databases.  As there is no space used information in this view, we need to use the fileproperty function to get the log spaced used value as below:

Select	DB_Name()
		, size / 128.0 As Log_Size_MB
		, FILEPROPERTY(name, 'spaceused') / 128.0 As Log_Used_Size_MB
		, (FILEPROPERTY(name, 'spaceused') * 1.0 / size) * 100 As Log_Used_Percent
From	sys.database_files
Where	type_desc = 'LOG'

Although this script is for log space usage only, it can also be used to monitoring data files size by taking off the where clause.

4. sys.dm_db_log_space_usage (SQL 2012 only)

This dynamic management view is not even documented in the BOL, but this is very simple view. As the view return back the log size in bytes, it may be more useful if converting the bytes to MBs as below:

Select	DB_Name(Database_id)
		, total_log_size_in_bytes / 1024 / 1024.0 As Log_Size_MB
		, used_log_space_in_bytes / 1024 / 1024.0 As Log_Used_Size_MB
		, used_log_space_in_percent AS Log_Used_Percent
From		sys.dm_db_log_space_usage

Given that, this view is so simple I believe it will be easily adopted. However, this view only returns the log space statistics for the current database only.

In SQL server 2012, there is indeed another DMV addded for the data file space as well which is: sys.dm_db_file_space_usage.

 

 

Automatically Transfer and Synchronize SQL Server logins, SIDs, Passwords and Permissions

April 5, 2012 at 3:54 AMSteven Wang

Subject: to transfer and synchronize SQL server logins, SIDs, SQL Login Passwords and permissions from one server instance to another by using SSIS package

This technical applies to: SQL server migration, Primary server and second server login synchronization including SIDs, Passwords and permissions transfer. SQL server 2008 and later is applicable.

  

1. Real world Issues

It is too often when a principal server failover to a mirror server, the SQL server logins are found is not totally in sync with the principal server. This is same to the log-shipping servers, replicated sites and other DR scenarios like Backup/restore databases from one server to another.

As I mentioned in my last post: A simple Way to Migrate SQL Logins from One Instance to Another, it is not convenient to transfer server logins from one server to another. 

2. Login Synchronization Consideration

It will be nice if we can have an automated process or script to transfer logins from one server to another including SIDs, passwords and permissions.

Typically, to transfer sql server logins from one server to another automatically, the following needs to be considered:

A) Login SIDs (either windows logins or SQL logins) in Destination server but not in source, these logins in destination server need to be dropped

B) SQL logins have different passwords between Source server and destination server, the SQL Logins in destination server needs to be altered

C) Login names in Destination server not in source server need to be altered with same name as in source

D) Logins in Source server but not in destination server, these logins in source server need to be synchronized from source to destination server.

E) Logins with different Is_Disabled value between source and destination. The destination logins need to be altered.

F) Logins’ server role assignments need to be synchronized from source to destination.

G) Login permissions need to be synchronized from source to destination.

H) There are others need to be considered like default language etc. but the above mentioned are most important ones.

I have transformed these requirements into the SSIS tasks as below:

A) Drop logins in destination where SIDs are different

B) Drop logins in destination where passwords are different (Drop is simpler than alter.)

C) Drop logins in destination where names are different (Drop is simpler than alter)

D) Check and Sync Logins

E) Alter logins in destination where Is_Disabled are different

F) Check and Sync role members

G) Check and sync permissions

The SSIS package lay out as below:

 3. SSIS Package design

The package contains 7 containers and each container has contained 3 tasks: a Data flow task, A ForEach Loop task, and a Execute SQL script task.

The general idea is to use the data flow task to generate the SQL script and store the query text in a recordset destination which assigns to an object variable: Query_Dataset; the ForEach task use an ADO enumerator to loop through the Query script records in the Query_Dataset variable and assign it to a string variable:SQL_Query; the Execute SQl script task runs the string query variable to apply the change.

The variables have been setup as below:

 A) Drop logins in destination where SIDs are different Task Design

The data flow task: Get login with SIDs diff task has the below design:

This is use merge join to find out any SIDs in Destination but not in Source. The conditional split is to split out any NULL SID value from the Login source in the merge Join component.

The derived column component is used to generate the DDL query like: “Drop Login MyLoginName;”

The query is stored in the Query_dataset where Query_Dataset variable is configured.

 

B) Drop logins in destination where passwords are different task design

The data flow task: Get Login with Password diff task design is depicted as below:

C) Drop logins in destination where names are different Task Design

A Login with same SID in Source and destination server could have different name if Login name in source server is altered. Therefore, these logins need to be synced to destination. To simplify the process, I simply drop those logins in destination and reimport from the source again later.

The data flow task: Get login with Name diff is depicted as below:

 

D) Check and Sync Logins task design

In this task I will use the SQL script I have created in my last post to generate the Create Login DDL queries.

The query will script out the source login SIDs and also passwords if they are SQL logins.

The data flow task: get server login diff task is depicted as below:

 

The script will script out both windows logins and SQL logins but will leave out the R = Server role, C = Login mapped to a certificate, and K = Login mapped to an asymmetric key. As in SQL server 2008, server role is fixed. The C and K are more complicated and I just want to keep things simple here.

The script in Login source is as below:

Select Upper(A.name) As Name,
Case When A.type IN ('G', 'U')
Then
'Create Login ' + QUOTENAME(A.name) 
+ ' From Windows With DEFAULT_DATABASE=[Master]' --+ quotename(A.default_database_Name)
+ ', DEFAULT_LANGUAGE=' + quotename(A.default_language_Name)
When A.type IN ('S')
Then
'Create Login ' + QUOTENAME(A.name) 
+ ' With Password=' + CONVERT(varchar(max), B.password_hash, 1) + ' hashed'
+ ', DEFAULT_DATABASE=[Master]' --+ quotename(B.default_database_Name)
+ ', DEFAULT_LANGUAGE=' + quotename(B.default_language_Name)
+ ', CHECK_POLICY=' + Case B.is_policy_checked when 0 then 'OFF' When 1 Then 'On' End 
+ ', CHECK_EXPIRATION=' + Case B.is_expiration_checked when 0 then 'OFF' When 1 Then 'On' End
+ ', SID=' + CONVERT(varchar(max), A.SID, 1)
End As SRVLogin
From sys.server_principals A
Left Outer Join
sys.sql_logins B
On A.name = B.Name
Where A.type  Not In ('R', 'C', 'K')
And A.name Not like '##%##'
And A.sid != 0x01

 E) Alter logins in destination where Is_Disabled are different

This task is used to keep the is_disabled value synced between source and destination. The data flow task: get login with IsDisable diff is depicted as below:

F) Check and Sync role members task design

This task is used to sync the server role members between source and destination. I get both role name and its members from source and destination servers. A full join component is used to find out which role members for a server role is in source but not in destination and vice versa. For those logins in a server role in source but not in destination, then an sp_addsrvrolemember script will be created. In contrast, for those logins in a server role in destination but not in source, an sp_dropsrvrolemember script will be created.

Both scripts will be combined and save into the Query_Dataset variable. The data flow task: Check role Member task is depicted as below:

 

G) Check and sync permissions task design

This task is used to sync the login permissions from source to destination. This query is used to script out the login permissions:

Select distinct
A.name
, B.Permission_Name
, B.state
, B.state_desc
From sys.server_principals A
Inner Join
sys.server_permissions B
On 
A.principal_id = B.grantee_principal_id
Where A.type  Not In ('R', 'C', 'K')
And A.name Not like '##%##'
And A.sid != 0x01

A full join is used to find out which permissions are applied in source for a login but not in destination and vice versa.

For those permissions are applied in source for a login but not in destination then those logins will go to the Permission_Applied route. And the below expression is used to create the permission grant script:

(DT_WSTR,2000)([state_desc source] + " " + [Permission_Name source] + " TO [" + [name source] + ([state source] == "W" ? "] WITH GRANT OPTION;" : "];"))

For those permissions are applied in destination for a login but not in source then those logins will go to the Permission_Removed route. And the below expression is used to create the permission grant script:

(DT_WSTR,2000)("REVOKE " + [Permission_Name dest] + " TO [" + [name dest] + "];")

The data flow task: Check Permissions task is depicted as below:

 

4. SSIS Package Automation

An SQL agent Integration service job can be used to automate the Login synchronization process. The procedure is to create a job to use the package. 

For example, a job name can be: Apply Login Changes from Source to Destination.

And Remember to set your source and destination server variable value to your server names like the below:

 

Once the job is set up you can schedule the job to run automatically in your preferred time intervals.

You can also create a server level DDL trigger to automatically run the job whenever an DDL command is performed on logins. An example of the trigger like:

Create Trigger trigger_ApplyLoginsToDestination
On All Server
After Create_Login, Alter_Login, Drop_Login
As
Exec msdb..sp_start_job 'Apply Login Changes from Source to Destination' --change to your own job name
Print 'You have synced your login changes from source to destination server.'

You can download this package from the attachment in this post.

Thanks for reading.

 

SQLServer_Login_Sync_Blog.zip (47.20 kb)

A simple Way to Migrate SQL Logins from One Instance to Another

March 18, 2012 at 7:39 PMSteven Wang

Subject: Trasfer SQL Logins with original passwords and SIDs

Scope: the technical applies to SQL server 2008 and later.


When we upgrade or migrate an SQL server instance to a new instance, if we have lots of SQL logins the migration process will get bogged down in 2 thorny issues:

  • SQL Login SIDs
  • SQL Login passwords

 

SSIS Transfer Logins Task can easily transfer the windows logins and groups but not for SQL logins. When using SSIS Transfer Logins Task to transfer the SQL logins it will generate new passwords and SIDs for the transfered SQL logins and these logins are disabled in the destination SQL server instance. This is not very useful for the real work.

 

When an SQL login is created in the new SQL server instance, a new SID is created and bound to the Login. In this case, the database user in the migrated database is no longer mapped into the new created SQL login as they are using different SID now.

Generally speaking, the SQL user can be remapped by using Alter User with login = [login_Name] or sp_change_users_login (deprecated SP, better to use Alter User).  However, when there are lots of SQL logins this technical becomes cumbersome. Particularly, for some environment, a production database might frequently be restored to a test environment then you need to fix the SQL user mapping all the time.

For the SQL login password, it is also every awkward for us to retype the password by creating a new SQL login in the new server instance.

Although the SQL login SIDs and password are stored in the old instance master database, before SQL server 2008 there is no an easy way to script out this binary information and transfer to a new instance. There are a few very intelligent techniques available in the SQL community and mainly it will use a complicated function to convert the binary data into string information and then convert it back to binary data in the destination instance. But in SQL server 2008 or later, these techniques became kinds of overkill.

In SQL server 2008, the COVERT function has added a new feature to convert binary data type to other data type. (check the book online for more details) The syntax as below:

CONVERT ( data_type [ ( length ) ] , expression [ , style ] )

Example: 

CONVERT(varchar(max), 0xE4305FD31D353546B4EC6E56C906E912, 1)

 

When Expression is binary(n) or varbinary(n), 3 style options are available 0 (default), 1, and 2.

When use 0 (default), binary bytes will be translated to ASCII characters. Each byte is converted 1:1.

When use 1, binary bytes will be translated to character string and the characters 0x will be added to the left of the converted result

When use 2, binary bytes will be translated to character string and 0x prefix will not be used.

SQL login SIDs and password_hash are both use 0x prefix, so we can use the binary style 1 to script out the SIDs and Hashed password from the old server instance and apply to the new instance.

The code to script out the SQL logins with the convert function becomes very simple, an example script as below:

Select 
'Create Login ' + QUOTENAME(A.name) 
+ ' With Password=' + CONVERT(varchar(max), A.password_hash, 1) + ' hashed'		--script out the passwrod
+ ', DEFAULT_DATABASE=' + quotename(A.default_database_Name) --if you apply the scrip to a mirroed server or log-shipped server user master database, as user database is not in useable state
+ ', DEFAULT_LANGUAGE=' + quotename(A.default_language_Name)
+ ', CHECK_POLICY=' + Case A.is_policy_checked when 0 then 'OFF' When 1 Then 'On' End 
+ ', CHECK_EXPIRATION=' + Case A.is_expiration_checked when 0 then 'OFF' When 1 Then 'On' End
+ ', SID=' + CONVERT(varchar(max), A.SID, 1)		--script out the SIDs
 As SQLLogin
From 
sys.sql_logins A
Where A.name Not like '##%##'  --remove those system generated sql logins
And A.sid != 0x01 --SA sid is always same

 (Note: If you have problem to copy and paste the script into the SSMS, copy it and paste to Office Word first and then paste to SSMS.)

You can use this scrip to store the SQL Login creation script to a table in the target server and use a cursor or loop to execute these scripts to transfer the SQL logins.

In the next blog, I will talk more depth on how to use an SSIS package to transfer the Logins and SQL server permissions from one server to another.

The Use of Denali 'With Result Sets' Feature in SSIS Data Flow Task

October 15, 2011 at 4:58 AMSteven Wang

The new T-SQL feature 'With Result Sets' in SQL Denali can be very useful for using the stored procedure as the data flow task data source in SSIS.

Before SQL Denali, there are some limitations for SSIS to use stored procedure as data flow task data source, like:

1. if you want the data source to use the different column names, you have to either change the stored procedure or use the derived column component, both are not convenient;
2. The same for the data type change, if you want to change the column data type you again have to either change the stored procedure or use the derived column component;
3. If your stored procedure is ended with dynamic SQL execution, like using Exec (@SQL) or sp_executeSQL  @SQL, the SSIS was not able the return the column. A common work-around is to define a table variable inside the stored procedure, and use the SQL execution to insert the data to the table variable and then select the data from it. This is very awkward and the performance is bad when the data set is big.

 
The new T-SQL feature 'With Result Sets' solves this problem.

The syntax of the With Result Sets is simple and just like below: (More information for the Execute T-SQL, see BOL. link is here)
 
Exec Stored_Proc_Name @Variables
With Result Sets
(
(
Col1 Data_type,
Col2 Data_type,
.....
List All Columns here
)
);
 
A simple example to demo this functionality as SSIS data flow task data source to solve the problem 3 stated in the above.

1. Create a simple SP
USE [AdventureWorksDWDenali]  --Denali sample DW database, free download
GO
CREATE Proc [dbo].[usp_UseWithResultSet_Dynamic_SQL_Denali]
As
Begin
Declare @SQL varchar(Max)

Set @SQL = '
SELECT [GeographyKey]
,[City]
,[StateProvinceName]
,[EnglishCountryRegionName]
FROM[dbo].[DimGeography]
'
Exec (@SQL);

End
GO

2. Create an SSIS data flow task and use exec  [dbo].[usp_UseWithResultSet_Dynamic_SQL_Denali] as data source (SQL Command) while you click preview, it gets error like below:

 

3. By using the With Result sets, you can rename the column name and change the data type (I changed column from nvarchar to varchar), and best of all, there is no error to use this SP as the data source. see the below:

 

4. Terminate with a simple Union All Component and run. It is successful:

Posted in: SQL Server 2012 | SSIS | T-SQL

Tags: , ,