What's New (Integration Services)
This latest release of Microsoft Integration Services introduces new features and enhancements to installation, components, data management, and performance and troubleshooting. These new features and enhancements increase the power and productivity of developers, administrators, and knowledge workers who develop data transformation solutions.
New Features and Enhancements by Area
Installation Features
This release includes a new samples location and support for Data Transformation Services.Component Enhancements
This release includes an enhanced Lookup transformation, new ADO.NET components, a new Data Profiling feature, a new connections wizard, a new script environment, package upgrade options, and changes to the application of package configurations.Data Management Enhancements
This release includes enhanced data type handling, new date and time data types, and enhanced SQL statements.Performance and Troubleshooting Enhancements
This release includes a new change data capture feature and new debug dump files.
Installation Features
New Samples Location
Books Online no longer includes SQL Server sample databases and sample applications. These sample databases and sample applications are now available on the SQL Server Samples Web site. This Web site not only makes it easier for users to find these samples, but to find additional samples that are related to Microsoft SQL Server and Business Intelligence. On the SQL Server Samples Web site, you can do the following actions:
Browse through samples contributed by developers, users, and the Microsoft Most Valuable Professional (MVP) community.
Download both sample databases and code projects.
View or participate in a discussion area where you can report issues and ask questions about the samples for each technology area.
Support for SQL Server 2000 Data Transformation Services (DTS)
Integration Services still supports SQL Server 2000 Data Transformation Services (DTS). For more information, see Support for Data Transformation Services (DTS) in SQL Server 2008.
Return to New Features and Enhancements by Area
Component Enhancements
Enhanced Performance and Caching for the Lookup Transformation
Performance enhancements to the Lookup transformation include faster cache loading and more efficient lookup operations. These enhancements are possible because of the following features:
The ability to take rows that do not have matching entries in the reference dataset and load those rows into the cache.
The ability to use separate data flows to load the reference dataset into the cache and to perform lookups on the reference dataset.
The Lookup transformation now includes the following caching options:
The reference dataset is a cache file (.caw), and is accessed by using a Cache connection manager.
The reference dataset is a connected data source in the data flow, and is accessed by using a Cache connection manager and a Cache Transform transformation.
The reference dataset is a table, view, or query that is fully or partially cached and accessed by using an OLE DB connection manager.
The cache can be shared between multiple Lookup transformations in a single package and between transformations in separate packages. For more information, see How to: Implement a Lookup Transformation in Full Cache Mode Using the Cache Connection Manager.
You can deploy a cache file together with a package. For more information, see How to: Create and Deploy a Cache for the Lookup Transformation.
For more information, see Lookup Transformation, Cache Connection Manager, and Cache Transform.
For a video that demonstrates how to configure the Lookup Transformation, see the video home page, How to: Implement a Lookup Transformation in Full Cache Mode (SQL Server Video), in the MSDN Library.
New ADO.NET Components
Integration Services now includes the following ADO.NET components:
An ADO NET source component that consumes data from a .NET Framework provider and makes the data available to the data flow. For more information, see ADO NET Source.
An ADO NET destination component that loads data into a variety of ADO.NET-compliant databases that use a database table or view. For more information, see ADO NET Destination.
New Data Profiling Task and Data Profile Viewer
The Data Profiling task is a new task in the Integration Services toolbox. You can use this task inside an Integration Services package to profile data that is stored in SQL Server. The information provided by the profile helps you identify potential problems with data quality. The Data Profiling task provides profiles that help identify data quality problems within individual columns and with column relationships:
Profiles that help identify problems within individual columns
The distribution of lengths in the column values.
The percentage of null values.
The distribution of values in the column.
Column statistics for numeric columns.
Regular expressions that match string columns.
Profiles that help identify problems with column relationships
Candidate key columns.
Functional dependencies between columns.
The inclusion of the set of values in one column in the set of values in another column.
For more information, see Data Profiling Task and Profiling Data with the Data Profiling Task and Viewer.
For a video that demonstrates how to use the Data Profiling task, see the video home page, How to: Use the Data Profiling Task (SQL Server Video), in the MSDN Library.
New Integration Services Connections Project Wizard
Use the Integration Services Connections Project Wizard to create a package that contains the connection information that you need to connect data sources and destinations. The wizard guides you through the steps of selecting data providers, configuring connection managers, and assigning connection managers to sources and destinations. For more information about when to use this wizard, see Creating a Package by Running a Wizard.
New Script Environment
Business Intelligence Development Studio now integrates seamlessly with Microsoft Visual Studio Tools for Applications (VSTA) environment. VSTA is the development environment in which a developer writes scripts for the Script task and the Script component.
VSTA supports both the Microsoft Visual Basic 2008 or Microsoft Visual C# 2008 programming languages. VSTA also enables you to add managed assemblies to a script at design time by browsing to the folder location. In addition, VSTA enables you to add a Web reference in your code that enables the code to use objects and methods provided by a Web service.
For SQL Server 2005 Integration Services (SSIS) packages that include Microsoft Visual Studio for Applications (VSA) scripts, VSTA converts those scripts.
Note
Breakpoints are not supported in the Script component.
For more information about VSTA in Integration Services, see the following topics:
Package Upgrade
You can upgrade your Integration Services packages from the format that Integration Services uses in SQL Server 2005 to the format that SQL Server 2008 uses. To upgrade your SQL Server 2005 packages, do one or more of the following procedures:
Use the dtexec command prompt utility (dtexec.exe) that comes with SQL Server 2008 to run the SQL Server 2005 package. When you use this method to run a SQL Server 2005 package, the upgrade is temporary, and the changes that result from the upgrade cannot be saved.
Add the SQL Server 2005 package to an existing project or open that package in SQL Server 2008 Integration Services. Integration Services will automatically upgrade the package. However, the upgrade is temporary. To permanently upgrade the package, you must save the package changes.
Note
To add a package to an existing package, on the Project menu, click Add Existing Package.
Create or open a SQL Server 2005 Integration Services project, and then use the SSIS Package Upgrade Wizard to upgrade all the packages in the project. This upgrade process is permanent.
For more information, see Upgrading Integration Services Packages.
Return to New Features and Enhancements by Area
Package Configurations
When you use the dtexec command prompt utility (dtexec.exe) to run a deployed package, Integration Services now applies package configurations twice. Integration Services applies the configuration both before and after applying the options that you specified at the dtexec command line.
For more information, see Package Configurations.
Data Management Enhancements
Enhanced Data Type Handling in the SQL Server Import and Export Wizard
The SQL Server Import and Export Wizard now provides additional information and options related to the date type conversions that the import or export operation requires:
You can view data type mapping information for each table or view that you select to import or export. This information includes a visual indication of the probability that the conversions will succeed without error.
You can view additional detailed information for any column in the selected table or view.
You can accept or reject the data type conversions that the wizard will perform on a column by column basis.
You can specify the handling of errors and truncations on a global or a column by column basis.
For more information, see Review Data Type Mapping (SQL Server Import and Export Wizard).
New Date and Time Data Types
The following new date and time data types are available in Integration Services:
DT_DBTIME2
DT_DBTIMESTAMP2
DT_DBTIMESTAMPOFFSET
These new Integration Services data types provide the following benefits:
Support for a larger scale for fractional seconds.
Support of user-defined precision.
Support for a time zone offset.
Various control flow and data flow elements support these new data types. For more information, see Control Flow Elements and Data Flow Elements.
You can convert the new data types to other Integration Services date data types by using expressions, the Data Conversion transformation, and the Derived Column transformation. You can also use expressions to perform comparisons between the new data types. For more information, see Integration Services Data Types and Cast (SSIS Expressions).
Enhanced SQL Statements
Integration Services includes the following enhancements to Transact-SQL statements:
Perform multiple data manipulation language (DML) operations Transact-SQL supports the use of a MERGE operation in an SQL statement. The MERGE operation enables you to express multiple INSERT, UPDATE, and DELETE operations in a single statement against a specified target table. The target table is based on join conditions with a source table. For more information, see Inserting, Updating, and Deleting Data by Using MERGE and Using MERGE in Integration Services Packages.
Retrieve data about changes to a data source The INSERT operation supports inserting rows into a target table that are returned by the OUTPUT clause of an INSERT, UPDATE, DELETE, or MERGE operation. For more information, see INSERT (Transact-SQL).
Improve the performance of the bulk load operation when the data is sorted according to the clustered index on the table The BULK option of the OPENROWSET function supports the ORDER argument that specifies how the data in the data file is already sorted. The ORDER argument does not effect a sort operation on the text data. This argument is a hint to the SQL Server Database Engine that the data is already pre-sorted in the file. If the data is not sorted, the Database Engine returns an error.
The OPENROWSET function enables you to use OLE DB to access remote data. For more information, see OPENROWSET (Transact-SQL).
Return to New Features and Enhancements by Area
Performance and Troubleshooting Enhancements
Change Data Capture
Integration Services can use a new Database Engine feature called change data capture. This new feature captures insert, update, and delete activity that is applied to SQL Server tables. Change data capture also makes the details of these changes available in an easily consumed relational format.
For information on using change data capture in an Integration Services package to perform incremental loads, see Improving Incremental Loads with Change Data Capture. For the latest samples of how to use change data capture in Integration Services packages, see the SQL Server Samples Web site.
New Debug Dump Files
You can create debug dump files (.mdmp and .tmp) that provide information about what happens when a package runs. This information can help you in troubleshooting issues that occur when you run the package.
To create the debug dump files, you use certain command prompt options with the dtexec utility and the dtutil command prompt utility (dtutil.exe). For more information, see dtexec Utility and dtutil Utility.
For more information about the type of information included in the debug dump file (.tmp), see Working with Debug Dump Files.
Return to New Features and Enhancements by Area
|
Change History
Updated content |
---|
|