When dealing with Excel files that have different column names and file structures, it can be challenging to create a standard table in MS SQL and an SSIS package. However, it is still possible to handle this scenario. Here's a general approach to address your requirements:
Assess the common data elements: Identify the common data elements across all Excel files that are essential for your database. These are the fields that exist in multiple files and can be considered as the foundation for your standard table.
Design a flexible table structure: Create a SQL table with columns for the common data elements identified in the previous step. Make these columns nullable to accommodate different files that may not have all the fields.
Create staging tables: As the column names and structures vary across Excel files, you can create staging tables in your database to hold the raw data from each file before transforming it into the standard table format. Each staging table should match the structure of the corresponding Excel file.
Create an SSIS package: Build an SSIS package that dynamically identifies the Excel files in a given folder and imports them into the appropriate staging table. You can use a ForEach Loop container to iterate over the files and a Data Flow task to load the data into the staging table.
Perform data transformations: Within the Data Flow task, you can use the Conditional Split transformation or Script Component to handle the differences in column names and map them to the corresponding columns in the standard table. Data conversion and data cleansing tasks can also be performed during this step.
Load data into the standard table: After the necessary transformations, load the data from the staging tables into the standard table using a SQL Server Destination component in the Data Flow task.
Regarding the use of Columnstore in this case, it depends on the size and nature of your data. Columnstore indexes benefit large data sets and can provide significant performance improvements for analytical queries. If your data size is substantial, you can consider using Columnstore indexes on the standard table for improved query performance.
By following this approach and using SSIS for data integration, you can handle the varying column names and structures in Excel files and import the data into a standardized table structure in MS SQL Server.
But I think using NoSQL engines like CosmosDB would be a better solution, and using different versions of APIs can help you for accessing information.