SSDT: ALTER TABLE error in composite project (SQL71001)

Russ M 1 Reputation point
2020-12-15T23:41:27.163+00:00

Our database is deployed in two "flavors" with largely the same set of objects, but with a small number of variations between them including indexes, stored procedures, and configuration. In order to accomplish this, we are using composite database projects with a root project ("R") that contains the common object definitions and two child projects ("C1", "C2") that contain the variations and a "same database" reference to the root project R.

One of the variations is that there are tables in C1 that have Change Tracking enabled while those same tables in C2 do not. The tables are otherwise identical, hence sharing the same definitions from R.

For a given table, there is a file in R with the CREATE TABLE statement. If we create a file in C1 with the ALTER TABLE statement to enable Change Tracking, it yields the following error:

SQL71001: Cannot find element referenced by the supporting statement.

For troubleshooting, we tried placing the file with the ALTER TABLE statement in project R, but it yielded the same error. The only way we could declaratively enable Change Tracking was to place the ALTER TABLE statement in the same file as the CREATE TABLE statement.

Why doesn't the ALTER TABLE statement see the table from the root composite project? Is there a way to make this work?

This seems like a bug. Is there a way to get this to work? If not, this is a potentially showstopping issue for us.

Developer technologies Visual Studio Other
SQL Server Other
{count} votes

4 answers

Sort by: Most helpful
  1. Guoxiong 8,206 Reputation points
    2020-12-16T17:12:30.577+00:00

    You can enable Change Tracking in the ALTER TABLE statements, but for your case, the problem is that there are no those objects in the database project C1. That is why you got the error. Did you try to add the root project R as the references in the C1 and C2 project? If this still does not work, you may have to move the table definitions to each child projects.


  2. ErikEJ 341 Reputation points MVP
    2020-12-16T18:10:47.59+00:00

    You should never have ALTER statements in a database project.
    Look at a pre or post deployment script to do that.


  3. Russ M 1 Reputation point
    2020-12-16T18:26:04.517+00:00

    @ErikEJ in general, I agree. In this case, I disagree. How else can you enable Change Tracking on a table but via an ALTER TABLE statement? The project understands the CT configuration on the table and parses it into the in-memory schema model if the ALTER TABLE statement resides in the same file as the CREATE TABLE statement.

    Further, SSDT schema compare recognizes the CT configuration as part of the table definition. Controlling CT from post-deployment removes this awareness away from the database project and creates a situation where subsequent deployments disable CT on the target tables because it's not part of the source project schema only to have it enabled again in the post-deployment. This change also triggers all dependent objects to be redeployed or refreshed because it thinks the table definition changed. This increases deployment time and creates a lot of churn and noise in the deployment logs, not to mention killing any existing CT history.

    It would be a potential workaround if SSDT had an option to ignore CT like Red Gate SQL Compare.

    0 comments No comments

  4. Russ M 1 Reputation point
    2020-12-30T16:41:43.43+00:00

    We may have found a work around for this, but we also filed a bug hoping to get this addressed directly in the product.

    The schema model metadata in the DACPAC does include a property indicating if CT is enabled on a table, but it currently cannot be enabled from a composite child project. However, we were able to build a utility using the DacFx library that runs during the post-build event of our child projects that alters the table objects in the root project for each of our tracked tables and then rebuilds the root package that gets included in the deployment artifact for that flavor of the database.

    This has been effective so far, but it significantly increases our build time due to the size of our model.

    Here are the key steps in the code:

    Load the Schema Model

    var model = TSqlModel.LoadFromDacpac(
        filePath,
        new ModelLoadOptions
        {
            // Must use script backed model to provide the source name for updating objects.
            LoadAsScriptBackedModel = true,
            ModelStorageType = DacSchemaModelStorageType.Memory,
            ThrowOnModelErrors = true
        });
    

    Process the Model

    var tables = model.GetObjects(
        DacQueryScopes.SameDatabase | DacQueryScopes.UserDefined,
        ModelSchema.Table);
    
    foreach (var table in tables)
    {
        if (!tableNames.Contains(table.Name.ToString()))
            continue;
    
        var sql = $"{table.GetScript()}\nGO\n\nALTER TABLE {table.Name}\n    ENABLE CHANGE_TRACKING\n    WITH (TRACK_COLUMNS_UPDATED = OFF);\nGO";
    
        var src = table.GetSourceInformation();
    
        model.AddOrUpdateObjects(
            sql,
            src.SourceName,
            new TSqlObjectOptions());
    }
    

    Update the Package

    using var package = DacPackage.Load(
        dacpacFilePath,
        DacSchemaModelStorageType.File,
        FileAccess.ReadWrite);
    
    package.UpdateModel(
        model,
        null);
    
    DacPackageExtensions.BuildPackage(
        dacpacFilePath,
        model,
        new PackageMetadata { Name = package.Name, Description = package.Description, Version = package.Version.ToString() });
    

    There may be a better approach, but this is what we were able to figure out so far. Feedback welcome.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.