issue in DB Sync

Mina Sabry Anis Metry 0 Reputation points
2025-08-29T10:02:54.23+00:00

I have configured a data sync group to sync DB from on-prem to Azure SQL & when I try to add tables to be synced, it's loaded without any feedback

I searched for that issue & got that the max table to load is 500 tables only, but my on-prem db tables are more than 600 tables & i found more scripts to add tables manually, but without any success

My question is, "Is there any way to load more than 500 tables?" or any valid way to select which tables to sync

Azure Database Migration service
{count} votes

2 answers

Sort by: Most helpful
  1. Kalyani Kondavaradala 4,600 Reputation points Microsoft External Staff Moderator
    2025-08-29T15:47:36.4666667+00:00

    Hello Mina Sabry Anis Metry,

    Thanks for posting your query on Microsoft Q&A!

    Yes, you are correct, Azure SQL Data Sync has a hard limit of 500 tables per sync group. If the sync user has visibility to more than 500 tables, the portal will hang or fail during the “Refresh Schema” operation. This is a known limitation and attempts to bypass it using scripts or custom logic will not succeed, as they rely on the same metadata enumeration.

    Please try the below approach to work around:

    To sync more than 500 tables, you’ll need to split your tables into subsets and create multiple sync groups:

    Divide your tables into logical subsets, each with no more than 500 tables. Group related tables together to preserve referential integrity.

    Create a dedicated sync user for each subset, granting permissions only on the relevant tables.

    Create a separate sync group for each subset in the Azure portal, using the corresponding sync user.

    1. Refresh the schema only the tables visible to that user will load.

    Repeat for each subset until all tables are covered.

    This approach ensures that each sync group stays within the supported limit and allows you to reliably configure synchronization for all 600+ tables.

    Validation Steps: Confirm each sync user sees ≤500 tables using:

    EXECUTE AS USER = 'SyncUserA';
    SELECT COUNT(*) FROM sys.tables;
    REVERT;
    

    After setup, verify the table count in each sync group.

    Run initial syncs and check logs for any errors.

    Note: Each sync group runs independently, so performance may vary based on table size and sync frequency

    Refer the below document for more details:

    https://azure.microsoft.com/en-us/blog/sync-sql-data-in-large-scale-using-azure-sql-data-sync/

    Please let us know if you need further assistance.

    Kindly consider upvoting the comment if the information provided is helpful. This can assist other community members in resolving similar issues.

    Thanks!

    Kalyani

    0 comments No comments

  2. Sina Salam 26,666 Reputation points Volunteer Moderator
    2025-08-29T20:31:44.7733333+00:00

    Hello Mina Sabry Anis Metry,

    Welcome to the Microsoft Q&A and thank you for posting your questions here.

    I understand that you are having issue in DB Sync.

    Yes, Azure SQL Data Sync has a hard limit of 500 tables per sync group, and I can attest that an attempts to bypass using of scripts will fail due to shared metadata enumeration. This is confirmed in Microsoft’s troubleshooting guide and best practices documentation. To avoid schema refresh failures and circular sync loops, it's essential to limit each sync user’s visibility to ≤500 tables using DENY permissions and create multiple sync groups accordingly. You can achieve with this workaround:

    DENY SELECT ON [dbo].[Table601] TO [SyncUserA];
    EXECUTE AS USER = 'SyncUserA';
    SELECT COUNT(*) FROM sys.tables;
    REVERT;
    

    Additionally, please note that Azure SQL Data Sync will be retired on September 30, 2027, so it's advisable to begin transitioning to supported alternatives. Microsoft recommends Azure Data Factory (ADF) for large-scale, incremental sync using Change Data Capture (CDC), or SQL Server Integration Services (SSIS) for complex on-prem to cloud ETL scenarios. For real-time sync, Transactional Replication is a robust option, though it requires more setup and DBA expertise.

    If you're experiencing schema refresh issues or sync loops, ensure that each sync group is isolated and that tables have primary keys. You may also consider creating a replica database with only the required tables to simplify sync configuration.

    I hope this is helpful! Do not hesitate to let me know if you have any other questions or clarifications.


    Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.