SSMS "Cannot Execute Script" - Specified argument was out of the range of valid values. Parameter name: chars (SQLEditors)

Son VoBa (Sync-n-scale) 1 Reputation point


I created several dozens of SQL scripts that each executes successfully individually and in succession in SSMS. When I concatenated them into a single script, SSMS complained with this error. Help?!

SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
12,237 questions
A Microsoft extension to the ANSI SQL language that includes procedural programming, local variables, and various support functions.
4,515 questions
{count} votes

4 answers

Sort by: Most helpful
  1. Son VoBa (Sync-n-scale) 1 Reputation point

    @Viorel - The aggregated script creates a new DB, follows with a series of INSERT statements. That is, there's nothing fancy. Its total length is under 2GB. I'm not sure what's the maximum validated script length for SSMS 15.0.18390.0. Do you know?

    @Erland Sommarskog - Initially, I used sqlcmd to run these individual scripts (":r"). These scripts are machine generated. In total, there're several hundreds of GBs worth of data. I switched to using SSMS to work around character encoding issues in sqlcmd. When I open the aggregated script, it shows up in the SSMS query window properly with no syntax errors. This is as expected since the individual scripts are well-formed and separately executed successfully in SSMS.

    0 comments No comments

  2. Erland Sommarskog 98,636 Reputation points

    SSMS is a 32-bit application (since it builds on the Visual Studio shell, and VS is 32-bit.) You say that the script is below 2GB in length, but exactly how long is it?

    You could try Azure Data Studio, which is a 64-bit application.

    Or you could try to resolve your character encoding issues with SQLCMD. You can specify the code page of the file with the -f option. Or just make sure that you create the script as UTF-8 file with an initial BOM, and SQLCMD detect it automatically.

  3. Erland Sommarskog 98,636 Reputation points

    I did not say that Azure Data Studio would work. I said that it was worth trying.

    Well, at least it has the decency to tell you that you fed it more than a mouthful.

    It seems to me that if you want to stick to the pattern you have now that SQLCMD is the best choice. And rather than using :r, it may be better to have a .BAT file that loads one file at a time. To wit, I have seen people posting about SQLCMD choking as well.

    But I think a better plan is to step back and look at what you really want to do. I find it difficult that these files are full of CREATE TABLE and CREATE PROCEDURE statements. Am I right that there is a lot of INSERT statements with data? There are better ways to load a lots of data than with INSERT. Better to write the data to data files and then load these with BCP. Since there is lot less syntax in the files, the files will be slower. But more importantly, the load will run more efficiently.

  4. Seeya Xi-MSFT 16,411 Reputation points

    Hi @Son VoBa (Sync-n-scale) ,

    Welcome to Microsoft Q&A!
    Judging from the above content, this seems to be the problem of sql server management tools, whether it is Azure Data Studio or SSMS.
    Unlike sqlcmd, they both provide a visual interface. When they are all unable to execute because your aggregation script is too large, why not try to use sqlcmd? In this way, you can not be restricted by management tools.
    I found some documents to help you use sqlcmd better:
    sqlcmd Utility
    sqlcmd - Run Transact-SQL Script Files

    Best regards,

    If the answer is the right solution, please click "Accept Answer" and kindly upvote it. If you have extra questions about this answer, please click "Comment".
    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

    0 comments No comments