Inconsistent results transforming JSON into table rows

walkabout 1 Reputation point
2022-11-12T18:43:04.68+00:00

I have some sql which transforms the content of a json file into tables, and it works perfectly 99.99% or the time. However once in a while it will "lose" a row. There is nothing special about the row that it loses. And if I execute the same sp against the same file, inevitably it does capture all of the rows implied by the json file.
I've run tests like import the same file 100 times, and I've never seen it drop a row.
We are using sql server 2017.
Posting the full code where would be extremely challenging so I'm not going try. I wanted to ask if the issue that I've reported sounds familiar to anyone, and if there areas I could look into.
The sp uses
BulkColumn FROM OPENROWSET
and drops it into a temp table. Normally there are multiple rows in the file; I've never seen it drop more than a single row. IOW most of the data in the json file is successfully processed.

SET @alenzi _BULK = 'DECLARE @strJSON_Table VARCHAR(MAX);SELECT @strJSON_Table = BulkColumn FROM OPENROWSET(BULK ''' + @FullFilePath + ''' , SINGLE_CLOB) AS JTable;select * INTO tmpFileImport FROM OPENJSON(@strJSON_Table)'

EXEC sp_executesql @alenzi _BULK

The reason we are using a string (@alenzi _BULK) is that the file path changes at each call (dynamic sql).

I can't understand how one row would be lost in a set of rows, and yet the missing row will materialize if processed again. Any assistance is appreciated.

Transact-SQL
Transact-SQL
A Microsoft extension to the ANSI SQL language that includes procedural programming, local variables, and various support functions.
4,597 questions
{count} votes

1 answer

Sort by: Most helpful
  1. walkabout 1 Reputation point
    2022-11-12T21:08:51.567+00:00

    So far yes, but it's hard to be sure since I can't reproduce it.
    But I just figured out it may be another issue. I was incomplete in my pre-post testing. It may not have anything to do with the json transformation. I'll post here again if I need to continue this.
    Thank you for the reply

    0 comments No comments