question

JamesHackney-9611 avatar image
0 Votes"
JamesHackney-9611 asked ErlandSommarskog commented

insert column data from one table, parse and insert into other table

I have a csv file that is input into a varchar(max) column in an Azure SQL Database table. When the file is inserted into this table column I would like to take the data in this column, parse it and insert its contents into various columns of another table.

Would I create an after insert trigger, get the column data, loop through it and add the elements to a list, array or some other collection (if SQL has such a thing) and then perform an insert into the other table using the collection? Or is their a better way?

My apologies if this is a stupid question but until very recently my SQL skills have been relegated to CRUD operations so I have a lot to learn.

sql-server-generalsql-server-transact-sqlazure-sql-database
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

ErlandSommarskog avatar image
1 Vote"
ErlandSommarskog answered ErlandSommarskog commented

The question is not stupid, but the operation is. :-)

If you can find a way to crack the string outside the database, it will be a lot easier. For instance, in Perl you can easliy say:

my ($thatvalue, $thisvalue, $onemorevalue, ...) = split(/,/, $csvrow);

And you can do similar in Python, Powershell, you name it. Or you can use BCP to load the file, if it has a consistent format.

But once you have the long string in a single column in a table, it is not that fun any more, because T-SQL does not have an operation like the one above. Well, you can do string_split, or use a custom-written string-splitter, but they return tables, which you would need to pivot into column, and it all gets really messy.

I have an article of splitting strings into tables here: http://www.sommarskog.se/arrays-in-sql.html. But for you that is only half the story.

· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

This is an ancient Web Forms app that presently uses an Oracle database. We have migrated to an Azure SQL database and I was trying to basically recreate the database triggers it presently uses. However, it would be much simpler for me to just do a multiple insert on the database call, one inserting the csv file into one table and the other inserting the individual elements into the other table. Not sure if management will allow me to mess with the app but I can ask. They tend to not want to touch legacy apps except for very minor things. If I can't touch the app another alternative may be to send the file to an Azure function and let it do all the work. We had to change connections anyway so that shouldn't be a problem.

Just thinking out loud, lol. OK, thank you very much for your help.

0 Votes 0 ·

They tend to not want to touch legacy apps except for very minor things.

I wouldn't call moving from Oracle to SQL Server a "minor" thing.

Sending it to an Azure function from a trigger does not sound efficient to me, but maybe this is not an issue here.

0 Votes 0 ·
GuoxiongYuan-7218 avatar image
0 Votes"
GuoxiongYuan-7218 answered

You may insert data into the staging table first from the csv file and then you can run the scripts to insert / merge data from the staging table to the destination table and also you can parse the data and insert its contents into various columns of another table.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.