question

JaneDickson-1754 avatar image
5 Votes"
JaneDickson-1754 asked alexanderdimitrov-2746 commented

Stream Analytics: SQL output and column order in SA query

Hi!
I have a Stream Analytics query that sends data to several tables.
Example:

 select 
     node_info.timestamp_utc as timestamp_utc,
     node_info.node_name as node_name,
     props.battery_level as battery_level, 
     props.firmware_version as firmware_version, 
     props.humidity as humidity, 
     props.node_id as node_id,
     props.reserve_byte as reserve_byte,
     props.rssi as rssi,
     props.temperature as temperature,
     props.transmission_count as transmission_count,
     props.type as type,
     props.Voltage as voltage,
     props.pressure as pressure
 into [output-raw]
 from node_info 

The order in which the columns are in the SELECT clause, and actual order of columns in the SQL table are not matching, but the names in the SA query and the SQL Table columns are matching.
SA job was sending all data to the respective columns correctly before yesterday when I restarted the job.
I started getting data conversion errors in some tables and found out that the query was trying to send data to SQL table in that exact order as it is in the SELECT clause ignoring the column names.
Where there was no data type conflict, the data just mixed up, you can see it in the example below:

144257-capture.jpg



Now I have an issue:
In one table, values in first 2 columns of the SQL table have default values generated automatically and I don't need to send them through SA job.
Let's say the columns are as follows:

EntityId - unique ID, generated by SQL
DateAdded - timestamp, generated by SQL
EventTimestamp - event timestamp that I generate in SA query
AlarmText - String that I generate in SA query

And the SA query for it looks like that

 select 
     System.Timestamp() as EventTimestamp,
     "Some alarms generated based on rules" as AlarmText
 into [output-notifications]
 from formatted_data

So before all data was getting to the respective table columns with no problems, but now it's trying to send EventTimestamp to the very first column of the output table (EntityId) as it is first in SELECT statement of the querty. And it's trying to send AlarmBody to the second (DateAdded ).

I understand that this change was made on backend recently and I haven't found anything on it in the documentation.

How do I handle skipping columns in this case?
Or how can I make the query send the data to the specified columns by column name as it was before?

Thank you!

azure-stream-analytics
capture.jpg (205.9 KiB)
· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Same issue! Does Microsoft know why this is happening? Or have a resolution?

1 Vote 1 ·

I'm glad I'm not alone in this! Could you please upvote the topic? Maybe it helps draw MS attention to the issue.

1 Vote 1 ·

I've started a message in Twitter @AzureSupport

1 Vote 1 ·
brettaj avatar image
1 Vote"
brettaj answered alexanderdimitrov-2746 commented

Just got off a support call with Microsoft. A hotfix has addressed this situation. We are not longer seeing the issue.

· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Fantastic! Thanks for your help!

1 Vote 1 ·

Great to hear. Thank you!

0 Votes 0 ·
SasZahariev avatar image
2 Votes"
SasZahariev answered alexanderdimitrov-2746 commented

Hello. Since Monday, we are having exactly the same issue - instead of inserting the mapped column to the respective mapped table column, the query tries to insert the first column from the output into the first column of the table.
Currently, there is no resolution and we plan to switch the technologies used, despite being in the middle of the project.

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi!

Thank you for sharing that! I was thinking only I have the issue and I'm doing something wrong!
I can confirm we have absolutely the same issue with Azure stream analytics Jobs when their output is Table in SQL Azure DB.
The INTO function does not map table columns and insert data using the order of the columns, not their names.
Please share if you find any solution.

3 Votes 3 ·