Hello @Bry ,
Thanks for the ask and using Microsoft Q&A platform .
As we understand the ask here is to insert timestamp data in Azure synapse analytics . Please do let me know if that not accurate.
I did tried out the below code on databricks and I was able to insert the records in SQLDW .
from pyspark.sql import functions as F
dict = [{'name': 'Alice', 'age': 1},{'name': 'Again', 'age': 2}]
df3 = spark.createDataFrame(dict)
df3=df3.withColumn('Age', F.current_timestamp())
df3.show()
tableName = 'SomeTableName'
df3.write.mode("overwrite") \
.format("com.databricks.spark.sqldw") \
.option("url", sqlDwUrl) \
.option("tempDir", tempDir) \
.option("forwardSparkAzureStorageCredentials", "true") \
.option("dbTable", tableName) \
.save()
df3.schema
Out[73]: StructType(List(StructField(Age,TimestampType,false),StructField(name,StringType,true)))
On the SQLDW side this is what I see
I am unable to repro this . Let me know if you see something which in diffferent in my case , I will try my best to rero this .
One an other note can you please try to cast the timestamp in this format "yyyy-MM-dd HH:mm:ss" and try to insert the data .
Please do let me if you have any queries .
Thanks
Himanshu
-------------------------------------------------------------------------------------------------------------------------
- Please don't forget to click on
or upvote
button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
- Want a reminder to come back and check responses? Here is how to subscribe to a notification
- If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
@HimanshuSinha-msft
I don't know why I can't edit previous comment, but I wanted to add that the dataframe schema is
StructType(List(StructField(v-from,TimestampType,true)))
Hello @Bry ,
I see that you are using the Sql server driver , but the title says that the your sink is Synapse .
As called out here , we should use the below driver . Can you please try this out and let me know ?
Please do let me if you have any queries .
Thanks
Himanshu
·
Hello @Bry ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .In case if you have any resolution please do share that same with the community as it can be helpful to others . Otherwise, will respond back with the more details and we will try to help .
Thanks
Himanshu
Hello @Bry ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .In case if you have any resolution please do share that same with the community as it can be helpful to others .
If you have any question relating to the current thread, please do let us know and we will try out best to help you.
In case if you have any other question on a different issue, we request you to open a new thread .
Thanks
Himanshu
Hi @HimanshuSinha-msft
the problems were the format and also the credentials, which apparently were not the correct ones, after obtaining the correct ones it was solved, thanks for your recommendations.
Hi @Bry
I am facing the same issue.
Unable to load timestamp spark dataframe into synapse sql database.
Can You please let me know how was the issue fixed.
Driver used : (com.microsoft.sqlserver.jdbc.spark")
Thanks in Advance
Hi @Ramya ,
I'm not sure what was the solution, but what I did is first, use .format("com.databricks.spark.sqldw"), then use the correct keyvault (scope) and keys, I asked to change to new ones, so I think that I was using the wrong ones or were created wrong.
I hope it helps you
Sign in to comment
0 additional answers
Sort by: Most helpful