question

WelderVicenteMoreiraMartins-2917 avatar image
0 Votes"
WelderVicenteMoreiraMartins-2917 asked ShaikMaheer-MSFT commented

load data into SQL Server via Pyspark.


Hi everyone, I don't know what else to do. I tested it with several cluster versions. Need help.
thanks.


Error return in the description below:

org.apache.spark.SparkException: Job aborted due to stage failure: Task 31 in stage 1.0 failed 4 times, most recent failure: Lost task 31.3 in stage 1.0 (TID 109) (10.139.64.9 executor 8): com.microsoft.sqlserver.jdbc.SQLServerException: Database 'dw' on server 'dw' is not currently available. Please retry the connection later. If the problem persists, contact customer support, and provide them the session tracing ID of '27977670-F9B7-42AF-8F1F-3E3EB972FA65&#39

azure-sql-databaseazure-databricksdotnet-ml-big-data
· 4
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi @WelderVicenteMoreiraMartins-2917,

Thank you for posting query in Microsoft Q&A Platform.

From error message it seems Database is not available. Kindly check database status and make sure its available and running.

In below thread some details shared around SQL issues with Azure Databricks. Kindly check it once if that helps.
https://docs.microsoft.com/en-us/answers/questions/251843/unable-to-connect-to-azure-sql-database-through-da.html

If above not helps then couple please help sharing more details on your setup. Meaning, Azure SQL details such as serverless or provisioned and Databricks code which throwing this error. So that we can try to repro scenario and help better. Thank you.

0 Votes 0 ·

Hi,

Inserting a small mass of data does not cause the database connection to be lost.

The first data load, for example, was a thousand records and there were no failures.

But with a load of 48 million records the connection to the database was lost.

Data loading was done at several different times and has the same failure.

Att,

Martins welder

0 Votes 0 ·

Hi ShaikMaheer-MSFT, I processed 48 million records in an hour in my on-premises environment. On azure I'm using a cluster with 4 workers and in addition to spending more than 1 hour processing, it loses the connection to the database. Do you have any solution suggestions?

0 Votes 0 ·
ShaikMaheer-MSFT avatar image ShaikMaheer-MSFT WelderVicenteMoreiraMartins-2917 ·

Hi @WelderVicenteMoreiraMartins-2917,

It feels this issue needs more debug and go to logs deeper and understand the cause. I encourage you to have to support ticket for this issue to deeper investigation and resolution. Below is the link which explains how to create support ticket. Thank you.
https://docs.microsoft.com/en-us/azure/azure-portal/supportability/how-to-create-azure-support-request

0 Votes 0 ·

1 Answer

WelderVicenteMoreiraMartins-2917 avatar image
0 Votes"
WelderVicenteMoreiraMartins-2917 answered

any way to reach more professionals through this post?

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.