pyspark dataframe to be moved to sql pool

Balaji M [C] 0 Reputation points


I am facing with the below error when i am trying to write the pyspark dataframe to sql server.
Can you please let me know the changes i need to make in the code.

User's image

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,671 questions
SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
13,324 questions
{count} votes

1 answer

Sort by: Most helpful
  1. AnnuKumari-MSFT 32,011 Reputation points Microsoft Employee

    Hi Balaji M [C] ,

    Welcome to Microsoft Q&A platform and thanks for posting your query here.

    It seems that you are facing firewall related issue while trying to establish connection between pyspark notebook and SQL server instance.

    The error message "make sure tcp/ip connections are not blocked by a firewall" indicates that there may be a firewall blocking the TCP/IP connection between your PySpark application and the SQL Server database.

    1. Check the firewall settings: Make sure that the firewall settings on the SQL Server machine allow incoming connections on the TCP/IP port used by the SQL Server instance. You can check the firewall settings in the Windows Firewall with Advanced Security tool.
    2. Check the SQL Server configuration: Make sure that the SQL Server instance is configured to allow remote connections. You can check the SQL Server configuration in the SQL Server Configuration Manager tool.
    3. Check the connection string: Make sure that the connection string used in your PySpark application is correct and includes the correct TCP/IP port number. You can also try specifying the IP address of the SQL Server machine instead of the hostname in the connection string.

    Hope it helps. Kindly let us know how it goes. Thankyou

    0 comments No comments