I want to write 42000+ rows of data into sql from azure synapse using python,I get an memory error So to avoid the memory error i also tried, writing the data as a csv file(bulkupload) directly from the container, but i am facing the below error. Is there a good practice of writing the large amount of data to sql.
can you please share the code if available.
#Using csv file to write the data to sql
# Using CSV in SQL
container_client = blob_service_client.get_container_client(container_name)
blob_client=container_client.get_blob_client(blob=blob_name)
blob_client.upload_blob(csv_data, overwrite=True)
blob_client = container_client.get_blob_client(blob=blob_name)
csv_data = blob_client.download_blob().readall()
conn = pyodbc.connect(a,autocommit=True)
cursor = conn.cursor()
sql_schema = 'priceInput'
sql_table = 'priceInput'
# Use BULK INSERT to insert data into SQL table with schema
bulk_insert_query = f"""
BULK INSERT {sql_schema}.{sql_table}
FROM '{csv_data}'
WITH (
FORMAT = 'CSV',
FIELDQUOTE = '\"',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0a',
FIRSTROW = 2
)
"""
cursor.execute(bulk_insert_query)
conn.commit()
cursor.close()
conn.close()
#ERROR - Below is the error which i am facing with
---------------------------------------------------------------------------
ProgrammingError Traceback (most recent call last)
Cell In [43], line 21
8 bulk_insert_query = f"""
9 BULK INSERT {sql_schema}.{sql_table}
10 FROM '{csv_data}'
(...)
17 )
18 """
20 # Execute the BULK INSERT query
---> 21 cursor.execute(bulk_insert_query)
22 conn.commit()
24 # Close the connection
ProgrammingError: ('42000', "[42000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Parse error at line: 3, column: 81: Incorrect syntax near '\\'. (103010) (SQLExecDirectW)")