Facing "Failed to detect schema" error when trying to create a external table from a CSV
I am following the instructions from https://microsoftlearning.github.io/dp-203-azure-data-engineer/Instructions/Labs/04-Create-a-Lake-Database.html
Exaclty from the step 2 to 3 from this part the error occurs:
"Create a table
- In the main pane, switch back to the RetailDB pane, which contains your database schema (currently containing the Customer and Product tables).
- In the + Table menu, select From data lake. Then in the Create external table from data lake pane, specify the following options:
- External table name: SalesOrder
- Linked service: Select synapsexxxxxxx-WorkspaceDefautStorage(datalakexxxxxxx)
- Input file of folder: files/RetailDB/SalesOrder
- Linked service: Select synapsexxxxxxx-WorkspaceDefautStorage(datalakexxxxxxx)
- External table name: SalesOrder
- Continue to the next page and then create the table with the following options:
- File type: CSV
- Field terminator: Default (comma ,)
- First row: Leave infer column names unselected.
- String delimiter: Default (Empty string)
- Use default type: Default type (true,false)
- Max string length: 4000"
- Use default type: Default type (true,false)
- String delimiter: Default (Empty string)
- First row: Leave infer column names unselected.
- Field terminator: Default (comma ,)
- File type: CSV
I have downloaded the CSV and uploaded as instructed. All the other tables before this one (the custom table and the one from the template) were created successfully.
When I try to create the external table from the datalake, after I set the table name, the linked service and the input file of folder I have to click the continue button but there isn't a next page for me to select a file type.
A window with the "Failed to detect schema" appears and there is the following detail:
"Error details
New external table
Failed to execute query. Error: Error encountered while parsing data: 'Invalid: Parquet magic bytes not found in footer. Either the file is corrupted or this is not a parquet file.'. Underlying data description: file 'https://datalake8sjyxr9.dfs.core.windows.net/files/RetailDB/SalesOrder/salesorder.csv'. The batch could not be analyzed because of compile errors."
I checked and the RetailDB Data format is "Delimited Text" and it's working because the other two CSV I uploaded for the custom table and the one from the template are working fine.
It is easy to reproduce the error following the instructions from https://microsoftlearning.github.io/dp-203-azure-data-engineer/Instructions/Labs/04-Create-a-Lake-Database.html.