Csv to Hive data load from 2nd row

Shambhu Rai 1,411 Reputation points
2022-06-07T12:16:07.217+00:00

Hi Expert,

I wanted to load the data from csv to Hive database from 2nd row. How i can load it

209027-image.png

Here is the link which i referred
https://stackoverflow.com/questions/47938629/loading-a-csv-file-to-existing-hive-tale-through-spark

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,528 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,311 questions
Azure Database for MySQL
Azure Database for MySQL
An Azure managed MySQL database service for app development and deployment.
883 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,148 questions
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA 90,506 Reputation points
    2022-06-09T10:06:22.023+00:00

    Hello @Shambhu Rai ,

    You can use pandas to read Csv to Hive data load from 2nd row.

    import pandas as pd  
    dbfile = pd.read_csv('abfss://data@cheprasynapse.dfs.core.windows.net/hivedata.csv', header=1)  
    display(dbfile)  
    

    209892-image.png

    Hope this will help. Please let us know if any further queries.

    ------------------------------

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.