import json payload from a rest api and save as json documents in adls gen2

Raj D 591 Reputation points
2020-08-03T17:53:23.913+00:00

Hi, I am trying to import json payload from a REST api GET method and save json documents into ADLS Gen2 using azure databricks.
GET: https://myapi.com/api/v1/city

GET method Output:

    [
    {"id":2643743,
     "name":"London"},
    {"id":2643744,
     "name":"Manchester"}
    ]

Powershell:

    [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
   
    $username = "user"
    $password = "password"
   
    $params = @{uri = 'https://myapi.com/api/v1/city';
                       Method = 'Get';
                       Headers = @{Authorization = 'Basic ' + [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes("$($username):$($password)"));
               } #end headers hash table
       } #end $params hash table
   
    $var = invoke-restmethod @params -ContentType "application/json".Content | ConvertTo-Json

Now, I'm stuck with how to save json document in Azure Data Lake Storage Gen2. Please guide me.

Thank you.

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,478 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,211 questions
Windows Server PowerShell
Windows Server PowerShell
Windows Server: A family of Microsoft server operating systems that support enterprise-level management, data storage, applications, and communications.PowerShell: A family of Microsoft task automation and configuration management frameworks consisting of a command-line shell and associated scripting language.
5,546 questions
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 90,146 Reputation points Microsoft Employee
    2020-08-04T07:45:22.18+00:00

    Hello,

    Welcome to Microsoft Q&A platform.

    You can use df.write.json API to write to any specific location as per your need.

    Syntax:df.write.json('location where you want to save the json file')

    Example:df.write.json("abfss://<file_system>@<storage-account-name>.dfs.core.windows.net/iot_devices.json")

    Here are the steps to save the JSON documents to Azure Data Lake Gen2 using Azure Databricks.

    Step1: You can use spark.read.json API to read the json file and create a dataframe.

    Step2: The blob storage location can be mounted to a databricks dbfs directory, using the instructions in below doc

    https://learn.microsoft.com/en-us/azure/databricks/data/data-sources/azure/azure-datalake-gen2

    Step3: Then use the df.write.json API to write to the mount point, which will write to the blob storage

    For more details, refer the below articles:

    Azure Databricks – JSON files

    Sample notebook: https://learn.microsoft.com/en-us/azure/databricks/_static/notebooks/adls-passthrough-gen2.html

    15430-image.png

    Hope this helps. Do let us know if you any further queries.

    ----------------------------------------------------------------------------------------

    Do click on "Accept Answer" and Upvote on the post that helps you, this can be beneficial to other community members.

    2 people found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. Rich Matheisen 46,801 Reputation points
    2020-08-03T19:12:33.99+00:00

    Is this what you're looking for?

    data-lake-storage-directory-file-acl-powershell


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.