Azure Databricks: Extract from REST API and save JSON file in Azure Data Lake


Databricks commands:

# Imports
import requests

Import library requests to be able to run HTTP requests.

# REST API GET Request with Basic Authentication
par = {'id':'3', 'period':'today'} # Parameters

response = requests.get('https://my-rest-server.azurewebsites.net/Person'
  , params = par
  , auth = ('username', 'password') # Basic Authentication
)

Define the parameters, the Basic Authentication attributes (username, password) and execute GET request.

# ADLS Configuration
spark.conf.set(
  'fs.azure.account.key.mystorageaccount.dfs.core.windows.net'
  , 'my_key'
)

spark.conf.set() define the access key for the connection to Data Lake. The access key can be found in Azure Portal.

# Save the result in file in ADLS
target_folder_path = 'abfss://mycontainer@mystorageaccount.dfs.core.windows.net/API_Result'
dbutils.fs.put(target_folder_path + '/Person.json', str(response.json()), True)

Define the destination folder path and save (overwrite) the result of the request in JSON file.

Keep it simple :-)


About Peter Lalovsky

I am Microsoft SQL Server certified professional, creating with T-SQL, SSRS, SSIS, ASP.NET/C#, Azure, Python, PowerShell and more on a daily basis since year 2006. In 2016 i wrote a book for beginner and intermediate T-SQL programmers which you can download here. This blog is something like my personal programming documentation. When i am not in front of a computer, i am around my paper car – Trabant 601.

Leave a comment

Your email address will not be published.