Databricks commands:
# Imports import requests
Import library requests to be able to run HTTP requests.
# REST API GET Request with Basic Authentication par = {'id':'3', 'period':'today'} # Parameters response = requests.get('https://my-rest-server.azurewebsites.net/Person' , params = par , auth = ('username', 'password') # Basic Authentication )
Define the parameters, the Basic Authentication attributes (username, password) and execute GET request.
# ADLS Configuration spark.conf.set( 'fs.azure.account.key.mystorageaccount.dfs.core.windows.net' , 'my_key' )
spark.conf.set() define the access key for the connection to Data Lake. The access key can be found in Azure Portal.

# Save the result in file in ADLS target_folder_path = 'abfss://mycontainer@mystorageaccount.dfs.core.windows.net/API_Result' dbutils.fs.put(target_folder_path + '/Person.json', str(response.json()), True)
Define the destination folder path and save (overwrite) the result of the request in JSON file.
Keep it simple :-)