Skip to content

Azure DataForge Workspace Setup RequirementsΒΆ

If you don't have a Databricks Workspace, sign up for a DataForge trial for easy setup.

To set up a new DataForge Workspace using your existing Azure Databricks Workspace, you will need:

  • An Azure Data Lake Storage Gen2 account with at least one container
  • A mount set up in the Databricks workspace that has access to the container in the ADLS Gen2 account
  • An App Registration in your Azure subscription that has Contributor permission and the following API permissions. These permissions are found in App Registration -> API Permissions -> Add a Permission -> Microsoft Graph.
  • Application.ReadWrite.All
  • Directory.ReadWrite.All
  • Owner role assignment in Azure Subscription for the App Registration.
  • Open the Quotas page in Azure Portal, filter the Region for the region you will use for your Databricks and DataForge environment, and request a quota increase for the following Quotas. Quota increases do not increase your cost. Lower quotas means fewer jobs can run at the same time while using DataForge and Databricks and may result in job failures.
  • Total Regional vCPUs - increase to 100 (15 is bare minimum)
  • Standard DSv5 Family CPUs - increase to 100 (15 is bare minimum)