All Practice Exams

100+ Free Databricks Platform Admin Practice Questions

Pass your Databricks Platform Administrator Accreditation exam on the first try — instant access, no signup required.

✓ No registration✓ No credit card✓ No hidden fees✓ Start practicing immediately
100+ Questions
100% Free
1 / 100
Question 1
Score: 0/0

Which Databricks administrative interface is used to create new workspaces, view billing, and manage account-level identities?

A
B
C
D
to track
2026 Statistics

Key Facts: Databricks Platform Admin Exam

25

Official Questions

Databricks Academy

Free

Exam Fee

Databricks (customers/partners)

Unlimited

Time Limit

Databricks Academy

Online

Delivery

Unproctored

Multi-cloud

Scope

AWS / Azure / GCP

Badge

Credential Type

Accreditation, not full cert

The Databricks Platform Administrator Accreditation is a free, online, unproctored 25-question badge assessment with unlimited time. It covers account/workspace architecture, identity and access, Unity Catalog, networking, cluster admin, and audit across AWS, Azure, and GCP.

Sample Databricks Platform Admin Practice Questions

Try these sample questions to test your Databricks Platform Admin exam readiness. Each question includes a detailed explanation. Start the interactive quiz above for the full 100+ question experience with AI tutoring.

1Which Databricks administrative interface is used to create new workspaces, view billing, and manage account-level identities?
A.A workspace's Admin Settings page
B.The account console (accounts.cloud.databricks.com / accounts.azuredatabricks.net)
C.The Unity Catalog Explorer in any workspace
D.The Spark UI of an all-purpose cluster
Explanation: The account console is the top-level administrative surface. It is where account admins create workspaces, manage account-level users and groups (with SCIM), configure metastores, view billable usage, and link cloud resources. Workspace Admin Settings only manage a single workspace.
2On the Databricks platform, what is the role of the control plane?
A.It hosts customer data files in S3, ADLS, or GCS
B.It runs the web app, REST APIs, job scheduler, and cluster manager in Databricks' cloud account
C.It runs the executors that process customer data inside customer compute
D.It stores Unity Catalog managed tables on the customer's cloud storage
Explanation: The control plane is operated by Databricks and hosts the workspace web app, REST APIs, notebooks UI, jobs scheduler, and the cluster manager that orchestrates customer compute. Customer data normally stays in the customer's cloud account in the data plane (classic compute) or in Databricks-managed serverless compute.
3An admin needs to deploy a Databricks workspace in a customer-managed VPC on AWS so they can control subnets and security groups. Which deployment option supports this?
A.Default Databricks-managed VPC deployment
B.Customer-managed VPC (BYOV) workspace deployment
C.Serverless-only workspace with no VPC at all
D.Cross-account IAM role deployment using only Databricks default networking
Explanation: The customer-managed VPC option (often called BYOV, bring-your-own-VPC) lets you provision the VPC, subnets, and security groups yourself and register them with the Databricks account, then deploy the workspace into that VPC. This is required for stricter network controls, PrivateLink, or specific CIDR planning.
4On Azure Databricks, which networking feature allows the workspace to be deployed into a customer's existing virtual network with control over subnets and NSGs?
A.VNet injection
B.VNet peering only
C.Application Gateway integration
D.ExpressRoute Direct only
Explanation: VNet injection deploys the Azure Databricks workspace data plane into two delegated subnets in the customer's existing virtual network. The customer controls NSGs, route tables, and DNS. VNet peering and Application Gateway are complementary networking patterns but are not the deployment mode that injects the workspace.
5On AWS, which feature provides private connectivity from a corporate network to Databricks REST APIs and notebook web UI without traversing the public internet?
A.AWS Direct Connect only
B.Front-end AWS PrivateLink to the Databricks workspace
C.VPC Flow Logs
D.S3 Gateway Endpoint
Explanation: Front-end PrivateLink terminates user-to-Databricks traffic (REST APIs, notebook UI) on a VPC endpoint inside the customer's AWS account, keeping it off the public internet. Back-end PrivateLink covers data plane to control plane traffic. Direct Connect is a separate dedicated link technology.
6On Azure, what is the equivalent feature to AWS PrivateLink for reaching the Azure Databricks workspace privately?
A.Azure Private Endpoint backed by Private Link service
B.Azure Application Gateway only
C.Azure Firewall NAT rules
D.Azure Bastion
Explanation: On Azure, Private Endpoints (powered by Azure Private Link) attach a private IP from your VNet to the Azure Databricks workspace, making the workspace reachable without public internet traversal. Application Gateway, Azure Firewall, and Bastion are different services with different roles.
7Where in Databricks are 'metastores' created and assigned to workspaces?
A.Inside each workspace's Admin Settings
B.From the account console (Data > Metastores), and then assigned to one or more workspaces
C.From a notebook using CREATE METASTORE
D.Automatically created per cluster
Explanation: Unity Catalog metastores are an account-level resource. An account admin creates a metastore from the account console, links it to a cloud storage location, and assigns it to one or more workspaces in the same region. A workspace admin alone cannot create a metastore.
8Unity Catalog uses a three-level namespace. What is the correct order from highest to lowest scope?
A.schema.catalog.table
B.table.schema.catalog
C.catalog.schema.table
D.metastore.workspace.table
Explanation: Unity Catalog organizes data as catalog.schema.table (schemas were historically called databases). Catalogs sit under a metastore, schemas live in catalogs, and tables/views/volumes live in schemas. SQL: SELECT * FROM main.sales.orders.
9In Unity Catalog, what is an 'external location'?
A.A network firewall rule for outbound traffic
B.A securable object that combines a cloud storage URL with a storage credential and is used to grant access to data in cloud storage
C.A workspace deployed in another region
D.An AWS S3 bucket policy that grants public read
Explanation: An external location pairs a cloud storage URL (s3://, abfss://, gs://) with a storage credential (an IAM role, managed identity, or service account), and Unity Catalog uses it to mediate access. Admins grant CREATE EXTERNAL TABLE, READ FILES, and WRITE FILES on the external location.
10What is a Unity Catalog 'storage credential'?
A.A username/password used to log into the workspace
B.A securable object that wraps a long-lived cloud credential (AWS IAM role, Azure managed identity, or GCP service account) used by Unity Catalog to access cloud storage
C.A personal access token used by the REST API
D.A Spark configuration property in cluster settings
Explanation: A storage credential represents the cloud-side authentication that Unity Catalog uses to read and write data on behalf of users. On AWS it is an IAM role; on Azure it is a managed identity (or service principal); on GCP it is a Google service account. External locations reference a storage credential.

About the Databricks Platform Admin Exam

The Databricks Platform Administrator Accreditation is a free, online, unproctored 25-question assessment that validates the ability to administer Databricks accounts and workspaces across AWS, Azure, and GCP. It is an accreditation badge, not a full proctored certification, and covers account architecture, identity and access (SCIM, OAuth, PATs), Unity Catalog governance, networking (PrivateLink, customer-managed keys), cluster policies, and audit via system tables.

Questions

25 scored questions

Time Limit

Unlimited

Passing Score

Not publicly published

Exam Fee

Free (Databricks Academy)

Databricks Platform Admin Exam Content Outline

20%

Account & Workspace Architecture

Account console, workspace creation, regions, control vs data plane, and cross-cloud (AWS/Azure/GCP) deployment patterns

20%

Identity & Access Management

SCIM provisioning from Okta/Entra ID, users, groups, service principals, OAuth, PATs, and IP access lists

25%

Unity Catalog & Governance

Metastore, three-level namespace, external locations, storage credentials, lineage, fine-grained access, ABAC, Delta Sharing

15%

Workspace Administration

Cluster policies, instance pools, init scripts, cluster access modes (shared vs single-user), and admin settings

10%

Networking & Security

PrivateLink/Private Endpoints, VPC/VNet, customer-managed keys, encryption at rest, and secrets management

10%

Monitoring, Audit & Billing

system.access.audit, system.billing.usage, system.query.history, usage dashboards, and cost monitoring

How to Pass the Databricks Platform Admin Exam

What You Need to Know

  • Passing score: Not publicly published
  • Exam length: 25 questions
  • Time limit: Unlimited
  • Exam fee: Free

Keys to Passing

  • Complete 500+ practice questions
  • Score 80%+ consistently before scheduling
  • Focus on highest-weighted sections
  • Use our AI tutor for tough concepts

Databricks Platform Admin Study Tips from Top Performers

1Memorize the Unity Catalog three-level namespace (catalog.schema.table) and how external locations + storage credentials grant access to cloud storage
2Practice SCIM provisioning end-to-end: IdP (Okta/Entra ID) -> account-level SCIM -> workspace identity federation
3Know cluster access modes: Shared (multi-user, Unity Catalog), Single User (assigned), No Isolation Shared (legacy), and which support Unity Catalog
4Drill system tables: system.access.audit (admin actions), system.billing.usage (DBU spend), system.query.history (query analysis)
5Understand cloud-specific networking: AWS PrivateLink + VPC, Azure Private Endpoints + VNet injection, GCP Private Service Connect, and front-end vs back-end PrivateLink

Frequently Asked Questions

Is the Databricks Platform Administrator a certification or accreditation?

It is an accreditation, which earns a digital badge. It is not the same as a full proctored Databricks certification (e.g., Data Engineer Associate). The assessment is online and unproctored.

How much does the Platform Administrator Accreditation cost?

It is free for Databricks customers and partners. You access it through the Databricks Academy with a free account.

Does the accreditation cover AWS, Azure, and GCP?

Yes. It is multi-cloud. Expect questions involving S3/IAM/KMS (AWS), ADLS/Entra ID/Private Endpoints (Azure), and GCS/Workload Identity Federation (GCP).

How long should I study?

Most candidates spend 20-40 hours, especially on Unity Catalog (metastore, external locations, fine-grained access), SCIM identity flows, PrivateLink, and system tables for audit and billing.

What admin features should I focus on most?

Account console vs workspace admin, Unity Catalog hierarchy and grants, SCIM provisioning, cluster policies and access modes, PrivateLink/Private Endpoints, customer-managed keys, and system.access.audit.