AZ-500 Objective 3.3: Plan and Implement Security for Azure SQL Database and SQL Managed Instance

 • 45 min read • Microsoft Azure Security Technologies

Share:

AZ-500 Exam Focus: This objective covers comprehensive database security including Microsoft Entra ID (Azure AD) authentication with Azure AD admin, contained database users, and managed identities eliminating SQL authentication, database auditing to Log Analytics and Storage with action groups tracking authentication and data access, dynamic data masking protecting PII with masking functions (default, email, partial, random), Transparent Data Encryption (TDE) with service-managed or customer-managed keys (BYOK) in Key Vault encrypting data at rest, and Always Encrypted with deterministic (equality queries) or randomized encryption (maximum protection) plus secure enclaves for rich queries. Understanding when to use each feature is essential.

Understanding Azure SQL Database Security

Azure SQL Database and SQL Managed Instance provide enterprise-grade relational database services requiring comprehensive security protecting against unauthorized access, data breaches, and compliance violations. Database security challenges include authentication weaknesses using SQL logins with weak or reused passwords vulnerable to brute force attacks, insufficient auditing lacking visibility into who accessed what data when making incident investigation difficult, data exposure where sensitive information visible to unauthorized users including database administrators and application developers with excessive privileges, encryption gaps leaving data vulnerable when database files, backups, or transaction logs stolen or improperly disposed, and insider threats from high-privilege users (DBAs, cloud operators) accessing sensitive data beyond their business need. Traditional SQL Server security relied on network isolation, SQL authentication, and database roles but lacked cloud-native identity integration, comprehensive auditing, dynamic data protection, and client-side encryption protecting data from database administrators.

Azure SQL Database security provides defense-in-depth through multiple layers: Identity and authentication using Microsoft Entra ID (Azure AD) providing centralized identity management, MFA support, Conditional Access enforcement, and managed identities for applications eliminating credential storage. Auditing and threat detection tracking all database activities, logging authentication attempts and data access, detecting anomalous patterns, integrating with Azure Sentinel for SIEM correlation. Data protection through dynamic data masking obfuscating sensitive columns for non-privileged users, row-level security restricting access to specific rows based on user context. Encryption at rest using Transparent Data Encryption (TDE) automatically encrypting database files, logs, and backups with service-managed or customer-managed keys. Encryption in use and in transit with Always Encrypted providing client-side encryption where database never sees plaintext, protecting sensitive data from DBAs and cloud operators. Network security through firewall rules, Virtual Network integration, and Private Endpoints eliminating public internet exposure. This objective explores Azure SQL Database security features enabling comprehensive protection of data through strong authentication, comprehensive auditing, intelligent data masking, encryption at rest and in transit, meeting security and compliance requirements for regulated industries while maintaining operational efficiency and application compatibility.

Microsoft Entra ID Database Authentication

Azure AD Authentication Benefits and Architecture

Microsoft Entra ID authentication (formerly Azure Active Directory) provides modern identity management for Azure SQL Database and SQL Managed Instance replacing traditional SQL authentication username/password with centralized cloud identity. SQL authentication challenges: Passwords stored in database vulnerable to brute force attacks if weak passwords used, no central password policy enforcement (users create own passwords), no Multi-Factor Authentication (MFA) support increasing credential theft risk, credential management burden requiring password resets and rotation, limited audit trail (database logs show SQL login but not actual user identity), credentials embedded in application configuration or connection strings creating security risks. Azure AD authentication benefits: Centralized identity management in Azure AD (single identity across Azure services), organizational password policies enforced (complexity, expiration, history), MFA support protecting high-privilege accounts, Conditional Access integration enforcing device compliance, location-based access, risk-based policies, comprehensive audit trail with actual user identities in logs, managed identities for Azure services eliminating credentials from code, Azure AD groups simplifying permission management, seamless single sign-on for users, Kerberos support for integrated authentication, token-based authentication with automatic token refresh.

Authentication flow: User or application authenticates to Azure AD presenting credentials (password, MFA, certificate), Azure AD validates credentials against policies (MFA, Conditional Access, risk detection), Azure AD issues OAuth 2.0 access token containing user identity and claims, client presents token to SQL Database during connection, SQL Database validates token signature and issuer with Azure AD, SQL Database checks database permissions for authenticated Azure AD principal, connection established with user context (audit logs show Azure AD user identity). Token expiration: Access tokens valid for limited time (typically 1 hour), client libraries automatically refresh tokens, applications require periodic re-authentication for MFA when token expires. Azure AD authentication modes: Azure AD password authentication—username and password to Azure AD, suitable for non-interactive scenarios, doesn't support MFA. Azure AD integrated authentication—single sign-on using Windows credentials, requires domain-joined machine or Azure AD joined device, supports Kerberos. Azure AD interactive authentication (Universal with MFA)—prompts for credentials with MFA support, interactive browser-based flow, recommended for user connections requiring MFA. Azure AD service principal—application identity with certificate or secret, suitable for applications and automation, certificates recommended over secrets. Azure AD managed identity—system-assigned or user-assigned identity for Azure services, eliminates credential management, automatic token acquisition, recommended for applications in Azure.

Configuring Azure AD Authentication

Set Azure AD admin: Portal—SQL server → Azure Active Directory → Set admin → Search and select Azure AD user or group (group recommended for availability) → Save. PowerShell: Set-AzSqlServerActiveDirectoryAdministrator -ResourceGroupName [RG] -ServerName [server] -DisplayName [admin-name]. CLI: az sql server ad-admin create --resource-group [RG] --server-name [server] --display-name [admin-name] --object-id [object-id]. Azure AD admin permissions: Full administrative access to server and all databases equivalent to server-level admin, can create Azure AD contained database users, can grant permissions to other users, should be Azure AD security group not individual user (group provides continuity when individuals change roles). Create contained database users: Connect to database as Azure AD admin using SSMS with Azure AD authentication, create user from Azure AD identity: CREATE USER [user@domain.com] FROM EXTERNAL PROVIDER (creates contained user for Azure AD user account), CREATE USER [GroupName] FROM EXTERNAL PROVIDER (creates contained user for Azure AD group, group members inherit permissions), CREATE USER [AppService-MyApp] FROM EXTERNAL PROVIDER (creates user for managed identity or service principal, use exact resource name or application name).

Grant permissions: Database roles: ALTER ROLE db_datareader ADD MEMBER [user@domain.com] (read all tables), ALTER ROLE db_datawriter ADD MEMBER [user@domain.com] (write all tables), ALTER ROLE db_owner ADD MEMBER [GroupName] (full database access). Object-level permissions: GRANT SELECT ON dbo.Customers TO [user@domain.com], GRANT INSERT, UPDATE ON dbo.Orders TO [AppGroup], GRANT EXECUTE ON dbo.uspGetCustomer TO [ServicePrincipal]. Best practice: Use Azure AD groups for role-based access, assign permissions to groups, manage access through Azure AD group membership not individual database user grants. Example: Create Azure AD security groups (SQL-Readers, SQL-Writers, SQL-Admins), create contained database users for each group, assign appropriate roles to groups, add/remove users in Azure AD (changes reflected immediately in database permissions). Managed identities: Enable managed identity on Azure service (App Service, Function App, VM, Logic App, AKS), create contained database user for managed identity using exact resource name, grant minimum required permissions: CREATE USER [myappservice] FROM EXTERNAL PROVIDER; GRANT SELECT, INSERT, UPDATE ON dbo.Orders TO [myappservice]. Application connection string: Server=myserver.database.windows.net;Database=mydb;Authentication=Active Directory Default (tries managed identity then other Azure AD methods), code uses Azure.Identity library: var credential = new DefaultAzureCredential(); var connection = new SqlConnection(connectionString); connection.AccessToken = credential.GetToken(new TokenRequestContext(new[] { "https://database.windows.net/.default" })).Token;

Service principals: Create app registration in Azure AD, create client secret or certificate (certificate recommended), create contained database user: CREATE USER [MyAppRegistration] FROM EXTERNAL PROVIDER; GRANT SELECT ON dbo.Customers TO [MyAppRegistration]. Application authenticates using service principal credentials: Authentication=Active Directory Service Principal;UID=[client-id];PWD=[client-secret]. Connecting with Azure AD: SQL Server Management Studio (SSMS): Server name → server.database.windows.net, Authentication → Azure Active Directory - Universal with MFA (recommended for interactive users), Azure Active Directory - Password (no MFA), Azure Active Directory - Integrated (SSO), Enter Azure AD username → Connect. sqlcmd: sqlcmd -S server.database.windows.net -d database -G -U user@domain.com (-G enables Azure AD authentication, prompts for password/MFA). Azure Data Studio: New Connection → Authentication type → Azure Active Directory - Universal with MFA support → Enter Azure AD username. Application code: Connection string authentication options: Active Directory Interactive (prompts browser), Active Directory Integrated (SSO), Active Directory Password (username/password), Active Directory Service Principal (client ID and secret), Active Directory Managed Identity, Active Directory Default (tries multiple methods automatically). Best practices: Set Azure AD admin as security group not individual user, disable SQL authentication when all applications support Azure AD (set Azure AD authentication only in server settings), use managed identities for all Azure services accessing database, implement MFA for all human administrators and developers, use service principals with certificates for external applications, enable Conditional Access requiring compliant devices for database access, regularly audit Azure AD database users and permissions, remove unused contained users, document user-to-permission mappings, test Azure AD authentication thoroughly before disabling SQL authentication, maintain emergency SQL admin account for Azure AD service outages.

Database Auditing

Audit Configuration and Destinations

Azure SQL Database auditing tracks database events writing audit logs enabling security monitoring, compliance reporting, threat detection, and forensic analysis. Audit captures: Database operations (SELECT, INSERT, UPDATE, DELETE on tables, EXECUTE for procedures), authentication events (successful and failed logins, connection attempts), schema changes (CREATE, ALTER, DROP objects—tables, views, procedures), permission changes (GRANT, REVOKE permissions, role membership changes), configuration changes (database settings, security configuration), database principal changes (user creation, deletion, modification), batch operations (query batches, stored procedure execution). Audit destinations: Log Analytics workspace—structured logs queryable using Kusto Query Language (KQL), real-time analysis with dashboards, alerting on suspicious patterns, integration with Azure Monitor and Sentinel, retention configurable (30-730 days), recommended for active monitoring and threat detection. Storage account—cost-effective long-term retention, blob storage in sqldbauditlogs container, retention up to years (3285 days), export for compliance audits, supports immutable storage for tamper-proof audit trail, recommended for compliance archival. Event Hub—stream audit logs in real-time, integrate with third-party SIEM systems (Splunk, QRadar), custom processing with Azure Functions or Stream Analytics, recommended for external SIEM or real-time processing. Can configure multiple destinations simultaneously (common: Log Analytics for analysis, Storage for compliance).

Server-level auditing: Configure once applies to all databases on logical SQL server, captures server-level events (login attempts, server configuration changes), simpler management than per-database configuration, recommended approach for most scenarios. Configure: Portal—SQL server → Auditing → Enable Azure SQL Auditing → Select destination: Storage account (choose account, retention days 0-3285), Log Analytics (select workspace), Event Hub (select namespace and hub) → Configure audit log destination → Save. CLI: az sql server audit-policy update --resource-group [RG] --server [server] --state Enabled --log-analytics-workspace-id [workspace-id]. PowerShell: Set-AzSqlServerAudit -ResourceGroupName [RG] -ServerName [server] -LogAnalyticsWorkspaceResourceId [workspace-id]. Database-level auditing: Configure per database overriding or supplementing server-level audit, use when specific database has different compliance requirements, captures database-specific events plus inherited server events. Configure: Database → Auditing → Enable auditing → Select destination → Save. Priority: Database-level settings override server-level for that database, other databases use server-level settings. Audit policy configuration: Action groups define categories of events to audit, select appropriate groups based on compliance and security requirements. Common action groups: BATCH_COMPLETED_GROUP (all query batches—most comprehensive, high volume), SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP (successful logins), FAILED_DATABASE_AUTHENTICATION_GROUP (failed logins—critical for detecting brute force), DATABASE_OBJECT_CHANGE_GROUP (DDL changes—CREATE, ALTER, DROP), SCHEMA_OBJECT_CHANGE_GROUP (schema modifications), DATABASE_PRINCIPAL_CHANGE_GROUP (user and role changes), DATABASE_ROLE_MEMBER_CHANGE_GROUP (role membership changes), DATABASE_PERMISSION_CHANGE_GROUP (GRANT, REVOKE), APPLICATION_ROLE_CHANGE_PASSWORD_GROUP (password changes). Recommended baseline: Failed authentications, database object changes, principal changes, permission changes for security monitoring.

Querying and Analyzing Audit Logs

Log Analytics queries: Portal → Log Analytics workspace → Logs → New Query. Basic audit query: AzureDiagnostics
| where ResourceType == "SERVERS/DATABASES"
| where Category == "SQLSecurityAuditEvents"
| project TimeGenerated, server_principal_name_s, action_name_s, statement_s, database_name_s
. Failed login attempts: | where event_type_s == "FAILED_DATABASE_AUTHENTICATION_GROUP"
| summarize FailedAttempts=count() by server_principal_name_s, bin(TimeGenerated, 1h)
| where FailedAttempts > 5
(detects brute force). Permission changes: | where action_name_s contains "GRANT" or action_name_s contains "REVOKE"
| project TimeGenerated, server_principal_name_s, action_name_s, affected_rows_s
. Data access by specific user: | where server_principal_name_s == "user@domain.com"
| where action_name_s contains "SELECT"
| project TimeGenerated, statement_s, database_name_s
. After-hours access: | where hourofday(TimeGenerated) < 6 or hourofday(TimeGenerated) > 18
| where action_name_s contains "SELECT" or action_name_s contains "UPDATE"
. Schema changes: | where action_name_s contains "CREATE" or action_name_s contains "ALTER" or action_name_s contains "DROP"
| project TimeGenerated, server_principal_name_s, action_name_s, statement_s
.

Storage account logs: Navigate to Storage account → Containers → sqldbauditlogs, folder structure: server/database/year/month/day, files in JSON format with audit events, download for analysis or compliance documentation, use Storage Explorer or Azure portal for browsing. Immutable storage: Configure immutable storage on audit container preventing deletion or modification, time-based retention (retain for X days, months, years) or legal hold, meets compliance requiring tamper-proof audit logs (SEC 17a-4, FINRA, CFTC), configure: Container → Access policy → Add policy → Time-based retention → Set days → Lock policy. Audit log fields: TimeGenerated (event timestamp), server_principal_name_s (user who performed action), action_name_s (action type—SELECT, INSERT, UPDATE, etc.), statement_s (SQL statement executed), database_name_s (database name), schema_name_s (schema), object_name_s (table/view/procedure), affected_rows_s (rows affected), client_ip_s (source IP), application_name_s (application name from connection string). Alerting: Create alert rules based on audit log queries, alert on suspicious activities (multiple failed logins, permission changes, after-hours access, unusual query patterns, mass data export). Configure: Log Analytics workspace → Alerts → New alert rule → Condition: Custom log search → Query: failed login query → Threshold: > 5 in 1 hour → Action group: Email security team → Create. Best practices: Enable server-level auditing covering all databases, configure both Log Analytics (analysis) and Storage (archival), select action groups including authentication, schema changes, permissions, failed logins, query audit logs regularly for anomalies, implement automated alerts on critical events, integrate with Azure Sentinel for advanced threat detection, retain audit logs for compliance-required duration (financial: 7 years, healthcare: 6 years, general: 1-3 years), enable immutable storage for audit logs in regulated environments, document audit configuration and retention policies, test audit log collection and querying, regularly review access to audit logs (should be limited), export audit logs periodically for compliance reporting, maintain audit logs separately from operational databases.

Dynamic Data Masking

Masking Functions and Configuration

Dynamic data masking (DDM) protects sensitive data by obfuscating it in query results for non-privileged users while original data remains unchanged in database. Masking applied in real-time during query execution, privileged users (with UNMASK permission) see actual data, non-privileged users see masked data, actual data in database never modified. Use cases: Protect personally identifiable information (PII) from application users without business need for full data (customer service viewing partial credit card or SSN), demonstrate production data structures without exposing sensitive values (testing, training, demos), comply with privacy regulations limiting data exposure (GDPR right to limit processing, CCPA, PCI DSS restricted access to cardholder data), reduce risk of accidental data disclosure in reports and screenshots, provide tiered data access (managers see more than general staff). Masking functions: Default masking—masks entire field based on data type: String data → XXXX (4 X characters regardless of length), Numeric data → 0, Date/time data → 1900-01-01 00:00:00.000, Binary data → single byte 0. Use for complete obfuscation. Email masking—reveals first character and domain, masks username middle: john.smith@example.com → jXXX@XXXX.com, maintains email format for validation, protects identity while showing it's an email. Custom string (partial) masking—shows prefix and suffix characters, masks middle with padding: Format: partial(prefix-length, "padding", suffix-length), examples: Social Security Number 123-45-6789 → XXX-XX-6789: partial(0, "XXX-XX-", 4), Credit card 1234-5678-9012-3456 → XXXX-XXXX-XXXX-3456: partial(0, "XXXX-XXXX-XXXX-", 4), Phone (555) 123-4567 → (XXX) XXX-4567: partial(0, "(XXX) XXX-", 4). Random masking—numeric data replaced with random value in specified range: Format: random(lower-bound, upper-bound), example: Salary 75000 → random number between 10000-100000: random(10000, 100000), useful for testing with realistic values, preserves numeric data type for calculations.

Configure masking with T-SQL: Create table with masking: CREATE TABLE Customers (
  CustomerID INT PRIMARY KEY,
  Name NVARCHAR(100),
  Email NVARCHAR(100) MASKED WITH (FUNCTION = 'email()'),
  Phone NVARCHAR(20) MASKED WITH (FUNCTION = 'partial(0,"XXX-XXX-",4)'),
  SSN NVARCHAR(11) MASKED WITH (FUNCTION = 'partial(0,"XXX-XX-",4)'),
  CreditCard NVARCHAR(19) MASKED WITH (FUNCTION = 'partial(4,"XXXX-XXXX-XXXX-",4)'),
  Salary DECIMAL(10,2) MASKED WITH (FUNCTION = 'random(10000,100000)')
)
. Add masking to existing column: ALTER TABLE Customers
ALTER COLUMN SSN ADD MASKED WITH (FUNCTION = 'partial(0,"XXX-XX-",4)')
. Remove masking: ALTER TABLE Customers
ALTER COLUMN SSN DROP MASKED
. View masked columns: SELECT c.name, m.masking_function
FROM sys.masked_columns AS m
JOIN sys.columns AS c ON m.column_id = c.column_id AND m.object_id = c.object_id
WHERE m.object_id = OBJECT_ID('dbo.Customers')
. Portal configuration: Database → Dynamic Data Masking → Shows recommended masking rules based on data classification, Add mask → Select schema, table, column → Choose masking function (Default, Email, Custom text, Number, DateTime) → For custom: Specify prefix exposed length, padding string, suffix exposed length → Save. Masking recommendations: Azure automatically suggests columns for masking based on sensitive data discovery and classification, columns with sensitivity labels (Confidential, PII, Financial) recommended for masking, review recommendations applying appropriate masking functions, recommendations update as new sensitive columns added.

Managing Masking Permissions

UNMASK permission: Users with UNMASK see actual unmasked data regardless of masking rules, grant selectively following least privilege principle. Default UNMASK holders: Server administrator (admin account used at server creation), Azure AD administrator (set in Azure Active Directory settings), Database owner (members of db_owner role). Grant UNMASK: GRANT UNMASK TO [user@domain.com] (specific user), GRANT UNMASK TO [ComplianceTeam] (Azure AD group), GRANT UNMASK TO [ManagerRole] (database role). Revoke UNMASK: REVOKE UNMASK FROM [user@domain.com]. View UNMASK grants: SELECT dp.name AS Principal, dp.type_desc
FROM sys.database_permissions AS p
JOIN sys.database_principals AS dp ON p.grantee_principal_id = dp.principal_id
WHERE p.permission_name = 'UNMASK'
. Best practice: Grant UNMASK only to roles requiring full data access (compliance officers reviewing data quality, security administrators investigating incidents, senior management with business need, data analysts performing specific analysis requiring actual values), most users including application users, customer service, junior staff should not have UNMASK. Testing masking: Connect as non-privileged user (without UNMASK permission): SELECT CustomerID, Name, Email, Phone, SSN, CreditCard, Salary FROM Customers, results show masked values: Email: jXXX@XXXX.com, Phone: XXX-XXX-4567, SSN: XXX-XX-6789, CreditCard: XXXX-XXXX-XXXX-3456, Salary: random value between 10000-100000. Connect as privileged user (with UNMASK or admin): Same query shows actual unmasked values. Verify using different connection accounts.

Limitations and security considerations: Masking is not encryption—data stored as plaintext in database, not suitable as primary security control, determined users can infer actual values through repeated queries or statistical analysis. Doesn't prevent privileged access—users with UNMASK or admin permissions see actual data, DBAs have access to unmasked data. Query limitations—masking applied to SELECT results only, doesn't mask data in WHERE clause used for filtering, doesn't mask aggregate function results (AVG, SUM on masked columns show actual aggregated value). Export considerations—data exported by privileged users contains unmasked values, backups contain plaintext data, import/export utilities may bypass masking. Use with complementary security: Row-level security (RLS)—restricts which rows users can access, combine with masking controlling both row access and column visibility. Azure AD authentication—identifies users enabling proper masking enforcement. Auditing—tracks who accessed masked columns and whether they had UNMASK permission. Always Encrypted—provides true encryption protecting data from all database users including administrators, use for highest sensitivity data. Best practices: Implement masking during initial database design before production data loaded, use data discovery and classification identifying sensitive columns to mask, apply appropriate masking function per data type and use case, test masking thoroughly with non-privileged accounts verifying data properly obfuscated, grant UNMASK permission sparingly following least privilege, document masking rules and UNMASK grants for compliance, regularly review masking configuration ensuring new sensitive columns masked, combine with Azure AD authentication, auditing, row-level security for defense-in-depth, educate developers about masking limitations and proper UNMASK usage, monitor audit logs for UNMASK permission grants and usage, implement application-level controls preventing mass data exports by non-privileged users, conduct regular security assessments validating masking effectiveness.

Transparent Data Encryption (TDE)

TDE Architecture and Configuration

Transparent Data Encryption (TDE) encrypts Azure SQL Database data files, transaction logs, and backups at rest protecting against unauthorized access to offline database files. TDE performs real-time encryption as data written to disk and decryption when read from disk, transparent to applications (no code changes required), protects data at physical storage layer. TDE enabled by default: New Azure SQL databases automatically have TDE enabled with service-managed keys, provides immediate protection without configuration, Microsoft manages encryption keys storing in Microsoft key store, automatic key rotation by Microsoft, sufficient for most scenarios meeting compliance requirements (PCI DSS, HIPAA, ISO 27001), no additional cost or performance impact beyond minimal inherent encryption overhead (typically 1-3%). Encryption scope: Database files (MDF, NDF), transaction log files (LDF), backup files (BAK), tempdb database (encrypted when user database has TDE), doesn't encrypt filestream data, doesn't encrypt communication channel (use SSL/TLS separately).

TDE key hierarchy: Database Encryption Key (DEK)—symmetric AES-256 key used for actual data encryption/decryption, unique per database, stored in database boot record, protected by TDE protector. TDE Protector—asymmetric key (RSA 2048, 3072, or 4096 bits) encrypting DEK, stored externally to database (Microsoft key store for service-managed, Azure Key Vault for customer-managed), one protector per server protects DEKs for all databases on that server. Service-managed TDE: Microsoft manages TDE protector in Microsoft key store, automatic key rotation periodically, no customer management required, simplest configuration, suitable when compliance doesn't require customer key control. Customer-managed TDE (Bring Your Own Key - BYOK): Customer controls TDE protector stored in Azure Key Vault, full key lifecycle control (creation, rotation, disabling, deletion), meets compliance requiring customer-controlled keys (government, financial services, healthcare), supports audit trail of key operations in Key Vault logs, enables immediate database access revocation by disabling key. Prerequisites for customer-managed TDE: Azure Key Vault with soft delete enabled (7-90 day recovery), purge protection enabled (prevents permanent deletion during retention), RSA or RSA-HSM key 2048+ bits (3072 recommended), SQL server or managed instance managed identity (system-assigned or user-assigned), managed identity with Key Vault permissions (Get key, Wrap Key, Unwrap Key or Key Vault Crypto Service Encryption User role).

Configure customer-managed TDE: Create or select Key Vault with required features: Soft delete enabled, Purge protection enabled. Create RSA key: Key Vault → Keys → Generate/Import → Key type: RSA, RSA key size: 3072 (recommended) or 4096 → Name key → Create. Enable managed identity: SQL server → Identity → System-assigned: On → Save (note object ID), or User-assigned → Add → Select managed identity. Grant Key Vault permissions: Key Vault → Access control (IAM) → Add role assignment → Key Vault Crypto Service Encryption User → Assign to SQL server managed identity → Save. Alternatively, Key Vault → Access policies → Add access policy → Key permissions: Get, Wrap Key, Unwrap Key → Select principal: SQL server managed identity → Add → Save. Configure TDE: SQL server → Transparent data encryption → Customer Managed Key → Key selection method: Select a key → Key Vault: Select vault → Key: Select key → Select (optionally specify key version or use latest) → Save. Verification: Database → Transparent data encryption → Shows encryption enabled with customer-managed key. Query TDE status: SELECT DB_NAME(database_id) AS DatabaseName,
encryption_state,
encryption_state_desc,
percent_complete,
key_algorithm,
key_length
FROM sys.dm_database_encryption_keys
. Encryption states: 0=No database encryption key present, 1=Unencrypted, 2=Encryption in progress, 3=Encrypted, 4=Key change in progress, 5=Decryption in progress, 6=Protection change in progress. View TDE protector: SELECT * FROM sys.dm_server_tde_certificates (service-managed), check SQL server TDE settings in portal (customer-managed).

Key Rotation and Management

Key rotation service-managed: Automatic rotation by Microsoft, no customer action required, rotation occurs periodically (Microsoft-determined schedule), transparent to databases and applications. Key rotation customer-managed: Manual rotation—create new key version in Key Vault (Keys → Key → New version → Generate), update SQL server TDE configuration: SQL server → Transparent data encryption → Customer Managed Key → Select key → Choose new version → Save, SQL Database re-encrypts DEK with new TDE protector (background operation, database remains online), previous key version retained for reading data encrypted with old version, track rotation in Key Vault audit logs. Automatic rotation (recommended)—enable auto-rotation policy in SQL server TDE settings, configure rotation schedule in Key Vault (rotation policy on key), SQL server automatically uses new key version when available, eliminates manual rotation and associated human error. Best practice: Rotate customer-managed keys annually minimum, more frequently for highly sensitive environments (quarterly or semi-annually). Key revocation: Disable or delete TDE protector key in Key Vault immediately revoking database access, database queries fail with encryption-related errors, database inaccessible until key restored or re-enabled, useful for emergency access control, security incidents (suspected key compromise), compliance scenarios (data must be inaccessible). Revoke access: Key Vault → Keys → TDE key → Disable → Database becomes inaccessible within minutes. Restore access: Enable key or restore from soft delete if deleted. Test revocation: Disable key, attempt database query (should fail), enable key, retry query (should succeed). Document revocation procedures for emergency response.

Backup and restore: Backups automatically encrypted with same TDE protector as database at backup time, restore requires access to original TDE protector or key version used during backup, customer-managed key considerations: Maintain access to historical key versions in Key Vault for restoring old backups, document key-to-backup mappings for disaster recovery, test restore procedures ensuring key access, export keys for long-term backup retention (store securely separate from backups), soft delete and purge protection prevent accidental key loss. Cross-subscription restore: Requires TDE protector accessible from target subscription, customer-managed key restore may require granting target SQL server access to Key Vault. Monitoring: Key Vault diagnostic logs capture TDE operations (Get, Wrap Key, Unwrap Key), logs show which SQL servers accessed keys and when, SQL server metrics display encryption status, configure alerts: Key access failures, TDE encryption errors, Key Vault unavailability, unexpected key access patterns. Query logs: Log Analytics → AzureDiagnostics → Filter ResourceType: Key Vault, OperationName: VaultGet, Wrap, Unwrap → Analyze SQL server access to TDE keys. Best practices: Use service-managed TDE for most scenarios (automatic management, no operational overhead), implement customer-managed TDE when compliance requires customer key control (government, regulated industries), enable customer-managed TDE before database creation avoiding background re-encryption, use Key Vault with soft delete and purge protection preventing accidental key loss, implement automatic key rotation for customer-managed keys, maintain access to historical key versions for backup restoration, monitor TDE status and key operations with alerts on failures, test key revocation and restoration procedures, document TDE configuration and key management for compliance, combine TDE with Always Encrypted for maximum protection (TDE at rest, Always Encrypted in use), backup TDE protector keys separately from database backups, implement break-glass procedures for Key Vault access issues, regular disaster recovery drills including TDE key access scenarios.

Always Encrypted

Always Encrypted Architecture and Use Cases

Always Encrypted provides client-side encryption protecting sensitive data from database administrators, cloud operators, and high-privilege users who manage database but shouldn't access sensitive data. Encryption occurs on client application, decryption occurs on client application, database stores ciphertext only, database engine never sees plaintext data, encryption keys never revealed to database. Comparison with other protections: TDE encrypts data at rest (disk files), database engine decrypts data during query processing, protects against physical theft, DBAs see plaintext data. Dynamic masking obfuscates data in results, data stored as plaintext in database, DBAs access actual data, masks only query output. Always Encrypted encrypts data end-to-end, plaintext only at client application, database stores ciphertext, DBAs see ciphertext, protects against compromised DBAs and cloud operators. Highest protection for sensitive columns. Use cases: Highly sensitive data requiring protection from database administrators who manage infrastructure but shouldn't see data (classified government information, financial PII, healthcare patient records, salary information), compliance mandating client-side encryption (regulations requiring data inaccessible to service providers), SaaS scenarios where provider manages database but tenants control encryption keys (tenant data sovereignty), protecting data from high-privilege insider threats (malicious DBAs, compromised administrator accounts), client applications requiring exclusive data access (encryption keys only at client, not shared with cloud provider).

Encryption types: Deterministic encryption—same plaintext always produces same ciphertext, enables equality comparisons in WHERE clauses (WHERE SSN = '123-45-6789'), supports DISTINCT, GROUP BY, joins on encrypted columns, some information leakage through pattern analysis (identical ciphertexts indicate identical plaintext values), suitable for columns needing search functionality (lookup fields, foreign keys). Randomized encryption—same plaintext produces different ciphertext each time (probabilistic), maximum security preventing pattern analysis, no operations on encrypted data (can't use WHERE, JOIN, GROUP BY, ORDER BY on randomized columns), suitable for highest sensitivity data not requiring queries (medical diagnoses, personal notes, financial account details). Recommendation: Use deterministic for searchable sensitive data, randomized for maximum protection. Always Encrypted with secure enclaves: Advanced feature requiring SQL Server 2019+ or Azure SQL Database with enclave support, secure enclave (AMD SEV-SNP or Intel SGX) in database server, encryption keys disclosed to enclave not general database engine, enclave performs operations on plaintext inside hardware-protected boundary, enables richer queries on encrypted data: Range comparisons (WHERE Salary BETWEEN 50000 AND 100000), Pattern matching (WHERE Name LIKE 'John%'), Sorting (ORDER BY EncryptedColumn), more complex computations. Enclave provides balance between security and functionality.

Implementing Always Encrypted

Key hierarchy: Column Master Key (CMK)—asymmetric key (RSA 2048+ bits) encrypting Column Encryption Keys, stored in client-accessible key store (Azure Key Vault, Windows Certificate Store, Hardware Security Module), application must have access to CMK, database never accesses CMK, one CMK can protect multiple CEKs. Column Encryption Key (CEK)—symmetric AES-256 key encrypting column data, encrypted with CMK and stored in database metadata, multiple encrypted copies of CEK (one per CMK) enable key rotation, database stores encrypted CEK but can't decrypt without CMK. Encrypted column data—actual data encrypted with CEK, stored as ciphertext in database, only clients with CMK access can decrypt. Key storage options: Azure Key Vault (recommended for cloud applications)—centralized key management, access control through RBAC and access policies, comprehensive audit logging, supports managed identities and service principals, high availability and disaster recovery. Windows Certificate Store (for Windows applications)—stores CMK as certificate in CurrentUser or LocalMachine store, suitable for on-premises or Windows-only scenarios, requires certificate deployment to client machines. Hardware Security Module (HSM)—highest security for keys, FIPS 140-2 Level 2/3 compliance, suitable for highly regulated environments, requires HSM integration. Configuration process: Create Column Master Key in Azure Key Vault: Key Vault → Keys → Generate/Import → RSA key 2048+ bits → Create. Grant application access: Managed identity or service principal with Key Vault Crypto User role or Get, Unwrap Key, Sign permissions. Create CMK in database: Connect to database with SSMS or Azure Data Studio, Object Explorer → Database → Security → Always Encrypted Keys → Column Master Keys → New Column Master Key → Key store: Azure Key Vault, Key path: vault URI/keys/keyname/version → OK. Or T-SQL: CREATE COLUMN MASTER KEY CMK1
WITH (
  KEY_STORE_PROVIDER_NAME = 'AZURE_KEY_VAULT',
  KEY_PATH = 'https://myvault.vault.azure.net/keys/AlwaysEncryptedKey/abc123version'
)
. Create Column Encryption Key: New Column Encryption Key → Select CMK → Name CEK → OK. T-SQL: CREATE COLUMN ENCRYPTION KEY CEK1
WITH VALUES (
  COLUMN_MASTER_KEY = CMK1,
  ALGORITHM = 'RSA_OAEP',
  ENCRYPTED_VALUE = <binary value from encryption>
)
. Encrypt columns: New table: CREATE TABLE Employees (
  EmployeeID INT PRIMARY KEY,
  Name NVARCHAR(100),
  SSN NVARCHAR(11) ENCRYPTED WITH (
    COLUMN_ENCRYPTION_KEY = CEK1,
    ENCRYPTION_TYPE = DETERMINISTIC,
    ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256'
  ),
  Salary DECIMAL(10,2) ENCRYPTED WITH (
    COLUMN_ENCRYPTION_KEY = CEK1,
    ENCRYPTION_TYPE = RANDOMIZED,
    ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256'
  )
)
. Existing table: Use Always Encrypted wizard in SSMS (right-click table → Encrypt Columns) or PowerShell scripts for migration.

Client application configuration: Install required NuGet package: Microsoft.Data.SqlClient (4.0+, supports Always Encrypted). Enable Always Encrypted in connection string: Server=myserver.database.windows.net;Database=mydb;Column Encryption Setting=Enabled;Authentication=Active Directory Default. Application code unchanged: INSERT, UPDATE, SELECT work transparently with encrypted columns, client library handles encryption/decryption automatically. Key access: Application uses managed identity or service principal accessing Azure Key Vault, DefaultAzureCredential automatically authenticates: var credential = new DefaultAzureCredential();, connection opens with Column Encryption enabled, library retrieves CMK from Key Vault when needed, decrypts CEK, uses CEK for column encryption/decryption. Query limitations: Deterministic columns—support equality (=), DISTINCT, GROUP BY, JOIN, can create unique indexes. Randomized columns—no WHERE clause predicates, no indexing, no joins or aggregations, SELECT retrieves ciphertext decrypted on client. Neither supports: LIKE patterns, comparison operators (>, <, BETWEEN), mathematical operations, string functions on encrypted data. Plan schema considering query requirements: Use deterministic for lookup/foreign key columns requiring searches, use randomized for maximum protection without query needs, leave non-sensitive columns unencrypted for full query capabilities. Best practices: Encrypt only truly sensitive columns requiring protection from DBAs, use deterministic encryption for columns needing equality searches (IDs, lookup values), use randomized for highest sensitivity (medical data, financial PII), store CMK in Azure Key Vault with proper access controls, use managed identities for application Key Vault access, implement key rotation procedures (annually for CMK, as needed for CEK), document encrypted columns and key locations, test application thoroughly with Always Encrypted before production, educate developers on query limitations and schema design, combine with TDE and auditing for defense-in-depth, maintain key backups separately from database backups, implement break-glass procedures for key loss scenarios, monitor Key Vault access to CMK logging usage, consider Always Encrypted with secure enclaves for richer query support, regular security assessments validating encryption effectiveness.

Exam Preparation Tips

Key Concepts to Master

  • Azure AD authentication: Set Azure AD admin (group recommended), create contained users (FROM EXTERNAL PROVIDER), managed identities, service principals, MFA with Universal authentication
  • Database auditing: Server-level (all databases) vs database-level, destinations (Log Analytics for analysis, Storage for archival, Event Hub for SIEM), action groups (authentication, schema changes, permissions)
  • Dynamic masking: Masking functions (default, email, partial, random), UNMASK permission, query limitations, combine with RLS and Always Encrypted
  • TDE: Enabled by default with service-managed keys, customer-managed keys (BYOK) in Key Vault require soft delete + purge protection, key rotation, backups encrypted with TDE protector
  • Always Encrypted: Client-side encryption, CMK in Key Vault, CEK in database, deterministic (equality queries) vs randomized (maximum protection), query limitations, secure enclaves for rich queries
  • Security layers: Azure AD auth (identity), Auditing (monitoring), Dynamic masking (obfuscation), TDE (at rest), Always Encrypted (end-to-end from DBAs)

Practice Questions

Sample AZ-500 Exam Questions:

  1. Question: What authentication method provides MFA support for Azure SQL Database?
    • A) SQL authentication
    • B) Azure AD authentication
    • C) Windows authentication
    • D) Certificate authentication

    Answer: B) Azure AD authentication - Supports MFA through Azure AD Universal with MFA authentication mode.

  2. Question: Which audit destination is recommended for real-time security monitoring?
    • A) Storage account
    • B) Event Hub
    • C) Log Analytics workspace
    • D) File share

    Answer: C) Log Analytics workspace - Provides structured queryable logs with real-time alerting capabilities.

  3. Question: Which dynamic masking function should you use for email addresses?
    • A) Default
    • B) Email
    • C) Partial
    • D) Random

    Answer: B) Email - Specifically designed for email masking (jXXX@XXXX.com format).

  4. Question: What is required for customer-managed TDE keys in Azure Key Vault?
    • A) Soft delete only
    • B) Purge protection only
    • C) Both soft delete and purge protection
    • D) Neither is required

    Answer: C) Both soft delete and purge protection - Required prerequisites for customer-managed TDE in Key Vault.

  5. Question: Which Always Encrypted encryption type allows equality comparisons in WHERE clauses?
    • A) Randomized
    • B) Deterministic
    • C) Probabilistic
    • D) Both A and B

    Answer: B) Deterministic - Allows equality comparisons, GROUP BY, DISTINCT; randomized doesn't support queries.

  6. Question: Where is the Column Master Key stored for Always Encrypted?
    • A) Database
    • B) Azure SQL server
    • C) Azure Key Vault or client key store
    • D) Application configuration

    Answer: C) Azure Key Vault or client key store - CMK stored outside database, accessible only to client applications.

  7. Question: What permission allows users to see unmasked data despite dynamic masking?
    • A) db_owner
    • B) CONTROL
    • C) UNMASK
    • D) VIEW DEFINITION

    Answer: C) UNMASK - Specific permission allowing users to view actual unmasked data.

  8. Question: Which encryption protects data from database administrators?
    • A) TDE with service-managed keys
    • B) TDE with customer-managed keys
    • C) Always Encrypted
    • D) Dynamic data masking

    Answer: C) Always Encrypted - Client-side encryption ensuring DBAs can't access plaintext; TDE and masking don't protect from DBAs.

AZ-500 Success Tip: Remember Azure AD authentication provides MFA and Conditional Access; create contained users FROM EXTERNAL PROVIDER, use managed identities. Auditing: server-level covers all databases, Log Analytics for analysis, Storage for archival, action groups include authentication and permission changes. Dynamic masking functions: email() for emails, partial() for credit cards/SSN with prefix/suffix, UNMASK permission to see actual data. TDE enabled by default with service-managed keys; customer-managed requires Key Vault with soft delete + purge protection. Always Encrypted: deterministic allows equality queries, randomized maximum protection, CMK in Key Vault, CEK in database, protects from DBAs. Combine all features for defense-in-depth.

Hands-On Practice Lab

Lab Objective

Implement comprehensive Azure SQL Database security including Azure AD authentication, auditing configuration, dynamic data masking, and TDE verification.

Lab Activities

Activity 1: Configure Azure AD Authentication

  • Create SQL server and database: SQL databases → Create → Configure server with SQL authentication admin → Create database
  • Set Azure AD admin: SQL server → Azure Active Directory → Set admin → Select your Azure AD user or security group → Save
  • Connect with Azure AD: SSMS → Server: yourserver.database.windows.net → Authentication: Azure Active Directory - Universal with MFA → Username: your Azure AD email → Connect
  • Create contained user: CREATE USER [user@domain.com] FROM EXTERNAL PROVIDER; ALTER ROLE db_datareader ADD MEMBER [user@domain.com];
  • Test permissions: Connect as contained user → Query tables → Verify read access works

Activity 2: Enable Database Auditing

  • Create Log Analytics workspace: Log Analytics workspaces → Create → Note workspace ID
  • Enable server-level auditing: SQL server → Auditing → Enable auditing → Destination: Log Analytics → Select workspace → Save
  • Configure action groups: Select FAILED_DATABASE_AUTHENTICATION_GROUP, SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP, BATCH_COMPLETED_GROUP → Save
  • Generate audit events: Connect to database → Run queries (SELECT, INSERT, UPDATE) → Attempt failed login with wrong password
  • Query audit logs: Log Analytics → Logs → AzureDiagnostics | where Category == "SQLSecurityAuditEvents" | project TimeGenerated, server_principal_name_s, action_name_s, statement_s

Activity 3: Configure Dynamic Data Masking

  • Create test table: CREATE TABLE Customers (CustomerID INT PRIMARY KEY, Name NVARCHAR(100), Email NVARCHAR(100), SSN NVARCHAR(11), CreditCard NVARCHAR(19));
  • Insert sample data: INSERT INTO Customers VALUES (1, 'John Smith', 'john.smith@example.com', '123-45-6789', '4111-1111-1111-1111');
  • Configure masking: Database → Dynamic Data Masking → Add mask → Table: Customers → Column: Email, Function: email() → Add mask → Column: SSN, Function: partial(0,"XXX-XX-",4) → Save
  • Test masking: Connect as regular user → SELECT * FROM Customers; → Verify Email shows jXXX@XXXX.com, SSN shows XXX-XX-6789
  • Grant UNMASK: Connect as admin → GRANT UNMASK TO [testuser@domain.com]; → Reconnect as test user → Query shows actual values

Activity 4: Verify TDE Configuration

  • Check TDE status: Database → Transparent data encryption → Verify enabled (default for new databases)
  • Query encryption status: SELECT DB_NAME(database_id) AS DatabaseName, encryption_state, encryption_state_desc FROM sys.dm_database_encryption_keys; → Encryption state should be 3 (Encrypted)
  • View TDE protector: SQL server → Transparent data encryption → Shows service-managed (default) or customer-managed if configured
  • (Optional) Configure customer-managed TDE: Create Key Vault → Create RSA key → Enable managed identity on SQL server → Grant Key Vault permissions → Configure TDE with customer key

Activity 5: Review and Document

  • Azure AD configuration: List Azure AD admin, contained users, test Azure AD authentication with MFA
  • Audit configuration: Verify server-level auditing enabled, confirm logs in Log Analytics, review action groups configured
  • Dynamic masking: List masked columns with functions, test masking with non-privileged user, verify UNMASK grants
  • TDE verification: Confirm encryption enabled, document key management approach (service vs customer-managed)
  • Security assessment: SQL server → Microsoft Defender for Cloud → Review recommendations → Address high-priority findings
  • Document architecture: Create security documentation covering authentication methods, audit configuration, masked columns, encryption status
  • Clean up: Delete database, SQL server, Log Analytics workspace, Key Vault (if created) to avoid charges

Lab Outcomes

After completing this lab, you'll have hands-on experience with Azure SQL Database security. You'll understand how Azure AD authentication provides identity-based access with MFA support, database auditing captures all activities for compliance and threat detection, dynamic data masking protects sensitive columns from non-privileged users, and TDE encrypts data at rest with service-managed or customer-managed keys. These practical skills demonstrate database security capabilities tested in AZ-500 exam and provide foundation for implementing comprehensive data protection in production Azure SQL environments.

Frequently Asked Questions

How do you enable Microsoft Entra ID authentication for Azure SQL Database?

Microsoft Entra ID (formerly Azure AD) authentication provides centralized identity management for Azure SQL Database and SQL Managed Instance eliminating SQL authentication passwords. Benefits: Centralized identity management in Azure AD, supports Multi-Factor Authentication (MFA) protecting against credential theft, integrates with Conditional Access enforcing device compliance and location-based access, provides audit trail with Azure AD user identities, enables managed identities for applications, supports Azure AD groups for simplified permission management. SQL authentication vs Azure AD: SQL authentication uses username/password stored in database, passwords can be weak or reused, no MFA support, limited audit trail, requires credential management in applications. Azure AD authentication uses Azure AD identities, enforces organizational password policies, supports MFA, comprehensive audit logs, managed identities eliminate credentials in code, integrated with enterprise identity systems. Configuration steps: Set Azure AD admin for SQL server—Portal: SQL server → Azure Active Directory → Set admin → Select Azure AD user or group → Save. CLI: az sql server ad-admin create --resource-group [RG] --server-name [server] --display-name [admin-name] --object-id [object-id]. PowerShell: Set-AzSqlServerActiveDirectoryAdministrator. Azure AD admin has full administrative access equivalent to server admin, can create Azure AD contained users, manage permissions. Create Azure AD contained database users: Connect to database as Azure AD admin using SQL Server Management Studio (SSMS) with Azure AD authentication, create contained users: CREATE USER [user@domain.com] FROM EXTERNAL PROVIDER (for Azure AD user), CREATE USER [GroupName] FROM EXTERNAL PROVIDER (for Azure AD group), grant permissions: ALTER ROLE db_datareader ADD MEMBER [user@domain.com], GRANT SELECT ON dbo.Customers TO [user@domain.com]. Contained users: Authenticated directly by database not server login, permissions granted at database level, user exists only in database, benefits: simplified user management, database portability between servers. Azure AD groups: Create Azure AD security groups organizing users by role (SQL-Readers, SQL-Writers, SQL-Admins), create contained user from group, assign permissions to group, manage access through Azure AD group membership. Example: CREATE USER [SQL-Developers] FROM EXTERNAL PROVIDER, ALTER ROLE db_owner ADD MEMBER [SQL-Developers]. Users inherit permissions from group membership. Managed identities: System-assigned or user-assigned identities for Azure services (App Service, Functions, VMs, AKS), connect to SQL Database without credentials, configure: Enable managed identity on Azure service, create contained user for managed identity: CREATE USER [app-service-name] FROM EXTERNAL PROVIDER, grant permissions: GRANT SELECT, INSERT, UPDATE ON dbo.Orders TO [app-service-name]. Application code uses managed identity: Connection string specifies Authentication=Active Directory Default or Active Directory Managed Identity, Azure.Identity library (DefaultAzureCredential) handles authentication automatically. Service principals: Azure AD app registrations for external applications or CI/CD pipelines, create contained user: CREATE USER [app-registration-name] FROM EXTERNAL PROVIDER, grant permissions, application authenticates using client ID and secret/certificate. Best practices: Set Azure AD admin immediately after SQL server creation, disable SQL authentication when all applications support Azure AD (set Azure AD admin only), use Azure AD groups for role-based access simplifying management, implement managed identities for Azure services, use service principals with certificates (not secrets) for external apps, enable MFA for Azure AD admin accounts, implement Conditional Access requiring compliant devices, regularly audit Azure AD users and permissions, document user-to-permission mappings for compliance, test Azure AD authentication before disabling SQL authentication. Connecting with Azure AD: SSMS: Connect → Authentication: Azure Active Directory - Universal with MFA → Enter Azure AD username, sqlcmd: sqlcmd -S server.database.windows.net -d database -G -U user@domain.com, application connection string: Server=server.database.windows.net;Database=dbname;Authentication=Active Directory Interactive (or Active Directory Integrated, Active Directory Password, Active Directory Service Principal, Active Directory Managed Identity, Active Directory Default). Troubleshooting: Azure AD admin not able to connect—verify firewall allows client IP, user doesn't have permissions—verify contained user exists and has appropriate role memberships, managed identity connection fails—verify managed identity enabled and contained user created for identity, MFA prompt not appearing—check authentication method (use Universal with MFA).

Share:

Written by Joe De Coppi - Last Updated November 14, 2025