Skip to content

ecaha/dp200

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DP-200 Azure Data Solution - Implementation

Couple of remarks about course and training of Azure Data Solutions

Free training materials (courseware) on FutureProof website

You need trial account, azure pass or other access to Microsoft Azure. The best practice is to use separate purpose build account for this task. For example, create new gmail account, then create new Microsoft account for this new email, redeem code for new account (or use credit card for trial account verification). Credit card for trial account cannot be connected to Azure ever before. Electronic credit card can be used. For trial account, the card is used only for verification and is never charged.

Install Azure CLI. Azure CLI is preferred command line tool to work with Azure. It is multiplatform tool (you can run Azure CLI on MacOS, Linux, Windows and if you try little bit harder even on BSD like UNIX).

For convenient PowerShell experience install latest version of PowerShell and install Az module. There was big change in Azure modules on the beginning of the year 2019. You can still find two different command sets (AzureRmSomething, AzSomething). Try to use Az version whenever it is possible, it is newer with bright future. Even reworking older syntax (AzureRm) to new one will pay off in the future.

Setup training environment

  1. Get dummy mail account (like Yahoo) eg. myname01@yahoo.com.
  2. Create Microsoft account for this email
  3. Open Azure pass page, Sign in with MS Account
  4. Redeem the code

Azure account links

Exam

  • Not the easiest one, but not so difficult if you compare it to AZ-300 even to AZ-103
  • Broad range of technologies is covered in some depth
  • No virtual labs :-(
  • Databricks - spin up cluster, working with notebook, connecting to data, basic libraries, Spark, Python
  • Storage - focus on Data Lake Gen 2 (but Gen 1 as well), loading blobs, using polybase, mounting to Databricks, security
  • CosmosDB - scalability, avaliability, partitioning, RU
  • Azure SQL RDBMS - more DW than SQL, Polybase, ADF load, partitioning (indexes, distributing tables to node), vCPU, DTU, DWU
  • Stream Analytics - SU, IoT hub, Event hub/ Event grid, storage, azure functions
  • ADF - basic concepts, transformations, SSIS
  • Monitoring - Azure Log analytics, Alerts, Actions
  • Security - how to secure data TDE, RLS, CLS, RBAC, AAD, keys, credentials

Recommended preparation for exam

  • Practice, practice, practice
  • This exam does have a lab. Circa ten tasks about setting something in Azure portal / no code writing
  • There is Measure Up practice test for this exam MeasureUp

Usuall schedule

  1. Azure for the Data Engineer
  2. Working with Data Storage
  3. Enabling Team Based Data Science with Azure Databricks
  4. Building Globally Distributed Databases with Cosmos DB
  5. Working with Relational Data Stores in the Cloud
  6. Performing Real-Time Analytics with Stream Analytics
  7. Orchestrating Data Movement with Azure Data Factory
  8. Securing Azure Data Platforms
  9. Monitoring and Troubleshooting Data Storage and Processing

Software links

Not all the software is essential, but it is recomended software for working with Azure.

Training sessions specific schedule

17.09.2019 - Warszawa

  1. Azure for the Data Engineer

  2. Working with Data Storage

  3. Building Globally Distributed Databases with Cosmos DB

  4. Working with Relational Data Stores in the Cloud

  5. Securing Azure Data Platforms

  6. Orchestrating Data Movement with Azure Data Factory

  7. Enabling Team Based Data Science with Azure Databricks

  8. Performing Real-Time Analytics with Stream Analytics

  9. Monitoring and Troubleshooting Data Storage and Processing

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published