JN Training
Jeremy Nathan
Phone: 865-282-1895
Email: jnathan@jncomputertraining.com
Website: jncomputertraining.com/training



Microsoft Fabric End-to-End: From Architecture to Automation - 3-days (24 hours)

Day 1: Foundations & Data Architecture

Fabric Overview & Licensing

Fabric architecture, OneLake, and workloads

Compare SKUs (Trial, F2, F64+)

Fabric Admin Portal overview

Workspace types and licensing models

Hands-on: Create a Fabric workspace and enable features

OneLake & Lakehouse Setup

Create Lakehouses and tables

Understand file vs. table storage

Shortcuts: ADLS, S3, SharePoint

Enable caching for external data

Lab: Ingest data from Azure SQL with shortcuts

Mirroring for Real-Time Sync

What is Mirroring (Azure SQL, Cosmos DB, Snowflake)

Requirements, region constraints, and setup

Compare with Copy activity

Hands-on: Mirror Azure SQL into Fabric

Data Pipelines and Copy Job

Create pipelines in Data Factory (Fabric)

Orchestration vs. movement

Integrate with Notebooks and Dataflows

Lab: Schedule copy from SQL to Lakehouse table

Data Transformation and Delta Optimization

Data cleansing and shaping in dataflows or notebooks

Implement star schemas, bridge tables, SCD1/SCD2

Denormalization, null/missing handling, deduplication

Aggregate or de-aggregate datasets

Resolve pipeline/notebook/SQL performance issues

Optimize Delta files: V-Order, file compaction, partitioning

Day 2: Modeling and Automation

Semantic Model Optimization

Direct Lake vs. Import vs. Direct Query

Design best practices: Star schema, DAX, aggregations

Tabular Editor and DAX Studio usage

Hands-on: Optimize model for Direct Lake

SQL & KQL Querying

Query Lakehouse and Warehouse using SQL

Visual query editor and script pane

Overview of KQL for Eventhouse scenarios

Lab: Run queries against Lakehouse tables and views

Views, Procedures & Functions in Lakehouse

Create views for reusable logic

Build user-defined functions (SQL or PySpark)

Author stored procedures for scheduled workflows

Embed logic into semantic models

Data Activator & Automation

Use cases: anomalies, thresholds, triggers

Set up conditions, actions, and triggers

Connect to Lakehouse or Power BI Goals

Hands-on: Create real-time trigger from table

Capacity Monitoring & Governance

Monitor usage, job queueing, and metrics

View and interpret Fabric capacity metrics app

Assign workspaces to specific capacities

Lab: Trigger alert when job backlog exceeds threshold

Day 3: Reports, Security & Deployment

Power BI Reporting in Fabric

Create semantic model and connect to SQL endpoint

Build visuals: Decomposition Tree, KPIs, Tooltips

Optimize for mobile and web

Hands-on: Create a report from Lakehouse model

Workspace Permissions & Sensitivity Labels

Workspace roles vs. item-level security

Row-level and object-level security

Apply and audit sensitivity labels

Lab: Configure row-level security for a table

CI/CD & Git Integration

Save semantic model/report as PBIP

Version control with Git and VS Code

Integrate with Azure DevOps pipelines

Lab: Track semantic model changes using Git

Managing Fabric Items & Cross-Item Integration

Overview of Metrics, Goals, Pipelines, Lakehouses

Use Impact Analysis and lineage tracing

Manage dependencies across Fabric items

Deploy via XMLA endpoint

Lab: Trace and reuse semantic model assets across workspaces