Loading...
Loading...
Create, manage, and deploy Power BI semantic models inside Microsoft Fabric workspaces via `az rest` CLI against Fabric and Power BI REST APIs. Use when the user wants to: (1) create a semantic model from TMDL definition files, (2) retrieve or download semantic model definitions, (3) update a semantic model definition with modified TMDL, (4) trigger or manage dataset refresh operations, (5) configure data sources, parameters, or permissions, (6) deploy semantic models between pipeline stages. Covers Fabric Items API (CRUD) and Power BI Datasets API (refresh, data sources, permissions). For read-only DAX queries, use `powerbi-consumption-cli`. For fine-grained modeling changes, route to `powerbi-modeling-mcp`. Triggers: "create semantic model", "upload TMDL", "download semantic model TMDL", "refresh dataset", "semantic model deployment pipeline", "dataset permissions", "list dataset users", "semantic model authoring".
npx skill4agent add microsoft/skills-for-fabric powerbi-authoring-cliUpdate Check — ONCE PER SESSION (mandatory) The first time this skill is used in a session, run the check-updates skill before proceeding.
- GitHub Copilot CLI / VS Code: invoke the
skill.check-updates- Claude Code / Cowork / Cursor / Windsurf / Codex: compare local vs remote package.json version.
- Skip if the check was already performed earlier in this session.
CRITICAL NOTES
- To find the workspace details (including its ID) from workspace name: list all workspaces and, then, use JMESPath filtering
- To find the item details (including its ID) from workspace ID, item type, and item name: list all items of that type in that workspace and, then, use JMESPath filtering
| Task | Reference | Notes |
|---|---|---|
| Finding Workspaces and Items in Fabric | COMMON-CLI.md § Finding Workspaces and Items in Fabric | Mandatory — READ link first [needed for finding workspace id by its name or item id by its name, item type, and workspace id] |
| Fabric Topology & Key Concepts | COMMON-CORE.md § Fabric Topology & Key Concepts | Hierarchy; Finding Things in Fabric |
| Environment URLs | COMMON-CORE.md § Environment URLs | Production (Public Cloud) |
| Tool Selection Rationale | COMMON-CLI.md § Tool Selection Rationale | |
| Authentication Recipes | COMMON-CLI.md § Authentication Recipes | |
Fabric Control-Plane API via | COMMON-CLI.md § Fabric Control-Plane API via | Always pass |
OneLake Data Access via | COMMON-CLI.md § OneLake Data Access via | Use |
| Job Execution (CLI) | COMMON-CLI.md § Job Execution | Run notebooks/pipelines, refresh semantic models, check/cancel jobs |
| OneLake Shortcuts | COMMON-CLI.md § OneLake Shortcuts | Create a Shortcut; List Shortcuts; Delete a Shortcut |
| Capacity Management (CLI) | COMMON-CLI.md § Capacity Management | List Capacities; Assign Workspace to Capacity |
| Composite Recipes | COMMON-CLI.md § Composite Recipes | End-to-end workspace→lakehouse→file, SQL endpoint→query, and notebook execution recipes |
| Gotchas & Troubleshooting (CLI-Specific) | COMMON-CLI.md § Gotchas & Troubleshooting (CLI-Specific) | |
| Quick Reference | COMMON-CLI.md § Quick Reference | |
| DAX Queries & Metadata Discovery | powerbi-consumption-cli | Read-only DAX queries; use for post-creation validation |
| Tool Stack | SKILL.md § Tool Stack | |
| Authentication & API Audiences | SKILL.md § Authentication & API Audiences | Two audiences: Fabric API vs Power BI Datasets API |
| Must/Prefer/Avoid | SKILL.md § Must/Prefer/Avoid | Guardrails for semantic model authoring |
| SemanticModel Definition & Envelope | ITEM-DEFINITIONS-CORE.md § SemanticModel | TMDL format; required parts, envelope structure, support matrix |
| TMDL File Structure & Examples | SKILL.md § TMDL File Structure | Required parts, minimal content examples |
| TMDL CRUD (Create / Get / Update) | SKILL.md § Create Semantic Model | Create → Get/Download → Update; full lifecycle with LRO |
| Authoring Scope Matrix | SKILL.md § Authoring Scope Matrix | What Fabric API supports vs what to avoid |
| Refresh Operations | SKILL.md § Refresh Operations | Trigger, cancel, history, schedule (Power BI API) |
| Data Sources & Parameters | SKILL.md § Data Sources & Parameters | Get/update data sources and parameters |
| Permissions | SKILL.md § Permissions | Grant/update dataset user permissions |
| Deployment Pipelines | SKILL.md § Deployment Pipelines | List, get stages, deploy between stages |
| Agentic Workflow | SKILL.md § Agentic Workflow | Step-by-step: discover → create → verify → refresh → validate |
| Troubleshooting | SKILL.md § Troubleshooting | Common errors table: LRO, auth, TMDL encoding, refresh |
| Examples | SKILL.md § Examples | Create model, download definition, refresh, deploy |
| Property-to-API Mapping | semantic-model-properties-guide.md § Property-to-API Mapping | Maps each property category to the correct API surface |
| Owner, Storage Mode & Operational Metadata | semantic-model-properties-guide.md § Owner, Storage Mode | Power BI Datasets API properties |
| Refresh History Response Properties | semantic-model-properties-guide.md § Refresh History | Refresh detail response fields |
| Data Source Response Properties | semantic-model-properties-guide.md § Data Sources | Connection and gateway properties |
| DirectQuery / LiveConnection Refresh Schedule | semantic-model-properties-guide.md § DQ Refresh Schedule | DirectQuery/LiveConnection schedule settings |
| Upstream Dataflow Links | semantic-model-properties-guide.md § Upstream Dataflows | Dataflow dependency properties |
| Per-Table Storage Mode | semantic-model-properties-guide.md § Per-Table Storage | Table-level storage mode via TMDL |
| TMDL Syntax Rules | tmdl-authoring-guide.md § TMDL Syntax Rules | Tab indentation, object declaration, quoting rules |
| Modeling Best Practices | tmdl-authoring-guide.md § Modeling Best Practices | Naming conventions, column rules, measure & DAX rules, format strings |
| Relationships | tmdl-authoring-guide.md § Relationships | Relationship declarations, key rules |
| Hierarchies | tmdl-authoring-guide.md § Hierarchies | Hierarchy declarations and key rules |
| Direct Lake Guidelines | tmdl-authoring-guide.md § Direct Lake Guidelines | Direct Lake mode configuration and constraints |
| Calculated Tables | tmdl-authoring-guide.md § Calculated Tables | DAX-based calculated table definitions |
| Date/Calendar Table | tmdl-authoring-guide.md § Date/Calendar Table | Calendar table setup and marking |
| Parameters | tmdl-authoring-guide.md § Parameters | Expression-based parameter declarations |
| Annotations | tmdl-authoring-guide.md § Annotations | Model and object-level annotations |
| TMDL File Layout & Core Files | tmdl-advanced-features-guide.md § File Layout | Directory structure, database.tmdl, model.tmdl |
| Calculation Groups | tmdl-advanced-features-guide.md § Calculation Groups | Calculation group tables and items |
| Security Roles | tmdl-advanced-features-guide.md § Security Roles | RLS/OLS role definitions |
| Security Role Memberships | SKILL.md § Security Role Memberships | Add/list/delete users & groups in RLS roles (Power BI API) |
| Translations / Cultures | tmdl-advanced-features-guide.md § Translations / Cultures | Localization via culture files |
| Perspectives | tmdl-advanced-features-guide.md § Perspectives | Perspective definitions for subset views |
| Functions | tmdl-advanced-features-guide.md § Functions | User-defined DAX functions in the model |
| Calendar Objects | tmdl-advanced-features-guide.md § Calendar Objects | Auto date/time calendar table objects |
| Tool | Role | Install |
|---|---|---|
| Primary: | Pre-installed in most dev environments |
| Parse JSON from | Pre-installed or trivial |
| Encode TMDL file content for definition payloads | Built-in |
Agent check — verify before first operation:bashaz version 2>/dev/null || echo "INSTALL: https://learn.microsoft.com/cli/azure/install-azure-cli"
| API | Audience ( | Use For |
|---|---|---|
| Fabric Items API | | Create/get/update/delete semantic model definitions, list items, LRO polling |
| Power BI Datasets API | | Refresh, data sources, parameters, permissions, deployment pipelines |
# Fabric Items API — semantic model definition operations
az rest --method post \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels" \
...
# Power BI Datasets API — refresh, data sources, permissions
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "https://api.powerbi.com/v1.0/myorg/groups/$WS_ID/datasets/$DATASET_ID/refreshes" \
...--resourceaz rest--headers "Content-Type=application/json"--bodyUnsupported Media TypeupdateDefinition.platformupdateDefinitioncreateItemWithDefinitiongetDefinitionupdateDefinition202 AcceptedOperation-Idpayload.=:'GET /v1/workspaces/{id}capacityIdcreateItemWithDefinitionpowerbi-modeling-mcppowerbi-consumption-cliupdateDefinitionpowerbi-modeling-mcplineageTag/////description///updateDefinitioncreateItemWithDefinitionupdateDefinition| Part Path | Content | Required |
|---|---|---|
| Semantic model connection settings | Yes |
| Database properties (compatibility level) | Yes |
| Model properties (culture, default summarization) | Yes |
| Per-table: columns, measures, partitions | Yes (≥1) |
Critical:must include ALL parts — modified and unmodified. The API replaces the entire definition. Never includeupdateDefinitionin update payloads..platform
{
"version": "4.2",
"settings": {
"qnaEnabled": true
}
}database
compatibilityLevel: 1702
compatibilityMode: powerBImodel Model
culture: en-US
defaultPowerBIDataSourceVersion: powerBI_V3
discourageImplicitMeasuresNote:is required for Import-mode models. Without it, the API returnsdefaultPowerBIDataSourceVersion: powerBI_V3.Import from JSON supported for V3 models only
table Customer
/// Total number of customers
measure '# Customers' = COUNTROWS(Customer)
formatString: #,##0
column CustomerId
dataType: int64
isHidden
isKey
summarizeBy: none
sourceColumn: CustomerId
column 'Customer Name'
dataType: string
sourceColumn: CustomerName
partition Customer = m
mode: import
source =
let
Source = Sql.Database(#"Server", #"Database"),
Customer = Source{[Schema="dbo", Item="Customer"]}[Data]
in
Customerexpression DL_Lakehouse =
let
Source = AzureStorage.DataLake("https://onelake.dfs.fabric.microsoft.com/<WorkspaceId>/<LakehouseId>", [HierarchicalNavigation=true])
in
Source
table Sales
/// Total revenue
measure 'Total Sales' = ```
SUMX(
Sales,
Sales[Quantity] * Sales[UnitPrice]
)
```
formatString: \$#,##0.00
column SalesKey
dataType: int64
isHidden
isKey
summarizeBy: none
sourceColumn: sales_key
column Quantity
dataType: int64
sourceColumn: quantity
column UnitPrice
dataType: decimal
summarizeBy: none
sourceColumn: unit_price
partition Sales = entity
mode: directLake
source
entityName: Sales
schemaName: dbo
expressionSource: DL_LakehouseWS_ID="<workspaceId>"
# 1. Base64-encode each TMDL file
PBISM=$(base64 -w 0 < definition.pbism)
DB=$(base64 -w 0 < definition/database.tmdl)
MODEL=$(base64 -w 0 < definition/model.tmdl)
TABLE=$(base64 -w 0 < definition/tables/Customer.tmdl)
# 2. Construct payload and create — use --verbose to capture HTTP status and LRO headers
cat > /tmp/body.json << EOF
{
"displayName": "MySalesModel",
"definition": {
"format": "TMDL",
"parts": [
{"path": "definition.pbism", "payload": "$PBISM", "payloadType": "InlineBase64"},
{"path": "definition/database.tmdl", "payload": "$DB", "payloadType": "InlineBase64"},
{"path": "definition/model.tmdl", "payload": "$MODEL", "payloadType": "InlineBase64"},
{"path": "definition/tables/Customer.tmdl", "payload": "$TABLE", "payloadType": "InlineBase64"}
]
}
}
EOF
az rest --method post --verbose \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels" \
--headers "Content-Type=application/json" \
--body @/tmp/body.jsonPowerShell — useinstead of[Convert]::ToBase64String([System.IO.File]::ReadAllBytes("file")).base64 -w 0
202 AcceptedgetDefinitionWS_ID="<workspaceId>"
MODEL_ID="<semanticModelId>"
# 1. Request definition — may return 200 (inline) or 202 (LRO)
RESPONSE=$(az rest --method post --verbose \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels/$MODEL_ID/getDefinition?format=TMDL" \
--body '{}' \
--output json 2>/dev/null)
# 2. If 202, poll the Location header URL until Succeeded, then GET /result
# 3. Decode each part
echo "$RESPONSE" | jq -r '.definition.parts[] | .path + " " + .payload' | \
while read -r path payload; do
mkdir -p "$(dirname "$path")"
echo "$payload" | base64 -d > "$path"
doneCritical rules: Must include ALL parts (modified + unmodified). Never include. The API replaces the entire definition — omitted parts are deleted..platform
WS_ID="<workspaceId>"
MODEL_ID="<semanticModelId>"
# 1. Get current definition (see Get/Download Definition above)
# 2. Modify the relevant TMDL files
# 3. Re-encode ALL parts and POST
cat > /tmp/body.json << EOF
{
"definition": {
"format": "TMDL",
"parts": [
{"path": "definition.pbism", "payload": "$PBISM", "payloadType": "InlineBase64"},
{"path": "definition/database.tmdl", "payload": "$DB", "payloadType": "InlineBase64"},
{"path": "definition/model.tmdl", "payload": "$MODEL", "payloadType": "InlineBase64"},
{"path": "definition/tables/Customer.tmdl", "payload": "$TABLE", "payloadType": "InlineBase64"}
]
}
}
EOF
az rest --method post \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels/$MODEL_ID/updateDefinition" \
--body @/tmp/body.json?updateMetadata=true.platform| Operation | Supported | Method |
|---|---|---|
| Create semantic model with TMDL | ✅ | |
| Get/download TMDL definition | ✅ | |
| Update full TMDL definition | ✅ | |
| Delete semantic model | ✅ | |
| Refresh dataset | ✅ | Power BI Datasets API (Phase 4) |
| Add/modify single measure or column | ⚠️ Route to | Full definition round-trip is inefficient |
| Create reports | ❌ | Not in scope — separate definition format (PBIR) |
https://analysis.windows.net/powerbi/apiWS_ID="<workspaceId>"
DATASET_ID="<semanticModelId>"
PBI="https://api.powerbi.com/v1.0/myorg"
# Trigger full refresh
cat > /tmp/body.json << 'EOF'
{"notifyOption": "NoNotification"}
EOF
az rest --method post --verbose \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
# Get refresh history (latest first)
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes?\$top=5"
# Cancel an in-progress refresh
az rest --method delete \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes/<refreshId>"
# Get refresh schedule
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshSchedule"
# Update refresh schedule
cat > /tmp/body.json << 'EOF'
{
"value": {
"enabled": true,
"days": ["Monday", "Wednesday", "Friday"],
"times": ["02:00", "14:00"],
"localTimeZoneId": "UTC"
}
}
EOF
az rest --method patch \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshSchedule" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json# Get data sources for a dataset
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/datasources"
# Get parameters
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/parameters"
# Update parameters
cat > /tmp/body.json << 'EOF'
{
"updateDetails": [
{"name": "Server", "newValue": "newserver.database.windows.net"},
{"name": "Database", "newValue": "ProductionDB"}
]
}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/Default.UpdateParameters" \
--headers "Content-Type=application/json" \
--body @/tmp/body.jsonAfter updating parameters or data source credentials, trigger a refresh for changes to take effect.
# List dataset users
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users"
# Grant dataset permissions to a user
cat > /tmp/body.json << 'EOF'
{
"identifier": "user@contoso.com",
"principalType": "User",
"datasetUserAccessRight": "Read"
}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
# Update existing user permissions
cat > /tmp/body.json << 'EOF'
{
"identifier": "user@contoso.com",
"principalType": "User",
"datasetUserAccessRight": "ReadReshare"
}
EOF
az rest --method put \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
--headers "Content-Type=application/json" \
--body @/tmp/body.jsonReadReadReshareReadExploreReadReshareExplorePBI="https://api.powerbi.com/v1.0/myorg"
# List members of a security role
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
| jq '[.value[] | select(.datasetUserAccessRight == "Read" and .roles != null)]'
# Add a user to a security role
cat > /tmp/body.json << 'EOF'
{
"identifier": "user@contoso.com",
"principalType": "User",
"datasetUserAccessRight": "Read",
"roles": ["SalesRegion"]
}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
# Add a security group to a role
cat > /tmp/body.json << 'EOF'
{
"identifier": "<group-object-id>",
"principalType": "Group",
"datasetUserAccessRight": "Read",
"roles": ["SalesRegion"]
}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
# Update role membership (e.g., move user to a different role)
cat > /tmp/body.json << 'EOF'
{
"identifier": "user@contoso.com",
"principalType": "User",
"datasetUserAccessRight": "Read",
"roles": ["EuropeOnly"]
}
EOF
az rest --method put \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
--headers "Content-Type=application/json" \
--body @/tmp/body.jsonThearray accepts one or more role names that must match roles defined in the semantic model's TMDL. The user/group must also have at leastrolespermission on the dataset.Readcan beprincipalType,User, orGroup.App
https://api.fabric.microsoft.comFABRIC="https://api.fabric.microsoft.com/v1"
# List deployment pipelines
az rest --method get \
--resource "https://api.fabric.microsoft.com" \
--url "$FABRIC/deploymentPipelines"
# Get pipeline stages
az rest --method get \
--resource "https://api.fabric.microsoft.com" \
--url "$FABRIC/deploymentPipelines/<pipelineId>/stages"
# Deploy from one stage to the next (e.g., Dev → Test)
cat > /tmp/body.json << 'EOF'
{
"sourceStageOrder": 0,
"targetStageOrder": 1,
"items": [
{
"sourceItemId": "<semanticModelId>",
"itemType": "SemanticModel"
}
],
"options": {
"allowCreateArtifact": true,
"allowOverwriteArtifact": true
}
}
EOF
az rest --method post \
--resource "https://api.fabric.microsoft.com" \
--url "$FABRIC/deploymentPipelines/<pipelineId>/deploy" \
--headers "Content-Type=application/json" \
--body @/tmp/body.jsonOmit thearray to deploy all items in the stage. The deploy call returnsitems— poll using the LRO pattern.202 Accepted
powerbi-modeling-mcpaz restGET /v1/workspaces/{id}/semanticModelsdefinition.pbismdatabase.tmdlmodel.tmdlformatStringEVALUATE { [Measure Name] }dataTypesourceColumndataTypeEarly-abort rule: If bothreturnsgetDefinition(on an item you can list/GET) and the Power BI refresh API returns404 EntityNotFoundwith403 Forbidden, stop retrying immediately — the user almost certainly has only Viewer role on the workspace. Verify by calling"identity None"; if that also returnsGET /v1/workspaces/{id}/roleAssignments, confirm to the user they need Contributor or higher role. Do not retry with different URL formats, endpoints, or parameters — the issue is permissions, not API usage.403 InsufficientWorkspaceRole
| Symptom | Cause | Fix |
|---|---|---|
| User has Viewer role — refresh, data sources, and permissions APIs require Contributor+ | Stop immediately. Ask user to request Contributor/Member/Admin role on the workspace |
| Insufficient permissions masquerading as 404 — getDefinition requires Contributor+ | Check workspace role first; do not retry with different URL formats |
| User is Viewer on the workspace | Confirms Viewer role — all authoring and most read operations are blocked |
| Wrong or missing | Use |
| Wrong audience | Use |
| Missing request body | Pass |
| LRO poll never completes | Token expired during long operation | Re-acquire token in poll loop; increase Retry-After interval |
| Didn't follow LRO to completion | Poll |
| TMDL validation error on create/update | Syntax error in TMDL content | Check TMDL rules in tmdl-authoring-guide.md; validate before encoding |
| Parts missing after updateDefinition | Only modified parts were sent | Must include ALL parts (modified + unmodified) in every update |
Error including | | Remove |
| Base64 decode produces garbled content | Wrong encoding or line wrapping | Use |
| Refresh fails with data source error | Credentials expired or parameters wrong | Check data sources and parameters; update credentials if needed |
| Deployment pipeline fails | Workspace not assigned to stage | Assign workspace to pipeline stage before deploying |
| Manually added | Remove |
| DAX error testing measures | Measure name case mismatch | DAX measure names are case-sensitive; match exactly |
Attempting | DAX | Use the Power BI REST API instead: |
WS_ID="<workspaceId>"
# Encode all TMDL files
PBISM=$(base64 -w 0 < definition.pbism)
DB=$(base64 -w 0 < definition/database.tmdl)
MODEL=$(base64 -w 0 < definition/model.tmdl)
CUSTOMER=$(base64 -w 0 < definition/tables/Customer.tmdl)
SALES=$(base64 -w 0 < definition/tables/Sales.tmdl)
cat > /tmp/body.json << EOF
{
"displayName": "SalesModel",
"definition": {
"parts": [
{"path": "definition.pbism", "payload": "$PBISM", "payloadType": "InlineBase64"},
{"path": "definition/database.tmdl", "payload": "$DB", "payloadType": "InlineBase64"},
{"path": "definition/model.tmdl", "payload": "$MODEL", "payloadType": "InlineBase64"},
{"path": "definition/tables/Customer.tmdl", "payload": "$CUSTOMER", "payloadType": "InlineBase64"},
{"path": "definition/tables/Sales.tmdl", "payload": "$SALES", "payloadType": "InlineBase64"}
]
}
}
EOF
az rest --method post --verbose \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels" \
--headers "Content-Type=application/json" \
--body @/tmp/body.jsonWS_ID="<workspaceId>"
MODEL_ID="<semanticModelId>"
# Get definition (may return 202 — follow LRO)
RESULT=$(az rest --method post \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels/$MODEL_ID/getDefinition?format=TMDL" \
--body '{}' --output json)
# Decode and save all parts
echo "$RESULT" | jq -r '.definition.parts[] | .path + "\t" + .payload' | \
while IFS=$'\t' read -r path payload; do
mkdir -p "$(dirname "$path")"
echo "$payload" | base64 -d > "$path"
echo "Saved: $path"
doneWS_ID="<workspaceId>"
DATASET_ID="<semanticModelId>"
PBI="https://api.powerbi.com/v1.0/myorg"
# Trigger refresh
cat > /tmp/body.json << 'EOF'
{"type": "Full"}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes" \
--body @/tmp/body.json
# Check latest refresh status
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes?\$top=1"FABRIC="https://api.fabric.microsoft.com/v1"
PIPELINE_ID="<pipelineId>"
# Deploy from Test (stage 1) to Production (stage 2)
cat > /tmp/body.json << 'EOF'
{
"sourceStageOrder": 1,
"targetStageOrder": 2,
"items": [
{"sourceItemId": "<semanticModelId>", "itemType": "SemanticModel"}
],
"options": {
"allowCreateArtifact": true,
"allowOverwriteArtifact": true
}
}
EOF
az rest --method post \
--resource "https://api.fabric.microsoft.com" \
--url "$FABRIC/deploymentPipelines/$PIPELINE_ID/deploy" \
--body @/tmp/body.json