Free cookie consent management tool by TermsFeed Policy Generator

#087 Update Azure Analysis Services Data Model Automatically

Apresentamos nesse vídeo como solicitar a atualização do modelo de dados do Azure Analysis Services utilizando o Azure Data Factory.

We'll get to know the techniques

:

Create pipeline (NEW PIPELINE):

  • In Azure Data Factory, creating a pipeline involves defining a series of activities and tasks that make up the dataflow. A pipeline can include extracting, transforming, and loading (ETL) data, as well as other operations.
{
   "name": "MyPipeline",
   "properties": {
      "activities": [
         {
            "name": "MyActivity",
            "type": "Copy",
            "linkedServiceName": {
               "referenceName": "AzureBlobStorageLinkedService",
               "type": "LinkedServiceReference"
            },
            "inputs": [
               {
                  "referenceName": "InputDataset",
                  "type": "DatasetReference"
               }
            ],
            "outputs": [
               {
                  "referenceName": "OutputDataset",
                  "type": "DatasetReference"
               }
            ],
            "policy": {
               timeout: "7.00:00:00",
               "retry": 0,
               "retryIntervalInSeconds": 30,
               "secureOutput": false
            }
         }
      ]
   }
}
Set the URL of the

WEB service for data model refresh (SERVERS, MODELS, REFRESHES): To

refresh
  • a data model in Azure Analysis Services, you can use the URL of the web service associated with the model, specifying the refresh operation.

Create pipeline parameters (PARAMETERS, DATA TYPE, DEFAULT VALUE):Setting parameters

in
  • a pipeline allows for flexibility in execution by allowing values to be passed dynamically. These parameters can have specific data types and default values.
"
parameters": {
   "ParameterName": {
      "type": "String",
      "defaultValue": "DefaultValue"
   }
}

Add dynamic content using parameters (ADD DYNAMIC CONTENT): In

  • the body of an activity, you can add dynamic content by referencing parameters, expressions, or variables.
"
body": {
   "parameter": "@pipeline().parameters. NameParameter",
   "expression": "@formatDateTime(utcnow(), 'yyyy-MM-dd')"
}

Set the type of access authentication (SYSTEM ASSIGNED MANAGED IDENTITY, MSI): To

  • access resources such as Azure Analysis Services, you can configure authentication using System Managed Identity (MSI).
"
identity": {
   "type": "SystemAssigned"
}
Define the data to be sent by the HTTP request

(BODY, JSON):

  • When making an HTTP request, you define the data to be sent in the body of the request, usually in JSON format.
"
body": {
   "key": "value",
   "otherKey": "otherValue"
}

Get parameter values in the Azure Analysis Services resource:

  • When running a pipeline, parameter values can be fetched dynamically. These values can be used, for example, to update a model in Azure Analysis Services.

Run the pipeline (DEBUG):

  • The process of running a pipeline can be started in debug mode to test and verify that everything is working as expected.
Get the update JSON script

using SQL SERVER MANAGEMENT STUDIO (SSMS):

  • In SQL Server Management Studio, you can get the JSON script for updating the data model in Azure Analysis Services through the graphical interface.

These are simplified examples, and it's important to adapt to the specific structure of your environment and requirements.

 

 

This content contains
  • Content Video
  • Language Portuguese
  • Duration 10m 34s
  • Subtitles Sim

  • Reading time 2 min 19 seg

avatar
Fabio Santos

Data Scientist and Consultant for Digital and Analytics Solutions


  • Share

Youtube Channel

@fabioms

Subscribe now