Apresentamos nesse vídeo como solicitar a atualização do modelo de dados do Azure Analysis Services utilizando o Azure Data Factory.
Create pipeline (NEW PIPELINE):
{ "name": "MyPipeline", "properties": { "activities": [ { "name": "MyActivity", "type": "Copy", "linkedServiceName": { "referenceName": "AzureBlobStorageLinkedService", "type": "LinkedServiceReference" }, "inputs": [ { "referenceName": "InputDataset", "type": "DatasetReference" } ], "outputs": [ { "referenceName": "OutputDataset", "type": "DatasetReference" } ], "policy": { timeout: "7.00:00:00", "retry": 0, "retryIntervalInSeconds": 30, "secureOutput": false } } ] } }Set the URL of the
WEB service for data model refresh (SERVERS, MODELS, REFRESHES): To
refreshCreate pipeline parameters (PARAMETERS, DATA TYPE, DEFAULT VALUE):Setting parameters
inparameters": { "ParameterName": { "type": "String", "defaultValue": "DefaultValue" } }
Add dynamic content using parameters (ADD DYNAMIC CONTENT): In
body": { "parameter": "@pipeline().parameters. NameParameter", "expression": "@formatDateTime(utcnow(), 'yyyy-MM-dd')" }
Set the type of access authentication (SYSTEM ASSIGNED MANAGED IDENTITY, MSI): To
identity": { "type": "SystemAssigned" }Define the data to be sent by the HTTP request
(BODY, JSON):
body": { "key": "value", "otherKey": "otherValue" }
Get parameter values in the Azure Analysis Services resource:
Run the pipeline (DEBUG):
using SQL SERVER MANAGEMENT STUDIO (SSMS):
These are simplified examples, and it's important to adapt to the specific structure of your environment and requirements.
Data Scientist and Consultant for Digital and Analytics Solutions
@fabioms