Tuesday, August 27, 2019

The template validation failed: 'The action(s) 'xxx' referenced by 'inputs' in action 'xxx' are not defined in the template.'.

Currently working on a Logic App to accept AS2 message,decode it and do further processing and if any error encountered then notify the stakeholders by sending an email.

And for that I used email action and tried to pass on the error message as a body to the email, but I was not allowed to save the logic app for following reason

Action not defined in template


The template validation failed: 'The action(s) 'Decode AS2 message' referenced by 'inputs' in action 'Failure_Alert' are not defined in the template.'.

Why it happened

I am using following expression to set the body of alert email which will execute only if Decode AS2 message action fails

actions('Decode AS2 message').outputs.body.errormessage

It looks absolutely fine on designer but still wasn't able to save and the reason is how Logic App renders in background i.e. at code level. Upon switching to code view found that Decode AS2 message gets converted to Decode_AS2_ message

What to do

I just did the way logic app needs to provide the action name in the expression as below and all worked fine.


Related Post

  • Getting Started with Logic Apps - Enterprise Application Integration
  • Getting Started with Logic Apps - Fundamentals
  • The provided map definition is not valid
  • Required property 'content' expects a value but got null. Path ''.'
  • Getting Started with Logic Apps - File Routing
  • Installing and Configuring On Premises Data Gateway - By adding user to Active Directory
  • Inserting data in On Premises SQL Database using Logic Apps
  • Inserting Multiple Records In On Prem SQL Using Logic App
  • Check your request parameters to make sure the Path “” exists on your file system

  • Sunday, June 16, 2019

    Getting Started with Azure Data Factory - Insert Pipeline details in Custom Monitoring Table

    Monitoring of any system/application's health and performance plays very important role in an Enterprise , as you get insight about what is happening with the applications and can prevent from major losses by enabling better/corrective decision.

    Microsoft has provided inbuilt Monitoring feature within Azure Data Factory where you gets details about the each Pipeline which is run and also details about the activity inside the pipeline.

    Pipeline run

    As can be seen lot of details are made available at both the levels,RunID, Starttime, status, name, duration etc. 
    activity run

     and at activity level you can even see input and output of that particular activity.

    So the question might arise why need of Custom monitoring table?

    1. Currently the pipeline run data retention is only for 45 days, then after data would be no more available.
    2. Support for User defined properties -- all the details which we see is system properties but what about other than those, i.e. User defined properties which we want to monitor (non system). Yes, there is support to include User Properties in Monitor - you can promote any pipeline activity property as a user property( But only 5).
    3. And not all properties are shown in Monitor tab.However columns can be added or removed but are limited and only available for pipeline run and not for activity run.


    For demo I am using simple scenario - moving data from Azure SQL to Azure Data Lake Store. Continued from last post(Getting Started with Azure Data Factory - CopyData from CosmosDB to SQL) 
    Azure SQL to Azure Data Lake Store

    But here, upon failure details are logged into custom monitoring table

    Steps in creating solution

    1. Create Azure Data Lake Store  -- Destination

    ADLS gen2

    Small mistake here which led to failure in Pipeline run.

    2. Create Azure SQL DB -- Source
    SQL DB Table

    Small mistake here which led to failure in Pipeline run.

    3. Create Error Logging Table and Stored Procedure to Insert values in it

    Based on what all details you need to capture, number of columns can be defined. For demo twelve entities are identified thus those are added as columns  in ErrorLogTable
    ErrorLog Table and Stored Procedure

    And a simple stored procedure to insert values in it

    CREATE PROCEDURE [dbo].[usp_UpdateErrorLogTable]
    @PipelineName VARCHAR(250),
    @PipelineID VARCHAR(250),
    @Source VARCHAR(300),
    @ActivityName VARCHAR(250),
    @ActivityID VARCHAR(250),
    @ErrorCode VARCHAR(10),
    @ErrorDescription VARCHAR(5000),
    @FailureType VARCHAR(50),
    @ActivityStartTime DATETIME,
    @PipelineStartTime DATETIME,
    @BatchID VARCHAR(100),
    @ErrorLoggedTime DATETIME,
    DECLARE @CheckError INT = 0;
    INSERT INTO [ErrorLogTable]
    [ActivityName ],
    [ActivityID ],
    [ErrorCode ],
    [ErrorDescription ],
    [FailureType ],
    [ActivityStartTime ],
    [PipelineStartTime ],
    [BatchID ],
    [ErrorLoggedTime ]
    @PipelineName ,
    @PipelineID ,
    @Source ,
    @ActivityName ,
    @ActivityID ,
    @ErrorCode ,
    @ErrorDescription ,
    @FailureType ,
    @ActivityStartTime ,
    @PipelineStartTime ,
    @BatchID ,
    SET @CheckError = ERROR_NUMBER()
    IF @CheckError = 0 
    RETURN 1
    RETURN -1


    4. Create Azure Data Factory and Pipeline

    i. To create new instance of Data Factory, login to Azure portal Click on +Create a resource --> Data Factories --> click on +Add button 

    Provide unique name to Data Factory instance to be created as per purpose , select the subscription, resource group, version and location(Note that currently only limited locations are available to choose from).

    After creating an instance of Data Factory, you need to click on Author and Monitor - which will lead to ADF designer portal (dev env) which opens in separate tab. 

    get started page

    The new tab opens with options to get started with. Either click on Create Pipeline Wizard or the Pencil icon, both will lead to the canvas.

    ii. If you already have Data Factory instance, you can skip above step. 
    Instance of Data factory

    Search and open existing ADF instance and add new pipeline in it.

    iii. Create Linked Service and Dataset which will be used in pipeline.

    SQL Linked service and dataset
    SQL Linked Service

    Go to Connection tab and click on New button. Give it a name, database details, username, password etc.
    SQl Dataset

    Here no Table is selected, as am going to use query(Note: Even if you choose table here, you can use query in the activity).

    ADLS Linked service and dataset

    ADLS Linked Service

    Go to Connection tab and click on New button. Give it a name, ADLS url, account key.

    ADLS Dataset

    iv. Now drag and drop Copy activity from Activities-->Move & Transform section on canvas

    copy activity
    Give it a name, select Dataset (SQL Dataset created in above step) and the query to fetch data

    SQL Sink

    In sink tab select Dataset (ADLS dataset created in above step)

    v. Now next step is to add StoredProcedure activity on canvas and connect Copy DataFromSQL activity to it but on failure condition.

    Select Copy DataFromSQL activity and Click on plus sign and select Failure

    Under general tab give name and under SQL Account select the Linked Service, here we are using same Linked service which was created to SQL DB.
    Stored Procedure Parameter

    Under Stored Procedure tab select the SP and click on Import Parameter button -- It will fetch list of all the parameters defined in SP. You can add parameters manually too by pressing +New  button.
    SP Parameter

    Here am capturing details of Activity and Pipeline from properties which are promoted by system at runtime (ActivityID, PipelineID, ErrorCode etc.) and few are static values (User defined properties e.g., Source and ActivityName).

    How it works -- Whenever pipeline is triggered a RunID gets assigned to it and also to each activities within it , also other info like PipelineName, JobId, ActivityRunId, Status, StatusCode, Output, Error, ExecutionStartTime, ExecutionEndTime, ExecutionDetails, Duration and many more are captured. But all are not accessible, only few of them are, the ones which are promoted by system(available to access). And to access them we need to use accessors(properties).
    example -
    a. to get Pipeline ID use RunID property of Pipeline class -- pipeline().RunId
    b. to get when particular activity started use ExecutionStartTime property of activity class -- activity('name of activity').ExecutionStartTime
    c. to get the error details of particular activity  use error.message property of activity class -- activity('name of activity').error.message (it is available only if that particular activity has failed).

    That's it, pipeline is ready, validate it , publish it and test.


    The copy activity has to fail so that entry is made in ErrorLogTable, for that provided Incorrect table name in Query

    Select * from EmployeeArchive1 where Jobtitle='Developer'

    Invalid Object Name error

     After the pipeline run completed, entries were found in the ErrorLogTable
    entry in error log table

    Related Post

    Sunday, June 9, 2019

    Bad Request - The specifed resource name contains invalid characters.

    Encountered below error when triggered the  pipeline which moves data  from SQL to Azure Data Lake Store Gen2 and writes a csv file


    Activity Copy DataFromSQL failed: ErrorCode=UserErrorFailedBlobFSOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=BlobFS operation failed for: Operation returned an invalid status code 'BadRequest'. Account: 'adls2adf'. FileSystem: 'EMPARCHIVE'. ErrorCode: 'InvalidResourceName'. Message: 'The specifed resource name contains invalid characters.'. RequestId: '8a0437a7-201f-0008-4ac7-1e3e47000000'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.Azure.Storage.Data.Models.ErrorSchemaException,Message=Operation returned an invalid status code 'BadRequest',Source=Microsoft.DataTransfer.ClientLibrary,'

    Why it happened

    The error actually is pointing out that there is some invalid characters in the file path.

    incorrect file path

    As can be seen in above image name of container,directory and filename are all in Uppercase, however the error points to the name given to the container i.e. EMPARCHIVE. And it is because uppercase is not yet supported for container name.

    What to do

    You need to use lowercase alphabets also there is support for numbers for providing name to container. There is no such restriction on Directory/ foldername. 

    correct file path in ADLS gen2

    Output file was created after correcting the container name.

    Related Post