Wednesday, July 21, 2021

SQL compilation error: Unsupported data type 'XXX' in Snowflake

Working on loading of data into Snowflake (Online DWH). The first step is to have the database and tables created in snowflake.

After creating database, opened the worksheet to run create table query (quite straight forward)

CREATE TABLE [DATA].[BOLS]
(
        ID INT,
BOLNUMBER VARCHAR(50),
ROWGUID UNIQUEIDENTIFIER,
STPNUMBER INT,
        BB_ID SMALLINT,
ACCTNO VARCHAR(6),
CREATEDATE DATETIME
    
)

But when clicked on Run, got following error in result.

SQL compilation error: Unsupported data type 'UNIQUEIDENTIFIER'.

Like wise for other tables got below errors

SQL compilation error: Unsupported data type 'MONEY''.
SQL compilation error: Unsupported data type 'MONEY'



SQL compilation error: Unsupported data type 'BIT''
                               
   SQL compilation error: Unsupported data type 'bit'



Why it happened


Snowflake is a data platform and data warehouse that supports the most common standardized version of SQL: ANSI. This means that all of the most common operations are usable within Snowflake. Snowflake also supports all of the operations that enable data warehousing operations, like create, update, insert, etc.

However it's not true with the all the datatype used in SQL, that's what the error message say so 😉


What to do


You need to find out the equivalent datatypes in Snowflake for the ones you get error for and you will need to rewrite this datatype in Snowflake

Like in my case following were the equivalents to be used
  • DECIMAL is the equivalent for MONEY datatype in Snowflake. 
  • STRING is the equivalent for UNIQUEIDENTIFIER datatype in Snowflake.
  • BOOLEAN is the equivalent for BIT datatype in Snowflake.







Saturday, July 10, 2021

Logic Apps : For Each Inside a For Each - Fetching values from field in an array inside an array

 

Introduction

In last post Debatching(Splitting) XML Message in Logic Apps - ForEach and SplitOn  and we understood about how For each can be used to split/debatch messages.

In above scenario  we are using only one for each, whereas there can be situations where more than one For each would be required to solve the problem.


Scenario 


Say below is the purchase order and we need to take some action based on productName (value to be fetched). 

So we have two arrays to deal with
1. Products (having multiple product)
2. Product (having single product detail) within each products

PO Sample


Creating Logic App


Create a http based triggered logic app, followed by initializing a variable (in it we store value of productName field)

initialize a variable


Next is to add first For Each which will iterate through array of Products, input to for each has to be an array and to do that we use xpath query as can be seen in below image

first for each loop



Using xpath above - we are asking to traverse through the triggerBody and look out of products node and if found add entry in  For each array.(All occurrence of products nodes are added).


xpath(xml(triggerbody()),'purchaseOrder/products')


The array will have all the products node , so if input has 10 Products node then the array formed using xpath will have 10 items in it, each representing a single products node.


second for each loop


Next, we need to loop through each product from products and for that we will use array which was created in first For each and for that syntax is - items('For_each') and using xpath - we are asking to traverse through the items('For_each') and look out of product node and if found add entry in  For each array.(All occurrence of product nodes are added).

xpath(xml(items('For_each')),'products/product')

fetch the element value

last step is to use Set Variable  action where we fetch value of product name from each product node, and for that we will use array which was created in second  For each and for that syntax is - items('For_each_2') and using xpath - we get the current element value.


                xpath(xml(items('For_each_2')),'product/productName/text()')[0]











Friday, July 9, 2021

HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: MalformedInputException: Input length = 1

Working on Polybase implementation where data stored as parquet file has to be imported in Synapse (SQL DWH). 

So based on number of objects, created respective polybase(s). After executing few of them successfully, I got below error for one of them


HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: MalformedInputException: Input length = 1


To troubleshoot  checked the error statement ---

HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: MalformedInputException: Input length = 2 


It appeared something in Input has length = 2, (size mismatch) but upon checking file found no such fields.

Why it happened


I have created the external table with varchar as the datatype for all the columns so that all datatypes can be handled.

             CREATE EXTERNAL TABLE SampleTbl(
ID VARCHAR(200),
                        NOTE VARCHAR(1000),
ROWGUID VARCHAR(200),
CreateDate VARCHAR(200)
             )


But there's an exception, columns having datetime values and that's the reason for the error.  

Although external table will be created, the error will come while inserting in destination table, where expected is datetime value but trying to be inserted is varchar.

The error message doesn't say so 😉


What to do


So while creating external table,  check for columns having datetime datatype. If there is then specify it .

            CREATE EXTERNAL TABLE SampleTbl(
ID VARCHAR(200),
                        NOTE VARCHAR(1000),
ROWGUID VARCHAR(200),
CreateDate DATETIME
            )









Friday, February 12, 2021

When enabledForTemplateDeployment is true, networkAcls.bypass must include "AzureServices"

Working on Private Endpoint implementation for KeyVault, for that I enabled Private Endpoint  and access is allowed to only selected networks. And no exceptions (no one can bypass the rule)

key vault private end point


After doing that I was trying to add a Secret in a Key Vault but was notified with following error

"When enabledForTemplateDeployment is true, networkAcls.bypass must include "AzureServices""


Error while adding secret in Key Vault



Why it happened

 
So seeing the error I verified and confirmed that enabledForTemplateDeployment is true

enabled for Template deployment key vault


But this 
Specifies whether Azure Resource Manager is permitted to retrieve secrets from the key vault  thus this is not why the error was popped.
                     
The other part of error says that network access control is enabled.

networkAcls.bypass - Tells what traffic can bypass network rules. This can be 'AzureServices' or 'None'.

But as I have selected No in exceptions, no azure services can access Key Vault (cannot bypass the firewall).

What to do


The KeyVault in which I was tying to add secret has Private Endpoint enabled and access is allowed to only selected networks.

Thus I need to be inside that network to access Key Vault, for that I logged in to one of the Virtual Machine which is part of this Virtual Network, login to portal from the VM  and that's it - was able to add secret.