Reputation: 181
{
"Events":
[
{
"dataOne":7.5555555555555555,
"dataTwo":7.5555555555555555,
"dataFive":1025,
"dataSix":0,
"dataSeven":1025,
"dateTimeLocal":1234567890,
"dateTimeUTC":1234567890
}
],
"infoType":"type1",
"deviceID":"00000000000000000",
"dateTimeLocal":1234567890,
"dateTimeUTC":1234567890,
"EventProcessedUtcTime":"20xx-0x-xxT0x:0x:x.4781329Z",
"PartitionId":1,
"EventEnqueuedUtcTime":"20xx-0x-xxT0x:0x:x.3850000Z"
}
Here is a JSON record of my data, how do I define my sql table according to it, and how do I import the JSON data in to Azure SQL Server : (
Upvotes: 0
Views: 6250
Reputation: 1
In ADF, you can create a 'Copy Activity' to transfer your blob data into the SQL server directly. Please configure the source dataset as CSV 'DelimitedText' (Think JSON file content as csv data with 1 row and 1 column) and Sink dataset as Azure SQL Database Connector. You can load hundreds of JSON from Azure blob to a Azure SQL table having a field with size NVARCHAR(MAX).
Define the mapping for 1 source (i.e. JSON) column. You can also add custom fields in Copy activity Source and do it mapping here too. Create another 'Stored Procedure Activity' to parse JSON stored in SQL table.
Source:
Source Dataset:
Sink:
Mapping:
'stg' table to save should look like this.
CREATE TABLE [dbo].[stgSalesInvoice](
[id] [int] IDENTITY(1,1) NOT NULL,
[JsonData] [nvarchar](max) NULL
) ON [PRIMARY]
GO
The stored procedure to parse JSON in Target Azure Sql table will look like this for a sample nested json. Not attaching the json file but below shows how easily you can parse nested json.
CREATE PROCEDURE [dbo].[pLoadSalesInvoice]
AS
BEGIN
DECLARE @Jsons nVARCHAR(MAX);
DECLARE @mPartitionKey VARCHAR(20) ;
DECLARE @mFileName VARCHAR(1000) ;
DECLARE @JsonTotalFilesCount INT = (SELECT COUNT(*) FROM [dbo].[stgSalesInvoice]);
DECLARE @JsonLoopCount INT = 1;
TRUNCATE TABLE [dbo].[SalesInvoice]
WHILE @JsonLoopCount <= @JsonTotalFilesCount
BEGIN
SELECT @Jsons = JsonData FROM [dbo].[stgSalesInvoice] WHERE id = @JsonLoopCount;
--Target Azure Sql Table
INSERT INTO [dbo].[SalesInvoice]
([SalesInvoiceId]
,[invoiceNo]
,[transactionId]
,[supplierId]
,[vendorSupplierId]
,[vendorNoExt]
,[vendorDate]
,[biddingId]
,[description]
,[biddingDimensionId]
,[biddingItemDimensionValueId]
)
SELECT [SalesInvoiceId]
,[invoiceNo]
,[transactionId]
,[supplierId]
,[vendorSupplierId]
,[vendorNoExt]
,[vendorDate]
,[biddingId]
,[description]
,[biddingDimensionId]
,[biddingItemDimensionValueId]
FROM OPENJSON(@Jsons) WITH (
SalesInvoice NVARCHAR(MAX) '$.SalesInvoice' AS JSON
,supplier NVARCHAR(MAX) '$.supplier' AS JSON
,vendor NVARCHAR(MAX) '$.vendor' AS JSON
,bidding NVARCHAR(MAX) '$.bidding' AS JSON
)
OUTER APPLY OPENJSON(SalesInvoice) WITH (
[SalesInvoiceId] NVARCHAR(100) '$.id'
,[invoiceNo] NVARCHAR(100) '$.invoiceNo'
,[transactionId] NVARCHAR(100) '$.transactionId'
)
OUTER APPLY OPENJSON(supplier) WITH (
[supplierId] NVARCHAR(100) '$.id'
,[vendorSupplierId] NVARCHAR(100) '$.vendorSupplierId'
)
OUTER APPLY OPENJSON(vendor) WITH (
[vendorNoExt] NVARCHAR(100) '$.vendorNoExt'
,[vendorDate] datetime2(7) '$.vendorDate'
)
OUTER APPLY OPENJSON(bidding) WITH (items NVARCHAR(MAX) '$.items' AS JSON)
OUTER APPLY OPENJSON(items) WITH (
[biddingId] NVARCHAR(100) '$.biddingId',
[description] NVARCHAR(250) '$.description',
biddingItemDimensionValues NVARCHAR(MAX) '$.biddingItemDimensionValues' AS JSON
)
OUTER APPLY OPENJSON(biddingItemDimensionValues) WITH (
[biddingDimensionId] NVARCHAR(100) '$.biddingDimensionId'
,[biddingItemDimensionValueId] NVARCHAR(100) '$.biddingItemDimensionValueId'
);
SET @JsonLoopCount = @JsonLoopCount + 1
END
SELECT 1;
END
Upvotes: 0
Reputation: 16431
Please look at this doucment:Import JSON documents into SQL Server(Azure SQL Database)
It shows how to import Azure Storage Blob JSON into Azure SQL Server. And provides the demos:
Demo SQL:
CREATE EXTERNAL DATA SOURCE MyAzureBlobStorage
WITH ( TYPE = BLOB_STORAGE,
LOCATION = 'https://myazureblobstorage.blob.core.windows.net',
CREDENTIAL= MyAzureBlobStorageCredential);
BULK INSERT Product
FROM 'data/product.dat'
WITH ( DATA_SOURCE = 'MyAzureBlobStorage');
You just need to follow the tutorial.
Hope this helps.
Upvotes: 2
Reputation: 23782
You could use Azure Data Factory Copy Activity to transfer your blob data into sql server directly. Please configure the input as Azure Blob Storage Connector and output as Azure SQL Database Connector.
About the column mapping, please refer to this link:https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping. The array property could be stored as string in your sql database column.
Upvotes: 1