Azure bulk import csv file to Azure SQL Server from Azure Database

I would like to import a csv file (sqlserver-dba-csv.txt) into a table in a SQL Server database hosted in Azure.

This file resides in the Azure File Service (location address: https://XXXXXXXXXXX.file.core.windows.net/XXXXXXXXXXX/sqlserver-dba-csv.txt '), which is also a mapped drive on my local machine.

Eventually I would like this to be somewhat automated with a trigger, but for now I just want to import the data into a table in a SQL server to prove the process is working.

content of this sqlserver-dba-csv.txt file:

1,James Brown,blue 
2,Prince,red 
3,Rick James,yellow

      

The code I'm using in SSMS is:

**--create a table
CREATE TABLE musicians_csv (
musician_id INT,
full_name VARCHAR(50),
colour VARCHAR(20)
)
GO
--bulk insert csv into a SQL Server table
BULK
INSERT musicians_csv
FROM 'https://XXXXXXXXXXX.file.core.windows.net/XXXXXXXXXXX/sqlserver-dba-csv.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Verify data inserted
SELECT *
FROM musicians_csv
GO
--Drop the table
DROP TABLE musicians_csv
GO**

      

The error message I am getting:

Msg 4861, Level 16, State 1, Line 10 Unable in bulk because the file "Https: /xxxxxxxxx.file.core.windows.net/xxxxxxxxx/sqlserver-dba-csv.txt" could not be opened. Operating system error code (null).

(0 lines (lines) affected)

I suspect the file location formatting is wrong, but after searching, I couldn't find any solution.

Alternatively - can I reference the file on my local machine and import it even though my SQL Server is in the Azure cloud - for example, a location like "C: \ Users \ user.name \ Desktop \ sqlserver -dba-csv.txt "

Any help is greatly appreciated

+3


source to share


2 answers


You may be facing a security issue: Azure SQL may not have access to the file you are trying to import if the file has not been declared publicly normally.

Of course, you might not want the file to be available to everyone, so you need to store the credentials to access it in Azure SQL using the CREATE CREDENTIAL command:

CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'A-$tr0ng|PaSSw0Rd!';
GO

CREATE DATABASE SCOPED CREDENTIAL [CSV-Storage-Credentials]
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = '<shared-access-signature for "csvimportdemo" blob storage here>';
GO

CREATE EXTERNAL DATA SOURCE [CSV-Storage]
WITH 
( 
    TYPE = BLOB_STORAGE,
    LOCATION = 'https://csvimportdemo.blob.core.windows.net',
    CREDENTIAL= [CSV-Storage-Credentials]
);

      

once that is done, you should be able to access the file via the BULK command:



SELECT 
    FileId = ' + CAST(@fid AS NVARCHAR(9)) + ',
    FirstName, 
    LastName,   
    TwitterHandle
FROM OPENROWSET(
    BULK '<your file here>', 
    DATA_SOURCE = 'CSV-Storage',
    FIRSTROW=2,
    FORMATFILE='<your format file here>',
    FORMATFILE_DATA_SOURCE = 'Storage') as t

      

you can find a complete working example on github (where the example code I posted comes from) that automates imports using Azure Function as well:

https://github.com/yorek/AzureFunctionUploadToSQL/blob/master/SQL/create-objects.sql

+1


source


This link should help you: https://azure.microsoft.com/en-us/updates/preview-loading-files-from-azure-blob-storage-into-sql-database/



You will need to create an external data source (with a shared signature if your block store is not public.)

0


source







All Articles