There are many similar questions to this but none seem to have my exact problem and none of the suggested solutions work for me.
I have an Azure SQL database and Azure blob storage and am trying to get data from a CSV file into an existing table (same data structure, column order etc).
The csv file is formatted without an index or headers and was generated from my pandas dataframe in Python with:
df.to_csv(csv_path, index=False, header=False) and then uploaded to blob storage.
And the code I have used to try and insert the data in SQL is:
CREATE DATABASE SCOPED CREDENTIAL AccessAzure
WITH
IDENTITY = 'SHARED ACCESS SIGNATURE'
, SECRET = 'sv=<my_token>'
;
CREATE EXTERNAL DATA SOURCE GeneralBlob
WITH
( LOCATION = 'https://<my_storage_account>.blob.core.windows.net/general/'
, CREDENTIAL = AccessAzure
, TYPE = BLOB_STORAGE
)
;
BULK INSERT <existing_table>
FROM 'data.csv'
WITH (DATA_SOURCE = 'GeneralBlob',
FORMAT = 'CSV')
;
Everything runs without errors except the very last part, where I get:
Cannot bulk load. The file "data.csv" does not exist or you don't have file access rights.
I have tested my SAS token etc by passing
https://<my_storage_account>.blob.core.windows.net/general/data.csv?sv=<my_token> just in my browser, and that prompts a download for my CSV. So it does exist, and with the token I should have file access rights, but nonetheless I still get that error in SQL.
I have also tried
SELECT * FROM OPENROWSET(
BULK 'data.csv',
DATA_SOURCE = 'GeneralBlob',
FORMAT = 'CSV'
) AS DataFile;
but it complains about a lack of format file and I can't find a suitable resource to tell me how to make one of those for my CSV. It also doesn't seem to me that that will work when the bulk insert doesn't anyway.
I'd really welcome any help here!!
SECRET = 'sv=<my_token>'st, sv, se is a part of the SAS key. Check SECRET format, please.