0

I am new to Azure. I have a automated process that populates data into a table on Azure SQL Database. Now, I am looking for an automated way of exporting the data out of this table in a CSV format to an On-premises location. (From there the file will be sent to a vendor) By Automation I mean a scheduled process which can run every couple of hours.

How can this be achieved?

1
  • Hi Mumuksh, If the answer is helpful for you, you can mark it as answer( click on the check mark beside the answer to toggle it from greyed out to filled in.). This can be beneficial to other community members. Thank you. Commented May 26, 2020 at 4:25

2 Answers 2

1

There are many ways can auto export the Azure SQL database table data as a csv file to an on-premise location.

The best way we suggest you is using Data Factory, when the pipeline created, you can create a trigger and schedule execute the pipeline.

Reference:

  1. Copy and transform data in Azure SQL Database by using Azure Data Factory
  2. Copy data to or from a file system by using Azure Data Factory
  3. Pipeline execution and triggers in Azure Data Factory

You also could use bellow ways:

  1. You can also use SSIS to implement an automated task.You can simply just copy data between databases (cloud -> On prem) with a scheduled SSIS package Export to CSV.
  2. You of course can use BCP but it will be cumbersome in the long run. A lot of scripts, tables, maintenance. No logging, no metrics, no alerts... Don't do it honestly.

Ref: Azure SQL DB - data file export (.csv) from azure sql

Hope this helps.

Sign up to request clarification or add additional context in comments.

3 Comments

Thank you so much for your responses. I was able to generate the file using Data Factory. Now I am trying to build something to get it on-premises location.
@Mumuksh You also could achieve that, DF self-host integration runtime can help you connect with Azure and local resource.
I am trying self-host integration runtime but seems like I am missing something. The status of self-hosted integration runtime says 'Running'. On my laptop it is saying 'Connected to cloud service' under config manager. But when I am adding a new linked service (Type: File System) and trying to connect a folder on my laptop. This is the error I am getting while testing the connection. Cannot connect to 'C:\Test'. Detail Message: The operation completed successfully The operation completed successfully Activity ID:
0

There are good amount of ways to do this:

  1. Schedule a Azure Data Factory pipeline
  2. Schedule an SSIS package using azure data factory or using sql server agent job
  3. Have a powershell script, which is scheduled using task scheduler

I am giving powershell script below, which I generally use to quickly get data from a specific table to CSV file. You can schedule this powershell script using task scheduler or using Execute-Process task in SSIS package.

Invoke-sqlcmd -ConnectionString "AzureSQLDBConnectionString"  `
-Query "SET NOCOUNT ON;SELECT * FROM TableName" -MaxCharLength 700 `
-QueryTimeout 1200 | Export-Csv -NoTypeInformation -path C:\temp\Tablename.csv -Encoding UTF8

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.