2

I'm copying data from an Azure SQL DB to blob by the means of a query.

Here is the script of the activity:

{
    "type": "Copy",
    "typeProperties": {
        "source": {
            "type": "SqlSource",
            "sqlReaderQuery": "select distinct a.*, b.Name from [dbo].[Transactxxxxxxx] a join dbo.Anxxxxx b on a.[Clixxxxx] = b.[Fixxxxxx] where b.Name = 'associations'"
        },
        "sink": {
            "type": "BlobSink",
            "writeBatchSize": 0,
            "writeBatchTimeout": "00:00:00"
        }
    },
    "inputs": [
        {
            "name": "Txnsxxxxxxxxxxx"
        }
    ],
    "outputs": [
        {
            "name": "Txnxxxxxxxxxxxx"
        }
    ],
    "policy": {
        "timeout": "01:00:00",
        "concurrency": 1,
        "retry": 3
    },
    "scheduler": {
        "frequency": "Hour",
        "interval": 1
    },
    "name": "Copyxxxxxxxxxx"
}

The activity seems to work but it does not put any file in the sink.

The dataset points to the correct container.

1 Answer 1

1

Per your provided information, I found successful run logs in our service. I noticed that the target blob is specified like "experimentinput/Inxxx_To_xx_Associations.csv/Inxxx_To_xx.csv". The blob name is static, and multiple slice runs will overwrite the same blob file. You could leverage the partitionBy property to have a dynamic blob name. Refer to this article for more details: https://azure.microsoft.com/en-us/documentation/articles/data-factory-azure-blob-connector/#azure-blob-dataset-type-properties.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.