3

I am currently using python in azure functions to create a timer trigger that aggregates data from blob storage, and puts the result in cosmosDB.

My problem is as follows: When I use a specific file in the path binding the functions runs as expected. Whenever I change it (so as to take all blobs in the container) I get the following error:

 Microsoft.Azure.WebJobs.Host: No value for named parameter 'test'.

Below is my function.json bindings

{
  "bindings": [
    {
      "name": "blobTrigger",
      "type": "timerTrigger",
      "direction": "in",
      "schedule": "0 0 * * * *",
      "connnection": "AzureWebJobsStorage",
      "path": "blob/{test}"
     },
     {
       "type": "blob",
       "name": "inputBlob",
       "path": "blob/{test}",
       "connection": "AzureWebJobsStorage",
       "direction": "in"
     },
     {
       "type": "documentDB",
       "name": "outputDocument",
       "databaseName": "database1",
       "collectionName": "functioncollection",
       "createIfNotExists": false,
       "connection": "development_DOCUMENTDB",
       "direction": "out"
      }
    ],
    "disabled": false
}

Unsure if connection to storage is supposed to be in the trigger binding also, but when ive tried without it i still get the same error.

Do any of you have any idea how to solve this?

Thanks.

1 Answer 1

1

This is not a legal syntax for timer trigger / blob input binding. When you set blob path to blob/{test}, that means you are binding blob path to a piece of information from your function's trigger. E.g. it could be used to bind to a property of queue messages.

In your case, the trigger is timer, so it has no information with it that could be used as parameter for blob input binding.

You can't really bind your function to ALL blobs in a container. If you need to access all blobs at once, you might have to do it manually (without dedicated binding, just by using SDK). Alternatively, create a function which will be triggered for each added/changed blob - if you can operate on one blob at a time.

Sign up to request clarification or add additional context in comments.

5 Comments

Ok, so I cant use a timer trigger for all the files in a blob storage container? I'd have to do it manually? I.e assign the storage account as a variable in the function and read from there?
You can use timer trigger, but can't use blob input binding to all files. And yes, you can read files manually.
Hi again Mikhail, sorry to bother you, but since you are so knowledgeable on this topic I thought I might ask you. I am now using the SDK as you suggested, and I am getting an IOError when trying to iterate through the files in my blob container. This is my code for blob in generator: json = json.load(data.get_blob_to_text('containername', open(blob.name))) and the error im getting is this: IOError: [Errno 2] No such file or directory: 'test.json' I realize that this might be an absolute path issue? How should I go about solving this?
@Nord112 Could you create a new question with the full code sample?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.