1

my Azure's serverless architecture is composed from some producers, a service bus, un azure function and a Postgres DB (see the below picture).

The Postgres DB is a legacy requirement, I cannot change it.

The following is the operations flow:

  1. The producers send a message with a temporal frequency towards service bus (about 9000 messagges/minute). Every producers sends a single message.
  2. The azure functions consumes the message and insert a row into the Postgres DB

In order to avoid a strong load of DB and open a lot of connections I would to aggregate the messages into the function and inset them by bulk insert. Could I work well with a durable function (entity function)?

Can you help me please? Best Regards

enter image description here

7
  • 1
    Each function invocation should receive a batch of messages anyway already and not being called per single message github.com/Azure/azure-functions-servicebus-extension/issues/… Commented Feb 27, 2020 at 10:55
  • Great, can I configure it? Commented Feb 27, 2020 at 11:28
  • what do you need to configure? Commented Feb 27, 2020 at 12:27
  • 1
    Looking at the PR, I would say: yes. but not sure how exactly. Might be worth to open a issue in the github repo. default seems to be 1000: github.com/Azure/azure-functions-servicebus-extension/pull/39/… Commented Feb 27, 2020 at 13:03
  • 1
    got it to work. see my answer below Commented Feb 27, 2020 at 13:33

2 Answers 2

1

Ok, got it to work as discussed in the comments. You don't need a durable function but you should receive (and then write) messages in batches from the Service Bus. Here is a batching example:

[FunctionName("QueueTriggeredFunction")]
public static void Run([ServiceBusTrigger("demofunctionqueue", Connection = "queueconstring")]string[] myQueueItems, ILogger log)
{
    log.LogInformation("Received messages {count}", myQueueItems.Length);
    foreach (var myQueueItem in myQueueItems)
    {
        log.LogInformation($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
    }
}

host.json to set the maximum number of messages per batch

{
  "version": "2.0",
  "extensions": {
    "serviceBus": {
      "batchOptions": {
        "maxMessageCount": 200
      },
      "messageHandlerOptions": {
        "maxConcurrentCalls": 1
      }
    }
  }
}
Sign up to request clarification or add additional context in comments.

12 Comments

He is using consumption plan. So, even if you limit the messages per batch on the function's instance, you will not limit the number of instances that can scale up on his function app. So he would have 10 instances of the function app consuming 1-2 messages each.
Fair enough, I edited the host.json above to add the config switch for max concurrency
maxConcurrentCalls does not work with Consumption plan. Please read my comments on my answer.
According to this one it should work? github.com/Azure/azure-functions-host/issues/…
yes, should be like that. If you still see multiple instances being spawned, you might need to also set WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT as suggested by @HugoBarona
|
1

I would consider Durable Functions just in case if you have multiple steps, so then you can build your workflow using Durable Functions. In your case, what I would do is to play around with the maxConcurrentCalls parameter on your hosts.json file, so you can then set an acceptable value for concurrent calls for your DB. Please reference this sample to get more info.

2 Comments

Hi Hugo, thanks for you reponse. Does "maxConcurrentCalls" limits the number of concurrent calls towards consumers (Azure function)?
@MichelFoucault in that case you should use the configuration setting WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT. Read more here

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.