0

My question relates to the overall time it takes to execute an async function compared to synchronous function.

My understanding is as follows:

-Async function gets called

-Code executes until the awaited code is reached

-Whilst awaited code is executed (eg db read), the executing thread is freed up to do another task

-Once awaited code is complete, a thread is assigned to complete this call

Assuming what I say is correct, is it the case that on a busy server the async call could and often would be slower than the synchronous call?

My reasoning is as follows:

  1. There is an overhead to async (keeping track of everything)

  2. As the async code has to wait for a thread to execute the remaining code, it has to wait for a thread to become available - so if all the threads are busy it would have to wait for one?

If this is true, then I assume this also means that if there was an awaited method that did not do any I/O, it would still release the executing thread and wait for the next one to become available to complete the task?

UPDATE: I know await/async does not make response times faster, but instead allows the server to be more efficient.

The main point I am trying to clarify is whether a request could take longer to complete due to the fact that there are no threads available when the async task has completed eg if the db call has completed, does it ever have to wait more time for a thread to become available? I know there are many dimensions to await and many more points to consider, but I am just trying to understand this one point in isolation.

UPDATE 2: Assume I have a well designed async system, but I have one method that I need to be as fast and lean as possible - I do not ant anything to ever get in the way of the execution time for this one call. Surely in this situation, if I make this method async and free up the thread, the call could potentially have to wait for a thread to finish its execution if it is async, whereas if it is sync the thread will always be there waiting - at the expense of the performance of the rest of the application?

13
  • 2
    Yes, async can have overhead over sync. The deciding factor isn't typically load, but how long the async action takes. If an async action can be executed very quickly, the overhead of the async machinery can cause overall lower throughput. This is especially true if the action isn't completed synchronously (which the async framework allows for, but isn't always done by implementing code). There have been recent improvements in the framework to reduce this overhead in the case of short-lived async operations that just read from a buffer (something you really want to do synchronously). Commented Nov 5, 2018 at 12:17
  • Yes there is an overhead to async await, but it's always worth remembering that this overhead is negligible compared to IO times. It's highly unlikely that you'll run into a situation where this really makes a difference, and it is more than offset by the economical use of threads that the async/await paradigm allows. Commented Nov 5, 2018 at 12:18
  • On a busy server, blocking a thread just to wait will prevent it from serving other requests, resulting in reduced scalability and consequently, causing delays. async improves scalability by releasing threads to handle other work instead of blocking Commented Nov 5, 2018 at 12:19
  • 1
    @Alex there are different reasons that can make an async database call slower. Results themselves are read asynchronously which means that loading a lot of data can itself take longer. Commented Nov 5, 2018 at 12:40
  • 1
    @Alex As for the super optimized method - if the database query isn't fast, it doesn't matter how the client code is written. You'll have to measure how your application behaves under load including the CPU load!!!. If you saturate the server with blocks you'll run into trouble. Commented Nov 5, 2018 at 12:44

2 Answers 2

3
  1. There is an overhead to async (keeping track of everything)

Correct. However, the speed gained from asynchronous tasks tends to far outweigh the overhead cost, so it's negligible. Unless you start spamming many tasks with tiny execution bodies, at which point the overhead (per task) will start to become a noticeable factor.

The main point I am trying to clarify is whether a request could take longer to complete due to the fact that there are no threads available when the async task has completed eg if the db call has completed, does it ever have to wait more time for a thread to become available?

Pedantically, that is correct. Your "first" operation that was put aside because it was waiting for a task to complete will not be able to butt in line at the exact time where its task is complete.

The operation will have to rejoin the queue, but does get priority over operations that have not been started yet. This is very analogous to e.g. how a post office works:

  • You queue
  • You get to the front desk, the postal worker gives you a form to fill in (and thus awaits you filling in that form)
  • You step aside and fill in the form while the postal worker can help others instead of idling while she waits for you.
  • When you are done with your task, you do not push the customer currently being helped aside... (this is what you seem to want: getting the immediate attention of a thread and not waiting for finish what it's currently doing).
  • ...but you will be allowed to jump the queue in front of people who have not yet spoken to the postal worker.

Note: At least that is how my culture (Western Europe) deals with this. Your mileage may vary.

So, yes, your assumption is in fact correct; but I do want to stress that the time difference here is astronomically negligible compared to the gains from working asynchronously.
If you want to squeeze a particular operation for the absolutely fastest performance, you would have to give it its own thread and not let it partake in the "async pooling" schedule. But I highly suggest against that as the benefit from doing so will be exceedingly marginal and will in no way outweigh the complexity of developing it.

Secondly, consider that in a "well designed async system" (as per your question), tasks will be split up into the smallest reasonable operations, and thus the execution of a single task (excluding any wait times inbetween) will not be significant. In practice, you will not notice having to wait for a thread to become available - unles you are massively oversaturating your server's workload and really pushing the limits - at which points you've got other bigger fish to fry.

Assume I have a well designed async system, but I have one method that I need to be as fast and lean as possible - I do not ant anything to ever get in the way of the execution time for this one call. Surely in this situation, if I make this method async and free up the thread, the call could potentially have to wait for a thread to finish its execution if it is async, whereas if it is sync the thread will always be there waiting - at the expense of the performance of the rest of the application?

I don't understand your reasoning here.

If you assume that "the thread is waiting for me" in the synchronous example, that means that this thread is available and currently not busy. But for your asynchronous example, you assume that all thread are busy and none are available.

These are completely different situations. If all threads are saturated in an asynchronous application, then all threads (be it one or more) in a synchronous application would also be busy.

The intent of async is quite the opposite. If a thread is currently busy but idling, in a synchronous application, you will be forced to wait and you cannot use the thread. In an asynchronous application, however, the waiting thread will be able to start handling your request as it's currently idling because it's waiting for another task that is not yet completed.

Sign up to request clarification or add additional context in comments.

1 Comment

great answer - you are the first person to answer the question I asked - rather than what they thought I should be asking
0

async-await is more about responsiveness that performance.

In a client UI, the UI is (usually) managed in a single thread (the UI thread). If you block that thread, the UI is blocked. async-await provides a mechanism to start a task outside of the UI thread and resume execution on the UI thread when that task is complete. That task can either be IO or CPU intensive. The main goal is to get out of the UI thread.

In a server (ASP.NET) the thread pool is limited so, releasing threads when doing IO makes the application able to handle more requests. On the other hand, using tasks for CPU intensive work will just trade one thread pool thread for another twice.

All this is extra work which, obviously, uses more resources (CPU cycles and memory). It's all about responsiveness, not performance.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.