21

There's a way to transform a Python 3.5 async for statement in a Python 3.4 code?

PEP 0492 says that async for

async for TARGET in ITER:
    BLOCK
else:
    BLOCK2

is equivalent to

iter = (ITER)
iter = type(iter).__aiter__(iter)
running = True
while running:
    try:
        TARGET = await type(iter).__anext__(iter)
    except StopAsyncIteration:
        running = False
    else:
        BLOCK
else:
    BLOCK2

but __aiter__ does not exists in Python 3.4

2
  • 3
    If you have a working Python 3.5 code then look at the source of .__aiter__() and .__anext__() methods (it may be different for different ITER). Commented Dec 17, 2015 at 15:54
  • @OldBunny2800 I believe you are looking for this stackoverflow.com/questions/30191556/… Commented Feb 7, 2018 at 8:16

1 Answer 1

12
+50

No, there is not, async/await (__aiter__, etc as well) was introduced in python 3.5. On py3.4 the closest thing is asyncio.gather (if you can run all tasks at once/in parallel and wait until they are all finished) or pushing results into an asyncio.Queue (which is sequential, just as async for). Edit: see last example for an async for alternative, as described in the question.

Here is an example ala python docs for asyncio.gather:

import asyncio

@asyncio.coroutine
def task(id):
    print("task: {}".format(id))
    yield from asyncio.sleep(random.uniform(1, 3))
    return id

tasks = [
    task("A"),
    task("B"),
    task("C")
]
loop = asyncio.get_event_loop()
results = loop.run_until_complete(asyncio.gather(*tasks))
loop.close()
print(results)

Output:

task: B
task: A
task: C
['A', 'B', 'C']

Here is one for asyncio.Queue:

import asyncio

@asyncio.coroutine
def produce(queue, n):
    for x in range(n):
        print('producing {}/{}'.format(x, n))
        # todo: do something more useful than sleeping :)
        yield from asyncio.sleep(random.random())
        yield from queue.put(str(x))


@asyncio.coroutine
def consume(queue):
    while True:
        item = yield from queue.get()
        print('consuming {}...'.format(item))
        # todo: do something more useful than sleeping :)
        yield from asyncio.sleep(random.random())
        queue.task_done()


@asyncio.coroutine
def run(n):
    queue = asyncio.Queue()
    # schedule the consumer
    consumer = asyncio.ensure_future(consume(queue))
    # run the producer and wait for completion
    yield from produce(queue, n)
    # wait until the consumer has processed all items
    yield from queue.join()
    # the consumer is still awaiting for an item, cancel it
    consumer.cancel()


loop = asyncio.get_event_loop()
loop.run_until_complete(run(10))
loop.close()

Edit: async for alternative as described in the question:

import asyncio
import random

class StopAsyncIteration(Exception):
    """"""

class MyCounter:
    def __init__(self, count):
        self.count = count

    def __aiter__(self):
        return self

    @asyncio.coroutine
    def __anext__(self):
        if not self.count:
            raise StopAsyncIteration

        return (yield from self.do_something())

    @asyncio.coroutine
    def do_something(self):
        yield from asyncio.sleep(random.uniform(0, 1))
        self.count -= 1
        return self.count

@asyncio.coroutine
def getNumbers():
    i = MyCounter(10).__aiter__()
    while True:
        try:
            row = yield from i.__anext__()
        except StopAsyncIteration:
            break
        else:
            print(row)

loop = asyncio.get_event_loop()
loop.run_until_complete(getNumbers())
loop.close()

Note that this can be simplified by removing both __aiter__ and __anext__ and raising a stop exception within the do_something method itself or return a sentinel result when done (usually an invalid value like: None, "", -1, etc)

Sign up to request clarification or add additional context in comments.

6 Comments

If an asynchronous function returns an iterable, would you be able to assign it to a variable and use a standard for-in loop to iterate over it?
Yes, if you have something like result = await somecoro() and somecoro returns an iterable (i.e: list, tuple, dict, set, etc), then sure, you can iterate it later. The question here was about iterating over an async iterator, one that for example makes a bunch of HTTP requests and yields the content of each of them as soon as one is available, instead of having to wait for all to complete.
I've added some examples for asyncio.gather and asyncio.Queue. Of course, if you are on py3.5 an async iterator would be better (as in simpler/more readable) than a queue, at least in most situation I can think of.
@OldBunny2800 I've thought of another way of doing this and added as an example. It's basically the equivalent code showed in the OP question.
IMHO the third it's more simple and clear, very good job. Anyway, I've not well understood the role of __aiter__. In all examples I found it always returns self. When you would set __aiter__ a a different object?
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.