Exploring Generators and Coroutines

Let’s re­vis­it the idea of gen­er­a­tors in Python, in or­der to un­der­stand how the sup­port for corou­tines was achieved in lat­est ver­sions of Python (3.6, at the time of this writ­ing).

By re­view­ing the mile­stones on gen­er­a­tors, chrono­log­i­cal­ly, we can get a bet­ter idea of the evo­lu­tion that lead to asyn­chro­nous pro­gram­ming in Python.

We will re­view the main changes in Python that re­late to gen­er­a­tors and asyn­chronous pro­gram­ming, start­ing with PEP-255 (Sim­ple Gen­er­a­tors), PEP-342 (­Corou­tines via En­hanced Gen­er­a­tors), PEP-380 (Syn­tax for del­e­gat­ing to a ­Sub­-­Gen­er­a­tor), and fin­ish­ing with PEP-525 (Asyn­chronous Gen­er­a­tors).

Simple Generators

PEP-255 in­tro­duced gen­er­a­tors to Python. The idea is that when we process some ­data, we don’t ac­tu­al­ly need all of that to be in mem­o­ry at once. ­Most of the times, hav­ing one val­ue at the time is enough. Lazy eval­u­a­tion is a ­good trait to have in soft­ware, be­cause in this case it means that less mem­o­ry is used. It’s al­so a key con­cept in oth­er pro­gram­ming lan­guages, and one of the main ideas be­hind func­tion­al pro­gram­ming.

The new yield key­word was added to Python, with the mean­ing of pro­duc­ing an ele­ment that will be con­sumed by an­oth­er call­er func­tion.

The mere pres­ence of the yield key­word on any part of the func­tion, au­to­mat­i­cal­ly makes that a gen­er­a­tor func­tion. When called, this func­tion will cre­ate a gen­er­a­tor ob­ject, which can be ad­vanced, pro­duc­ing it­s ele­ments, one at the time. By call­ing the gen­er­a­tor suc­ces­sive times with the nex­t() func­tion, the gen­er­a­tor ad­vances to the next yield state­men­t, pro­duc­ing val­ues. Af­ter the gen­er­a­tor pro­duced a val­ue, the gen­er­a­tor is sus­pend­ed, wait­ing to be called again.

Take the range built-in func­tion, for ex­am­ple. In Python 2, this func­tion re­turns a list with all the num­bers on the in­ter­val. Imag­ine we want to come up­ with a sim­i­lar im­ple­men­ta­tion of it, in or­der to get the sum of all num­bers up­ ­to a cer­tain lim­it.

LIMIT = 1_000_000
def old_range(n):
    numbers = []
    i = 0
    while i < n:
        numbers.append(i)
        i += 1
    return numbers

print(sum(old_range(LIMIT)))

Now let’s see how much mem­o­ry is used:

$ /usr/bin/time -f %M python rangesum.py
499999500000
48628

The first num­ber is the re­sult of the print, whilst the sec­ond one is the out­put of the time com­mand, print­ing out the mem­o­ry used by the pro­gram (~48 MiB).

Now, what if this is im­ple­ment­ed with a gen­er­a­tor in­stead?

We just have to get rid of the list, and place the yield state­ment in­stead­, indi­cat­ing that we want to pro­duce the val­ue on the ex­pres­sion that fol­lows the key­word.

LIMIT = 1_000_000
def new_range(n):
    i = 0
    while i < n:
        yield i
        i += 1

print(sum(new_range(LIMIT)))

This time, the re­sult is:

$ /usr/bin/time -f %M python rangesum.py
499999500000
8992

We see a huge dif­fer­ence: the im­ple­men­ta­tion that holds all num­bers in a list in mem­o­ry, us­es ~48MiB, where­as the im­ple­men­ta­tion that just us­es one num­ber at the time, us­es much less mem­o­ry (< 9 MiB) [1].

We see the idea: when the yield <ex­pres­sion> is reached, the re­sult of the ­ex­pres­sion will be passed to the call­er code, and the gen­er­a­tor will re­main sus­pend­ed at that line in the mean­while.

>>> import inspect
>>> r = new_range(1_000_000)
>>> inspect.getgeneratorstate(r)
'GEN_CREATED'
>>> next(r)
0
>>> next(r)
1
>>> inspect.getgeneratorstate(r)
'GEN_SUSPENDED'

Gen­er­a­tors are it­er­able ob­ject­s. An it­er­able is an ob­ject whose __iter__ method, con­structs a new it­er­a­tor, ev­ery time is called (with iter(it), for in­stance). An it­er­a­tor is an ob­ject whose __iter__ re­turns it­self, and its __nex­t__ method con­tains the log­ic to pro­duce new ­val­ues each time is called, and how to sig­nal the stop (by rais­ing Sto­pIt­er­a­tion).

The idea of it­er­ables is that they ad­vance through val­ues, by call­ing the built-in nex­t() func­tion on it, and this will pro­duce val­ues un­til the Sto­pIt­er­a­tion ex­cep­tion is raised, sig­nalling the end of the it­er­a­tion.

>>> def f():
...     yield 1
...     yield 2

>>> g = f()
>>> next(g)
1
>>> next(g)
2
>>> next(g)
StopIteration:

>>> list(f())
[1, 2]

In the first case, when call­ing f(), this cre­ates a new gen­er­a­tor. The ­first two calls to nex­t(), will ad­vance un­til the next yield state­men­t, pro­duc­ing the val­ues they have set. When there is noth­ing else to pro­duce, the Sto­pIt­er­a­tion ex­cep­tion is raised. Some­thing sim­i­lar to this, is ac­tu­al­ly run, when we it­er­ate over this ob­ject in the form of for x in it­er­able: …. On­ly that Python in­ter­nal­ly han­dles the ex­cep­tion that de­ter­mines when the for loop stop­s.

Be­fore wrap­ping up the in­tro­duc­tion to gen­er­a­tors, I want to make a quick­ ­com­men­t, and high­light some­thing im­por­tant about the role of gen­er­a­tors in the lan­guage, and why they’re such a neat ab­strac­tion to have.

In­stead of us­ing the ea­ger ver­sion (the one that stores ev­ery­thing in a list), y­ou might con­sid­er avoid­ing that by just us­ing a loop and count­ing in­side it. It’s like say­ing “all I need is just the coun­t, so I might as well just ac­cu­mu­late the val­ue in a loop, and that’s it”. Some­thing slight­ly sim­i­lar to:

total = 0
i = 0
while i < LIMIT:
    total += i
    i += 1

This is some­thing I might con­sid­er do­ing in a lan­guage that does­n’t have ­gen­er­a­tors. Don’t do this. Gen­er­a­tors are the right way to go. By us­ing a ­gen­er­a­tor, we’re do­ing more than just wrap­ping the code of an it­er­a­tion; we’re cre­at­ing a se­quence (which could even be in­finite), and nam­ing it. This se­quence we have, is an ob­ject we can use in the rest of the code. It’s an ab­strac­tion. As such, we can com­bine it with the rest of the code (for ex­am­ple ­to fil­ter on it), re­use it, pass it along to oth­er ob­ject­s, and more.

For ex­am­ple, let’s say we have the se­quence cre­at­ed with new_range(), and then we re­al­ize that we need the first 10 even num­bers of it. This is as sim­ple as do­ing.

>>> import itertools
>>> rg = new_range(1_000_000)
>>> itertools.islice(filter(lambda n: n % 2 == 0, rg), 10)

And this is some­thing we could not so eas­i­ly ac­com­plish, had we cho­sen to ig­nore gen­er­a­tors.

For years, this has been all pret­ty much about gen­er­a­tors in Python. Gen­er­a­tors were in­tro­duced with the idea of it­er­a­tion and lazy com­pu­ta­tion in mind.

Lat­er on, there was an­oth­er en­hance­men­t, by PEP-342, adding more meth­ods to them, with the goal of sup­port­ing corou­tines.

Coroutines

Rough­ly speak­ing, the idea of corou­tines is to pause the ex­e­cu­tion of a ­func­tion at a giv­en point, from where it can be lat­er re­sumed. The idea is that while a corou­tine is sus­pend­ed, the pro­gram can switch to run an­oth­er part of the code. Ba­si­cal­ly, we need func­tions that can be paused.

As we have seen from the pre­vi­ous ex­am­ple, gen­er­a­tors have this fea­ture: when the yield <ex­pres­son>, is reached, a val­ue is pro­duced to the caller ob­jec­t, and in the mean­time the gen­er­a­tor ob­ject is sus­pend­ed. This sug­gest­ed that gen­er­a­tors can be used to sup­port corou­tines in Python, hence the name of the PEP be­ing “Corou­tines via En­hanced Gen­er­a­tors”.

There is more, though. Corou­tines have to sup­port to be re­sumed from ­mul­ti­ple en­try points to con­tin­ue their ex­e­cu­tion. There­fore, more changes are re­quired. We need to be able to pass da­ta back to them, and han­dle ex­cep­tion­s. ­For this, more meth­ods were added to their in­ter­face.

  • send(<­­val­ue>)
  • throw(ex_­type­[, ex_­val­ue[, ex_­­trace­back­­]])
  • close()

These meth­ods al­low send­ing a val­ue to a gen­er­a­tor, throw­ing an ex­cep­tion in­side it, and clos­ing it, re­spec­tive­ly.

The send() method im­plies that yield be­comes an ex­pres­sion, rather than a state­ment (as it was be­fore). With this, is pos­si­ble to as­sign the re­sult of a yield to a vari­able, and the val­ue will be what­ev­er it was sent to it.

>>> def gen(start=0):
...     step = start
...     while True:
...         value = yield step
...         print(f"Got {value}")
...         step += 1
...
>>> g =  gen(1)
>>> next(g)
1
>>> g.send("hello")
Got hello
2
>>> g.send(42)
Got 42
3

As we can see from this pre­vi­ous code, the val­ue sent by yield is go­ing to be the re­sult of the send, (in this case, the con­sec­u­tive num­bers of the se­quence), while the val­ue passed in the send(), the pa­ram­e­ter, is the re­sult that is as­signed to val­ue as re­turned by the yield, and print­ed out on the next line.

Be­fore send­ing any val­ues to the gen­er­a­tor, this has to be ad­vanced to the nex­t yield. In fac­t, ad­vanc­ing is the on­ly al­lowed op­er­a­tion on a new­ly-cre­at­ed ­gen­er­a­tor. This can be done by call­ing nex­t(g) or g.send(None), which are equiv­a­len­t.

Warn­ing

Re­mem­ber to al­ways ad­vance a gen­er­a­tor that was just cre­at­ed, or you will get a Type­Er­ror.

With the .throw() method the call­er can make the gen­er­a­tor raise an ex­cep­tion at the point where is sus­pend­ed. If this ex­cep­tion is han­dled in­ter­nal­ly in the gen­er­a­tor, it will con­tin­ue nor­mal­ly and the re­turn val­ue will be the one of the next yield line that reached. If it’s not han­dled by the gen­er­a­tor, it will fail, and the ex­cep­tion will prop­a­gate to the call­er.

The .close() method is used to ter­mi­nate the gen­er­a­tor. It will raise the Gen­er­a­torEx­it ex­cep­tion in­side the gen­er­a­tor. If we wish to run some clean up code, this is the ex­cep­tion to han­dle. When han­dling this ex­cep­tion, the on­ly al­lowed ac­tion is to re­turn a val­ue.

With these ad­di­tion­s, gen­er­a­tors have now evolved in­to corou­tines. This mean­s our code can now sup­port con­cur­rent pro­gram­ming, sus­pend the ex­e­cu­tion of ­tasks, com­pute non-block­ing I/O, and such.

While this work­s, han­dling many corou­ti­nes, refac­tor gen­er­a­tors, and or­ga­niz­ing the code be­came a bit cum­ber­some. More work had to be done, if we want­ed to ­keep a Python­ic way of do­ing con­cur­rent pro­gram­ming.

More Coroutines

PEP-380 added more changes to corou­ti­nes, this time with the goal of sup­port­ing del­e­ga­tion to sub­-­gen­er­a­tors. Two main things changed in gen­er­a­tors to make them more use­ful as corou­ti­nes:

  • Gen­er­a­tors can now re­turn val­ues.
  • The yield from syn­tax.

Return Values in Generators

The key­word def, de­fines a func­tion, which re­turns val­ues (with the re­turn key­word). How­ev­er, as stat­ed on the first sec­tion, if that def con­tains a yield, is a gen­er­a­tor func­tion. Be­fore this PEP it would have been a syn­tax er­ror to have a re­turn in a gen­er­a­tor func­tion (a func­tion that al­so has a yield. How­ev­er, this is no longer the case.

Re­mem­ber how gen­er­a­tors stop by rais­ing Sto­pIt­er­a­tion. What does it mean that a gen­er­a­tor re­turns a val­ue? It means that it stop­s. And where does that ­val­ue do? It’s con­tained in­side the ex­cep­tion, as an at­tribute in Sto­pIt­er­a­tion.­val­ue.

def gen():
        yield 1
        yield 2
        return "returned value"

>>> g = gen()
>>> try:
...     while True:
...         print(next(g))
... except StopIteration as e:
...     print(e.value)
...
1
2
returned value

No­tice that the val­ue re­turned by the gen­er­a­tor is stored in­side the ex­cep­tion, in Sto­pIt­er­a­tion.­val­ue. This might sound like is not the most el­e­gan­t ­so­lu­tion, but do­ing so, pre­serves the orig­i­nal in­ter­face, and the pro­to­col re­mains un­changed. It’s still the same kind of ex­cep­tion sig­nalling the end of the it­er­a­tion.

yield from

An­oth­er syn­tax change to the lan­guage.

In its most ba­sic for­m, the con­struc­tion yield from <it­er­able>, can be thought of as:

for e in iterable:
    yield e

Ba­si­cal­ly this means that it ex­tends an it­er­able, yield­ing all el­e­ments that this in­ter­nal it­er­able can pro­duce.

For ex­am­ple, this way we could cre­ate a clone of the iter­tool­s.chain func­tion from the stan­dard li­brary.

>>> def chain2(*iterables):
...:     for it in iterables:
...:         yield from it

>>> list(chain2("hello", " ", "world"))
['h', 'e', 'l', 'l', 'o', ' ', 'w', 'o', 'r', 'l', 'd']

How­ev­er, sav­ing two lines of code is not the rea­son why this con­struc­tion was added to the lan­guage. The rai­son d’e­tre of this con­struc­tion is to ac­tu­al­ly del­e­gate re­spon­si­bil­i­ty in­to small­er gen­er­a­tors, and chain them.

>>> def internal(name, limit):
...:     for i in range(limit):
...:         got = yield i
...:         print(f"{name} got: {got}")
...:     return f"{name} finished"

>>> def gen():
...:     yield from internal("A", 3)
...:     return (yield from internal("B", 2))

>>> g = gen()
>>> next(g)
0
>>> g.send(1)
A got: 1
1

>>> g.send(1)   # a few more calls until the generator ends
B got: 1
------------------------------------------------------
StopIteration        Traceback (most recent call last)
... in <module>()
----> 1 g.send(1)
StopIteration: B finished

Here we see how yield from han­dles prop­er del­e­ga­tion to an in­ter­nal ­gen­er­a­tor. No­tice that we nev­er send val­ues di­rect­ly to in­ter­nal, but to gen, in­stead, and these val­ues end up on the nest­ed gen­er­a­tor. What yield from is ac­tu­al­ly do­ing is cre­at­ing a gen­er­a­tor that has a chan­nel to all nest­ed gen­er­a­tors. Val­ues pro­duced by these will be pro­vid­ed to the caller of gen. Val­ues sent to it, will be passed along to the in­ter­nal gen­er­a­tors (the same for ex­cep­tion­s). Even the re­turn val­ue is han­dled, and be­comes the re­turn val­ue of the top-lev­el gen­er­a­tor (in this case the string that states the name of the last gen­er­a­tor be­comes the re­sult­ing Sto­pIt­er­a­tion.­val­ue).

We see now the re­al val­ue of this con­struc­tion. With this, it’s eas­i­er to refac­tor gen­er­a­tors in­to small­er pieces, com­pose them and chain them to­geth­er while pre­serv­ing the be­hav­iour of corou­tines.

The new yield from syn­tax is a great step to­wards sup­port­ing bet­ter ­con­cur­ren­cy. We can now think gen­er­a­tors as be­ing “lightweight thread­s”, that del­e­gate func­tion­al­i­ty to an in­ter­nal gen­er­a­tor, pause the ex­e­cu­tion, so that other things can be com­put­ed in that time.

Be­cause syn­tac­ti­cal­ly gen­er­a­tors are like corou­ti­nes, it was pos­si­ble to ac­ci­den­tal­ly con­fuse them, and end up plac­ing a gen­er­a­tor where a corou­tine ­would have been ex­pect­ed (the yield from would ac­cept it, af­ter al­l). For this rea­son, the next step is to ac­tu­al­ly de­fine the con­cept of corou­tine as a prop­er type. With this change, it al­so fol­lowed that yield from evolved in­to await, and a new syn­tax for defin­ing corou­tines was in­tro­duced: async.

async def / await

A quick note on how this re­lates to asyn­chro­nous pro­gram­ming in Python.

On asyn­cio, or any oth­er event loop, the idea is that we de­fine corou­ti­nes, and make them part of the event loop. Broad­ly speak­ing the event loop will keep­ a list of the tasks (which wrap our corou­ti­nes) that have to run, and will sched­ule them to.

On our corou­tines we del­e­gate the I/O func­tion­al­i­ty we want to achieve, to some other corou­tine or await­able ob­jec­t, by call­ing yield from or await on it.

Then the event loop will call our corou­tine, which will reach this line, del­e­gat­ing to the in­ter­nal corou­tine, and paus­ing the ex­e­cu­tion, which gives the con­trol back to the sched­uler (so it can run an­oth­er corou­tine). The even­t loop will mon­i­tor the fu­ture ob­ject that wraps our corou­tine un­til is fin­ished, and when it’s need­ed, it will up­date it by call­ing the .send() method on it. Which in turn, will pass along to the in­ter­nal corou­tine, and so on.

Be­fore the new syn­tax for async and await was in­tro­duced, corou­ti­nes were de­fined as gen­er­a­tors dec­o­rat­ed with asyn­cio.­corou­tine (type­s.­corou­tine was added in Python 3.5, when the corou­tine type it­self was cre­at­ed). Nowa­days, async def cre­ates a na­tive corou­tine, and in­sid­e it, on­ly the await ex­pres­sion is ac­cept­ed (not yield from).

The fol­low­ing two corou­tines step and coro are a sim­ple ex­am­ple, of how await works sim­i­lar to yield from del­e­gat­ing the val­ues to the in­ter­nal gen­er­a­tor.

>>>  @types.coroutine
...: def step():
...:     s = 0
...:     while True:
...:         value = yield s
...:         print("Step got value ", value)
...:         s += 1

>>>  async def coro():
...:     while True:
...:         got = await step()
...:         print(got)


>>> c = coro()
>>> c.send(None)
0
>>> c.send("first")
Step got value  first
1

>>> c.send("second")
Step got value  second
2

>>> c.send("third")
Step got value  third
3

Once again, like in the yield from ex­am­ple, when we send a val­ue to coro, this reach­es the await in­struc­tion, which means that will pass to the step corou­tine. In this sim­ple ex­am­ple coro is some­thing like what we would write, while step would be an ex­ter­nal func­tion we cal­l.

The fol­low­ing two corou­tines are dif­fer­ent ways of defin­ing corou­tines.

# py 3.4
@asyncio.coroutine
def coroutine():
    yield from asyncio.sleep(1)

# py 3.5+
async def coroutine():
    await asyncio.sleep(1)

Ba­si­cal­ly this means that this asyn­chro­nous way of pro­gram­ming is kind of like an API, for work­ing with event loop­s. It does­n’t re­al­ly re­late to asyn­cio, we could use any event loop (cu­rio, uvloop, etc.), for this. The im­por­tant part is to un­der­stand, that an event loop will call our corou­tine, which will even­tu­al­ly reach the line where we de­fined the await, and this will del­e­gate the func­tion to an ex­ter­nal func­tion (in this case asyn­cio.sleep). When the event loop calls send(), this is al­so passed, and the await gives back con­trol to the event loop, so a dif­fer­en­t ­corou­tine can run.

The corou­tines we de­fine are there­fore in be­tween the event loop, and 3rd-­par­ty ­func­tions that know how to han­dle the I/O in a non-block­ing fash­ion.

The event loop works then by a chain of await call­s. Ul­ti­mate­ly, at the end of that chain there is a gen­er­a­tor, that paus­es the ex­e­cu­tion of the func­tion, and han­dles the I/O.

In fact if we check the type of asyn­cio.sleep, we’ll see that is in­deed a ­gen­er­a­tor:

>>> asyncio.sleep(1)
<generator object sleep at 0x...>

So with this new syn­tax, does this mean that await is like yield from?

On­ly with re­spect to corou­tines. It’s cor­rect to write await <corou­tine>, as well as yield from <corou­tine>, the for­mer won’t work with oth­er it­er­ables (for ex­am­ple gen­er­a­tors that aren’t corou­ti­nes, se­quences, etc.). ­Con­verse­ly, the lat­ter won’t work with await­able ob­ject­s.

The rea­son for this syn­tax change is for cor­rect­ness. Ac­tu­al­ly it’s not just a syn­tax change, the new corou­tine type is prop­er­ly de­fined.:

>>> from collections import abc
>>> issubclass(abc.Coroutine, abc.Awaitable)
True

Giv­en that corou­tines are syn­tac­ti­cal­ly like gen­er­a­tors, it would be pos­si­ble ­to mix them, and place a gen­er­a­tor in an asyn­chro­nous code where in fact we ­ex­pect­ed a corou­tine. By us­ing await, the type of the ob­ject in the ­ex­pres­sion is checked by Python, and if it does­n’t com­ply, it will raise an ex­cep­tion.

Asynchronous Generators

In Python 3.5 not on­ly the prop­er syn­tax for corou­tines was added (async de­f / await), but al­so the con­cept of asyn­chro­nous it­er­a­tors. The idea of hav­ing an asyn­chro­nous it­er­able is to it­er­ate while run­ning asyn­chro­nous code. For this new meth­ods such as __aiter__ and __anex­t__ where added un­der the ­con­cept of asyn­chro­nous it­er­a­tors.

How­ev­er there was no sup­port for asyn­chro­nous gen­er­a­tors. That is anal­o­gous to say­ing that for asyn­chro­nous code we had to use it­er­ables (like __iter__ / __nex­t__ on reg­u­lar code), but we could­n’t use gen­er­a­tors (hav­ing a yield in an async def func­tion was an er­ror).

This changed in Python 3.6, and now this syn­tax is sup­port­ed, with the se­man­tics of a reg­u­lar gen­er­a­tor (lazy eval­u­a­tion, sus­pend and pro­duce one ele­ment at the time, etc.), while it­er­at­ing.

Con­sid­er this sim­ple ex­am­ple on which we want to it­er­ate while call­ing some I/O ­code that we don’t want to block up­on.

async def recv(no, size) -> str:
    """Simulate reading <size> bytes from a remote source, asynchronously.
    It takes a time proportional to the bytes requested to read.
    """
    await asyncio.sleep((size // 512) * 0.4)
    chunk = f"[chunk {no} ({size})]"
    return chunk


class AsyncDataStreamer:
    """Read 10 times into data"""
    LIMIT = 10
    CHUNK_SIZE = 1024

    def __init__(self):
        self.lecture = 0

    def __aiter__(self):
        return self

    async def __anext__(self):
        if self.lecture >= self.LIMIT:
            raise StopAsyncIteration

        result = await recv(self.lecture, self.CHUNK_SIZE)
        self.lecture += 1
        return result

async def test():
    async for read in AsyncDataStreamer():
        logger.info("collector on read %s", read)

The test func­tion will sim­ply ex­er­cise the it­er­a­tor, on which el­e­ments are pro­duced, one at the time, while call­ing an I/O task (in this ex­am­ple asyn­cio.sleep).

With asyn­chro­nous gen­er­a­tors, the same could be rewrit­ten in a more com­pact way.

async def async_data_streamer():
    LIMIT = 10
    CHUNK_SIZE = 1024
    lecture = 0
    while lecture < LIMIT:
        lecture += 1
        yield await recv(lecture, CHUNK_SIZE)

Summary

It all start­ed with gen­er­a­tors. It was a sim­ple way of hav­ing lazy com­pu­ta­tion in Python, and run­ning more ef­fi­cient pro­gram­s, that use less mem­o­ry.

This evolved in­to corou­ti­nes, tak­ing ad­van­tage of the fact that gen­er­a­tors can ­sus­pend their ex­e­cu­tion. By ex­tend­ing the in­ter­face of gen­er­a­tors, corou­ti­nes pro­vid­ed more pow­er­ful fea­tures to Python.

Corou­tines were al­so im­proved to sup­port bet­ter pat­tern­s, and the ad­di­tion of yield from was a game chang­er, that al­lows to have bet­ter gen­er­a­tors, refac­tor in­to small­er pieces, and re­or­ga­nize the log­ic bet­ter.

The ad­di­tion of an event loop to the stan­dard li­brary, helps to pro­vide a ref­er­en­tial way of do­ing asyn­chro­nous pro­gram­ming. How­ev­er, the log­ic of the ­corou­tines and the await syn­tax it not bound to any par­tic­u­lar event loop. It’s an API [2] for do­ing asyn­chro­nous pro­gram­ming.

Asyn­chro­nous gen­er­a­tor was the lat­est ad­di­tion to Python that re­lates to ­gen­er­a­tors, and they help build more com­pact (and ef­fi­cien­t!) code for asyn­chronous it­er­a­tion.

In the end, be­hind all the log­ic of async / await, ev­ery­thing is a ­gen­er­a­tor. Corou­tines are in fact (tech­ni­cal­ly), gen­er­a­tors. Con­cep­tu­al­ly they are dif­fer­en­t, and have dif­fer­ent pur­pos­es, but in terms of im­ple­men­ta­tion ­gen­er­a­tors are what make all this asyn­chro­nous pro­gram­ming pos­si­ble.

Slides

References

Notes

[1] Needless to say, the results will vary from system to system, but we get an idea of the difference between both implementations.
[2] This is an idea by David Beazley, that you can see at https://youtu.be/ZzfHjytDceU