Today we are pleased to announce the release of IPython 7.0, the powerful Python interactive shell that goes above and beyond the default Python REPL with advanced tab completion, syntactic coloration, and more. It’s the jupyter kernel for python used by millions of users, hopefully including you. This is the second major release of IPython since we stopped support for Python 2.
Not having to support Python 2 allowed us to make full use of new Python 3 features and bring never before seen capability in a Python Console. We are still encouraging library authors and users to look at the Python 3 Statement to learn about the end of life of Python 2 and how to stop support for Python 2 without breaking installation for Python 2 end users.
As developers and maintainers of IPython, it was a large gain of time to be able to only develop for a single version of python. Avoiding the use of conditional imports, being able to rely on type annotations, and make use of the newly available Python APIs were some of the advantages that made us more productive. Especially as most of the work on IPython is done by volunteers who work on nights and weekends, with only a couple of minutes here and there, this often made the difference between a patch reaching completion, or the contributor moving on to other pastures.
One of the core features we focused on for this release is the ability to (ab)use the async and await syntax available in Python 3.5+. There are of course many other improvements in this release you can read about in the what’s new.
TL;DR: You can now use async/await at the top level in the IPython terminal and in the notebook, it should — in most of the cases — “just work”. Update IPython to version 7+, IPykernel to version 5+, and you’re off to the races.
See how to install IPython by reading the “what’s new”.
The recipes are currently building on conda-forge and should be available soon. For the time being you can install it via pip:
$ pip install ipython ipykernel --upgrade
You may have heard about async/await, threads, concurrency, preemptive scheduling and cooperative scheduling without really understanding what all this is about. If you are not familiar will all the above terms, all the hype may be confusing so let’s talk about concurrency in a really high level way.
Typically when your computer needs to execute many tasks, it will switch between them really fast, so from the human point of view it looks like everything is being processed at the same time. There are two main ways of doing so under the hood: Preemptive Scheduling, and Cooperative Scheduling.
With preemptive scheduling changing tasks can happen at any time. For example, while writing this blog post, I could stop in the middle of a word to start writing an email, which will itself be interrupted to check Gitter/Slack, before coming back, writing 5 words and stopping to get dinner.
With cooperative scheduling, the task switches can happen only at agreed spots. The term co-operative comes from the fact that tasks need to co-operate for the whole process to function. If a task decides to never take a break to let you to do something else, the illusion of many tasks being completed at once disappears.
Each approach has its own advantages and drawbacks, and we will not focus on these. Let’s just say that with co-operative scheduling async/await let you mark the areas where interruption is allowed to occur.
Moreover, async/await syntax allows cooperative scheduling in Python in a way that lets you write code that looks synchronous (without task switches), while actually being able to be interrupted, from the point of view of the computer. It also keeps the programmers from having to worry about global state changing under their feet, as this can occur only at the proximity of
`When going to a restaurant, social conventions (and common sense) tell us when and how these interactions can or cannot be interrupted, but programming languages need markers when using cooperative scheduling. These are
await keywords in Python.
Async marks a function that may be interrupted,
await is required to call async-functions (aka
coroutine) and marks a point were task can be switched.
If you want to learn more we strongly recommend reading the Trio Tutorial Primer on async programming.
In the current Python ecosystem, packages tend to standardize around AsyncIO, provided in the Python standard library. AsyncIO can sometimes be judged as complex even by well known developers; this is in part due to the necessity of supporting other older asynchronous projects like twisted or tornado, but it’s also what makes a lots of its power: One event loop to rule them all.
Running a single async task requires you to learn about AsyncIO, write a non negligible amount of boilerplate code in order to fetch a single result. This can be especially cumbersome when doing interactive exploration, and likely will keep users from experimenting with AsyncIO code.
As Raymond Hettinger would says (slamming hand on podium): “There must be a better way”.
Thanks to a multiple month effort (actually this work started close to 2 years ago), and the work of many talented people, you can now directly await code in the REPL and IPython will do “the right thing”.
With the new integration, you don’t have to import or learn about asyncio, deal with the loop yourself, or wrap your task in its own function. You are now able to just focus on the business logic and move along.
The only thing you need to remember is: If it is an async function you need to await it.
We hope that this will free users to experiment and play with asynchronous programming. Of course this will not magically make your code faster, or run in parallel, simply easier to write and reason about.
The addition of
await keyword in Python did not only simplify the use of asynchronous programing and the standardization around
asyncio; it also allowed experimentation with new paradigms for asynchronous libraries. David Beazley created Curio, and Nathaniel Smith Trio, which both explore new ways to write asynchronous programs and explore how
await and coroutines could be used when starting from a blank slate. The Trio documentation introduction and which problems it attempt to solve [1, 2, 3, 4, 5, 6] are highly recommended reading with varying level of technicality.
Interactive uses of libraries is key to getting insight and intuition on how a system works, intuition is critical to rapid prototyping, development and creation of higher levels of abstraction. It was natural for us to build support for Curio, Trio, (and potentially new other async libraries) into IPython.
You can setup IPython to run async code via Curio, or Trio and experiment or write production code using these libraries. To do so use the
%autoawait magic, and tell it which library to use.
As you can see code looks really natural, and it is easy to forget that the above snippet is usually a syntax error in Python or older version of IPython. The astute reader and IPython expert will have suggested to use the %%time cell magic instead of doing it manually, though a couple of magics still need updates to properly handle async code. We look forward to your contribution on this front, and are excited to see what you can come up with.
If you are a Jupyter user, you most likely use a Notebook interface, and interact with IPython via the ipykernel package.
We’ve been working hard on making async code work in a notebook when using ipykernel. While most of the heavy lifting was done in IPython, the work in IPykernel was non-negligible, and required the accommodation of a number of use cases, which are not working. You now have to update both IPython to 7.0+ and ipykernel to version 5.0+ for async to be available. If you are using pip:
$ pip install IPython ipykernel --update. As for conda, the packages should be available on conda-forge soon. With these new releases,
async will work with all the frontends that support the Jupyter Protocol, including the classic Notebook, JupyterLab, Hydrogen, nteract desktop, and nteract web. The default code will run in the existing asyncio/tornado loop that runs the kernel. Integration with Trio and Curio is still available, but tasks will not be interleaved with the asyncio one — at least not yet. We welcome work on this front.
Submitting background tasks still requires you to access the asyncio event loop, and we are still be looking for contributions on this front as well, to make it even easier to run async code.
There are still some question on how to handle nested asyncio eventloop. It is indeed usually impossible to run nested eventloop, in the case of
asyncio, trying to do so raises a
RuntimeError the kernel already ran in and asyncio eventloop, calling directly or indirectly
loop.run_until_complete and alike is not possible. There are discussions to use libraries like
nest_asyncio as pointed out on this comment, but until those are more battle tested we do not want to commit a default solution in the core of IPython and let the ecosystem develop.
As far as we know, this is the first Async-aware Python REPL, and libraries like Trio/Curio are still young, thus there are still a number of use-cases we have not yet even thought about! We are encouraging you to come forward to talk about your use cases, what you tried and what did not work. There is also a number of new features to implement (making magics work with
async, tab completion, background tasks) on which we would welcome new contributors.