Goodbye Virtual Environments?

By Chad Smith

If you’re a Python developer you’ve likely heard of Virtual Environments. A Virtual Environment is “a self-contained directory tree that contains a Python installation for a particular version of Python, plus a number of additional packages.”

Why are they so popular? Well, they solve a problem: no longer are packages installed into a mishmash of global site-packages. Instead, each project can install precise dependency versions into their own “virtual” environments.

However, they introduce some problems as well:

  • Learning curve: explaining “virtual environments” to people who just want to jump in and code is not always easy
  • Terminal isolation: Virtual Environments are activated and deactivated on a per-terminal basis
  • Cognitive overhead: Setting up, remembering installation location, activating/deactivating

To solve some of the above points, new higher level tools such as pipenv, poetry, flit, and hatch have been created. These tools improve the situation by hiding the complexities of pip and Virtual Environments. However, they become complex themselves in order to hide complexity. They also have their own API’s, learning curves, and maintenance burden.

A Python Enhancement Proposal (PEP) was introduced back in May of 2018 to modify Python itself to solve the same problem Virtual Environments solve, but in a totally different and much simpler way.

I’ll let PEP 582 speak for itself:

This PEP proposes to add to Python a mechanism to automatically recognize a __pypackages__directory and prefer importing packages installed in this location over user or global site-packages. This will avoid the steps to create, activate or deactivate “virtual environments”. Python will use the__pypackages__ from the base directory of the script when present.

This proposal effectively works around all the complexity of Virtual Environments and their higher level counterparts simply by searching a local path for additional packages.

It even comes with a reference CPython implementation. Its fate is still being determined.

If you don’t have the time or desire to build a CPython binary, you can try a Python wrapper I made called pythonloc (for “python local”). It is a Python package (less than 100 lines of code) that does exactly what PEP 582 describes.

pythonloc runs Python, but will import packages from __pypackages__ , if present. It also ships with piploc which is the same as pip but installs/uninstalls to __pypackages__.

Here is an example.

> piploc install requests
Successfully installed certifi-2018.11.29 chardet-3.0.4 idna-2.8 requests-2.21.0 urllib3-1.24.1
> tree -L 4
└── __pypackages__
└── 3.6
└── lib
├── certifi
├── certifi-2018.11.29.dist-info
├── chardet
├── chardet-3.0.4.dist-info
├── idna
├── idna-2.8.dist-info
├── requests
├── requests-2.21.0.dist-info
├── urllib3
└── urllib3-1.24.1.dist-info
> python -c "import requests; print(requests)"
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
> pythonloc -c "import requests; print(requests)"
<module 'requests' from '/tmp/demo/__pypackages__/3.6/lib/requests/'>

Note: This is identical to what you might find in your site packages directory, i.e.~/.local/lib/python3.6/site-packages.

As you can see piploc installed requests to __pypackages__/3.6/lib/requests . Running python demonstrated that it did not find requests (which is expected since it doesn’t search __pypackages__).

To make Python find it, you can run pythonloc, which is the same as running PYTHONPATH=.:__pythonpackages__:$PYTHONPATH python. This searches __pypackages__ and finds therequests installation. 🎉

You can try pythonloc today by running

pip install --user pythonloc

and can learn more at

If you have the source code available and it has a file, you can run

piploc install .

then run pythonloc and have all your dependencies available.

If you have a requirements.txt file, you can run

piploc install -r requirements.txt

If you are using pipenv you can generate a requirements.txt file with

pipenv lock --requirements

And finally if you are using poetry you can generate requirements.txt with

poetry run pip freeze > requirements.txt

Okay so we can install from various sources, but what if we’re developing and want to generate a list of dependencies.

A new workflow you could use with the advent of __pypackages__ is to work around creating a list of dependencies and actually commit __pypackages__ itself to source control. Doing that would virtually guarantee you’re using the same versions because, well… you’re using the exact same source code.

Assuming you don’t want to do that, you could run piploc freeze . But this presents a problem. It shows all installed python packages: those in site-packages as well as in __pypackages__. This probably isn’t want you want because it includes more than what you installed to __pypackages__ .

You likely only want to output the packages installed to __pypackages__ . That is exactly what pipfreezeloc does.


It is the equivalent of pip freeze but only outputs packages in __pypackages__. This is required because there is no built-in way to do this with pip. For example, the command pip freeze --target __pypackages__ does not exist.

So instead of running

pip freeze > requirements.txt

you would run

pipfreezeloc > requirements.txt

PEP 582 introduces a new way to install and isolate packages without Virtual Environments. It also eliminates indirection between project location and environment location, since the installation location is always in the same directory as the project — at ./__pypackages__.

pythonloc (and piploc, pipfreezeloc) is a proof of concept Python implementation of PEP 582 available today.

What do you think? Should PEP 582 be approved? Are Virtual Environments going to be relied on less ? Does pythonloc improve your workflow?