Multiple Homebrew Pythons

The Homebrew project brings a really nice package installer to OS X, allowing you to install command line programs in a really simple way. For instance, to install the latest version of Python 3, you would do:

$ brew install python3

Because not all projects are as aware of old versions as python, when brew upgrades a package, it removes the old versions linked binaries and support files. This is actually not a good thing for python: it means you can no longer access the older interpreter.

Python keeps version-named interpreters, and then just symlinks the most recently installed to the python executable. Thus, it’s not uncommon to see, for python2:

$ ls -1 /usr/bin/python*
/usr/bin/python
/usr/bin/python-config
/usr/bin/python2.5
/usr/bin/python2.5-config
/usr/bin/python2.6
/usr/bin/python2.6-config
/usr/bin/python2.7
/usr/bin/python2.7-config
/usr/bin/pythonw
/usr/bin/pythonw2.5
/usr/bin/pythonw2.6
/usr/bin/pythonw2.7

This means, if you want to run an older version (for instance, say you use tox and want to do some testing against a range of versions), you can just use:

$ python2.5
Python 2.5.6 (r256:Unversioned directory, Mar  9 2014, 22:15:03) 
[GCC 4.2.1 Compatible Apple LLVM 5.0 (clang-500.0.68)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> 

But Homebrew breaks this.

However, if you have the old versions already installed, you can easily recreate the symlinks. Indeed, here is a script that will visit all homebrew installed python3 versions, creating

cd /usr/local/Cellar/python3/

for VERSION in `ls`
do
  cd $VERSION
  
  find bin -name \*3\* -exec ln -s -f `pwd`/{} /usr/local/{} \;
  find lib -name \*python\* -maxdepth 1 -exec ln -s -F `pwd`/{} /usr/local/{} \;
  find share -name python\* -exec ln -s -f `pwd`/{} /usr/local/{} \;
  find Frameworks -name 3.\* -exec ln -s -f `pwd`/{} /usr/local/{} \;
  
  cd ..
done

It worked for me for python3 with the following versions installed:

  • 3.2.3
  • 3.3.3
  • 3.4.1

Now I just need to figure out how to get Homebrew to download and build specific versions of packages.

Per-command Virtualenv

Recently, I finally got around to re-installing OS X from scratch on my work machine. It was past time it needed to happen, to the extent where I would frequently be unable to wake machine from display sleep, and saving a file in a monitored directory would take the wsgi-monitor package tens of seconds to restart django.

One thing I wanted to do this time was only install stuff as necssary, but also put every pip installed command line tool in it’s own virtualenv. However, this has one drawback, in that it is a little repetitive.

For instance, to install Fabric, my deployment tool of choice:

$ virtualenv ~/.venv/fabric
$ . ~/.venv/fabric/bin/activate
(fabric)$ pip install fabric
(fabric)$ ln -s ~/.venv/bin/fabric /usr/local/bin/

This is fine if you only have one ‘tool’ to install, but something like docutils actually installs a whole stack of command line tools.

What we want, is something like:

  • create the virtualenv
  • get a list of items already in the <virtualenv>/bin
  • install the required tool (and any extra modules)
  • link all of the newly added commands in <virtualenv>/bin to /usr/local/bin

We could just add each <virtualenv>/bin to our path, but that would mean that the first virtualenv created would be used for pip, which I don’t want installed at all.

Additionally, it would be nice to be able to specify a required version of the package to install, and other (non-dependency) packages that should be installed. For instance, I want mercurial_keyring to be installed in the mercurial virtualenv.

This last one is probably less important, as you can just use that virtualenv’s pip to install them after. But the version number stuff might be nice.

virtualenv has the nice ability to be able to create bootstrap scripts, which will do other stuff (like install specific packages). We can co-opt this to build a tool for doing the automatic installation and linking:

import virtualenv, subprocess

data = """
import os, subprocess

def extend_parser(optparse_parser):
    optparse_parser.add_option(
        "--upgrade",
        action="store_true",
        dest="upgrade",
        default=False,
        help="Upgrade package",
    )
    optparse_parser.add_option(
        "--path",
        dest="path",
        default='~/.venv/',
        help="Parent path of virtualenvs"
    )
    optparse_parser.add_option(
        '--package',
        dest="packages",
        action="append",
        help="Other packages to install"
    )
    
def adjust_options(options, args):
    global package
    if not args: 
        return
    package = args[0]
    if '==' in args[0]:
        args[0], version = args[0].split('==', 1)
    args[0] = os.path.join(os.path.expanduser(options.path), args[0])

def after_install(options, home_dir):
    global package
    venv = os.path.join(os.path.expanduser(options.path), home_dir)
    before = os.listdir(os.path.join(venv, 'bin'))
    command = [os.path.join(venv, 'bin', 'pip'), 'install', package]
    if options.upgrade:
        command += ['--upgrade']
    if options.packages:
        command += options.packages
    subprocess.call(command)
    after = os.listdir(os.path.join(venv, 'bin'))
    
    for command in set(after).difference(before):
        subprocess.call([
            'ln', '-s', 
            os.path.join(venv, 'bin', command),
            '/usr/local/bin'
        ])
"""

output = virtualenv.create_bootstrap_script(data)
open('/usr/local/bin/pip-install', 'w').write(output)
subprocess.call(['chmod', '+x', '/usr/local/bin/pip-install'])

There is one caveat: if an existing file is found in /usr/local/bin that matches one that should be linked, it will be ignored. That is, it does not overwrite existing commands. I think this is preferable, as it is marginally safer.

Linking commands like this is better than copying them, as it means you can just do a pip install --upgrade <package> in the relevant virtualenv, and it will upgrade commands. You can also use pip-install <package>==<new-version>, and that should work too. However, if you unlink a command (or remove one that would have linked but failed), and do a pip-install, it will not link the commands that were already installed in that virtualenv.

Anyway, your mileage may vary. I’m using it now, and it seems good.

Installing django (or any python framework, really)

TL;DR

1 $ pip install virtualenv
2 $ virtualenv /path/to/django_project/
3 $ . /path/to/django_project/bin/activate
4 $ pip install django

I hang around a fair bit in #django now on IRC. It’s open most of the time I am at work: if I am waiting for something to deploy, I’ll keep an eye out for someone that needs a hand, or whatever. Yesterday, I attempted to help someone out with an issue with django and apache: I ended up having to go home before it got sorted out.

One of the things that came up was how to actually install django. The person was following instructions on how to do so under Ubuntu, but they weren’t exactly ‘best practice’.

One of the things I wish I had been around when I first started developing using python is virtualenv. This tool allows you to isolate a python environment, and install stuff into it that will not affect other virtual environments, or the system python installation.

Unfortunately, it does not come standard with python. If it were part of the standard library, it may reduce the likelihood of someone not using it. The upside of it not being in the standard library is that it gets updated more frequently.

Installing virtualenv

First, see if virtualenv is installed:

1 $ virtualenv --version

If not, you’ll need to install it. You can install it using pip or easy_install, if you have either of those installed. If you are a super-user on your machine (ie, it is your computer), then you may want to use sudo. You can have it installed just in your user account, which you might need to do on a shared computer.

You’ll probably also want to install pip at the system level. I do this first, and use it to install virtualenv, fabric and other packages that I need to use outside of a virtualenv (mercurial springs to mind). Do note that a virtualenv contains an install of pip by default, so this is up to you: once you have virtualenv installed, you can use pip in every virtualenv to install packages.

Setting up a virtual environment

I recommend using virtualenv for both development and deployment.

I think I use virtualenv slightly differently to most other people. My project structure tends to look like:

 1 /home/user/development/<project-name>/
 2     bin/
 3     fabfile.py
 4     include/
 5     lib/python2.6/site-packages/...
 6     project/
 7         # Project-specific stuff goes here
 8     src/
 9         # pip install -e stuff goes here
10     tmp/

Thus, my $VIRTUAL_ENV is actually also my $PROJECT_ROOT. This means that everything is self contained. It has the negative side-effect of meaning if I clone my project, I need to install everything again. This is not such a bad thing, as I use Fabric to automate the setup and deployment processes. It takes a bit of time, but using a local pypi mirror makes if fairly painless.

Obviously, I ignore bin/, lib/ and the other virtualenv created directories in my source control.

However, since we are starting from scratch, we won’t have a fabfile.py to begin with, and we’ll just do stuff manually.

1 $ cd /location/to/develop
2 $ virtualenv my_django_project

That’s it. You now have a virtual environment.

Installing django/other python packages

You’ll want to activate your new virtualenv to install the stuff you will need:

1 $ cd my_django_project
2 $ . bin/activate
3 (my_django_project)$

Notice the prompt changes to show you are in a virtual environment.

Install the packages you need (from now on, I’ll assume your virtualenv is active):

1     $ pip install django

There has been some discussion about having packages like psycopg2 installed at the system level: I tend to install everything into the virtualenv.

So that’s it. You now have django installed in a virtual environment. I plan to write some more later about my deployment process, as well as how I structure my django projects.