Update I have de-contrived the example below.

After some struggle, I’ve managed to accomplish this seemingly simple task: wrap a friendly open-source library inside Python and make it a Python module. Such wrapping is handy because it retains the speed of C++ inside of Python.

If you want to check out how to do more or less the same thing with Cython instead of Boost, see this.

So here is the setup.py:

#! /usr/bin/python

from ez_setup import use_setuptools
from setuptools import setup, Extension

import sys, os
import glob

include_dirs = []

if __name__ == '__main__':
    extensions = [Extension('_boostbabel',['src/boostbabel.cpp'],
                            include_dirs+['src/boostbabel/', '/usr/include/openbabel-2.0/'],
                            language="c++", libraries=['boost_python', 'openbabel']),
    setup(name              = 'boostbabel',
          ext_package       = 'boostbabel',
          ext_modules       = extensions,
          zip_safe = False,     # as a zipped egg the *.so files are not found (at least in Ubuntu/Linux)

Relative to setup.py we have the actual source: src/boostbabel.cpp

#include <Python.h>
#include <boost/python.hpp>
#include <openbabel/mol.h>
#include <openbabel/atom.h>
#include <openbabel/bond.h>

using namespace boost::python;
using namespace OpenBabel;

    // boostbabel has to match filename, here boostbabel.cpp
Now I know nothing about building/linking/including, so getting to that stage was pretty hard. Being the lazy one, I wonder if there isn't a way to guess all of the includes and libs used to init the Extension object by being clever in parsing the actual boostbabel.cpp?

Nooope, there isn’t for the general case.