X-PSI is an open-source software package that is available on GitHub and can be cloned as:

git clone </path/to/xpsi>

In this page, we lay down the instructions for installing X-PSI and all the necessary pre-requisites on your local self-administered system.


For installation on a high-performance computing system, we direct the reader to the HPC systems page for guidance since the instructions on this page are either not applicable or do not target performance.

Python environment

X-PSI was developed in Python 2.7, and is in the process of being ported to Python 3. Fortunately, there are several ways to create a virtual environment with a different version of Python, without disrupting your Python ecosystem. In this page, we present the installation procedure using Python 2.7. To use Python 2.7 compatible X-PSI version, we need to switch from the main branch to a branch called python2:

cd </path/to/xpsi>
git checkout python2

This section is divided into two subsections. We recommend that the user follow the instructions in the Basic Conda environment subsection to begin with. If faced with installation issues, the user may refer to the Environment duplication for diagnosis subsection.

Basic Conda environment

In the source directory we provide a basic dependency file that installs the core Python packages required for likelihood functionality. These packages are:

If you want to run X-PSI in a Jupyter notebook, you can add this as an entry (e.g., - jupyter=1.0) to the environment file or you can install it via Conda (or pip) after environment creation.

To create a virtual environment from file:

conda env create -f <path/to/xpsi>/basic_environment.yml

If Conda does not solve the environment dependencies, you may need to create an environment manually via

conda create -n xpsi python=2.7

and then install the core dependencies listed in basic_environment.yml, such as NumPy and Cython.

All the following steps need to be performed in this newly created environment which can be activated as:

conda activate xpsi

Environment duplication for diagnosis

In the source repository we provide another dependency file that can facilitate exact duplication of the environment from which X-PSI v0.6 was released. This information may be useful if trying to diagnose installation problems, but can only be expected to be compatible with the same platform. We therefore recommend users to instead try and follow the steps mentioned in the previous subsection, and use this section as more of a guidance if required.

The development environment:

  • Ubuntu 14.04
  • Installed globally via apt: GCC 4.8.4; Open MPI 1.6.5 (“ancient”); BLAS; LAPACK; ATLAS.
  • Miniconda2 (Python 2.7; 64-bit)
  • Conda environment exported to xpsi/environment.yml

When inspecting the xpsi/environment.yml file, note that most of the entries were installed via automatic resolution of a strict dependency chain when core packages were specified. Also note the packages that were installed into a Conda environment via pip. There are a few reasons for these choices, but the main one is that pip is purely for Python packages and will not install unwanted non-Python libraries. To be clear, such libraries would be dependencies that could have been installed via Conda, if we had not already satisfied them as listed above in this instance.

The Python packages below can be installed straightforwardly from source or via a package manager (Conda, pip, or a combination), via the instructions native to the packages. When searching for an open-source package you may need to add conda-forge package channel.

To duplicate from file:

conda env create -f <path/to/xpsi>/environment.yml



For installing X-PSI on a Mac OS or Windows, please look at the tips below before proceeding with the installation of the various depnedencies.

Python dependencies

The following Python packages are required for nested sampling (see below how to install PyMultiNest and mpi4py from source):

  • PyMultiNest (the interface to the MultiNest library)
  • mpi4py (for parallelisation)
  • mpifort (MPI-wrapped Fortran compiler for building library)


That conda install -c conda-forge pymultinest might try to install dependencies in the environment, including binaries for MPI, BLAS/LAPACK, and a Fortran compiler, all in order to install MultiNest. Moreover, the MultiNest version listed is a minor release too low to satisfy all our needs. Although production sampling runs need to be performed on a high-performance system and X-PSI can locally be installed without sampling functionality, it is advisable to install MultiNest on your personal machine to gain experience on application to inexpensive test problems. Below we offer from source instructions.

Running the tests requires:

The following Python packages are required for full functionality of the post-processing module:

  • GetDist (posterior KDE corner plotting)[1]
  • h5py (storage of X-ray signals computed from posterior samples; also used by emcee)
  • nestcheck (posterior error analysis, plotting, run combination, etc.)[2]
  • fgivenx (conditional posterior plotting; also required by nestcheck)

Note that post-processing can generally be done on a desktop computer and thus these packages are not necessary for running sampling processes on a high-performance system. If they are not installed, a warning message is printed or an exception is raised (by the root process if MPI world size >1).

The emcee Python package for ensemble-MCMC is optional.


That pip install emcee==3.0.2  [--user] installs a version working with Python 2.



The version of GetDist currently compatible with X-PSI, and used in Riley et al. (2019), is v0.3.1. It may be cloned as follows:

git clone [--single-branch] -b customisation \

The version of nestcheck currently compatible with X-PSI, and used in Riley et al. (2019), is v0.2.0. It may be cloned as follows:

git clone [--single-branch] -b feature/getdist_kde \

From source

X-PSI has several dependencies that are not Python packages. Build and install guidelines are given below.


To obtain the latest GSL source code (otherwise v2.5 works):

wget -v


The next steps require an OpenMP-enabled C compiler (known compatibility with icc, gcc, and clang). Most linux systems come with GCC built-in. To find out the GCC path-executable on your system, run which gcc.

Untar, navigate to the build directory (e.g., cd gsl-latest/build), and then build and install:

../configure CC=<path/to/compiler/executable> --prefix=$HOME/gsl
make check
make install
make installcheck
make clean

This will install the library in your $HOME, as an example. You can check the prefix and version of GSL on your path:

gsl-config --version
gsl-config --prefix


To leverage some capabilities of sample post-processing software you require MultiNest v3.12. To build the MultiNest library, you require an MPI-wrapped Fortran compiler (e.g., mpifort from Open MPI).


The following assumes an environment similar to that summarised in the in the Python environment section above, specifically to emphasise where an MPI compiler wrapper is required.

First clone the repository, then navigate to it and build:

git clone <path/to/clone>/multinest
cd <path/to/clone>/multinest/MultiNest_v3.12_CMake/multinest/
mkdir build
cd build
CC=gcc FC=mpif90 CXX=g++ cmake -DCMAKE_{C,CXX}_FLAGS="-O3 -march=native -funroll-loops" -DCMAKE_Fortran_FLAGS="-O3 -march=native -funroll-loops" ..
ls ../lib/

Use the last command to check for the presence of shared objects. There is no need to make install as suggested in the source code documentation.


If prompted about missing cmake and gfortran, they can simply be installed as sudo apt-get install cmake gfortran

If you have not already installed mpi4py using pip (or Conda assuming a different environment setup to that summarised in Python environment), then here is how to do it from source (e.g., on some path such as $HOME):


tar -xf mpi4py-3.0.0.tar.gz

python build --mpicc=mpicc

python install

The package will be installed in your Conda environment (if activated).

To test:

mpiexec -n 4 python demo/

Do you see ranks 0 through 3 reporting for duty? The number of MPI processes might be best set to somewhere between the number of physical cores and logical cores in your machine for test sampling applications. For a typical laptop that might be up to -n 4.

You also need to set the environment variable for the library path to point at MultiNest:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HOME/multinest/MultiNest_v3.12_CMake/multinest/lib/

Now you need the Python interface to MultiNest:

git clone <path/to/clone>/pymultinest
cd <path/to/clone>/pymultinest
python install [--user]

The package will be installed in your Conda environment (if activated).


Here we clone the latest PyMultiNest repository. However, to make sure that PyMultiNest is compatible with Python 2.7 version of X-PSI, you can e.g., checkout the following PyMultiNest commit: git checkout c8eba95, before running the file. For Riley et al. (2019), working with X-PSI v0.1, we used the repository as frozen in a fork. To clone this version instead:

git clone <path/to/clone>

and then simply follow the same installation procedure.


Finally, to build and install from the X-PSI clone root, execute:

CC=<path/to/compiler/executable> python install [--user]

The --user flag is optional and specifies where the package is installed; if you want to install the package in a virtual environment, omit this flag.

For icc, you may need to prepend this command with LDSHARED="icc -shared". This ensures that both the compiler and linker are Intel, otherwise the gcc linker might be invoked.

Provided the GSL <prefix>/bin is in your PATH environment variable, the X-PSI script will automatically use the gsl-config executable to link the shared libraries and give the required C flags for compilation of the X-PSI extensions. Because the library location will not change for runtime, we state the runtime linking instructions at compilation in the script.

To check whether installation proceeded correctly and the software is functioning as expected, execute the following:

cd examples/examples_fast/Modules/

This module performs a likelihood check. If the likelihood value calculated matches the given value, X-PSI is functioning as expected, else it will raise an error message. The module will then initiate sampling using MultiNest (assuming that it’s installed), and given the settings, it should take ~5 minutes. To cancel mid-way press ctrl + C.


The default X-PSI is installed with an analytical blackbody surface emission model extension. If you want to use alternative models for the surface radiation field, you will need to (re-)install / (re-)compile XPSI with the appropriate flags:

CC=<path/to/compiler/executable> python --help
CC=<path/to/compiler/executable> python install [--NumHot] [--NumElse] [--user]

This will install the numerical atmosphere for the hot regions and/or for the rest of the surface (elsewhere). To (re-) install the default blackbody surface emission model, run the command again without the flags:

CC=<path/to/compiler/executable> python install [--user]

If you ever need to reinstall, first clean to recompile the C files:

rm -r build dist *egg* xpsi/*/*.c

Alternatively, to build X-PSI in-place:

CC=<path/to/compiler/executable> python build_ext -i

This will build extension modules in the source code directory. You must in this case ensure that the source code directory is on your PYTHONPATH environment variable, or inserted into sys.path within a calling module.


If you wish to compile the documentation you require Sphinx:

To install sphinx, run the following command in the X-PSI environment:

conda install sphinx=1.8.5

You then need the relevant extensions and need to ensure versions compatible with python2. Make sure to run each line individually and not copy-paste the whole block into your terminal for proper installation.

conda install -c conda-forge nbsphinx=0.5.1
conda install decorator=4.4.1
pip install sphinxcontrib-websupport==1.1.2
pip install sphinx_rtd_theme==0.4.3

Now the documentation can be compiled using:

cd xpsi/docs; [make clean;] make html

To rebuild the documentation after a change to source code docstrings:

[CC=<path/to/compiler/executable>] python install [--user]; cd docs; make clean; make html; cd ..

The .html files can then found in xpsi/docs/build/html, along with the notebooks for the tutorials in this documentation. The .html files can naturally be opened in a browser, handily via a Jupyter session (this is particularly useful if the edits are to tutorial notebooks).

Note that if you require links to the source code in the HTML files, you need to ensure Sphinx imports the xpsi package from the source directory instead of from the ~/.local/lib directory of the user. To enforce this, insert the path to the source directory into sys.path in the script. Then make sure the extension modules are inside the source directory – i.e., the package is built in-place (see above).


To build the documentation, all modules need to be imported, and the dependencies that are not resolved will print warning messages.

Tips for installing on Mac OS

Most of the aforementioned instructions for linux are also applicable for Mac OS. Here we note some of the changes required.

After creating the environment using the basic_environment.yml file, install xcode or xcode tools. Be mindful of the sequence of programs to be installed hereafter. Use pip install to download and install h5py and emcee (and maplotlib, numpy, scipy and cython if not using the basic_environment.yml. You may use the file as a reference of the packages required).

On Mac OS, it’s preferable to use llvm clang rather than gcc. In order to do so, first install homebrew:

/usr/bin/ruby -e "$(curl -fsSL"

Install llvm with homebrew, even if weird messages appear, saying llvm is already present in the Mac OS:

brew install llvm

Install GSL (see above).

Install fortran before MPI. If faced with issues when specifying or using gfortran (and it “does not pass simple tests”) specify the compiler as being gfortran in the mpif90 wrapper files and delete the files that were already in the build directory. Once MPI is installed, export the following environment variables:

export LD_LIBRARY_PATH="/Users/<your_path>/openmpi/lib:$LD_LIBRARY_PATH"
export PATH=$PATH:/Users/<your_path>/mpi/bin/
export LDFLAGS="-L/usr/local/opt/llvm/lib"
export CPPFLAGS="-I/usr/local/opt/llvm/include"

Consider adding these lines directly in your bashrc (or equivalent file for a different shell e.g. zshrc).

Install X-PSI using:

CC=/usr/local/opt/llvm/bin/clang python install [--user]

If it gives problem, remove the tools and surface_radiation_field entires from of X-PSI. The line in the file would then look like:

packages = ['xpsi', 'xpsi/PostProcessing']

If you encounter any problems with permissions when installing X-PSI, use the --user option (This will install X-PSI globally, and not just within your virtual environment).

For compatibility, install the specified fgivenx, GetDist and nestcheck (see above).

Tips for installing on Windows


We do not recommend installing and running X-PSI on windows. However, if you must, this section details some of the relevant procedures.

X-PSI was successfully installed and run on Windows in the year 2020, at least for the purpose of likelihood functionality, using the following user-contributed procedure.

  • Clone the X-PSI repository to a directory on your Windows computer (see above).
  • Download Ubuntu for Windows.
  • Install Python 2.7.
  • Create a virtual Python environment in an Ubuntu shell.
  • Install supporting packages pip install matplotlib numpy cython scipy followed by sudo apt-get install libgsl-dev.
  • Ensure you are in the X-PSI directory and install X-PSI CC=gcc python install.
  • Install any missing packages that you need, e.g., pip install h5py for post-processing functionality if you have posterior sample sets available.
  • Install Jupyter notebook using pip install notebook.
  • Start the kernel with the command Jupyter notebook.