Boris Bolliet

Jodrell Bank Center For Astrophysics (JBCA), The University of Manchester
Researching on:
CMB Spectral Distortions, Clusters of Galaxies, Dark Energy, Inflation and Black Holes


Research tools

Cosmic Linear Anisotropy Solver System

The Cosmic Linear Anisotropy Solver System (CLASS) is a code written by Pr. Julien Lesgourgues and Dr. Thomas Tram.

CLASS is a Boltzmann code, analogous to CAMB but however written in C++. It is aimed at computing the predicted anisotropy power spectra of the CMB from a given ΛCDM scenario (i.e. a specific setting of the free-parameters of the ΛCDM model). The first public version has been released in 2011.

Some useful notations used throughout the code are:

ppr pointer to precision structure
pba pointer to background structure
ppt pointer to the perturbation structure
pps pointer to the spectra structure
index_md index of mode under consideration (scalar/.../tensor)
kwavenumber
tau conformal time
yvector of perturbations (those integrated over time)
ppw contains the updated metric perturbations

The physical constants are:

_Mpc_over_m_ 3.085677581282e22 >>conversion factor from meters to megaparsecs
remark: CAMB uses 3.085678e22: good to know if you want to compare with high accuracy

_Gyr_over_Mpc_ 3.06601394e2 >>conversion factor from megaparsecs to gigayears (c=1 units, Julian years of 365.25 days)

_c_ 2.99792458e8 >> c in m/s
_G_ 6.67428e-11 >> Newton constant in m^3/Kg/s^2
_eV_ 1.602176487e-19 >> 1 eV expressed in J

Parameters entering in Stefan-Boltzmann constant sigma_B
_k_B_ 1.3806504e-23
_h_P_ 6.62606896e-34

The number pi
_PI_ 3.1415926535897932384626433832795e0 >>

All quantities in the code are either dimensionless or written in unit Mpc^n. In the "ini" file, some parameters either start with an upper case or a lower case caracter. The lower case parameter equals the upper case parameter times h^2=0.49.


Main

The int main of CLASS, that is written in the file class.c, consists in a succession of check that the following functions evaluate to SUCCESS:

input_init_from_arguments(argc,argv,&pr,&ba,&th,&pt,&tr,&pm,&sp,&nl,&le,&op,errmsg)
background_init(&pr,&ba)
thermodynamics_init(&pr,&ba,&th)
perturb_init(&pr,&ba,&th,&pt)
primordial_init(&pr,&pt,&pm)
nonlinear_init(&pr,&ba,&th,&pt,&pm,&nl)
transfer_init(&pr,&ba,&th,&pt,&nl,&tr)
spectra_init(&pr,&ba,&pt,&pm,&nl,&tr,&sp)
lensing_init(&pr,&pt,&sp,&nl,&le)
output_init(&ba,&th,&pt,&pm,&tr,&sp,&nl,&le,&op)

These functions take the addresses of the following structures as arguments.

struct precision pr;
struct background ba;
struct thermo th;
struct perturbs pt;
struct transfers tr;
struct primordial pm;
struct spectra sp;
struct nonlinear nl;
struct lensing le;
struct output op;
ErrorMsg errmsg;

The header class.h contains the declaration of the standard libraries as well as the declaration of the ten class modules, that correspond to the structures mentioned above (except for precision which is a global structure defined in common.h), i.e.

#include "input.h"
#include "background.h"
#include "thermodynamics.h"
#include "perturbations.h"
#include "primordial.h"
#include "nonlinear.h"
#include "transfer.h"
#include "spectra.h"
#include "lensing.h"
#include "output.h"

and the declaration of a file common.h and the tools header files (e.g. for the Runge-Kutta method).


Common parameters and functions

Parameters and functions that are used by all modules are declared and defined in common.h. This includes the definition of mathematical constants such as Pi, macros for the max and min of two numbers, error reporting and testing macros such as class_call and class_test, allocating memory macros such as class_alloc(pointer,size,errormessage) and similar commodities.

A more interesting part is the declaration of parameters related to the precision of the code and to the method of calculation.

enum evolver_type {rk,ndf15};
enum pk_def {delta_m_squared,delta_tot_squared,
delta_bc_squared,delta_tot_from_poisson_squared};
enum file_format {class_format,camb_format};

The precision parameters refer to the members of the structure precision (only a few illustrative members are listed bellow):

struct precision{
double a_ini_over_a_today_default;
double back_integration_stepsize;
double tol_background_integration;
double safe_phi_scf;
FileName sBBN_file;
double recfast_z_initial;
double reionization_z_start_max;
enum evolver_type evolver;
double k_min_tau0;
double k_bao_center;
int tight_coupling_approximation;
int ur_fluid_approximation;
double k_per_decade_primordial;
double halofit_dz;
int accurate_lensing;};


Input parameters

In the file input.h there are in particular the definitions of the macros necessary to reads input parameters from the ini file, e.g.

class_read_double(name,destination)
class_read_int(name,destination)
class_read_string(name,destination)

These macros are used in the module input.c whose aim is to read all the input parameters. An important function defined in input.c, schematically:

int input_read_parameters(
struct file_content * pfc,
struct precision * ppr,
struct background *pba,
struct thermo *pth,
struct perturbs *ppt,
struct transfers *ptr,
struct primordial *ppm,
struct spectra *psp,
struct nonlinear * pnl,
struct lensing *ple,
struct output *pop,
ErrorMsg errmsg
) {

class_read_double("a_today",pba->a_today);

if (pba->Omega0_fld != 0.) {
class_read_double("w0_fld",pba->w0_fld);
class_read_double("wa_fld",pba->wa_fld);
class_read_double("cs2_fld",pba->cs2_fld);}

if (strcmp(string1,"analytic_Pk") == 0) {
ppm->primordial_spec_type = analytic_Pk;
flag2=_TRUE_;}

....}

Two other functions should be mentionned:

int input_default_params where the default values of the cosmological parameters are given.

int input_default_precision which contains e.g. ppr->a_ini_over_a_today_default = 1.e-14;.


Adding a new parameter to CLASS

There are several changes to done in order to add a new input parameter to CLASS. In input.c:
  • Add it to the "ini" file with the desired value.
  • Declare the new parameter externally, e.g. in a header file.
  • Assign a default value to the parameter in int input_default_params()
  • In int input_read_parameters(), read the parameter with e.g. class_read_int("newparam",newparam)


Background cosmology

The background module enables to compute the background dynamics with given input parameters.The background sctructure is defined in background.h. Some illustrative members are reported bellow.

struct background{
double H0;
double w0_fld;
double phi_prime_ini_scf;
double h;
double conformal_age;

int index_bg_a;
int index_bg_H;
int index_bg_rho_cdm;
int index_bg_rho_crit;
int index_bg_conf_distance;
int index_bg_ang_distance;
int index_bg_D;

double * tau_table;
double * z_table;
double * background_table;

int index_bi_a;
int index_bi_rho_dcdm;

short has_cdm;
short has_lambda;}

where D refers to the growth factor in dust universe. Moreover, bt stands for background table, bi for background integration.

Some indications found in the preamble of background.c:

- background_functions() returns all background quantitites {A} as a function of quantitites {B}.

- background_solve() integrates the quantities {B} and {C} with respect to conformal time.

where {A}, e.g. rho_gamma, can be expressed as simple analytical functions of a few variables {B}, e.g. scale factor. And some other quantitites, called {C}, e.g. the sound horizon or proper time, also require an integration with respect to time, that cannot be infered analytically from parameters {B}.

Some functions that are declared in background.h and defined in background.h:

int background_at_tau()
int background_functions()
int background_tau_of_z()
int background_init()
int background_solve()
int background_initial_conditions()
int background_output_data()




/********************************************* //Example: Print background data vs Redshift for (index_tau=0; index_taubt_size; index_tau++) { printf( "tau=%e z=%e a=%e H=%e\n", pba->tau_table[index_tau], pba->z_table[index_tau], pba->background_table[index_tau*pba->bg_size+pba->index_bg_a], pba->background_table[index_tau*pba->bg_size+pba->index_bg_H]); } **************************************/ /***********************************************************/


Anisotropy and Fourier power spectra

The spectra module is dedicated to the computation of anisotropy and Fourier power spectra. The spectra structure, struct spectra is defined in spectra.h. Once initialized by spectra_init(), it contains a table of all C_l's and P(k) as a function of multipole/wavenumber, mode (scalar/tensor), type (for C_l's: TT, TE, etc.), and pairs of initial conditions (adiabatic, isocurvatures).

Important functions include:

int spectra_cl_at_l()
int spectra_pk_at_z()
int spectra_pk_at_k_and_z()
int spectra_cls()
int spectra_compute_cl()
int spectra_k_and_tau()
int spectra_pk()
int spectra_sigma()
int spectra_matter_transfers()

Preamble of spectra.c:

This module computes the anisotropy and Fourier power spectra C_l, P(k)'s given the transfer and Bessel functions (for anisotropy spectra), the source functions (for Fourier spectra) and the primordial spectra. The following functions can be called from other modules:
  • spectra_init() at the beginning (but after transfer_init()),
  • spectra_cl_at_l() at any time for computing C at any l,
  • spectra_spectrum_at_z() at any time for computing P(k) at any z,
  • spectra_spectrum_at_k_and z() at any time for computing P at any k and z,
  • spectra_free() at the end.


Transfer functions

(Preamble of transfer.c.) This module has two purposes:
  1. At the beginning, to compute the transfer functions in harmonic space, Delta_l(q), and store them in tables used for interpolation in other modules.

  2. At any time in the code, to evaluate the transfer functions (for a given mode, initial condition, type and multipole l) at any wavenumber q (by interpolating within the interpolation table).
Hence the following functions can be called from other modules:
  • transfer_init() at the beginning (but after perturb_init() and bessel_init()),
  • transfer_functions_at_q() at any later time,
  • transfer_free() at the end.
The structure transfers is defined in transfer.h.


Cosmological perturbations

(Preamble of perturbation.c.) This module has two purposes:
  1. At the beginning, to initialize the perturbations, i.e. to integrate the perturbation equations, and store temporarily the terms contributing to the source functions as a function of conformal time. Then, to perform a few manipulations of these terms in order to infer the actual source functions S(k,tau), and to store them as a function of conformal time inside an interpolation table.

  2. At any time in the code, to evaluate the transfer functions (for a given mode, initial condition, type and multipole l) at any wavenumber q (by interpolating within the interpolation table).
Hence the following functions can be called from other modules:
  • perturb_init() at the beginning (but after background_init() and thermodynamics_init()),
  • perturb_sources_at_tau() at any later time,
  • perturb_free() at the end.
The structure perturbs is defined in perturbation.h. Note:

Flags for various approximation schemes
  • tca = tight-coupling approximation,
  • rsa = radiation streaming approximation,
  • ufa = massless neutrinos / ultra-relativistic relics fluid approximation).
Integration is made either in the synchronous or conformal Newtonian gauge.



Class Plotting Utility

Class Plotting Utility is written in python. It enables you to quickly produce nice plots for some given data files.

The data file should be either a 'txt' or a 'dat' file. It should include a header with lines starting with '#'.

$ python CPU.py ../DataFiles/FinalData/LQC.txt ../DataFiles/Pk_ref_CLASS/StandardInflation.dat -y P --scale loglog
$ python CPU.py output/Pk_LQCcl_lensed.dat output/Pk_LQCcl.dat output/Pk_refcl_lensed.dat output/Pk_refcl.dat -y TT BB --scale loglog

Montepython and the Planck Likelihood

The Monte Python code, is a Monte Carlo code written in Python to be used with CLASS. It has been written by Dr. Benjamin Audren and is being used by the Planck Collaboration for parameters extraction.

Monte Python is the software that finds the best-fit to the CMB data, according to the specific cosmological scenario (ΛCDM, Modified Gravity, Massive Neutrinos, etc.) that is set into CLASS (just as is CosmmoMC with CAMB).

Running Monte Python implies that the data and corresponding likelihoods are available to the Monte Carlo code. The Planck Data and Likelihood (2015 release) are publicly available one the Plank Legacy Archive website.

The two most important files are COM_Likelihood_Code-v2.0_R2.00.tar.gz, the code that installs the Planck likelihood, and COM_Likelihood_Data-baseline_R2.00.tar.gz, the data.

There is a detailed description of the contents of these folders on the ESA wiki page CMB spectra and likelihood code.

The Planck Collaboration uses the following labels for likelihoods (cf. Planck 2015 results. XIII. Cosmological parameters, footnote 7, p. 6.):

(i) Planck TT: combination of the TT likelihood at multipoles l >= 30 and a low-l temperature-only likelihood based on the CMB map recovered with Commander;

(ii) Planck TT+lowP: further includes the Planck polarization data in the low-l likelihood;

(iii) Planck TE+lowP: TE likelihood at l >= 30 plus the polarization-only component of the map-based low-l Planck likelihood;

(iv) Planck TT,TE,EE+lowP: combination of the likelihood at l >= 30 using TT, TE, and EE spectra and the low-l temperature+polarization likelihood.

(v) Planck TE+lowT,P: combinations of the polarization likelihoods at l >= 30 and the temperature+polarization data at low-l.

Installing and running CLASS-Montepython with the Planck data

Once the four gz files are in the same repository, say ClassAndMontepython/, open a terminal and unzip them one by one using

tar xzvf class_public-2.4.2.tar.gz
tar xzvf montepython_public-2.1.4.zip
tar xzvf COM_Likelihood_Data-extra-plik-DS_R2.00.tar.gz
tar xjvf COM_Likelihood_Code-v2.0_R2.00.tar.bz2


(Pay attention to the xjvf and not xzvf for the bz2 file.) Move all the zip files to a ZIPfiles/ folder, or delete them in order to make your repository tidy.

Now there should be four new folders inside ClassAndMontepython/, they are: 1) class_public-2.4.2/ containing CLASS Code; 2) montepython_public-2.1.4/ for Monte Python; 3) plc-2.0/ for the Planck likelihood; 4) plc_2.0/ for the Planck data.

Installing the Planck likelihood

In order to install the Planck likelihood, move to plc-2.0/. If you have a recent and tidy computer this step may take only a few seconds, using a tool called waf provided inside the plc-2.0/ folder. On Mac, just type:

./waf configure --install_all_deps
./waf install


Although, before that you might want to do:

sudo port selfupdate
sudo port upgrade outdated


On Linux (with ifort and mkl):

./waf configure --install_all_deps --lapack_mkl=$MKLROOT
./waf install


On the cluster, at CC-in2p3, I had to do:

./waf configure --install_all_deps --lapack_mkl=/usr/local/intel/mkl/ --ifort
./waf install


If this procedure fails, you should install using make, after having checked that all the necessary libraries are well linked (see readme.md inside plc-2.0/).

Then you can copy the line:

source /Users/borisbolliet/Dropbox/SZ/Codes/plc-2.0/bin/clik_profile.sh

into your bash_profile and do:

. ~/.bash_profile

to reload your profile file.

The installation guides for CLASS and its Python wrapper can be found here and there, respectively. The installation guide for Monte Python is here, under "Installation" and/or "Documentation".

In the following, I give a description of all the necessary steps for a basic installation of these softwares, in their simplest configuration.

Installing CLASS and its Python wrapper

Installing CLASS is generally straightforward. Move to class_public-2.4.2/, and type the following commands:

make clean
make -j


The second line ensure that the Python wrapper is installed along with CLASS. You should have the SciPy stack installed, by doing:

$ sudo apt-get install python-numpy python-scipy python-matplotlib ipython ipython-notebook python-pandas python-sympy python-nose

Check that the installion of CLASS is successfull by runing:

./class explanatory.ini

In order to check that the Python wrapper is well installed, run python in the terminal, and then:

>>from classy import Class

If there is no error message, your installation of the Python wrapper of CLASS is probably successfull. The ultimate check is to run a test Python scripts that calls CLASS. Move into the subfolder /python, inside class_public-2.4.2/, and run the following command in your terminal (after having exited python):

nosetests test_class.py

Note that you need to have the nose module installed (pip install nose) along with its submodule nose-parameterized. If nose-parameterized is not present on your computer, even when nose is installed, download the wheel file (you might have to do pip install wheel) nose_parameterized-0.5.0-py2.py3-none-any.whl at the bottom of this page, put it in class_public-2.4.2/python/, where you are, and run:

pip install nose_parameterized-0.5.0-py2.py3-none-any.whl

Then, try again nosetests test_class.py, it will loop over many different setting of the cosmological parameters and return the common CLASS output in the terminal. For instance, while looping, when the output says

| Test case lensing=yes_output=tClpCllClP_k_initype=inflation_V_modes=st |

it means that: lensing is taken into account for the computation of the angular power spectra; the code is asked to compute the temperature "tCl", polarization "pCl" and lensed "lCl"; the initial primordial power spectrum is the one predicted by inflation; both scalar and tensor modes have to be considered.

(The same procedure shall be used again, when implementing modifications to the original CLASS code, in order to check that the code still runs perfectly, note that the loop takes about three hours to complete)

Installing Monte Python

In fact, properly speaking Monte Python doesn't require an installation, it needs to be well linked to CLASS, the Planck likelihood and the Planck data. The key file for specifying the paths is default.conf.template inside the folder montepython_public-2.1.4/.

First of all, create a copy of default.conf.template and name it default.conf. Open default.conf with a text editor. Three paths have to be specified default.conf.

The path to the codes, "root":

root = '/Users/borisbolliet/Desktop/ClassAndMontepython/'

The path to CLASS, "cosmo":

path['cosmo'] = root+'/class_public-2.4.2'

The path to the Planck likelihood, "clik":

path['clik'] = root+'/plc-2.0'

The settings for the MCMC are specified in base2015.param, inside the folder montepython_public-2.1.4/.

Runing Monte Python

Go to montepython_public-2.1.4/ and type the following commands

$ source /Users/borisbolliet/Desktop/CLassAndMontepython/plc-2.0/bin/clik_profile.sh
$ montepython/MontePython.py run --conf default.conf -p base2015.param -o PlanckBaselineTest -c covmat/base.covmat -N 10


The first command ensures that the paths to the likelihood are well set.You should include it into your .bash_profile (or .bashrc, or .cshrc) file, otherwise each time you open a terminal you would need to type it againg in order to use Monte Python with the Planck likelihood.

The file .bash_profile is generally in your home directory, you will find it with the command ls -al, and open it with open -a emacs .bash_profile (or emacs .bashrc & on Linux).

The MCMC starts with the second command. The option -N 10 means that the chain will be made of ten steps. A reasonnable number of steps for the chain to converge is 10000, and several dozens of chains are necessary to proceed to the parameter extraction in such an analysis. This is not doable on a single computer, motivating the use of a remote computing grid.

The MCMC chains will be produced as a .txt file inside a newly created folder whose name is passed through the -o option (here, PlanckBaselineTest/). It is handy to create a directory chains/ inside montepython_public-2.1.4/ where all the future folders containing the chains will be stored.

A nice exercise, doable on a laptop, is the JLA analysis. In montepython_public-2.1.4/data/JLA/ there is a readme.txt file that tells you how to download the data files manually. Basically, extract the arxiv http://supernovae.in2p3.fr/sdss_snls_jla/jla_likelihood_v4.tgz and copy all the files from jla_likelihood_v4/data/ to montepython_public-2.1.4/data/JLA/.

You might need to install a few missing modules (pip install numexpr --user, pip install pandas --user ). Then, simply run

$ montepython/MontePython.py run -p jla.param -o chains/JLAchains -N 1000

Usually, the following warning appears:

The acceptance rate is above 0.6, which means you might have difficulties exploring the entire parameter space. Try analysing these chains, and use the output covariance matrix to decrease the acceptance rate to a value between 0.2 and 0.4 (roughly).


In order to analyze the chains do

$ montepython/MontePython.py info chains/JLAChains

This commands produces a covariance matrix as well as a bestfit that can be used as inputs for the next run, as

$ montepython/MontePython.py run -p jla.param -o chains/JLAchains -c chains/JLAchains/JLAchains.covmat -b chains/JLAchains/JLAchains.bestfit -N 1000

Analyze the chains again and redo the procedure until the Gelman-Rubin convergence numbers (R-1) are small enough (tipycally <0.05) for all parameters.


Plotting with montepython

The plots are produce as an output of the command $ montepython/MontePython.py info .... In particular, This command creates a subdirectory '/plots' in the chain directory, which contains the following two pdf files:
  • CHAINS_1d.pdf : the posterior probability for each parameter,
  • CHAINS_triangle.pdf : the 2D posterior for all pairs of parameters.
Make sure you run the command from outside the chain directory! And make sure the chain's names end with '__i.txt', with two underscores! On remote machines it is sometimes useful to disable the production of the pdf files, as follows:

$ montepython/MontePython.py info CHAINS --noplot

Then, only the covmat and bestfit will be computed.

The dashed line that appears on the posterior plots corresponds to the mean likelihood, in order to remove it do

$ montepython/MontePython.py info CHAINS --no-mean

The details for all the options of the 'info' command can be accessed via

$ montepython/MontePython.py info --help

It is possible to read and reanalyze Planck chains with Montepython.

Running in parallel

MCMC chains can be run in parallel using
Open MPI, and in particular the command mpirun. In order to do this with Monte Python a few softwares have to be installed.

For mpi4py, which is necessary in order to use the command mpirun with MontePython, you need to install openmpi first and then:

$ pip install mpi4py --user

Note that the --user option is to be used when you do not have the admin privileges, which is generally the case on clusters.

To install openmpi, download and unzip it from Open MPI and then read the instruction in the INSTALL file, in the user section (basically ./configure and sudo make install). Once these two things (mpi4py and MultiNest) are well set on you machine you should be able to run a few chains in parallel with

$ mpirun -np 10 montepython/MontePython.py run -p jla.param -o chains/JLAchains -N 1000

This will launch ten chains with one thousand steps each.

If in the future you would like to use the Nested Sampling method, you should also install MultiNest and PyMultiNest, one after the other. For MultiNest do

$ git clone http://github.com/JohannesBuchner/MultiNest.git
$ cd ./MultiNest/build/
$ cmake ..
$ make


Then add the following line into your .bash_profile file:

export LD_LIBRARY_PATH=/Path/to/MultiNest/lib:$LD_LIBRARY_PATH

Reload your bash profile with . .bash_profile (after you have cd to the directory where the file is). This will ensure that the libraries needed by Monte Python for parallel computing will be well linked. For PyMultiNest do:

$ git clone http://github.com/JohannesBuchner/PyMultiNest
$ cd ./PyMultiNest
$ python setup.py install --user


September 2015

Submitting MontePython jobs at CC-in2p3

We shall now describe how to submit jobs on a cluster, i.e. running simultaneously a numbers of chains with several thousand points each.

The hardware of the in2p3 Computing Center is located in Lyon, Fr. It is a Sun Grid Engine cluster. All information about it can be found on the CC-in2p3 webpage, and the ENS Lyon computing webpage.

After you have asked for a new account on your name you can login using the ssh command:

$ ssh username@ccage.in2p3.fr

The little memory available in your home directory, /afs/in2p3.fr/home/username, will not allow you to perform long calculations and store the necessary data. As a member of the LSST or the HESS collaborations you will be able to use the SPS disk, from which you will launch your jobs. Create a new directory on this disk, for instance: /sps/lsst/data/username.

In there, you should install Class, MontePython, the Planck Likelihood, the Planck data, PyMultiNest and MultiNest, following the indications given in the previous sections. Create a few subdirectories in order to keep everything tidy. In particular:

1) /PlanckMCMC

-/MultiNest
-/PyMultiNest
-/class_public-2.4.2
-/montepython_public-2.1.4
-/plc-2.0
-/plc_2.0

2) /jobs

-/chains
-/output


3) /scripts


Within this last directory create a new shell script, for instance JobMCMC.sh, with any text editor. This file contains two types of information: the options for the computing and the commands to be executed in order to perform the calculation. Example:

#!/bin/sh
#$ -P P_lsst
#$ -q pa_long
#$ -l ct=100000
#$ -l sps=1
#$ -o /sps/lsst/data/username/jobs/output
#$ -cwd
#$ -j y
#$ -V
#$ -N MCMC_Run
#$ -pe openmpi 5

cd ${SGE_O_WORKDIR}

source /afs/in2p3.fr/home/b/bbolliet/.profile

module load openmpi-x86_64

EXECDIR="/sps/lsst/data/username/PlanckMCMC/montepython_public-2.1.4"
CHDIR="/sps/lsst/data/username/jobs/chains"

mpirun -np ${NSLOTS} ${EXECDIR}/montepython/MontePython.py run --conf ${EXECDIR}/default.conf -p ${EXECDIR}/base2015HighlNs.param -c ${EXECDIR}/covmat/base.covmat -b ${EXECDIR}/bestfit/base.bestfit -o ${CHDIR}/PlanckMPI -N 100

In order to use the pa_long queue you must ask the permission to the administrators of CC-in2p3 (without it you wont be able to use mpi properly).

To launch a job do

$ qsub scripts/JobMCMC.sh

To check the status of your jobs do

$ qstat

To delete all your jobs do

$ qdel -u username

To delete a specific job do

$ qdel JOB_ID

where JOB_ID is a number that you can access with qstat.

Once you have enough chains so that the Gelman-Rubin convergence numbers are satisfying, analyze the chains with

$ PlanckMCMC/montepython_public-2.1.4/montepython/MontePython.py info jobs/chains/PlanckMPI/2015*

In particular, this will create a subdirectory with your plots which you can import on your desktop with

$ scp -r username@ccage.in2p3.fr:/sps/lsst/data/username/jobs/chains/PlanckMPI/plots /path/to/my/Desktop/Plots

from a local Terminal.

IMPORTANT: at CC-Lyon, the chains are created too fast. At the start of a job montepython creates a log.param file, which is subsequently read each time a new chain is created. Generally, when the second chain starts, the log.param file is being read while it has not been written completely, then the job end with the error message "Information on the likelihood was not found in the log.param file".

To overcome this issue copy your parameter file into your chain directory and rename it: log.param.

ROOT

In order to analyze the data comming from colliders, particle physicists have been developping ROOT since 1995. It is a powerful software, able to compile C++ language on the go.

The user guide provides a presentation of the many aspects and utilities of ROOT.

You can download the latest version of ROOT, and follow the indications for the installation.

Alternatively, on a mac, you can install it with brew install root.

The ROOT Graphical User Interface (GUI) is useful on its own. You can create, edit and save your plots in many different formats, in a user-friendly way.

In order to access to the GUI, open ROOT, and open a TBrowser:

TBrowser browser

Use the brower in order to locate tutorials in the ROOT files. The ROOT tutorials are full of examples and templates to start with. If you are using ROOT on a remote machine, you can still download them from there.

In order to include ROOT libraries when you are compiling a code simply use, the '-I' option:
$ g++ -I `root-config --incdir` -o MyExec main.C `root-config --libs`

Be careful with the "`".

August 2015

LyX

LyX is a software initially designed to write LaTeX documents in a more user friendly way than the usual LaTeX editors. The first public version was released in 1995, when LaTeX was starting to be widely used in the scientific cummunity.

LyX can be used as a notebook. It is a perfect tool to write equations and do calculations. The codes for fractions, powers, etc. are just the same as in LaTeX, however the equations appears simultaneously on the lyx file. Then, it is possible to copy-paste whole parts of equations very easily, avoiding many mistakes we usually do when copying equations by hand.

You will find my LyX template here. Once you have installed LyX, open the lyx file. This template features: a BibTeX bibliography; a table of content; a table; a block of equation; an image; cross-references; header and footer.

Click on the eyes logo on the left-hand corner and you will see the pdf version of the lyx file. Go to File/export/PDF (pdflatex) and export you document as a pdf. Do File/export/LaTeX (standard) for the tex version.

Most of the important LaTeX settings you want to use can be specified in Document/Settings and then Document Class, Latex Preambule, etc.

August 2015

Gnuplot

Gnuplot is a plotting utility relatively simple to use. On Gnuplot homepage you will be able to download the software and related documentation.

A set of illustrative example of commands can be found there.

For instance, say you have a data file called 'data.txt', with two columns. Move to the repository where this file is located with the terminal. Then, do:

$ gnuplot
gnuplot> plot 'data.txt' using 1:2 w l title 'TEST'
gnuplot> set key top left
gnuplot> set logscale x
gnuplot> set logscale y
gnuplot> set xlabel 'multipole l'
gnuplot> set ylabel 'l(l+1)Cl/2Pi [uK^2]'
gnuplot> set terminal png
gnuplot> set output "plot.png"
gnuplot> replot
gnuplot> exit


This will creat the file 'plot.png' in the current directory.

October 2015

GitHub

GitHub is a hosting platform dedicated to the management of collaborative projects involving computer codes and softwares.

Once you have created your free personal account, follow the instructions bellow for a quick start.

Login to your account. Create a new repository, clicking the green button to the left: '+ New repository'.

Give a one sentence description of what you are goint to store in the Description field.

Check the box 'Public' and then check 'Initialize this repository with a readme'. Click the button 'Create repository' finally.

The repository MyNewRep/ is created. It contains a single 'README.md' file with the description you just gave.

Open a terminal, check that git is installed as a command line, by typing for instance git --version. Install it if it is not.

Then, with the terminal move to the directory which you want to push (copy) on GitHub, and do

$ git clone https://github.com/username/MyNewRep
$ cd ./MyNewRep

Copy paste all your project files into MyNewRep/. And then do

$ git add .
$ git commit -m "FirstUpdate"
$ git push origin master

You can check online that all the files have appeared in the online directory.

Back to your GitHub account, click on your folder MyNewRep/. Create a new branch by clicking on the button 'Branch:master' and entering the name of your first branch, for instance MyFirstBranch. This is nothing else than a secondary copy of your folder.

The philosophy is to work on secondary branches and when a secondary branch is satisfying one eventually merges with the master branch.

Back to the local repository MyNewRep/, that you access with the terminal, you shall create a new branch called MyFirstBranch too. Procede as follow,

$ git checkout -b MyFirstBranch

Then

$ git branch -a

to list all your branches.

You shall now perform changes and do

$ git add .
$ git commit -m "Update on branch"
$ git push origin MyFirstBranch

Check online that your changes have only appeared on the MyFirstBranch copy. Click on the button 'compare' to the left in order to see the differences between branches.

October 2015

Create your own website

You can easily create your personal webpage for free using an HTML/CSS template and GitHub.

Browse the internet and find a template that you like. Usually you will be able to download a folder that contains the files "index.html" (for the content of your webpage) and "style.css" (for the layout). You are welcome to use my HTML/CSS files to start with. The html files can be opened with your internet browser. In order to edit the html and css files you can use any text editor.

Once you are happy with your website, store it all on your desktop in a repository called WEBSITE. The next step is to bring it online using GitHub.

Register onto GitHub. The registration is free as long as you are using public repositories only. Let us assume your username is richardfeynman, then you have to create a public repository called "richardfeynman.github.io".

Now you shall import the content of the folder WEBSITE/ into richardfeynman.github.io/ by following the next instructions.

Open a terminal, check that git is installed as a command line, by typing for instance git --version. Install it if it is not. Then type:

git clone https://github.com/richardfeynman/richardfeynman.github.io

Move to the newly created richardfeynman/ folder with

cd ./richardfeynman

Copy-paste the whole content of the WEBSITE/ folder into the richardfeynman/ folder. And type the following commands, one by one, in your terminal:

git add .
git commit -m "Putting my website online"
git push origin master

Thanks to the first two commands git loads the files you have just copy-pasted. The message "Putting my website online" could be anything, it will appear next to the files as a description when you look into richardfeynman.github.io/ on your GitHub account.

The third command is to tell git to push the content of your local richardfeynman/ folder (identified as origin, the local git repository) onto your online repository richardfeynman.github.io/ (identified as master, the remote (online) git repository).

After the last command you need to provide git with your username and password.

You can see what the online repository richardfeynman.github.io/ should look like by looking at my borisbolliet.github.io/ (by the way, note that anyone can see and download the content of your public repository on GitHub).

Your website is now online at: http://richardfeynman.github.io and will soon be listed in the popular search engines.

At this stage you do not need your WEBSITE/ folder anymore, you can delete it and rename your local git repository richardfeynman/ to WEBSITE/ or whatever.

Whenever you wish to update your website, just add, delete and update all you need in the local git repository. You can modify your existing files, delete some of them, add new ones, etc. Then, open a terminal and cd to the local git repository and repeat the commands:

git add .
git commit -m "My first update"
git push origin master

Change your website name extension

It is possible to change your website name extension (.github.io) so that your website would be called richardfeynman.com instead of richardfeynman.github.io, however you would need to pay about seven euros per year.

In order do this, go to a domain name provider and purchase the domain name of your choice (if still available). The most popular domain name providers seem to be this one and this one.

Say you have purchased the domain name www.richardfeynman.com. The only thing you have to do concerning GitHub is to create a file called "CNAME" and place it into your richardfeynman.github.io/ online repository (with the procedure described above). This file has to contain the single following line:

richardfeynman.com

Note the absence of the "www". (You can download the CNAME file here and adapt it to whatever your domain name is.) Do not forget to update your online richardfeynman.github.io/ repository and you are done with GitHub.

The final steps is to sign in onto your domain name provider account and, for your new domain name, set the IP adress of the host (GitHub) to 192.30.252.153, as well as the directory onto which your domain name has to point to, that is richardfeynman.github.io, under the "www" section of the CName alias. You will find a very clear up-to-date description of these steps on Julia Lovett's website.