Nicole Drakos

Research Blog

Welcome to my Research Blog.

This is mostly meant to document what I am working on for myself, and to communicate with my colleagues. It is likely filled with errors!

This project is maintained by ndrakos

Rockstar Halo Finder

I am switching to using Rockstar for my halo catalogs, as is seems easier to use on the large simulations.

Compile

To compile you just need to type “make”. No need to change any files.

Input

You need to specify a “.cfg” file. Here is mine for the \(128^3\) simulations on my laptop:

INBASE = "/Users/nicoledrakos/Documents/Research/Simulations/wfirst128/Gadget" #directory to snapshot files
OUTBASE = "/Users/nicoledrakos/Documents/Research/Simulations/wfirst128/HaloFinder" #where to output files

NUM_SNAPS = 50
STARTING_SNAP = 0
NUM_BLOCKS = 1 #number of files per snapshot
#FILENAME = "snapshot.<block>.<snap>"
#FILENAME = "snapshot_.<snap>"
#BLOCK_NAMES = "blocks.txt"
#SNAPSHOT_NAMES = "snaps.txt"

FILE_FORMAT = "GADGET2"
GADGET_MASS_CONVERSION = 1e+10 #1e10 solar mass
GADGET_LENGTH_CONVERSION = 1e-3 #kpc/h
FORCE_RES = 18


PARALLEL_IO = 1
NUM_WRITERS = 8 #number of CPUs (multiple of 8)
FORK_READERS_FROM_WRITERS = 1
FORK_PROCESSORS_PER_MACHINE = 8

You then can run this using:

/path/to/rockstar -c server.cfg

creates auto-rockstar.cfg

path/to/rockstar -c OUTBASE/auto-rockstar.cfg

Troubleshooting

When I tried running it, it would hang after printing:

[ 0s] Accepting connections...

So instead, I am using Bruno’s python script to run it; to do this, I used mpirun -n 4 python wfirst128.py, where wfirst128.py is:

import sys, os, time
from subprocess import call
from mpi4py import MPI


MPIcomm = MPI.COMM_WORLD
pId = MPIcomm.Get_rank()
nProc = MPIcomm.Get_size()
currentDirectory = os.getcwd()


INBASE = "/Users/nicoledrakos/Documents/Research/Simulations/wfirst128/Gadget/" #directory to snapshot files
OUTBASE = "/Users/nicoledrakos/Documents/Research/Simulations/wfirst128/HaloFinder/" #where to output files

rockstarComand ='/Users/nicoledrakos/Documents/Software/Rockstar/rockstar'

rockstarConf = {
'FILE_FORMAT': '"GADGET2"',
'GADGET_LENGTH_CONVERSION' :1e-3,  #convert from kpc to Mpc
'GADGET_MASS_CONVERSION': 1e+10,
'FORCE_RES': 5e-3,                 #Mpc/h
'OUTBASE': OUTBASE,
}

parallelConf = {
'PARALLEL_IO': 1,
'INBASE':  INBASE ,               #"/directory/where/files/are/located"
'NUM_BLOCKS': 1,                              # <number of files per snapshot>
'NUM_SNAPS': 10,                               # <total number of snapshots>
'STARTING_SNAP': 0,
'FILENAME': '"snapshot_<snap>"',              #"my_sim.<snap>.<block>"
'NUM_WRITERS': 8,                             #<number of CPUs>
'FORK_READERS_FROM_WRITERS': 1,
'FORK_PROCESSORS_PER_MACHINE': 8,             #<number of processors per node>
}


if pId == 0:
  if not os.path.exists( rockstarConf['OUTBASE']): os.makedirs(rockstarConf['OUTBASE'])
  rockstarconfigFile = rockstarConf['OUTBASE'] + '/rockstar_param.cfg'
  rckFile = open( rockstarconfigFile, "w" )
  for key in list(rockstarConf.keys()):
    rckFile.write( key + " = " + str(rockstarConf[key]) + "\n" )
  for key in list(parallelConf.keys()):
    rckFile.write( key + " = " + str(parallelConf[key]) + "\n" )
  rckFile.close()
  #Run ROCKSTAR finder
  print("\nFinding halos...")
  print(" Parallel configuration")
  print("Output: ", rockstarConf['OUTBASE'] + '\n')


MPIcomm.Barrier()

if pId == 0: call([rockstarComand, "-c", rockstarconfigFile ])
if pId == 1:
  time.sleep(5)
  call([rockstarComand, "-c", rockstarConf['OUTBASE'] + '/auto-rockstar.cfg' ])

Pleiades

Here is my job script to run it on Pleiades

#PBS -S /bin/csh
#PBS -j oe
#PBS -l select=1:ncpus=8:mem=10MB
#PBS -l walltime=0:30:00
#PBS -q ldan

module load mpi-sgi/mpt
module load comp-intel/2018.3.222
module load python3/3.7.0

setenv MPI_SHEPHERD true

export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:/nasa/pkgsrc/sles12/2016Q4/lib:/pleia\
des/u/ndrakos/install_to_here/gsl_in/lib

mpiexec -np 8 python wfirst128_rockstar.py > output

(to install mpi4py, you can load python3 and do a pip install, with the flag –user)

Comments

I am now going to run this on the \(1024^3\) simulations. I will also need to figure out how to run merger trees on these, and how to read in the output from these files.


« AHF 1024 Sims
2048 Simulation Summary »