CMSG disk and Shared Software on CX1
General information of shared software available for CMSG group members on Imperial CX1 is summarized in this page. Software listed below, unless stated otherwise, is hosted under 'CMSG' project of Imperial Research Data Store (RDS) [1] (referred as RDS-CMSG below), which is accessible for both login and computational nodes of Imperial HPC CX1. For Imperial HPC HX1, by the time this page is upated (Feb. 2024), it is not accessible there. Only login node can visit RDS-CMSG disk after the pilot phase [2]. Therefore software on RDS-CMSG are classified into 2 catagories:
- For CX1, use
/rds/general/project/cmsg/live/app/
(executables, libraries and headers) and/rds/general/project/cmsg/live/etc/
(other files, such as pseudopotential). Executable can be called directly from the user's home directory. - For HX1, use
/rds/general/project/cmsg/live/share/
. Executables must be downloaded and uploaded to HX1. It is user's responsibility to set up an effective running environment according to 'readme' files (if there is one) saved in the same, or upper, directory.
Please note that accessibility to CMSG disk is granted for CMSG group members only. If you need to access it, please contact group PI Prof. Nicholas Harrison.
Step 0 : Environment Modules
Environment modules [3] manages the compilation and running environment to ensure all of its dependencies are loaded when an executable is launched. For software on RDS-CMSG, typically a module file is prepared, which can be called by module load
command. The user is strongly suggested to add the following line to their ~/.bashrc
file on CX1:
export MODULEPATH="/rds/general/project/cmsg/live/etc/modulefiles:${MODULEPATH}"
That helps the 'module' executable to find module files hosted on RDS-CMSG. Then 'module' commands do not require real path. But in case that readers skip this section (as they always do), in the following text, the read path is always used. Use the following command to update the environment variable ${MODULEPATH}
:
~$ source ~/.bashrc
Due to the settings of CX1, the source
command should be executed everytime you login. To aviod this, try the following commands:
~$ cat << EOF >> ~/.bash_profile if test -f ~/.bashrc; then source ~/.bashrc fi EOF
CRYSTAL23 v1 (Intel)
The current default for CX1. Not available for HX1.
CX1 version
- Compiling Env : EasyBuild Intel 2023a, OpenMP
- Note: MPPproperties and MP2 (CRYSCOR and CRYSTAL2) are currently not available, which involves separate code packages (DMat2 and CRYSCOR). For MPPproperties please read the GNU version below.
The general job submission script for Imperial HPC, developed by the author himself, is used here.
Thanks to Mr. K Tallat-Kelpsa for testing
Job submission script configuration
The source code (written in bash) is available on group GitHub repo, though it is not needed in practice. Useful for developing new features. On CX1, use the following command to configure your local settings:
~$ bash /rds/general/project/cmsg/live/share/HPC-job-submission/Imperial-HPC-Job-Submission/CRYSTAL23/config_CRYSTAL23.sh
Then follow the instructions on the screen. Typically the default values are good. To use default vaules, press enter. After configuration is finished, the information is stored in a 'settings' file which is by default saved as ${HOME}/etc/runCRYSTAL23/settings
. It functions as a dictionary for job sumbmission scripts to refer to. It is also editable according to the user's needs.
Job submission script commands
After configuration, use source ~/.bashrc
to enable alias commands.
Command | Definition |
---|---|
Pcrys23 | Generate parallel OMP CRYSTAL23 job submission files |
MPPcrys23 | Generate massive parallel OMP CRYSTAL23 job submission files |
Pprop23 | Generate parallel OMP CRYSTAL23 properties job submission files |
Scrys23 | Generate serial OMP CRYSTAL23 job submission files |
Sprop23 | Generate serial OMP CRYSTAL23 properties job submission files |
Xcrys23 | User-defined executables and multiple jobs (see below for code examples) |
SETcrys23 | Print the local 'settings' file |
HELPcrys23 | Print the instructions of commands |
Examples
To generate a qsub file for a parallel crystal23 job on 'mgo.d12', the following command can be used:
~$ Pcrys23 -in mgo.d12 -wt 01:00 -nd 1
That generates a 'mgo.qsub' file requesting for 1 node and the maximum running time is 1 hour. Similarly, after the job is done, using the following command can run a parallel properties calculation based on 'mgo-band.d3' and data from the previous 'mgo' SCF calculation on 12 CPUs:
~$ Pprop23 -in mgo-band.d3 -nc 12 -wt 00:30 -ref mgo
For detailed instructions and testing cases, please refer to the documentation.
PC version
A few variants are available in /rds/general/project/cmsg/live/share/CRYSTAL
, including executables compiled for Windows Subsystem for Linux (WSL) and MacOSx. Please check the readme files in saved in individual directories for specifications.
A set of statically linked 'crystal' and 'properties' executables is available in /rds/general/project/cmsg/live/share/CRYSTAL/23v1/Linux-intel2023-x86-intel2023
, which do not have any prerequisite and can be run in serial on either Linux or MacOSx with x86 CPU.
CRYSTAL23 v1 (GNU)
Compared to Intel version above, GNU version is slightly slower to allow for compatibility with MPPproperties. Available for CX1 and HX1.
CX1 version
- Compiling Env : gcc11.2.0, aocl4.0, mpich4.0.2, OpenMP
- Note: MP2 (CRYSCOR and CRYSTAL2) is currently not available.
The same job submission script is used. However, GNU version is not the default option for CX1. To enable GNU version, if there is a 'settings' file, change the following parameters:
- EXEDIR :
module load /rds/general/project/cmsg/live/etc/modulefiles/CRYSTAL/23v1-gcc
- MPIDIR :
module load /rds/general/project/cmsg/live/etc/compiler/gcc11.2.0-aocl
.
Alternatively, rerun the configuration file (same as above) and specify 2 values during configuration.
Note that by default, there is no command for 'MPPproperties'. The user has to modify their ~/.bashrc
file to set the alias and 'settings' file for executable flags. An example:
# in ~/.bashrc alias MPPprop23="/rds/general/project/cmsg/live/share/HPC-job-submission/Imperial-HPC-Job-Submission/gen_sub -x mppprop -set ${HOME}/etc/runCRYSTAL23/settings" # in settings, keep the column width mppprop mpiexec -np ${V_TPROC} MPPproperties Massive parallel properties calculation, OMP
HX1 version
- Compiling Env : EasyBuild foss/2023a, OpenMP
- Note: MP2 (CRYSCOR and CRYSTAL2) is currently not available.
Executables are saved in /rds/general/project/cmsg/live/share/CRYSTAL/23v1/Linux-EBFOSS2023a-HX1-ompi
. To setup the running environment on HX1, if the user has a copy of 'settings' file, they need to modify the following keywords:
- EXEDIR : No default. The directory where the user put their executables in.
- MPIDIR : No default, but please use
module load foss/2023a
unless a self-compiled version is used.
Alternatively, run the configuration file (The one with '-HX1' in the same directory of Github repo) and specify 2 values during configuration. Other options are consistent with CRYSTAL on CX1.
CRYSTAL17 v2 (GNU)
Not hosted on RDS-CMSG. Default for CX1. Not available for HX1.
CX1 version
- Compilor : gcc/6.2.0
- MPI : mpich/3.4.3
For job submission scripts
The configuration and usage are identical to CRYSTAL23. Please refer to the previous section and substitute '23' with '17'. Also, please be noted that CRYSCOR (MP2 calculation) is not available for CRYSTAL17 and 'MPP' as a massive parallel strategy is limited to 'crystal' calculations, i.e., only MPPcrystal is released.
Quantum Espresso v7.2 (GNU)
Job submission file configured for CX1 and HX1, but only CX1 executable is shared as these executables are highly dependent on environment.
CX1 version
- Compiling Env : EasyBuild foss2022a, OpenMP
- libxc: Yes, version 5.1.2
- hdf5: No
Pseudopotential files are saved in /rds/general/project/cmsg/live/etc/QE_PseudoP
.
Job submission script configuration
~$ bash /rds/general/project/cmsg/live/share/HPC-job-submission/Imperial-HPC-Job-Submission/QE7/config_QE7.sh
The procedure is the same as CRYSTAL. Type enter for default setups, which typically works fine. Please note that moving files between temporal and home directories are disabled as QE has built-in file management system.
Job submission script commands
Command | Definition |
---|---|
PWqe7 | Generate pw.x job submission files |
PHqe7 | Generate ph.x job submission files |
CPqe7 | Generate cp.x job submission files |
PPqe7 | Generate pp.x job submission files |
Xqe7 | Generate job submission files for user-defined executables |
SETqe7 | Print the local 'settings' file |
HELPqe7 | Print the instructions of commands |
GULP v6.1.2 (GNU)
Default for CX1. Not available for HX1.
CX1 version
- Compiling Env : EasyBuild foss2022a
- OpenKIM: Yes, version 2.3.0
- PLUMED: Yes, version 2.9.0
- ALAMODE: Yes, git repo copied into
app/
GULP force field libraries are saved in /rds/general/project/cmsg/live/etc/GULP_Libraries
. Executable name: 'gulp-mpi'.
OpenKIM
KIM force field [4] are saved in /rds/general/project/cmsg/live/etc/KIM_Models
. To run OpenKIM executable and download new models, run:
~$ module load /rds/general/project/cmsg/live/etc/modulefiles/OpenKIM/2.3.0-foss
Job submission script configuration
~$ bash /rds/general/project/cmsg/live/share/HPC-job-submission/Imperial-HPC-Job-Submission/GULP6/config_GULP6.sh
Type enter for default setups, which typically works fine.
Job submission script commands
Command | Definition |
---|---|
Pglp6 | Generate job submission files for 'gulp-mpi' |
Xglp6 | Generate job submission files for user defined executable (by default nothing else, so not useful). |
SETglp6 | Print the local 'settings' file |
HELPglp6 | Print the instructions of commands |
ONETEP v6.1.9.2 (GNU)
Default for CX1. Not available for HX1. Job submission script not available.
CX1 version
- Compiling Env : EasyBuild foss2022a, OpenMP, FFTW, Scalapack
- Libxc: Yes, version 5.1.2
Executable path is /rds/general/project/cmsg/live/app/ONETEP/6.1.2.2__foss2022a/bin
. Pseudopotential files are saved in /rds/general/project/cmsg/live/etc/ONETEP_PseudoP
.
References
<references>