Orca¶
Description¶
ORCA is an ab initio quantum chemistry program package for modern electronic structure methods including density functional theory, many-body perturbation, coupled cluster, multireference methods, and semi-empirical quantum chemistry methods. Its main field of application is larger molecules, transition metal complexes, and their spectroscopic properties. ORCA is developed in the research group of Frank Neese.
Environment Modules¶
Run module spider orca
to find out what environment modules are available for this application.
Environment Variables¶
- HPC_ORCA_DIR - installation directory
Additional Usage Information¶
Questions about running ORCA or electronic structure methods in general should be addressed to Ajith Perera through the UFIT Support System Main Page. The following links provide good resources: Setting Up ORCA and ORCA Tutorials.
Job Script Examples¶
ORCA does not distribute the source code. The executable installed at Hipergator is built with OpenMPI 4.1.1 (the compiler is unspecified and assumed to be GNU/Linux x86_64). ORCA can run in parallel. However, there several important things that users need to know. ORCA calls mpirun internally, and most computer centers including Hipergator prefer mpirun (or srun with slurm controller) invoked externally. As the following example shows we are not using srun, instead directly call ORCA binary "orca" with the full path (essential!!). ORCA documentation does not give examples of how to use it with multiple nodes, but you may visit the ORCA forum to see related issues and solutions.
The input file must contain the parallel configuration section.
%pal nprocs n
end
Orca on a single node, with multiple cores:
!/bin/bash
#SBATCH --job-name=parallel_job # Job name
#SBATCH --mail-type=END,FAIL # Mail events (NONE, BEGIN, END, FAIL, ALL)
#SBATCH --mail-user=usename@ufl.edu # Where to send mail
#SBATCH --nodes=1 # Run all tsks on a single node
#SBATCH --ntasks=2 # The total number of tasks
#SBATCH --mem-per-cpu=500mb # Memory per processor
#SBATCH --time=00:05:00 # Time limit hrs:min:sec
#SBATCH --output=parallel_%j.log # Standard output and error log
pwd; hostname; date
echo "Running orca test calculation on a with four CPU cores"
echo "Date = $(date)"
echo "Hostname = $(hostname -s)"
echo "Working Directory = $(pwd)"
echo ""
echo "Number of Nodes Allocated = $SLURM_JOB_NUM_NODES"
echo "Number of Tasks Allocated = $SLURM_NTASKS"
echo "Number of Cores/Task Allocated = $SLURM_CPUS_PER_TASK"
echo ""
module load gcc/14.2.0 openmpi/5.0.7 orca/6.0.1
which mpirun; echo $PATH; echo $LD_LIBRARY_PATH
export ORCA_DIR=/apps/gcc/14.2.0/openmpi/5.0.7/orca/6.0.1
export OMPI_MCA_coll_hcoll_enable=0
export SRUN_CPUS_PER_TASK=1
$ORCA_DIR/orca job.inp > job.ou
date
!/bin/bash
#SBATCH --job-name=parallel_job # Job name
#SBATCH --mail-type=END,FAIL # Mail events (NONE, BEGIN, END, FAIL, ALL)
#SBATCH --mail-user=usename@ufl.edu # Where to send mail
#SBATCH --nodes=2 # The number of nodes
#SBATCH --ntasks-per-node=8 # Maximum number of tasks on each node
#SBATCH --cpus-per-task=1 # number of CPUs per task.
#SBATCH --mem-per-cpu=500mb # Memory per processor
#SBATCH --time=00:05:00 # Time limit hrs:min:sec
#SBATCH --output=parallel_%j.log # Standard output and error log
pwd; hostname; date
echo "Running orca test calculation on a with four CPU cores"
echo "Date = $(date)"
echo "Hostname = $(hostname -s)"
echo "Working Directory = $(pwd)"
echo ""
echo "Number of Nodes Allocated = $SLURM_JOB_NUM_NODES"
echo "Number of Tasks Allocated = $SLURM_NTASKS"
echo "Number of Cores/Task Allocated = $SLURM_CPUS_PER_TASK"
echo ""
module load gcc/14.2.0 openmpi/5.0.7 orca/6.0.1
which mpirun; echo $PATH; echo $LD_LIBRARY_PATH
export SRUN_CPUS_PER_TASK=1
export ORCA_DIR=/apps/gcc/14.2.0/openmpi/5.0.7/orca/6.0.1
export OMPI_MCA_coll_hcoll_enable=0
$ORCA_DIR/orca job.inp > job.ou
date
Disclaimer: The above slurm configurations are hypothetical. The user must customize it based on the size of the calculation, available resources etc.
Citation¶
If you publish research that uses ORCA you have to cite it as follows:
The ORCA quantum chemistry program package, Frank Neese, Ute Becker, Fran Wennmohs, Christoph Riplinger, J. Chem. Phys., 152, 224108 (2020).
Categories¶
chemistry