SIGN-IN

Cluster orca.sharcnet.ca

Links System documentation in the SHARCNET Help Wiki

Manufacturer HP
Operating System CentOS 6.x
Interconnect QDR InfiniBand
Total processors/cores 8880
Nodes
orca: 1‑320
24 cores
2 sockets x 12 cores per socket
AMD Opteron @ 2.2 GHz
Type: Compute
Memory: 32.0 GB
Local storage: 120 GB
orca: 321‑360
16 cores
2 sockets x 8 cores per socket
Intel Xeon @ 2.6 GHz
Type: Compute
Memory: 32.0 GB
Local storage: 430 GB
orca: 361‑388
16 cores
2 sockets x 8 cores per socket
Intel Xeon @ 2.7 GHz
Type: Compute
Notes: Run time limited to four (4) hours for non-contribution users.
Memory: 64.0 GB
Local storage: 500 GB
orca: 389‑392
16 cores
2 sockets x 8 cores per socket
Intel Xeon @ 2.7 GHz
Type: Compute
Notes: Run time limited to four (4) hours for non-contribution users.
Memory: 128.0 GB
Local storage: 500 GB
orca: 9001‑9002
24 cores
2 sockets x 12 cores per socket
AMD Opteron @ 2.2 GHz
Type: Login
Memory: 24.0 GB
Local storage: 280 GB
Total attached storage 58.6 TB
Suitable use

Low latency parallel applications.

Software available

NAMD, GAUSSIAN, STAR-CCM+, LAMMPS, CP2K, MATLAB, GCC, ESPRESSO, MODE, NCBIC++TOOLKIT, FDTD, SIESTA, FREEFEM++, BLCR, ABAQUS, PYTHON, NWCHEM, OCTAVE, UTIL, CMAKE, MAP, COMSOL, DAR, SPARK, R, PARI/GP, NETCDF, FFTW, CONVERGE, OPEN64, ACML, CHARM++, MERCURIAL, OPENMPI, PETSC_SLEPC, ANSYS, ABINIT, SUBVERSION, BLAST, ADF/BAND, HDF, INTEL, BOOST, ORCA, SAMTOOLS, GIT, CDF, MAPLE, CPMD, OPENJDK, GNU , TINKER, AMBER, NCL, GDB, BIOPERL, QD, GNUPLOT, MrBAYES, GROMACS, GMP, BINUTILS, PERL, SPRNG, MKL, BIOSAMTOOLS, MPFR, VIM, MPFUN90, VALGRIND, MPIBLAST, TEXLIVE, RLWRAP, MPFUN2015, LSDYNA, MPC, YT, DLPOLY, SUPERLU, PNETCDF, COREUTILS, IPM, GSL, BIOPERLRUN, SQ, ILOGCPLEX, PGI, OPENCV, LLVM, LDWRAPPER, ARPACK-NG, EMACS, CPAN, RUBY, NIX, MONO, PROOT, GHC, VMD, SYSTEM, AUTODOCKVINA, GEANT4, NINJA

Current system state details Graphs

Recent System Notices

Status Status Notes
Dec 04 2018, 03:24PM
(9 days ago)

/project is available again on orca after a faulty piece of network hardware was replaced

Dec 03 2018, 04:35PM
(10 days ago)

/project is currently unavailable on orca due to a network hardware failure. We have unmounted /project from all nodes until the problem is corrected, to avoid annoying hanging behaviour. In-progress jobs that try to use /project will probably hang. Queued jobs that now start expecting /project to be mounted will fail.

We’ll update this status when we have an expected repair time.

Dec 03 2018, 03:28PM
(10 days ago)

/project is currently unavailable on orca, we are investigating the problem

Nov 05 2018, 02:32PM
(about 1 month ago)

Update: orca.sharcnet.ca has been decommissioned. See the end of this message for access to /home and /work.

Orca has been converted to the same software environment as Graham, and now requires your Compute Canada username and password. Software access and job submission works the same way as it does on Graham.

You can connect to the new Orca via orca.computecanada.ca

The legacy SHARCNET software stack is still available and can be used by issuing the following two commands:

module purge --force

export MODULEPATH=/opt/sharcnet/modules

Please report any software issues you find by email to help@sharcnet.ca

On the new Orca you will have the same /home and /project data as you have on Graham. You have the same /scratch available as from the old orca.

Your old /home and /work are still available via dtn.sharcnet.ca and other SHARCNET clusters.

Oct 19 2018, 01:31PM
(about 1 month ago)

Orca has been converted to the same software environment as Graham, and now requires your Compute Canada username and password. Software access and job submission works the same way as it does on Graham.

You can connect to the new Orca via orca.computecanada.ca

The legacy SHARCNET software stack is still available and can be used by issuing the following two commands:

module purge --force

export MODULEPATH=/opt/sharcnet/modules

Please report any software issues you find by email to help@sharcnet.ca

On the new Orca you will have the same /home and /project data as you have on Graham. You have the same /scratch available as from the old orca.

Your old /home and /work are still available on the old Orca via orca.sharcnet.ca, dtn.sharcnet.ca and other SHARCNET clusters.

Sign-in to get full status history