HPC at University of Greenwich

The HPC at University of Greenwich is built on the 64 bit Centos Linux operating system, with high speed Omni-Path interconnects designed to be used with MPI compatible applications. The HPC is currently being used by researchers in computational chemistry, analysis of DNA and bioinformatics.

Users have access to 2 types of disk area.

  1. Their home directory, in which users can keep persistent data such as job definitions, source files for jobs, and scripts.
  2. A per-user (or shared across research groups) scratch area, which is faster for access than the home directories. This is the place for work in progress, and results storage. It can also be used for analysis carried out on the HPC system, to speed up the iterative workflows of researchers.

The HPC comprises 3 types of node, all with inter-connections via Omni-Path capable of 100Gb/s transfer speeds. All nodes feature Intel(R) Xeon(R) CPU E5-2640 v4 @ 2.40GHz with 20 CPU Cores.

  1. Standard Compute: Machines with 128GB RAM
  2. High memory Compute: Machines with 320GB RAM
  3. GPU Compute: Machines with 128GB RAM and 2 NVIDIA Tesla K80 per node with 24GB GDDR5 ECC memory

The University of Greenwich HPC uses the SLURM job system and has 3 queues, each tailored for a different purpose.

  1. stdcomp - 48 Standard Compute nodes.
  2. himemcomp - 3 High memory compute nodes.
  3. gpu - 5 GPU Compute nodes.

During the onboarding process, users will be provided with applications which will enable:

  1. Terminal access via SSH
  2. File transfer via SCP
  3. Virtual GL - accelerated remote visualisation and analysis capabilities
  4. "Renderfarm" for supported systems