HPC Cluster

Computers in a cluster are generally connected by a fast network and have lots of hard drive space. The individual computers on a cluster are usually either numerous and weak or small in number and very powerful.

We have two systems available to staff, researchers and students.

Fotcluster2

This is a 752 core distributed-memory cluster, which comprises of:

A 3U combined head & storage node, plus 56 compute nodes.

Head Node

The head node is a Viglen HS316i combined head & storage node, equipped with 16GB memory and has 2 x 500GB Hard Drives which are configure with raid 0.

The 56 compute nodes are split between two phases:

Phase 1

Consisting of 20 ClusterVision RS724Q-E7 2U Compute Nodes, equipped with Dual Intel Xeon E5-2650 (SandyBridge) 8 core 2.60GHz processors with 64GB of memory per motherboard.

Each of the compute nodes are networked with an Intel Infinband Switch.

Phase 1 provides a total of 320 cores.

Phase 2

Consisting of 36 Viglen HX425T²i HPC 2U Compute Nodes, equipped with Dual Intel Xeon E5650 (Westmere) Six Core 2.66GHz processors and 12 GB of memory per motherboard.

Each of the compute nodes are networked by an infinband network.

The cluster has a 12TB data storage, which is configure with RAID 6.

The whole system is supported using IPMI (KVM over LAN) Technology.

Phase 2 provides a total of 432 cores.

This phase was funded by a Science & Technology Facilities Council (STFC) grant awarded to the UKQCD.

FoSERES/FoSEEDU

The complete system was funded equally by The School of Engineering (SoE) and The School of Computing, electronics & Mathematics (SoCEM)

This is a 1664 core Homogeneous cluster, which comprises of:

Twin Head nodes (Research/Education)

  • 2 x 1U Servers consisting of dual E5-2650v4 CPU, 128GB RAM and 1 x single port Intel® Omni-Path NIC

Compute Nodes

  • 52 x 2U Twin Sq. (4 Nodes) Each Node Consisting of 2 x E5-2683v4 CPU, 128GB RAM, 1 x 240GB SATA SSD with Intel® Omni-Path NIC

Data Storage

  • 1U Meta Data Server with a Single E5-1650v3 CPU, 128GB RAM, 10 x 146GB 15K HDDs and 1 x single port Intel® Omni-Path NIC
  • 4U Object Server with a Dual E5-2650v3 CPU, 128GB RAM, 36 x 6TBHDDs, x 146GB 15K HDD and 1 x single port Intel® Omni-Path NIC
  • Total Storage Capacity 500TB

Storage / Communications

  • Inter node communications via Intel® Omni-Path cabling
  • IPMI Management module with KVM over LAN

Cooling System

Our High Performance Clusters are enclosed in state of the art water-cooled cabinets.

These cabinet are monitored 24/7 to ensure that the cluster hardware is maintained between 20-24 degrees C.

The High Performance Cluster Systems are cooled by a 35KW sealed water cooling system.

The cooling system delivers water to the cluster cabinets at 10 degrees C.


The cooling system

The cooling system

Researchers at the University depend on high performance computing to explore new ways to harness wave power for generating electricity and to better understand the fabric of the universe.

To provide leading-edge technology for research, the Plymouth HPC team needed a new cluster that could deliver outstanding performance while simplifying deployment and management.

Working with IT solution provider Viglen, the HPC team selected an Intel® Cluster Ready cluster with Intel® Xeon® processors. The new HPC resources are helping researchers uncover new insights in marine renewable energy and particle physics.

"With the Intel Xeon processor 5600 series, researchers gain the raw compute performance and memory bandwidth they need to generate complex, high-resolution models and to analyse large data sets."

Peter Mills, School Technical Manager

Get an account 

To get an account on one of the clusters, please submit an account request by completing the online application. You must ensure that the form is completed in full.

Access to the HPC's is restricted to staff, researchers and authorised students.

If a student wishes to obtain an account on the HPC, then his/her course supervisor must provide justification as to why the account is required. 

Your account

When HPC staff create an account for a user on on of the HPC's, certain things are set by default and a certain directory structure created.

Before login into the cluster the first time you'll be required to attend a introductory session so that the cluster can be explained in greater detail. This will provide new users the opportunity to ask questions to help clear up and queries you may have.

The directory structure that HPC will set up as part of your account creation is designed to facilitate the work of users and also reflects the fact that some users may require more storage space that other users.

When your account is created you will be given a home directory

  • /home/username - this is where you start when you log in to HPC. 

If you have an account on fotcluster2, you will also be allocated 100Gb storage space.