Changes ahead for N8HPC (Closure 17th July 2018)

[UPDATE 31st August 2018] Polaris is no longer on line.

N8HPC has announced the “polaris” service will shut down the queues on 17th July 2018.

The UoLeeds ARC team recommend moving your software and simulations (back to?) ARC2, yes, ARC2.  This is because it is virtually identical to Polaris.

It is not yet clear what the “Cloud provision” means for UoLeeds HPC users. It may be that we have to get familiar with topics such as this (NOTE : CEMAC is not recommending that you download this material, we will do what we can to investigate alternatives).

HPC Moves to the Cloud – What You Need to Know


Feb 2018 status of active projects

We are well into February and now both Software Development Scientists have rolled up their sleeves and are working hard at the following projects:

UNRESP: The CALPUFF workflow has been investigated and how to get precise reference data for Nicaragua Region of Interest (around the Masaya Volcano) has been established. Next is the investigation of whether there is an alternative to using the Meteorological Data that is currently sourced from ECMWF. (An example of the likely output is map_concrec010048)

Air Quality Living Lab: A web site has been provided for the scientists to upload data from CPC devices used to detect Air Quality around Campus. It subsequently displays the information against a Google Map background. The track is gleaned from GPS logging during the walkabout. (

AfriCultRes:: A preliminary analysis of the code (GLAMv3) has been undertaken and already a prototype with modifications is under testing; it promises a “huge” computational improvement on the existing implementation. The time for the calculations is greatly reduced and the memory requirement is also reduced (thus queuing for a large memory computational node is not necessary).

JWCRP Stage2 (Chemistry processes): This project has been in hiatus since the work on the aerosol sub-processes was completed in vn10.8. Now with (vn10.9 and 11.0) this project will provide significant changes to the UKCA code to enhance the computational performance; initially this will be through “cache-blocking” style of additional nested do loop; this will be followed by adding OpenMP regions around the loops over columns of atmosphere.

CRESCENDO: There are a couple of aspects currently being addressed. (a) using the CISTools for processing the output of the UM, Satellite (MODIS) and In-situ AOD data (AERONET); (b) contributing to the evaluation work package and helping with the configuration of UK-ESM runs that will provide data for the evaluations. (

Decoding MeteoSat archives: One aspect of this work has been installing and developing tools to extract specific data from MeteoSat BUFR files. The original BUFR tools are being superceded by “ecCodes”, an ECMWF product (similarly it takes over the GRIB api activity).

Below is an example ‘full disc’ image from Meteosat-10, showing Brightness Temperature from the Infrared 10.8 micron Channel:

The region of interest (requested by the science researcher):

A zoomed view demonstrating the limitations of the particular satellite.

UKMO Unified Model on ARC3: there is a continued effort to get the UM running on ARC3 to support any local researcher who needs to do runs of the model locally. Typically this would be a research project that does not have access to Monsoon or Archer. Good advice and guidance is being provided by NCAS CMS ( and

New call opens for Tier-2 HPC Open Access

The EPSRC Resource allocation panel have opened the call for HPC access to Tier-2 systems:

The Tier-2 systems are now:

Cirrus HPC System – 10,000 core system based on Intel Xeon Broadwell
GW4 – Test bed for emerging architectures including ARM, GPU and Xeon Phi
CSD3 – 24,000 cores of Intel Xeon Skylake, 342 Intel Xeon Phi and 360 NVidia GPUs
HPC Midlands – 14,336 core system based on the Intel Xeon processor
JADE A National GPU facility – 22 NVIDIA DGX-1 Deep Learning systems

Closing date 12th Oct 2017 with projects to start before end of December 2017

ARC1 switch-off 31 May 2017

Just in case you missed it: ARC1 is being turned off at the end of May 2017.

Hello everyone,

This is another reminder that the ARC1 service closes permanently on May 31st 2017.

If you have not yet made arrangements to migrate your work over to one of the other clusters or you don’t have an account on one of the other machines you need to do this as a matter of urgency.

For account applications, see here:

ARC team

Extra HPC provision on Monsoon (NEXCS)

Hi All,
the details are here:

Project resource

Requests for NERC project resource should be made through the regular NERC process – in the early-usage stage (April 2017 – July 2017), access will be open.

Machine access

NEXCS has the same 2-factor security model as does Monsoon2; a fob is required to access a bastion machine (the lander) with subsequent ssh to the xcs. NEXCS maintains accounting groups (projects) to which users are assigned after receipt of a security fob.

Current Monsoon2 users need only be added to an appropriate NEXCS group to use the resource.

Contact NCAS-CMS to request access to NEXCS and/or access to specific projects.

ARCHER information about Intel Knights Landing (KNL)

The new generation Intel Xeon Phi is more widely known as “KNL”. The ARCHER service are commissioning a test and development small system. It is expected to be available for early adopters in late October with a month of FoC use with a subsequent step up to an AU charge.

More details can be found on this page in the webinar “Using KNL on ARCHER” (12th October 2016)

Archer Training Information

The slides are probably good enough information on their own but there is a video on the Archer Youtube channel.