New call opens for Tier-2 HPC Open Access

The EPSRC Resource allocation panel have opened the call for HPC access to Tier-2 systems:

The Tier-2 systems are now:

Cirrus HPC System – 10,000 core system based on Intel Xeon Broadwell
GW4 – Test bed for emerging architectures including ARM, GPU and Xeon Phi
CSD3 – 24,000 cores of Intel Xeon Skylake, 342 Intel Xeon Phi and 360 NVidia GPUs
HPC Midlands – 14,336 core system based on the Intel Xeon processor
JADE A National GPU facility – 22 NVIDIA DGX-1 Deep Learning systems

Closing date 12th Oct 2017 with projects to start before end of December 2017

Research Council UK Global Challenge Research Fund Regional Events, Leeds and Manchester 2017

Reported on N8 recently:

The Leeds event on the 19th September 2017 will be held in the Radisson Blu Hotel and the whole day event will cover detailed information on GCRF and ODA compliance and it will also include a networking lunch. Speakers from RCUK will be available for questions on the day and experienced GCRF project leads will present learning points and challenges. The event provides a great opportunity to discuss projects currently under development and also to start collaborations for the next round of calls.

Here is the N8HPC news item:

Link to event ticketing (Can be found also on above link):
Even Brite Site

JASMIN User conference under way 27th-28th June 2017

Earth Topography Icon

The JASMIN2017 User Conference can now be viewed from STFC’s public website at

Selecting Hosted by Rutherford, and the appropriate viewing platform.

Organised by CEDA and NERC The second JASMIN Conference on Advanced Computing for Environmental Science will be held at the STFC Rutherford Appleton Laboratory (RAL) on 27th and 28th June 2017. We anticipate around 200 attendees from universities and research institutions across the UK and further afield. The conference will have lectures, talks and posters covering a wide variety of cutting-edge research topics ranging from method development to applications, all pushing the limits of environmental informatics.

ARC1 switch-off 31 May 2017

Just in case you missed it: ARC1 is being turned off at the end of May 2017.

Hello everyone,

This is another reminder that the ARC1 service closes permanently on May 31st 2017.

If you have not yet made arrangements to migrate your work over to one of the other clusters or you don’t have an account on one of the other machines you need to do this as a matter of urgency.

For account applications, see here:

ARC team

Extra HPC provision on Monsoon (NEXCS)

Hi All,
the details are here:

Project resource

Requests for NERC project resource should be made through the regular NERC process – in the early-usage stage (April 2017 – July 2017), access will be open.

Machine access

NEXCS has the same 2-factor security model as does Monsoon2; a fob is required to access a bastion machine (the lander) with subsequent ssh to the xcs. NEXCS maintains accounting groups (projects) to which users are assigned after receipt of a security fob.

Current Monsoon2 users need only be added to an appropriate NEXCS group to use the resource.

Contact NCAS-CMS to request access to NEXCS and/or access to specific projects.

UM Users Monsoon Switch-over has happened

We had plenty of warning.
The umsystems team have consolidated the “lander” user id of those of us with Met Office internal unix IDs. On the new Monsoon (xcs-c) those userids will have no data available unless it was transferred from the previous userid. That includes all data in the /projects/project/useridB working directories (DATADIR). The earlier userid is referred to as UseridB and the consolidated userid is UseridC.
I had to set up a blank key on XCM to add to the authorized_keys on XCSC to allow password-less transfer of data.

For the last week I have transferred data from XCM _to_ XCSC by pushing with:
scp -p A useridC@xcslc0:A

Now I find I should have also made a reverse key (oops) as I can only log into my earlier user id (Monsoon terminology useridB) with a password. Which I do not know.

This link is to instructions on the collaboration Twiki which requires a separate login (which hopefully you already know).
Monsoon Twiki

ARC1 to be switched off in May 2017

Advanced Research Computing (ARC) have announce that ARC1 is in its final days and will be switcehd of after 31st May 2017.

That is only 13 weeks away from 21st Feb posting)… Is that long enough for you to ensure your codes will work?

You should consider migrating your cods, workflows and data to another local HPC system such as (ARC2, ARC3, MARC1 or the N8 machines: Polaris or HERC1)

You can ask ARC about the arrangements at

Another thing to note is that the system is approx 8 years old and any failed parts are being replaced with similar age parts and so mean-time-between-failure is shortening. I believe the system availability has been around 60% over the last 12 months.

NERC HPC application for resource deadline approaching (28th Feb 2017)

The deadline for submitting an application to use ARCHER for 2017/2018 is early next week. Our concern is the statement on the NERC site
” Any applications received after this date will only be considered by the HPC Steering Committee if time permits. ”

See :

For example there are two categories ARCHER uses
(i) running under n02-chem (which is managed by Grenville Lister) and used mainly by aerosol and chemical composition groups.
(ii) running under a project account related to a project or proposal.

A) For a project similar to those within n02-chem AU budgets and less than 50 MAU per year:
They can run under n02-chem and they just have to do a brief outline of their work to Grenville.
B) If the user uses > 50 MAU per year, they need to fill out a new project form, and apply for a project code. This is fairly laborious. If you did this last year and want the project to continue you need to fill in a continuation form.

If you want some guidance on calculating the AU budget prediction then let CEMAC know and we can talk you through that section.