Supercomputing’s super potential

Having been at the leading edge of supercomputing, and its extraordinary growth over the past 30 years, Professor Mark Parsons believes we’re just at the cusp of exploiting its mind-blowing potential.

Image
Professor Mark Parsons sits at a table holding a cup and looking to camera.
In conversation with Professor Mark Parsons
HTML

By Derek Main, freelance writer

Five minutes before our interview, Professor of High-Performance Computing, Mark Parsons emails to apologise he’s running late. “Sorry about that,” he says as we begin our call (only four minutes past the agreed time), “we’re in the middle of a torrential downpour here, and I had to check on the greenhouse.”

The weather. A disarmingly ordinary subject to begin an interview about the extraordinary power of supercomputing. But, as it soon turns out, extremely relevant to the conversation. “It often surprises people to know that despite all our technological advancements, we still can’t accurately predict clouds, rain and thunderstorms,” Professor Parsons explains as we begin talking about the capabilities and limitations of modern-day supercomputers.

Director of EPCC, the University of Edinburgh’s supercomputing centre, since 2016, Professor Parsons has seen the power of high-performance computing grow exponentially since joining the centre as a junior programmer in 1994.

“When I joined the centre, it was installing its first national supercomputer, the CRAY T3D,” he recalls. “At the time, with its 256 processors, it was the most powerful thing we’d ever seen. The current national high-powered computer service we host, called ARCHER 2, has 750,000 processors, and each one is more powerful than all of the CRAY T3D’s combined.”

Image
Person makes notes while looking at original ARCHER system.
The original ARCHER system installed in Edinburgh in 2014.

Trains, planes and automobiles

Professor Parsons says supercomputers have an increasingly massive impact on our everyday lives, which many of us are unaware of.

“Every modern train, plane or automobile we travel in owes its existence to supercomputing,” he states. “Aircraft are certified to fly because supercomputers have run lightning strike simulations on them, and thanks to 30 years of high-performance computing, their engines are now 30% more efficient. And in the past decade we’ve started modelling car crashes on computers, rather than crashing real cars, and have made them significantly safer in the event of an accident.”

Professor Parsons’s own journey from physics undergraduate to supercomputing pioneer was far from linear.

“After graduating from the University of Dundee, I applied for a PhD in computer science at Edinburgh, but was advised to do a masters in information technology first,” he recalls. “After completing that, rather than going straight into computing, I was offered a PhD in particle physics at CERN, the European Organisation for Nuclear Research.” After four years of studying the differences between quarks and gluons in the Large Electron-Positron Collider in Switzerland, Edinburgh’s call proved hard to resist.

“CERN was one of the most inspirational periods of my life,” says Professor Parsons. “Meeting all of these clever people from all over the world was fantastic. But I decided I was better at computing than physics and knew that Edinburgh, with its pioneering reputation in parallel computing, was the only place to start my career in the field.”

Processing power pioneers 

Born from the ideas of computing pioneers Professor Sir David Wallace, Professor Stuart Pawley and Professor Ken Bowler, the Edinburgh Parallel Computing Centre, as today’s EPCC was then known, was at the forefront of a supercomputing revolution. During the early years of computing, scientists wishing to create a supercomputer would build one around the most potent central processor available. However, because a single processor can only solve one computational problem at a time, their potential was limited.

Image
Office full of computers and workers from 1970s or 1980s.

“The real step-change occurred in the 1970s and 80s, when scientists at Edinburgh and elsewhere began linking multiple processors together and tasking them with solving small parts of a larger problem in parallel,” recalls Parsons. “That was the birth of parallel computing - and everything that’s come since.”

After early projects helping the UK Met Office parallelise its weather model, events conspired to set Professor Parsons’ career in an unexpected direction.

“At the end of 1994, almost all of the senior management left to spin out a company,” he says. “The following year, despite hosting the UK’s first national high-performance computer service, CRAY T3D, we were in difficulties. My colleagues and I knew we had to do something bold to ensure the centre didn’t go under, not just for our sake, but for the future of UK supercomputing and science.

“So, I helped write a bid that brought in a million pounds that gave us a stay of execution, ultimately leading to me becoming EPCC’s Commercial Manager in 1997. But we still had to adapt to survive.”

The centre’s transformation came in the form of a new shortened name, EPCC, and a revamped approach to engaging with industry, academic partners and government. It worked.

In 2002, the UK government signalled its confidence in EPCC by selecting the centre to operate the HPCx service and its successor, called HECTOR, in 2007.

Image
The HECTOR supercomputer, installed at Edinburgh in 2007.
The HECTOR supercomputer, installed at Edinburgh in 2007.

Hosting successive generations of national high-performance computing services, ARCHER from 2014 and ARCHER2 from 2020, cemented EPCC’s reputation as the UK’s leading supercomputing and data science centre and helped establish Britain as a world power in computational science.

The power to save time, money and lives

Professor Parsons concedes that despite the enormous potential of supercomputing, encouraging companies to use it isn’t always easy.

“Engaging with industry taught me that many problems that companies grapple with are identical to those we’re trying to solve in the scientific community,” he says. “However, supercomputing can be a huge skill, knowledge, and financial hurdle for most organisations, meaning they seldom see it as their first port of call for solutions.”

Nevertheless, Parsons believes these challenges are a hurdle worth leaping: “Using a supercomputer can lead to huge efficiencies and long-term cost savings. The first time a company produces a model of a water pump, for example, it is slow, expensive and requires expertise they don’t have. But from then on, they can test, adapt and replicate that model quickly and more precisely to speed up the product’s development and improve its quality.”

Since 2013, EPCC has helped more than 160 SMEs across Europe overcome these hurdles to harness the power of supercomputing for the first time through the European Commission-funded Fortissimo and Fortissimo 2 projects.

“Companies only generally need significant support at the beginning,” he comments. “Once they’ve acquired the skills and knowledge, they can self-sufficiently use high-performance computing to drive innovation in their processes, products and services.”

Professor Parsons believes healthcare is perhaps one of the areas where supercomputing promises to touch all of our lives the most. Since 2015, EPCC has hosted the Scottish National Safe Haven for anonymised medical, education and other demographic records.

“The Safe Haven has helped deliver the University’s Usher Institute’s vision to transform health in society using big data to inform government policy in Scotland and worldwide,” Professor Parsons explains. “While elsewhere, we’ve seen how projects such as Google’s Alphafold have harnessed high-performance computing to solve a problem that has puzzled scientists for decades – how to work out which molecules can bend and bond together – contributing directly to the development of coronavirus vaccines. This ability to solve problems will transform healthcare.”

A billion billion calculations each second

The staggering exponential growth in computing power makes it challenging to comprehend how large and fast this transformation will come.

“Right now, ARCHER2 works at the petascale, performing twenty million billion calculations a second which we call petaFLOPS,” explains Professor Parsons. “The next generation of supercomputers work at the exascale, which means they carry out a billion billion calculations each second known as an exaFLOP. If all seven billion humans in the world were to complete one sum per second, 12 hours a day, it would take nine years to complete the same number of calculations an exascale computer does each second. I find that amazing.”

The endless possibilities for innovation mean Professor Parsons’s zest for high-performance computing and dedication to EPCC remains as strong today as it was at the beginning of his career almost three decades ago.

“Every day is different. One minute I can be talking about big data in healthcare or analysing aircraft engine simulations, and the next talking with an electrician about wiring,” he says. “Working in an international context has excited me since my days at CERN, which is another gratifying part of my day-to-day role, as I regularly speak and exchange ideas with a community of high-performance computing centre directors worldwide.”

Maintaining the UK’s place at the vanguard of innovation

These global connections give Professor Parsons an excellent vantage point on international innovation, from which he is advising the UK government on its decision to invest in Britain’s first national Exascale supercomputer service.

Image
Professor Mark Parsons

“The US and China already have Exascale computers, Germany is fast developing its capabilities, and within a few years, all major economies will have one,” Professor Parsons explains.

However, he acknowledges this step-change in performance does come with a step-change in investment: “Building and installing an Exascale computer will cost around £500 million, but it is essential to maintain the UK’s competitive edge as a science superpower, especially after Brexit. We should be at the vanguard of innovation.”

Professor Parsons believes there is only one place capable of hosting the UK’s first Exascale computer: “EPCC is the only site in the UK with the necessary skills, knowledge and hosting environment.”

Moreover, Scotland’s green credentials will help to mitigate exascale computing’s high environmental cost: “This next-generation supercomputer will need around 25 megawatts of electricity, but with almost 100% of its energy generation coming from renewables, and the opportunity to reuse the heat the computer produces, Scotland is the right place to do this.”