At the SC17 conference, held this week in Denver Colorado, Nvidia announced that it has begun optimising applications for HPC and visualisation of HPC workflows in addition to partnerships with all major cloud providers.
Nvidia also announced updates to its Nvidia GPU Cloud (NGC) platform which will now provide containerised HPC, deep learning and HPC visualisation applications.
The company is targeting many of the top HPC applications including RELION a software package used in the discovery of gravitational waves – a discovery that won the Nobel Prize for physics in 2017.
During the first day of the conference, Nvidia announced that the top 15 applications and 70 per cent of top 50 HPC applications are now optimised for GPU acceleration. This data comes from an Intersect360 report which described Nvida as 'critical to the future of scientific computing.'
‘GPU computing has reached a tipping point in the HPC market that will encourage continued increases in application optimisation’ wrote Addison Snell and Laura Segervall of Intersect360.
In a pre-show talk, NVIDIA CEO and founder Jensen Huang noted that every major computer maker and cloud service has turned to the NVIDIA Volta architecture to accelerate data-intensive workloads.
Nvidia has now begun to containerise HPC and HPC visualisation applications to enable more users to take advantage of GPU acceleration. Nvidia has released software and tools as part of the NGC container registry that allows scientists to deploy applications and HPC visualisation tools efficiently using cloud-based GPU technology.
The world’s top 15 HPC applications, all GPU accelerated, include GROMACS, ANSYS Fluent, Gaussian, VASP, NAMD, Simulia Abaqus, WRF, OpenFOAM, ANSYS, LS-DYNA, BLAST, LAMMPS, AMBER, Quantum Espresso and GAMESS.
‘Today, one of the biggest market dynamics is the advent of AI,’ noted Intersect360 in the report. ‘Many organisations are looking to deep learning techniques to bring AI advancements to their products, services, or operations. These algorithms often rely on GPUs, to the extent that AI has become a major growth driver for NVIDIA.’