'Windows is coming,' Addison Snell, CEO of the market research company, Intersect360 Research, told the opening press conference of SC10 in New Orleans – and sure enough, on the second day of the meeting, Microsoft announced a battery of new developments and applications of its software designed to encourage scientists and engineers to use Windows-based high performance computing. The physical layout of the meeting also demonstrated one facet of the US supercomputing industry that is usually left unspoken – the extent to which it depends financially on US Government spending, either directly or channelled through the US's extensive network of national laboratories and scientific agencies.
To speed genomics research by allowing researchers to compare genome sequences quickly and cheaply, the company has released NCBI BLAST on Windows Azure, its cloud computing software. The Basic Local Alignment Search Tool (BLAST) is the standard software used to try to find similarities between DNA of unknown function and sequences of know proteins stored in the database. Microsoft is allowing the Azure-optimised version to be used free of charge by the scientific community and, according to Kyril Faenov, general manager of the company's Technical Computing Group, it is also making a pool of computing hours available to academic users.
In conjunction with Microsoft, the University of Washington has already used the BLAST on Azure system to carry out a sequence comparison run in just half-an-hour that would have taken more than a month on the US National Centre for Biotechnology Information's (NCBI) own servers. 'Catering for IT managers is not enough,' Faenov said. 'Last year Microsoft decided to look at alternative users and how they can utilise compute power.'
By the end of the year, he said, the company will release a service pack for Windows HPC Server that will allow seamless integration of Azure and thus permit IT administrators to manage cluster and cloud resources simultaneously. It will be transparent to end-user scientists, but IT managers need to know what resources are being employed and what it will cost. The Microsoft announcement reflected another theme of the opening press conference – that the lower end of the HPC market was 'cloud-oriented'.
But if users are to have faith in Windows as a viable technology for high performance computing, Faenov believes, Microsoft must been seen to be present at all levels – including the very high end. So he drew particular attention to the fact that the Tokyo Institute of Technology's Tsubame 2.0 supercomputer has surpassed a petaflop of performance using Windows HPC Server. It demonstrates, he said, 'that we understand what supercomputing is about'.
Perhaps surprisingly for a country that prides itself on being the home of free enterprise, the exhibition floor of SC10 is dominated by state-funded agencies, such as NASA, the National Oceanic and Atmospheric Administration, and the National Centre for Atmospheric Research, all of whom have lavish display stands occupying a large floor area. All the (state-funded) national laboratories also have very large presences on the show floor, ranging from the weapons laboratories such as Lawrence Livermore, Los Alamos and Sandia, to more civilian oriented labs such as Oak Ridge and Argonne.
Oak Ridge in particular was showcasing a US$122M project to develop a 'virtual nuclear reactor', the Consortium for Advanced Simulation of Light Water Reactors (CASL). According to Dr Doug Kothe, director of CASL, it will take compute power in the Exascale to develop fully detailed predictive computer models that simulate nuclear power plant operations. The intention is to use the computer models to reduce capital and operating costs, safely extend the lifetime of existing reactors, and lay the foundations for more efficient reactors in the future. One of the partners in the project is the major US manufacturer of pressurised-water reactors, Westinghouse, whose own proprietary computer codes have already been integrated with the CASL project.