NSF News

Preparing for the future today: Expanding U.S. researchers' access to advanced computing systems


Access to advanced computational capabilities like supercomputers and cloud computing allows researchers to make revolutionary discoveries at scale ranging from the subatomic to the cosmological. For the U.S. to maintain its economic competitiveness and position as a global leader in innovative research and education, scientists and engineers need even more access to these capabilities.

The National Science Foundation provides researchers across the country remote access to advanced capabilities, creating opportunities that would otherwise be impossible for those whose home institutions lack such resources. Additionally, the agency strives to expand the capacity of U.S. scientific computing resources and democratize researcher access to game-changing artificial intelligence computing power.

"NSF's longstanding strategy is to provision powerful computing capabilities that drive U.S. computational- and data-intensive research across all of science and engineering," said Amy Friedlander, acting office director for the Office of Advanced Cyberinfrastructure. "That is why today, we are announcing $40 million in funding to expand our portfolio of innovative computational resources that take advantage of rapidly changing technologies."

Three new projects focus on providing reliable, large-capacity, open access resources for the research community. These resources lower the entry barrier to advances in computing while also supporting training, outreach and partnerships to help create a broader, more diverse next-generation scientific workforce.

  • Jetstream 2, at Indiana University, facilitates transitions between academic and commercial cloud computing.
  • Delta, at the University of Illinois Urbana-Champaign, leverages modern, web-based interfaces to deliver real-time computing.
  • Anvil, at Purdue University, brings advanced computing power to new users in diverse scientific domains through integrated, interactive computing and desktop environments.

In addition, NSF is providing researchers innovative testbeds to empower AI research across a wide range of science and engineering fields. The testbeds allow researchers to use tools like deep learning – networks that can gain insights by analyzing data largely unsupervised by human operators – to stimulate innovation. These projects deliver faster data analysis and "training" of AI computer models through simulations so systems can learn and operate more independently.

  • Voyager, at the University of California San Diego, gives researchers the opportunity to explore well-established deep-learning frameworks. Researchers can also develop their own AI techniques using software tools and libraries built specifically for Voyager's innovative architecture.
  • Neocortex, at Carnegie Mellon University, vastly shortens the time required for "training" deep-learning tools, fostering greater integration of deep-learning AI models into scientific workflows, and providing revolutionary hardware for developing more efficient algorithms for AI.

Twenty-first century science and engineering research is being transformed by the increasing availability and scale of computation and data. NSF is evolving computing systems in response to changing applications and technology landscapes, driven by science needs and community inputs.