May 2019 - Jul 2019
Aug 2016 - Jan 2018
May 2016 - May 2017
Apr 2015 - Jun 2016
Apr 2013 - Sep 2016
Blue Waters Symposium 2019, Jun 4, 2019
Blue Waters Symposium 2018, Jun 6, 2018
Blue Waters Symposium 2018, Jun 4, 2018
Blue Waters Symposium 2017, May 17, 2017
Conference for Undergraduate Women in Physics (CUWiP); Wayne State University, Detroit, Michigan, U.S.A., Jan 14, 2017
Blue Waters Symposium 2016, Jun 13, 2016
Blue Waters Symposium 2015, May 13, 2015
NCSA Colloquium, Mar 13, 2015
Blue Waters Symposium 2014, May 13, 2014
Feb 11, 2019
By employing their Enhanced Halo Resolution modeling technique on Blue Waters, one of the most powerful supercomputers in the world, Hummels and his research team are able to, more accurately than ever before, account for cool hydrogen gas that is spewed into galactic outskirts in vast quantities following galaxy formation.
Jan 23, 2019
A team of international scientists have investigated the birth of supermassive black holes in the early universe and found a new mechanism that might have triggered their formation: the rapid growth of dark matter halos.
Nov 20, 2017
Director, Donna Cox of the Advanced Visualization Lab at the National Center for Supercomputing Applications (NCSA) at the University of Illinois took home the Best Scientific Visualization and Data Analytics Showcase Award at SC 17 in Denver. The visualizations were created by Donna Cox, Robert Patterson, Stuart A. Levy, Jeffrey D. Carpenter, AJ Christensen and Kalina M. Borkiewicz. The scientific simulation was run on the Blue Waters supercomputer by Michael Norman & Hao Xu, UC San Diego; Brian O’Shea, Michigan State U.; John Wise, Georgia Tech; Kyungjin Ahn, Chosun U.
Aug 29, 2016
In this video from the 2016 Blue Waters Symposium, Brian O’Shea from Michigan State University presents: Simulating the Earliest Generations of Galaxies with Enzo and Blue Waters.
Oct 20, 2015
Why do we care what happened 13 billion years ago? A bold lead question for an interview with an astrophysicist looking at the early universe, but one that doesn’t seem to faze Brian O’Shea. The Michigan State University professor just smiles across the Skype connection and then chuckles. ... O’Shea is no stranger to supercomputing or NCSA, dating back to his days as a student at the University of Illinois at Urbana-Champaign. As leader of a Petascale Computing Resource Allocations (PRAC) team that includes co-principal investigator Michael Norman and Hao Xu at the University of California, San Diego, John Wise of Georgia Tech, and Britton Smith of the University of Edinburgh, O’Shea’s been able to explore early galaxy formation and evolution. The team has published more than 16 papers, primarily in the Astrophysical Journal.
Jul 2, 2015
It looks like we might have overestimated how many neighbors we have. New predictions show that the universe might be an emptier place than we imagined. Since the Hubble launched, we’ve been seeing stunning image of the crowded universe. Most of the images come accompanied by assurances that what we see in the images is just the start. Astronomers have been excitedly guessing at the amount of faint, distant galaxies that they can’t see. Lurkers surely outnumbered visible galaxies. New simulations done on Blue Waters, a supercomputer at the National Center for Supercomputing Applications indicate that that isn’t the case.
Jul 1, 2015
There may be far fewer galaxies further out in the Universe than might be expected, suggests a new study based on simulations conducted using the Blue Waters supercomputer at the National Center for Supercomputing Applications, with resulting data transferred to SDSC Cloud at the San Diego Supercomputer Center at the University of California, San Diego, for future analysis. The study, published this week in the Astrophysical Journal Letters, shows the first results from the Renaissance Simulations, a suite of extremely high-resolution adaptive mesh refinement (AMR) calculations of high redshift galaxy formation. "Our work suggests that there are far fewer faint galaxies than one could previously infer," said principal investigator and lead author Brian W. O'Shea, an associate professor at Michigan State University with a joint appointment in the Department of Computational Mathematics, Science and Engineering; the Department of Physics and Astronomy; and the National Superconducting Cyclotron Laboratory.
May 29, 2015
The 2015 Blue Waters Symposium, held May 10-13 at Oregon's beautiful Sunriver Resort, brought together leaders in petascale computational science and engineering to share successes and methods. Around 130 attendees, many of whom were Blue Waters users and the NCSA staff who support their work, enjoyed presentations on computational advances in a range of research areas—including sub-atomic physics, weather, biology, astronomy, and many others—as well as keynotes from innovative thinkers and leaders in high-performance computing. Over the three days of the symposium, 58 science teams from across the country presented on their work on Blue Waters.
Mar 13, 2015
Nine research teams from a wide range of disciplines have been awarded computational and data resources on the Blue Waters supercomputer at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign. Blue Waters is one of the world’s most powerful supercomputers, capable of performing quadrillions of calculations every second and working with quadrillions of bytes of data. Its massive scale and balanced architecture enable scientists and engineers to tackle research challenges that could not be addressed with other computing systems.
Mar 25, 2013
Not every scientific discovery originates in the lab, or from the field. Scientists increasingly are turning to powerful new computers to perform calculations they couldn't do with earlier generation machines, and at breathtaking speed, resulting in groundbreaking computational insights across a range of research fields. .... Last October, NSF inaugurated Yellowstone, one of the world's most powerful computers, based at NCAR in Cheyenne, Wyo., and later this month will dedicate two additional supercomputers, Blue Waters, located at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, and Stampede, headquartered at the Texas Advanced Computing Center 9TACC) at The University of Texas at Austin. ... "The computer is excellent in permitting us to test a hypothesis," says Klaus Schulten, a professor of physics at the University of Illinois at Urbana-Champaign, who uses large-scale computing to study the molecular assembly of biological cells, most recently HIV, the virus that causes AIDS. "But if you want to test a hypothesis, you need to have a hypothesis."
Aug 27, 2012
Visualization has been an integral part of NCSA since its beginning. That tradition continues with the Blue Waters project. While the early visualization work is mostly concerned with testing software functionality and performance, says Dave Semeraro, Blue Waters visualization team leader, it is providing tantalizing glimpses of the science. As Petascale Computing Resource Allocations (PRAC) teams exercise the Blue Waters Early Science System (BW-ESS), NCSA staff members are performing application tests to ascertain system and application performance. These tests, done in collaboration with BW-ESS users, have produced datasets that are, in turn, being used by the Blue Waters visualization staff to test scalable visualization tools.
Apr 3, 2012
With the first portion of the Blue Waters petascale supercomputer now being used by six science teams, the Blue Waters visualization team is using scalable visualization software to produce some of the first science images from the Blue Waters project. NCSA staff members are performing application tests to ascertain Blue Waters' system and application performance. These tests, done in collaboration with the early science users, have produced datasets that are in turn being used by the Blue Waters visualization staff to test scalable visualization tools. Such tools enable science teams to explore the very large volumes of data they will produce on the full Blue Waters system. While the early visualization work is mostly concerned with software functionality, it is providing tantalizing glimpses of the early science.
Mar 20, 2012
Six research teams have begun using the first phase of the Blue Waters sustained-petascale supercomputer to study some of the most challenging problems in science and engineering, from supernovae to climate change to the molecular mechanism of HIV infection. The Blue Waters Early Science System, which is made up of 48 Cray XE6 cabinets, represents about 15 percent of the total Blue Waters computational system and is currently the most powerful computing resource available through the National Science Foundation.
Dec 9, 2009
Many scientists are working now with the Blue Waters team so they are ready to use the massive sustained-petaflop supercomputer when it comes online in 2011. These teams will use Blue Waters to improve our understanding of tornadoes, earthquakes, the spread of contagious diseases, the formation of galaxies, the behavior of molecules and more.
Sep 1, 2009
"In the planning world, we work with policymakers to design studies of particular outcomes," says Virginia Tech's Keith Bisset. Months of planning, collaboration, and modeling might go into strategies for what a city, county, or entire country might do when facing a disease outbreak. "But now we also have tools that allow for a quick turnaround. We can do a situational assessment that shows them what a particular [outbreak] might look like tomorrow or next week as it unfolds. They describe the situation, and we can tell them the outcomes of various interventions," he says. This spring Bisset and a group from Virginia Tech joined forces with the Pittsburgh Supercomputing Center's Shawn Brown and Douglas Roberts and Diglio Simoni of North Carolina's Research Triangle Institute to win one of the first Petascale Computing Resource Allocations awards. With that support and with computing time on Blue Waters, they expect to model global epidemics, as well as smaller-scale outbreaks. Instead of looking at a few hundred million people, as the team members do with their current codes, they'll look at more than 6 billion people.