University of Southern California
Jun 2017 - Aug 2018
Dec 2016 - May 2017
Sep 2015 - Jul 2017
Sep 2015 - Nov 2016
Blue Waters Symposium 2016, Jun 15, 2016
Blue Waters Symposium 2015, May 13, 2015
Blue Waters Symposium 2014, May 14, 2014
Christine Goulet: Advances in Physics-based Modeling of High-frequency Ground Motions and Probabilistic Seismic Hazard Analysis using Blue Waters
Blue Waters Symposium 2017, May 18, 2017
Ricardo Taborda: Influence of the Source, Seismic Velocity, and Attenuation Models on the Validation of Ground Motion Simulations
16th World Conference on Earthquake Engineering; Santiago de Chile, Chile, Jan 12, 2017
2015 GPU Technology Conference; San Jose, California, U.S.A., Mar 18, 2015
Blue Waters Webinar Series, Mar 8, 2017
May 28, 2014
The symposium, held May 13-15 in Champaign, Ill., gathered many of the country’s leading supercomputer users to share what they have learned using Blue Waters and discuss the future of supercomputing. On May 13, 2014, Blue Waters supercomputer users and many of the NCSA staff who support their work converged in Champaign, Ill., for the second annual Blue Waters Symposium. The ensuing three days were filled with what many of them would later refer to as a wonderful variety of science talks and opportunities for networking and collaboration.
May 29, 2015
The 2015 Blue Waters Symposium, held May 10-13 at Oregon's beautiful Sunriver Resort, brought together leaders in petascale computational science and engineering to share successes and methods. Around 130 attendees, many of whom were Blue Waters users and the NCSA staff who support their work, enjoyed presentations on computational advances in a range of research areas—including sub-atomic physics, weather, biology, astronomy, and many others—as well as keynotes from innovative thinkers and leaders in high-performance computing. Over the three days of the symposium, 58 science teams from across the country presented on their work on Blue Waters.
Oct 1, 2014
The National Science Foundation (NSF) has awarded 14 new allocations on the Blue Waters petascale supercomputer at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign. Seven of the awards are for new projects.
Aug 26, 2015
Earthquakes occur on a massive scale and often originate deep below the surface of the Earth, making them notoriously difficult to predict. The Southern California Earthquake Center (SCEC) and its lead scientist, Thomas Jordan, use massive computing power made possible by the National Science Foundation (NSF) to improve our understanding of earthquakes. In doing so, SCEC is helping to provide long-term earthquake forecasts and more accurate hazard assessments.
Apr 21, 2010
The northern reaches of the San Andreas Fault have seen their share of major earthquakes in the last century. The 1906 San Francisco killed more than 3,000 people, and a 1989 quake near Santa Cruz postponed the World Series. The other end of the fault near Los Angeles, meanwhile, hasn't seen a major earthquake since 1680. But there is a high probability of a rupture over the next two decades. A team of more than 30 earthquake scientists, computer scientists, and other specialists are very interested in that next earthquake—what it might look like, what sort of damage it might cause, and what might be done to mitigate the damage. Led by the Southern California Earthquake Center (SCEC), they plan to use the Blue Waters sustained-petascale supercomputer at NCSA to model it.
Dec 4, 2014
Earthquake risk is a thorny subject scientifically and in society. In 2012, an Italian judge sentenced six Italian scientists and engineers and one government official to six years in prison for downplaying the risk of an impending earthquake in L’Aquila, Italy, in 2009. The charge was manslaughter. The judge held the scientists responsible for 29 of the 309 deaths because he said the scientists failed to properly analyze and explain the earthquake threat in the days leading up to the 6.3-magnitude quake. ... "It's incredible that scientists trying to do their job under the direction of a government agency have been convicted for criminal manslaughter," said geoscientist Tom Jordan to Science magazine when the verdict was handed down. "We know that the system for communicating risk before the L'Aquila earthquake was flawed, but this verdict will cast a pall over any attempt to improve it. I'm afraid that many scientists are learning to keep their mouths shut." Jordan is not one of those quiet scientists. At the Southern California Earthquake Center (SCEC) where Jordan is the lead scientist, the SCEC PressOn project aims to improve long-term earthquake prediction (on the decadal time scale) and modeling as a step toward more specific and accurate earthquake hazard assessments.
Nov 18, 2015
Decision-makers from various sectors now have better knowledge to assess and mitigate earthquake risk owing to high-performance computing research by the Southern California Earthquake Center (SCEC), headquartered at the University of Southern California. On Thursday at the 2015 International Conference for High Performance Computing, Network Storage, and Analysis ("Supercomputer 2015") in Austin, Texas, SCEC Director Tom Jordan will present an invited talk on the societal impacts of SCEC's research and development in using supercomputers. "By using the nation's largest supercomputers, we can now forecast with more accuracy and detail the strong shaking that will come from large earthquakes in Southern California," stated Tom Jordan, Director of SCEC.
Jan 13, 2010
The Blue Waters supercomputer is one of the most powerful supercomputers in the world for open scientific research when it comes online at Illinois in 2011. How will scientists and engineers across the country use this tremendous resource? How will their research be advanced by a supercomputer that can do 1 quadrillion calculations every second? Many scientists are working now with the Blue Waters team so they are ready to use the massive sustained-petaflop supercomputer when it comes online in 2011. These teams will use Blue Waters to improve our understanding of everything from the Earth's climate to earthquakes.
Nov 18, 2015
When researchers need to compare complex new genomes, or map new regions of the Arctic in high-resolution detail, or detect signs of dark matter, or make sense of massive amounts of functional MRI data, they turn to the high-performance computing and data analysis systems supported by the National Science Foundation (NSF). High-performance computing (or HPC) enables discoveries in practically every field of science -- not just those typically associated with supercomputers like chemistry and physics, but also in the social sciences, life sciences and humanities. The Southern California Earthquake Center (SCEC) and its lead scientist Thomas Jordan use massive computing power to simulate the dynamics of earthquakes. In doing so, SCEC helps to provide long-term earthquake forecasts and more accurate hazard assessments.
Mar 25, 2013
Not every scientific discovery originates in the lab, or from the field. Scientists increasingly are turning to powerful new computers to perform calculations they couldn't do with earlier generation machines, and at breathtaking speed, resulting in groundbreaking computational insights across a range of research fields. .... Last October, NSF inaugurated Yellowstone, one of the world's most powerful computers, based at NCAR in Cheyenne, Wyo., and later this month will dedicate two additional supercomputers, Blue Waters, located at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, and Stampede, headquartered at the Texas Advanced Computing Center 9TACC) at The University of Texas at Austin. ... "The computer is excellent in permitting us to test a hypothesis," says Klaus Schulten, a professor of physics at the University of Illinois at Urbana-Champaign, who uses large-scale computing to study the molecular assembly of biological cells, most recently HIV, the virus that causes AIDS. "But if you want to test a hypothesis, you need to have a hypothesis."
Jan 24, 2018
Working with data from the Lander earthquake that shook Southern California in 1992, a team of researchers from San Diego State University has advanced earthquake simulation capability by improving a widely-used wavefield simulation code and adapting it for improved 3D modeling and for use on high-end HPC systems. Their research delivered new insight into strike-slip earthquakes such as the Lander earthquake which was magnitude 7.3 and leveled homes, sparked fires, cracked roads and caused one death.
Mar 28, 2018
Thomas Jordan and his team at the Southern California Earthquake Center are delivering seismic intelligence for better earthquake preparedness.