Benoit Roux

University of Chicago

Biochemistry and Molecular Structure and Function

2016

John D Lueck, Adam L Mackey, Daniel T Infield, Jason D Galpin, Jing Li, Benoît Roux, and Christopher A Ahern (2016): Atomic Mutagenesis in Ion Channels with Engineered Stoichiometry, eLife, eLife Sciences Organisation, Ltd., Vol 5

2014

Wei Jiang, James C. Phillips, Lei Huang, Mikolai Fajer, Yilin Meng, James C. Gumbart, Yun Luo, Klaus Schulten, and Benoît Roux (2014): Generalized Scalable Multiple Copy Algorithms for Molecular Dynamics Simulations in NAMD, Computer Physics Communications, Elsevier BV, Vol 185, Num 3, pp908--916

2013

James F. Dama, Anton V. Sinitskiy, Martin McCullagh, Jonathan Weare, Benoît Roux, Aaron R. Dinner, and Gregory A. Voth (2013): The Theory of Ultra-Coarse-Graining. 1. General Principles, J. Chem. Theory Comput., American Chemical Society (ACS), Vol 9, Num 5, pp2466--2480

Great Lakes Consortium awards access to Blue Waters supercomputer to 11 research projects

How the flu virus enters a cell in the body. Evaluating economic policy impacts of potential future climate change. Understanding the dynamics and physics of atomic matter during galaxy cluster formation. These are just a few of the research projects being pursued by the 11 science and engineering teams from across the country who were awarded time on the Blue Waters supercomputer through the Great Lakes Consortium for Petascale Computation. Over a twelve-month period, these science and engineering teams will have a combined total of more than 4.3 million node hours on Blue Waters..

U of I, Great Lakes Consortium award Blue Waters resources to 18 research teams

Eighteen research teams from a wide range of disciplines have been awarded computational and data resources on the sustained-petascale Blue Waters supercomputer at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign. Blue Waters is one of the world’s most powerful supercomputers, capable of performing quadrillions of calculations every second and working with quadrillions of bytes of data. Its massive scale and balanced architecture enable scientists and engineers to tackle research challenges that could not be addressed with other computing systems..