Seid Koric

University of Illinois at Urbana-Champaign

Engineering

2017

Kataruka, A., S. Koric, E. Guleryuz, W. M. Kriven, and A.-T. Akono (2017): Multiscale Strength Homogenization of Geopolymer Composites, Cement and Concrete Composites (in preparation)
Cui, Y., E. Guleryuz, W. M. Kriven, S. Koric, and A.-T. Akono (2017): Elastic and strength properties of geopolymer precursor via atomistic modeling, Physical Review B (in preparation)

2016

Anshul Gupta, Natalia Gimelshein, Seid Koric, and Steven Rennich (2016): Effective Minimally-Invasive GPU Acceleration of Distributed Sparse Matrix Factorization, Springer International Publishing, Euro-Par 2016: Parallel Processing, pp672--683, Grenoble, France
F. A. Sabet, O. Jin, S. Koric, and I. Jasiuk (2016): Nonlinear Micro-CT FE Modeling of Trabecular Bone - Sensitivity of Apparent Response to Tissue Constitutive Law, Journal of the Mechanical Behavior of Biomedical Materials (submitted)
R. I. Barabash, V. Agarwal, S. Koric, I. Jasiuk, and J. Z. Tischler (2016): Finite Element Simulation and X-Ray Microdiffraction Study of Strain Partitioning in a Layered Nanocomposite, Journal of Crystallography, Hindawi Publishing Corporation, Vol 2016, pp1--11
J. Kwack, G. H Bauer, and S. Koric (2016): Performance Test of Parallel Linear Equation Solvers on Blue Waters – Cray XE6/XK7 system, presented at CUG 2016, London, England, U.K.
Seid Koric, and Anshul Gupta (2016): Sparse Matrix Factorization in the Implicit Finite Element Method on Petascale Architecture, Computer Methods in Applied Mechanics and Engineering, Elsevier BV, Vol 302, pp281--292
Vladimir Puzyrev, Seid Koric, and Scott Wilkin (2016): Evaluation of Parallel Direct Sparse Linear Solvers in Electromagnetic Geophysical Problems, Computers & Geosciences, Elsevier BV, Vol 89, pp79--87

2015

M. Giles, S. Koric, and E. Burness (2015): U. S. Industrial Supercomputing at NCSA, Chapman and Hall/CRC Press, Industrial Applications of High-Performance Computing, pp179-194
Sohan Kale, Ankit Saharan, Seid Koric, and Martin Ostoja-Starzewski (2015): Scaling and Bounds in Thermal Conductivity of Planar Gaussian Correlated Microstructures, J. Appl. Phys., AIP Publishing, Vol 117, Num 10, pp104301
Sohan Kale, Seid Koric, and Martin Ostoja-Starzewski (2015): Stochastic Continuum Damage Mechanics Using Spring Lattice Models, Applied Mechanics and Materials, Trans Tech Publications, Vol 784, pp350--357
Vladimir Puzyrev, and Seid Koric (2015): 3D Multi-Source CSEM Simulations: Feasibility and Comparison of Parallel Direct Solvers, Society of Exploration Geophysicists, SEG (Society of Exploration Geophysicists) Technical Program Expanded Abstracts 2015, pp833-838, presented at the 2015 SEG International Exposition and 85th Annual Meeting, New Orleans, Louisiana, U.S.A.

2014

Galen Arnold, Manisha Gajbe, Seid Koric, and John Urbanic (2014): XSEDE OpenACC Workshop Enables Blue Waters Researchers to Accelerate Key Algorithms, Association for Computing Machinery (ACM), Proceedings of the 2014 Annual Conference on Extreme Science and Engineering Discovery Environment (XSEDE '14), pp28:1-28:6, Atlanta, Georgia, U.S.A.
Seid Koric, Qiyue Lu, and Erman Guleryuz (2014): Evaluation of Massively Parallel Linear Sparse Solvers on Unstructured Finite Element Meshes, Computers & Structures, Elsevier BV, Vol 141, pp19--25
Wen Huang, Seid Koric, Xin Yu, K. Jimmy Hsia, and Xiuling Li (2014): Precision Structural Engineering of Self-Rolled-Up 3D Nanomembranes Guided by Transient Quasi-Static FEM Modeling, Nano Letters, American Chemical Society (ACS), Vol 14, Num 11, pp6293--6297

Seid Koric: Engineering Breakthroughs at NCSA


4th International Industrial Supercomputing Workshop; Amsterdam, The Netherlands, Oct 24, 2013

Seid Koric and B. G. Thomas: Visco-plasic Multi-Physics Modeling of Steel Solidification


20th International Symposium on Plasticity and its Current Applications; Freeport, Bahamas, Jan 4, 2014

Natalia E. Gimelshein, A. Gupta, S. C. Rennich, and S. Koric: GPU Acceleration of WSMP


2015 GPU Technology Conference; San Jose, California, U.S.A., Mar 17, 2015

Muris Torlak and Halač, Almin and Roesler, Stefan and Koric, Seid: DES of Turbulent Flow Around a Sphere with Trip Wire


Eighth International Symposium On Turbulence, Heat and Mass Transfer; Sarajevo, Bosnia and Herzegovina, Sep 17, 2015

Seid Koric, Fereshteh A Sabet, Ouli Jin, and Iwona M. Jasiuk: Direct numerical Simulation of Bone Plasticity and Strength


2016 International Symposium on Plasticity and Its Current Applications; Keauhou, Hawai'i, U.S.A., Jan 3, 2016

Industrial Computational Breakthroughs on Blue Waters

In this video from the NCSA Blue Waters Symposium, Seid Koric from National Center for Supercomputing Applications presents: Industrial Computational Breakthroughs on Blue Waters..

Blue Waters supercomputer power-user profile: Seid Koric

Every year, many people perform research on Blue Waters, the massive supercomputer hosted by the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign. However, very few can say that they've published research on Blue Waters an impressive 22 times, as NCSA's own Seid Koric has done..

Pushing limits

NCSA's Private Sector Program (PSP) has a unique opportunity to work with non-academic codes, impact economic development, and make more complex simulations happen for industry partners, says Merle Giles, PSP director. High-performance computing (HPC) has become a core strategic technology enabling enhanced insight into product performance and improving the productivity by considering more design variants.But as companies increasingly seek to minimize time, quality, and cost pressures by using engineering simulation, they have been constrained by compute power. Both software developers and end-users face constraints when it comes to testing the limits of codes. They often don't have access to truly massive supercomputers and are focused on daily business needs, so they can't spare the time and manpower to attempt extreme scaling studies. PSP bridges the gap between partnering companies and computing resources and expertise..

NCSA’s Private Sector Program pushes LS-DYNA code to extreme scale on Blue Waters supercomputer

LS-DYNA, an explicit finite element code used for simulations in the auto, aerospace, manufacturing, and bioengineering industries, was recently scaled to 15,000 cores on NCSA’s Blue Waters supercomputer—a world record for scaling of any commercial engineering code. Both software developers and end-users face constraints when it comes to testing the limits of commercial codes. They often don’t have access to truly massive supercomputers, and their resources and staff are focused on daily business needs—they can’t spare the time and manpower to attempt extreme scaling studies. NCSA’s Private Sector Program (PSP) is able to bring all of the key components together: LS-DYNA developer LSTC; the petascale Blue Waters supercomputer and its hardware manufacturer, Cray; the industrial users with real challenges; and the expertise of PSP’s staff. “Once Blue Waters was in production, we looked for test cases to run at extreme scale,” says Seid Koric, a senior computational resources coordinator with NCSA’s PSP and a University of Illinois adjunct professor of Mechanical Science and Engineering..

Blue Waters Instrumental in Recent Computational Development

Do you still remember linear algebra? Better question—can you still solve a linear algebra system of equations? Here, try one out. We’ll wait.2x + 3y = 6 6y + 3x = -3These are the kind of problems supercomputers use sparse matrices—which means with many zeros—to solve, says Seid Koric, adjunct professor in mechanical science and engineering as well as technical program manager for the Private Sector Program at NCSA..

NCSA Paves a New Way for Using Geopolymers

Dr. Seid Koric, this year’s winner of the Top Supercomputing Achievement award in the annual HPCwire Editors’ Choice Awards, teamed up with NCSA Faculty Fellow and PI, Professor Ange-Therese Akono, geopolymers expert Professor Waltraud “Trudy” Kriven and NCSA research scientist Dr. Erman Guleryuz. Their goal is to understand the impact of nanoporosity on stiffness and strength of geopolymers via molecular dynamics and finite element modeling..

Scaling Commercial CFD Code at 3 Supercomputing Centers

The year isn’t over yet and already we've seen new records posted for proprietary physics-based simulations on two Cray machines – at the National Center for Supercomputing Applications (NCSA) and at King Abdullah University of Science and Technology (KAUST) - successfully pushing the limits of all the available nodes in those large machines..

HPC Centers Role in Driving CAE Simulation

Cray systems are installed at many HPC centers around the world, including the National Center for Supercomputing Applications (NCSA) at the University of Illinois, the High Performance Computing Center Stuttgart (HLRS), Oak Ridge National Laboratory (ORNL) and Edinburgh Parallel Computing Centre (EPCC) just to name a few. All of these centers have programs in place to work with commercial companies and increasingly, we see these centers enabling leading-edge HPC simulation. HPC centers have become essential to expanding the competitiveness of commercial companies..