Skip to Content
Petascale Computing Institute
Agenda
Mon Tue Wed Thu Fri
Registration
includes list of Host Sites
Presenters
Resources
Instructions, Slides, Links
Additional Learning Resources
Registrant Locations

FAQ

Call For Host Sites

A/V Plan for Host Sites

Institute Organizers

Friday


Software Engineering Best Practices
Presenter: Michael Heroux, Sandia National Laboratory
Abstract:
Scientific software has emerged as an essential discipline in its own right. Because computational models, computer architectures, and scientific software projects have become extremely complex, the Computational Science & Engineering (CSE) community now has a unique opportunity -and in implicit mandate- to address pressing challenges in scientific software productivity, quality, and sustainability. This presentation will focus on software engineering best practices and efforts in the CSE community to foster natural incentives for improving software quality on the way to improving computational science.

Survey of Visualization Resources
Presenter: Anne Bowen, TACC
Abstract:
Visualization of larger data sets on HPC resources requires some special considerations. In this session, we will survey the visualization applications available to scientists on TACC resources, summarize best practices/strategies for visualizing larger data-sets and provide quick-start guide to remote visualization.

Resources at PSC
Presenter: Nick Nystrom, PSC

Resources at SciNet
Presenter: Marcelo Ponce , SciNet

Resources at TACC
Presenter: Tim Cockerill, TACC

Containers
Presenter: Joe Allen, TACC
Abstract:
Docker and Singularity are a products for delivering container solutions. Containers help in standardizing the process of packaging software and the associated dependencies such that the issues related to the differences in environment (e.g., different versions of the libraries) during the development and release phases of a software can be mitigated. In doing so, Containers simplifyPetasc the software installation processes by providing an easy way to share working environments, which in turn makes it easy to use libraries with complicated setups. Distributing a Docker or Singularity image means distributing both the code itself as well as the environment used to run that code allowing images to run identically on OSX, Windows, and Linux, so generated analyses will run on any other machine and will have reproducible results.

Cloud Computing has become a mainstay platform for providing Information Technology services: storage, databases, networking, software development, and data analytics, to just name a few. The cloud has become very useful in application development, data recovery, hosting, audio and video streaming, software on demand, and machine learning. As such, it has spawned off multiple new services for the technology community which include Infrastructure-as-a-service (IaaS), Platform-as-a-service (PaaS), and Software-as-a-service (SaaS). Why is there a need to implement a cloud environment for computational science? Scientific data is only valid if it can be replicated. The end goal is to teach how to build a producible, scalable environment for high throughput and high performance scientific computing. Docker and Singularity containers are a key piece of technology to achieve this end goal. Hence, with containers, one can write applications that are portable across different cloud computing platforms such as Amazon AWS, Google cloud, Microsoft Azure, and Jetstream (an NSF funded cloud offering for open-science [1]) as well as packaging and importing software stacks to use on an many HPC resources.

Wrap-up and Adjourn
Presenter: Scott Lathrop, NCSA