Scientific Software Days 2010

The Texas Advanced Computing Center, the University of Texas Institute for Geophysics, and the University of Texas Bureau of Economic Geology are organizing the 2010 Scientific Software Days, where the scientific community can share experiences of developing software and learn about new developments in general scientific software.

Invited talks:

Steve Easterbook (University of Toronto)
"Can you trust the code? An analysis of the software quality of global climate models"

In this talk, I will compare the software development practices used by climate modelers with those used across the software industry for commercial and open source applications. I will explore the particular characteristics that make some standard software engineering practices inappropriate for climate modeling, and assess how effective existing modeling practices are at avoiding coding errors. I will illustrate the talk with examples from our studies of model development the Hadley Centre in the UK and NCAR in the US. Bio: Steve Easterbrook is a professor of computer science at the University of Toronto. He received his Ph.D. (1991) in Computing from Imperial College in London (UK), and was a lecturer at the School of Cognitive and Computing Science, University of Sussex from 1990 to 1995. In 1995 he moved to the US to lead the research team at NASA´s Independent Verification and Validation (IV&V) Facility in West Virginia, where he investigated software verification on the Space Shuttle Flight Software, the International Space Station, the Earth Observation System, and several planetary probes. He moved to the University of Toronto in 1999. His research interests range from modelling and analysis of complex software software systems to the socio-cognitive aspects of team interaction, including communication, coordination, and shared understanding in large software teams. He has served on the program committees for many conferences and workshops in Requirements Engineering and Software Engineering, and was general chair for RE'01 and program chair for ASE'06. In the summer of 2008, he was a visiting scientist at the UK Met Office Hadley Centre.

William Stein (University of Washington)
SAGE Tutorial

http://wstein.org/talks/20100510-texas/

Contributed talks:

Michael Gonzales (Texas Advanced Computing Center)
iPlant: Building the Cyberinfrastructure to Tackle the Grand Challenges in Plant Science

The iPlant Collaborative (iPlant) is an NSF funded project designed to foster the development of a diverse, multidisciplinary community of scientists, teachers and students to facilitate significant advances in the understanding of plant science. Through the application of advanced computational approaches, iPlant aims to create the cyberinfrastructure (CI) needed to address Grand Challenge problems in plant biology. The iPlant Collaborative will achieve this Vision through a number of strategic efforts that include the following objectives - Adopt and create the best and most appropriate CI for addressing existing and future plant science questions. - Promote computational thinking and approaches within plant science and education community by stimulating collaborations, developing appropriate tools, and supporting training. A key component of this CI is the Discovery Environment (DE), which provides a modern, common web interface and platform to expose the computing, data, and application resources made available to the community. Through the DE, scientists will have access to tools built by iPlant, existing informatics applications and many additional community-contributed tools as well. Discovery Environments are designed to facilitate data exploration and scientific discovery. The component model and framework will enable both fixed and user-configurable workflows. Support for high performance computing will provide both seamless, behind-the-scenes computing resources on TeraGrid and other large systems for some applications, and more fine-grained control through command-line tools for advanced users. Provenance tracking of both primary and derived files will make experiment tracking and reproducibility possible. Collaboration tools will enable users to share data, workflows, analysis results, and data visualizations. iPlant's cyberinfrastructure provides a platform to serve as the basis for all future development and its application programming interfaces will allow the community to integrate tools into Discovery Environments.

Paul Navratil (Texas Advanced Computing Center)
Graphics hardware acceleration for HPC codes: what can CUDA do for me?

Because of their incredible price/performance ratio, programmable graphics hardware (GPUs) are an increasingly effective means of accelerating scientific computations in areas such as chemistry, biology, medicine, and data analysis. GPU kernels for problems well-suited to GPU architectural constraints have achieved order-of-magnitude speed-up over comparable CPU kernels. However, not all problems have a decomposition suitable for GPU processing and creating an efficient GPU kernel is non-trivial. In this introductory talk, I will present an overview of modern GPU hardware and the implications of the hardware constraints on problem decomposition and kernel design. I will also discuss Longhorn, TACC's newest GPU-accelerated machine, and some of its current projects that utilize GPU acceleration. Time permitting, I will present recent GPU-accelerated results for HPC codes.

Masa Prodanovic (Department of Petroleum and Geosystems Engineering, UT Austin)
LSMLIB Library and LSMPQS Software for Level Set Method Based Simulation of Interface Motion

Wolfgang Bangerth (Department of Mathematics, Texas A&M)
deal.II -- Software for Advanced Finite Element Simulations

In order to convincingly convey the results of current research in numerical methods, the codes that implement these methods must be able to solve realistic problems in 2d and 3d, resolve local features, and have competitive speed. The implementation of such codes strains what individual researchers can achieve within the lifetime of typical academic projects, and has grown beyond what can be expected of graduate students. As a consequence, the gap between methods research on the one hand and application codes is widening: academic research is often not taken serious by applications researchers because new methods are not demonstrated using testcases that are routinely solved by existing application specific codes. On the other hand, actual applications are not used in testing methods because today's testcases and benchmarks are too complex to be realized in most research projects. A solution to this dilemma is if widely available software libraries provide example programs that already solve realistic applications. These well-documented codes can then be used as a baseline to develop new methods such as better discretization schemes. In this contribution, we will outline one such library, the widely used deal.II Open Source finite element library.

Date and Venue

May 10, 2010
J.J. Pickle Research Campus, ROC 1.603
10100 Burnet Rd., Austin, TX
Scientific Software Day 2010 will be a one-day event, featuring lectures in the morning and a tutorial on Sage in the afternoon.

Organizers

Host Institutions

Texas Advanced Computing Center Jackson School of Geosciences