David Gallaher is a Geoscientist and Manager of Information Technology Services, National Snow and Ice Data Center (NSIDC), Cooperative Institute for Environmental Sciences (CIRES), University of Colorado, Boulder, CO.
He is leading the technical evolution of NSIDC systems and architecture to meet the needs of our scientific communities and stakeholders. At the same time, he is focusing on evolving our internal and external systems integration, and on refining technologies and infrastructures to be more user-friendly, efficient, cost-effective, and scalable, while continuing to support our core data ingest and distribution functions.
He is currently developing a “green data center” at NSIDC with the goal of reducing the power consumption for cooling by 95%. Dave is the lead investigator on a NSF grant for project to design and prototype a process (through creation of "Data Rods") for addressing time-series data as pure objects that will enable time-centric change analysis of massive multi-modality cryospheric data. He is also the Project Manager on a project to build web services and analysis application for determining changes to the Greenland ice sheet. His latest project is to recover 1960’s Nimbus satellite data to determine the sea ice extent during that decade.
Since the introduction of programming languages such as Fortran,
scientific data users have been liberated to think beyond memory
registers and 1s and 0s. These higher level languages better capture the
semantics of the scientific application domain -- numbers and formulae.
Object-Oriented techniques have extended the ability to represent higher level
abstractions through encapsulation of data structures and operations. Numerous
data models have emerged but fail to achieve broad adoption outside a specific
Speaker Description:
Doug Lindholm began his career in science informatics while completing his Masters Degree in Astrophysics at CU in the early 1990s. In addition to feeling the pain of scientific data access, he contributed to NASA's Astrophysics Data System. After CU, Doug held three different software engineering positions within UCAR. He has since come full circle, returning to CU just over five years ago, to the Laboratory for Atmospheric and Space Physics (LASP) where he manages the science data system of the SORCE spacecraft, develops data access technologies, and does research on scientific data modeling.
Part 3 of the Fortran standard defines a preprocessor. An implementation
is available for use at NCAR. It fully supports the standard Part 3,
and has some extensions of use to Fortran programmers.
Speaker Description:
Daniel Nagle is the chair of PL22.3 (formerly J3) Fortran Standard Committee. He got his PhD in Computational Science from GMU and is working in UCAR’s Consulting Service Group now. He has been using and teaching Fortran since the '60s and has been parallel programming in Fortran and other languages since the '80s
William Putman, Kaushik Datta, Amidu Oloso, Thomas Clune
Speaker Description:
Thoughout Kaushik Datta's career, he has focused on extracting performance from the latest computer architecturs. As a Ph.D. student at U.C. Berkeley, he studied how to use auto-tuning to best exploit cache-based multicore processors. He then put that knowledge to use at Reservoir Labs, a small compiler company based in New York. He has since shifted to working at NASA (through Northrop Grumman), where Kaushik is now looking at how to best exploit more exotic architectures, including GPUs and Intel's MIC architecture, for simple climate kernels.
J. Collins, D. Harper, M. McNulty, J. Lacy, L. Lopez, S. Lewis, M. Liu, S. Reed, I. Truslove, H. Wilcox
Speaker Description:
All of the authors/speakers are software engineers currently working for the National Snow and Ice Data Center (NSIDC) whose main goal is to support research into our world’s frozen realms: the snow, ice, glaciers, frozen ground, and climate interactions that make up Earth’s cryosphere. Our software engineering team supports this goal by creating custom software to fulfill the center’s need for managing and distributing data, creating tools for data access, supporting data users, performing scientific research, and educating the public about the cryosphere. The individuals on our software team come from very varied backgrounds within the software industry, and have vastly different experience levels ranging from graduate student to 15+ years in the software industry.
Most developers have heard of Test Driven Development (TDD), however its adoption is less widespread. This presentation outlines the philosophy, concepts and tools your team needs to completely test drive your products efficiently, from the front end down. It will define what unit tests and TDD are, and cover acceptance testing and ATDD with Cucumber, behavior driven development (BDD) and various test structures, mock objects, and fluent matchers. Examples will be in Java, JavaScript and Ruby.
Speaker Description:
Ian has been working in various software roles for over ten years, including programming web applications, and making slot machines and video games. He has been working on Agile teams for over five years, using practices such as Scrum and test-driven development. He is currently a programmer at the National Snow and Ice Data Center in Boulder, CO, working on services-based web applications.
David Knox received a BA in Computer Science from University of Colorado at Boulder in 1982. He spent 30 years in industry before returning to receive a MS in Computer Science in 2010. He spent many years in R&D for office automation, image processing, and optical character recognition(OCR) technologies. He is currently a PhD student in Computational Bioscience at University of Colorado Anschutz Medical Campus. He has been a member of IEEE and ACM for 25+ years.
Johnny Lin graduated from Stanford University with a B.S. in Mechanical Engineering and an M.S. in Civil Engineering-Water Resources. After working as an environmental engineer, he returned to school and received his Ph.D. in Atmospheric Sciences from UCLA, as a student of David Neelin. His atmospheric science research is focused on stochastic convective parameterizations, ice-atmosphere interactions in the Arctic, and simple frameworks for modularizing climate models.
He is also working on a book on environmental ethics and helps coordinate the PyAOS mailing list and blog (pyaos.johnny-lin.com), an effort at building up the atmospheric and oceanic sciences Python community. Currently, he is a Professor of Physics at North Park University in Chicago.
Johnny Lin graduated from Stanford University with a B.S. in
Mechanical Engineering and an M.S. in Civil Engineering-Water Resources.
After working as an environmental engineer, he returned to school and
received his Ph.D. in Atmospheric Sciences from UCLA, as a student of
David Neelin. His atmospheric science research is focused on stochastic
convective parameterizations, ice-atmosphere interactions in the
Arctic, and simple frameworks for modularizing climate models.
He
is also working on a book on environmental ethics and helps coordinate
the PyAOS mailing list and blog (pyaos.johnny-lin.com), an effort at
building up the atmospheric and oceanic sciences Python community.
Currently, he is a Professor of Physics at North Park University in
Chicago.
Many HPC developers still use command-line tools and tools with disparate, and sometimes confusing, user interfaces for the different aspects of the HPC project life cycle. The Eclipse Parallel Tools Platform (PTP) combines tools for coding, debugging, job scheduling, tuning, revision control, and more into an integrated environment for increased productivity.
Speaker Description:
Jay Alameda is the lead for Advanced Application Support at the National Center for Supercomputing Applications. In this role, he works with the Extreme Science and Engineering Discovery Environment (XSEDE) which is a collaboration of NSF-funded high performance computing (HPC) resource providers, working to provide a common set of services, including the provisioning of advanced user support, to the science and engineering community. Jay also works with the NSF-funded Track 1 project, Blue Waters, and in this role, has worked withadvanced development tools (such as the Eclipse Parallel Tools Platform) to support development and optimization of HPC applications on the Blue Waters resource. He is also leading the NSF funded SI2 project, “A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform”, which is working on a user- and application-centric plan to improve Eclipse PTP as a platform for development of HPC applications, with a particular focus on broadening support of a diverse range of HPC resources (especially across XSEDE) as well as undertaking a broad education, outreach and training agenda to increase the size of the community benefiting from the capabilities of Eclipse PTP.