The SCons tool is a software construction framework based on Python and an
alternative to tools like 'make' and 'autoconf' . The software source
components and their dependencies are defined using a library of Python
modules, and, if needed, the full power of Python is available for
scripting and customizing the build procedures and products.
Software engineers in the Earth Observing Laboratory have been using SCons
for a few years to build several libraries and applications. SCons
Speaker Description:
Gary Granger received a Bachelor of Science degree in Computer Engineering from Virginia Tech, then began working for the Atmospheric Technology Division at NCAR in 1992. Over the years he has worked in several software development areas related to field deployment and instrument development, including field operations, visualization, and wind profiling radars. Currently he works in the Software Systems Group of the Earth Observing Laboratory, developing software in C++ and Python for the Integrated Sounding System, and developing LabVIEW software for spectrometers. He also advocates good software engineering practices in EOL and supports related infrastructure, such as subversion, build frameworks, and continuous integration servers.
Mr Dave Bouwer has been the Chief Engineer for the
Space Weather Division of Space Environment Technologies (SET) for the
past 12 years, since the company's inception, performing the development
and maintenance of real-time space weather operations at SET. Prior to
this, he was employed at the Univ of Colo./CIRES and the NOAA Space
Environment Center for 20 years. He is responsible for the development
and deployment of the initial SET SpaceWx iPhone App, a
Thermospheric/Ionospheric Google Earth application, and other successful
operational space weather products.
Earth Science Data Processing, Cloud Computing, Nebula, AIRS
Speaker Description:
Dr. Aijun Chen got his Ph.D. in Peking University, China, in 2000, Major in Remote Sensing & GIS. He came to George Mason University as a post-doc research associate in 2002. Currently, Dr. Chen is a research associate professor in the Center of Spatial Information Science and Systems, George Mason University. He is contracted to work at NASA Goddard Earth Science Data and Information Service Center. Dr. Chen has published more than 60 academic papers in journals and international conferences. He led several NASA projects, including: 1. Utilizing Nebula Cloud Computing to promote NASA Earth Science data processing”, main investigator 2. Using Google Earth to enhance and promote the use, usefulness and usability of NASA GES DISC atmospheric data for scientific research and public use 3. The Integration of Grid Technology with OGC Web Services (OWS) in NWGISS for NASA HDF-EOS Geospatial Data
Thomas Cram, Doug Schuster, Hannah Wilcox, Steve Worley, Michael Burek and Eric Nienhouse
Speaker Description:
B.A. 1994 in Mathematics, St. Olaf College, Northfield, MN M.S. 2000 in Atmospheric Science, Colorado State University (advisor: Michael Montgomery) 2000-2010: Research Associate in the Department of Atmospheric Science at Colorado State University: focused on hurricane dynamics and global climate model (GCM) development. 2010-present: Software Engineer at NCAR: Working in the Data Support Section (DSS), which manages the NCAR Research Data Archive.
Apache OODT is a thriving top level project at the Apache Software Foundation, released under the ALv2. The software is a set of components, an architectural pattern and style for constructing data-intensive systems that focus on two fundamental areas: data processing and computation; and information integration.
Speaker Description:
Chris Mattmann has a wealth of experience in software design, and in the
construction of large-scale data-intensive systems. His work has
infected a broad set of communities, ranging from helping NASA unlock
data from its next generation of earth science system satellites, to
assisting graduate students at the University of Southern California
(his Alma mater) in the study of software architecture, all the way to
helping industry and open source as a member of the Apache Software
Foundation. When he's not busy being busy, he's spending time with his
lovely wife and son braving the mean streets of Southern California.
Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, has been accomplished with basic software tools that have not changed substantially from the days of serial and early vector computers.
Speaker Description:
Jay Alameda is the lead for Advanced Application Support at the National Center for Supercomputing Applications. In this role, he works with the Extreme Science and Engineering Discovery Environment (XSEDE) which is a collaboration of NSF-funded high performance computing (HPC) resource providers, working to provide a common set of services, including the provisioning of advanced user support, to the science and engineering community. Jay also works with the NSF-funded Track 1 project, Blue Waters, and in this role, has worked withadvanced development tools (such as the Eclipse Parallel Tools Platform) to support development and optimization of HPC applications on the Blue Waters resource. He is also leading the NSF funded SI2 project, “A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform”, which is working on a user- and application-centric plan to improve Eclipse PTP as a platform for development of HPC applications, with a particular focus on broadening support of a diverse range of HPC resources (especially across XSEDE) as well as undertaking a broad education, outreach and training agenda to increase the size of the community benefiting from the capabilities of Eclipse PTP.
Over the past five years, we have constructed a framework for understanding open source software systems at the National Aeronautics and Space Administration (NASA). Our framework includes processes and strategies for open source licensing, redistribution, attribution, community building, IP understanding and a number of other relevant dimensions.
Speaker Description:
Chris Mattmann has a wealth of experience in software design, and in the construction of large-scale data-intensive systems. His work has infected a broad set of communities, ranging from helping NASA unlock data from its next generation of earth science system satellites, to assisting graduate students at the University of Southern California (his Alma mater) in the study of software architecture, all the way to helping industry and open source as a member of the Apache Software Foundation. When he's not busy being busy, he's spending time with his lovely wife and son braving the mean streets of Southern California.
Sky Bristol is currently the Director of Applied Earth Systems Informatics Research in the U.S. Geological Survey Core Science Systems Mission Area, where he is working to develop a new virtual organization with a three-part mission to conduct targeted computer and information science research and development, establish working partnerships with research institutions throughout the field, and conduct an education program to improve the ability for scientists to apply technology to earth systems research. Sky has led a variety of software engineering efforts in the USGS and the Fish and Wildlife Service over the last decade as a software developer, systems engineer, solutions architect, and product owner. Sky has driving passions to put "Information" firmly back in "IT" and to make government scientific information more robust and usable in the public domain.
Growing complexity, volume, and reliance on operationally and routinely created datasets pose challenges for centers tasked with archiving and curating this information. Past tools focused on data delivered via media, such as tape, or data downloaded by ftp scripting customized for single datasets. Presently most data are acquired using network transfers that can happen many times per day. Past archive management technologies do not scale to this new paradigm.
Speaker Description:
Zaihua Ji, short name Hua
Software Engineer III in Data Support Section, CISL, NCAR; starting in February 2004.
Graduated from University of South Florida in 1997 with a M.S. in Computer Science and Ph.D. in Physical Oceanography.
Teraflop-class supercomputers, like Bluefire, can generate Petabytes (PB, 10^15 bytes) of climate model output; will petaflop-class machines like Yellowstone, be used to generate Exabytes (EB, 10^18) of data? The ability and use of these computers to generate truly massive amounts of data is an subject that has gotten some degree of attention, but there are a number of outstanding issues that have been only lightly addressed.
Speaker Description:
Gary Strand is a software engineer in the Climate Change Prediction group of the Climate and Global Dynamics Division of NCAR. He began work at NCAR in 1986 as a student assistant, and has been involved in several generations of climate model development in CGD. He is the primary data manager and data scientist for the latest NCAR climate model, the Community Earth System Model (CESM). He has led the major data management activities and projects for the CESM since 2003, including CMIP3 and the current CMIP5. He is also one of the key personnel for the Earth System Grid (ESG) project, participating since its inception in 2001. Gary has also created a number of visualizations of CESM output that have been used in many scientific presentations as well as in major broadcast media.