Supersized Science show

Supersized Science

Summary: The Supersized Science podcast highlights research and discoveries nationwide enabled by advanced computing technology and expertise at the Texas Advanced Computing Center of the University of Texas at Austin. TACC science writer Jorge Salazar hosts Supersized Science. Supersized Science is part of the Texas Podcast Network, brought to you by The University of Texas at Austin. Podcasts are produced by faculty members and staffers at UT Austin who work with University Communications to craft content that adheres to journalistic best practices. The University of Texas at Austin offers these podcasts at no charge. Podcasts appearing on the network and this webpage represent the views of the hosts and not of The University of Texas at Austin.

Join Now to Subscribe to this Podcast
  • Visit Website
  • RSS
  • Artist: Texas Advanced Computing Center - University of Texas at Austin
  • Copyright: CC BY-NC-SA

Podcasts:

 Supercomputers Fire Lasers to Shoot Gamma Ray Beam | File Type: audio/mpeg | Duration: 13:10

Supercomputers might have helped unlock a new way to make controlled beams of gamma rays, according to scientists at the University of Texas at Austin. The simulations done on the Stampede and Lonestar systems at TACC will guide a real experiment later this summer in 2016 with the recently upgraded Texas Petawatt Laser, one of the most powerful in the world. The scientists say the quest for producing gamma rays from non-radioactive materials will advance basic understanding of things like the inside of stars. What's more, gamma rays are used by hospitals to eradicate cancer, image the brain, and they're used to scan cargo containers for terrorist materials. Unfortunately no one has yet been able to produce gamma ray beams from non-radioactive sources. These scientists hope to change that. On the podcast are the three researchers who published their work May of 2016 in the journal Physical Review Letters. Alex Arefiev is a research scientist at the Institute for Fusion Studies and at the Center for High Energy Density Science at UT Austin. Toma Toncian is the assistant director of the Center of High Energy Density Science. And the lead author is David Stark, a scientist at the Los Alamos National Laboratory. Jorge Salazar hosted the podcast.

 UT Chancellor William McRaven on TACC supercomputers - "We need to be the best in the world" | File Type: audio/mpeg | Duration: 6:44

University of Texas System Chancellor William McRaven gave a podcast interview at TACC during a visit for its building expansion dedication and the announcement of a $30 million award from the National Science Foundation for the new Stampede 2 supercomputer system. Chancellor McRaven spoke of his path to lead the UT System of 14 Institutions, the importance of supercomputers to Texans and to the nation, the new Dell Medical School, and more. William McRaven: "Behind all of this magnificent technology are the fantastic faculty, researchers, interns, our corporate partners that are part of this, the National Science Foundation, there are people behind all of the success of the TACC. I think that's the point we can never forget."

 Zika Hackathon Fights Disease with Big Data | File Type: audio/mpeg | Duration: 7:50

On May 15th Austin, Texas held a Zika Hackathon. More than 50 data scientists, engineers, and UT Austin students gathered downtown at the offices of Cloudera, a big data company. They used big data to help fight the spread of Zika. Mosquitos carry and spread the Zika virus, which can cause birth defects and other symptoms like fever. The U.S. Centers for Disease Control is now ramping up collection of data that tracks Zika spread. But big gaps exist in linking different kinds of data, and that makes it tough for experts to predict where it will go next and what to do to prevent it. The Texas Advanced Computing Center provided time on the Wrangler data intensive supercomputer as a virtual workspace for the Zika hackers. Featured on the podcast are Ari Kahn, Texas Advanced Computing Center; and Eddie Garcia, Cloudera. Podcast hosted by Jorge Salazar of TACC.

 Sudden Collapse: Supercomputing Spotlight on Gels | File Type: audio/mpeg | Duration: 11:17

Chemical engineering researcher Roseanna Zia has begun to shed light on the secret world of colloidal gels - liquids dispersed in a solid. Yogurt, shampoo, and Jell-o are just a few examples. Sometimes gels act like liquids, and sometimes they act like a solid. Understanding the theory behind these transitions can translate to real-world applications, such as helping understand why mucus - also a colloidal gel - in the airway of people with cystic fibrosis can thicken, resist flow and possibly threaten life. Roseanna Zia is an Assistant Professor of Chemical and Bimolecular Engineering at Cornell. She led development of the biggest dynamic computer simulations of colloidal gels yet, with over 750,000 particles. The Zia Group used the Stampede supercomputer at TACC through an allocation from XSEDE, the eXtreme Science and Engineering Environment, a single virtual system funded by the National Science Foundation (NSF) that allows scientists to interactively share computing resources, data and expertise. Podcast host Jorge Salazar interviewed Roseanna Zia. Music Credits: Raro Bueno, Chuzausen freemusicarchive.org/music/Chuzausen/

 Docker for Science | File Type: audio/mpeg | Duration: 14:02

Scientists might find a friend in the open source software called Docker. It's a platform that bundles up the loose ends of applications - the software and the dependencies that sustain it - into something fairly light that can run on any system. As more scientists share not only their results but their data and code, Docker is helping them reproduce the computational analysis behind the results. What's more, Docker is one of the main tools used in the Agave API platform, a platform-as-a-service solution for hybrid cloud computing developed at TACC and funded in part by the National Science Foundation. Podcast host Jorge Salazar talks with software developer and researcher Joe Stubbs about using Docker for science. Stubbs is a Research Engineering and Scientist Associate in the Web & Cloud Services group at the Texas Advanced Computing Center.

 Dark Energy of a Million Galaxies | File Type: audio/mpeg | Duration: 12:13

UT Austin astronomer Steven Finkelstein eyes Wrangler supercomputer for HETDEX extragalactic survey, in this interview with host Jorge Salazar. A million galaxies far, far away are predicted to be discovered before the year 2020 thanks to a monumental mapping of the night sky in search of a mysterious force. That's according to scientists working on HETDEX, the Hobby-Eberly Telescope Dark Energy Experiment. They're going to transform the big data from galaxy spectra billions of light-years away into meaningful discoveries with the help of the Wrangler data-intensive supercomputer. "You can imagine that would require an immense amount of computing storage and computing power. It was a natural match for us and TACC to be able to make use of these resources," Steven Finkelstein said. Finkelstein is an assistant professor in the Department of Astronomy at The University of Texas at Austin (UT Austin). He's one of the lead scientists working on HETDEX. "HETDEX is one of the largest galaxy surveys that has ever been done," Finkelstein said. Starting in late 2016, thousands of new galaxies will be detected each night by the Hobby-Eberly Telescope at the McDonald Observatory in West Texas. It'll study them using an instrument called VIRUS, the Visible Integral Field Replicable Unit Spectrograph. VIRUS takes starlight from distant galaxies and splits the light into its component colors like a prism does. Music Credits: Raro Bueno, Chuzausen freemusicarchive.org/music/Chuzausen/

 Human Origins in Fossil Data | File Type: audio/mpeg | Duration: 10:17

Paleoanthropologist Denne Reed of UT Austin is interviewed by host Jorge Salazar about making connections in big data from fossils of human origins. New discoveries might lie buried deep in the data of human fossils. That's according to Denné Reed, an associate professor in the Department of Anthropology at The University of Texas at Austin (UT Austin). Reed is the principal investigator of PaleoCore, an informatics initiative funded by the National Science Foundation (NSF). The PaleoCore project aims to get researchers of human origins worldwide all on the same page with their fossil data. Reed said PaleoCore is doing this by implementing data standards; making a place to store all data of human fossils; and developing new tools to collect the data. What he hopes to come out of this are deeper insights into our origins from better integration and sharing between different research projects in paleoanthropology and paleontology. "We've tried to take advantage of some of the geo-processing and database capabilities that are available through Wrangler to create large archives," Reed said. The big data Reed wants to archive on Wrangler are the entirety of the fossil record on human origins. PaleoCore will also include geospatial data such as satellite imagery. "For many of the countries that we're working in, this is their cultural heritage. We need to be able to ensure that not only are the data rapidly available, accessible, searchable, and everything else, but that they're safely archived," Reed said. Music Credits: Raro Bueno, Chuzausen freemusicarchive.org/music/Chuzausen/

 Supercomputers Save Money, Save Energy | File Type: audio/mpeg | Duration: 10:46

Computer scientist Joshua New of the Oak Ridge National Laboratory speaks with host Jorge Salazar on how to optimize buildings to save energy using computer models. Saving energy saves money. Scientists at Oak Ridge National Laboratory (ORNL) are using supercomputers to do just that by making virtual versions of millions of buildings in the U.S. The Wrangler data-intensive supercomputer is working jointly with ORNL's Titan in a project called Autotune that trims the energy bills of buildings. Computer scientist Joshua New of the ORNL Building Technology Research and Integration Center is the principal investigator of the Autotune project, funded by the U.S. Department of Energy. Autotune takes a simple software model of a building's energy use and optimizes it to match reality. "What we're trying to do is create a crude model from publicly available data," New said. "Then the Autotune project takes utility bill data, whether it's monthly electrical utility bills, or hourly bills from advanced metering infrastructure, and calibrates that software model to match measured data." New said that once Autotune calibrates the model well enough, it can be legally used in multiple ways including for optimal building retrofit packages. Music Credits: Raro Bueno, Chuzausen freemusicarchive.org/music/Chuzausen/

 Evolution of Monogamy | File Type: audio/mpeg | Duration: 8:10

UT Austin biologist Rebecca Young discuss her work with host Jorge Salazar about how she traces the genes behind monogamous behavior using the Wrangler supercomputer at the Texas Advanced Computing Center. Scientists at the Hofmann Lab of UT Austin are using the Wrangler data-intensive supercomputer to find orthologs — genes common to different species. They'll search for them in each of the major lineages of vertebrates — mammals, birds, reptiles, amphibians and fishes. "What we want to know is, even though they've evolved independently, whether it's possible that some of the same genes are important in regulating this behavior, in particular expression of these genes in the brain while monogamous males are reproductively active," said Rebecca Young. Young is a research associate in the Department of Integrative Biology and at the Center for Computational Biology and Bioinformatics, UT Austin. Music Credits: Raro Bueno, Chuzausen freemusicarchive.org/music/Chuzausen/

 Wrangler Supercomputer Speeds through Big Data | File Type: audio/mpeg | Duration: 15:27

Scientists and engineers at TACC have created a new kind of supercomputer to handle big data.Featured on the podcast is Niall Gaffney, Director of Data Intensive Computing at the Texas Advanced Computing Center. Gaffney leads efforts at TACC to bring online a new data-intensive supercomputing system called Wrangler.The National Science Foundation's Division of Advanced Cyberinfrastructure awarded TACC and its collaborators 11.2 million dollars in November of 2013 to build and operate the Wrangler supercomputer. Indiana University, TACC, and the University of Chicago worked together on the project.In April of 2015, Wrangler began early operations for the open science community, where results are made freely available to the public. Wrangler will augment the Stampede supercomputer, one of the most powerful in the world. And Wrangler will join the cyberinfrastructure of NSF-funded XSEDE, the eXtreme Science and Engineering Discovery Environment.Niall Gaffney:We went to propose to build Wrangler with (the data world) in mind. We kept a lot of what was good with systems like Stampede, but then added new things to it like a very large flash storage system, a very large distributed spinning disc storage system, and high speed network access to allow people who have data problems that weren't being fulfilled by systems like Stampede and Lonestar to be able to do those in ways that they never could before.

  SC15: ACM Gordon Bell Prize Winners Supercompute Deep Earth | File Type: audio/mpeg | Duration: 24:47

The 2015 ACM Gordon Bell Prize, given in recognition of outstanding achievement in high-performance computing, was awarded to researchers Johann Rudi and Omar Ghattas of the Institute for Computational Engineering and Sciences at the University of Texas at Austin. They share the award with their study co-authors, who utilized the Stampede supercomputer of the Texas Advanced Computing Center and the IBM Sequoia supercomputer at Lawrence Livermore National Laboratory. The award-winning study modeled the flow thousands of kilometers deep in the mantle, which moves Earth's plates and triggers unpredictable events like volcanic eruptions and massive earthquakes. The SC15 supercomputing conference took place in Austin, November 15-20, 2015. SC showcases the latest in high performance computing, networking, storage and analysis to advance scientific discovery, research, education and commerce. The study, "An Extreme-Scale Implicit Solver for Complex PDEs: Highly Heterogeneous Flow in Earth's Mantle," was funded in part by the National Science Foundation and the Department of Energy. Co-authors include Johann Rudi and Omar Ghattas of ICES; A. Cristiano Malossi, Peter Staar, Yves Ineichen, Costas Bekas, and Alessandro Curioni at the Foundations of Cognitive Solutions, IBM Research – Zurich, Switzerland; Tobin Isaac of ICES; Georg Stadler of the Courant Institute of Mathematical Sciences, New York; and Michael Gurnis of the Seismological Laboratory at CalTech. Omar Ghattas: The absolute large number of cores and the big scaling result was done on the IBM system at Livermore. We couldn't have done it without the IBM guys. But the actual science runs, a lot of the day-to-day stuff, testing of the solvers - Johann was using TACC's Stampede system. Johann Rudi: Mainly, my research was done on the Stampede supercomputer at TACC. One part of the work is developing algorithms. That was wholly supported by TACC machines. And also, the help that I got from TACC a couple of times was very valuable to me. There were certain small issues that I couldn't even see from where I was working with the machine. But people from the internal status, running the systems, they could see when something was going wrong. They actually helped a lot. I was happy to work with TACC. Especially Bill Barth. I remember him helping me a lot. I was glad. The development of the solvers was done on TACC. Also, everything in the paper that shows the scientific results, the visualizations - these were also done on TACC machines. The science part was also supported by TACC.

  SC15: Revealing the Hidden Universe with Supercomputing Simulations of Black Hole Mergers | File Type: audio/mpeg | Duration: 15:04

This November 2015 marks 100 years of Einstein's field equations that describe space and time as one interwoven continuum - and predict the existence of black holes and more. Manuela Campanelli is a professor at the Rochester Institute of Technology and the Director of the Center for Computational Relativity and Gravitation. Dr. Campanelli was invited to give a presentation at SC15 titled "Revealing the Hidden Universe with Supercomputer Simulations of Black Hole Mergers." Dr. Campanelli uses the computational resources of XSEDE, the Extreme Science and Engineering Discovery Environment, to probe the mysteries of black holes. She spoke by Skype to talk about that and about the 100th anniversary of Einstein's field equations and about her work that takes on the complexity of accurately describing black hole mergers. The SC15 supercomputing conference takes place in Austin, November 15-20, 2015. SC15 showcases the latest in high performance computing, networking, storage and analysis to advance scientific discovery, research, education and commerce. Manuela Campanelli: General Relativity is celebrating this year a hundred years since its first publication in 1915, when Einstein introduced his theory of General Relativity, which has revolutionized in many ways the way we view our universe. For instance, the idea of a static Euclidean space, which had been assumed for centuries and the concept that gravity was viewed as a force changed. They were replaced with a very dynamical concept of now having a curved space-time in which space and time are related together in an intertwined way described by these very complex, but very beautiful equations.

  SC15: Societal Impact of Earthquake Simulations at Extreme Scale | File Type: audio/mpeg | Duration: 11:28

Thomas Jordan is a professor of Earth Sciences at University of Southern California and the Director of the Southern California Earthquake Center. It's a big national collaboration of over a thousand earthquake experts and 70 institutions. Dr. Jordan uses the computational resources of XSEDE, the Extreme Science and Engineering Discovery Environment, to model earthquakes and help reduce their risk to life and property. Dr. Jordan was invited to speak at SC15 on the Societal Impact of Earthquake Simulations at Extreme Scale. The SC15 supercomputing conference takes place in Austin, November 15-20, 2015. SC15 showcases the latest in high performance computing, networking, storage and analysis to advance scientific discovery, research, education and commerce. Thomas Jordan: One thing people need to understand is we need a lot of supercomputer time in order to be able to do these calculations. Some of our simulation models that are based on the simulation of earthquake physics can take hundreds of millions of hours of computer time to generate. These are very complex system-level calculations. They're of the similar complexity of trying to calculate what Earth's climate is going to be like in 50 years because of human activities and CO2 charging of the atmosphere. It's a similar scale of problem. These problems that deal with natural hazards and the complexity of the Earth system really require very large computers to be able to simulate that activity. We're frankly looking forward to the day when computers are ten times or a hundred times or more faster than they are today.

 SC15: Science Advocate and Emmy Award Winning Actor Alan Alda to Open SC15 | File Type: audio/mpeg | Duration: 11:48

Alan Alda, actor, director and writer, has had a lifelong interest in science. He hosted the PBS program Scientific American Frontiers from 1993 to 2005, an experience he called "the best thing I ever did in front of a camera." Perhaps best known as surgeon 'Hawkeye' Pierce on the TV series MASH, Alda has won seven Emmys, six Golden Globes, and three Directors Guild of America awards for directing. His two memoires are both New York Times bestsellers. A recipient of the National Science Board's Public Service Award, Alda is a visiting professor at and founding member of Stony Brook University's Alan Alda Center for Communicating Science, where he helps develop innovative programs on how scientists communicate with the public. He is also on the Board of Directors of the World Science Festival. SC15 is the 27th annual International Conference for High Performance Computing, Networking, Storage and Analysis. The event showcases the latest in supercomputing to advance scientific discovery, research, education and commerce.  Alan Alda: I think the kind of transformation that's already been brought about by high performance computing is extraordinary. And for it to go further and fully realize its potential requires another kind of transformation… Powerful computing affects all our lives and can hopefully save our lives. It can eventually help us survive some of our unfortunate efforts that have affected climate, for instance. To model climate change is one of the great benefits we're going to get from supercomputing. The trouble is, to really help the public understand all the benefits that they can get from supercomputing, it has to be communicated with clarity so that they get it and they get excited by it… I think we have to transform the scientists who are explaining this to the public before the public will allow them and participate with them in transforming their own lives with this amazing ability to model things on supercomputers.

 SC15: Understanding User-Level Activity on Today's Supercomputers with XALT | File Type: audio/mpeg | Duration: 9:31

Robert McLay manages the software tools group in high performance computing at the Texas Advanced Computing Center. Dr. McLay is one of the developers of XALT, a software tool developed with funding by the National Science Foundation. XALT tracks user codes and environments on a computer cluster. Robert McLay and Mark Fahey of the Argonne National Laboratory will be co-leading a session called "Understanding User-Level Activity on Today's Supercomputers with XALT" at SC15. The SC15 supercomputing conference takes place in Austin, November 15-20, 2015. SC15 showcases the latest in high performance computing, networking, storage and analysis to advance scientific discovery, research, education and commerce. Robert McLay: XALT is a tool that me and my colleague, Dr. Mark Fahey, put together to help people try and use our systems. We run the system. We manage the system. We develop software for the system and install software for our users. We want to know what's used and what's not. XALT gives us a way to find out in a very inexpensive way.

Comments

Login or signup comment.