News Story
Alumnus Aswin Sankaranarayanan wins NSF CAREER Award
ECE Alumnus Aswin Sankaranarayanan (M.S. ’07, Ph.D. ‘09) has received a 2017 National Science Foundation (NSF) Faculty Early Career Development (CAREER) Award for “Plenoptic Signal Processing --- A Framework for Sampling, Detection, and Estimation using Plenoptic Functions.” The five-year award is worth $532,000. The NSF CAREER Program fosters the career development of outstanding junior faculty, combining the support of research and education of the highest quality and in the broadest sense.
Sankaranarayanan is an Assistant Professor in the Department of Electrical and Computer Engineering at Carnegie Mellon University in Pittsburgh, PA. He joined the Carnegie Mellon faculty in 2013 after holding a postdoctoral research position at the DSP group at Rice University. He completed his Ph.D. in 2009 at the University of Maryland; there he was advised by Dr. Rama Chellappa, Distinguished University Professor and Chair of Electrical and Computer Engineering. He received a Distinguished Dissertation Fellowship from UMD’s ECE department in 2009 and completed the Clark School’s Future Faculty Program. His research on lensless cameras won the 2016 Herschel M. Rich award from Rice University. Sankaranarayanan’s research interests are in computer vision and signal processing. His research focuses on developing computational tools and imaging architectures for high-dimensional visual signals. To see an overview of Sankaranarayanan’s work in Computational Imaging, go here.
About the award
The interactions of light with objects in a scene are often complex. An image --- which only captures 2D spatial variations --- is poorly equipped to unravel these interactions and infer properties of a scene including its shape, reflectance, and its composition. This is especially true for scenes that have sharp reflections, refractions, and volumetric scattering. This research models interactions of light with scenes using light rays and their transformations. The central hypothesis underlying the research is the idea that problems of shape, reflectance and material composition estimation are often simpler and well-posed when they are studied using light rays and their transformations. A wide-range of real-world objects and scenes stand to benefit from progress made in this research; this includes scenes with complex configurations that lead to inter-reflections, objects with shine, specularities, and spatially-varying reflectances, as well as objects that are transparent, or translucent. A diverse set of applications including machine vision, microscopy, and consumer photography stand to benefit from this research. The education and outreach components of this project disseminates image processing research in the broader Pittsburgh area via camera building workshops and lab demos for middle/high-school students, and professional development courses for physics teachers.
The focus of the research is to develop novel acquisition and processing methods for scene understanding by studying characterizations of light that go beyond images. In particular, the research analyzes the properties of two signals: the plenoptic function, which captures spatial, temporal, angular, and spectral variations of light, and the plenoptic light transport, which captures how light propagates through a scene. The central hypothesis of the research is that the plenoptic function and light transport provide a rich encoding of how light interacts with a scene; hence, unlike image-based inference, plenoptic inference can be fundamentally well-conditioned even for scenes that interact with light in a complex manner. To this end, the research develops novel low-dimensional models for plenoptic functions that are based on physical laws governing interaction of light with a scene. The research also builds novel computational cameras that acquire light propagates in a scene by decomposing into light paths of varying complexity, and subsequently estimating the 3D shape, reflectance, and material composition.
Published February 14, 2017