Abstract:
Computer systems must increasingly operate in the presence of uncertainty in their execution environment in that new applications and hardware platforms require our systems to model and compute with objects that are inherently uncertain or partially observable in that their behaviors are only given by noisy measurements. This reality presents many new questions about how to interpret, debug, validate, verify, and optimize these systems.
As an illustrative example of such a system, I'll present DiffTune, a technique for learning neurosymbolic performance models of modern computer processors. Processor performance models are critical for many computer systems engineering tasks, however due to the limits on our ability to introspect modern processors, these models must be inferred from behavioral measurements. Our system leverages deep learning to perform differentiable surrogate optimization of a CPU simulator to yield models that predict the performance of programs executed on modern Intel CPUs better than state-of-the-art, handcrafted techniques from LLVM.
Guided by these results, I'll demonstrate how this system presents many of the challenges with engineering modern uncertain computations as well as connect these challenges to my work on new program semantics, optimizations, and analyses for uncertain computations.
Bio: Michael Carbin is an Associate Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology. At MIT, he leads the Programming Systems Group. Typical goals for his work include improved reliability, performance, energy consumption, and resilience for computer systems using techniques from Programming Languages. A central aspect of his work is the study of systems that must manage uncertainty in their environment (e.g., perception) and construction (e.g., approximation), such as machine learning systems.
Michael has received an NSF CAREER Award, a Sloan Foundation Research Fellowship, and a MIT Frank E. Perkins Award for Excellence in Graduate Advising. His work has received best paper awards at OOPSLA, ICLR, and ICFP. His work has also received a CACM Research Highlight.
Michael received a B.S. in Computer Science from Stanford University in 2006, and an S.M. and Ph.D. in Electrical Engineering and Computer Science from the Massachusetts Institute of Technology in 2009 and 2015, respectively. Michael was also a Researcher at Microsoft Research, working on Deep Learning Systems from 2014 to 2018.