# SimonsTV

Our videos can also be found on YouTube.
Playlist: 15 videos

Playlist: 18 videos

Playlist: 26 videos

Playlist: 22 videos

Jun. 2022

Mark Zhandry (NTT Research & Princeton University)

https://simons.berkeley.edu/talks/quantum-advantage-without-structure

Quantum and Lattices Joint Reunion Workshop

https://simons.berkeley.edu/talks/quantum-advantage-without-structure

Quantum and Lattices Joint Reunion Workshop

Mar. 2021

Amin Coja-Oghlan (Goethe University)

https://simons.berkeley.edu/talks/tbd-283

50 Years of Satisfiability: The Centrality of SAT in the Theory of Computing

https://simons.berkeley.edu/talks/tbd-283

50 Years of Satisfiability: The Centrality of SAT in the Theory of Computing

Dec. 2020

Rahul Jain (USC)

https://simons.berkeley.edu/talks/tbd-241

Reinforcement Learning from Batch Data and Simulation

https://simons.berkeley.edu/talks/tbd-241

Reinforcement Learning from Batch Data and Simulation

Oct. 2020

Mengdi Wang (Princeton University)

https://simons.berkeley.edu/talks/model-based-reinforcement-learning-value-targeted-regression

Mathematics of Online Decision Making

https://simons.berkeley.edu/talks/model-based-reinforcement-learning-value-targeted-regression

Mathematics of Online Decision Making

Oct. 2020

Doina Precup (McGill Univeristy & MILA / DeepMind)

https://simons.berkeley.edu/talks/tbd-224

Deep Reinforcement Learning

https://simons.berkeley.edu/talks/tbd-224

Deep Reinforcement Learning

Sep. 2020

Csaba Szepesvari (University of Alberta, Google DeepMind) & Mengdi Wang (Princeton University, Google DeepMind)

https://simons.berkeley.edu/talks/tbd-160

Theory of Reinforcement Learning Boot Camp

https://simons.berkeley.edu/talks/tbd-160

Theory of Reinforcement Learning Boot Camp

Playlist: 11 videos

Apr. 2020

Theory Shorts is a documentary web series that explores topics from the Simons Institute’s research programs.

Episode 1, “Perception as Inference: The Brain and Computation,” explores the computational processes by which the brain builds visual models of the external world, based on noisy or incomplete data from patterns of light sensed on the retinae.

HOST

Bruno Olshausen

DIRECTOR

Christoph Drösser

EDITOR

Michaelle McGaraghan

PRODUCERS

Kristin Kane

Michaelle McGaraghan

SCIENTIFIC ADVISOR

Shafi Goldwasser

ANIMATORS

Caresse Haaser

Christoph Drösser

Lukas Engelhardt

GRAPHIC DESIGNER

Barry Bödeker

VIDEOGRAPHERS

Drew Mason

Omied Far

Michaelle McGaraghan

Matt Beardsley

PRODUCTION ASSISTANTS

Christine Wang

Bexia Shi

Lior Shavit

THEME MUSIC

“Plastic” by Purple Moons

Courtesy of Marmoset in Portland, Oregon

OTHER MEDIA COURTESY OF

Bruce Damonte

Arash Fazl

Anders Garm

Jean Lorenceau and Maggie Shiffrar

Beau Lotto

A. L. Yarbus

Bruno Olshausen

videocobra / Pond5

BlackBoxGuild / Pond5

nechaevkon / Pond5

DaveWeeks / Pond5

CinematicStockVideo / Pond5

BananaRepublic / Pond5

MicroStockTube / Pond5

shelllink / Pond5

AudioQuattro / Envato Market

HitsLab / Envato Market

FlossieWood / Envato Market

plaincask / Envato Market

MusicDog / Envato Market

Loopmaster / Envato Market

Ryokosan / Envato Market

Images used under license from Shutterstock.com

© Simons Institute for the Theory of Computing, 2019

Episode 1, “Perception as Inference: The Brain and Computation,” explores the computational processes by which the brain builds visual models of the external world, based on noisy or incomplete data from patterns of light sensed on the retinae.

HOST

Bruno Olshausen

DIRECTOR

Christoph Drösser

EDITOR

Michaelle McGaraghan

PRODUCERS

Kristin Kane

Michaelle McGaraghan

SCIENTIFIC ADVISOR

Shafi Goldwasser

ANIMATORS

Caresse Haaser

Christoph Drösser

Lukas Engelhardt

GRAPHIC DESIGNER

Barry Bödeker

VIDEOGRAPHERS

Drew Mason

Omied Far

Michaelle McGaraghan

Matt Beardsley

PRODUCTION ASSISTANTS

Christine Wang

Bexia Shi

Lior Shavit

THEME MUSIC

“Plastic” by Purple Moons

Courtesy of Marmoset in Portland, Oregon

OTHER MEDIA COURTESY OF

Bruce Damonte

Arash Fazl

Anders Garm

Jean Lorenceau and Maggie Shiffrar

Beau Lotto

A. L. Yarbus

Bruno Olshausen

videocobra / Pond5

BlackBoxGuild / Pond5

nechaevkon / Pond5

DaveWeeks / Pond5

CinematicStockVideo / Pond5

BananaRepublic / Pond5

MicroStockTube / Pond5

shelllink / Pond5

AudioQuattro / Envato Market

HitsLab / Envato Market

FlossieWood / Envato Market

plaincask / Envato Market

MusicDog / Envato Market

Loopmaster / Envato Market

Ryokosan / Envato Market

Images used under license from Shutterstock.com

© Simons Institute for the Theory of Computing, 2019

Apr. 2020

Henry Yuen (University of Toronto)

Richard M. Karp Distinguished Lecture Series, Spring 2020

https://simons.berkeley.edu/events/rmklectures2020-spring-3

In a recent result known as "MIP* = RE," ideas from three disparate fields of study — computational complexity theory, quantum information, and operator algebras — have come together to simultaneously resolve long-standing open problems in each field, including a 44-year old mystery in mathematics known as Connes’ Embedding Problem. In this talk, I will describe the evolution and convergence of ideas behind MIP* = RE: it starts with three landmark discoveries from the 1930s (Turing’s notion of a universal computing machine, the phenomenon of quantum entanglement, and von Neumann’s theory of operators), and ends with some of the most cutting-edge developments from theoretical computer science and quantum computing.

This talk is aimed at a general scientific audience, and will not assume any specialized background in complexity theory, quantum physics, or operator algebras.

Richard M. Karp Distinguished Lecture Series, Spring 2020

https://simons.berkeley.edu/events/rmklectures2020-spring-3

In a recent result known as "MIP* = RE," ideas from three disparate fields of study — computational complexity theory, quantum information, and operator algebras — have come together to simultaneously resolve long-standing open problems in each field, including a 44-year old mystery in mathematics known as Connes’ Embedding Problem. In this talk, I will describe the evolution and convergence of ideas behind MIP* = RE: it starts with three landmark discoveries from the 1930s (Turing’s notion of a universal computing machine, the phenomenon of quantum entanglement, and von Neumann’s theory of operators), and ends with some of the most cutting-edge developments from theoretical computer science and quantum computing.

This talk is aimed at a general scientific audience, and will not assume any specialized background in complexity theory, quantum physics, or operator algebras.

Recent years have seen major advances in the ability to control quantum devices with dozens of qubits. The advent of so-called "Noisy Intermediate Scale Quantum" (NISQ) computers raises major algorithmic challenges. The goal of this workshop is to present current techniques and to help distill the key questions and theoretical models moving forward.

Playlist: 25 videos

The Boot Camp is intended to acquaint program participants with the key themes of the program. It will consist of four days of tutorial presentations, each with ample time for questions and discussion, as follows:

Playlist: 16 videos

An important development in the area of convex relaxations has been the introduction of systematic ways of strengthening them by lift-and-project techniques. This leads to several hierarchies of LP/SDP relaxations: Lovasz-Schrijver, Sherali-Adams and Sum of Squares (also known as the Lasserre hierarchy). The benefits and limitations of these hierarchies have been studied extensively over the last decade. Recently, strong negative results have been obtained, not only for specific hierarchies but even for the more general notion of extended formulations. In this workshop we investigate the power and limitations of LP/SDP hierarchies as well as general extended formulations, and their ties to convex algebraic geometry. We also explore tools and concepts from matrix analysis with strong connections to SDP formulations: matrix concentration, matrix multiplicative weight updates, and various notions of matrix rank. Finally, the workshop will cover related areas where these kinds of techniques are employed: sparsification, discrepancy and hyperbolic/real stable polynomials.

Playlist: 24 videos

May. 2017

David Duvenaud, University of Toronto

Computational Challenges in Machine Learning

https://simons.berkeley.edu/talks/david-duvenaud-2017-5-1

Computational Challenges in Machine Learning

https://simons.berkeley.edu/talks/david-duvenaud-2017-5-1

Mar. 2017

Tom Griffiths, UC Berkeley

Representation Learning

https://simons.berkeley.edu/talks/tom-griffiths-2017-3-29

Representation Learning

https://simons.berkeley.edu/talks/tom-griffiths-2017-3-29

Apr. 25 – Apr. 26, 2016

Playlist: 9 videos

Nov. 16 – Nov. 20, 2015

Playlist: 23 videos