other seminars:

:: Fri 9/27/13, 1:30 PM in 6C442 Joseph Traub (Columbia University (on sabbatical at Harvard)) We introduce the notion of strong quantum speedup. To compute this speedup one must know the classical computational complexity. What is it about the problems of quantum physics and quantum chemistry that enable us to get lower bounds on the classical complexity? :: Thu 10/17/13, 3:00 PM in E17133 Juerg Froehlich (ETH Zurich) After a brief general introduction to the subject of quantum probability theory, quantum dynamical systems are introduced and some of their probabilistic features are described. On the basis of a few general principles  "duality between observables and indeterminates", "loss of information" and "entanglement generation"  a quantum theory of experiments and measurements is developed, and the "theory of von Neumann measurements" is outlined. Finally, a theory of nondemolition measurements is sketched, and, as an application of the Martingale Convergence Theorem, it is shown how facts emerge in nondemolition measurements. :: Thu 10/17/13, 4:00 PM in 10250 Matthias Troyer (ETH Zurich) About a century after the development of quantum mechanics we have now reached an exciting time where nontrivial devices that make use of quantum effects can be built. While a universal quantum computer of nontrivial size is still out of reach there are a number commercial and experimental devices: quantum random number generators, quantum encryption systems, and analog quantum simulators. In this colloquium I will present some of these devices and validation tests we performed on them. Quantum random number generators use the inherent randomness in quantum measurements to produce true random numbers, unlike classical pseudorandom number generators which are inherently deterministic. Optical lattice emulators use ultracold atomic gases in optical lattices to mimic typical models of condensed matter physics. Finally, I will discuss the devices built by Canadian company DWave systems, which are special purpose quantum simulators for solving hard classical optimization problems. :: Fri 10/18/13, 1:30 PM in 6C442 Sean Hallgren (Penn State (visiting MIT)) Computing the unit group of a number field is one of the main problems in computational algebraic number theory. Polynomial time algorithms for problems for arbitrary degree number fields should be polynomial time in both the degree and log of the discriminant. The best classical algorithms for computing the unit group takes time exponential in both parameters. There is a quantum algorithm that runs in time polynomial in log the discriminant but exponential in the degree. We give a quantum algorithm that is polynomial in both parameters of the number field. The proof works via a reduction to a continuous Hidden Subgroup Problem. :: Wed 3/26/14, 4 PM in 32G575 Alexander Belov (MIT) In this talk, I describe some recent quantum algorithms for the problem of learning and testing juntas. For the main part of the talk, I study the following variant of the junta learning problem. We are given an oracle access to a Boolean function f on n variables that only depends on k variables, and, when restricted to them, equals some predefined symmetric function h. The task is to identify the variables the function depends on. This is a generalization of the BernsteinVazirani problem (when h is the XOR function) and the (combinatorial) group testing problem (when h is the OR function). I describe an optimal quantum algorithm for the case when h is the OR or the EXACTHALF function. For the case of the MAJORITY function, I obtain an upper bound of O(k1/4). Additionally, I describe an application of these techniques for the problem of testing juntas, that is a joint work with Andris Ambainis, Oded Regev, and Ronald de Wolf. :: Wed 4/2/14, 4:30 PM in 6C442 Renato Renner (ETH) Quantum state tomography is the task of estimating the state of a quantum system using measurements. Typically, one is interested in the (unknown) state generated during an experiment which can be repeated arbitrarily often in principle. However, the number of actual runs of the experiment, from which data is collected, is always finite (and often small). As pointed out recently, this may lead to unjustified (or even wrong) claims when employing standard statistical tools without care. In this talk, I will present a method for obtaining reliable estimates from finite tomographic data. Specifically, the method allows the derivation of confidence regions, i.e., subsets of the state space in which the unknown state is contained with probability almost one. :: Fri 4/11/14, 1:30 PM in 6C442 Keith Lee (Perimeter Institute) Quantum field theory provides the framework for the Standard Model of particle physics and plays a key role in physics. However, calculations are generally computationally complex and limited to weak interaction strengths. After an introduction to quantum field theory, I'll describe polynomialtime quantum algorithms for computing relativistic scattering amplitudes in both scalar and fermionic quantum field theories. The algorithms achieve exponential speedup over known classical methods. One of the motivations for this work comes from computational complexity theory. Ultimately, we wish to know what is the computational power of our universe. Studying such quantum algorithms probes whether a universal quantum computer is powerful enough to represent quantum field theory; in other words, is quantum field theory in BQP? Conversely, one can ask whether quantum field theory can represent a universal quantum computer; is quantum field theory BQPhard? I'll describe our approach to addressing the question of BQPhardness. :: Fri 4/18/14, 1:30 PM in 6C442 Stephanie Wehner (National University of Singapore) The second law of thermodynamics tells us which state transformations are so statistically unlikely that they are effectively forbidden. Its original formulation, due to Clausius, states that "Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time." The second law applies to systems composed of many particles, however, we are seeing that one can make sense of thermodynamics in the regime where we only have a small number of particles interacting with a heat bath, or when we have highly correlated systems and wish to make nonstatistical statements about them. Is there a second law of thermodynamics in this regime? Here, we find that for processes which are cyclic or very close to cyclic, thesecond law for microscopic or highly correlated systems takes on a very different form than it does at the macroscopic scale, imposing not just one constraint on what state transformations are possible, but an entire family of constraints. In particular, we find that the quantum Renyi relative entropy distances to the equilibrium state can never increase. We further find that there are three regimes which govern which family of second laws govern state transitions, depending on how cyclic the process is. In one regime one can cause an apparent violation of the usual second law, through a process of embezzling work from a large system which remains arbitrarily close to its original state. :: Fri 4/25/14, 1:30 in 6C442 Isaac Kim (Perimeter Institute) For a general multipartite quantum state, we formulate a locally checkable condition, under which the expectation values of certain nonlocal observables are completely determined by the expectation values of some local observables. The condition is satisfied by ground states of gapped quantum manybody systems in one and two spatial dimensions, assuming a widely conjectured form of area law is correct. Its implications on quantum state tomography, quantum state verification, and quantum error correcting code is discussed. Sergei Bravyi (IBM) Quantum codes with lowweight stabilizers known as LDPC codes have been actively studied recently due to their potential applications in faulttolerant quantum computing. However, all families of quantum LDPC codes known to this date suffer from a poor distance scaling limited by the squareroot of the code length. This is in a sharp contrast with the classical case where good families of LDPC codes are known that combine constant encoding rate and linear distance. In this talk I will describe the first family of good quantum codes with lowweight stabilizers. The new codes have a constant encoding rate, linear distance, and stabilizers acting on at most square root of n qubits, where n is the code length. For comparison, all previously known families of good quantum codes have stabilizers of linear weight. The proof combines two techniques: randomized constructions of good quantum codes and the homological product operation from algebraic topology. We conjecture that similar methods can produce good stabilizer codes with stabilizer weight n^a for any a>0. Finally, we apply the homological product to construct new small codes with lowweight stabilizers. This is a joint work with Matthew Hastings Preprint: arXiv:1311.0885 :: Fri 5/9/14, 1:30 PM in 6C442 Andris Ambainis (University of Latvia) The complexity class QMA (quantum counterpart of NP) allows to understand the complexity of many computational problems in quantum physics but some natural problems appear to be slightly harder than QMA. We introduce new complexity classes consisting of problems that are solvable with a small number of queries to a QMA oracle and use these complexity classes to quantify the complexity of several natural computational problems (for example, the complexity of estimating the spectral gap of a Hamiltonian). :: Thu 5/15/14, 11 AM in 6C442 Umesh Vazirani (UC Berkeley) The wellknown heuristic DMRG (Density Matrix Renormalization Group) has been the method of choice for the practical solution of 1D systems since its introduction two decades ago by Steve White. However, the reasons for DMRG's success are not theoretically well understood and it is known to fail in certain cases. In this talk, I will describe the first polynomial time classical algorithm for finding ground states of 1D quantum systems described by gapped local Hamiltonians. The algorithm is based on a framework that combines recently discovered structural features of gapped 1D systems, convex programming, and a new and efficient construction of a class of operators called approximate ground state projections (AGSP). An AGSPcentric approach may help guide the search for algorithms for more general quantum systems, including for the central challenge of 2D systems, where even heuristic methods have had very limited success. Joint work with Zeph Landau and Thomas Vidick :: Fri 5/16/14, 1:30 PM in 6C442 Robin Kothari (University of Waterloo) We provide a quantum algorithm for simulating the dynamics of sparse Hamiltonians with complexity sublogarithmic in the inverse error, an exponential improvement over previous methods. Unlike previous approaches based on product formulas, the query complexity is independent of the number of qubits acted on, and for timevarying Hamiltonians, the gate complexity is logarithmic in the norm of the derivative of the Hamiltonian. Our algorithm is based on a significantly improved simulation of the continuous and fractionalquery models using discrete quantum queries, showing that the former models are not much more powerful than the discrete model even for very small error. We also significantly simplify the analysis of this conversion, avoiding the need for a complex fault correction procedure. Our simplification relies on a new form of "oblivious amplitude amplification" that can be applied even though the reflection about the input state is unavailable. Finally, we prove new lower bounds showing that our algorithms are optimal as a function of the error. This is joint work with Dominic W. Berry, Andrew M. Childs, Richard Cleve, and Rolando D. Somma. Available at http://arxiv.org/abs/1312.1414 :: Tue 5/20/14, 12 PM in 4331 Claudio Chamon (Boston University) The wellknown heuristic DMRG (Density Matrix Renormalization Group) has been the method of choice for the practical solution of 1D systems since its introduction two decades ago by Steve White. However, the reasons for DMRG's success are not theoretically well understood and it is known to fail in certain cases. In this talk, I will describe the first polynomial time classical algorithm for finding ground states of 1D quantum systems described by gapped local Hamiltonians. The algorithm is based on a framework that combines recently discovered structural features of gapped 1D systems, convex programming, and a new and efficient construction of a class of operators called approximate ground state projections (AGSP). An AGSPcentric approach may help guide the search for algorithms for more general quantum systems, including for the central challenge of 2D systems, where even heuristic methods have had very limited success. Joint work with Zeph Landau and Thomas Vidick Nicole Yunger Halpern (Caltech) How can thermodynamics, which involves vast numbers of particles, describe cuttingedge technology, which involves tiny scales? Heat exchanges have been modeled with the resourcetheory framework famous for elucidating entanglement. “Helmholtz” resource theories have shown that the work cost W_cost of creating one copy of a state can differ from the work W_dist distillable from the state. This “oneshot” result contrasts with the thermodynamic limit, in which W_cost = W_dist. I will generalize the resourcetheory framework from heat exchanges to morearbitrary thermodynamic interactions. I will introduce resource theories of nonequilibrium, a family of models that includes the Helmholtz theories. In addition to shedding new light on Helmholtz theories, the nonequilibrium framework expands the resource theories’ potential to describe experiments and quantum phenomena. I will quantify the distinction between W_cost and W_dist in terms of the hypothesistesting entropy D_H. This research is motivated by the interplay between information and energy on small scales exemplified by singlemolecule experiments, nanoscale engines, and molecular motors. This work was conducted partially (arXiv:1309.6586) at the Perimeter Institute for Theoretical physics with Markus Müller and Rob Spekkens, as well as with Gilad Gour and Varun Narasimhachar. Part of this work is being conducted with Joe Renes. :: Fri 5/23/14, 1:30 PM in 6C442 Robin BlumeKohout (Sandia National Labs) A gauge symmetry appears when we can transform the parameters of a theory, yet leave every observable property invariant. This means that the theory contains redundant parameters. The theory of quantum information processing (QIP)  in which elementary operations are completely positive maps, and any viable algorithm is described by a quantum circuit that specifies a sequence of those operations  has a gauge. For "forward" QIP this is no big deal; it just provides multiple equivalent ways to describe the same physical device. But for "inverse" QIP  i.e., tomography and characterization  this is surprisingly inconvenient! I will discuss our recent progress on device characterization via gateset tomography, chronicle our ongoing struggles with the gauge, and plead for your assistance in taming it. 