Perimeter Institute
We construct a family of warped AdS_5 compactifications of IIB supergravity that are the holographic duals of the complete set of N=1 fixed points of a Z_2 quiver gauge theory. This family interpolates between the T^{1,1} compactification with no three-form flux and the Z_2 orbifold of the Pilch-Warner geometry which contains three-form flux. This family of solutions is constructed by making the most general Ansatz allowed by the symmetries of the field theory. We use Killing spinor methods because the symmetries impose two simple projection conditions on the Killing spinors, and these greatly reduce the problem. We see that generic interpolating solution has a nontrivial dilaton in the internal five-manifold. We calculate the central charge of the gauge theories from the supergravity backgrounds and find that it is 27/32 of the parent N=2, quiver gauge theory. We believe that the projection conditions that we derived here will be useful for a much larger class of N=1 holographic RG-flows.
We initiate the development of a new language and theory for quantum music, to which we refer as Quantum Concept Music (QCM). This new music formalism is based on Categorical Quantum Mechanics (CQM), and more specifically, its diagrammatic incarnation Quantum Picturalism (QPict), which is heavily based on ZX-calculus. In fact, it is naturally inherited from CQM/QPict. At its heart is the explicit notational representation of relations that exist within and between the key concepts of music composition, performance, and automation. QCM also enables one to directly translate quantum phenomena into music compositions in a both intuitively obvious, rigorous and mechanical manner. Following this pattern, we propose a score for musicians interacting like a Bell-pair under measurement, and outline examples of how it could be live performed. While most of the Western classical music notation has heavily relied on linear representation of music - which does not always adequately capture the nature of music - our approach is distinct by highlighting the fundamental relational dimension of music. In addition, this quantum-based technique not only influences the music at the profound level of composition, but also has a direct impact on a live performance, and also provides a new template for automating music, e.g.~in the context of AI-generation. All together, we initiate the creation of new music formalism that is powerful and efficient in capturing the interactive nature of music, both in terms of internal and external interactions, and goes beyond the boundaries of Western classical music notation, which allows to use it in many different genres and directions.
To deepen our understanding of Quantum Gravity and its connections with black holes and cosmology, building a common language and exchanging ideas across different approaches is crucial. The Nordita Program "Quantum Gravity: from gravitational effective field theories to ultraviolet complete approaches" created a platform for extensive discussions, aimed at pinpointing both common grounds and sources of disagreements, with the hope of generating ideas and driving progress in the field. This contribution summarizes the twelve topical discussions held during the program and collects individual thoughts of speakers and panelists on the future of the field in light of these discussions.
The Gottesman-Knill theorem says that a stabilizer circuit -- that is, a quantum circuit consisting solely of CNOT, Hadamard, and phase gates -- can be simulated efficiently on a classical computer. This paper improves that theorem in several directions. First, by removing the need for Gaussian elimination, we make the simulation algorithm much faster at the cost of a factor-2 increase in the number of bits needed to represent a state. We have implemented the improved algorithm in a freely-available program called CHP (CNOT-Hadamard-Phase), which can handle thousands of qubits easily. Second, we show that the problem of simulating stabilizer circuits is complete for the classical complexity class ParityL, which means that stabilizer circuits are probably not even universal for classical computation. Third, we give efficient algorithms for computing the inner product between two stabilizer states, putting any n-qubit stabilizer circuit into a "canonical form" that requires at most O(n^2/log n) gates, and other useful tasks. Fourth, we extend our simulation algorithm to circuits acting on mixed states, circuits containing a limited number of non-stabilizer gates, and circuits acting on general tensor-product initial states but containing only a limited number of measurements.
We present a higher-categorical generalization of the "Karoubi envelope" construction from ordinary category theory, and prove that, like the ordinary Karoubi envelope, our higher Karoubi envelope is the closure for absolute limits. Our construction replaces the idempotents in the ordinary version with a notion that we call "condensations." The name is justified by the direct physical interpretation of the notion of condensation: it encodes a general class of constructions which produce a new topological phase of matter by turning on a commuting projector Hamiltonian on a lattice of defects within a different topological phase, which may be the trivial phase. We also identify our higher Karoubi envelopes with categories of fully-dualizable objects. Together with the Cobordism Hypothesis, we argue that this realizes an equivalence between a very broad class of gapped topological phases of matter and fully extended topological field theories, in any number of dimensions.
Researchers discovered 2017 OF201, a new dwarf planet candidate with an estimated diameter of 700 km and one of the most extreme orbits known for a Trans-Neptunian Object, possessing a semi-major axis of 838 AU. This object's unique orbital parameters, particularly its longitude of perihelion and simulated instability with a proposed Planet X, provide new constraints on models of outer solar system formation and challenge the Planet X hypothesis.
Global Categorical Symmetries are a powerful new tool for analyzing quantum field theories. This volume compiles lecture notes from the 2022 and 2023 summer schools on Global Categorical Symmetries, held at the Perimeter Institute for Theoretical Physics and at the Swiss Map Research Station in Les Diableret. Specifically, this volume collects the lectures: * An introduction to symmetries in quantum field theory, Kantaro Ohmori * Introduction to anomalies in quantum field theory, Clay Córdova * Symmetry Categories 101, Michele Del Zotto * Applied Cobordism Hypothesis, David Jordan * Finite symmetry in QFT, Daniel S. Freed These volumes are devoted to interested newcomers: we only assume (basic) knowledge of quantum field theory (QFT) and some relevant maths. We try to give appropriate references for non-standard materials that are not covered. Our aim in this first volume is to illustrate some of the main questions and ideas together with some of the methods and the techniques necessary to begin exploring global categorical symmetries of QFTs.
This review summarizes Effective Field Theory techniques, which are the modern theoretical tools for exploiting the existence of hierarchies of scale in a physical problem. The general theoretical framework is described, and explicitly evaluated for a simple model. Power-counting results are illustrated for a few cases of practical interest, and several applications to Quantum Electrodynamics are described.
Many of the most fundamental observables | position, momentum, phase-point, and spin-direction | cannot be measured by an instrument that obeys the orthogonal projection postulate. Continuous-in-time measurements provide the missing theoretical framework to make sense of such observables. The elements of the time-dependent instrument define a group called the \emph{instrumental group} (IG). Relative to the IG, all of the time-dependence is contained in a certain function called the \emph{Kraus-operator density} (KOD), which evolves according to a classical Kolmogorov equation. Unlike the Lindblad master equation, the KOD Kolmogorov equation is a direct expression of how the elements of the instrument (not just the total channel) evolve. Shifting from continuous measurement to sequential measurements more generally, the structure of combining instruments in sequence is shown to correspond to the convolution of their KODs. This convolution promotes the IG to an \emph{involutive Banach algebra} (a structure that goes all the way back to the origins of POVM and C*-algebra theory) which will be called the \emph{instrumental group algebra} (IGA). The IGA is the true home of the KOD, similar to how the dual of a von Neumann algebra is the home of the density operator. Operators on the IGA, which play the same role for KODs as superoperators play for density operators, are called \emph{ultraoperators} and various examples are discussed. Certain ultraoperator-superoperator intertwining relations are considered, including the relation between the KOD Kolmogorov equation and the Lindblad master equation. The IGA is also shown to have actually two involutions: one respected by the convolution ultraoperators and the other by the quantum channel superoperators. Finally, the KOD Kolmogorov generators are derived for jump processes and more general diffusive processes.
In a novel application of the tools of topological data analysis (TDA) to nonperturbative quantum gravity, we introduce a new class of observables that allows us to assess whether quantum spacetime really resembles a ``quantum foam" near the Planck scale. The key idea is to investigate the Betti numbers of coarse-grained path integral histories, regularized in terms of dynamical triangulations, as a function of the coarse-graining scale. In two dimensions our analysis exhibits the well-known fractal structure of Euclidean quantum gravity.
The thermal and kinematic Sunyaev-Zel'dovich effects (tSZ, kSZ) probe the thermodynamic properties of the circumgalactic and intracluster medium (CGM and ICM) of galaxies, groups, and clusters, since they are proportional, respectively, to the integrated electron pressure and momentum along the line-of-sight. We present constraints on the gas thermodynamics of CMASS galaxies in the Baryon Oscillation Spectroscopic Survey (BOSS) using new measurements of the kSZ and tSZ signals obtained in a companion paper. Combining kSZ and tSZ measurements, we measure within our model the amplitude of energy injection ϵMc2\epsilon M_\star c^2, where MM_\star is the stellar mass, to be ϵ=(40±9)×106\epsilon=(40\pm9)\times10^{-6}, and the amplitude of the non-thermal pressure profile to be \alpha_{\rm Nth}<0.2 (2σ\sigma), indicating that less than 20% of the total pressure within the virial radius is due to a non-thermal component. We estimate the effects of including baryons in the modeling of weak-lensing galaxy cross-correlation measurements using the best-fit density profile from the kSZ measurement. Our estimate reduces the difference between the original theoretical model and the weak-lensing galaxy cross-correlation measurements in arXiv:1611.08606 by half but does not fully reconcile it. Comparing the tSZ measurements to cosmological simulations, we find that simulations underestimate the CGM pressure at large radii while they fare better in comparison with the kSZ measurements. This suggests that the energy injected via feedback models in the simulations that we compared against does not sufficiently heat the gas at these radii. We do not find significant disagreement at smaller radii. These measurements provide novel tests of current and future simulations. This work demonstrates the power of joint, high signal-to-noise kSZ and tSZ observations, upon which future cross-correlation studies will improve.
What is the minimum number of extra qubits needed to perform a large fault-tolerant quantum circuit? Working in a common model of fault-tolerance, I show that in the asymptotic limit of large circuits, the ratio of physical qubits to logical qubits can be a constant. The construction makes use of quantum low-density parity check codes, and the asymptotic overhead of the protocol is equal to that of the family of quantum error-correcting codes underlying the fault-tolerant protocol.
We comment on the recently introduced Gauss-Bonnet gravity in four dimensions. We argue that it does not make sense to consider this theory to be defined by a set of D4D\to 4 solutions of the higher-dimensional Gauss-Bonnet gravity. We show that a well-defined D4D\to 4 limit of Gauss-Bonnet Gravity is obtained generalizing a method employed by Mann and Ross to obtain a limit of the Einstein gravity in D=2D=2 dimensions. This is a scalar-tensor theory of the Horndeski type obtained by a dimensional reduction methods. By considering simple spacetimes beyond spherical symmetry (Taub-NUT spaces) we show that the naive limit of the higher-dimensional theory to four dimensions is not well defined and contrast the resultant metrics with the actual solutions of the new theory.
We propose a renormalization group flow equation for a functional that generates SS-matrix elements and which captures similarities to the well-known Wetterich and Polchinski equations. While the latter ones respectively involve the effective action and Schwinger functional, which are genuine off-shell objects, the presented flow equation has the advantage of working more directly with observables, i.e. scattering amplitudes. Compared to the Wetterich equation, our flow equation also greatly simplifies the notion of going on-shell, in the sense of satisfying the quantum equations of motion. In addition, unlike the Wetterich equation, it is polynomial and does not require a Hessian inversion. The approach is a promising direction for non-perturbative quantum field theories, allowing one to work more directly with scattering amplitudes.
We discuss the physical interpretation of the gravity mediated entanglement effect. We show how to read it in terms of quantum reference systems. We pinpoint the single gravitational degree of freedom mediating the entanglement. We clarify why the distinction between longitudinal and transverse degrees of freedom is irrelevant for the interpretation of the results. We discuss the relation between the LOCC theorem and the interpretation of the effect, its different relevance for, respectively, the quantum gravity and quantum information communities, and the reason for the excitement raised by the prospect of detection.
Quantum teleportation of qudits is revisited. In particular, we analyze the case where the quantum channel corresponds to a non-maximally entangled state and show that the success of the protocol is directly related to the problem of distinguishing non-orthogonal quantum states. The teleportation channel can be seen as a coherent superposition of two channels, one of them being a maximally entangled state thus, leading to perfect teleportation and the other, corresponding to a non-maximally entangled state living in a subspace of the d-dimensional Hilbert space. The second channel leads to a teleported state with reduced fidelity. We calculate the average fidelity of the process and show its optimality.
The educational value of a fully diagrammatic approach in a scientific field has never been explored. We present Quantum Picturalism (QPic), an entirely diagrammatic formalism for all of qubit quantum mechanics. This framework is particularly advantageous for young learners as a novel way to teach key concepts such as entanglement, measurement, and mixed-state quantum mechanics in a math-intensive subject. This eliminates traditional obstacles without compromising mathematical correctness - removing the need for matrices, vectors, tensors, complex numbers, and trigonometry as prerequisites to learning. Its significance lies in that a field as complex as Quantum Information Science and Technology (QIST), for which educational opportunities are typically exclusive to the university level and higher, can be introduced at high school level. In this study, we tested this hypothesis, examining whether QPic reduces cognitive load by lowering complex mathematical barriers while enhancing mental computation and conceptual understanding. The data was collected from an experiment conducted in 2023, whereby 54 high school students (aged 16-18) underwent 16 hours of training spread over eight weeks. The post-assessments illustrated promising outcomes in all three specific areas of focus: (1) whether QPic can alleviate technical barriers in learning QIST, (2) ensures that the content and teaching method are age appropriate, (3) increases confidence and motivation in science and STEM fields. There was a notable success rate in terms of teaching outcomes, with 82% of participants successfully passing an end-of-training exam and 48% achieving a distinction, indicating a high level of performance. The unique testing and training regime effectively reduced the technical barriers typically associated with traditional approaches, as hypothesized.
Quantum theory is a probabilistic theory with fixed causal structure. General relativity is a deterministic theory but where the causal structure is dynamic. It is reasonable to expect that quantum gravity will be a probabilistic theory with dynamic causal structure. The purpose of this paper is to present a framework for such a probability calculus. We define an operational notion of space-time, this being composed of elementary regions. Central to this formalism is an object we call the causaloid. This object captures information about causal structure implicit in the data by quantifying the way in which the number of measurements required to establish a state for a composite region is reduced when there is a causal connection between the component regions. This formalism puts all elementary regions on an equal footing. It does not require that we impose fixed causal structure. In particular, it is not necessary to assume the existence of a background time. Remarkably, given the causaloid, we can calculate all relevant probabilities and so the causaloid is sufficient to specify the predictive aspect of a physical theory. We show how certain causaloids can be represented by suggestive diagrams and we show how to represent both classical probability theory and quantum theory by a causaloid. We do not give a causaloid formulation for general relativity though we speculate that this is possible. The work presented here suggests a research program aimed at finding a theory of quantum gravity. The idea is to use the causaloid formalism along with principles taken from the two theories to marry the dynamic causal structure of general relativity with the probabilistic structure of quantum theory.
We show how to correctly account for scalar accretion onto black holes in scalar field models of dark energy by a consistent expansion in terms of a slow roll parameter. We find an exact solution for the scalar field within our Hubble volume which is regular on both black hole and cosmological event horizons, and compute the back reaction of the scalar on the black hole, calculating the resulting expansion of the black hole. Our results are independent of the relative size of black hole and cosmological event horizons. We comment on the implications for more general black hole accretion, and the no hair theorems.
Are you, with your perceptions, memories and observational data, a Boltzmann brain, i.e., a statistical fluctuation out of the thermal equilibrium of the universe? Arguments are given in the literature for and against taking this hypothesis seriously. Complicating these analyses have been the many subtle - and very often implicit - entanglements between related arguments that have been given for the past hypothesis, the second law, and even Bayesian inference of the reliability of experimental data. These entanglements can easily lead to circular reasoning. To help disentangle those arguments, since almost all of them involve Boltzmann's H theorem, we begin by formalizing the H theorem as a time-symmetric, time-translation invariant Markov process over the entropy values of the universe. Crucially, this process does not specify the time(s) on which we should condition it in order to infer the stochastic dynamics of our universe's entropy. Any such choice of conditioning events must be introduced as an independent assumption. This observation allows us to disentangle the standard Boltzmann brain hypothesis, its "1000CE" variant, the past hypothesis, the second law, and the reliability of our experimental data, all in a fully formal manner. In particular, we show that they all adopt the H theorem's stipulation that the universe's entropy evolves as a Markov processes, and all make an arbitrary assumption that the process should be conditioned on a single moment in time. Their only difference is what single time to condition on. In this aspect, the Boltzmann brain hypothesis and the second law are equally legitimate (or not).
There are no more papers matching your filters at the moment.