Wolfram Research
As first discovered by Choptuik, the black hole threshold in the space of initial data for general relativity shows both surprising structure and surprising simplicity. Universality, power-law scaling of the black hole mass, and scale echoing have given rise to the term ``critical phenomena''. They are explained by the existence of exact solutions which are attractors within the black hole threshold, that is, attractors of codimension one in phase space, and which are typically self-similar. Critical phenomena give a natural route from smooth initial data to arbitrarily large curvatures visible from infinity, and are therefore likely to be relevant for cosmic censorship, quantum gravity, astrophysics, and our general understanding of the dynamics of general relativity. Major additions since the 2010 version of this review are numerical simulations beyond spherical symmetry, in particular of vacuum critical collapse, and new sections on mathematical results in PDE blowup (as a toy model for singularity formation) and on naked singularity formation in GR.
Both metamathematics and physics are posited to emerge from samplings by observers of the unique ruliad structure that corresponds to the entangled limit of all possible computations. The possibility of higher-level mathematics accessible to humans is posited to be the analog for mathematical observers of the perception of physical space for physical observers. A physicalized analysis is given of the bulk limit of traditional axiomatic approaches to the foundations of mathematics, together with explicit empirical metamathematics of some examples of formalized mathematics. General physicalized laws of mathematics are discussed, associated with concepts such as metamathematical motion, inevitable dualities, proof topology and metamathematical singularities. It is argued that mathematics as currently practiced can be viewed as derived from the ruliad in a direct Platonic fashion analogous to our experience of the physical world, and that axiomatic formulation, while often convenient, does not capture the ultimate character of mathematics. Among the implications of this view is that only certain collections of axioms may be consistent with inevitable features of human mathematical observers. A discussion is included of historical and philosophical connections, as well as of foundational implications for the future of mathematics.
Some contemporary views of the universe assume information and computation to be key in understanding and explaining the basic structure underpinning physical reality. We introduce the Computable Universe exploring some of the basic arguments giving foundation to these visions. We will focus on the algorithmic and quantum aspects, and how these may fit and support the computable universe hypothesis.
We propose the characterization of binary cellular automata using a set of behavioral metrics that are applied to the minimal Boolean form of a cellular automaton's transition function. These behavioral metrics are formulated to satisfy heuristic criteria derived from elementary cellular automata. Behaviors characterized through these metrics are growth, decrease, chaoticity, and stability. From these metrics, two measures of global behavior are calculated: 1) a static measure that considers all possible input patterns and counts the occurrence of the proposed metrics in the truth table of the minimal Boolean form of the automaton; 2) a dynamic measure, corresponding to the mean of the behavioral metrics in nn executions of the automaton, starting from nn random initial states. We use these measures to characterize a cellular automaton and guide a genetic search algorithm, which selects cellular automata similar to the Game of Life. Using this method, we found an extensive set of complex binary cellular automata with interesting properties, including self-replication.
We examine a family of BPS solutions of ten-dimensional type IIb supergravity. These solutions asymptotically approach AdS_5 X S^5 and carry internal `angular' momentum on the five-sphere. While a naked singularity appears at the center of the anti-de Sitter space, we show that it has a natural physical interpretation in terms of a collection of giant gravitons. We calculate the distribution of giant gravitons from the dipole field induced in the Ramond-Ramond five-form, and show that these sources account for the entire internal momentum carried by the BPS solutions.
How does one formalize the structure of structures necessary for the foundations of physics? This work is an attempt at conceptualizing the metaphysics of pregeometric structures, upon which new and existing notions of quantum geometry may find a foundation. We discuss the philosophy of pregeometric structures due to Wheeler, Leibniz as well as modern manifestations in topos theory. We draw attention to evidence suggesting that the framework of formal language, in particular, homotopy type theory, provides the conceptual building blocks for a theory of pregeometry. This work is largely a synthesis of ideas that serve as a precursor for conceptualizing the notion of space in physical theories. In particular, the approach we espouse is based on a constructivist philosophy, wherein ``structureless structures'' are syntactic types realizing formal proofs and programs. Spaces and algebras relevant to physical theories are modeled as type-theoretic routines constructed from compositional rules of a formal language. This offers the remarkable possibility of taxonomizing distinct notions of geometry using a common theoretical framework. In particular, this perspective addresses the crucial issue of how spatiality may be realized in models that link formal computation to physics, such as the Wolfram model.
How does one generalize differential geometric constructs such as curvature of a manifold to the discrete world of graphs and other combinatorial structures? This problem carries significant importance for analyzing models of discrete spacetime in quantum gravity; inferring network geometry in network science; and manifold learning in data science. The key contribution of this paper is to introduce and validate a new estimator of discrete sectional curvature for random graphs with low metric-distortion. The latter are constructed via a specific graph sprinkling method on different manifolds with constant sectional curvature. We define a notion of metric distortion, which quantifies how well the graph metric approximates the metric of the underlying manifold. We show how graph sprinkling algorithms can be refined to produce hard annulus random geometric graphs with minimal metric distortion. We construct random geometric graphs for spheres, hyperbolic and euclidean planes; upon which we validate our curvature estimator. Numerical analysis reveals that the error of the estimated curvature diminishes as the mean metric distortion goes to zero, thus demonstrating convergence of the estimate. We also perform comparisons to other existing discrete curvature measures. Finally, we demonstrate two practical applications: (i) estimation of the earth's radius using geographical data; and (ii) sectional curvature distributions of self-similar fractals.
Theoretical and computational frameworks of modern science are dominated by binary structures. This binary bias, seen in the ubiquity of pair-wise networks and formal operations of two arguments in mathematical models, limits our capacity to faithfully capture irreducible polyadic interactions in higher-order systems. A paradigmatic example of a higher-order interaction is the Borromean link of three interlocking rings. In this paper we propose a mathematical framework via hypergraphs and hypermatrix algebras that allows to formalize such forms of higher-order bonding and connectivity in a parsimonious way. Our framework builds on and extends current techniques in higher-order networks -- still mostly rooted in binary structures such as adjacency matrices -- and incorporates recent developments in higher-arity structures to articulate the compositional behavior of adjacency hypermatrices. Irreducible higher-order interactions turn out to be a widespread occurrence across natural sciences and socio-cultural knowledge representation. We demonstrate this by reviewing recent results in computer science, physics, chemistry, biology, ecology, social science, and cultural analysis through the conceptual lens of irreducible higher-order interactions. We further speculate that the general phenomenon of emergence in complex systems may be characterized by spatio-temporal discrepancies of interaction arity.
We consider a generalization of polynomial programs: algebraic programs, which are optimization or feasibility problems with algebraic objectives or constraints. Algebraic functions are defined as zeros of multivariate polynomials. They are a rich set of functions that includes polynomials themselves, but also ratios and radicals, and finite compositions thereof. When an algebraic program is given in terms of radical expressions, a straightforward way of reformulating into a polynomial program is to introduce a new variable for each distinct radical that appears. Hence, the rich theory and algorithms for polynomial programs, including satisfiability via cylindrical algebraic decomposition, infeasibility certificates via Positivstellensatz theorems, and optimization with sum-of-squares programming directly apply to algebraic programs. We propose a different reformulation, that in many cases introduces significantly fewer new variables, and thus produces polynomial programs that are easier to solve. First, we exhibit an algorithm that finds a defining polynomial of an algebraic function given as a radical expression. As a polynomial does not in general define a unique algebraic function, additional constraints need to be added that isolate the algebraic function from others defined by the same polynomial. Using results from real algebraic geometry, we develop an algorithm that generates polynomial inequalities that isolate an algebraic function. This allows us to reformulate an algebraic program into a polynomial one, by introducing only a single new variable for each algebraic function. On modified versions of classic optimization benchmarks with added algebraic terms, our formulation achieves speedups of up to 50x compared to the straightforward reformulation.
We propose a formal framework for understanding and unifying the concept of observers across physics, computer science, philosophy, and related fields. Building on cybernetic feedback models, we introduce an operational definition of minimal observers, explore their role in shaping foundational concepts, and identify what remains unspecified in their absence. Drawing upon insights from quantum gravity, digital physics, second-order cybernetics, and recent ruliological and pregeometric approaches, we argue that observers serve as indispensable reference points for measurement, reference frames, and the emergence of meaning. We show how this formalism sheds new light on debates related to consciousness, quantum measurement, and computational boundaries; by way of theorems on observer equivalences and complexity measures. This perspective opens new avenues for investigating how complexity and structure arise in both natural and artificial systems.
The Chaos Game Representation, a method for creating images from nucleotide sequences, is modified to make images from chunks of text documents. Machine learning methods are then applied to train classifiers based on authorship. Experiments are conducted on several benchmark data sets in English, including the widely used Federalist Papers, and one in Portuguese. Validation results for the trained classifiers are competitive with the best methods in prior literature. The methodology is also successfully applied for text categorization with encouraging results. One classifier method is moreover seen to hold promise for the task of digital fingerprinting.
1
We introduce an intuitive algorithmic methodology for enacting automated rewriting of string diagrams within a general double-pushout (DPO) framework, in which the sequence of rewrites is chosen in accordance with the causal structure of the underlying diagrammatic calculus. The combination of the rewriting structure and the causal structure may be elegantly formulated as a weak 2-category equipped with both total and partial monoidal bifunctors, thus providing a categorical semantics for the full multiway evolution causal graph of a generic Wolfram model hypergraph rewriting system. As an illustrative example, we show how a special case of this algorithm enables highly efficient automated simplification of quantum circuits, as represented in the ZX-calculus.
Skin cancer, the most commonly found human malignancy, is primarily diagnosed visually via dermoscopic analysis, biopsy, and histopathological examination. However, unlike other types of cancer, automated image classification of skin lesions is deemed more challenging due to the irregularity and variability in the lesions' appearances. In this work, we propose an adaptation of the Neural Style Transfer (NST) as a novel image pre-processing step for skin lesion classification problems. We represent each dermoscopic image as the style image and transfer the style of the lesion onto a homogeneous content image. This transfers the main variability of each lesion onto the same localized region, which allows us to integrate the generated images together and extract latent, low-rank style features via tensor decomposition. We train and cross-validate our model on a dermoscopic data set collected and preprocessed from the International Skin Imaging Collaboration (ISIC) database. We show that the classification performance based on the extracted tensor features using the style-transferred images significantly outperforms that of the raw images by more than 10%, and is also competitive with well-studied, pre-trained CNN models through transfer learning. Additionally, the tensor decomposition further identifies latent style clusters, which may provide clinical interpretation and insights.
How do spaces emerge from pregeometric discrete building blocks governed by computational rules? To address this, we investigate non-deterministic rewriting systems (multiway systems) of the Wolfram model. We express these rewriting systems as homotopy types. Using this new formulation, we outline how spatial structures can be functorially inherited from pregeometric type-theoretic constructions. We show how higher homotopy types are constructed from rewriting rules. These correspond to morphisms of an nn-fold category. Subsequently, the nn \to \infty limit of the Wolfram model rulial multiway system is identified as an \infty-groupoid, with the latter being relevant given Grothendieck's homotopy hypothesis. We then go on to show how this construction extends to the classifying space of rulial multiway systems, which forms a multiverse of multiway systems and carries the formal structure of an (,1){\left(\infty, 1\right)}-topos. This correspondence to higher categorical structures offers a new way to understand how spaces relevant to physics may arise from pregeometric combinatorial models. A key issue we have addressed here is to relate abstract non-deterministic rewriting systems to higher homotopy spaces. A consequence of constructing spaces and geometry synthetically is that it eliminates ad hoc assumptions about geometric attributes of a model such as an a priori background or pre-assigned geometric data. Instead, geometry is inherited functorially by higher structures. This is relevant for formally justifying different choices of underlying spacetime discretization adopted by models of quantum gravity. We conclude with comments on how our framework of higher category-theoretic combinatorial constructions, corroborates with other approaches investigating higher categorical structures relevant to the foundations of physics.
We investigate operator algebraic origins of the classical Koopman-von Neumann wave function ψKvN\psi_{KvN} as well as the quantum mechanical one ψQM\psi_{QM}. We introduce a formalism of Operator Mechanics (OM) based on a noncommutative Poisson, symplectic and noncommutative differential structures. OM serves as a pre-quantum algebra from which algebraic structures relevant to real-world classical and quantum mechanics follow. In particular, ψKvN\psi_{KvN} and ψQM\psi_{QM} are both consequences of this pre-quantum formalism. No a priori Hilbert space is needed. OM admits an algebraic notion of operator expectation values without invoking states. A phase space bundle E{\cal E} follows from this. ψKvN\psi_{KvN} and ψQM\psi_{QM} are shown to be sections in E{\cal E}. The difference between ψKvN\psi_{KvN} and ψQM\psi_{QM} originates from a quantization map interpreted as "twisting" of sections over E{\cal E}. We also show that the Schr\"{o}dinger equation is obtained from the Koopman-von Neumann equation. What this suggests is that neither the Schr\"{o}dinger equation nor the quantum wave function are fundamental structures. Rather, they both originate from a pre-quantum operator algebra. Finally, we comment on how entanglement between these operators suggests emergence of space; and possible extensions of this formalism to field theories.
We generalize Koopman-von Neumann classical mechanics to poly-symplectic fields and recover De Donder-Weyl theory. Comparing with Dirac's Hamiltonian density inspires a new Hamiltonian formulation with a canonical momentum field that is Lorentz covariant with symplectic geometry. We provide commutation relations for the classical and quantum fields that generalize the Koopman-von Neumann and Heisenberg algebras. The classical algebra requires four fields that generalize space-time, energy-momentum, frequency-wavenumber, and the Fourier conjugate of energy-momentum. We clarify how 1st and 2nd quantization can be found by simply mapping between operators in classical and quantum commutator algebras.
The paper philosophically examines Stephen Wolfram's multi-computational Ruliad framework, proposing that physical laws and perceived reality emerge from computationally-bounded observers sampling the Ruliad, an entangled limit of all possible computations. It defines physical laws as sampling-invariance and identifies an inherent limitation in describing a system that includes its own observer.
We examine qubit states under symmetric informationally-complete measurements, representing state vectors as probability 4-vectors within a 3-simplex in bb(R)4bb(R)^4. Using geometric transformations, this 3-simplex is mapped to a tetrahedron in bb(R)3bb(R)^3. A specific surface within this tetrahedron allows for the separation of probability vectors into two disjoint 1-simplices. The intersection of this surface with the insphere identifies a "quantum potato chip" region, where probability 4-vectors reduce to two binary classical variables. States within this region can be fully reconstructed using only two given projective measurements, a feature not found elsewhere in the state space.
1
There are no more papers matching your filters at the moment.