differential-geometry
This is a contribution to the special issue of Surveys in Differential Geometry celebrating the 75th birthday of Shing-Tung Yau. The bulk of the paper is devoted to a survey of some new geometric inequalities and estimates for the Monge-Ampere equation, obtained by the authors in the last few years in joint work with F. Tong, J. Song, and J. Sturm. These all depend in an essential way on Yau's solution of the Calabi conjecture, which is itself nearing its own 50th birthday. The opportunity is also taken to survey briefly many current directions in complex geometry, which he more recently pioneered.
In this paper, we show that massive Majorana spinors \eqref{1.2} do not exist if they are tt-dependent or ϕ\phi-dependent in Kerr, or Kerr-(A)dS spacetimes. For massless Majorana spinors in the non-extreme Kerr spacetime, the Dirac equation can be separated into radial and angular equations, parameterized by two complex constants ϵ1\epsilon_1, ϵ2\epsilon_2. If at least one of ϵ1\epsilon_1, ϵ2\epsilon_2 is zero, massless Majorana spinors can be solved explicitly. If ϵ1\epsilon_1, ϵ2\epsilon_2 are nonzero, we prove the nonexistence of massless time-periodic Majorana spinors in the non-extreme Kerr spacetime which are LpL^p outside the event horizon for $ 0
First-order differential operators arising from the representation-theoretic decomposition of the covariant derivative play a central role in Riemannian geometry. In this paper, we study Stein-Weiss O(n)O(n)-gradients acting on covariant symmetric trace-free tensors of arbitrary rank p2p \ge 2. By analyzing the decomposition of TMS0p(M)T^*M \otimes S_0^p(M) into its O(n)O(n)-irreducible components, we explicitly describe the corresponding generalized gradients and compute Weitzenbock formulas for their adjoint compositions. These results extend Bouguignon four-dimensional formulas for p=2p = 2 and generalize previous work of other authors to higher-rank symmetric tensors. The formulas obtained provide a unified framework for understanding second-order Stein-Weiss operators and yield tools applicable to deformation complexes, curvature estimates, and stability problems in geometric analysis. The article continues the authors' earlier investigations of Stein-Weiss operators on natural tensor bundles.
This research establishes a Tannaka duality for geometric complex analytic stacks by employing the liquid mathematics framework developed by Clausen and Scholze. It demonstrates that these analytic stacks can be faithfully reconstructed from their categories of liquid quasicoherent sheaves, allowing for applications such as the recovery of topological fundamental groups and Stokes groupoids.
We investigated singular points of translation surfaces under the linearly independent condition. In this paper, as completion, we investigate singular points of translation surfaces under the linearly dependent condition, using the theories of generalised framed surfaces and framed surfaces. We introduce the notion of translation generalised framed surfaces and investigate types of singular points that appear on translation generalised framed base surfaces.
We develop a dynamical method for proving the sharp Berezin--Li--Yau inequality. The approach is based on the volume-preserving mean curvature flow and a new monotonicity principle for the Riesz mean RΛ(Ωt)R_\Lambda(\Omega_t). For convex domains we show that RΛR_\Lambda is monotone non-decreasing along the flow. The key input is a geometric correlation inequality between the boundary spectral density QΛQ_\Lambda and the mean curvature HH, established in all dimensions: in d=2d=2 via circular symmetrization, and in d3d\ge 3 via the boundary Weyl expansion together with the Laugesen--Morpurgo trace minimization principle. Since the flow converges smoothly to the ball, the monotonicity implies the sharp Berezin--Li--Yau bound for every smooth convex domain. As an application, we obtain a sharp dynamical Cesàro--Pólya inequality for eigenvalue averages.
Liouville conformal field theory describes a random geometry that fluctuates around a deterministic one: the unique solution of the problem of finding, within a given conformal class, a Riemannian metric with prescribed scalar and geodesic curvatures as well as conical singularities and corners. The level of randomness in Liouville theory is measured by the coupling constant γ(0,2)\gamma\in(0,2), the semi-classical limit corresponding to taking γ0\gamma\to0. Based on the probabilistic definition of Liouville theory, we prove that this semi-classical limit exists and does give rise to this deterministic geometry. At second order this limit is described in terms of a massive Gaussian free field with Robin boundary conditions. This in turn allows to implement CFT-inspired techniques in a deterministic setting: in particular we define the classical stress-energy tensor, show that it can be expressed in terms of accessory parameters (written as regularized derivatives of the Liouville action), and that it gives rise to classical higher equations of motion.
We introduce a geometric and operator-theoretic formalism viewing optimization algorithms as discrete connections on a space of update operators. Each iterative method is encoded by two coupled channels-drift and diffusion-whose algebraic curvature measures the deviation from ideal reversibility and determines the attainable order of accuracy. Flat connections correspond to methods whose updates commute up to higher order and thus achieve minimal numerical dissipation while preserving stability. The formalism recovers classical gradient, proximal, and momentum schemes as first-order flat cases and extends naturally to stochastic, preconditioned, and adaptive algorithms through perturbations controlled by curvature order. Explicit gauge corrections yield higher-order variants with guaranteed energy monotonicity and noise stability. The associated determinantal and isomonodromic formulations yield exact nonasymptotic bounds and describe acceleration effects via Painlevé-type invariants and Stokes corrections.
287
We establish generic regularity results for isoperimetric regions in closed Riemannian manifolds of dimension eight. In particular, we show that every isoperimetric region has a smooth nondegenerate boundary for a generic choice of smooth metric and enclosed volume, or for a fixed enclosed volume and a generic choice of smooth metric.
The Neural Differential Manifold (NDM) proposes re-conceptualizing neural networks as differentiable manifolds where parameters define a Riemannian metric, rather than operating in Euclidean spaces. This architecture aims to intrinsically regularize representations, enhance interpretability through geometric meaning, and enable more efficient optimization.
HSE University researchers developed a fully Riemannian framework for Low-Rank Adaptation (LoRA), optimizing adapters directly on the fixed-rank matrix manifold using the novel Riemannion optimizer and a gradient-informed initialization. This approach resolves parametrization ambiguity and achieves higher accuracy, faster convergence, and greater stability in fine-tuning large language models and diffusion models.
We extend the classical fundamental theorem of the local theory of smooth curves to a wider class of non-smooth data. Curvature and torsion are prescribed in terms of the distributional derivative measures of two given functions of bounded variation. The essentially unique non-smooth curve solution has both finite total curvature and total absolute torsion. In case of continuous data, we preliminarly discuss a more general problem involving a linear system of distributional derivative equations.
This is a sequel to our paper `On the kernel learning problem'. We identify a canonical choice of Riemannian gradient flow, to find the stationary points in the kernel learning problem. In the presence of Gaussian noise variables, this flow enjoys the remarkable property of having a continuous family of Lyapunov functionals, and the interpretation is the automatic reduction of noise. PS. We include an extensive discussion in the postcript explaining the comparison with the 2-layer neural networks. Readers looking for additional motivations are encouraged to read the postscript immediately following the introduction.
We prove for a Θ\Theta-positive representation from a discrete subgroup ΓPSL(2,R)\Gamma\subset \mathsf{PSL}(2,\mathbb{R}), the critical exponent for any αΘ\alpha\in \Theta is not greater than one. When Γ\Gamma is geometrically finite, the equality holds if and only if Γ\Gamma is a lattice.
In this note, we prove that every closed connected oriented odd-dimensional manifold admits a map of non-zero degree (i.e., a domination) from a tight contact manifold of the same dimension. This provides an odd-dimensional counterpart of a symplectic domination result due to Joel Fine and Dmitri Panov. We prove that the dominating contact manifold can be ensured to be Liouville-fillable, but not Weinstein-fillable in general. We discuss an application for contact divisors arising as zero sets of asymptotically contact-holomorphic sections.
We study biharmonic maps between conformally compact manifolds, a large class of complete manifolds with bounded geometry, asymptotically negative curvature, and smooth compactification. These metrics provide a far-reaching generalization of hyperbolic space. We work on the class of simple bb-maps, i.e. maps which send interior to interior, boundary to boundary, and are transversal to the boundary of the target manifold. The main result of this paper is a non-existence result: if a simple bb-map u:(M,g)(N,h)u:\left(M,g\right)\to\left(N,h\right) between conformally compact manifolds is biharmonic, its restriction to the boundary is non-constant, and moreover (N,h)\left(N,h\right) is non-positively curved, then uu is harmonic. We do not assume any integrability condition on uu: in particular, uu is not required to have finite energy, nor is its tension field required to be in LpL^{p} for any pp. Our result implies the following version of the Generalized Chen's Conjecture: if (N,h)\left(N,h\right) is a non-positively curved conformally compact manifold, and ΣN\Sigma\hookrightarrow N is a properly embedded submanifold with boundary meeting N\partial N transversely, then Σ\Sigma is biharmonic if and only if it is minimal.
We show that a hyperbolic three-manifold MM containing a closed minimal surface with principal curvatures in [1,1][-1,1] also contains nearby (non-minimal) surfaces with principal curvatures in (1,1)(-1,1). When MM is complete and homeomorphic to S×RS\times\mathbb{R}, for SS a closed surface, this implies that MM is quasi-Fuchsian, answering a question left open from Uhlenbeck's 1983 seminal paper. Additionally, our result implies that there exist (many) quasi-Fuchsian manifolds that contain a closed surface with principal curvatures in (1,1)(-1,1), but no closed minimal surface with principal curvatures in (1,1)(-1,1), disproving a conjecture from the 2000s.
Rapid growth of high-dimensional datasets in fields such as single-cell RNA sequencing and spatial genomics has led to unprecedented opportunities for scientific discovery, but it also presents unique computational and statistical challenges. Traditional methods struggle with geometry-aware data generation, interpolation along meaningful trajectories, and transporting populations via feasible paths. To address these issues, we introduce Geometry-Aware Generative Autoencoder (GAGA), a novel framework that combines extensible manifold learning with generative modeling. GAGA constructs a neural network embedding space that respects the intrinsic geometries discovered by manifold learning and learns a novel warped Riemannian metric on the data space. This warped metric is derived from both the points on the data manifold and negative samples off the manifold, allowing it to characterize a meaningful geometry across the entire latent space. Using this metric, GAGA can uniformly sample points on the manifold, generate points along geodesics, and interpolate between populations across the learned manifold using geodesic-guided flows. GAGA shows competitive performance in simulated and real-world datasets, including a 30% improvement over the state-of-the-art methods in single-cell population-level trajectory inference.
Nonnegative cross-curvature (NNCC) is a geometric property of a cost function defined on a product space that originates in optimal transportation and the Ma-Trudinger-Wang theory. Motivated by applications in optimization, gradient flows and mechanism design, we propose a variational formulation of nonnegative cross-curvature on c-convex domains applicable to infinite dimensions and nonsmooth settings. The resulting class of NNCC spaces is closed under Gromov-Hausdorff convergence and for this class, we extend many properties of classical nonnegative cross-curvature: stability under generalized Riemannian submersions, characterization in terms of the convexity of certain sets of c-concave functions, and in the metric case, it is a subclass of positively curved spaces in the sense of Alexandrov. One of our main results is that Wasserstein spaces of probability measures inherit the NNCC property from their base space. Additional examples of NNCC costs include the Bures-Wasserstein and Fisher-Rao squared distances, the Hellinger-Kantorovich squared distance (in some cases), the relative entropy on probability measures, and the 2-Gromov-Wasserstein squared distance on metric measure spaces.
Understanding intelligence is a central pursuit in neuroscience, cognitive science, and artificial intelligence. Intelligence encompasses learning, problem-solving, creativity, and even consciousness. Recent advancements in geometric analysis have revealed new insights into high-dimensional information representation and organisation, exposing intrinsic data structures and dynamic processes within neural and artificial systems. However, a comprehensive framework that unifies the static and dynamic aspects of intelligence is still lacking. This manuscript proposes a mathematical framework based on Riemannian geometry to describe the structure and dynamics of intelligence and consciousness. Intelligence elements are conceptualised as tokens embedded in a high-dimensional space. The learned token embeddings capture the interconnections of tokens across various scenarios and tasks, forming manifolds in the intelligence space. Thought flow is depicted as the sequential activation of tokens along geodesics within these manifolds. During the navigation of geodesics, consciousness, as a self-referential process, perceives the thought flow, evaluates it against predictions, and provides feedback through prediction errors, adjusting the geodesic: non-zero prediction errors, such as learning, lead to the restructuring of the curved manifolds, thus changing the geodesic of thought flow. This dynamic interaction integrates new information, evolves the geometry and facilitates learning. The geometry of intelligence guides consciousness, and consciousness structures the geometry of intelligence. By integrating geometric concepts, this proposed theory offers a unified, mathematically framework for describing the structure and dynamics of intelligence and consciousness. Applicable to biological and artificial intelligence, this framework may pave the way for future research and empirical validation.
There are no more papers matching your filters at the moment.