Univ. Sorbonne Paris Nord
We study a sextic tensor model where the interaction terms are given by all O(N)3O(N)^3-invariant bubbles. The class of invariants studied here is thus a larger one that the class of the U(N)3U(N)^3-invariant sextic tensor model. We implement the large NN limit mechanism for this general model and we explicitly identify the dominant graphs in the 1/N1/N expansion. This class of dominant graphs contains tadpole graphs, melonic graphs but also new types of tensor graphs. Our analysis adapts the tensorial intermediate field method, previously applied only to the prismatic interaction, to all connected sextic interactions except the wheel interaction, which we treat separately using a cycle analysis.
Every language recognized by a non-deterministic finite automaton can be recognized by a deterministic automaton, at the cost of a potential increase of the number of states, which in the worst case can go from nn states to 2n2^n states. In this article, we investigate this classical result in a probabilistic setting where we take a deterministic automaton with nn states uniformly at random and add just one random transition. These automata are almost deterministic in the sense that only one state has a non-deterministic choice when reading an input letter. In our model, each state has a fixed probability to be final. We prove that for any d1d\geq 1, with non-negligible probability the minimal (deterministic) automaton of the language recognized by such an automaton has more than ndn^d states; as a byproduct, the expected size of its minimal automaton grows faster than any polynomial. Our result also holds when each state is final with some probability that depends on nn, as long as it is not too close to 00 and 11, at distance at least Ω(1n)\Omega(\frac1{\sqrt{n}}) to be precise, therefore allowing models with a sublinear number of final states in expectation.
There are no more papers matching your filters at the moment.