popular-physics
We investigate the microlensing detectability of extraterrestrial technosignatures originating from Dyson sphere \textendash like structures, such as Dyson Swarms surrounding primordial black holes (PBHs). These hypothetical swarms consist of stochastically varying, partially opaque structures that could modulate standard microlensing light curves through time-dependent transmission effects. We introduce a probabilistic framework that includes a stochastic transmission model governed by variable optical depth and random gap distributions. We perform a parameter scan and generate heatmaps of the optical transit duration. We study the infrared excess radiation and peak emission wavelength as complementary observational signatures. Additionally, we define and analyze the effective optical depth and the anomalous microlensing event rate for these stochastic structures. Our findings provide a new avenue for searching for extraterrestrial advanced civilizations by extending microlensing studies to include artificial, dynamic modulation signatures.
Weaver et al. introduce a satirical universal classification scheme, "Meat, Vegetable, Soup," which categorizes all items, objects, and concepts in the known and unknown universe. The framework rigorously applies novel definitions for "meat," "vegetable," and "soup," leading to counter-intuitive yet logically consistent classifications for diverse entities from celestial bodies to abstract thoughts.
A novel non-reactive thrust principle based on controlling the angular momentum of a material body is proposed. Theoretically, it is shown that asymmetric emission/absorption of low-energy particle fluxes with spin in a direction perpendicular to the motion enables the creation of a propulsion system whose energy efficiency exceeds that of a photon engine by several orders of magnitude near massive bodies. Using the example of a "dumbbell-flywheel" system dynamics in a central gravitational field, the possibility of controlling orbital parameters without propellant consumption is demonstrated. Experiments in a vacuum chamber revealed anomalies consistent with the hypothesis of spacetime spin polarization. The developed approach offers a mechanistic interpretation of wave-particle duality and suggests new pathways for unifying gravity with quantum mechanics. The obtained results open prospects for the development of next-generation propulsion systems.
Researchers at the University of Science and Technology of China realized the Einstein-Bohr recoiling-slit gedankenexperiment at the quantum limit by employing a single Rubidium-87 atom cooled to its motional ground state as a tunable movable slit, demonstrating Bohr's complementarity principle and distinguishing quantum noise from classical heating.
This article discusses the main aspects related to Bell's inequality, both theoretical and experimental. A new derivation of Bell's inequality is also presented, which stands out for its mathematical simplicity. The exposition is mainly intended for undergraduate physics students, and places special emphasis on clarifying the meaning and scope of Bell's theorem in the context of the Einstein-Podolski-Rosen experiment.
Time-of-flight measurements of helium atoms in a double-slit experiment reported in [C. Kurtsiefer, T. Pfau, and J. Mlynek, Nature 386, 150 (1997)] are compared with the arrival times of Bohmian trajectories. This is the first qualitative comparison of a quantum mechanical arrival-time calculation with observation at the single-particle level, particularly noteworthy given the absence of a consensus in extracting time-of-flight predictions from quantum theory. We further explore a challenging double-slit experiment in which one of the slits is shut in flight.
The diverse methodologies and myriad orthogonal proposals for the best technosignatures to search for in SETI can make it difficult to develop an effective and balanced search strategy, especially from a funding perspective. Here I propose a framework to compare the relative advantages and disadvantages of various proposed technosignatures based on nine "axes of merit". This framework was first developed at the NASA Technosignatures Workshop in Houston in 2018 and published in that report. I give the definition and rationale behind the nine axes as well as the history of each axis in the SETI and technosignature literature. These axes are then applied to three example classes of technosignature searches as an illustration of their use. An open-source software tool is available to allow technosignature researchers to make their own version of the figure.
This paper offers a satirical critique of specific anxieties within AI safety discourse concerning machines becoming physically "supersized" or uncontrollably large. It employs humor and deliberately absurd arguments to highlight perceived logical inconsistencies and advocate for more precise, nuanced thinking in the field, challenging overly speculative narratives about advanced AI.
This anniversary paper is an occasion to recall some of the events that shaped institutional econophysics. But in these thoughts about the evolution of econophysics in the last 15 years we also express some concerns. Our main worry concerns the relinquishment of the simplicity requirement. Ever since the groundbreaking experiments of Galileo some three centuries ago, the great successes of physicists were largely due to the fact that they were able to decompose complex phenomena into simpler ones. Remember that the first observation of the effects of an electrical current was made by Alessandro Volta (1745-1827) on the leg of a frog! Clearly, to make sense this observation had to be broken down into several separate effects. Nowadays, with computers being able to handle huge amounts of data and to simulate any stochastic process no matter how complicated, there is no longer any real need for such a search for simplicity. Why should one spend time and effort trying to break up complicated phenomena when it is possible to handle them globally? On this new road there are several stumbling blocks, however. Do such global mathematical descriptions lead to a real understanding? Do they produce building blocks which can be used elsewhere and thus make our knowledge and comprehension to grow in a cumulative way? Should econophysics also adopt the "globalized" perspective that has been endorsed, developed and spread by the numerous "Complexity Departments" which sprang up during the last decade?
Studies on extraterrestrial civilisations in Russia date back to the end of the 19th century. The modern period of SETI studies began in the USSR in the early 1960s. The first edition of the I.S. Shklovsky's book {\it Universe, Life, Intelligence} published in 1962 was a founding stone of SETI research in the USSR. A number of observational projects in radio and optical domains were conducted in the 1960s - 1990s. Theoretical studies focused on defining optimal spectral domains for search of artificial electromagnetic signals, selection of celestial targets in search for ETI, optimal methods for encoding and decoding of interstellar messages, estimating the magnitude of astro-engineering activity of ETI, and developing philosophical background of the SETI problem. Later, in the 1990s and in the first two decades of the 21st century, in spite of acute underfunding and other problems facing the scientific community in Russia and other countries of the former Soviet Union, SETI-oriented research continued. In particular, SETI collaborations conducted a number of surveys of Sun-like stars in the Milky Way, searched for Dyson spheres and artificial optical signals. Several space broadcasting programs were conducted too, including a radio transmission toward selected stars. Serious rethinking was given to incentives for passive and active participation of space civilisations in SETI and CETI. This paper gives an overview of past SETI activities. It also gives a comprehensive list of publications by authors from Russia, the Soviet Union and the post-Soviet space, as well as some SETI publications by other authors. The rich heritage of SETI research presented in the paper might offer a potentially useful background and starting point for developing strategy and specific research programs of the near future.
Cosimo Bambi outlines a conceptual interstellar mission using laser-propelled nanocrafts to directly investigate a nearby isolated black hole, aiming to achieve unprecedented precision in testing General Relativity, the nature of event horizons, and fundamental constants in strong gravitational fields.
The visualisation of objects moving at relativistic speeds has been a popular topic of study since Special Relativity's inception. While the standard exposition of the theory describes certain shape-changing effects, such as the Lorentz-contraction, it makes no mention of how an extended object would appear in a snapshot or how apparent distortions could be used for measurement. Previous work on the subject has derived the apparent form of an object, often making mention of George Gamow's relativistic cyclist thought experiment. Here, a rigorous re-analysis of the cyclist, this time in 3-dimensions, is undertaken for a binocular observer, accounting for both the distortion in apparent position and the relativistic colour and intensity shifts undergone by a fast moving object. A methodology for analysing binocular relativistic data is then introduced, allowing the fitting of experimental readings of an object's apparent position to determine the distance to the object and its velocity. This method is then applied to the simulation of Gamow's cyclist, producing self-consistent results.
In this case study, we explore the capabilities and limitations of ChatGPT, a natural language processing model developed by OpenAI, in the field of string theoretical swampland conjectures. We find that it is effective at paraphrasing and explaining concepts in a variety of styles, but not at genuinely connecting concepts. It will provide false information with full confidence and make up statements when necessary. However, its ingenious use of language can be fruitful for identifying analogies and describing visual representations of abstract concepts.
2
A long-standing issue in astrobiology is whether planets orbiting the most abundant type of stars, M-dwarfs, can support liquid water and eventually life. A new study shows that subglacial melting may provide an answer, significantly extending the habitability region, in particular around M-dwarf stars, which are also the most promising for biosignature detection with the present and near-future technology.
Whether you're a CEO strategizing the future of your company, a tech enthusiast debating your next career move, a high school teacher eager to enlighten your students, or simply tired of the relentless quantum hype, this is crafted just for you. Cutting through the complex jargon to deliver the straight facts on quantum computing, peeling away the layers of mystique to reveal the true potential and limitations of this groundbreaking technology. Prepare to have your misconceptions challenged, and your understanding deepened in this clear-eyed view of the quantum future, written to inform and inspire readers across the spectrum of curiosity and need.
Using a Markov Chain Monte Carlo optimization algorithm and a computer simulation, I find the passenger ordering which minimizes the time required to board the passengers onto an airplane. The model that I employ assumes that the time that a passenger requires to load his or her luggage is the dominant contribution to the time needed to completely fill the aircraft. The optimal boarding strategy may reduce the time required to board and airplane by over a factor of four and possibly more depending upon the dimensions of the aircraft. In addition, knowledge of the optimal boarding procedure can inform decisions regarding changes to methods that are employed by a particular carrier. I explore some of the salient features of the optimal boarding method and discuss practical modifications to the optimal. Finally, I mention some of the benefits that could come from implementing an improved passenger boarding scheme.
This paper addresses the problem of determining the optimum shape for a beer glass that minimizes the heat transfer while the liquid is consumed, thereby keeping it cold for as long as possible. The proposed solution avoids the use of insulating materials. The glass is modelled as a body of revolution generated by a smooth curve S, constructed from a material with negligible thermal resistance at the revolution surface but insulated at the bottom. The ordinary differential equation describing the problem is derived from the first law of Thermodynamics applied to a control volume encompassing the liquid. This is an inverse optimization problem, aiming to find the shape of the glass (represented by curve S) that minimizes the heat transfer rate. In contrast, the direct problem aims to determine the heat transfer rate for a given geometry. The solution obtained is analytic, and the resulting expression for S is in closed form, providing a family of optimal glass shapes that can be manufactured using conventional methods.
This paper addresses the problem of determining the optimum shape for a beer glass that minimizes the heat transfer while the liquid is consumed, thereby keeping it cold for as long as possible. The proposed solution avoids the use of insulating materials. The glass is modeled as a body of revolution generated by a smooth curve, constructed from a material with negligible thermal resistance, but insulated at the base. The ordinary differential equation describing the problem is derived from the first law of Thermodynamics applied to a control volume encompassing the liquid. This is an inverse optimization problem, aiming to find the shape of the glass (represented by curve SS) that minimizes the heat transfer rate. In contrast, the direct problem aims to determine the heat transfer rate for a given geometry. The solution obtained here is analytic, and the resulting function describing the relation between height ans radius of the glass, is in closed form, providing a family of optimal glass shapes that can be manufactured by conventional methods. Special attention is payed to the dimensions and the capacity of the resulting shapes.
In science, as in life, `surprises' can be adequately appreciated only in the presence of a null model, what we expect a priori. In physics, theories sometimes express the values of dimensionless physical constants as combinations of mathematical constants like pi or e. The inverse problem also arises, whereby the measured value of a physical constant admits a `surprisingly' simple approximation in terms of well-known mathematical constants. Can we estimate the probability for this to be a mere coincidence, rather than an inkling of some theory? We answer the question in the most naive form.
In 1974, Stephen Hawking theoretically discovered that black holes emit thermal radiation and have a characteristic temperature, known as the Hawking temperature. The aim of this paper is to present a simple heuristic derivation of the Hawking temperature, based on the Heisenberg uncertainty principle. The result obtained coincides exactly with Hawking's original finding. In parallel, this work seeks to clarify the physical meaning of Hawking's discovery. This article may be useful as pedagogical material in a high school physics course or in an introductory undergraduate physics course.
There are no more papers matching your filters at the moment.