pricing-of-securities
In this work, we introduce amortizing perpetual options (AmPOs), a fungible variant of continuous-installment options suitable for exchange-based trading. Traditional installment options lapse when holders cease their payments, destroying fungibility across units of notional. AmPOs replace explicit installment payments and the need for lapsing logic with an implicit payment scheme via a deterministic decay in the claimable notional. This amortization ensures all units evolve identically, preserving fungibility. Under the Black-Scholes framework, AmPO valuation can be reduced to an equivalent vanilla perpetual American option on a dividend-paying asset. In this way, analytical expressions are possible for the exercise boundaries and risk-neutral valuations for calls and puts. These formulas and relations allow us to derive the Greeks and study comparative statics with respect to the amortization rate. Illustrative numerical case studies demonstrate how the amortization rate shapes option behavior and reveal the resulting tradeoffs in the effective volatility sensitivity.
Differential ML (Huge and Savine 2020) is a technique for training neural networks to provide fast approximations to complex simulation-based models for derivatives pricing and risk management. It uses price sensitivities calculated through pathwise adjoint differentiation to reduce pricing and hedging errors. However, for options with discontinuous payoffs, such as digital or barrier options, the pathwise sensitivities are biased, and incorporating them into the loss function can magnify errors. We consider alternative methods for estimating sensitivities and find that they can substantially reduce test errors in prices and in their sensitivities. Using differential labels calculated through the likelihood ratio method expands the scope of Differential ML to discontinuous payoffs. A hybrid method incorporates gamma estimates as well as delta estimates, providing further regularization.
Bitcoin operates as a macroeconomic paradox: it combines a strictly predetermined, inelastic monetary issuance schedule with a stochastic, highly elastic demand for scarce block space. This paper empirically validates the Endogenous Constraint Hypothesis, positing that protocol-level throughput limits generate a non-linear negative feedback loop between network friction and base-layer monetary velocity. Using a verified Transaction Cost Index (TCI) derived from this http URL on-chain data and Hansen's (2000) threshold regression, we identify a definitive structural break at the 90th percentile of friction (TCI ~ 1.63). The analysis reveals a bifurcation in network utility: while the network exhibits robust velocity growth of +15.44% during normal regimes, this collapses to +6.06% during shock regimes, yielding a statistically significant Net Utility Contraction of -9.39% (p = 0.012). Crucially, Instrumental Variable (IV) tests utilizing Hashrate Variation as a supply-side instrument fail to detect a significant relationship in a linear specification (p=0.196), confirming that the velocity constraint is strictly a regime-switching phenomenon rather than a continuous linear function. Furthermore, we document a "Crypto Multiplier" inversion: high friction correlates with a +8.03% increase in capital concentration per entity, suggesting that congestion forces a substitution from active velocity to speculative hoarding.
We examine whether retail investor investment horizons explain earnings-related return patterns. Using StockTwits posts (2010--2021), we classify stocks as long- or short-horizon prior to earnings. We find horizon composition strongly predicts price paths: long-horizon stocks exhibit larger immediate reactions and pronounced post-announcement drift compared to short-horizon stocks. A strategy buying long-horizon and shorting short-horizon stocks generates 0.43% monthly alpha. Additionally, elevated pre-event sentiment predicts weaker subsequent performance, particularly for short-horizon stocks. These results confirm that retail horizon composition provides a useful dimension for summarizing systematic variation in earnings returns and extracting information from retail activity.
The Solvency Capital Requirement (SCR) calculation under Solvency II is computationally intensive, relying on the estimation of own funds. Regulation mandates the direct estimation method. It has been proven that under specific assumptions, the indirect method results in the same estimate. We study their comparative properties and give novel insights. First, we provide a straightforward proof that the direct and indirect estimators for own funds converge to the same value. Second, we introduce a novel family of mixed estimators that encompasses the direct and indirect methods as its edge cases. Third, we leverage these estimators to develop powerful variance reduction techniques, constructing a single control variate from the direct and indirect estimators and a multi-control variate framework using subsets of the mixed family. These techniques can be combined with existing methods like Least-Squares Monte Carlo. We evaluate the estimators on three simplified asset-liability management models of a German life insurer, Bauer's model MUST and IS case from Bauer et al. (2006), and openIRM by Wolf et al. (2025). Our analysis confirms that neither the direct nor indirect estimator is universally superior, though the indirect method consistently outperforms the direct one in more realistic settings. The proposed control variate techniques show significant potential, in some cases reducing variance to one-tenth of that from the standard direct estimator. However, we also identify scenarios where improvements are marginal, highlighting the model-dependent nature of their efficacy. The source code is publicly available at this https URL.
Since the advent of generative artificial intelligence, every company and researcher has been rushing to develop their own generative models, whether commercial or not. Given the large number of users of these powerful new tools, there is currently no intrinsically verifiable way to explain from the ground up what happens when LLMs (large language models) learn. For example, those based on automatic speech recognition systems, which have to rely on huge and astronomical amounts of data collected from all over the web to produce fast and efficient results, In this article, we develop a backdoor attack called MarketBackFinal 2.0, based on acoustic data poisoning, MarketBackFinal 2.0 is mainly based on modern stock market models. In order to show the possible vulnerabilities of speech-based transformers that may rely on LLMs.
Researchers from The Hong Kong University of Science and Technology and Peking University developed a three-stage framework leveraging Large Language Models (LLMs) and a multi-agent system to automate quantitative investment strategy discovery and portfolio management. The system achieved a 53.17% cumulative return on the SSE50 index, dramatically outperforming the benchmark's -11.73% over the same period, and demonstrated robust, adaptive performance across diverse market conditions.
11
Researchers developed "loss-versus-rebalancing" (LVR), a novel metric quantifying adverse selection costs for Automated Market Maker liquidity providers, offering a continuous-time framework that robustly tracks real-world LP performance after hedging market risk. This work from Columbia University and the University of Chicago provides a superior benchmark for AMM analysis and guides future protocol design.
Perpetual futures are the most popular cryptocurrency derivatives. Perpetuals offer leveraged exposure to their underlying without rollover or direct ownership. Unlike fixed-maturity futures, perpetuals are not guaranteed to converge to the spot price. To minimize the gap between perpetual and spot prices, long investors periodically pay shorts a funding rate proportional to this difference. We derive no-arbitrage prices for perpetual futures in frictionless markets and bounds in markets with trading costs. Empirically, deviations from these prices in crypto are larger than in traditional currency markets, comove across currencies, and diminish over time. An implied arbitrage strategy yields high Sharpe ratios.
Overnight rates, such as the SOFR (Secured Overnight Financing Rate) in the US, are central to the current reform of interest rate benchmarks. A striking feature of overnight rates is the presence of jumps and spikes occurring at predetermined dates due to monetary policy interventions and liquidity constraints. This corresponds to stochastic discontinuities (i.e., discontinuities occurring at ex-ante known points in time) in their dynamics. In this work, we propose a term structure modelling framework based on overnight rates and characterize absence of arbitrage in a generalised Heath-Jarrow-Morton (HJM) setting. We extend the classical short-rate approach to accommodate stochastic discontinuities, developing a tractable setup driven by affine semimartingales. In this context, we show that simple specifications allow to capture stylized facts of the jump behavior of overnight rates. In a Gaussian setting, we provide explicit valuation formulas for bonds and caplets. Furthermore, we investigate hedging in the sense of local risk-minimization when the underlying term structures feature stochastic discontinuities.
We propose to develop a new class of investment insurance products for holders of superannuation accounts in Australia, which we tentatively call equity protection swaps (EPSs). An EPS is a standalone financial derivative, which is reminiscent of a total return swap but also shares some features with the variable annuity known as the registered index-linked annuity (RILA). The buyer of an EPS obtains partial protection against losses on a reference portfolio and, in exchange, agrees to share portfolio gains with the insurance provider if the realized return on a reference portfolio is above a predetermined threshold. Formally, a generic EPS consists of protection and fee legs with participation rates agreed upon by the provider and holder. A general fair pricing formula for an EPS is obtained by considering a static hedging strategy based on traded European options. It is argued that to make the contract appealing to holders, the provider should select appropriate protection and fee rates that make the fair premium at the contract's inception equal to zero. A numerical study based on the Black-Scholes model and empirical tests based on market data for S\&P~500 and S&P/ASX~200 indices for 2020-2022 demonstrates the benefits of an EPS as an efficient investment insurance tool for superannuation accounts.
Two of the most important technological advancements currently underway are the advent of quantum technologies, and the transitioning of global financial systems towards cryptographic assets, notably blockchain-based cryptocurrencies and smart contracts. There is, however, an important interplay between the two, given that, in due course, quantum technology will have the ability to directly compromise the cryptographic foundations of blockchain. We explore this complex interplay by building financial models for quantum failure in various scenarios, including pricing quantum risk premiums. We call this quantum crypto-economics.
We develop inferential tools for latent factor analysis in short panels. The pseudo maximum likelihood setting under a large cross-sectional dimension n and a fixed time series dimension T relies on a diagonal TxT covariance matrix of the errors without imposing sphericity nor Gaussianity. We outline the asymptotic distributions of the latent factor and error covariance estimates as well as of an asymptotically uniformly most powerful invariant (AUMPI) test for the number of factors based on the likelihood ratio statistic. We derive the AUMPI characterization from inequalities ensuring the monotone likelihood ratio property for positive definite quadratic forms in normal variables. An empirical application to a large panel of monthly U.S. stock returns separates month after month systematic and idiosyncratic risks in short subperiods of bear vs. bull market based on the selected number of factors. We observe an uptrend in the paths of total and idiosyncratic volatilities while the systematic risk explains a large part of the cross-sectional total variance in bear markets but is not driven by a single factor. Rank tests show that observed factors struggle spanning latent factors with a discrepancy between the dimensions of the two factor spaces decreasing over time.
The main result of this paper is a collateralized counterparty valuation adjusted pricing equation, which allows to price a deal while taking into account credit and debit valuation adjustments (CVA, DVA) along with margining and funding costs, all in a consistent way. Funding risk breaks the bilateral nature of the valuation formula. We find that the equation has a recursive form, making the introduction of a purely additive funding valuation adjustment (FVA) difficult. Yet, we can cast the pricing equation into a set of iterative relationships which can be solved by means of standard least-square Monte Carlo techniques. As a consequence, we find that identifying funding costs and debit valuation adjustments is not tenable in general, contrary to what has been suggested in the literature in simple cases. The assumptions under which funding costs vanish are a very special case of the more general theory. We define a comprehensive framework that allows us to derive earlier results on funding or counterparty risk as a special case, although our framework is more than the sum of such special cases. We derive the general pricing equation by resorting to a risk-neutral approach where the new types of risks are included by modifying the payout cash flows. We consider realistic settings and include in our models the common market practices suggested by ISDA documentation, without assuming restrictive constraints on margining procedures and close-out netting rules. In particular, we allow for asymmetric collateral and funding rates, and exogenous liquidity policies and hedging strategies. Re-hypothecation liquidity risk and close-out amount evaluation issues are also covered. Finally, relevant examples of non-trivial settings illustrate how to derive known facts about discounting curves from a robust general framework and without resorting to ad hoc hypotheses.
This paper from researchers at Keio University and Meiji University investigates the time-varying validity and factor redundancy of Fama-French multifactor models (FF3 and FF5) across the U.S., Japan, and Europe, employing robust generalized GRS statistics. It demonstrates that model effectiveness and factor significance are largely unstable over time and sensitive to portfolio construction in the U.S. and Europe, while in Japan, models, including the CAPM, show consistent validity across most periods and portfolio sorts, suggesting higher market efficiency.
This paper surveys various methodologies for constructing the implied volatility surface (IVS), a critical tool in computational finance. It details model-based, parametric, and nonparametric approaches, analyzing their characteristics, practical implementation challenges, and methods to ensure arbitrage-freeness for accurate option pricing and risk management.
The paper empirically demonstrates that randomized signature methods, combined with a reservoir computing framework, effectively provide non-linear and non-parametric drift estimations for optimal portfolio selection. This approach consistently outperforms traditional benchmarks in maximizing Sharpe Ratio on both simulated and real-world S&P 500 data, even when accounting for transaction costs.
We investigate asymmetry of information in the context of robust approach to pricing and hedging of financial derivatives. We consider two agents, one who only observes the stock prices and another with some additional information, and investigate when the pricing--hedging duality for the former extends to the latter. We introduce a general framework to express the superhedging and market model prices for an informed agent. Our key insight is that an informed agent can be seen as a regular agent who can restrict her attention to a certain subset of possible paths. We use results of Hou & Obłój on robust approach with beliefs to establish the pricing--hedging duality for an informed agent. Our results cover number of scenarios, including information arriving before trading starts, arriving after static position in European options is formed but before dynamic trading starts or arriving at some point before the maturity. For the latter we show that the superhedging value satisfies a suitable dynamic programming principle, which is of independent interest.
To cope with the negative oil futures price caused by the COVID-19 recession, global commodity futures exchanges temporarily switched the option model from Black--Scholes to Bachelier in 2020. This study reviews the literature on Bachelier's pioneering option pricing model and summarizes the practical results on volatility conversion, risk management, stochastic volatility, and barrier options pricing to facilitate the model transition. In particular, using the displaced Black-Scholes model as a model family with the Black-Scholes and Bachelier models as special cases, we not only connect the two models but also present a continuous spectrum of model choices.
We present a novel Adaptive Distribution Generator that leverages a quantum walks-based approach to generate high precision and efficiency of target probability distributions. Our method integrates variational quantum circuits with discrete-time quantum walks, specifically, split-step quantum walks and their entangled extensions, to dynamically tune coin parameters and drive the evolution of quantum states towards desired distributions. This enables accurate one-dimensional probability modeling for applications such as financial simulation and structured two-dimensional pattern generation exemplified by digit representations(0~9). Implemented within the CUDA-Q framework, our approach exploits GPU acceleration to significantly reduce computational overhead and improve scalability relative to conventional methods. Extensive benchmarks demonstrate that our Quantum Walks-Based Adaptive Distribution Generator achieves high simulation fidelity and bridges the gap between theoretical quantum algorithms and practical high-performance computation.
There are no more papers matching your filters at the moment.