Submission opened on December 1, 2025 and closed on January 26, 2026.
Hassan Abdelrahman (University of Toronto)
A Flexible Bayesian Framework for Reserving with Hierarchical Temporal Modeling [Click for Abstract]
Accurate loss reserving is fundamental to financial stability and risk management in property and casualty insurance. While traditional macro-level reserving methods rely on aggregated run-off triangles, they obscure important claim-level dynamics and do not permit separate estimation of reported-but-not-settled (RBNS) and incurred-but-not-reported (IBNR) reserves. Models such as the Double Chain Ladder and the Collective Reserving Model address these limitations by operating at a finer level of granularity; however, their practical implementation is often hindered by severe overparameterization, data sparsity, and limited uncertainty quantification.
This paper develops a unified Bayesian framework, built on the Collective Reserving Model, that balances modeling flexibility with statistical regularization. The proposed approach decomposes cell-level payments into accident period, reporting delay, and settlement delay components, while allowing for interactions between these dimensions within a coherent probabilistic structure. Rather than relying on restrictive multiplicative assumptions or treating effects as independent fixed parameters, the framework employs structured temporal smoothing through first-order random walks and spline-based development patterns, enabling information sharing across adjacent time periods.
The model further supports hierarchical extensions that accommodate multiple claim types or business segments through partial pooling, allowing group-specific heterogeneity to be estimated robustly even under sparse data conditions. Additional features commonly encountered in practical reserving applications—such as calendar year effects, seasonal variation, interaction effects, and portfolio-level covariates—can be incorporated naturally within the same modeling architecture. Semi-continuous payment behavior is addressed using a hurdle-based likelihood, permitting separate modeling of payment occurrence and severity.
Model performance is evaluated through sequential out-of-sample reserving exercises across multiple valuation dates. The results demonstrate that structured Bayesian regularization leads to improved predictive accuracy and more stable uncertainty quantification relative to baseline collective reserving specifications. Overall, the proposed framework provides a flexible and interpretable foundation for modern loss reserving, bridging methodological rigor with practical applicability.
Criscent Birungi (Concordia University)
Habit Formation, Labor Supply, and the Dynamics of Retirement and Annuitization [Click for Abstract]
The decision to annuitize wealth in retirement planning has become increasingly complex due to rising longevity risk and changing retirement patterns, including increased labor force participation at older ages. While an extensive literature studies consumption, labor, and annuitization decisions, these elements are typically examined in isolation. This paper develops a unified framework in which utility depends on consumption relative to an internal habit, labor supply is chosen flexibly, mortality is age dependent, and annuitization is an irreversible stopping decision. We derive optimal consumption, labor, portfolio, and annuitization policies in a continuous-time lifecycle model. Our results reveal a rich sequence of retirement dynamics. When wealth is low relative to habit, labor is supplied defensively to protect consumption standards. As wealth increases, agents enter a work-to-retire phase in which labor is supplied at its maximum level to accelerate access to retirement. Human capital acts as a stabilizing asset, justifying a more aggressive pre-retirement investment portfolio, followed by abrupt de-risking upon annuitization. Subjective mortality beliefs are a key determinant in shaping retirement dynamics. Agents with pessimistic longevity beliefs rationally perceive annuities as unattractive, leading them to avoid or delay annuitization. This framework provides a behavior-based explanation for low annuity demand and offers guidance for retirement planning jointly linking labor supply, portfolio choice, and the timing of annuitization. (Joint work with Cody Hyndman).
Dominik Chevalier (Université Laval)
Quantifying epistemic uncertainty in gradient boosting via spectral decomposition of staged predictions [Click for Abstract]
Gradient boosting for decision tree (GBDT) models are widespread in the insurance industry as they achieve state-of-the-art performance for tabular data. A limitation of GBDT models is the absence of a measure of predictive uncertainty, an element readily available in generalised linear models and essential in many high stakes applications. Aleatoric uncertainty can be quantified with probabilistic predictions, mainly through probabilistic GBDT algorithms, but the epistemic uncertainty component still needs investigation. In this work, we shift the perspective of existing propositions by leveraging the sequential nature of GBDT models to construct a consistent model variance estimator. We exploit the spectral decomposition of a GBDT staged prediction discrete-time stochastic process. After proving the consistency properties of our estimator, we dive into numerical results through simulations, comparing the performance of our epistemic uncertainty quantification method against model-agnostic benchmarks: ensembling and conformal prediction.
Michelle Dong (The Australian National University)
Climate scenario analysis on compositional cause of death for the US and Australia [Click for Abstract]
Climate risks pose pressing challenges for insurers, governments, and businesses worldwide, as they face growing uncertainty in quantifying the impacts of climate change. Scenario analysis is becoming more important in light of this as a way to inform potential ranges of impacts from climate-related risks. We aim to understand the impact of specific climate factors by cause for subgroups of the US and Australian populations, through hypothetical counterfactual scenario analysis based on an increase in temperature and dry days. We apply compositional data analysis (CODA) to consider cause-specific deaths, treating the density of deaths as a set of dependent, non-negative values that sum to one. CODA is a natural approach to accounting for dependencies between causes of death, and is especially important to produce forecasts by age and cause which are consistent when causes are aggregated together. As climate and mortality by cause data presents specific obstacles, within the CODA framework we further apply Principal Component Analysis (PCA) as a means of dimension reduction, and consider Generalised Additive Models (GAM) to better reflect the non-linear relationships between climate factors and mortality by cause. Results from our analysis indicate climate factors have varying impacts by cause, and by age within each cause, having more pronounced increases in proportions of deaths for ischaemic heart and chronic obstructive pulmonary diseases, with possible offsetting from hypertenstive disease. The impacts of all causes are greater for ages over 75 years, reinforcing the observation that climate-related risks have a greater impact on older (and likely more vulnerable) subgroups of the population. 
Alexandre Dubeau (Université Laval)
Actor-critic reinforcement learning for auto insurance pricing [Click for Abstract]
The competitive nature of the P&C insurance market prompts insurers to continually revise their pricing strategies to balance profitability and customer retention. Long-term relationships are crucial: policyholders who initially present higher short-term risk may become profitable later in their lifetime with the company. To address this dynamic, we formulate pricing as a sequential decision-making problem in which premium adjustments at each term influence renewal, revenue, and future portfolio composition. We adopt a reinforcement learning framework to naturally accommodate stochasticity in customer trajectories and capture the intertemporal trade-offs inherent to pricing. We consider a state-of-the-art actor-critic algorithm and employ a gradient-boosted trees ensemble as the function approximator for the pricing policy, thereby preserving explainability. Beyond improving pricing decisions, the critic’s value estimates can be viewed as customer lifetime value estimates, supporting acquisition and retention strategies, thus yielding organizational benefits. Using auto insurance data from a Canadian insurer, this work serves as a proof of concept for applying actor-critic methods to insurance pricing.
Chen Jin (Western University)
Forecasting implied volatility surface with generative diffusion models [Click for Abstract]
We introduce a conditional Denoising Diffusion Probabilistic Model (DDPM) for generating arbitrage-free implied volatility (IV) surfaces, offering a more stable and accurate alternative to existing GAN-based approaches. To capture the path-dependent nature of volatility dynamics, our model is conditioned on a rich set of market variables, including exponential weighted moving averages (EWMAs) of historical surfaces, returns and squared returns of underlying asset, and scalar risk indicators like VIX. Empirical results demonstrate our model significantly outperforms leading GAN-based models in capturing the stylized facts of IV dynamics. A key challenge is that historical data often contains small arbitrage opportunities in the earlier dataset for training, which conflicts with the goal of generating arbitrage-free surfaces. We address this by incorporating a standard arbitrage penalty into the loss function, but apply it using a novel, parameter-free weighting scheme based on the signal-to-noise ratio (SNR) that dynamically adjusts the penalty's strength across the diffusion process. We also show a formal analysis of this trade-off and provide a proof of convergence showing that the penalty introduces a small, controllable bias that steers the model toward the manifold of arbitrage-free surfaces while ensuring the generated distribution remains close to the real-world data.
Derek Kusmenko (University of Toronto)
Insurance Pricing with Fine-Tuned LLMs [Click for Abstract]
This study investigates replacing structured covariates in insurance pricing models with semantic embeddings derived from large language models (LLMs). Traditional actuarial models rely on categorical or numeric features such as age, gender, and occupation, and assume additive or low-order interaction effects. However, risk assessment often depends on higher-order interactions and contextual interpretations that are difficult to capture explicitly. We propose an approach in which natural language prompts, describing policyholder attributes in underwriting terms, are encoded by a foundation model to produce latent feature representations. These embeddings serve as inputs to standard regression or machine learning models for predicting pure premiums in motor insurance. The method enables the model to leverage the LLM’s ability to infer implicit correlations and domain-relevant associations across attributes, learned from vast text corpora. Using illustrative examples, we show how feature replacement can enhance model flexibility, capture complex relationships, and maintain actuarial fairness. This framework provides a principled approach to integrating foundation model representations into risk modelling and portfolio analytics.
Jongtaek Lee (University of Toronto)
A Frequency-Severity Trip-Level Risk Index for Classification of Telematics Signals [Click for Abstract]
In this paper, we introduce a novel notion of severity for telematics signals, which is newly formalized in actuarial ratemaking: severity measures the rarity of abnormal driving patterns relative to a portfolio-level baseline, rather than claim size conditional on an accident. Building on this notion, we propose a frequency–severity joint trip-level risk index that integrates both how often abnormal patterns occur and how severe (i.e., rare) they are. Using high-frequency telematics signals, we construct a multi-scale representation via the maximal overlap discrete wavelet transform (MODWT) to retain localized driving patterns and aggregate MODWT coefficients across scales to summarize the patterns across multiple time scales. To quantify severity as tail rarity, we propose a portfolio-level Gaussian–Uniform mixture with a multi-Uniform tail structure that extends the usual single-Uniform model: Gaussian components represent typical driving, while multiple Uniform components absorb tail behavior by partitioning each tail into ordered severity layers. We develop a Multi-Uniform MEMR (MU-MEMR) procedure that makes maximum-likelihood estimation practical for this extended mixture model. Given the estimated severity layers, we define multi-layer tail counts (MLTCs) per trip and model them with a Poisson–Gamma framework, yielding closed-form posterior intensities and a trip-level frequency–severity risk rate. The same conjugate structure supports sequential updating to produce an evolving driver-level risk profile. An application to the UAH-DriveSet controlled dataset demonstrates that incorporating severity materially improves discrimination of risky trips, particularly for drowsy driving that can exhibit a “normal-looking” profile.
Wuding Li (Université de Montréal)
Quadratic Hedging in Discrete Time with Stochastic Interest Rates and GARCH Volatility [Click for Abstract]
We propose a discrete-time derivative pricing and hedging framework with stochastic interest rates and time-varying volatilities for both asset returns and the short-term interest rate. Volatility dynamics are modeled using affine GARCH processes with correlated Gaussian innovations, and risk-neutral dynamics are derived from a covariance-dependent pricing kernel. Within this setting, we obtain semi-closed-form expressions for risk-minimizing hedging positions that incorporate both equity and interest rate hedging instruments. Our numerical analysis combines simulation-based and empirical hedging experiments. Using S&P 500 index options and U.S. Treasury yield data, we evaluate hedging performance across different interest rate environments and demonstrate how stochastic interest rate dynamics and bond risk premia influence hedging effectiveness.
Dante Mata Lopez (Universite du Quebec a Montreal)
Optimal time to sell a stock in the presence of default and volatility risks [Click for Abstract]
We consider a small investor who holds a stock that is subject to default risk and seeks to identify the optimal time to sell the stock in the sense of minimizing the "prophet's drawdown". This problem is phrased as an optimal stopping problem which we can solve explicitly in the case where the stock price is modelled by an exponential spectrally negative Levy process. This is joint work with Aleksandar Mijatovic (U. Warwick)
Shiva Mehdipour Ghobadlou (Western University)
Modeling Dependence in Count and Hybrid Insurance Data [Click for Abstract]
Accurate dependency modeling is fundamental to insurance product pricing, reserving, and capital allocation, where understanding the interplay between claim frequency and severity is essential. Traditional actuarial practice often relies on the simplifying assumption of independence, which can lead to misleading risk assessments. While copula models provide a standard framework for measuring dependencies, they face significant challenges with discrete count data—specifically, the non-uniqueness of the joint distribution—and often struggle to fit real-world data that deviates from standard parametric families.
In this study, we introduce a flexible, non-parametric approach based on Sklar-type density estimates to address these challenges. By employing perturbation strategies, we derive accurate joint probability distributions for both count–count and hybrid frequency–severity models. This framework avoids strong parametric assumptions and offers an intuitive way to capture complex dependence structures. We demonstrate the practical utility of these models by applying them to real-world motor insurance datasets, confirming their ability to precisely recover joint probability distributions for improved risk assessment and pricing.
Kathleen Miao (University of Toronto)
Discrimination-Insensitive Pricing [Click for Abstract]
Fair and discrimination-free insurance premia can be considered as a societal good, managing risks for all members of a community. However, in Canada, insurers are allowed to discriminate when it is a business necessity, i.e. when there is no practical alternative to using "unfair", or protected, covariates. In this work, we propose a discrimination-insensitive premium pricing framework, where we require the premium to be insensitive to the (exogenously determined) protected covariates. We formulate and solve a mathematically tractable optimisation problem that finds the nearest (in Kullback-Leibler divergence) "pricing'' measure --- termed discrimination-insensitive measure --- such that the resultant joint distribution of covariates is discrimination-insensitive. We provide conditions for existence and uniqueness of the solution.
In the case where the discrimination-insensitive pricing measure is undesirable or infeasible, we further propose a two-step pricing procedure. In particular, we relax the problem into finding measures that are marginally-insensitive to each discriminatory covariate, and then reconciling them via a constrained barycentre model. We provide a closed form solution to this problem, and give conditions for existence and uniqueness. As an intermediary step, we provide the representation of the KL barycentre of measures, and prove existence and uniqueness, which may be of independent interest.
Finally, we compare the performance of our discrimination-insensitive premia and the constrained barycentre pricing measure with recently proposed actuarially fair premia in the literature in a numerical illustration.
Brandon Tam (University of Toronto)
Dynamic Pareto Optima in Multi-Period Pure-Exchange Economies [Click for Abstract]
We study a problem of optimal allocation in a (discrete-time) multi-period pure-exchange economy, where the agents' preferences over stochastic processes are represented by strongly time-consistent dynamic risk measures. We introduce the notion of dynamic Pareto-optimal allocation processes and show that such processes can be constructed recursively starting with the allocation at the terminal time. We further derive a comonotonic improvement theorem for allocation processes and provide a recursive approach to constructing comonotonic dynamic Pareto optima when agent preferences are coherent and satisfy a property that we call equidistribution-preserving. In the special case where each agent's dynamic risk measure is of distortion type, we provide a closed-form characterization of comonotonic dynamic Pareto optima. We illustrate our results in a two-period setting.
Safoora Zarei (University of Western Ontario)
An Actuarial Framework for Heavy-Tailed Spatially Dependent Earthquake Losses: Solvency Implications for British Columbia Insurers [Click for Abstract]
A catastrophic earthquake in a major Canadian city can create systemic risk for the insurance industry. Although numerical catastrophe models provide a strong basis for seismic risk assessment, quantifying systemic portfolio risk is still actuarially challenging because the model must capture both heavy-tailed loss severities and cross-regional dependence, especially in the upper tail where large losses occur jointly. This paper develops an actuarial framework to address these features and to quantify the financial impact of dependent earthquake losses. To overcome the limitations of sparse historical earthquake data, we calibrate the framework using a high-fidelity 500,000-year synthetic earthquake catalog for British Columbia, Canada. The framework combines three components: (i) a common Poisson shock model to represent dependent event counts across three high-risk regions; (ii) heavy-tailed marginal severity models, including composite Generalized Beta of the second kind with Generalized Pareto (GB2–GPD), and a mixture GB2–GB2, to represent the full loss range; and (iii) copula models to capture dependence between regional severities generated by the same shock. The resulting annual aggregate loss distribution supports solvency-relevant risk metrics such as Value-at-Risk (VaR) and Tail Value-at-Risk (TVaR) and shows that ignoring dependence can materially underestimate tail risk and required capital. The proposed approach offers a practical and computationally efficient tool for insurers and regulators to assess systemic seismic risk.