Advances in Bayesian Approaches for Stochastic Process Modeling and Uncertainty Quantification
Main Article Content
Abstract
Stochastic processes serve as foundational models for systems characterized by random evolution across time or space, making them essential tools in disciplines such as finance, physics, epidemiology, and environmental science. Traditional statistical methods often yield only point estimates of model parameters, limiting their capacity to capture the full scope of uncertainty inherent in such systems. In contrast, Bayesian inference offers a rigorous and comprehensive probabilistic framework by treating both parameters and stochastic processes as random variables. This approach enables the integration of prior knowledge and yields posterior distributions that encapsulate uncertainty more fully. This paper presents a comprehensive survey of Bayesian inference as applied to stochastic processes. It begins by outlining the theoretical foundations of Bayes' Theorem in this context, emphasizing the importance of prior specification for infinite-dimensional function spaces. The discussion then turns to key classes of stochastic processes—including Gaussian Processes, Markov Models, and State-Space Models—highlighting how Bayesian methods enhance their interpretability and predictive capacity. Given the complexity of posterior distributions in these models, the paper also reviews modern computational techniques such as Markov Chain Monte Carlo (MCMC) and Variational Inference (VI) that enable practical implementation. Applications across multiple domains are explored to demonstrate the flexibility and power of the Bayesian approach. The study concludes by identifying emerging challenges and outlining promising directions for future research in Bayesian inference for stochastic systems.

Citation Metrics:
Downloads
Article Details

Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
References
Antoniak, C. E. (1974). Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems. The Annals of Statistics, 2(6), 1152–1174.
Betancourt, M., Byrne, S., Livingstone, S., & Girolami, M. (2017). The geometric foundations of Hamiltonian Monte Carlo. Bernoulli, 23(4A), 2257–2298.
Jacot, A., Gabriel, F., & Hongler, C. (2018). Neural tangent kernel: Convergence and generalization in neural networks. Advances in Neural Information Processing Systems (NeurIPS).
Neal, R. M. (1996). Bayesian learning for neural networks. Springer.
Rasmussen, C. E., & Williams, C. K. I. (2006). Gaussian processes for machine learning. MIT Press.
Roy, V., & Bhattacharya, S. (2020). Bayesian nonparametric tests for stochastic process properties. Journal of the American Statistical Association, 115(531), 1391–1405.
Semenova, E., van der Wilk, M., & Turner, R. E. (2022). PriorVAE: Encoding prior distributions with variational autoencoders. Journal of Machine Learning Research, 23(106), 1–38.
Wainwright, M. J., & Jordan, M. I. (2008). Graphical models, exponential families, and variational inference. Foundations and Trends in Machine Learning, 1(1–2), 1–305.
Find the perfect home for your research! If this journal isn't the right fit, don't worry—we offer a wide range of journals covering diverse fields of study. Explore our other journals to discover the ideal platform for your work and maximize its impact. Browse now and take the next step in publishing your research:
| HOME | Yasin | AlSys | Anwarul | Masaliq | Arzusin | Tsaqofah | Ahkam | AlDyas | Mikailalsys | Edumalsys | Alsystech | AJSTEA | AJECEE | AJISD | IJHESS | IJEMT | IJECS | MJMS | MJAEI | AMJSAI | AJBMBR | AJSTM | AJCMPR | AJMSPHR | KIJST | KIJEIT | KIJAHRS |













