Modeling of the Markiv random field for the purpose of its further optimization and application
DOI: 10.31673/2412-9070.2020.051215
DOI:
https://doi.org/10.31673/2412-9070.2020.051215Abstract
Models of the Markov random field are investigated. The main improvements of the Markov random field model are investigated. If we consider Markov models of random fields with binary conditional distributions, which include stochastic evolution in time, which is based on the autoregression structure for a large-scale model, these models retain the flexibility of static Markov random field models to reproduce the representation of spatial dependence in a small-scale model. Bayesian estimation in this case is achieved through the use of a so-called algorithm that requires the generation of auxiliary random fields, but does not require the use of ideal samples. Markov random fields are a powerful tool in machine learning. It is often necessary to model such fields between dissimilar objects, which leads to the fact that the nodes in the graph belong to different types of data. To model inhomogeneous areas using graphical models, it is necessary to assign different types of distributions (binary, Gaussian, Poisson, exponent, exponential, etc.) to the model nodes. The concept of conditional random fields is considered in the article, their features, advantages and disadvantages are established. The application of binary data in Markov models of random fields is considered, which generates a class of models of binary Markov random fields. It is established that the discrete nature of Markov random fields allows a wider range of possible values of dependence, ie negative dependence. The model, loss function and distribution of the Markov random field function are investigated. Strengthening of Markov random fields is proposed. The pairwise exponential Markov random field is considered.
Keywords: Markov random field; optimization; loss function; distribution; pairwise exponential Markov random field.
References
1. Propp, James Gary, David Bruce Wilson. Exact sampling with coupled Markov chains and applications to statistical mechanics // Random structures and Algorithms. 1996. 9.1-2. Р. 223–252.
2. Hughes, John, Murali Haran, Petruta C Caragea. Autologistic models for binary data on a lattice // Environmetrics. 2011. 22.7. Р. 857–871.
3. Zucchini W., MacDonald I. L., Langrock R. Hidden Markov Models for Time Series: An Introduction Using R. Chapman and Hall, 2016.
4. Wainwright M. J., Jordan M. I. Graphical Models, Exponential Families, and Variational Inference // Found. and Tr. in Mach. Learn. 2008. 1(1–2):1–305.
5. Minka T. P. The EP Energy Function and Minimization Schemes // MSR TR. 2001 р.
6. Hürzeler M., Künsch H. R. Monte Carlo Approximations for General State-Space Models // JCGS. 1998. 7(2):175–193.
7. Lafferty J., Mccallum A., Pereira F. Conditional random fields: Probabilistic models for segmenting and labeling sequence data // Proceedings of the 18th International Conference on Machine Learning. Williamstown, Massachusetts, 2001. P. 282–289.
8. Sha F., Pereira F. Shallow parsing with conditional random fields // In Proceedings of HLT/NAACL, 2003. P. 213–220.
9. Оптимизация работы алгоритма градиентного бустинга с помощью перекрестной проверки / В. В. Жебка, В. И. Виноградов, А. П. Бондарчук, М. Н. Степанов // Актуальні проблеми економіки. 2019. №12 (222). С. 189–197.