Автор: Henry D. I. Abarbanel
Издательство: Cambridge University Press
Год: 2022
Страниц: 207
Язык: английский
Формат: pdf (true)
Размер: 138.8 MB
Data assimilation is a hugely important mathematical technique, relevant in fields as diverse as geophysics, Data Science, and neuroscience. This modern book provides an authoritative treatment of the field as it relates to several scientific disciplines, with a particular emphasis on recent developments from Machine Learning (ML) and its role in the optimisation of data assimilation. Underlying theory from statistical physics, such as path integrals and Monte Carlo methods, are developed in the text as a basis for data assimilation, and the author then explores examples from current multidisciplinary research such as the modelling of shallow water systems, ocean dynamics, and neuronal dynamics in the avian brain. The theory of data assimilation and Machine Learning is introduced in an accessible and unified manner, and the book is suitable for undergraduate and graduate students from science and engineering without specialized experience of statistical physics.
Supervised Machine Learning is a framework in which information in some noisy input data is presented to a layer called l0. This input is sampled M times, carrying information about the probability distribution of the input. We label this input y(k)(l0); k = 1, 2, . . . , M. The input is transferred to and through a layered model network following a set of rules for the transfer from layer to layer. The aim is to produce at the final output layer, lF, labeled, known characteristics of those data. We call the use of continuous layers “deepest learning.” In the ML literature, adding many layers to provide a ‘representation’ of the information presented to the network is often called deep learning. Making the layer variable continuous is as ‘deep’ as one can go. These problems respect a symplectic symmetry; Kot; Arnol’d in continuous time/layer phase space. Both Lagrangian versions and Hamiltonian versions of these problems are presented. When it comes to numerical evaluation of expected values in the MLP, we, once again, are required to use discrete layer steps. Maintaining the symplectic symmetry of the problem will turn to using our earlier discussions of this.
Скачать The Statistical Physics of Data Assimilation and Machine Learning