### Skapa referens, olika format (klipp och klistra)

**Harvard**

Larsén, I., Karlsson, D., Eriksson, M., Wallin, E. och Helgegren, R. (2017) *Gaussian processes for emulating chiral effective field theory describing few-nucleon systems*.

** BibTeX **

@misc{

Larsén2017,

author={Larsén, Isak and Karlsson, Daniel and Eriksson, Martin and Wallin, Erik and Helgegren, Rikard},

title={Gaussian processes for emulating chiral effective field theory describing few-nucleon systems},

abstract={Gaussian processes (GPs) can be used for statistical regression, i.e. to predict new data given a set of observed
data. In this context, we construct GPs to emulate the calculation of low energy proton-neutron scattering
cross sections and the binding energy of the helium-4 nucleus. The GP regression uses so-called kernel functions
to approximate the covariance between observed and unknown data points. The emulation is done in an
attempt to reduce the large computational cost associated with exact numerical simulation of the observables.
The underlying physical theory of the simulation is EFT. This theory enables a perturbative description of
low-energy nuclear forces and is governed by a set of low-energy constants to define the terms in the effective
Lagrangian. We use the research code nsopt to simulate selected observables using EFT.
The GPs used in this thesis are implemented using the Python framework GPy. To measure the performance
of a GP we define an error measure called model error by comparing exact simulations to emulated predictions.
We also study the time and memory consumption of GPs. The choice of input training data affects the
predictive accuracy of the resulting GP. Therefore, we examined different sampling methods with varying
amounts of data.
We found that GPs can serve as an effective and versatile approach for emulating the examined observables.
After the initial high computational cost of training, making predictions with GPs is quick. When trained using
the right methods, they can also achieve high accuracy. We concluded that the Matérn 5/2 and RBF kernels
perform best for the observables studied. When sampling input points in high dimensions, latin hypercube
sampling is shown to be a good method. In general, with a multidimensional input space, it is a good choice to
use a kernel function with different sensitivities in different directions. When working with data that spans
over many orders of magnitude, logarithmizing the data before training also improves the GP performance.
GPs do not appear to be a suitable method for making extrapolations from a given training set, but performs
well with interpolations.},

year={2017},

keywords={Machine learning, Gaussian processes, Chiral effective field theory, Scattering},

note={35},

}

** RefWorks **

RT Generic

SR Electronic

ID 256359

A1 Larsén, Isak

A1 Karlsson, Daniel

A1 Eriksson, Martin

A1 Wallin, Erik

A1 Helgegren, Rikard

T1 Gaussian processes for emulating chiral effective field theory describing few-nucleon systems

YR 2017

AB Gaussian processes (GPs) can be used for statistical regression, i.e. to predict new data given a set of observed
data. In this context, we construct GPs to emulate the calculation of low energy proton-neutron scattering
cross sections and the binding energy of the helium-4 nucleus. The GP regression uses so-called kernel functions
to approximate the covariance between observed and unknown data points. The emulation is done in an
attempt to reduce the large computational cost associated with exact numerical simulation of the observables.
The underlying physical theory of the simulation is EFT. This theory enables a perturbative description of
low-energy nuclear forces and is governed by a set of low-energy constants to define the terms in the effective
Lagrangian. We use the research code nsopt to simulate selected observables using EFT.
The GPs used in this thesis are implemented using the Python framework GPy. To measure the performance
of a GP we define an error measure called model error by comparing exact simulations to emulated predictions.
We also study the time and memory consumption of GPs. The choice of input training data affects the
predictive accuracy of the resulting GP. Therefore, we examined different sampling methods with varying
amounts of data.
We found that GPs can serve as an effective and versatile approach for emulating the examined observables.
After the initial high computational cost of training, making predictions with GPs is quick. When trained using
the right methods, they can also achieve high accuracy. We concluded that the Matérn 5/2 and RBF kernels
perform best for the observables studied. When sampling input points in high dimensions, latin hypercube
sampling is shown to be a good method. In general, with a multidimensional input space, it is a good choice to
use a kernel function with different sensitivities in different directions. When working with data that spans
over many orders of magnitude, logarithmizing the data before training also improves the GP performance.
GPs do not appear to be a suitable method for making extrapolations from a given training set, but performs
well with interpolations.

LA eng

LK http://publications.lib.chalmers.se/records/fulltext/256359/256359.pdf

OL 30