diff --git a/joss/paper.md b/joss/paper.md index 31ccddcf..d1677cb5 100644 --- a/joss/paper.md +++ b/joss/paper.md @@ -51,7 +51,7 @@ Similarly, `PyBOP` can be used for parameter design optimisation under user-defi # Statement of need -`PyBOP` is a Python package designed to provide a user-friendly, object-oriented interface for optimising battery models. `PyBOP` leverages the open-source `PyBaMM` package [@Sulzer:2021] to formulate and solve of these battery models. `PyBOP` is intended to serve a broad audience of students, engineers, and researchers in both academia and the battery industry by enabling the use of predictive battery models where previously this was not possible. `PyBOP` emphasises clear and informative diagnostics and workflows for users of varying expertise, by providing advanced optimisation and sampling algorithms. These methods are provided through interfaces to `PINTS` [@Clerx:2019], `SciPy` [@SciPy:2020], in addition to the PyBOP's own algorithms such as Adaptive Moment Estimation with Weight Decay (AdamW), Gradient descent, and Cuckoo search. +`PyBOP` is a Python package designed to provide a user-friendly, object-oriented interface for optimising battery models. `PyBOP` leverages the open-source `PyBaMM` package [@Sulzer:2021] to formulate and solve of these battery models. `PyBOP` is intended to serve a broad audience of students, engineers, and researchers in both academia and the battery industry by enabling the use of predictive battery models where previously this was not possible. `PyBOP` emphasises clear and informative diagnostics and workflows for users of varying expertise, by providing advanced optimisation and sampling algorithms. These methods are provided through interfaces to `PINTS` [@Clerx:2019], `SciPy` [@SciPy:2020], in addition to the `PyBOP`'s own algorithms such as Adaptive Moment Estimation with Weight Decay (AdamW), Gradient descent, and Cuckoo search. `PyBOP` supports the Battery Parameter eXchange (BPX) standard [@BPX:2023] for sharing battery parameter sets. As these parameter sets are costly to obtain due to: the equipment and time required for characterisation experiments, the need for battery domain knowledge, and the computational cost of parameter estimation. `PyBOP` reduces these costs by providing fast computational estimation with parameter set interoperability. @@ -59,53 +59,53 @@ This package complements other lithium-ion battery modelling packages built arou # Architecture -`PyBOP` has a layered data structure designed to compute and process the forward model predictions and to package the necessary information for the optimisation and sampling algorithms. The forward model is solved using the popular battery modelling package, `PyBaMM`, with construction, parameterisation, and discretisation managed by PyBOP's model interface to PyBaMM. This approach provides a robust object construction process with a consistent interface between the models and optimisers. The statistical methods and optimisation algorithms are then constructed to interface cleanly with the forward model predictions. Furthermore, identifiability metrics are provided alongside the estimated parameters through Hessian approximation of the cost functions in the frequentist workflows and posterior moments in the Bayesian workflows. +`PyBOP` has a layered data structure designed to compute and process the forward model predictions and to package the necessary information for the optimisation and sampling algorithms. The forward model is solved using the popular battery modelling package, `PyBaMM`, with construction, parameterisation, and discretisation managed by `PyBOP`'s model interface to `PyBaMM`. This approach provides a robust object construction process with a consistent interface between the models and optimisers. The statistical methods and optimisation algorithms are then constructed to interface cleanly with the forward model predictions. Furthermore, identifiability metrics are provided alongside the estimated parameters through Hessian approximation of the cost functions in the frequentist workflows and posterior moments in the Bayesian workflows. ![PyBOP's interface to supporting funding agencies, alongside a visualisation of the general workflow for parameterisation and optimisation \label{fig:high-level}](figures/PyBOP-high-level.pdf){width=80%} -`PyBOP` formulates the inference process into four key classes, namely the model, the problem, the cost, and the optimiser/sampler, as shown in \autoref{fig:classes}. Each of these objects represents a base class with child classes constructing specialised functionality for inference or optimisation workflows. The model class constructs a `PyBaMM` forward model for a given set of model equations provided by `PyBaMM`, initial conditions, spatial discretisation, and numerical solver. By composing `PyBaMM` directly into `PyBOP`, specialised models can be constructed alongside the standard models, which can be modified, and optimally constructed for the inference tasks. One such example is spatial rediscretisation, which is performed when geometric parameters are optimised. In this situation, `PyBOP` minimally rediscretises the `PyBaMM` model while maintaining the problem, cost, and optimiser objects, providing improved performance benefits to users. Alongside construction of the forward model, `PyBOP`'s model class provides methods for obtaining sensitivities from the prediction, enabling gradient-based optimisation algorithms. This prediction, along with it's corresponding sensitivities, is provided to the problem class for processing and exception control. A standardised data structure is then provided to the cost classes, providing a distance, design, or likelihood-based metric for optimisation. For deterministic optimisation, the optimisers minimise the corresponding cost function or the negative log-likelihood if a likelihood class is provided. Bayesian inference is provided by Monte Carlo sampling classes, which accept the LogPosterior class and sample from it using Pints' based Monte Carlo algorithms at the time of submission. In the typical workflow, the classes in \autoref{fig:classes} are constructed in sequence. +`PyBOP` formulates the inference process into four key classes, namely the model, the problem, the cost, and the optimiser/sampler, as shown in \autoref{fig:classes}. Each of these objects represents a base class with child classes constructing specialised functionality for inference or optimisation workflows. The model class constructs a `PyBaMM` forward model for a given set of model equations provided by `PyBaMM`, initial conditions, spatial discretisation, and numerical solver. By composing `PyBaMM` directly into `PyBOP`, specialised models can be constructed alongside the standard models, which can be modified, and optimally constructed for the inference tasks. One such example is spatial rediscretisation, which is performed when geometric parameters are optimised. In this situation, `PyBOP` minimally rediscretises the `PyBaMM` model while maintaining the problem, cost, and optimiser objects, providing improved performance benefits to users. Alongside construction of the forward model, `PyBOP`'s model class provides methods for obtaining sensitivities from the prediction, enabling gradient-based optimisation algorithms. This prediction, along with it's corresponding sensitivities, is provided to the problem class for processing and exception control. A standardised data structure is then provided to the cost classes, providing a distance, design, or likelihood-based metric for optimisation. For deterministic optimisation, the optimisers minimise the corresponding cost function or the negative log-likelihood if a likelihood class is provided. Bayesian inference is provided by Monte Carlo sampling classes, which accept the LogPosterior class and sample from it using `PINTS` based Monte Carlo algorithms at the time of submission. In the typical workflow, the classes in \autoref{fig:classes} are constructed in sequence. -In addition to the core architecture, `PyBOP` provides several specialised inference and optimisation processes. One such instance is numerical electrochemical impedance spectroscopy predictions by discretising the forward model into sparse mass matrix form with accompanying auto-differentiated Jacobian. These objects are then translated into the frequency domain with a linear solution used to compute the battery model impedance. In this situation, the forward models are constructed within the spatial rediscretisation workflow, allowing for geometric parameter inference from EIS forward model predictions. Furthermore, `PyBOP` builds on the JAX [@jax:2018] numerical solvers provided by `PyBaMM` by providing JAX-based cost functions for automatic forward model differentiation with respect to the parameters. This functionality provides a performance improvement alongside an interface to JAX-based inference packages, such as Numpyro [@numpyro:2019], BlackJAX [@blackjax:2024], and Optax [@optax:2020]. +In addition to the core architecture, `PyBOP` provides several specialised inference and optimisation processes. One such instance is numerical electrochemical impedance spectroscopy predictions by discretising the forward model into sparse mass matrix form with accompanying auto-differentiated Jacobian. These objects are then translated into the frequency domain with a linear solution used to compute the battery model impedance. In this situation, the forward models are constructed within the spatial rediscretisation workflow, allowing for geometric parameter inference from EIS forward model predictions. Furthermore, `PyBOP` builds on the `JAX` [@jax:2018] numerical solvers provided by `PyBaMM` by providing `JAX`-based cost functions for automatic forward model differentiation with respect to the parameters. This functionality provides a performance improvement alongside an interface to JAX-based inference packages, such as `Numpyro` [@numpyro:2019], `BlackJAX` [@blackjax:2024], and `Optax` [@optax:2020]. ![The core `PyBOP` architecture, showcasing the base class interfaces. Each class provide direct mapping to a classical step in the optimisation workflow. \label{fig:classes}](figures/PyBOP_components.drawio.png){ width=80% } -The currently implemented subclasses for the model, problem, and cost classes are listed in \autoref{tab:subclasses}. The cost functions are grouped by problem type, while the model and optimiser classes can be selected in combination with any problem-cost pair. +The currently implemented subclasses for the model, problem, and cost classes are listed in \autoref{tab:subclasses}. The model and optimiser classes can be selected in combination with any problem-cost pair. :List of available model, problem and cost (or likelihood) classes. \label{tab:subclasses} | Battery Models | Problem Types | Cost / Likelihood Functions | |:------------------------------------|:----------------|:----------------------------| | Single particle model (SPM) | Fitting problem | Sum squared error | -| SPM with electrolyte (SPMe) | | Root mean squared error | -| Doyle-Fuller-Newman (DFN) | | Minkowski | +| SPM with electrolyte (SPMe) | Design problem | Root mean squared error | +| Doyle-Fuller-Newman (DFN) | Observer | Minkowski | | Many particle model (MPM) | | Sum of power | | Multi-species multi-reaction (MSMR) | | Gaussian log likelihood | | Weppner Huggins | | Maximum a posteriori | -| Equivalent circuit model (ECM) | Observer | Unscented Kalman filter | -| | Design problem | Gravimetric energy density | -| | | Volumetric energy density | +| Equivalent circuit model (ECM) | | Volumetric energy density | +| | | Gravimetric energy density | +| | | Unscented Kalman filter | -Similarly, the current algorithms available for optimisation tasks are presented in \autoref{tab:optimisers}. From now on, the point-based parameterisation and design optimisation tasks will simply be referred to as optimisation tasks. This simplification can be justified by examining \autoref{eqn:parameterisation} and \autoref{eqn:design} and confirming that deterministic parameterisation can be viewed as an optimisation task to minimise a distance-based cost function. +Similarly, the current algorithms available for optimisation tasks are presented in \autoref{tab:optimisers}. It should be noted that `SciPy` minimize has gradient and non-gradient methods. From now on, the point-based parameterisation and design optimisation tasks will simply be referred to as optimisation tasks. This simplification can be justified by examining \autoref{eqn:parameterisation} and \autoref{eqn:design} and confirming that deterministic parameterisation can be viewed as an optimisation task to minimise a distance-based cost function. -: The currently supported optimisation algorithms classified by candidate solution type, including gradient information. (*) Scipy minimize has gradient and non-gradient methods. \label{tab:optimisers} +: The currently supported optimisation algorithms classified by candidate solution type, including gradient information. \label{tab:optimisers} | Gradient-based | Evolutionary | (Meta)heuristic | -|:--------------------------------------|:-------------------------------|:------------------------| +|:--------------------------------------------------|:-------------------------------|:------------------------| | Weight decayed adaptive moment estimation (AdamW) | Covariance matrix adaptation (CMA-ES) | Particle swarm (PSO) | | Improved resilient backpropagation (iRProp-) | Exponential natural (xNES) | Nelder-Mead | | Gradient descent | Separable natural (sNES) | Cuckoo search | -| SciPy minimize (*) | SciPy differential evolution | | +| SciPy minimize | SciPy differential evolution | | | | | | As discussed above, `PyBOP` provides Bayesian inference methods such as maximum a posteriori (MAP) alongside the point-based methods in \autoref{tab:subclasses}; however, for a full Bayesian framework, Monte Carlo sampling is implemented within `PyBOP`. These methods construct a posterior distribution on the inference parameters, which can be used for uncertainty and practical identifiability. The individual sampler classes are currently composed within `PyBOP` from the `PINTS` library, with a base sampling class implemented for interoperability and direct integration with the `PyBOP` model, problem, and likelihood classes. The currently supported samplers are listed in \autoref{tab:samplers}. -: Sampling methods supported by PyBOP, classified according to the proposed method. \label{tab:samplers} +: Sampling methods supported by `PyBOP`, classified according to the proposed method. \label{tab:samplers} -| Gradient-based | Adaptive | Slicing | Evolutionary | Other | +| Gradient-based | Adaptive | Slicing | Evolutionary | Other | |:--------------------|:------------------------|:---------------|:-----------------------|:-----------------------------| -| Monomial Gamma | Delayed Rejection Adaptive | Rank Shrinking | Differential Evolution | Metropolis Random Walk | -| No-U-Turn | Haario Bardenet | Doubling | | Emcee Hammer | +| Monomial Gamma | Delayed Rejection Adaptive | Rank—Shrinking | Differential Evolution | Metropolis Random Walk | +| No-U-Turn | Haario Bardenet | Doubling | | Emcee Hammer | | Hamiltonian | Haario | Stepout | | Metropolis Adjusted Langevin | | Relativistic | Rao Blackwell | | | | @@ -165,7 +165,7 @@ Next, the performance of the various optimisation algorithms is presented by cat ![Cost landscape contour plot with corresponding optimisation traces. The top row represents the gradient-based optimisers, the middle row is the evolution-based, and the bottom row is the (meta)heuristics. The order from left to right corresponds to the entries in \autoref{tab:optimisers}. \label{fig:optimiser-inference}](figures/joss/contour_total.png){ width=100% } -This parameterisation task can also be approached from a Bayesian perspective, which we will present below using PyBOP's sampler methods. The optimisation equation presented in equation \autoref{eqn:parameterisation} does not represent the Bayesian parameter identification task, and as such we introduce the Bayes theorem as, +This parameterisation task can also be approached from a Bayesian perspective, which we will present below using `PyBOP`'s sampler methods. The optimisation equation presented in equation \autoref{eqn:parameterisation} does not represent the Bayesian parameter identification task, and as such we introduce the Bayes theorem as, \begin{equation} P(\theta|D) = \frac{P(D|\theta)P(\theta)}{P(D)} @@ -173,7 +173,7 @@ P(\theta|D) = \frac{P(D|\theta)P(\theta)}{P(D)} \end{equation} where, $P(\theta|D)$ is the posterior and represents the probability density function of the parameter. $P(D|\theta)$ is the likelihood function and assesses the parameter values alongside a noise model. $P(\theta)$ encapsulates the prior knowledge about the parameters, and finally $P(D)$ is the model evidence and acts as a normalising constant so that the final posterior is a correctly scaled density function. -Our goal in parameter inference is to identify the parameter values with the highest probability, which can be represented as a point-based metric or as the posterior distribution, which provides additional information about the uncertainty of the identified parameters. Monte Carlo sampling methods are available to obtain this posterior distribution. These methods sample from the posterior using a variety of methods, including gradient-based methods such as No-U-Turn [@NUTS:2011] and Hamiltonian [@Hamiltonian:2011], as well as heuristic methods such as Differential Evolution [@DiffEvolution:2006], and finally conventional methods based on random sampling with rejection criteria [@metropolis:1953]. PyBOP offers a sampling class that provides an interface to these samplers, which are supported by the Probabilistic Inference of Noise Time-Series (PINTS) package. \autoref{fig:posteriors} below shows the sampled posterior for the synthetic workflow described above, using an adaptive covariance-based sampler, Haario Bardenet [@Haario:2001]. +Our goal in parameter inference is to identify the parameter values with the highest probability, which can be represented as a point-based metric or as the posterior distribution, which provides additional information about the uncertainty of the identified parameters. Monte Carlo sampling methods are available to obtain this posterior distribution. These methods sample from the posterior using a variety of methods, including gradient-based methods such as No-U-Turn [@NUTS:2011] and Hamiltonian [@Hamiltonian:2011], as well as heuristic methods such as Differential Evolution [@DiffEvolution:2006], and finally conventional methods based on random sampling with rejection criteria [@metropolis:1953]. `PyBOP` offers a sampling class that provides an interface to these samplers, which are supported by the Probabilistic Inference of Noise Time-Series (`PINTS`) package. \autoref{fig:posteriors} below shows the sampled posterior for the synthetic workflow described above, using an adaptive covariance-based sampler, Haario Bardenet [@Haario:2001]. ![Posterior distributions for model parameters alongside identified noise on the observations. Shaded area denotes confidence bounds for each parameter. \label{fig:posteriors}](figures/joss/posteriors.png){ width=100% } diff --git a/joss/paper.pdf b/joss/paper.pdf index ea0d6cba..fc7da937 100644 Binary files a/joss/paper.pdf and b/joss/paper.pdf differ diff --git a/joss/paper.preprint.pdf b/joss/paper.preprint.pdf index de5debfd..33961b6e 100644 Binary files a/joss/paper.preprint.pdf and b/joss/paper.preprint.pdf differ diff --git a/joss/paper.preprint.tex b/joss/paper.preprint.tex index b9551277..63f42fa9 100644 --- a/joss/paper.preprint.tex +++ b/joss/paper.preprint.tex @@ -236,8 +236,8 @@ \section{Statement of need}\label{statement-of-need} These methods are provided through interfaces to \texttt{PINTS} (\citeproc{ref-Clerx:2019}{Clerx et al., 2019}), \texttt{SciPy} (\citeproc{ref-SciPy:2020}{Virtanen et al., 2020}), in addition to the -PyBOP's own algorithms such as Adaptive Moment Estimation with Weight -Decay (AdamW), Gradient descent, and Cuckoo search. +\texttt{PyBOP}'s own algorithms such as Adaptive Moment Estimation with +Weight Decay (AdamW), Gradient descent, and Cuckoo search. \texttt{PyBOP} supports the Battery Parameter eXchange (BPX) standard (\citeproc{ref-BPX:2023}{Korotkin et al., 2023}) for sharing battery @@ -260,14 +260,14 @@ \section{Architecture}\label{architecture} information for the optimisation and sampling algorithms. The forward model is solved using the popular battery modelling package, \texttt{PyBaMM}, with construction, parameterisation, and discretisation -managed by PyBOP's model interface to PyBaMM. This approach provides a -robust object construction process with a consistent interface between -the models and optimisers. The statistical methods and optimisation -algorithms are then constructed to interface cleanly with the forward -model predictions. Furthermore, identifiability metrics are provided -alongside the estimated parameters through Hessian approximation of the -cost functions in the frequentist workflows and posterior moments in the -Bayesian workflows. +managed by \texttt{PyBOP}'s model interface to \texttt{PyBaMM}. This +approach provides a robust object construction process with a consistent +interface between the models and optimisers. The statistical methods and +optimisation algorithms are then constructed to interface cleanly with +the forward model predictions. Furthermore, identifiability metrics are +provided alongside the estimated parameters through Hessian +approximation of the cost functions in the frequentist workflows and +posterior moments in the Bayesian workflows. \begin{figure} \centering @@ -292,7 +292,7 @@ \section{Architecture}\label{architecture} \texttt{PyBOP} minimally rediscretises the \texttt{PyBaMM} model while maintaining the problem, cost, and optimiser objects, providing improved performance benefits to users. Alongside construction of the forward -model, \texttt{PyBOP}`s model class provides methods for obtaining +model, \texttt{PyBOP}'s model class provides methods for obtaining sensitivities from the prediction, enabling gradient-based optimisation algorithms. This prediction, along with it's corresponding sensitivities, is provided to the problem class for processing and @@ -302,8 +302,8 @@ \section{Architecture}\label{architecture} minimise the corresponding cost function or the negative log-likelihood if a likelihood class is provided. Bayesian inference is provided by Monte Carlo sampling classes, which accept the LogPosterior class and -sample from it using Pints' based Monte Carlo algorithms at the time of -submission. In the typical workflow, the classes in +sample from it using \texttt{PINTS} based Monte Carlo algorithms at the +time of submission. In the typical workflow, the classes in \autoref{fig:classes} are constructed in sequence. In addition to the core architecture, \texttt{PyBOP} provides several @@ -315,15 +315,15 @@ \section{Architecture}\label{architecture} compute the battery model impedance. In this situation, the forward models are constructed within the spatial rediscretisation workflow, allowing for geometric parameter inference from EIS forward model -predictions. Furthermore, \texttt{PyBOP} builds on the JAX +predictions. Furthermore, \texttt{PyBOP} builds on the \texttt{JAX} (\citeproc{ref-jax:2018}{Bradbury et al., 2018}) numerical solvers -provided by \texttt{PyBaMM} by providing JAX-based cost functions for -automatic forward model differentiation with respect to the parameters. -This functionality provides a performance improvement alongside an -interface to JAX-based inference packages, such as Numpyro -(\citeproc{ref-numpyro:2019}{Phan et al., 2019}), BlackJAX -(\citeproc{ref-blackjax:2024}{Cabezas et al., 2024}), and Optax -(\citeproc{ref-optax:2020}{DeepMind et al., 2020}). +provided by \texttt{PyBaMM} by providing \texttt{JAX}-based cost +functions for automatic forward model differentiation with respect to +the parameters. This functionality provides a performance improvement +alongside an interface to JAX-based inference packages, such as +\texttt{Numpyro} (\citeproc{ref-numpyro:2019}{Phan et al., 2019}), +\texttt{BlackJAX} (\citeproc{ref-blackjax:2024}{Cabezas et al., 2024}), +and \texttt{Optax} (\citeproc{ref-optax:2020}{DeepMind et al., 2020}). \begin{figure} \centering @@ -334,9 +334,8 @@ \section{Architecture}\label{architecture} \end{figure} The currently implemented subclasses for the model, problem, and cost -classes are listed in \autoref{tab:subclasses}. The cost functions are -grouped by problem type, while the model and optimiser classes can be -selected in combination with any problem-cost pair. +classes are listed in \autoref{tab:subclasses}. The model and optimiser +classes can be selected in combination with any problem-cost pair. \begin{longtable}[]{@{} >{\raggedright\arraybackslash}p{(\columnwidth - 4\tabcolsep) * \real{0.4458}} @@ -367,31 +366,33 @@ \section{Architecture}\label{architecture} \bottomrule\noalign{} \endlastfoot Single particle model (SPM) & Fitting problem & Sum squared error \\ -SPM with electrolyte (SPMe) & & Root mean squared error \\ -Doyle-Fuller-Newman (DFN) & & Minkowski \\ +SPM with electrolyte (SPMe) & Design problem & Root mean squared +error \\ +Doyle-Fuller-Newman (DFN) & Observer & Minkowski \\ Many particle model (MPM) & & Sum of power \\ Multi-species multi-reaction (MSMR) & & Gaussian log likelihood \\ Weppner Huggins & & Maximum a posteriori \\ -Equivalent circuit model (ECM) & Observer & Unscented Kalman filter \\ -& Design problem & Gravimetric energy density \\ -& & Volumetric energy density \\ +Equivalent circuit model (ECM) & & Volumetric energy density \\ +& & Gravimetric energy density \\ +& & Unscented Kalman filter \\ \end{longtable} Similarly, the current algorithms available for optimisation tasks are -presented in \autoref{tab:optimisers}. From now on, the point-based -parameterisation and design optimisation tasks will simply be referred -to as optimisation tasks. This simplification can be justified by -examining \autoref{eqn:parameterisation} and \autoref{eqn:design} and -confirming that deterministic parameterisation can be viewed as an -optimisation task to minimise a distance-based cost function. +presented in \autoref{tab:optimisers}. It should be noted that +\texttt{SciPy} minimize has gradient and non-gradient methods. From now +on, the point-based parameterisation and design optimisation tasks will +simply be referred to as optimisation tasks. This simplification can be +justified by examining \autoref{eqn:parameterisation} and +\autoref{eqn:design} and confirming that deterministic parameterisation +can be viewed as an optimisation task to minimise a distance-based cost +function. \begin{longtable}[]{@{} - >{\raggedright\arraybackslash}p{(\columnwidth - 4\tabcolsep) * \real{0.4062}} - >{\raggedright\arraybackslash}p{(\columnwidth - 4\tabcolsep) * \real{0.3333}} - >{\raggedright\arraybackslash}p{(\columnwidth - 4\tabcolsep) * \real{0.2604}}@{}} + >{\raggedright\arraybackslash}p{(\columnwidth - 4\tabcolsep) * \real{0.4722}} + >{\raggedright\arraybackslash}p{(\columnwidth - 4\tabcolsep) * \real{0.2963}} + >{\raggedright\arraybackslash}p{(\columnwidth - 4\tabcolsep) * \real{0.2315}}@{}} \caption{The currently supported optimisation algorithms classified by -candidate solution type, including gradient information. (*) Scipy -minimize has gradient and non-gradient methods. +candidate solution type, including gradient information. \label{tab:optimisers}}\tabularnewline \toprule\noalign{} \begin{minipage}[b]{\linewidth}\raggedright @@ -420,7 +421,7 @@ \section{Architecture}\label{architecture} Improved resilient backpropagation (iRProp-) & Exponential natural (xNES) & Nelder-Mead \\ Gradient descent & Separable natural (sNES) & Cuckoo search \\ -SciPy minimize (*) & SciPy differential evolution & \\ +SciPy minimize & SciPy differential evolution & \\ & & \\ \end{longtable} @@ -442,8 +443,8 @@ \section{Architecture}\label{architecture} >{\raggedright\arraybackslash}p{(\columnwidth - 8\tabcolsep) * \real{0.1379}} >{\raggedright\arraybackslash}p{(\columnwidth - 8\tabcolsep) * \real{0.2069}} >{\raggedright\arraybackslash}p{(\columnwidth - 8\tabcolsep) * \real{0.2586}}@{}} -\caption{Sampling methods supported by PyBOP, classified according to -the proposed method. \label{tab:samplers}}\tabularnewline +\caption{Sampling methods supported by \texttt{PyBOP}, classified +according to the proposed method. \label{tab:samplers}}\tabularnewline \toprule\noalign{} \begin{minipage}[b]{\linewidth}\raggedright Gradient-based @@ -474,7 +475,7 @@ \section{Architecture}\label{architecture} \endhead \bottomrule\noalign{} \endlastfoot -Monomial Gamma & Delayed Rejection Adaptive & Rank Shrinking & +Monomial Gamma & Delayed Rejection Adaptive & Rank---Shrinking & Differential Evolution & Metropolis Random Walk \\ No-U-Turn & Haario Bardenet & Doubling & & Emcee Hammer \\ Hamiltonian & Haario & Stepout & & Metropolis Adjusted Langevin \\ @@ -642,8 +643,8 @@ \subsection{Parameterisation}\label{parameterisation} \end{figure} This parameterisation task can also be approached from a Bayesian -perspective, which we will present below using PyBOP's sampler methods. -The optimisation equation presented in equation +perspective, which we will present below using \texttt{PyBOP}'s sampler +methods. The optimisation equation presented in equation \autoref{eqn:parameterisation} does not represent the Bayesian parameter identification task, and as such we introduce the Bayes theorem as, @@ -670,12 +671,12 @@ \subsection{Parameterisation}\label{parameterisation} heuristic methods such as Differential Evolution (\citeproc{ref-DiffEvolution:2006}{Braak, 2006}), and finally conventional methods based on random sampling with rejection criteria -(\citeproc{ref-metropolis:1953}{Metropolis et al., 1953}). PyBOP offers -a sampling class that provides an interface to these samplers, which are -supported by the Probabilistic Inference of Noise Time-Series (PINTS) -package. \autoref{fig:posteriors} below shows the sampled posterior for -the synthetic workflow described above, using an adaptive -covariance-based sampler, Haario Bardenet +(\citeproc{ref-metropolis:1953}{Metropolis et al., 1953}). +\texttt{PyBOP} offers a sampling class that provides an interface to +these samplers, which are supported by the Probabilistic Inference of +Noise Time-Series (\texttt{PINTS}) package. \autoref{fig:posteriors} +below shows the sampled posterior for the synthetic workflow described +above, using an adaptive covariance-based sampler, Haario Bardenet (\citeproc{ref-Haario:2001}{Haario et al., 2001}). \begin{figure}