Observing progress

Argmin offers an interface to observe the state of the solver at initialization as well as after every iteration. This includes the parameter vector, gradient, Jacobian, Hessian, iteration number, cost values and many more general as well as solver-specific metrics. This interface can be used to implement loggers, send the information to a storage or to plot metrics.

The observer ParamWriter saves the parameter vector to disk and as such requires the parameter vector to be serializable. This observer is available in the argmin-observer-paramwriter crate.

The observer SlogLogger logs the progress of the optimization to screen or to disk. This can be found in the argmin-observer-slog crate. Writing to disk requires the serde1 feature to be enabled in argmin-observer-slog.

The rate at which the progress of the solver is observed can be set via ObserverMode, which can be either Always, Never, NewBest (whenever a new best solution is found) or Every(i) which means every ith iteration.

Custom observers can be used as well by implementing the Observe trait (see the chapter on implementing an observer for details).

The following example shows how to add an observer to an Executor which logs progress to the terminal. The mode ObserverMode::Always ensures that every iteration is printed to screen. Multiple observers can be added to a single Executor.

#![allow(unused_imports)]
extern crate argmin;
extern crate argmin_testfunctions;
use argmin::core::{Error, Executor, CostFunction, Gradient};
use argmin::core::observers::ObserverMode;
use argmin_observer_slog::SlogLogger;
use argmin::solver::gradientdescent::SteepestDescent;
use argmin::solver::linesearch::MoreThuenteLineSearch;
use argmin_testfunctions::{rosenbrock, rosenbrock_derivative};

struct Rosenbrock {}

/// Implement `CostFunction` for `Rosenbrock`
impl CostFunction for Rosenbrock {
    /// Type of the parameter vector
    type Param = Vec<f64>;
    /// Type of the return value computed by the cost function
    type Output = f64;

    /// Apply the cost function to a parameter `p`
    fn cost(&self, p: &Self::Param) -> Result<Self::Output, Error> {
        Ok(rosenbrock(p))
    }
}

/// Implement `Gradient` for `Rosenbrock`
impl Gradient for Rosenbrock {
    /// Type of the parameter vector
    type Param = Vec<f64>;
    /// Type of the return value computed by the cost function
    type Gradient = Vec<f64>;

    /// Compute the gradient at parameter `p`.
    fn gradient(&self, p: &Self::Param) -> Result<Self::Gradient, Error> {
        Ok(rosenbrock_derivative(p))
    }
}

fn run() -> Result<(), Error> {

// Define cost function (must implement `CostFunction` and `Gradient`)
let cost = Rosenbrock {};
 
// Define initial parameter vector
let init_param: Vec<f64> = vec![-1.2, 1.0];
 
// Set up line search
let linesearch = MoreThuenteLineSearch::new();
 
// Set up solver
let solver = SteepestDescent::new(linesearch);

// [...]

let res = Executor::new(cost, solver)
    .configure(|state| state.param(init_param).max_iters(10))
    // Add an observer which will log all iterations to the terminal
    .add_observer(SlogLogger::term(), ObserverMode::Always)
    .run()?;

println!("{}", res);
    Ok(())
}

fn main() {
    if let Err(ref e) = run() {
        println!("{}", e);
        std::process::exit(1);
    }
}