The fatiando package has been deprecated. Please check out the new tools in the Fatiando a Terra website: www.fatiando.org

Base classes for internal use (fatiando.inversion.base)

The base classes for inverse problem solving.

See fatiando.inversion for examples, regularization, and more.

This module defines base classes that are used by the rest of the inversion package:

  • MultiObjective: A “container” class that emulates a sum of different objective (goal) functions (like Misfit or some form of regularization). When two of those classes are added they generate a MultiObjective object.
  • OperatorMixin: A mix-in class that defines the operators + and * (by a scalar). Used to give these properties to Misfit and the regularizing functions. Adding results in a MultiObjective. Multiplying sets the regul_param of the class (like a scalar weight factor).
  • OptimizerMixin: A mix-in class that defines the fit and config methods for optimizing a Misfit or MultiObjective and fitting the model to the data.
  • CachedMethod: A class that wraps a method and caches the returned value. When the same argument (an array) is passed twice in a row, the class returns the cached value instead of recomputing.
  • CachedMethodPermanent: Like CachedMethod but always returns the cached value, regardless of the input. Effectively calculates only the first time the method is called. Useful for caching the Jacobian matrix in a linear problem.

class fatiando.inversion.base.CachedMethod(instance, meth)[source]

Bases: future.types.newobject.newobject

Wrap a method to cache it’s output based on the hash of the input array.

Store the output of calling the method on a numpy array. If the method is called in succession with the same input array, the cached result will be returned. If the method is called on a different array, the old result will be discarded and the new one stored.

Uses SHA1 hashes of the input array to tell if it is the same array.

Note

We need the object instance and method name instead of the bound method (like obj.method) because we can’t pickle bound methods. We need to be able to pickle so that the solvers can be passed between processes in parallelization.

Parameters:

  • instance
    : object

    The instance of the object that has the method you want to cache.

  • meth
    : string

    The name of the method you want to cache.

Examples:

>>> import numpy as np
>>> class MyClass(object):
...     def __init__(self, cached=False):
...         if cached:
...             self.my_method = CachedMethod(self, 'my_method')
...     def my_method(self, p):
...         return p**2
>>> obj = MyClass(cached=False)
>>> a = obj.my_method(np.arange(0, 5))
>>> a
array([ 0,  1,  4,  9, 16])
>>> b = obj.my_method(np.arange(0, 5))
>>> a is b
False
>>> cached = MyClass(cached=True)
>>> a = cached.my_method(np.arange(0, 5))
>>> a
array([ 0,  1,  4,  9, 16])
>>> b = cached.my_method(np.arange(0, 5))
>>> a is b
True
>>> cached.my_method.hard_reset()
>>> b = cached.my_method(np.arange(0, 5))
>>> a is b
False
>>> c = cached.my_method(np.arange(0, 5))
>>> b is c
True
>>> cached.my_method(np.arange(0, 6))
array([ 0,  1,  4,  9, 16, 25])
hard_reset()[source]

Delete the cached values.

class fatiando.inversion.base.CachedMethodPermanent(instance, meth)[source]

Bases: future.types.newobject.newobject

Wrap a method to cache it’s output and return it whenever the method is called..

This is different from CachedMethod because it will only run the method once. All other times, the result returned will be this first one. This class should be used with methods that should return always the same output (like the Jacobian matrix of a linear problem).

Note

We need the object instance and method name instead of the bound method (like obj.method) because we can’t pickle bound methods. We need to be able to pickle so that the solvers can be passed between processes in parallelization.

Parameters:

  • instance
    : object

    The instance of the object that has the method you want to cache.

  • meth
    : string

    The name of the method you want to cache.

Examples:

>>> import numpy as np
>>> class MyClass(object):
...     def __init__(self, cached=False):
...         if cached:
...             self.my_method = CachedMethodPermanent(self, 'my_method')
...     def my_method(self, p):
...         return p**2
>>> obj = MyClass(cached=False)
>>> a = obj.my_method(np.arange(0, 5))
>>> a
array([ 0,  1,  4,  9, 16])
>>> b = obj.my_method(np.arange(0, 5))
>>> a is b
False
>>> cached = MyClass(cached=True)
>>> a = cached.my_method(np.arange(0, 5))
>>> a
array([ 0,  1,  4,  9, 16])
>>> b = cached.my_method(np.arange(0, 5))
>>> a is b
True
>>> c = cached.my_method(np.arange(10, 15))
>>> c
array([ 0,  1,  4,  9, 16])
>>> a is c
True
hard_reset()[source]

Delete the cached values.

class fatiando.inversion.base.MultiObjective(*args)[source]

Bases: fatiando.inversion.base.OptimizerMixin, fatiando.inversion.base.OperatorMixin

An objective (goal) function with more than one component.

This class is a linear combination of other goal functions (like Misfit and regularization classes).

It is automatically created by adding two goal functions that have the OperatorMixin as a base class.

Alternatively, you can create a MultiObjetive by passing the other goals function instances as arguments to the constructor.

The MultiObjetive behaves like any other goal function object. It has fit and config methods and can be added and multiplied by a scalar with the same effects.

Indexing a MultiObjetive will iterate over the component goal functions.

Examples:

To show how this class is generated and works, let’s create a simple class that subclasses OperatorMixin.

>>> class MyGoal(OperatorMixin):
...     def __init__(self, name, nparams, islinear):
...         self.name = name
...         self.islinear = islinear
...         self.nparams = nparams
...     def value(self, p):
...         return 1
...     def gradient(self, p):
...         return 2
...     def hessian(self, p):
...         return 3
>>> a = MyGoal('A', 10, True)
>>> b = MyGoal('B', 10, True)
>>> c = a + b
>>> type(c)
<class 'fatiando.inversion.base.MultiObjective'>
>>> c.size
2
>>> c.nparams
10
>>> c.islinear
True
>>> c[0].name
'A'
>>> c[1].name
'B'

Asking for the value, gradient, and Hessian of the MultiObjective will give me the sum of both components.

>>> c.value(None)
2
>>> c.gradient(None)
4
>>> c.hessian(None)
6

Multiplying the MultiObjective by a scalar will set the regularization parameter for the sum.

>>> d = 10*c
>>> d.value(None)
20
>>> d.gradient(None)
40
>>> d.hessian(None)
60

All components must have the same number of parameters. For the moment, MultiObjetive doesn’t handle multiple parameter vector (one for each objective function).

>>> e = MyGoal("E", 20, True)
>>> a + e
Traceback (most recent call last):
  ...
AssertionError: Can't add goals with different number of parameters: 10, 20

The MultiObjective will automatically detect if the problem remains linear or not. For example, adding a non-linear problem to a linear one makes the sum non-linear.

>>> (a + b).islinear
True
>>> f = MyGoal('F', 10, False)
>>> (a + f).islinear
False
>>> (f + f).islinear
False
config(*args, **kwargs)[source]

Configure the optimization method and its parameters.

This sets the method used by fit and the keyword arguments that are passed to it.

Parameters:

  • method
    : string

    The optimization method. One of: 'linear', 'newton', 'levmarq', 'steepest', 'acor'

Other keyword arguments that can be passed are the ones allowed by each method.

Some methods have required arguments:

  • newton, levmarq and steepest require the initial argument (an initial estimate for the gradient descent)
  • acor requires the bounds argument (min/max values for the search space)

See the corresponding docstrings for more information:

copy(deep=False)

Make a copy of me.

estimate_

A nicely formatted version of the estimate.

If the class implements a fmt_estimate method, this will its results. This can be used to convert the parameter vector to a more useful form, like a fatiando.mesher object.

fit()[source]

Solve for the parameter vector that minimizes this objective function.

Uses the optimization method and parameters defined using the config method.

The estimated parameter vector can be accessed through the p_ attribute. A (possibly) formatted version (converted to a more manageable type) of the estimate can be accessed through the property estimate_.

fmt_estimate(p)[source]

Format the current estimated parameter vector into a more useful form.

Will call the fmt_estimate method of the first component goal function (the first term in the addition that created this object).

gradient(p)[source]

Return the gradient of the multi-objective function.

This will be the sum of all goal functions that make up this multi-objective.

Parameters:

  • p
    : 1d-array

    The parameter vector.

Returns:

  • result
    : 1d-array

    The sum of the gradients of the components.

hessian(p)[source]

Return the hessian of the multi-objective function.

This will be the sum of all goal functions that make up this multi-objective.

Parameters:

  • p
    : 1d-array

    The parameter vector.

Returns:

  • result
    : 2d-array

    The sum of the hessians of the components.

regul_param

The regularization parameter (scale factor) for the objetive function.

Defaults to 1.

value(p)[source]

Return the value of the multi-objective function.

This will be the sum of all goal functions that make up this multi-objective.

Parameters:

  • p
    : 1d-array

    The parameter vector.

Returns:

  • result
    : scalar (float, int, etc)

    The sum of the values of the components.

class fatiando.inversion.base.OperatorMixin[source]

Bases: future.types.newobject.newobject

Implements the operators + and * for the goal functions classes.

This class is not meant to be used on its own. Use it as a parent to give the child class the + and * operators.

Used in Misfit and the regularization classes in fatiando.inversion.regularization.

Note

Performing A + B produces a MultiObjetive with copies of A and B.

Note

Performing scalar*A produces a copy of A with scalar set as the regul_param attribute.

copy(deep=False)[source]

Make a copy of me.

regul_param

The regularization parameter (scale factor) for the objetive function.

Defaults to 1.

class fatiando.inversion.base.OptimizerMixin[source]

Bases: object

Defines fit and config methods plus all the optimization methods.

This class is not meant to be used on its own. Use it as a parent to give the child class the methods it implements.

Used in Misfit and fatiando.inversion.base.MultiObjetive.

The config method is used to configure the optimization method that will be used.

The fit method runs the optimization method configured and stores the computed parameter vector in the p_ attribute.

Some stats about the optimization process are stored in the stats_ attribute as a dictionary.

The minimum requirement for a class to inherit from OptimizerMixin is that it must define at least a value method.

config(method, **kwargs)[source]

Configure the optimization method and its parameters.

This sets the method used by fit and the keyword arguments that are passed to it.

Parameters:

  • method
    : string

    The optimization method. One of: 'linear', 'newton', 'levmarq', 'steepest', 'acor'

Other keyword arguments that can be passed are the ones allowed by each method.

Some methods have required arguments:

  • newton, levmarq and steepest require the initial argument (an initial estimate for the gradient descent)
  • acor requires the bounds argument (min/max values for the search space)

See the corresponding docstrings for more information:

estimate_

A nicely formatted version of the estimate.

If the class implements a fmt_estimate method, this will its results. This can be used to convert the parameter vector to a more useful form, like a fatiando.mesher object.

fit()[source]

Solve for the parameter vector that minimizes this objective function.

Uses the optimization method and parameters defined using the config method.

The estimated parameter vector can be accessed through the p_ attribute. A (possibly) formatted version (converted to a more manageable type) of the estimate can be accessed through the property estimate_.

fmt_estimate(p)[source]

Called when accessing the property estimate_.

Use this to convert the parameter vector (p) to a more useful form, like a geometric object, etc.

Parameters:

  • p
    : 1d-array

    The parameter vector.

Returns:

  • formatted

    Pretty much anything you want.