Python API (tangent_py)¶
Tangent: Python bindings for manifold-based nonlinear least squares optimization.
Uses cppyy to allow JIT compilation of tangent’s header files.
Initialization¶
- tangent_py.init()[source]¶
Initialize cppyy with Tangent headers.
This must be called before using any Tangent functionality. It loads Eigen, Sophus, and all Tangent headers.
Safe to call multiple times (idempotent).
- tangent_py.define_error_term(cpp_code: str)[source]¶
JIT compile a C++ error term class.
The code should define a class inheriting from AutoDiffErrorTerm.
- Parameters:
cpp_code – C++ code string defining the error term class
Example
>>> define_error_term(''' ... class DiffError : public Tangent::AutoDiffErrorTerm<DiffError, double, 1, ... Tangent::SimpleScalar, ... Tangent::SimpleScalar> { ... public: ... DiffError(Tangent::VariableKey<Tangent::SimpleScalar> k1, ... Tangent::VariableKey<Tangent::SimpleScalar> k2) { ... std::get<0>(variableKeys) = k1; ... std::get<1>(variableKeys) = k2; ... information.setIdentity(); ... } ... ... template <typename T, typename V1, typename V2> ... Eigen::Matrix<T, 1, 1> computeError(const V1& v1, const V2& v2) const { ... Eigen::Matrix<T, 1, 1> err; ... err(0) = v2 - v1; ... return err; ... } ... }; ... ''')
Error Term Templates¶
- tangent_py.error_term_template(name: str, residual_dim: int, var_types: List[str], compute_body: str, extra_members: str = '', extra_constructor_params: str = '', extra_constructor_init: str = '') str[source]¶
Generate C++ code for an AutoDiffErrorTerm.
Supports any combination of variable types, residual dimensions, and optional extra members/constructor parameters.
- Parameters:
name – Class name for the error term
residual_dim – Dimension of the residual vector
var_types – List of variable type names (e.g., [“SimpleScalar”, “SE3”])
compute_body – C++ code for the body of computeError(). Use v0, v1, v2… to refer to variables. Use err(i) to set residual components.
extra_members – Additional class member declarations (e.g., “double target;”)
extra_constructor_params – Additional constructor parameters (e.g., “double t”)
extra_constructor_init – Additional constructor initialization code (e.g., “target = t;”)
- Returns:
C++ code string defining the error term class
Examples
Binary constraint between two variables:
>>> code = error_term_template( ... name="DifferenceError", ... residual_dim=1, ... var_types=["SimpleScalar", "SimpleScalar"], ... compute_body="err(0) = v1 - v0;" ... )
Unary prior with a target value:
>>> code = error_term_template( ... name="ScalarPrior", ... residual_dim=1, ... var_types=["SimpleScalar"], ... compute_body="err(0) = v0 - target;", ... extra_members="double target;", ... extra_constructor_params="double t", ... extra_constructor_init="target = t;" ... )
Optimizer¶
- class tangent_py.Optimizer(variables: List[str], error_terms: List[str], huber_delta: float = 1000.0)[source]¶
Nonlinear least squares optimizer using Tangent’s SSEOptimizer.
This class wraps SSEOptimizer and handles the complex template instantiation required by cppyy.
Example
>>> opt = Optimizer( ... variables=["SimpleScalar"], ... error_terms=["DifferenceError"] ... ) >>> x = SimpleScalar(10.0) >>> y = SimpleScalar(0.0) >>> k1 = opt.add_variable(x) >>> k2 = opt.add_variable(y) >>> opt.add_error_term("DifferenceError", k1, k2) >>> result = opt.optimize()
- __init__(variables: List[str], error_terms: List[str], huber_delta: float = 1000.0)[source]¶
Create an optimizer.
- Parameters:
variables – List of variable type names (e.g., [“SimpleScalar”, “SE3”])
error_terms – List of error term class names
huber_delta – Huber loss function delta parameter
- add_error_term(error_type: str, *args)[source]¶
Add an error term to the optimization problem.
- Parameters:
error_type – Name of the error term class
*args – Arguments to pass to the error term constructor (typically variable keys and measurement data)
- Returns:
Error term key
- add_variable(variable, prior_info=None)[source]¶
Add a variable to the optimization problem.
- Parameters:
variable – A Tangent variable instance (SimpleScalar, SE3, etc.)
prior_info – Optional prior information matrix (numpy array or scalar)
- Returns:
Variable key for referencing this variable
- get_variable(key)[source]¶
Get a variable by its key.
- Parameters:
key – Variable key from add_variable()
- Returns:
The variable instance with current optimized value
- optimize(max_iterations: int | None = None) OptimizationResult[source]¶
Run the optimization.
- Parameters:
max_iterations – Override the default maximum iterations
- Returns:
OptimizationResult with convergence information
- set_prior(key, information)[source]¶
Set the prior information matrix for a variable.
- Parameters:
key – Variable key from add_variable()
information – Information matrix (scalar for 1D, or numpy array)
- property settings¶
Access optimizer settings.
Results¶
- class tangent_py.OptimizationResult(iterations: int, initial_error: float, final_error: float, converged: bool, error_decreased: bool, error_history: List[float])[source]¶
Result of an optimization run.
- iterations¶
Number of optimizer iterations performed.
- Type:
int
- initial_error¶
Total squared error before optimization.
- Type:
float
- final_error¶
Total squared error after optimization.
- Type:
float
- converged¶
Whether the optimizer converged to a minimum.
- Type:
bool
- error_decreased¶
Whether the final error is less than the initial error.
- Type:
bool
- error_history¶
List of total error values at each iteration.
- Type:
List[float]
- converged: bool¶
- error_decreased: bool¶
- error_history: List[float]¶
- final_error: float¶
- initial_error: float¶
- iterations: int¶