All examples are expected to run from the examples/ directory of the Tesseract-JAX repository .
In this example, you will learn how to:
Compose both Tesseracts with Tesseract-JAX to create a pipeline that can be used for differentiable shape optimization.
Build a Tesseract that uses finite differences under the hood to enable differentiability of a non-autodifferentiable geometry operation (computing a signed distance field from a 3D model).
In this notebook, we explore the optimization of a parametric structure made of a linear elastic material. The structure is parametrized by N bars, each of which has M piecewise linear segments. We seek the ideal configuration of the \(y\)-coordinates of the vertices that connect those bar segments. This notebook is based on the 2D topology optimization example from jax-fem , but we solve the problem using a parametric approach instead.
That is, we use end-to-end automatic differentiation (AD) through several components to optimize the design variables directly with respect to (simulated) performance of the design.
The design space is defined using a geometry library called PyVista, which does not support automatic differentiation. However, we can enable differentiability of this operation by using a finite difference approximation of the Jacobian matrix.
We denote the design space as a function \(g\) that maps the design variables to a signed distance field. Then, we can then define the density field \(\rho(\mathbf{x})\) as a function of a signed distance field (SDF) value \(g(\mathbf{x})\). Finally we denote the differentiable finite element method (FEM) solver as \(f\), which takes the density field as input and returns the structure’s compliance. Therefore, the optimization problem can be formulated as follows:
\[ \begin{equation} \min_{\theta} f(\rho(g(\theta))). \end{equation} \]
Here, \(\theta\) is the vector of design variables (the \(y\)-coordinates of the vertices).
AD and Tesseracts¶ Since we want use a gradient based optimizer, we need to compute the gradient of the compliance with respect to the design variables. Hence we are interested in the following derivative: \[ \begin{equation} \frac{\partial f}{\partial\theta} = \frac{\partial f}{\partial \rho} \cdot \frac{\partial \rho}{\partial g} \cdot \frac{\partial g}{\partial\theta} \end{equation} \] Note that each term is a (Jacobian) matrix. With modern AD libraries such as JAX, backpropagation uses the vector-Jacobian-product to pull back the gradients over the entire pipeline, without ever materializing Jacobian matrices. This is a powerful feature, but it typically requires that the entire pipeline is implemented in a single monolithic application – which can be cumbersome and error-prone, and does not scale well to large applications or compute needs. With Tesseracts, we wrap each function in a separate module and then compose them together. To enable differentiability, we also define AD-relevant endpoints, such as the vector-Jacobian product, inside each Tesseract module ( tesseract_api.py ). To learn more about building and running Tesseracts, please refer to the Tesseract documentation.