The Analytical Jacobian

Dymola is a tool built around advanced symbolic manipulation. Many solver algorithms and numerical methods are / can be utilized to solve the models created within Dymola. Many of the algorithms and numerical methods have substantial impact on the accuracy and speed of the resulting simulations. Jacobian matrices play a key part in much of this. It has been previously discussed that reducing the number of numerical Jacobians improves simulation performance but why is this and how do we convince Dymola to create analytical Jacobians rather than their less desirable numerical counterparts? Or maybe a better question is what causes Dymola to be forced to use numerical Jacobians in some scenarios?

What is a Jacobian?

Put simply, a Jacobian matrix is a matrix of numbers representing the differential values of a system of (non-linear) equations. It is often described as a matrix of first order partial derivatives of a system of equations. Directly, the Jacobian can be thought of as the gradient of a system of equations or linearization of the response to changes of each input value.

Analytical versions of Jacobians are those which can be formed directly or more simply, a closed form solution. Conversely, numerical Jacobians require numerical methods to approximate the Jacobian matrix for a system.

Determination of analytical Jacobians is not always easy.

I first learned this while programming a kinematic assembly model of a vehicle around 2006. That particular model was written in and solved in Matlab. Even though the results of the model were accurate, the simulation was much slower than anticipated. Since this model was going to be used by numerous engineers, many times per week, I was asked to work on the simulation speed. After a phone call with the help desk at Mathworks, I set out to construct an analytical Jacobian function for the system of equations (based on partial derivative definitions in a textbook that I was following). Even though I had the blueprint of how to construct the Jacobian matrix in this case, it was a painful process. In the end, (after fixing multiple bugs in the textbook) I was successful in creating an accurate Jacobian function and the resulting kinematic assembly simulation ran in approximately 1/10th of the original time!

Since that project I have seldom taken for granted the value of an analytical Jacobian. Often times, Dymola makes this analytical Jacobian generation step seem trivial and the user goes on blissfully unaware of the work that Dymola is performing behind the scenes. This is one of the many endearing qualities of Dymola and is why many of us engineers appreciate it so much!

Numerical Jacobians in Dymola are automatically constructed when an analytical Jacobian cannot be calculated. They are constructed by evaluating the residuals of a non-linear system of equations for every independent system variable. In order to determine these residuals, the solver will perturb the system of equations, one independent variable at a time (generally) until each of the gradients in the Jacobian have been approximated. This sequence of calculations is time consuming and thus has an adverse effect on simulation speed. Non-linear model behavior (including that caused by equation sets containing if statements and branches, for instance) can require very complex or even multiple numerical Jacobians. If possible, reducing or eliminating the use of numerical Jacobians is desirable to avoid the associated parasitic number crunching.

Why does Dymola need Jacobians?

Jacobians are required for solutions of a variety of problem types. A couple common ones come to mind. The first is the relationship between the state variables and their derivatives through a timestep for an ODE problem. The second is in the solution of a nonlinear algebraic system of equations.

In many cases, it comes down to the use of a zero finding algorithm. These types of numerical methods (such as a Newton-Raphson) are used to drive residuals to zero (or a tolerable value). Regardless of the application, these types of algorithms generally require gradient information used to lead the solver to take a logical ‘step’ towards the end goal. With each iteration of the non-linear solver, the Jacobian (as the gradient of the system) is utilized to estimate the step required for each independent variable which (if the system of equations were linear) would achieve zero residual. Once the step is taken, the residual is evaluated again. Since these sets of equations are non-linear, the estimation of the linear gradients is not perfect, and multiple steps are often necessary. An accurate Jacobian matrix will generally reduce the number of iterations it takes to achieve the goal. For highly nonlinear systems, a Jacobian update calculation may be required at every iteration of the non-linear solver, for others it may only need to be updated every few steps.

As you can hopefully envision based on the previous paragraph, employing analytical Jacobians over numerical ones offers an avenue to increased simulation performance. Reducing the computational overhead associated with those of the numerical type, as well as increasing the Jacobian accuracy results in quicker convergence, yielding a faster simulating model.

Helping Dymola generate an analytical Jacobian

Dymola automatically generates analytical Jacobians for the vast majority of equation systems. In order to do this, Dymola must be able to differentiate the system of equations. Dymola can generally do this if the user has created the system of equations within Dymola and written the equations within the equation section.  However, if equations (or more accurately assignment statements) are written in an algorithm section, Dymola is not allowed to automatically differentiate these equations.

Numerical Jacobians in Dymola models are thus often caused by equations written in functions (algorithm sections), as Dymola is not allowed to automatically differentiate these equations. Since algorithm sections get executed causally per the Modelica Language specification, Dymola only rarely differentiates these unless the user manually marks the function as continuous.

This can be done a few ways, one of the most common is by deploying the smoothOrder annotation in the function.

annotation (smoothOrder=2);

Using this annotation specifically informs Dymola that it can go ahead and automatically differentiate the function the number of times stated in smoothOrder annotation (in this case 2). Automatic differentiation is the key here, as it enables Dymola to generate an analytical Jacobian instead of the numerical one.

Figure: Adding smoothOrder annotation to a function enables it to be automatically differentiated, and a numerical Jacobian avoided.
Adding smoothOrder annotation to a function enables it to be automatically differentiated, and a numerical Jacobian avoided. Image: Dassault Systemes

Note: Setting smoothOrder = 1 on a function that is discontinuous can lead to very poor performance (or even failure) of the model, so be sure the function is truly smooth up to the order of ‘smoothing’ stated in the annotation.

Closing remarks

Removing numerical Jacobians presents an often simple way of improving model performance. Generally, this can be done without the need to significantly tear up or restructure models. As good practice, be sure that the derivatives of your functions are continuous and inform Dymola of this by setting the smoothOrder annotation. For more information, please refer to Dymola User Manual 2C page 10.

Nate Horn – Vice President

Please get in touch if you have any questions or have got a topic in mind that you would like us to write about. You can submit your questions / topics via: Tech Blog Questions / Topic Suggestion

CONTACT US

Got a question? Just fill in this form and send it to us and we'll get back to you shortly.

Sending

© Copyright 2010-2023 Claytex Services Ltd All Rights Reserved

Log in with your credentials

Forgot your details?