# Getting Started with Symbolics.jl

Symbolics.jl is a symbolic modeling language. In this tutorial, we will walk you through the process of getting Symbolics.jl up and running, and start doing our first symbolic calculations.

## Installing Symbolics.jl

Installing Symbolics.jl is as simple as using the Julia package manager. This is done via the command `]add Symbolics`

. After precompilation is complete, do `using Symbolics`

in the terminal (REPL) and when that command is completed, you're ready to start!

## Building Symbolic Expressions

The way to define symbolic variables is via the `@variables`

macro:

```
using Symbolics
@variables x y
```

\[ \begin{equation} \left[ \begin{array}{c} x \\ y \\ \end{array} \right] \end{equation} \]

After defining variables as symbolic, symbolic expressions, which we call a `iscall`

object, can be generated by utilizing Julia expressions. For example:

`z = x^2 + y`

\[ \begin{equation} y + x^{2} \end{equation} \]

Here, `z`

is an expression tree for “square `x`

and add `y`

”. To make an array of symbolic expressions, simply make an array of symbolic expressions:

```
A = [x^2 + y 0 2x
0 0 2y
y^2 + x 0 0]
```

\[ \begin{equation} \left[ \begin{array}{ccc} y + x^{2} & 0 & 2 x \\ 0 & 0 & 2 y \\ x + y^{2} & 0 & 0 \\ \end{array} \right] \end{equation} \]

Note that by default, `@variables`

returns `Sym`

or `iscall`

objects wrapped in `Num`

to make them behave like subtypes of `Real`

. Any operation on these `Num`

objects will return a new `Num`

object, wrapping the result of computing symbolically on the underlying values.

If you are following this tutorial in the Julia REPL, `A`

will not be shown with LaTeX equations. To get these equations, we can use Latexify.jl. Symbolics.jl comes with Latexify recipes, so it works automatically:

```
using Latexify
latexify(A)
```

\begin{equation} \left[ \begin{array}{ccc} y + x^{2} & 0 & 2 x \ 0 & 0 & 2 y \ x + y^{2} & 0 & 0 \ \end{array} \right] \end{equation}

Normal Julia functions work on Symbolics expressions, so if we want to create the sparse version of `A`

we would just call `sparse`

:

```
using SparseArrays
spA = sparse(A)
latexify(A)
```

\begin{equation} \left[ \begin{array}{ccc} y + x^{2} & 0 & 2 x \ 0 & 0 & 2 y \ x + y^{2} & 0 & 0 \ \end{array} \right] \end{equation}

We can thus use normal Julia functions as generators for sparse expressions. For example, here we will define

```
function f(u)
[u[1] - u[3], u[1]^2 - u[2], u[3] + u[2]]
end
f([x, y, z]) # Recall that z = x^2 + y
```

\[ \begin{equation} \left[ \begin{array}{c} x - y - x^{2} \\ - y + x^{2} \\ 2 y + x^{2} \\ \end{array} \right] \end{equation} \]

Or we can build an array of variables and use it to trace the function:

```
u = Symbolics.variables(:u, 1:5)
f(u)
```

\[ \begin{equation} \left[ \begin{array}{c} u_1 - u_3 \\ - u_2 + u_1^{2} \\ u_2 + u_3 \\ \end{array} \right] \end{equation} \]

`Symbolics.variables(:u, 1:5)`

creates a Julia array of symbolic variables. This uses O(n) compute and memory but is a very general representation. Symbolics.jl also has the ability to represent symbolic arrays which gives an O(1) representation but is more limited in its functionality. For more information, see the Symbolic Arrays page.

## Derivatives

One common thing to compute in a symbolic system is derivatives. In Symbolics.jl, derivatives are represented lazily via operations, just like any other function. To build a differential operator, use `Differential`

like:

```
@variables t
D = Differential(t)
```

`Differential(t)`

This is the differential operator $D = \frac{\partial}{\partial t}$. We can compose the differential operator by `*`

, e.g. `Differential(t) * Differential(x)`

or `Differential(t)^2`

. Now let's write down the derivative of some expression:

```
z = t + t^2
D(z)
```

\[ \begin{equation} \frac{\mathrm{d}}{\mathrm{d}t} \left( t + t^{2} \right) \end{equation} \]

Notice that this hasn't computed anything yet: `D`

is a lazy operator because it lets us symbolically represent “The derivative of $z$ with respect to $t$”, which is useful for example when representing our favorite thing in the world, differential equations. However, if we want to expand the derivative operators, we'd use `expand_derivatives`

:

`expand_derivatives(D(z))`

\[ \begin{equation} 1 + 2 t \end{equation} \]

The variable, that you are taking the derivative with respect to, is accessed with:

`D.x`

\[ \begin{equation} t \end{equation} \]

We can also have simplified functions for multivariable calculus. For example, we can compute the Jacobian of an array of expressions like:

`Symbolics.jacobian([x + x*y, x^2 + y], [x, y])`

\[ \begin{equation} \left[ \begin{array}{cc} 1 + y & x \\ 2 x & 1 \\ \end{array} \right] \end{equation} \]

and similarly we can do Hessians, gradients, and define whatever other derivatives you want.

## Simplification and Substitution

Symbolics interfaces with SymbolicUtils.jl to allow for simplifying symbolic expressions. This is done simply through the `simplify`

command:

`simplify(2x + 2y)`

\[ \begin{equation} 2 \left( x + y \right) \end{equation} \]

This can be applied to arrays by using Julia's broadcast mechanism:

```
B = simplify.([t + t^2 + t + t^2 2t + 4t
x + y + y + 2t x^2 - x^2 + y^2])
```

\[ \begin{equation} \left[ \begin{array}{cc} 2 \left( t + t^{2} \right) & 6 t \\ 2 \left( t + y \right) + x & y^{2} \\ \end{array} \right] \end{equation} \]

We can then use `substitute`

to change values of an expression around:

`simplify.(substitute.(B, (Dict(x => y^2),)))`

\[ \begin{equation} \left[ \begin{array}{cc} 2 \left( t + t^{2} \right) & 6 t \\ 2 \left( t + y \right) + y^{2} & y^{2} \\ \end{array} \right] \end{equation} \]

and we can use this to interactively evaluate expressions without generating and compiling Julia functions:

`V = substitute.(B, (Dict(x => 2.0, y => 3.0, t => 4.0),))`

\[ \begin{equation} \left[ \begin{array}{cc} 40 & 24 \\ 16 & 9 \\ \end{array} \right] \end{equation} \]

Where we can reference the values via:

`Symbolics.value.(V)`

```
2×2 Matrix{Float64}:
40.0 24.0
16.0 9.0
```

## Non-Interactive Development

Note that the macros are for the high-level case where you're doing symbolic computation on your own code. If you want to do symbolic computation on someone else's code, like in a macro, you may not want to do `@variables x`

because you might want the name “x” to come from the user's code. For these cases, you can use the interpolation operator to interpolate the runtime value of `x`

, i.e. `@variables $x`

. Check the documentation of `@variables`

for more details.

`a, b, c = :runtime_symbol_value, :value_b, :value_c`

`(:runtime_symbol_value, :value_b, :value_c)`

`vars = @variables t $a $b(t) $c(t)[1:3]`

```
4-element Vector{Any}:
t
runtime_symbol_value
value_b(t)
(value_c(t))[1:3]
```

`(t, a, b, c)`

`(t, :runtime_symbol_value, :value_b, :value_c)`

One could also use `variable`

and `variables`

. Read their documentation for more details.

If we need to use this to generate new Julia code, we can simply convert the output to an `Expr`

:

`Symbolics.toexpr(x + y^2)`

`:((+)(x, (^)(y, 2)))`

`Sym`

s and callable `Sym`

s

In the definition

`@variables t x(t) y(t)`

\[ \begin{equation} \left[ \begin{array}{c} t \\ x\left( t \right) \\ y\left( t \right) \\ \end{array} \right] \end{equation} \]

`t`

is of type `Sym{Real}`

, but the name `x`

refers to an object that represents the `Term`

`x(t)`

. The operation of this expression is itself the object `Sym{FnType{Tuple{Real}, Real}}(:x)`

. The type `Sym{FnType{...}}`

represents a callable object. In this case specifically, it's a function that takes 1 Real argument (noted by `Tuple{Real}`

) and returns a `Real`

result. You can call such a callable `Sym`

with either a number or a symbolic expression of a permissible type.

This expression also defines `t`

as an independent variable, while `x(t)`

and `y(t)`

are dependent variables. This is accounted for in differentiation:

```
z = x + y*t
expand_derivatives(D(z))
```

\[ \begin{equation} y\left( t \right) + \frac{\mathrm{d} x\left( t \right)}{\mathrm{d}t} + t \frac{\mathrm{d} y\left( t \right)}{\mathrm{d}t} \end{equation} \]

Since `x`

and `y`

are time-dependent, they are not automatically eliminated from the expression, and thus the `D(x)`

and `D(y)`

operations still exist in the expanded derivatives for correctness.

We can also define unrestricted functions:

`@variables g(..)`

```
1-element Vector{Symbolics.CallWithMetadata{SymbolicUtils.FnType{Tuple, Real}, Base.ImmutableDict{DataType, Any}}}:
g⋆
```

Here, `g`

is a variable that is a function of other variables. Any time that we reference `g`

we have to utilize it as a function:

`z = g(x) + g(y)`

\[ \begin{equation} g\left( y\left( t \right) \right) + g\left( x\left( t \right) \right) \end{equation} \]

## Registering Functions

One of the benefits of a one-language Julia symbolic stack is that the primitives are all written in Julia, and therefore it's trivially extendable from Julia itself. By default, new functions are traced to the primitives and the symbolic expressions are written on the primitives. However, we can expand the allowed primitives by registering new functions. For example, let's register a new function `h`

:

```
h(x, y) = x^2 + y
@register_symbolic h(x, y)
```

When we use `h(x, y)`

, it is a symbolic expression and doesn't expand:

`h(x, y) + y^2`

\[ \begin{equation} h\left( x\left( t \right), y\left( t \right) \right) + \left( y\left( t \right) \right)^{2} \end{equation} \]

To use it with the differentiation system, we need to register its derivatives. We would do it like this:

```
# Derivative w.r.t. the first argument
Symbolics.derivative(::typeof(h), args::NTuple{2,Any}, ::Val{1}) = 2args[1]
# Derivative w.r.t. the second argument
Symbolics.derivative(::typeof(h), args::NTuple{2,Any}, ::Val{2}) = 1
```

and now it works with the rest of the system:

`Symbolics.derivative(h(x, y) + y^2, x)`

\[ \begin{equation} 2 x\left( t \right) \end{equation} \]

`Symbolics.derivative(h(x, y) + y^2, y)`

\[ \begin{equation} 1 + 2 y\left( t \right) \end{equation} \]

`@register_symbolic`

only allows for scalar outputs. If full array functions are needed, then see `@register_array_symbolic`

for registering functions of symbolic arrays.

## Building Functions

The function for building functions from symbolic expressions is the aptly-named `build_function`

. The first argument is the symbolic expression or the array of symbolic expressions to compile, and the trailing arguments are the arguments for the function. For example:

```
to_compute = [x^2 + y, y^2 + x]
f_expr = build_function(to_compute, [x, y])
Base.remove_linenums!.(f_expr)
```

```
(:(function (ˍ₋arg1,)
begin
begin
(SymbolicUtils.Code.create_array)(typeof(ˍ₋arg1), nothing, Val{1}(), Val{(2,)}(), (+)(ˍ₋arg1[2], (^)(ˍ₋arg1[1], 2)), (+)(ˍ₋arg1[1], (^)(ˍ₋arg1[2], 2)))
end
end
end), :(function (ˍ₋out, ˍ₋arg1)
begin
begin
#= /home/runner/.julia/packages/SymbolicUtils/dtCid/src/code.jl:422 =# @inbounds begin
ˍ₋out[1] = (+)(ˍ₋arg1[2], (^)(ˍ₋arg1[1], 2))
ˍ₋out[2] = (+)(ˍ₋arg1[1], (^)(ˍ₋arg1[2], 2))
nothing
end
end
end
end))
```

gives back two codes. The first is a function `f([x, y])`

that computes and builds an output vector `[x^2 + y, y^2 + x]`

. Because this tool was made to be used by all the cool kids writing fast Julia codes, it is specialized to Julia and supports features like StaticArrays. For example:

```
using StaticArrays
myf = eval(f_expr[1])
myf(SA[2.0, 3.0])
```

```
2-element StaticArraysCore.SVector{2, Float64} with indices SOneTo(2):
7.0
11.0
```

The second function is an in-place non-allocating mutating function which mutates its first value. Thus, we'd use it like the following:

```
myf! = eval(f_expr[2])
out = zeros(2)
myf!(out, [2.0, 3.0])
out
```

```
2-element Vector{Float64}:
7.0
11.0
```

To save the symbolic calculations for later, we can take this expression and save it out to a file:

`write("function.jl", string(f_expr[2]))`

`373`

Note that if we need to avoid `eval`

, for example to avoid world-age issues, one could do `expression = Val{false}`

:

`Base.remove_linenums!(build_function(to_compute, [x, y], expression=Val{false})[1])`

```
RuntimeGeneratedFunction(#=in Symbolics=#, #=using Symbolics=#, :((ˍ₋arg1,)->begin
#= /home/runner/.julia/packages/SymbolicUtils/dtCid/src/code.jl:373 =#
#= /home/runner/.julia/packages/SymbolicUtils/dtCid/src/code.jl:374 =#
#= /home/runner/.julia/packages/SymbolicUtils/dtCid/src/code.jl:375 =#
begin
begin
#= /home/runner/.julia/packages/SymbolicUtils/dtCid/src/code.jl:468 =#
(SymbolicUtils.Code.create_array)(typeof(ˍ₋arg1), nothing, Val{1}(), Val{(2,)}(), (+)(ˍ₋arg1[2], (^)(ˍ₋arg1[1], 2)), (+)(ˍ₋arg1[1], (^)(ˍ₋arg1[2], 2)))
end
end
end))
```

which will use RuntimeGeneratedFunctions.jl to build Julia functions which avoid world-age issues.

## Building Non-Allocating Parallel Functions for Sparse Matrices

Now let's show off a little bit. `build_function`

is kind of like if `lambdify`

ate its spinach. To show this, **let's build a non-allocating function that fills sparse matrices in a multithreaded manner**. To do this, we just have to represent the operation that we're turning into a function via a sparse matrix. For example:

```
using LinearAlgebra
N = 8
A = sparse(Tridiagonal([x^i for i in 1:N-1], [x^i * y^(8-i) for i in 1:N], [y^i for i in 1:N-1]))
show(A)
```

`sparse([1, 2, 1, 2, 3, 2, 3, 4, 3, 4, 5, 4, 5, 6, 5, 6, 7, 6, 7, 8, 7, 8], [1, 1, 2, 2, 2, 3, 3, 3, 4, 4, 4, 5, 5, 5, 6, 6, 6, 7, 7, 7, 8, 8], Num[x(t)*(y(t)^7), x(t), y(t), (x(t)^2)*(y(t)^6), x(t)^2, y(t)^2, (x(t)^3)*(y(t)^5), x(t)^3, y(t)^3, (x(t)^4)*(y(t)^4), x(t)^4, y(t)^4, (x(t)^5)*(y(t)^3), x(t)^5, y(t)^5, (x(t)^6)*(y(t)^2), x(t)^6, y(t)^6, (x(t)^7)*y(t), x(t)^7, y(t)^7, x(t)^8], 8, 8)`

Now we call `build_function`

:

`Base.remove_linenums!(build_function(A,[x,y],parallel=Symbolics.MultithreadedForm())[2])`

```
:(function (ˍ₋out, ˍ₋arg1)
begin
begin
var"##out#245" = ˍ₋out.nzval
begin
(Symbolics.var"#noop#430"())(map(fetch, (begin
let
task = Base.Task((Symbolics.Funcall)(RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(Symbol("##out#245"), :ˍ₋arg1), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0xcfd04852, 0x326acbab, 0x83340d71, 0x2e61ac11, 0xc3ac6e79), Nothing}(nothing), (var"##out#245", ˍ₋arg1)))
task.sticky = false
Base.schedule(task)
task
end
end, begin
let
task = Base.Task((Symbolics.Funcall)(RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(Symbol("##out#245"), :ˍ₋arg1), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0xa7d15217, 0xfd92cce7, 0xc63be6e6, 0x1851c5da, 0xa02e9a01), Nothing}(nothing), (var"##out#245", ˍ₋arg1)))
task.sticky = false
Base.schedule(task)
task
end
end))...)
end
end
end
end)
```

Let's unpack what that's doing. It's using `A.nzval`

to linearly index through the sparse matrix, avoiding the `A[i,j]`

form because that is a more expensive way to index a sparse matrix if you know exactly the order that the data is stored. Then, it's splitting up the calculation into Julia threads, so they can be operated on in parallel. It synchronizes after spawning all the tasks, so the computation is ensured to be complete before moving on. And it does this with all in-place operations to the original matrix instead of generating arrays. This is the fanciest way you could fill a sparse matrix, and Symbolics makes this dead simple.

(Note: this example may be slower with multithreading due to the thread spawning overhead, but the full version was not included in the documentation for brevity. It will be the faster version if `N`

is sufficiently large!)

Importantly, when using the in-place functions generated by `build_function`

, it is important that the mutating argument matches the sparse structure of the original function, as demonstrated below.

```
using Symbolics, SparseArrays, LinearAlgebra
N = 10
_S = sprand(N, N, 0.1)
_Q = sprand(N, N, 0.1)
F(z) = [ # some complicated sparse amenable function
_S * z
_Q * z.^2
]
Symbolics.@variables z[1:N]
sj = Symbolics.sparsejacobian(F(z), z) # sparse jacobian
f_expr = build_function(sj, z)
rows, cols, _ = findnz(sj)
out = sparse(rows, cols, zeros(length(cols)), size(sj)...) # pre-allocate, and correct structure
myf = eval(last(f_expr))
myf(out, rand(N)) # note that out matches the sparsity structure of sj
out
```

```
20×10 SparseArrays.SparseMatrixCSC{Float64, Int64} with 19 stored entries:
⎡⡀⠁⠁⢑⠀⎤
⎢⠈⠂⠊⠀⠅⎥
⎢⢠⠀⠀⠀⠂⎥
⎢⠀⠠⠠⠀⠀⎥
⎣⠀⠁⠀⠠⠀⎦
```