Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

grad, div, curl defintions #210

Open
KnutAM opened this issue Oct 20, 2023 · 1 comment
Open

grad, div, curl defintions #210

KnutAM opened this issue Oct 20, 2023 · 1 comment

Comments

@KnutAM
Copy link
Member

KnutAM commented Oct 20, 2023

In the literature, there are different definitions for divergence and curl for second order and higher tensor fields. With the introduction of 3rd order Tensors, #205, we need to define this clearly. This issue is to get an overview over different sources with the goal to make a decision for which definition should be used.

Let's denote a general second-order tensor as $\boldsymbol{S}$ for the discussion. Note that we always assume an orthonormal, right-handed Cartesian coordinate system.

Just comment below the additional definitions and references, and I'll try to keep the tables updated (ping me on Slack if I forget)

Gradient

AFAIK, this is not problematic (correct me if I'm wrong). To my knowledge, there are just different notations (which can be confusing in itself), i.e.
$$\mathrm{grad}(\boldsymbol{S}) = \nabla \boldsymbol{S} = \boldsymbol{S} \otimes \nabla = \frac{\partial S_{ij}}{x_k} \boldsymbol{e}_i\otimes\boldsymbol{e}_j\otimes\boldsymbol{e}_k$$

Divergence

Tensor form Index form Sources Comment
$\nabla \cdot \boldsymbol{S} $ $d_i = \frac{\partial S_{ji}}{x_j}$ [1]
$\boldsymbol{S} \cdot \nabla $ $d_i = \frac{\partial S_{ij}}{x_j}$ [2] (2.134), [3] (2.3.11), [4] (2.112) Common in mechanics?

Curl

Here it is important that our definition fulfills $\mathrm{grad}(\mathrm{curl}(\boldsymbol{S}))=\boldsymbol{0}$. There exist definitions in the literature that don't. As a precursor, we haven't defined the cross-product for 2nd-order tensors, so for the discussion, let's define the cross product with a vector $\boldsymbol{v}$ as

$$\begin{align*} \boldsymbol{S}\times\boldsymbol{v} &= S_{ij} v_k \boldsymbol{e}_i \otimes \boldsymbol{e}_j \times \boldsymbol{e}_k = S_{ij} v_k \boldsymbol{e}_i \otimes [\varepsilon_{jkm} \boldsymbol{e}_m] = S_{ij} v_k \varepsilon_{jkm} \boldsymbol{e}_i \otimes \boldsymbol{e}_m \\\ \boldsymbol{v}\times \boldsymbol{S} &= v_i S_{jk} \boldsymbol{e}_i \times \boldsymbol{e}_j \otimes \boldsymbol{e}_k = v_i S_{jk} [\varepsilon_{ijm} \boldsymbol{e}_m] \otimes \boldsymbol{e}_k = v_i S_{jk} \varepsilon_{ijm} \boldsymbol{e}_m \otimes \boldsymbol{e}_k \end{align*}$$
Tensor form Index form Sources Comment
$- \boldsymbol{S} \times \nabla $ $d_{ij} = \varepsilon_{opj}\frac{\partial S_{ip}}{x_o}$ [3] (2.3.18)
$\nabla \times \boldsymbol{S}$ $d_{ij} = \varepsilon_{opi}\frac{\partial S_{jp}}{x_o}$ [1] From the definition of the cross-product, this should be $\nabla \times \boldsymbol{S}^\mathrm{T}$

where $\varepsilon_{ijk}$ is the Levi-Civita symbol.

Sources

[1] https://en.wikipedia.org/wiki/Tensor_derivative_(continuum_mechanics)
[2] Bonet and Wood (2008)
[3] Rubin (2000)
[4] Itskov (2015)

@KnutAM
Copy link
Member Author

KnutAM commented Oct 24, 2023

Just pointing out that we can, if we follow the definitions of the tensor products, support the following operations

julia> using Tensors

julia> import Tensors: ∇

julia> f(x::Vec{3}) = Vec{3}((norm(x), sum(x), prod(x)))
f (generic function with 1 method)

julia> v = rand(Vec{3});

julia> divergence(f, v)
2.0866914680779223

julia> (f  ∇)(v)
2.0866914680779223

julia> gradient(f, v)
3×3 Tensor{2, 3, Float64, 9}:
 0.630406  0.673204  0.386502
 1.0       1.0       1.0
 0.279749  0.261964  0.456285

julia> (f  ∇)(v)
3×3 Tensor{2, 3, Float64, 9}:
 0.630406  0.673204  0.386502
 1.0       1.0       1.0
 0.279749  0.261964  0.456285

julia> curl(f, v)
3-element Vec{3, Float64}:
 -0.738035882169441
  0.10675354891495425
  0.3267956925012888

julia> (f × ∇)(v)
3-element Vec{3, Float64}:
 -0.738035882169441
  0.10675354891495425
  0.3267956925012888

with

LinearAlgebra.dot(f::Function, ::typeof(∇)) = Base.Fix1(divergence, f)  # d_{⋯} = ∂f(x)_{⋯i}/∂xᵢ
LinearAlgebra.cross(f::Function, ::typeof(∇)) = Base.Fix1(curl, f)      # d_{⋯j} εₒₚⱼ ∂f(x)_{⋯p}/∂xₒ
otimes(f::Function, ::typeof(∇)) = Base.Fix1(gradient, f)               # d_{⋯jk} = ∂f(x)_{⋯j}/∂xₖ

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant