(Translated by https://www.hiragana.jp/)
Matrix representation of a cross product and related curl-based differential operators in all space dimensions Skip to content
BY 4.0 license Open Access Published by De Gruyter Open Access December 31, 2021

Matrix representation of a cross product and related curl-based differential operators in all space dimensions

  • Peter Lewintan EMAIL logo
From the journal Open Mathematics

Abstract

A higher dimensional generalization of the cross product is associated with an adequate matrix multiplication. This index-free view allows for a better understanding of the underlying algebraic structures, among which are generalizations of Grassmann’s, Jacobi’s and Room’s identities. Moreover, such a view provides a higher dimensional analogue of the decomposition of the vector Laplacian, which itself gives an explicit index-free Helmholtz decomposition in arbitrary dimensions n 2 .

MSC 2010: 15A24; 15A69; 47A06; 35J05

1 Introduction

The interplay between different differential operators is at the basis not only of pure analysis but also of many applied mathematical considerations. One possibility is to study, instead of the properties of a linear homogeneous differential operator with constant coefficients

(1.1a) A = αあるふぁ = k A αあるふぁ αあるふぁ ,

where αあるふぁ = ( αあるふぁ 1 , , αあるふぁ n ) T N 0 n is a multi-index of length αあるふぁ αあるふぁ 1 + + αあるふぁ n , αあるふぁ 1 αあるふぁ 1 n αあるふぁ n and A αあるふぁ R N × m , its symbol

(1.1b) A ( b ) = αあるふぁ = k A αあるふぁ b αあるふぁ R N × m ,

where we use the notation b αあるふぁ = b 1 αあるふぁ 1 b n αあるふぁ n for b R n . Note that A : C c ( Ωおめが , R m ) C c ( Ωおめが , R N ) with Ωおめが R n open and we obtain for all a C c ( Ωおめが , R m ) also the expression A a = A ( D a ) with A Lin ( R m × n , R N ) . The approach to look and algebraically operate with the vector differential operator in a manner of a vector is also referred to as vector calculus or formal calculations.

An example of such a differential operator is the derivative D itself, but also div , curl , Δでるた or inc . One of the most prominent relations in vector calculus is curl ζぜーた 0 for scalar fields ζぜーた C c ( Ωおめが ) , Ωおめが R 3 open, which, from an algebraic point of view, reads b × b = 0 for all b R 3 (where a scalar factor can be and is omitted).

In this paper, we take a closer look at a higher dimensional analogue of the curl or rather the underlying generalized cross product. An extension of the usual cross product of vectors in R 3 to vectors in R n depends on which properties are to be fulfilled. The three basic properties of the vector product are the linearity in both arguments, that the vector a × b is perpendicular to both a , b R 3 (and thus belongs to the same space) and that its length is the area of the parallelogram spanned by a and b . Gibbs uses these properties also to define the cross product, see [1, Chapter II]. It turns out that such a vector product exists only in three and seven dimensions, cf. [2]. However, the seven-dimensional vector product does not satisfy Jacobi’s identity but rather a generalization of it, namely the Malcev identity, cf. [3, p. 279] and the references at the end of the section therein. We do not follow these constructions here and instead generalize the cross product to all dimensions by omitting one of its basic properties. These considerations are usually carried out using coordinates, i.e., index notations. However, we are concerned with their matrix representation, which provides a better understanding of the underlying algebraic structures. Such a view has already proved very useful in extending Korn inequalities for incompatible tensor fields to higher dimensions, cf. [4], where first results in these matrix representations have been obtained. In the present paper, we catch up with the underlying algebraic structures, among which are generalizations of Grassmann’s, Jacobi’s and Room’s identities. Moreover, such a view provides a higher dimensional analogue of the decomposition of the vector Laplacian which itself gives an explicit index-free Helmholtz decomposition in arbitrary dimensions n 2 .

2 Notations

As usual, . . and . , . denote the dyadic and the scalar product, respectively. We write . . to highlight the scalar multiplication of a scalar with a vector or a matrix. The space of symmetric ( n × n ) -matrices is denoted by Sym ( n ) and the space of skew-symmetric ( n × n ) -matrices by so ( n ) . We use lower-case Greek letters to denote scalars, lower-case Latin letters to denote column vectors and upper-case Latin letters to denote matrices, with the exceptions for the dimensions: if not otherwise stated we have n , m , N N and n 2 . The identity matrix is denoted by I n . For the symmetric part, the skew-symmetric part and the transpose of a matrix P we write sym P , skew P and P T , respectively.

3 Algebraic view of a generalized cross product

3.1 Inductive introduction

From an algebraic point of view the components of the usual cross product a × b are of the form αあるふぁ i βべーた j αあるふぁ j βべーた i for 1 i < j 3 sorted (and multiplied with 1 ) in such a way that the resulting vector is perpendicular to both a and b . For a general n N we have n ( n 1 ) 2 combinations of the form αあるふぁ i βべーた j αあるふぁ j βべーた i with 1 i < j n and we array them as the column vector

(3.1) αあるふぁ 1 βべーた 2 αあるふぁ 2 βべーた 1 αあるふぁ 1 βべーた 3 αあるふぁ 3 βべーた 1 αあるふぁ 2 βべーた 3 αあるふぁ 3 βべーた 2 αあるふぁ 1 βべーた 4 αあるふぁ 4 βべーた 1 αあるふぁ 2 βべーた 4 αあるふぁ 4 βべーた 2 αあるふぁ 3 βべーた 4 αあるふぁ 4 βべーた 3 for a = ( αあるふぁ i ) i = 1 , , n , b = ( βべーた i ) i = 1 , , n R n .

This becomes a vector from R n ( n 1 ) 2 and only for n = 3 lies in the same space as the vectors a and b . More precisely, using the notation

(3.2) b = ( b ¯ , βべーた n ) T R n with b ¯ R n 1

we introduce the following generalized cross product × n : R n × R n R n ( n 1 ) 2 inductively by

(3.3) a × n b a ¯ × n 1 b ¯ βべーた n a ¯ αあるふぁ n b ¯ R n ( n 1 ) 2 where αあるふぁ 1 αあるふぁ 2 × 2 βべーた 1 βべーた 2 αあるふぁ 1 βべーた 2 αあるふぁ 2 βべーた 1 ,

wherefrom the bilinearity and anti-commutativity follow immediately. We show in Section 3.3 that this generalized cross product × n also satisfies the area property:

a × n b 2 = a 2 b 2 a , b 2 a , b R n .

Remark 3.1

The anti-commutativity of the (usual or generalized) cross product is a consequence of the area property. Indeed, let n , d N , n 2 and × ̲ : R n × R n R d be a bilinear map which satisfies the area property

(3.4) a × ̲ b 2 = a 2 b 2 a , b 2 a , b R n .

Then for a = b we obtain:

(3.5) b × ̲ b 2 = b 2 b 2 b , b 2 = 0 b × ̲ b = 0 b R n .

Linearizing the last equality leads to

(3.6) 0 = ( a + b ) × ̲ ( a + b ) = a × ̲ a + a × ̲ b + b × ̲ a + b × ̲ b a × ̲ b = b × ̲ a a , b R n .

Furthermore, in case d = n , we call × ̲ a vector product to emphasize that the vector a × ̲ b is in the same space as a and b . In this situation, we can further talk about orthogonality of the vector a × ̲ b to both a and b . Massey [2] showed that assuming these three properties, i.e., bilinearity, area property and orthogonality, a vector product exists only in the dimension n = 3 or n = 7 . However, there are many cross products, different from each other, depending on the properties one requires to hold. In the present paper, we drop the orthogonality condition since we consider the case d = n ( n 1 ) 2 and introduce in (3.3) the generalized cross product by induction over the space dimension n . This is equivalent to the coordinate-wise expression from (3.1) but allows for a better understanding of the algebraic properties of the generalized cross product × n .

3.2 Relation to skew-symmetric matrices

To establish the connection of the generalized cross product a × n b to the entries of skew ( a b ) we start with the following bijection a n : so ( n ) R n ( n 1 ) 2 given by

(3.7a) a n ( A ) ( αあるふぁ 12 , αあるふぁ 13 , αあるふぁ 23 , , αあるふぁ 1 n , , αあるふぁ ( n 1 ) n ) T

for A = ( αあるふぁ i j ) i , j = 1 , , n so ( n ) , as well as its inverse A n : R n ( n 1 ) 2 so ( n ) , so that

(3.7b) A n ( a n ( A ) ) = A A so ( n ) and a n ( A n ( a ) ) = a a R n ( n 1 ) 2

and, for a = αあるふぁ 1 , , αあるふぁ n ( n 1 ) 2 T R n ( n 1 ) 2 , in coordinates it looks like

(3.8) A n ( a ) = 0 αあるふぁ 1 αあるふぁ 2 αあるふぁ 4 αあるふぁ 1 0 αあるふぁ 3 αあるふぁ 5 αあるふぁ 2 αあるふぁ 3 0 αあるふぁ 6 αあるふぁ 4 αあるふぁ 5 αあるふぁ 6 0 0 .

Thus, the generalized cross product a × n b can be written as

(3.9a) a × n b = a n ( a b b a ) = 2 a n ( skew ( a b ) ) ,

or, equivalently,

(3.9b) A n ( a × n b ) = a b b a is true for all a , b R n .

3.3 Lagrange identity

In three dimensions, Lagrange’s identity reads in terms of the usual cross product and the scalar product

(3.10) a × b , c × d = a , c b , d a , d b , c a , b , c , d R 3

and for c = a and d = b becomes

(3.11) a × b 2 = a 2 b 2 a , b 2 a , b R 3 ,

meaning that the length of the vector a × b R 3 is equal to the area of the parallelogram spanned by the vectors a , b R 3 .

In higher dimensions, the inductive definition (3.3) can be used to directly deduce an analogue to Lagrange’s identity, namely:

(3.12) a × n b , c × n d = a , c b , d a , d b , c a , b , c , d R n .

Indeed, in the dimension n = 2 we have

αあるふぁ 1 αあるふぁ 2 × 2 βべーた 1 βべーた 2 , γがんま 1 γがんま 2 × 2 δでるた 1 δでるた 2 = ( αあるふぁ 1 βべーた 2 αあるふぁ 2 βべーた 1 ) ( γがんま 1 δでるた 2 γがんま 2 δでるた 1 ) = αあるふぁ 1 βべーた 2 γがんま 1 δでるた 2 + αあるふぁ 2 βべーた 1 γがんま 2 δでるた 1 αあるふぁ 1 βべーた 2 γがんま 2 δでるた 1 αあるふぁ 2 βべーた 1 γがんま 1 δでるた 2 = ( αあるふぁ 1 γがんま 1 + αあるふぁ 2 γがんま 2 ) ( βべーた 1 δでるた 1 + βべーた 2 δでるた 2 ) ( αあるふぁ 1 δでるた 1 + αあるふぁ 2 δでるた 2 ) ( βべーた 1 γがんま 1 + βべーた 2 γがんま 2 ) = αあるふぁ 1 αあるふぁ 2 , γがんま 1 γがんま 2 βべーた 1 βべーた 2 , δでるた 1 δでるた 2 αあるふぁ 1 αあるふぁ 2 , δでるた 1 δでるた 2 βべーた 1 βべーた 2 , γがんま 1 γがんま 2 .

Furthermore, with a = ( a ¯ , αあるふぁ n ) T , b = ( b ¯ , βべーた n ) T , c = ( c ¯ , γがんま n ) T , d = ( d ¯ , δでるた n ) T we obtain on one hand

a × n b , c × n d = a ¯ × n 1 b ¯ βべーた n a ¯ αあるふぁ n b ¯ , c ¯ × n 1 d ¯ δでるた n c ¯ γがんま n d ¯ = a ¯ × n 1 b ¯ , c ¯ × n 1 d ¯ + βべーた n δでるた n a ¯ , c ¯ + αあるふぁ n γがんま n b ¯ , d ¯ βべーた n γがんま n a ¯ , d ¯ αあるふぁ n δでるた n b ¯ , c ¯ ,

and on the other hand:

a , c b , d a , d b , c = ( a ¯ , c ¯ + αあるふぁ n γがんま n ) ( b ¯ , d ¯ + βべーた n δでるた n ) ( a ¯ , d ¯ + αあるふぁ n δでるた n ) ( b ¯ , c ¯ + βべーた n γがんま n ) = a ¯ , c ¯ b ¯ , d ¯ a ¯ , d ¯ b ¯ , c ¯ + βべーた n δでるた n a ¯ , c ¯ + αあるふぁ n γがんま n b ¯ , d ¯ βべーた n γがんま n a ¯ , d ¯ αあるふぁ n δでるた n b ¯ , c ¯ ,

so that (3.12) follows by induction over n N , n 2 . Especially, for c = a and d = b we obtain for the squared norm of the generalized cross product

(3.13) a × n b 2 = ( 3.12 ) a 2 b 2 a , b 2 a , b R n

meaning that the length of the vector a × n b R n ( n 1 ) 2 is equal to the area of the parallelogram spanned by the vectors a , b R n .

Two (non-zero) vectors a , b R n are linearly dependent (and thus parallel) if and only if a × n b = 0 .

3.4 Matrix representation

It is well known that an identification of the usual cross product × with an adequate matrix multiplication facilitates some of the common proofs in vector algebra and allows one to extend the cross product of vectors to a cross product of a vector and a matrix, cf. [5,6,7, 8,9].

Our next goal is to achieve a similar identification of the generalized cross product × n with a corresponding matrix multiplication. Indeed, since for a fixed a R n the operation a × n . is linear in the second component there exists a unique matrix denoted by a × n R n ( n 1 ) 2 × n such that

(3.14) a × n b a × n b b R n .

In view of (3.3) the matrices . × n can be characterized inductively, and for a = ( a ¯ , αあるふぁ n ) T the matrix a × n has the form

(3.15) a × n = a ¯ × n 1 0 - αあるふぁ n · I n - 1 a ¯      where    αあるふぁ 1 αあるふぁ 2 × 2 = - αあるふぁ 2 , αあるふぁ 1 ,

so that

(3.16) αあるふぁ 1 αあるふぁ 2 αあるふぁ 3 × 3 = - αあるふぁ 2 αあるふぁ 1 0 - αあるふぁ 3 0 αあるふぁ 1 0 - αあるふぁ 3 αあるふぁ 2 etc.

Remark 3.2

The entries of the generalized cross product a × 3 b , with a , b R 3 , are permutations (with a sign) of the entries of the classical cross product a × b . Remember that the operation a × . can be identified with the left multiplication by the following skew-symmetric matrix

(3.17) Anti ( a ) = 0 αあるふぁ 3 αあるふぁ 2 αあるふぁ 3 0 αあるふぁ 1 αあるふぁ 2 αあるふぁ 1 0 ,

which differs from the expression a × 3 for a = ( αあるふぁ 1 , αあるふぁ 2 , αあるふぁ 3 ) T , cf. (3.16), and also form A 3 ( a ) which reads

(3.18) A 3 ( a ) = 0 αあるふぁ 1 αあるふぁ 2 αあるふぁ 1 0 αあるふぁ 3 αあるふぁ 2 αあるふぁ 3 0 .

Thus, in three dimensions, it holds for the usual cross product

(3.19) a × b = Anti ( a ) b a , b R 3 .

Also the notations T a , W ( a ) or even [ a ] × are used for Anti ( a ) ; however, the latter emphasizes that we deal with a skew-symmetric matrix. For the analysis and the properties of such matrices we refer to [5,6,7, 8,9].

Remark 3.3

Also the seven-dimensional vector product a × . for a R 7 (which differs from a × 7 .) can be represented with a left multiplication by a skew-symmetric matrix from so ( 7 ) , see [10,11, 12,13].

3.5 Scalar triple product

In case of the usual cross product in three dimensions, the scalar triple product remains unchanged under a circular shift of the three vectors (from the same space):

(3.20) a , b × c = a , Anti ( b ) c = Anti ( b ) a , c = a × b , c , a , b , c R 3 .

Since × n : R n × R n R n ( n 1 ) 2 it does not make sense to think of an analogue of a scalar triple product with three vectors coming from the same vector space but rather instead:

(3.21a) a , b × n c = a , b × n c = b × n T a , c a R n ( n 1 ) 2 , b , c R n ,

so that with c = b we have:

(3.21b) b × n T a , b = 0 a R n ( n 1 ) 2 , b R n .

Note the slight difference from the case of the usual cross product. The latter can be represented by a left multiplication with a square skew-symmetric matrix, whereas for the generalized cross product by matrices of the form (3.15) which are neither square (except the case n = 3 ) nor skew-symmetric matrices. These matrices . × n also appear in further identities involving the generalized cross product and are very important in the subsequent considerations.

3.6 Grassmann identity

In three dimensions, the usual vector triple product fulfills

(3.22) a × ( b × c ) = Anti ( a ) ( b × c ) = Anti ( a ) Anti ( b ) c = ( b × c ) × a = Anti ( b × c ) a = a , c b a , b c a , b , c R 3 ,

where the relation to scalar products, marked by , is referred to as Grassmann identity.

However, in a generalization of a vector triple product we cannot expect the double appearance of the generalized cross product but focus on the matrices . × n , as in the generalization of the scalar triple. Thus, as a generalization of Grassmann’s identity we obtain for a , b , c R n

(3.23) a × n T ( b × n c ) = a , b c a , c b = ( b , a I n b a ) c = ( c b b c ) a = ( 3.9 ) A n ( b × n c ) a R n .

It remains to prove the first equality ( 3.23 ) 1 . In the dimension n = 2 , we have

(3.24) αあるふぁ 2 αあるふぁ 1 βべーた 1 βべーた 2 × 2 γがんま 1 γがんま 2 = αあるふぁ 2 αあるふぁ 1 ( βべーた 1 γがんま 2 βべーた 2 γがんま 1 ) = αあるふぁ 2 βべーた 2 γがんま 1 αあるふぁ 2 βべーた 1 γがんま 2 αあるふぁ 1 βべーた 1 γがんま 2 αあるふぁ 1 βべーた 2 γがんま 1 = αあるふぁ 1 αあるふぁ 2 , βべーた 1 βべーた 2 γがんま 1 γがんま 2 αあるふぁ 1 αあるふぁ 2 , γがんま 1 γがんま 2 βべーた 1 βべーた 2 .

Furthermore, with a = ( a ¯ , αあるふぁ n ) T , b = ( b ¯ , βべーた n ) T , c = ( c ¯ , γがんま n ) T we obtain on one hand

(3.25) a × n T ( b × n c ) = a ¯ × n 1 T - αあるふぁ n · I n 1 0 a ¯ T b ¯ × n 1 c ¯ γがんま n · b ¯ - βべーた n · c ¯ = a ¯ × n 1 T ( b ¯ × n 1 c ¯ ) + αあるふぁ n βべーた n · c ¯ - αあるふぁ n γがんま n · b ¯ γがんま n a ¯ , b ¯ - βべーた n a ¯ , c ¯

and on the other hand:

(3.26) a , b c a , c b = a ¯ αあるふぁ n , b ¯ βべーた n c ¯ γがんま n a ¯ αあるふぁ n , c ¯ γがんま n b ¯ βべーた n = a ¯ , b ¯ c ¯ a ¯ , c ¯ b ¯ + αあるふぁ n βべーた n c ¯ αあるふぁ n γがんま n b ¯ a ¯ , b ¯ γがんま n a ¯ , c ¯ βべーた n

so that ( 3.23 ) 1 follows by induction over n N , n 2 .

3.7 Jacobi identity

In three dimensions, the usual cross product satisfies Jacobi’s identity:

(3.27) a × ( b × c ) + b × ( c × a ) + c × ( a × b ) = 0 a , b , c R 3 ,

which follows directly from the usual Grassmann identity (3.22) for the usual vector triple product. Similarly, having established the generalization of Grassmann’s identity involving the generalized cross product × n in the previous section, we obtain the following generalization of Jacobi’s identity:

(3.28a) a × n T ( b × n c ) + b × n T ( c × n a ) + c × n T ( a × n b ) = ( 3.23 ) 1 0 a , b , c R n

or, equivalently:

(3.28b) A n ( b × n c ) a + A n ( c × n a ) b + A n ( a × n b ) c = ( 3.23 ) 4 0 .

Surely, the relation (3.23) can also be used to obtain (3.12).

3.8 Cross product with a matrix

Furthermore, the generalized cross product can be written as

(3.29) a × n b = b × n a = b × n a = ( a T b × n T ) T .

This allows us to define a generalized cross product of a vector b R n and a matrix P R m × n from the right and with a matrix B R n × m from the left, where m N , via

(3.30a) P × n b P b × n T R m × n ( n 1 ) 2 seen as row-wise cross product

and

(3.30b) b × n B b × n B R n ( n 1 ) 2 × m seen as column-wise cross product ,

and they are connected via

(3.30c) ( b × n B ) T = B T b × n T = B T × n b B R n × m , b R n .

So, especially for the identity matrix P = I n we obtain

(3.31) I n × n b = b × n T and b × n I n = b × n .

Moreover, for a R m and b , c R n it follows

(3.32a) ( a b ) × n c = a b T c × n T = a ( c × n b ) T = a ( c × n b ) T = a ( b × n c ) ,

and especially for c = b :

(3.32b) ( a b ) × n b = 0 for all a R m and all b R n .

As a consequence, we obtain

(3.32c) ( b a ) × n b = ( 3.32 b ) 2 sym ( a b ) × n b = 2 skew ( a b ) × n b = ( 3.32 a ) b ( a × n b ) = ( 3.9 ) 2 b a n ( skew ( a b ) ) for all a , b R n .

3.9 Another vector triple

Already in the scalar triple product we come across the expression b × n T a R n . Hence, we may also consider the following vector triple product for a R n ( n 1 ) 2 and b , c R n :

(3.33) ( b × n T a ) × n c = b × n T a × n c = c × n b × n T a = c × n b × n T a = ( c × n × n b ) a R n ( n 1 ) 2 .

Again, the corresponding relations to (3.23) and (3.33) for the usual cross product coincide, whereas the situation is different for the generalized cross product due to the non-symmetry of the matrices . × n .

The inductive view ( 3.15 ) 1 on the appearing matrix in (3.33) shows for all a , b R n :

(3.34) a × n × n b = a × n - b × n T = = a ¯ × n 1 × n - 1 b ¯ βべーた n · a ¯ × n 1 αあるふぁ n · b ¯ × n 1 T - a ¯ b ¯ - αあるふぁ n βべーた n · I n - 1 n ( n - 1 ) 2 × n ( n - 1 ) 2 ,

and especially for a = b :

(3.35) b × n × n b = b × n b × n T Sym n ( n 1 ) 2 .

Moreover, we may also consider the following matrix multiplication:

(3.36) P b × n R m × n for P R m × n ( n 1 ) 2

and, like in (3.30), related by transposition also b × n T ( . ) for an n ( n 1 ) 2 × m -matrix.

3.10 Room identity

Surely, the considerations in the previous subsections were inspired by the corresponding relations known for the usual cross product. So, from the usual Grassmann identity (3.22) one can deduce the usual Jacobi (3.27) and Lagrange (3.10) identities. Moreover, the usual Grassmann identity (3.22) for the vector triple in three dimensions allows also to conclude that

(3.37) Anti ( a ) Anti ( b ) = Anti ( a ) × b = b a a , b I 3 a , b R 3 .

This algebraic relation is already contained in [5, p. 691 (ii)]. For this reason, let us call it Room identity. The relation (3.37) turned out to be very important also from an application point of view, cf. [9,14] and references therein.

Returning to the n -dimensional case, we have for arbitrary a , b R n :

(3.38) a × n T b × n x = a × n T ( b × n x ) = ( 3.23 ) ( b , a I n b a ) x x R n ,

so that as an analogue to Room’s identity it follows

(3.39) a × n T b × n = b , a I n b a R n × n a , b R n ,

and especially for a = b :

(3.40) b × n T b × n = b 2 I n b b Sym ( n ) .

Note that the minus sign is missing in the generalized Room identity (3.39) due to the lack of skew-symmetry of the matrix a × n T .

Interchanging the roles of a and b in (3.39) we further deduce that

(3.41) a × n T b × n b × n T a × n = a b b a = ( 3.9 ) A n ( a × n b ) .

Since tr ( a b ) = a , b , the expression (3.39) shows that the entries of a × n T b × n are linear combinations of the entries of the dyadic product a b . Also, the converse holds true:

(3.42) b a = a × n T b × n n 1 I n a × n T b × n ,

where we leave it as a short exercise for the reader to verify (e.g., by induction) that

(3.43) tr ( a × n T b × n ) = a × n , b × n = ( n 1 ) a , b .

Recall that the associated matrix Anti ( . ) with the usual cross product × in R 3 is a (skew-symmetric) square matrix, while the associated matrix . × n with the generalized cross product × n is an n ( n 1 ) 2 × n -matrix and therefore is a square matrix only in the case of n = 3 . Hence, despite the situation in Room’s identity (3.37) we may also interchange the matrices in its n -dimensional analogue (3.39), i.e., consider the expression in (3.34).

Returning to the usual Room identity we have

(3.44a) Anti ( a ) × b = L ( a b ) and a b = L ( Anti ( a ) × b ) a , b R 3 ,

denoting by L ( . ) a corresponding linear operator with constant coefficients, not necessarily the same in any two places here and in the following.

On one hand, we associate with the matrix Anti ( . ) a representation of the usual cross product. Room’s identity can be generalized to higher dimensions in three different ways. We have already seen in (3.39) and (3.42) an extension to:

(3.44b) a × n T b × n = L ( a b ) and a b = L ( a × n T b × n ) a , b R n .

However, a similar result to (3.44a) also holds true for the generalized cross product of the matrix coming from the matrix representation of the generalized cross product with a vector, see [4]:

(3.44c) a × n × n b = L ( a b ) a , b R n , n 2 and a b = L ( a × n × n b ) a , b R n , n 3 .

These relations also apply to the case of a × n b × n T = a × n b × n T = a × n × n b , which for n = 2 is only a scalar, so that the last relation in (3.44c) is only valid for n 3 .

On the other hand, Room’s identity in three dimensions can also be seen as an expression for the cross product of a skew-symmetric matrix with a vector:

(3.44a’) A × b = L ( axl ( A ) b ) and axl ( A ) b = L ( A × b ) A so ( 3 ) , b R 3 ,

where axl : so ( 3 ) R 3 denotes the inverse of Anti ( . ) . It is interesting that a similar result holds true for ( n × n ) -skew symmetric matrices in all dimensions n 2 , see [4]:

(3.44d) A × n b = L ( a n ( A ) b ) and a n ( A ) b = L ( A × n b ) A so ( n ) , b R n ,

where (3.44c) 1 and (3.44d) 1 follow directly from the definition of the generalized cross product of a matrix and a vector but for (3.44c) 2 and (3.44d) 2 inductive proofs are needed, cf. [4].

Remark 3.4

We have seen that Room’s identity (3.37) admits three different generalizations to higher dimensions (3.44b), (3.44c), (3.44d) which coincide in three dimensions when considering the usual cross product and the associated matrix with it, since the latter is a skew-symmetric (square) matrix. However, Grassmann’s and Jacobi’s identities generalize only in the ways presented in (3.23) and (3.28). Indeed, these relations are comparable to the situation in three dimensions when considering the usual triple vector product a × ( b × c ) = Anti ( a ) ( b × c ) since Anti ( a ) T = Anti ( a ) .

3.11 Simultaneous cross product

Of special interest is a simultaneous cross product of a square matrix P R n × n and a vector b R n from both sides:

(3.45) b × n P × n b = b × n P b × n T = ( 3.30 c ) ( ( P × n b ) T × n b ) T R n ( n 1 ) 2 × n ( n 1 ) 2 ,

where, due to the associativity of matrix multiplication, we can omit parenthesis. Since

(3.46a) ( b × n P × n b ) T = ( 3.45 ) b × n P T b × n T = b × n P T × n b

it follows for S Sym ( n ) and A so ( n ) immediately:

(3.46b) b × n S × n b Sym n ( n 1 ) 2 and b × n A × n b so n ( n 1 ) 2

and for all P R n × n :

(3.46c) b × n sym P × n b = sym ( b × n P × n b ) , b × n skew P × n b = skew ( b × n P × n b ) .

For P = I n the identity matrix we obtain

(3.47) b × n I n × n b = b × n b × n T = ( 3.35 ) b × n × n b Sym n ( n 1 ) 2 .

Moreover, for a , b , c R n it follows that

(3.48a) b × n ( a c ) × n b = ( 3.32 a ) ( b × n a ) ( c × n b ) ,

and especially for c = b that

(3.48b) b × n ( a b ) × n b = b × n ( b a ) × n b = b × n sym ( a b ) × n b = b × n skew ( a b ) × n b = 0 .

Furthermore, for a square matrix P R n ( n 1 ) 2 × n ( n 1 ) 2 and a vector b R n we obtain

(3.49) b × n T P b × n R n × n ,

which has comparable properties to the simultaneous cross product above, for instance:

(3.50a) ( b × n T P b × n ) T = b × n T P T b × n ,

which gives:

(3.50b) sym ( b × n T P b × n ) = b × n T sym P b × n ,

as well as

(3.50c) skew ( b × n T P b × n ) = b × n T skew P b × n .

And for the identity matrix P = I n ( n 1 ) 2 we obtain:

(3.51) b × n T I n ( n 1 ) 2 b × n = b × n T b × n = ( 3.40 ) b 2 I n b b .

Again, the corresponding expressions to (3.45) and (3.49) coming from the usual cross product in three dimensions just coincide:

(3.52) b × P × b = Anti ( b ) P Anti ( b ) R 3 × 3 for b R 3 , P R 3 × 3 .

4 Differential operators

Let us now come back to the interplay between a linear homogeneous differential operator with constant coefficients and its symbol, thus replacing b by the vector differential operator in the algebraic relation presented in the previous sections. For that purpose, let Ωおめが R n be open, n 2 and n , m N . As usual, the derivative and the divergence of a vector field rely on the dyadic product and the scalar product, respectively:

(4.1) D a a C c ( Ωおめが , R m × n ) for a C c ( Ωおめが , R m ) , div a a , = a T = tr ( D a ) C c ( Ωおめが , R ) for a C c ( Ωおめが , R n ) ,

where the latter can be generalized to a matrix divergence in a row-wise way:

(4.2) div P P C c ( Ωおめが , R m ) for P C c ( Ωおめが , R m × n ) .

In three dimensions, the usual curl is seen as

(4.3) curl a a × ( ) = × a = Anti ( ) a = 2 axl ( skew D a ) for a C c ( Ωおめが , R 3 ) , n = 3 .

Similarly, in arbitrary dimension n 2 the generalized curl is related to the generalized cross product via

(4.4) curl n a a × n ( ) = × n a = × n a = ( 3.9 ) 2 a n ( skew D a ) C c ( Ωおめが , R n ( n 1 ) 2 ) for a C c ( Ωおめが , R n ) ,

where the latter expression is usually considered in index notation to introduce the generalized curl.

Furthermore, we consider the new differential operation

(4.5) × n T a C c ( Ωおめが , R n ) for a C c ( Ωおめが , R n ( n 1 ) 2 ) ,

which differs from the usual curl and from curl n ( n 1 ) 2 a also in the three-dimensional case:

(4.6) curl αあるふぁ 1 αあるふぁ 2 αあるふぁ 3 = 2 αあるふぁ 3 3 αあるふぁ 2 3 αあるふぁ 1 1 αあるふぁ 3 1 αあるふぁ 2 2 αあるふぁ 1 , curl 3 αあるふぁ 1 αあるふぁ 2 αあるふぁ 3 = 1 αあるふぁ 2 2 αあるふぁ 1 1 αあるふぁ 3 3 αあるふぁ 1 2 αあるふぁ 3 3 αあるふぁ 2 , × 3 T αあるふぁ 1 αあるふぁ 2 αあるふぁ 3 = 2 αあるふぁ 1 3 αあるふぁ 2 1 αあるふぁ 1 3 αあるふぁ 3 1 αあるふぁ 2 + 2 αあるふぁ 3 .

To the best of our knowledge, the operator × n T : C c ( Ωおめが , R n ( n 1 ) 2 ) C c ( Ωおめが , R n ) has not received any attention in the literature so far, not even in index notation. However, this differential operator plays the counterpart in the integration by parts formula for the generalized curl n , see (4.31a) below. This adjoint differential operator appears here because the matrix associated with the generalized cross product has no symmetry.

Furthermore, it is the matrix representations of the cross product which allow us to introduce also a row-wise generalized matrix curl operator:

(4.7) curl n P P × n ( ) = ( 3.30 a ) P × n T for P C c ( Ωおめが , R m × n ) ,

which is connected to the column-wise differential operation:

(4.8) × n B × n B = ( 3.30 c ) [ curl n B T ] T for B C c ( Ωおめが , R n × m ) ,

and like in the three dimensional setting can be referred to as curl n T .

Moreover, the matrix representation of the curl operation offers also a further differential operator ( . ) × n for m × n ( n 1 ) 2 -matrix fields:

(4.9) P × n C c ( Ωおめが , R m × n ) for P C c ( Ωおめが , R m × n ( n 1 ) 2 ) ,

i.e., the row-wise differentiation from (4.5), and again related by transposition also × n T ( . ) for n ( n 1 ) 2 × m -matrix fields.

Surely, it follows from (3.32b):

(4.10a) curl n ( αあるふぁ ) 0 for αあるふぁ C c ( Ωおめが , R ) ,

or even

(4.10b) curl n ( D a ) 0 for a C c ( Ωおめが , R m ) ,

and from (3.32c):

(4.11) curl n ( D a ) T = 2 curl n ( sym D a ) = 2 curl n ( skew D a ) = [ D curl n a ] T = 2 [ D a n ( skew D a ) ] T for a C c ( Ωおめが , R n ) .

And, as analogue to the usual div curl 0 , we have in n -dimensions:

(4.12) div × n T a ( 3.21 b ) 0 for a C c ( Ωおめが , R n ( n 1 ) 2 ) .

We recall the following definition.

Definition 4.1

Let Ωおめが R n be open. A linear homogeneous differential operator with constant coefficients A : C c ( Ωおめが , R m ) C c ( Ωおめが , R N ) is said to be elliptic if its symbol A ( b ) Lin ( R m , R N ) is injective for all b R n \ { 0 } .

It follows from b × b = 0 for b R 3 that the usual curl operator is not elliptic. Similarly, also the generalized curl n is not elliptic.

Since the kernel of βべーた 2 βべーた 1 : R R 2 consists only of 0 for all βべーた 1 βべーた 2 R 2 \ { 0 } , the operator × 2 T is elliptic.

To see that × n T is not elliptic for all n 3 we consider

(4.13) βべーた 1 βべーた 2 βべーた 3 × 3 T βべーた 3 βべーた 2 βべーた 1 = βべーた 2 βべーた 3 0 βべーた 1 0 βべーた 3 0 βべーた 1 βべーた 2 βべーた 3 βべーた 2 βべーた 1 = 0 ,

which gives the non-ellipticity of × 3 T and the non-ellipticity in the higher dimensional cases follows from the inductive structure.

4.1 Nye formulas

Denoting by curl the matrix curl operator related to the usual curl for vector fields in R 3 , Room’s identity (3.44a) becomes after interchanging b by :

(4.14a) curl ( Anti ( a ) ) = L ( D a ) and D a = L ( curl Anti ( a ) ) for a C c ( Ωおめが , R 3 ) ,

where Ωおめが ( open ) R 3 for a moment. More precisely, they read

(4.14b) curl ( Anti ( a ) ) = div a I 3 ( D a ) T

and

(4.14c) D a = tr ( curl Anti ( a ) ) 2 I 3 ( curl Anti ( a ) ) T

and are better known as Nye formulas [15, equation (7)]. Surely, ( 4.14 a ) 1 is not surprising at all, but ( 4.14 a ) 2 implies that the entries of the derivative of a skew-symmetric matrix field are linear combinations of the entries of the matrix curl:

(4.14d) D A = L ( curl A ) for A C c ( Ωおめが , so ( 3 ) ) .

Returning to the higher dimensional case we conclude, from (3.42) or (3.44b), that

(4.15) D a = L ( a × n T × n ) for a C c ( Ωおめが , R n ) ,

and from (3.44c)

(4.16) D a = L ( curl