# Malcev: *Foundations of Linear Algebra* Introduction

**Anatoly Malcev**published

*Foundations of Linear Algebra*in Russian in 1948. A second Russian edition appeared in 1956 which was then translated into English and published by W H Freeman and Company in San Francisco and London in 1963. Here is an extract from the Introduction where Malcev introduces linear algebra:-

### Linear Algebra

In linear algebra one studies three kinds of objects; matrices, linear spaces, and algebraic forms. The theories of these objects are so closely related that most problems of linear algebra have equivalent formulations in each of the three theories. The matrix point of view, which underlies the present exposition is the one best adapted to actual calculations. On the other hand, most problems of linear algebra that arise in geometry and mechanics lead to algebraic forms, while the best understanding of the internal connections between different problems of linear algebra is obtained by means of linear spaces. Therefore the ability to pass from one type of formulation to another is one of the most important skills to acquire in the study of linear algebra.

From the point of view of the theory of forms, linear algebra falls naturally into three parts: the theories of linear forms, of bilinear and quadratic forms, and of multilinear forms. Linear algebra proper usually encompasses linear and bilinear forms, and the very beginnings of the theory of multilinear forms as tensor algebras. The more delicate questions of the theory of multilinear forms belong to the theory of invariants and are not included in this book.

Linear algebra is a branch of mathematics as old as mathematics itself. The solving of the equation $ax + b = 0$ may be considered the original problem of this subject. Although this problem presents no difficulty, the method which solves it, together with the properties of the corresponding linear function $y = ax + b$, are the initial models for the ideas and methods of all of linear algebra. For example, the fundamental idea behind the solution of a system of linear equations in several unknowns is that of replacing such a system by a chain of these simple equations.

The study of systems of linear equations acquired new significance after the creation of analytic geometry; it was possible to reduce all the fundamental questions about the arrangements of lines and planes in space to the investigation of such systems. The search in the 18th century for the general solution of $n$ linear equations in $n$ unknowns led Leibniz and Cramer to the notion of the determinant. In the 19th century, determinants not only found use in algebra and analytic geometry, but also entered analysis in the work of Ostrogradski and Jacobi on functional determinants. At the same time, the problem of the transformation of quadratic forms by linear substitutions acquired great importance in analytic geometry, in the theory of numbers, and especially in theoretical mechanics. The same problem was central also in the geometrical ideas of Lobachevski and Riemann, which led to the study of many-dimensional spaces, including many-dimensional linear spaces (Grassmann). In the middle of the last century, investigations of non-commutative algebras (Hamilton), led to the development of a matrix calculus (Cayley and Sylvester), which played a major role in the subsequent growth of linear algebra. Results which appeared near the end of the 19th century included the normal form of a matrix of a linear transformation (Jordan), elementary divisors (Weierstrass), pairs of quadratic forms (Weierstrass, Kronecker), and Hermitian forms (Hermite). At about the same time the development of differential geometry for many -dimensional spaces and of the theory of transformations of algebraic forms of higher powers led to the creation of the tensor calculus, upon which was built the theory of relativity.

In the present century linear algebra has acquired new richness and versatility through the use of the concepts of group and non-commutative ring in algebra itself, and through the use of in finite -dimensional function spaces in analysis. Applications to quantum mechanics stimulated a still more rapid development of the theory of these spaces, which has become one of the most important parts of contemporary functional analysis.

Last Updated March 2006