-
They were pioneers in the development of algebra, they used sexagimal numerical systems (with base 60) and solved quadratic, cubic and higher degree equations. They solved them through geometric methods and algorithms that described step-by-step procedures.
-
He developed methods to solve algebraic problems, his approach was more arithmetic than algebraic. They used what today we would call linear equations in the context of practical problems such as distributing goods, calculating areas and volumes, and solving problems related to agriculture. was more practical, using a technique known as the "false value method," which involved assuming an incorrect value, calculating the error, and then adjusting it to get the correct answer.
-
First records of solving systems of simple linear equations.
Mathematician: Ahmes -
Greek mathematicians, such as Pythagoras, Euclid, and Diophantus, contributed to the development of algebra. However, the Greeks focused more on geometry, and their algebra was closely related to it.
-
Algebra in Chinese civilization had a notable development, especially since the Han dynasty (206 BC-24 AD) with the treatise Mathematics in Nine Books, which addressed economic and administrative problems. Mathematicians such as Liu Hui and others improved on this treatise, developing an algorithmic method for solving systems of linear equations similar to Gauss's method, which allowed them to recognize negative numbers, one of the main achievements of Chinese mathematics.
-
Fundamentals of algebra, including methods for solving linear and quadratic equations.
mathematician:Al-Khwarizmi -
by René Descartes of coordinates in geometry. In fact, in this new geometry, now called Cartesian geometry, lines and planes are represented by linear equations, and calculating their intersections is equivalent to solving systems of linear equations.
-
Gabriel Cramer used them to give explicit solutions of linear systems, what is now called Cramer's rule. Gauss later further described the elimination method, which was initially billed as a breakthrough in geodesy.
-
Hermann Grassmann published his "Extension Theory" which included new foundational topics of what is today called linear algebra.
-
Formalization of matrix notation, introduced the term womb, which is Latin for womb.
mathematician: Sylvester -
making the general linear group possible. The group representation mechanism became available to describe complex and hypercomplex numbers. Crucially, Cayley used a single letter to denote an array, thus treating an array as an aggregate object. He also realized the connection between matrices and determinants, and wrote "There would be many things to be said about this theory of matrices which should, it seems to me, precede the theory of determinants."
-
Development of matrix theory and its applications , in addition to introducing matrix notation, developed the theory of invariants, which studies the properties of algebraic expressions that remain unchanged under certain transformations.
mathematician: Cayley -
They laid the foundations of set theory, which was fundamental for the formalization of the concepts of linear algebra.
Mathematicians: Georg Cantor and Richard Dedekind -
Benjamin Peirce published his Associative Linear Algebra and his son Charles Sanders Peirce later expanded the work.
-
The telegraph required an explanatory system, and the publication in 1873 of A Treatise on Electricity and Magnetism instituted a theory of force fields and required differential geometry for its expression. Linear algebra is plane differential geometry and serves in the spaces tangent to the manifolds. The electromagnetic symmetries of spacetime are expressed by Lorentzs transformations, and much of the history of linear algebra is the history of Lorentz transformations.
-
Fundamentals of functional analysis and quantum mechanics.
Mathematician: Hilbert -
Establishing a structure
Mathematician: Various mathematicians -
Development of efficient algorithms and applications in various areas.
Mathematics: various researchers -
2000-2005: Progress in the development of more efficient numerical algorithms for matrix decomposition and optimization methods. This includes advances in the Power Method and factorization algorithms, such as QR and SVD (Singular Value Decomposition).
• 2006: Introduction of Convolutional Neural Networks (CNN) in deep learning by Geoffrey Hinton and colleagues, which is based on linear algebra operations such as convolution and matrix multiplication. -
2007: Increasing use of linear algebra in the analysis of large volumes of data, such as matrix decomposition for dimensionality reduction (e.g., PCA, Principal Component Analysis).
• 2009: Publication of key works in applied linear algebra in graph theory, such as the expansion of Laplace matrix theory and its use in graph algorithms. -
2012: Explosion of deep learning with the introduction of deep neural networks requiring intensive linear algebra operations for backpropagation and optimization.
• 2013: Advances in computational linear algebra, with the development of libraries such as BLAS (Basic Linear Algebra Subprograms) and LAPACK (Linear Algebra PACKage), optimized for parallel hardware architectures.
. -
2015: Implementation of tensor decomposition methods, such as CP (CANDECOMP/PARAFAC) and Tucker decomposition, in big data and machine learning applications.
• 2018: Emergence of techniques such as word embeddings in Natural Language Processing (NLP), which depend on linear algebra to map words to vector spaces. -
Publication of “Deep Learning” by Ian Goodfellow and others, cementing the central role of linear algebra in the field of deep learning.
-
Linear algebra remains crucial in the development of machine learning models, especially in areas such as explainable artificial intelligence (XAI), where matrix factorization methods are used to interpret complex models.
-
advances in quantum computing, which uses linear algebra to model the behavior of quantum systems. Matrix decomposition and quantum Fourier transformation are key examples.
-
Hardware optimization for linear algebra operations, with the proliferation of tensor processing units (TPUs) designed specifically to accelerate matrix calculations in AI applications.
-
The expansion of linear algebra in emerging areas such as artificial general intelligence (AGI) and simulation of complex systems, where new algorithms and approaches based on matrices and tensors are being developed.
-