There’s quite a few videos on Youtube explaining Tensors. Report comment Reply CWsays: January 30, 2020 at 8:14 am Continuum mechanics and mechanical engineering use tensors a lot. Just part of the game in stress analysis and fluid mechanics.
Continuum mechanics and mechanical engineering use tensors a lot. Just part of the game in stress analysis and fluid mechanics. Yield surfaces, deviatoric stress vs. hydrostatic stress (look that one up) – all fun stuff.
In fact, the first use of the word ‘tensor’ was introduced by William Hamilton. Interestingly, the meaning of this word had little to do with what we call tensors from 1898 until today. How did tensors become important you may ask? Well, not without the help of one of the biggest names in science – Albert Einstein!
The most prominent example being Google’s TensorFlow. What is a tensor in Layman’s terms? The mathematical concept of a tensor could be broadly explained in this way. A scalar has the lowest dimensionality and is always 1x1. It can be thought of as a vector of length 1, or a 1x1 matrix.
Tensor calculus has many applications in physics, engineering and computer science including elasticity, continuum mechanics, electromagnetism (see mathematical descriptions of the electromagnetic field), general relativity (see mathematics of general relativity), quantum field theory, and machine learning.
A lot of them tend to see it in their math methods course. We covered them in my course, then saw them again for GR. In physics, I learned about them in the course "Classical Field Theory" (but I don't think many undergraduates have such a course). In mathematics, we covered tensors in a course on Smooth Manifolds.
Tensors are generalizations of scalars (that have no indices), vectors (that have exactly one index), and matrices (that have exactly two indices) to an arbitrary number of indices.
Tensors are usually encountered in a number of ways , it might be in an undergraduate math course taken by students , or also in a math/science textbook used by students or by people relying on self study .
Two good ways or paths to get to tensor calculus are 1) through vector analysis/calculus and differential geometry , and 2) through linear/multilinear algebra and matrices. Having knowledge of both paths makes it easier to study and understand tensors.
A tensor is a generalization of vectors and matrices and is easily understood as a multidimensional array. In the general case, an array of numbers arranged on a regular grid with a variable number of axes is known as a tensor.
To put it succinctly, tensors are geometrical objects over vector spaces, whose coordinates obey certain laws of transformation under change of basis. Vectors are simple and well-known examples of tensors, but there is much more to tensor theory than vectors.
What is a tensor in a deep learning framework? Tensors are the data structure used by machine learning systems, and getting to know them is an essential skill you should build early on. A tensor is a container for numerical data. It is the way we store the information that we'll use within our system.
Tensors have become important in physics because they provide a concise mathematical framework for formulating and solving physics problems in areas such as mechanics (stress, elasticity, fluid mechanics, moment of inertia, ...), electrodynamics (electromagnetic tensor, Maxwell tensor, permittivity, magnetic ...
The importance of tensor calculus became apparent in 1915 when physicist Albert Einstein revealed that he had found it indispensable for the gravitational field equations used in his theory of general relativity.
tensor analysis, branch of mathematics concerned with relations or laws that remain valid regardless of the system of coordinates used to specify the quantities. Such relations are called covariant.
Tensors may be used to define space that is non-Euclidean (that is, not flat or "geometrically regular"). Obviously, the sheer number of possible shapes and motions show just why tensor calculus proves to be so difficult.
Last Updated on December 6, 2019. In deep learning it is common to see a lot of discussion around tensors as the cornerstone data structure. Tensor even appears in name of Google’s flagship machine learning library: “ TensorFlow “. Tensors are a type of data structure used in linear algebra, ...
A tensor is a generalization of vectors and matrices and is easily understood as a multidimensional array. In the general case, an array of numbers arranged on a regular grid with a variable number of axes is known as a tensor. — Page 33, Deep Learning, 2016.
The element-wise addition of two tensors with the same dimensions results in a new tensor with the same dimensions where each scalar value is the element-wise addition of the scalars in the parent tensors.
It is a term and set of techniques known in machine learning in the training and operation of deep learning models can be described in terms of tensors.
A vector is a one-dimensional or first order tensor and a matrix is a two-dimensional or second order tensor. Tensor notation is much like matrix notation with a capital letter representing a tensor and lowercase letters with subscript integers representing scalar values within the tensor.
Tensors are typically defined by their coordinate transformation properties. The transformation properties of tensors can be understood by realizing that the physical quantities they represent must appear in certain ways to different observers with different points of view.
Writing vector or tensor equations in generalized coordinate systems is a process familiar to students in classical mechanics. In order to successfully write such equations and use them to solve problems or to build models, the characteristics of generalized coordinate systems must be understood. Recall that in a generalized coordinate system:
Einstein noticed that summation always occurs over a repeated index so that it is not strictly necessary to write out the summation operator (‘Σi’or ‘Σj’) each and every time. Using this convention, we have the compact notation
The permeability µ is a tensor of rank 2. Remember that B and H are both vectors, but they now differ from one another in both magnitude and direction.
i.e., while the position vector itself is not a tensor, the difference between any two position vectors is a tensor of rank 1! Similarly, for any position vectors V and V*, dV = dV*; i.e., the differential of the position vector is a tensor of rank 1.
Introductory courses dealing with electric and magnetic fields typically begin with homogeneous isotropic material media when the field intensities are at a sufficiently low level such that the medium remains linear with incremental increments in field strength.
where we show the permittivity and permeability as nine-component dyadics and with the double-headed arrow over bars to denote the “dual directional compoundedness” as described in Field Mathematics 1, Section 3.1. This is often the first exposure for students that there are quantities that require “rank” values greater than one.
In this next example we generalize the prior example to include metamaterials where the constitutive expressions take the form and and where and are the cross field dyadics. As before we combine these expressions
The piezoelectric effect4 is understood as the linear electromechanical interaction between the mechanical stress and the applied electric field intensity in crystalline materials with no inversion symmetry5. Whenever such a material is immersed in an electric field , the material encounters an internal stress described by the dyadic as
The nonlinear constitutive relation between the electric flux density field (also commonly referred to as the displacement vector field) and the electric field intensity field is an extension of Eq. (1-1) into higher ranked permittivity tensors , and the higher tensor products of the field are , and , etc., as
These examples show the importance of understanding the tensor nature of many objects and phenomena in modern physics and engineering, thus demonstrating why tensors need to be studied at undergraduate levels by first exposing dyadics (tensors of rank two) as shown in Example 1; then triadics (tensors of rank three) in Example 2; followed by Example 3 which introduces a tensor of rank four.
What is a tensor in Layman’s terms? The mathematical concept of a tensor could be broadly explained in this way. A scalar has the lowest dimensionality and is always 1x1. It can be thought of as a vector of length 1, or a 1x1 matrix. It is followed by a vector, where each element of that vector is a scalar.
Tensors are simply a generalization of the concepts we have seen so far. An object we haven’t seen is a tensor of rank 3. Its dimensions could be signified by k,m, and n, making it a KxMxN object. Such an object can be thought of as a collection of matrices.
A scalar has the lowest dimensionality and is always 1x1. It can be thought of as a vector of length 1, or a 1x1 matrix. It is followed by a vector, where each element of that vector is a scalar.