Placeholder Image

字幕列表 影片播放

  • As you can probably tell by now, the bulk of this series is on understanding matrix

    正如你們現在也許可以講得出,這個系列,

  • and vector operations

    通過更多綫性變換的視覺透鏡,

  • through that more visual lens of linear transformations.

    而在於懂得j矩陣和矢量的各種運算。

  • This video is no exception, describing the concepts of inverse matrices,

    這個錄像也不例外,通過那個透鏡描述

  • column space, rank, and null space through that lens.

    反矢量,列空間,秩,和零空間的概念。

  • A forewarning though: I'm not going to talk about the methods for actually computing these

    但有個預先警告:我不講實際計算它們的

  • things,

    而有些人會說那開始相當重要的。

  • and some would argue that that's pretty important.

    在這個系列之外有很多來學習那些方法很好的資源。

  • There are a lot of very good resources for learning those methods outside this series.

    查關鍵詞:“Gaussian elimination(高司消除法)”和“Row echolon form”

  • Keywords: "Gaussian elimination" and "Row echelon form."

    我想在我實際加在這裏的大部分的價值是在直覺方面的。

  • I think most of the value that I actually have to add here is on the intuition half.

    再加上,在實際中,我們可是通常用軟件來計算這些東西的。

  • Plus, in practice, we usually get software to compute this stuff for us anyway.

    首先,講幾句綫性代數的用処。

  • First, a few words on the usefulness of linear algebra.

    到了現在,你們已經對它怎樣

  • By now, you already have a hint for how it's used in describing the the manipulation of

    用在描述空間上有了一點提示

  • space,

    那就是對像計算機圖象和機器人之類的東西是有用的。

  • which is useful for things like computer graphics and robotics,

    但綫性代數是更廣汎的用處和幾乎

  • but one of the main reasons that linear algebra is more broadly applicable,

    任何的技術學科都需要的,

  • and required for just about any technical discipline,

    是它讓我們來解某些方程組。

  • is that it lets us solve certain systems of equations.

    在我說到“方程組”的時候,我的

  • When I say "system of equations," I mean you have a list of variables, things you don't

    意思是你有一批變量,你所不知道的

  • know,

    東西,和一組把它們聯係起來的方程。

  • and a list of equations relating them.

    在很多情況下,這些方程可以是很複雜的,

  • In a lot of situations, those equations can get very complicated,

    但是,如果你運氣好,他們可以是某種特殊的形式。

  • but, if you're lucky, they might take on a certain special form.

    在每個方程裏,對每個變量唯一做的事

  • Within each equation, the only thing happening to each variable is that it's scaled by some

    就是乘以了一個不變的係數,

  • constant,

    而對每一個乘了係數的變量唯一做的事

  • and the only thing happening to each of those scaled variables is that they're added to

    它們互相相加。

  • each other.

    因此,沒有指數或者花妙的函數,或者把兩個變量相乘起來那樣的事情。

  • So, no exponents or fancy functions, or multiplying two variables together; things like that.

    組織這樣的特殊的方程系統的典型方法

  • The typical way to organize this sort of special system of equations

    是把所有變量放到左邊,

  • is to throw all the variables on the left,

    並把餘下的常數放在右面。

  • and put any lingering constants on the right.

    把同樣的變量豎直對好也是很好的,

  • It's also nice to vertically line up the common variables,

    而這樣做的你也許需要把一些係數

  • and to do that you might need to throw in some zero coefficients whenever the variable

    為0的放到在方程中沒有的變量。

  • doesn't show up in one of the equations.

    這叫做一個“綫性方程系統。”

  • This is called a "linear system of equations."

    你們也許會注意到這個看上去很像

  • You might notice that this looks a lot like matrix vector multiplication.

    矩陣矢量乘法。事實上,你可以把所有的方程放進一個矢量方程,在其中你

  • In fact, you can package all of the equations together into a single vector equation,

    有這矩陣包括了所有的不變的係數,和

  • where you have the matrix containing all of the constant coefficients, and a vector containing

    一個矢量包括了所有的變量,

  • all of the variables,

    而它們的矩陣和矢量的積等於一些常數的矢量。

  • and their matrix vector product equals some different constant vector.

    讓我們命名常數矩陣為A,

  • Let's name that constant matrix A,

    代表變量的矢量為x,

  • denote the vector holding the variables with a boldface x,

    並把在右邊的常數矢量叫做v。

  • and call the constant vector on the right-hand side v.

    這個不只是為了要把我們的方程系統

  • This is more than just a notational trick to get our system of equations written on

    寫成一個等式的一個字面上的一個技巧而已。

  • one line.

    它對這問題的一種很美妙幾何解釋更容易懂得。

  • It sheds light on a pretty cool geometric interpretation for the problem.

    矩陣A對應以一些綫性變換,因此解

  • The matrix A corresponds with some linear transformation, so solving Ax = v

    Ax = v 意思是我們在找一個矢量 x

  • means we're looking for a vector x which, after applying the transformation, lands on

    對它施加這變換之後,就停在v。

  • v.

    想一想這裏發生著什麽。

  • Think about what's happening here for a moment.

    你可以有這個真正複雜的所有互相混在一起的多個變量

  • You can hold in your head this really complicated idea of multiple variables all intermingling

    的想法在你的頭腦裏

  • with each other

    就想著空間的移動和變化來得出一個

  • just by thinking about squishing and morphing space and trying to figure out which vector

    矢量停留住的地方。

  • lands on another.

    真好,對嗎?

  • Cool, right?

    開始簡單的,讓我們有一個有兩個等式和兩個未知數的。

  • To start simple, let's say you have a system with two equations and two unknowns.

    那意思是矩陣 A 是一個2x2的矩陣,

  • This means that the matrix A is a 2x2 matrix,

    而 v 和 x 都是一個2維的矢量。

  • and v and x are each two dimensional vectors.

    現在,我們怎樣來考慮這個方程的解

  • Now, how we think about the solutions to this equation

    取決於和A相關的變換是否把所有的

  • depends on whether the transformation associated with A squishes all of space into a lower

    變到一個更低維數的空間,

  • dimension,

    像一根綫或者一個點,或者它擴展

  • like a line or a point,

    它所在的2維空間。

  • or if it leaves everything spanning the full two dimensions where it started.

    用說一個錄像中的的說法,我們再把

  • In the language of the last video, we subdivide into the case where A has zero determinant,

    這情況細分而在A ,和A的行列式值不為0 的情況。

  • and the case where A has nonzero determinant.

    讓我們以最為可能的情況說起,那就是行列式值不為0,

  • Let's start with the most likely case, where the determinant is nonzero,

    意思是空間沒有被壓成一個沒有面積的區域。

  • meaning space does not get squished into a zero area region.

    這這種情況下,總歸有一個并且只有一個矢量 在 v上,

  • In this case, there will always be one and only one vector that lands on v,

    而你可以用逆變換來找到它的。

  • and you can find it by playing the transformation in reverse.

    而我們像這樣地重新回放磁帶,

  • Following where v goes as we rewind the tape like this,

    你將發現矢量 x 而A乘以 x 等於v。

  • you'll find the vector x such that A times x equals v.

    在你施加一個逆變換的時候,它實際上

  • When you play the transformation in reverse, it actually corresponds to a separate linear

    相當於另一個不同的變換,

  • transformation,

    通常稱作“A的逆(矩陣)”

  • commonly called "the inverse of A"

    記法為A的負一次方。

  • denoted A to the negative one.

    例如,如果A是一個90度的逆時針轉動的話,

  • For example, if A was a counterclockwise rotation by 90º

    那麽A的逆矩陣將是順時針轉動90度。

  • then the inverse of A would be a clockwise rotation by 90º.

    如果A是向右的剪切,那就是把j-hat推向右面一個單位,

  • If A was a rightward shear that pushes j-hat one unit to the right,

    其逆矩陣將是一個向左的剪切,那就是把j-hat向左推一個單位。

  • the inverse of a would be a leftward shear that pushes j-hat one unit to the left.

    一般來說,A的逆矩陣是矩陣A的一種

  • In general, A inverse is the unique transformation with the property that if you first apply

    獨特的變換,它具有這樣的性質

  • A,

    如果你施加A,然後施加A的逆矩陣,

  • then follow it with the transformation A inverse,

    你回到你所開始的地方。

  • you end up back where you started.

    在施加一個變換之後又施加另一個,那在代數裏就是矩陣的乘法

  • Applying one transformation after another is captured algebraically with matrix multiplication,

    所以這種A的逆矩陣乘以A的變換的

  • so the core property of this transformation A inverse is that A inverse times A

    性質的核心相當於什麽也沒有做

  • equals the matrix that corresponds to doing nothing.

    而什麽也沒做的變換就叫做“等同變換。”

  • The transformation that does nothing is called the "identity transformation."

    它把i-hat和j-hat都留在原地,沒有移動過,

  • It leaves i-hat and j-hat each where they are, unmoved,

    所以它的列是1, 0, 和 0, 1。

  • so its columns are one, zero, and zero, one.

    一旦你找到這個逆矩陣,在實踐中,你用計算機來做的,

  • Once you find this inverse, which, in practice, you do with a computer,

    你就可以通過這個逆矩陣乘以矩陣v來解你的方程了。

  • you can solve your equation by multiplying this inverse matrix by v.

    再說一遍,幾何意義上這意味著你

  • And again, what this means geometrically is that you're playing the transformation in

    用v 施加著逆轉換。

  • reverse, and following v.

    這種行列式不是0的情況,這對隨機

  • This nonzero determinant case, which for a random choice of matrix is by far the most

    選擇的一個矩陣是最有可以的一種,

  • likely one,

    就像相當於你有2個未知數和2個方程的想法,

  • corresponds with the idea that if you have two unknowns and two equations,

    在大多數情況下那有一個獨特的解。

  • it's almost certainly the case that there's a single, unique solution.

    這種想法在更高維數中也是容易理解的,

  • This idea also makes sense in higher dimensions,

    如果方程的個數等於未知數的個數。

  • when the number of equations equals the number of unknowns.

    重復一遍,方程系統可以翻譯成用幾何的解釋,

  • Again, the system of equations can be translated to the geometric interpretation

    你有一些變換, A

  • where you have some transformation, A,

    和一些矢量 v

  • and some vector, v,

    并且你在尋找一個矢量 x 它停到 v 上。

  • and you're looking for the vector x that lands on v.

    只要A的變換不把空間壓縮進一個更低

  • As long as the transformation A doesn't squish all of space into a lower dimension,

    維數的空間,就是說,它的行列式值

  • meaning, its determinant is nonzero,

    不為0,將將有一個逆轉換,A的逆矩陣,

  • there will be an inverse transformation, A inverse,

    具有這樣的性質,如果你先做A

  • with the property that if you first do A,

    然後你做A的逆轉換,

  • then you do A inverse,

    它和什麽也不做的效果是一樣的。

  • it's the same as doing nothing.

    而來解你的方程,你就得把

  • And to solve your equation, you just have to multiply that reverse transformation matrix

    矢量v 來乘以那個逆矩陣。

  • by the vector v.

    但是 如果這行列式值為0,那麽

  • But when the determinant is zero, and the transformation associated with this system

    和這個系統有關的變換

  • of equations

    把空間壓縮到一個更低維數,就沒有逆變換了。

  • squishes space into a smaller dimension, there is no inverse.

    你不能去-壓縮一根綫來把它回到一個平面的。

  • You cannot un-squish a line to turn it into a plane.

    至少那不是一個函數能做的一件事。

  • At least, that's not something that a function can do.

    那需要轉換各個矢量變成

  • That would require transforming each individual vector

    都是矢量的一整條綫。

  • into a whole line full of vectors.

    但是函數只能有一個輸入變成一個輸出。

  • But functions can only take a single input to a single output.

    與此相似,對於3個未知數的3個方程中,

  • Similarly, for three equations in three unknowns,

    如果相應的變換把3維的空間壓縮到

  • there will be no inverse if the corresponding transformation

    這個平面上,或者甚至如果它吧它壓成

  • squishes 3D space onto the plane,

    一根綫,或者一個點的話,那就沒有逆矩陣了。

  • or even if it squishes it onto a line, or a point.

    所有的這些都相對應於一個行列式為0的情況,

  • Those all correspond to a determinant of zero,

    因爲然後的區域被壓縮到一個體積為0的一個東西了。

  • since any region is squished into something with zero volume.

    如果沒有逆矩陣的話也是有可能存在一個解的,

  • It's still possible that a solution exists even when there is no inverse,

    這只不過是在你的變換時把空間壓縮成

  • it's just that when your transformation squishes space onto, say, a line,

    比分說,一根綫,而你運氣必須好到這矢量在那根綫上的一個地方。

  • you have to be lucky enough that the vector v lives somewhere on that line.

    你可能注意到這些行列式值為0的情況中有些比其他的要更嚴格一些。

  • You might notice that some of these zero determinant cases feel a lot more restrictive than others.

    舉個例,給出一個3x3 的矩陣,這看起來更難些來存在一個解的

  • Given a 3x3 matrix, for example, it seems a lot harder for a solution to exist

    如果它和相比一個平面壓縮空間到一條綫相比

  • when it squishes space onto a line compared to when it squishes things onto a plane,

    即使兩者都是那些行列式為0的。

  • even though both of those are zero determinant.

    比起我們說“行列式值為0”,我們有些更為確定的語言。

  • We have some language that's a bit more specific than just saying "zero determinant."

    在一個變換輸出是一根綫的時候,意思是這是1-維的,

  • When the output of a transformation is a line, meaning it's one-dimensional,

    我們說這變換有‘秩(rank)’為1.

  • we say the transformation has a "rank" of one.

    如果所有的矢量都在某個2-維平面上,

  • If all the vectors land on some two-dimensional plane,

    我們是這變換有一個‘秩(rank)’為2.

  • We say the transformation has a "rank" of two.

    因此'秩(rank)'這個字意思是一個便會的輸出的數字的維數。

  • So the word "rank" means the number of dimensions in the output of a transformation.

    例如,在一個2x2 矩陣的情況中,秩為2是它所能做到最高的了。

  • For instance, in the case of 2x2 matrices, rank 2 is the best that it can be.

    它意味著單位矢量繼續擴展到整個2-維空間,

  • It means the basis vectors continue to span the full two dimensions of space, and the

    而行列式值不為0.

  • determinant is nonzero.

    但是對3x3矩陣,秩為2意味著我們已經坍縮了,

  • But for 3x3 matrices, rank 2 means that we've collapsed,

    但並沒有想在一個秩為1的情況會有的那種坍縮。

  • but not as much as they would have collapsed for a rank 1 situation.

    如果一個3-維的變換具有一個不為0的行列式值,而它的輸出填滿了整個3-維空間,

  • If a 3D transformation has a nonzero determinant, and its output fills all of 3D space,

    它就有一個秩是3.

  • it has a rank of 3.

    對你的矩陣所有可能有的輸出的集合,

  • This set of all possible outputs for your matrix,

    不管它是一條綫,一個平面,3-維空間

  • whether it's a line, a plane, 3D space, whatever,

    被稱作你的矩陣的“列空間”。

  • is called the "column space" of your matrix.

    你也許可以猜出這名字是從那兒來的。

  • You can probably guess where that name comes from.

    你矩陣的列告訴你這單位矢量停在什麽地方的,

  • The columns of your matrix tell you where the basis vectors land,

    而這些經變換的單位矢量給出所有可能有的輸出。

  • and the span of those transformed basis vectors gives you all possible outputs.

    換句話說,列空間是你的矩陣的列的擴展。

  • In other words, the column space is the span of the columns of your matrix.

    因此秩的一個更為精確定義會是

  • So, a more precise definition of rank would be that

    如果這個秩是它所能達到最高的,

  • it's the number of dimensions in the column space.

    意思是它等於列的數目,我們叫這矩陣“全秩(full rank)”。

  • When this rank is as high as it can be,

    注意,0矢量將永遠被包括在這列空間裏。

  • meaning it equals the number of columns, we call the matrix "full rank."

    因爲綫性變換必須保持原點不動的。

  • Notice, the zero vector will always be included in the column space,

    對一個全秩變換來說,停在原點的矢量

  • since linear transformations must keep the origin fixed in place.

    只有是0矢量的自身。

  • For a full rank transformation, the only vector that lands at the origin is the zero vector

    但是對不是全秩的矩陣,也就是它壓縮到一個更小的維數的,來說

  • itself,

    你可以有一大堆停在0上的矢量。

  • but for matrices that aren't full rank, which squish to a smaller dimension,

    如果一個2-維的變換把空間壓縮到一根綫是,舉個例子說,

  • you can have a whole bunch of vectors that land on zero.

    在一個不同的方向上有著一根分開的綫,

  • If a 2D transformation squishes space onto a line, for example,

    滿是矢量被壓縮到原點。