Placeholder Image

字幕列表 影片播放

  • Hello, hello again.

    哈囖,哈囖。

  • So, moving forward

    這樣,接著講下去

  • I will be assuming you have a visual understanding of linear transformations

    我將假定你對綫性變換有一個視覺上的理解

  • and how they're represented with matrices

    以及他們是怎樣用矩陣來表示的

  • the way I have been talking about in the last few videos.

    這方法我在過去的幾個錄像中講過的。

  • If you think about a couple of these linear transformations

    如果你想一下這些綫性變換

  • you might notice how some of them seem to stretch space out

    你也許會注意到有些是怎樣看起來在空間裏拉伸著

  • while others squish it on in.

    而另外一些把它壓縮進去。

  • One thing that turns out to be pretty useful to understanding one of these transformations

    一個東西卻對理解這些變換中的一個相當有用的

  • is to measure exactly how much it stretches or squishes things.

    是來精確地度量它把那些東西拉伸或者壓縮了多少。

  • More specifically

    更具體來度量對給出的區域

  • to measure the factor by which the given region increases or decreases.

    增加或者減少的係數。

  • For example

    舉個例子

  • look at the matrix with the columns 3, 0 and 0, 2

    看一個矩陣它的列是(3,0) 和(0,2)

  • It scales i-hat by a factor of 3

    它對i-hat乘以一個係數3

  • and scales j-hat by a factor of 2

    並對j-hat乘以一個係數

  • Now, if we focus our attention on the one by one square

    現在,如果我們把注意力集中在1X1 的方塊s

  • whose bottom sits on i-hat and whose left side sits on j-hat.

    它的底部在i-hat是而左面在j-hat上。

  • After the transformation, this turns into a 2 by 3 rectangle.

    轉換之後,這就變成一個2X3 的長方塊。

  • Since this region started out with area 1, and ended up with area 6

    因爲這個區域開始的面積是1,而面積變成了6

  • we can say the linear transformation has scaled it's area by a factor of 6.

    我們可以是這綫性變換已經通過一個係數6來放大了它的面積

  • Compare that to a shear

    將它與一個剪切來比較

  • whose matrix has columns 1, 0 and 1, 1.

    剪切的矩陣的列是(1,0)和(1,1).

  • Meaning, i-hat stays in place and j-hat moves over to 1, 1.

    意思是,h-hat停在原地而j-hat移動到(1,1)。

  • That same unit square determined by i-hat and j-hat

    那個同樣的由i-hat和j-hat所決定的單位方塊

  • gets slanted and turned into a parallelogram.

    被歪掉了並成了一個平行四邊形。

  • But, the area of that parallelogram is still 1

    但是,那個平行四邊形的面積仍是1

  • since it's base and height each continue to each have length 1.

    因爲它的底綫和高度繼續有長度1。

  • So, even though this transformation smushes things about

    因此即使這個變換壓變了這東西

  • it seems to leave areas unchanged.

    它看來面積到沒變。

  • At least, in the case of that one unit square.

    至少,在一個單位方塊的情況下。

  • Actually though

    雖然實際上

  • if you know how much the area of that one single unit square changes

    如果你知道一個單位方塊的面積變化了多少的話

  • it can tell you how any possible region in space changes.

    這也能告訴你在空間裏的任何區域的變化的。

  • For starters

    這麽開頭吧

  • notice that whatever happens to one square in the grid

    注意在網格裏的一個方塊不管已經在

  • has to happen in any other square in the grid

    任何的在網格裏發生了怎麽樣的變化

  • no matter the size.

    尺寸大小是無關緊要的。

  • This follows from the fact that grid lines remain parallel and evenly spaced.

    這個出自這樣的事實網格綫一直平行並間隔均等的。

  • Then, any shape that is not a grid square

    然後,任何不是一個方格形狀的

  • can be approximated by grid squares really well.

    可以用網格來很好地近似的。

  • With arbitrarily good approximations if you use small enough grid squares.

    如果你用足夠小的方塊就有隨便樣什麽近似程度。

  • So, since the areas of all those tiny grid squares are being scaled by some single amount

    這樣,因爲所有的那些很小的方塊都以同樣的一個係數被縮小的區域

  • the area of the blob as a whole

    這一團作爲一個整體也將

  • will also be scaled also by that same single amount.

    被縮小一個同樣的係數。

  • This very special scaling factor

    這個非常特殊的縮小的係數

  • the factor by which a linear transformation changes any area

    一個綫性變換以那個係數改變了

  • is called the determinant of that transformation.

    任何面積的叫做那個變換的行列式值。

  • I'll show how to compute the determinate of a transformation using it's matrix later on

    在這個錄像後面我將演示給你看

  • in the video

    怎樣用來計算一個變換的行列式值,

  • but understanding what it is, trust me, much more important than understanding the computation.

    但是懂得它是什麽,信任我,比懂得計算更我重要。

  • For example the determinant of a transformation would be 3

    例如一個變換的行列式值會是3

  • if that transformation increases the area of the region by a factor of 3.

    如果那個變換增加了那個區域的面積3倍。

  • The determinant of a transformation would be 1/2

    這行列式值會是1/2

  • if it squishes down all areas by a factor of 1/2.

    如果它把面積壓縮到1/2.

  • And, the determinant of a 2-D transformation is 0

    而,一個2-維變換的行列式是0

  • if it squishes all of space onto a line.

    如果把所以的空間壓到一根綫上。

  • Or, even onto a single point.

    或者,甚至縮到一個點上。

  • Since then, the area of any region would become 0.

    然後,任何區域的面積就會成爲0了。

  • That last example proved to be pretty important

    最後那個例子證明是非常重要的。

  • it means checking if the determinant of a given matrix is 0

    它意味著檢查如果一個給出的矩陣的行列式值是不是0

  • will give away if computing weather or not the transformation associated with that matrix

    將給出和那個矩陣有關的變換是不是

  • squishes everything into a smaller dimension.

    把所有的東西壓縮進了一個更小的尺寸。

  • You will see in the next few videos

    你們在以後的幾個錄像中將知道

  • why this is even a useful thing to think about.

    爲什麽這甚至還是一個有用的東西來考慮一下的。

  • But for now, I just want to lay down all of the visual intuition

    但是現在,我只想寫下所有的視覺上的直覺

  • which, in and of itself, is a beautiful thing to think about.

    這個本身,就是一件美麗的東西來想一想的。

  • Ok, I need to confess that what I've said so far is not quite right.

    Ok,我需要坦白我剛已說過的並不很正確。

  • The full concept of the determinant allows for negative values.

    行列式值的完整概念允許有負數。

  • But, what would scaling an area by a negative amount even mean?

    但是,把一個面積放大縮小一個負數還會有什麽意思?

  • This has to do with the idea of orientation.

    這個不得不和方向的概念有關係。

  • For example

    例如

  • notice how this transformation

    注意這個變換

  • gives the sensation of flipping space over.

    給出把空間翻個身的感覺。

  • If you were thinking of 2-D space as a sheet of paper

    如果你把2-維空間當作一張紙,

  • a transformation like that one seems to turn over that sheet onto the other side.

    像那樣的一個變換一個人看起來像是把紙翻到另一個面了。

  • Any transformations that do this are said to "invert the orientation of space."

    任何作這樣的變換的就說成是來“反轉空間的定向。”

  • Another way to think about it is in terms of i-hat and j-hat.

    另一個方法來考慮是用i-hat和j-hat。

  • Notice that in their starting positions, j-hat is to the left of i-hat.

    注意到在它們開始的位置,j-hat是在 i-hat的左面的。

  • If, after a transformation, j-hat is now on the right of i-hat

    如果在轉換之後,j-hat到了i-hat的右面,

  • the orientation of space has been inverted.

    空間的定向已被反了過來。

  • Whenever this happens

    任何時候發生了這個

  • whenever the orientation of space is inverted

    任何時候空間的定向被翻轉了

  • the determinant will be negative.

    這行列式值將是個負數。

  • The absolute value of the determinant though

    這行列式值的絕對值

  • still tells you the factor by which areas have been scaled.

    仍告訴你這個面積被放大縮小的係數。

  • For example

    例如

  • the matrix with columns 1, 1 and 2, -1

    一個矩陣的列分別是(1,1)和(2,-1)

  • encodes a transformation that has determinant

    編碼著一個變換,它的行列式值

  • Ill just tell you

    我告訴你

  • -3.

    是-3。

  • And what this means is

    而它的意思是

  • that, space gets flipped over

    空間被翻了一個身

  • and areas are scaled by a factor of 3.

    并且面積放大了3倍。

  • So why would this idea of a negative area scaling factor

    那麽為什麽一個負的面積放大縮小係數

  • be a natural way to describe orientation flipping?

    會是一種自然的方法來描述定向的翻轉?

  • Think about the seres of transformations you get

    考慮一下你有這一系列的變換

  • by slowly letting i-hat get closer and closer to j-hat.

    慢慢的讓i-hat 越來越結局 j-hat。

  • As i-hat gets closer

    隨著i-hat 的接近

  • all the areas in space are getting squished more and more

    空間裏所有的面積越來越被壓縮

  • meaning the determinant approaches 0.

    意思是行列式值接近於0.

  • once i-hat lines up perfectly with j-hat,

    一旦 i-hat 和 j-hap完全重合,

  • the determinant is 0.

    行列式值就是0.

  • Then, if i-hat continues the way it was going

    然後,如果i-hat 繼續這樣走下去

  • doesn't it kinda feel natural for the determinant to keep decreasing into the negative numbers?

    這不正是讓行列式值減小進入負數那樣感到很自然的嗎?

  • So, that is the understanding of determinants in 2 dimensions

    因此,那就是在2-維空間對行列式值的理解

  • what do you think it should mean for 3 dimensions?

    對3-維你想它應該怎樣來想呢?

  • It [determinant of 3x3 matrix] also tells you how much a transformation scales things

    如果3x3 矩陣的行列式也告訴你一個變換對一些東西放大縮小了多少

  • but this time

    但這次

  • it tells you how much volumes get scaled.

    它告訴你多少體積被放大縮小了

  • Just as in 2 dimensions

    就像在2-維的那個一樣

  • where this is easiest to think about by focusing on one particular square with an area 1

    集中是一個特殊的一個面積為1的方塊

  • and watching only what happens to it

    並只看著對在3-維空間裏它所發生的,

  • in 3 dimensions

    這會容易一些

  • it helps to focus your attention

    它有助於把你的注意力集中

  • on the specific 1 by 1 by 1 cube

    在這個特定的1x1x1方塊

  • whose edges are resting on the basis vectors

    它的邊在基本矢量

  • i-hat, j-hat, and k-hat.

    i-hat, j-hat, 和 k-hat 上.

  • After the transformation

    轉換之後

  • that cube might get warped into some kind of slanty slanty cube

    那個方塊可能被扭曲到像是很斜很斜的方塊

  • this shape by the way has the best name ever

    順便提一下,這個形狀有一個最好的

  • parallelepiped.

    名字parallelepiped(平行管柱)

  • A name made even more delightful when your professor has a nice thick Russian accent.

    一個名字甚至更開心的如果你的教授有一個很強的俄羅斯口音。

  • Since this cube starts out with a volume of 1

    既然這個方塊的體積為1

  • and the determinant gives the factor by which any volume is scaled

    而行列式值給出體積被放大的係數

  • you can think of the determinant

    你可以把行列式值簡單的考慮成

  • as simply being the volume of that parallelepiped

    那方塊變成的

  • that the cube turns into.

    那個parallepiped的體積。

  • A determinate of 0

    一個是0的行列式值

  • would mean that, all of space is squished onto something with 0 volume

    將意味著,所有的空間都被壓縮的一個

  • meaning ether a flat plane, a line, or in the most extreme case

    0體積的什麽東西,意思是或者是一個平面,一條綫,或者在極端情況下

  • onto a single point.

    成爲一個點。

  • Those of you who watched chapter 2

    你們那些看過第二章的

  • will recognize this as meaning

    將認識到這個意思

  • that the columns of the matrix are linearly dependent.

    矩陣的列是綫性相關的。

  • Can you see why?

    你能知道為什麽嗎?

  • What about negative determinants?

    那麽負的行列式值呢?

  • What should that mean for 3 dimensions?

    那麽對3-維的應該是什麽意思呢?

  • One way to describe orientation in 3-D

    在3-維空間來描述方向的一個方法是

  • is with the right hand rule.

    用右手定則。

  • Point the forefinger of your right hand

    用你的右手的食指指向

  • in the direction of i-hat

    i-hat 的方向

  • stick out your middle finger in the direction of j-hat

    在j-hat 方向上伸出你的中指

  • and notice how when you point your thumb up

    而注意到你的姆指朝上

  • it is in the direction of k-hat.

    它就是k-hat 的方向。

  • If you can still do that after the transformation

    在變換之後如果你仍能夠做這個的,

  • orientation has not changed

    定向沒有變化

  • and the determinant is positive.

    那麽行列式值是正的;

  • Otherwise

    否則

  • if after the transformation it only makes since to do that with your left hand

    如果在變換之後,你只能用你的左手來做的話

  • orientation has been flipped

    定向被翻過來了

  • and the determinant is negative.

    而行列式值是負的。

  • So if you haven't seen it before

    如果你以前不知道

  • you are probably wondering by now

    你也許在想

  • "How do you actually compute the determinant?"

    “那你實際上是怎樣來計算行列式值的呢?”

  • For a 2 by 2 matrix with entries a, b, c, d

    對一個2x2的矩陣它的項為a, b, c, d

  • the formula is (a * d) - (b * c).

    這公式是(a*d) - (b*c)。

  • Here's part of an intuition for where this formula comes from

    這個公式是怎麽來的這裏是一部分的直覺

  • lets say the terms b and c both happed to be 0.

    讓我們假定項數 b 和 c 都正好是0.

  • Then the term a tells you how much i-hat is stretched in the x direction

    然後項數 a 告訴你 i-hat 在x方向

  • and the term d

    上伸出多少而項數 d 告訴你

  • tells you how much j-hat is stretched in the y direction.

    j-hat 在y 方向是伸出多少。

  • So, since those other terms are 0

    這樣,因爲其它的項都是0

  • it should make sense that a * d

    這應該可以理解 a*d 給出我們最喜歡的

  • gives the area of the rectangle that our favorite unit square turns into.

    單位方形所變成的矩形的面積。

  • Kinda like the 3, 0, 0, 2 example from earlier.

    有點像是早些的例子 3, 0, 0, 2

  • even if only one of b or c are 0

    即使 b, 或者 c 當中只有一個是0

  • you'll have a parallelogram

    你就有一個平行四邊形

  • with a base a

    它的底是 a

  • and a height d.

    而高為 d。

  • So, the area should still be

    這樣,這面積仍應該是

  • a times d.

    a 乘以 d。

  • Loosely speaking

    不太嚴格地說

  • if both b and c are non-0

    如果 b 和 c 都不是0

  • then that b * c term

    然後 b*c 那個項告訴你

  • tells you how much this parallelogram

    這個平行四邊形被拉伸看多少

  • is stretched or squished in the diagonal direction.

    或者在對角綫方向說被壓縮了多少。

  • For those of you hungry for a more precipice description of this b * c term

    對你們當中有些急於要對 b*c 這個項有一個更精確的描述

  • here's a helpful diagram if you would like to pause and ponder.

    這裏有一幅有幫助的圖如果你想要來停等一下並想一下的話。

  • Now if you feel like computing determinants by hand

    現在如果你感到 喜歡用手算行列式值的話

  • is something that you need to know

    是一件什麽事情你需要了知道的

  • the only way to get it down is to just

    唯一來做的方法

  • practice it with a few.

    就用它來做幾個練習。

  • There's not really that much I can say or animate that is going to drill in the computation.

    真的我也沒有很多可說的或者動畫一下要來計算的。

  • This is all tripply true for 3-rd dimensional determinants.

    對3-維的行列式值都是三個的

  • There is a formula [for that]

    對此有一個公式

  • and if you feel like that is something you need to know

    而如果你覺得想要啦知道的話

  • you should practice with a few matrices

    你應該做幾個矩陣的練習

  • or you know, go watch Sal Kahn work through a few.

    或者你知道的,去看看Sal Kahn 做的幾個。

  • Honestly though

    雖然老實說

  • I don't think those computations fall within the essence of linear algebra

    我不認爲那些計算在綫性代數的範圍裏的

  • but I definitely think that knowing what the determinate represents

    但是我肯定認爲行列式值所代表的

  • falls within that essence.

    是在那個精要中的。

  • Here's kind of a fun question to think about before the next video

    在下一個錄像之前有一個像是有趣的問題來想一下

  • if you multiply 2 matrices together

    如果你把兩個矩陣相乘起來

  • the determinant of the resulting matrix

    其結果得到的矩陣的行列式值

  • is the same as the product of the determinants of the original two matrices

    是和兩個原來的矩陣的行列式值的乘積是相同的。

  • if you tried to justify this with numbers

    如果你用數字來下證明

  • it would take a really long time

    它會化好多時間的

  • but see if you can explain why this makes sense in just one sentence.

    但是看看如果你可以解釋這點只用一句話就可以懂的話。

  • Next up

    下面

  • I'll be relating the idea of linear transformations covered so far

    我將把至今所講過的綫性變換的思想

  • to one of the areas where linear algebra is most useful

    來聯係到綫性代數最有用的

  • linear systems of equations

    綫性方程組上。

  • see ya then!

    到時再見!

Hello, hello again.

哈囖,哈囖。

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋