* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Note 3 (self study)
System of linear equations wikipedia , lookup
Jordan normal form wikipedia , lookup
Eigenvalues and eigenvectors wikipedia , lookup
Cross product wikipedia , lookup
Tensor operator wikipedia , lookup
Singular-value decomposition wikipedia , lookup
Non-negative matrix factorization wikipedia , lookup
Geometric algebra wikipedia , lookup
Determinant wikipedia , lookup
Euclidean vector wikipedia , lookup
Orthogonal matrix wikipedia , lookup
Symmetry in quantum mechanics wikipedia , lookup
Cayley–Hamilton theorem wikipedia , lookup
Covariance and contravariance of vectors wikipedia , lookup
Gaussian elimination wikipedia , lookup
Basis (linear algebra) wikipedia , lookup
Linear algebra wikipedia , lookup
Bra–ket notation wikipedia , lookup
Cartesian tensor wikipedia , lookup
Matrix multiplication wikipedia , lookup
Computer Graphics CSC 630 Lecture 2- Linear Algebra What is Computer Graphics? • Using a computer to generate an image from a representation. Model computer Image 2 Model Representations • How do we represent an object? – Points – Mathematical Functions • X2 + Y2 = R2 – Polygons (most commonly used) • Points • Connectivity 3 Linear Algebra • Why study Linear Algebra? – Deals with the representation and operations commonly used in Computer Graphics 4 What is a Matrix? • A matrix is a set of elements, organized into rows and columns m×n matrix n columns m rows a00 a  10 a01   a11  5 What is a Vector? • Vector: n×1 matrix a    v  b   c  y v x 6 Representing Points and Vectors • A 3D point – Represents a location with respect to some coordinate system • A 3D vector – Represents a displacement from a position a    p  b   c  a    v  b   c  7 Basic Operations • Transpose: Swap rows with columns a M   d  g  x V   y   z  b e h c f  i  V T  x a d M T  b e  c f y g h  i  z 8 Vector Addition • Given v = [x y z]T and w = [a b c]T v + w = [x+a y+b z+c]T • Properties of Vector addition – – – – Commutative: v + w = w + v Associative (u + v) + w = u + (v + w) Additive Identity: v + 0 = v Additive Inverse: v + (-v) = 0 9 Parallelogram Rule • To visualize what a vector addition is doing, here is a 2D example: v v+w w 10 Vector Multiplication • Given v = [x y z]T and a Scalar s and t – sv = [sx sy sz]T and tv = [tx ty tz]T • Properties of Vector multiplication – – – – Associative: (st)v = s(tv) Multiplicative Identity: 1v = v Scalar Distribution: (s+t)v = sv+tv Vector Distribution: s (v+w) = sv+sw 11 Vector Spaces • Consists of a set of elements, called vectors. • The set is closed under vector addition and vector multiplication. 12 Dot Product and Distances • Given u = [x y z] T and v = [a b c] T – v•u = vTu = ax+by+cz • The Euclidean distance of u from the origin is denoted by ||u|| and called norm or length – ||u|| = sqrt(x2+y2+z2) – Notice that ||u|| = sqrt(u •u) – Unit vector ||u|| = 1, zero vector denoted 0 • The Euclidean distance between u and v is sqrt((x-a) 2+(y-b) 2+(z-c) 2)) and is denoted by ||u-v|| 13 Properties of the Dot Product • Given a vector u, v, w and scalar s – The result of a dot product is a SCALAR value – Commutative: v•w = w•v – Non-degenerate: v•v=0 only when v=0 14 Angles and Projection • Alternative view of the dot product • v•w=||v|| ||w|| cos() where  is the angle between v and w If v and w have length 1… v v w v·w = 0 v=w v·w = 1 θ w v·w = cos θ 15 Angles and Projection • If v is a unit vector (||v|| = 1) then if we perpendicularly project w onto v can call this newly projected vector u then ||u|| = v•w w u v 16 Cross Product Properties • The Cross Product c of v and w is denoted by vw • c Is a VECTOR, perpendicular to the plane defined by v and w • The magnitude of c is proportional to the sin of the angle between v and w • The direction of c follows the right hand rule. vw w • ||vw||=||v|| ||w|| |sin| –  is the angle between v and w • vw=-(wv) v 17 Cross Product • Given 2 vectors v=[v1 v2 v3], w=[w1 w2 w3], the cross product is defined to be the determinant of i j v1 v2 w1 w2 k  v2 w3  v3w2    v3   v3w1  v1w3  w3  v1w2  v2 w1  where i,j,k are vectors 18 Matrices • A compact way of representing operations on points and vectors • 3x3 Matrix A looks like  a11 a12 a13     a21 a22 a23  a   31 a32 a33  aij refers to the element of matrix A in ith row and jth column 19 Matrix Addition a c  b e   d  g f  a  e   h  c  g b f   d  h Just add elements 20 Matrix Multiplication • If A is an nk matrix and B is a kp then AB is a np matrix with entries cij where cij= aisbsj • Alternatively, if we took the rows of A and columns of B as individual vectors then cij = Ai•Bj where the subscript refers to the row and column, respectively a b   e c d   g   f  ae  bg   h  ce  dg af  bh cf  dh  Multiply each row by 21 each column Matrix Multiplication Properties • Associative: (AB)C = A(BC) • Distributive: A(B+C) = AB+AC • Multiplicative Identity: I= diag(1) (defined only for square matrix) • Identity matrix: AI = A • NOT commutative: ABBA 1 0 0   I  0 1 0  0 0 221 Matrix Inverse • If A and B are nxn matrices and AB=BA=I then B is the inverse of A, denoted by A-1 • (AB)-1=B-1A-1 same applies for transpose • M T 1  M 1T 23 Determinant of a Matrix • Defined on a square matrix (nxn) • Used for inversion • If det A = 0, then A has no inverse a b  A  c d   det( A)  ad  bc 24 Determinant of a Matrix • For a nxn matrix, det A  A  i 1 a1i (1) A1i n 1i where A1i determinant of (n-1)x(n-1) submatrix A gotten by deleting the first row and the ith column 25 Transformations • Why use transformations?      Create object in convenient coordinates Reuse basic shape multiple times Hierarchical modeling System independent Virtual cameras 26 Translation  x '  y ' =    z '   x   T (tx, ty, tz )  y   z  =  x  y    z  t x  + ty    tz   x  y    z  t x  + ty    tz  27 Properties of Translation T (0,0,0) v = v T ( sx, sy, sz ) T (tx, ty, tz ) v = T ( sx  tx, sy  ty, sz  tz ) v T ( sx, sy, sz ) T (tx, ty, tz ) v = T (tx, ty, tz ) T ( sx, sy, sz ) v T 1 (tx, ty, tz ) v = T (tx,ty ,tz ) v 28 Rotations (2D) y x  r cos  y  r sin  x' , y '  x, y  x x'  x cos  y sin  y '  x sin   y cos x'  r cos(   ) y '  r sin(    ) cos(   )  cos  cos  sin  sin  sin(    )  cos  sin   sin  cos x'  (r cos  ) cos  (r sin  ) sin  y '  (r cos  ) sin   (r sin  ) cos 29 Rotations 2D • So in matrix notation  x   cos  '     y   sin  '  sin   x    cos  y  30 Rotations (3D) 0 1 Rx ( )  0 cos 0 sin  0   sin   cos   cos  0 sin   Ry ( )   0 1 0   sin  0 cos  cos   sin  0 Rz ( )   sin  cos  0  0 0 1 31 Properties of Rotations Ra ( 0)  I Ra ( ) Ra ( )  Ra (   ) Ra ( ) Ra ( )  Ra ( ) Ra ( ) Ra 1 ( )  Ra( )  Ra T ( ) Ra ( ) Rb( )  Rb( ) Ra ( ) order matters! 32 Combining Translation & Rotation T (1,1) R(45) R(45) T (1,1) 33 Combining Translation & Rotation v'  v  T v' '  Rv' v ' '  R( v  T ) v' '  Rv  RT v'  Rv v' '  v'T v' '  Rv  T 34 Scaling  x'  sx x   y '   sy y       z '   sz z   sx 0 0  S ( sx, sy, sz )   0 sy 0   0 0 sz  Uniform scaling iff sx  sy  sz 35 Homogeneous Coordinates  x  y   can be represented as  z  where X x , w X  Y    Z    w Y Z y , z w w 36 Translation Revisited 1  x  0   T (tx, ty, tz )  y    0  z   0 0 1 0 0 0 0 1 0 tx   x  ty   y  tz   z    1 1  37 Rotation & Scaling Revisited 0 1  x  0 cos    Rx( )  y    0 sin   z   0 0  sx 0 0  x  0 sy 0    S ( sx, sy, sz )  y    0 0 sz  z   0 0 0 0  sin  cos 0 0  x  0  y  0  z    1  1  0  x  0  y  0  z    1  1  38 Combining Transformations v '  Sv v ' '  Rv '  RS v v ' ' '  Tv ' '  TRv '  TRS v v ' ' '  Mv where M  TRS 39 Transforming Tangents t  pq t '  p 'q '  Mp  Mq  M (p  q )  Mt 40 Transforming Normals nT t  0 n'T t '  0 n'T Mt  0 n'T Mt  nT t n'T M  nT M T n'  n n'  M T 1 nM 1T n 41
 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                            