prelude l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Prelude PowerPoint Presentation
Download Presentation
Prelude

Loading in 2 Seconds...

play fullscreen
1 / 139

Prelude - PowerPoint PPT Presentation


  • 180 Views
  • Uploaded on

Prelude. A pattern of activation in a NN is a vector A set of connection weights between units is a matrix Vectors and matrices have well-understood mathematical and geometric properties Very useful for understanding the properties of NNs. Operations on Vectors and Matrices. Outline.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Prelude' - amina


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
prelude
Prelude
  • A pattern of activation in a NN is a vector
  • A set of connection weights between units is a matrix
  • Vectors and matrices have well-understood mathematical and geometric properties
  • Very useful for understanding the properties of NNs
outline
Outline
  • The Players: Scalars, Vectors and Matrices
  • Vectors, matrices and neural nets
  • Geometric Analysis of Vectors
  • Multiplying Vectors by Scalars
  • Multiplying Vectors by Vectors
    • The inner product (produces a scalar)
    • The outer product (produces a matrix)
  • Multiplying Vectors by Matrices
  • Multiplying Matrices by Matrices
scalars vectors and matrices
Scalars, Vectors and Matrices
  • Scalar: A single number (integer or real)
  • Vector: An ordered list of scalars

[ 1 2 3 4 5 ] [ 0.4 1.2 0.07 8.4 12.3 ] [ 12 10 ] [ 2 ]

scalars vectors and matrices5
Scalars, Vectors and Matrices
  • Scalar: A single number (integer or real)
  • Vector: An ordered list of scalars

[ 1 2 3 4 5 ] [ 0.4 1.2 0.07 8.4 12.3 ] [ 12 10 ] [ 2 ]

[ 12 10 ] ≠ [ 10 12 ]

scalars vectors and matrices6
Scalars, Vectors and Matrices
  • Scalar: A single number (integer or real)
  • Vector: An ordered list of scalars

[ 1 2 3 4 5 ] [ 0.4 1.2 0.07 8.4 12.3 ] [ 12 10 ] [ 2 ]

Row vectors

scalars vectors and matrices7

1

2

3

4

5

1.5

0.3

6.2

12.0

17.1

Scalars, Vectors and Matrices
  • Scalar: A single number (integer or real)
  • Vector: An ordered list of scalars

[ 1 2 3 4 5 ] [ 0.4 1.2 0.07 8.4 12.3 ] [ 12 10 ] [ 2 ]

Row vectors

Column Vectors

scalars vectors and matrices8
Scalars, Vectors and Matrices

Scalar: A single number (integer or real)

Vector: An ordered list of scalars

Matrix: An ordered list of vectors:

1 2 6 1 7 8

2 5 9 0 0 3

3 1 5 7 6 3

2 7 9 3 3 1

scalars vectors and matrices9
Scalars, Vectors and Matrices

Scalar: A single number (integer or real)

Vector: An ordered list of scalars

Matrix: An ordered list of vectors:

1 2 6 1 7 8

2 5 9 0 0 3

3 1 5 7 6 3

2 7 9 3 3 1

Row vectors

scalars vectors and matrices10
Scalars, Vectors and Matrices

Scalar: A single number (integer or real)

Vector: An ordered list of scalars

Matrix: An ordered list of vectors:

1 2 6 1 7 8

2 5 9 0 0 3

3 1 5 7 6 3

2 7 9 3 3 1

Column vectors

scalars vectors and matrices11
Scalars, Vectors and Matrices

Scalar: A single number (integer or real)

Vector: An ordered list of scalars

Matrix: An ordered list of vectors:

1 2 6 1 7 8

2 5 9 0 0 3

3 1 5 7 6 3

2 7 9 3 3 1

Matrices are indexed (row, column)

M =

scalars vectors and matrices12
Scalars, Vectors and Matrices

Scalar: A single number (integer or real)

Vector: An ordered list of scalars

Matrix: An ordered list of vectors:

1 2 6 1 7 8

2 5 9 0 0 3

3 1 5 7 6 3

2 7 9 3 3 1

Matrices are indexed (row, column)

M(1,3) = 6 (row 1, column 3)

M =

scalars vectors and matrices13
Scalars, Vectors and Matrices

Scalar: A single number (integer or real)

Vector: An ordered list of scalars

Matrix: An ordered list of vectors:

1 2 6 1 7 8

2 5 9 0 0 3

3 1 5 7 6 3

2 7 9 3 3 1

Matrices are indexed (row, column)

M(1,3) = 6 (row 1, column 3)

M(3,1) = 3 (row 3, column 1)

M =

variable naming conventions
Variable Naming Conventions
  • Scalars: Lowercase, italics

x, y, z…

  • Vectors: Lowercase, bold

u, v, w…

  • Matrices: Uppercase, bold

M, N, O …

  • Constants: Greek

, , , ,  …

transposing vectors
Transposing Vectors

If u is a row vector…

u = [ 1 2 3 4 5 ]

…then u’ (“u-transpose”) is a column vector

1

2

3

4

5

… and vice-versa.

u’ =

transposing vectors16
Transposing Vectors

If u is a row vector…

u = [ 1 2 3 4 5 ]

…then u’ (“u-transpose”) is a column vector

1

2

3

4

5

… and vice-versa.

Why in the world

would I care??

u’ =

You

transposing vectors17
Transposing Vectors

If u is a row vector…

u = [ 1 2 3 4 5 ]

…then u’ (“u-transpose”) is a column vector

1

2

3

4

5

… and vice-versa.

Answer: It’ll matter when we come to vector multiplication.

u’ =

transposing vectors18
Transposing Vectors

If u is a row vector…

u = [ 1 2 3 4 5 ]

…then u’ (“u-transpose”) is a column vector

1

2

3

4

5

… and vice-versa.

OK.

u’ =

vectors matrices neural nets21
Vectors, Matrices & Neural Nets

i1

i2

Output units, i

j1

j2

j3

Input units, j

vectors matrices neural nets22
Vectors, Matrices & Neural Nets

i1

i2

Output units, i

Connection weights, wij

w23

w11

w13

w12

w22

w21

j1

j2

j3

Input units, j

vectors matrices neural nets23
Vectors, Matrices & Neural Nets

i1

i2

Output units, i

Connection weights, wij

w23

w11

w13

w12

w22

w21

0.2

0.9

0.5

Input units, j

The activations of the input nodes can be represented as a 3-dimensional vector:

j = [ 0.2 0.9 0.5 ]

vectors matrices neural nets24
Vectors, Matrices & Neural Nets

1.0

0.0

Output units, i

Connection weights, wij

w23

w11

w13

w12

w22

w21

j1

j2

j3

Input units, j

The activations of the output nodes can be represented as a 2-dimensional vector:

i = [ 1.0 0.0 ]

vectors matrices neural nets25
Vectors, Matrices & Neural Nets

i1

i2

Output units, i

Connection weights, wij

w23

w11

0.1

0.2

w13

w12

1.0

w22

w21

j1

j2

j3

Input units, j

The weights leading into any output node can be represented as a 3-dimensional vector:

w1j = [ 0.1 1.0 0.2 ]

vectors matrices neural nets26
Vectors, Matrices & Neural Nets

i1

i2

Output units, i

Connection weights, wij

w23

-0.9

w11

0.1

0.2

w13

w12

1.0

w22

0.1

w21

1.0

j1

j2

j3

Input units, j

The complete set of weights can be represented as a 3 (row) X 2 (column) matrix:

0.1 1.0 0.21.0 0.1 -0.9

W =

vectors matrices neural nets27
Vectors, Matrices & Neural Nets

i1

i2

Output units, i

Connection weights, wij

Why in the world

would I care??

w23

-0.9

w11

0.1

0.2

w13

w12

1.0

w22

0.1

w21

1.0

j1

j2

j3

Input units, j

The complete set of weights can be represented as a 2 (row) X 3 (column) matrix:

0.1 1.0 0.21.0 0.1 -0.9

W =

vectors matrices neural nets28
Vectors, Matrices & Neural Nets

Because the mathematics of vectors and matrices is well-understood.

Because vectors have a very useful geometric interpretation.

Because Matlab “thinks” in vectors and matrices.

Because you are going to have to learn to think in Matlab.

Why in the world

would I care??

W

vectors matrices neural nets29
Vectors, Matrices & Neural Nets

Because the mathematics of vectors and matrices is well-understood.

Because vectors have a very useful geometric interpretation.

Because Matlab “thinks” in vectors and matrices.

Because you are going to have to learn to think in Matlab.

OK.

geometric analysis of vectors
Geometric Analysis of Vectors

Dimensionality: The number of numbers in a vector

geometric analysis of vectors31
Geometric Analysis of Vectors

Dimensionality: The number of numbers in a vector

geometric analysis of vectors32
Geometric Analysis of Vectors

Dimensionality: The number of numbers in a vector

geometric analysis of vectors33
Geometric Analysis of Vectors
  • Implications for neural networks
  • Auto-associative nets
    • State of activation at time t is a vector (a point in a space)
    • As activations change, vector moves through that space
    • Will prove invaluable in understanding Hopfield nets
  • Layered nets (“perceptrons”)
    • Input vectors activate output vectors
    • Points in input space map to points in output space
    • Will prove invaluable in understanding perceptrons and back-propagation learning
multiplying a vector by a scalar35
Multiplying a Vector by a Scalar

8

4

10

5

[ 5 4 ] * 2 = [ 10 8 ]

Lengthens the vector but does not change its orientation

adding a vector to a scalar37
Adding a Vector to a Scalar

4

5

[ 5 4 ] + 2 = NAN

Is Illegal.

adding a vector to a vector
Adding a Vector to a Vector

6

4

5

3

[ 5 4 ] + [ 3 6 ] =

adding a vector to a vector39
Adding a Vector to a Vector

10

6

4

5

3

8

[ 5 4 ] + [ 3 6 ] = [ 8 10 ]

Forms a parallelogram.

multiplying a vector by a vector 1 the inner product aka dot product
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)

If u and v are both row vectors of the same dimensionality…

u = [ 1 2 3 ]

v = [ 4 5 6 ]

multiplying a vector by a vector 1 the inner product aka dot product41
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)

If u and v are both row vectors of the same dimensionality…

u = [ 1 2 3 ]

v = [ 4 5 6 ]

… then the product

u ·v =

multiplying a vector by a vector 1 the inner product aka dot product42
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)

If u and v are both row vectors of the same dimensionality…

u = [ 1 2 3 ]

v = [ 4 5 6 ]

… then the product

u ·v = NAN

Is undefined.

multiplying a vector by a vector 1 the inner product aka dot product43
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)

If u and v are both row vectors of the same dimensionality…

u = [ 1 2 3 ]

v = [ 4 5 6 ]

… then the product

u ·v = NAN

Is undefined.

Huh??

Why??

That’s BS!

multiplying a vector by a vector 1 the inner product aka dot product44
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)

I told you you’d eventually care about transposing vectors…

?

multiplying a vector by a vector 1 the inner product aka dot product45
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)
  • The Mantra: “Rows by Columns”
  • Multiply rows (or row vectors) by columns (or column vectors)
multiplying a vector by a vector 1 the inner product aka dot product46
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)
  • The Mantra: “Rows by Columns”
  • Multiply rows (or row vectors) by columns (or column vectors)

u = [ 1 2 3 ]

v = [ 4 5 6 ]

multiplying a vector by a vector 1 the inner product aka dot product47
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)
  • The Mantra: “Rows by Columns”
  • Multiply rows (or row vectors) by columns (or column vectors)

4

5

6

u = [ 1 2 3 ]

v = [ 4 5 6 ]

v’ =

multiplying a vector by a vector 1 the inner product aka dot product48
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)
  • The Mantra: “Rows by Columns”
  • Multiply rows (or row vectors) by columns (or column vectors)

4

5

6

u = [ 1 2 3 ]

v’ =

u ·v’ = 32

multiplying a vector by a vector 1 the inner product aka dot product49
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)
  • The Mantra: “Rows by Columns”
  • Multiply rows (or row vectors) by columns (or column vectors)

v’

Imagine rotating your row vector into a (pseudo) column vector…

4

5

6

1

2

3

u = [ 1 2 3 ]

u ·v’ = 32

multiplying a vector by a vector 1 the inner product aka dot product50
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)
  • The Mantra: “Rows by Columns”
  • Multiply rows (or row vectors) by columns (or column vectors)

v’

Now multiply corresponding elements and add up the products…

4

5

6

1

2

3

4

u = [ 1 2 3 ]

u ·v’ = 32

multiplying a vector by a vector 1 the inner product aka dot product51
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)
  • The Mantra: “Rows by Columns”
  • Multiply rows (or row vectors) by columns (or column vectors)

v’

Now multiply corresponding elements and add up the products…

4

5

6

1

2

3

4

10

u = [ 1 2 3 ]

u ·v’ = 32

multiplying a vector by a vector 1 the inner product aka dot product52
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)
  • The Mantra: “Rows by Columns”
  • Multiply rows (or row vectors) by columns (or column vectors)

v’

Now multiply corresponding elements and add up the products…

4

5

6

1

2

3

4

10

18

u = [ 1 2 3 ]

u ·v’ = 32

multiplying a vector by a vector 1 the inner product aka dot product53
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)
  • The Mantra: “Rows by Columns”
  • Multiply rows (or row vectors) by columns (or column vectors)

v’

Now multiply corresponding elements and add up the products…

4

5

6

1

2

3

4

10

1832

u = [ 1 2 3 ]

u ·v’ = 32

multiplying a vector by a vector 1 the inner product aka dot product54
Multiplying a Vector by a Vector 1:The Inner Product (aka “Dot Product”)
  • Inner product is commutative as long as you transpose correctly

v’

u’

v

4

10

1832

4

5

6

4

5

6

1

2

3

4

10

1832

u = [ 1 2 3 ]

v = [ 4 5 6 ]

u ·v’ = 32

v· u’ = 32

the inner dot product
The Inner (“Dot”) Product

v’

u

  • In scalar notation…

4

5

6

1

2

3

4

10

1832

the inner dot product56
The Inner (“Dot”) Product
  • In scalar notation…
  • Remind you of…

… the net input to a unit

the inner dot product57
The Inner (“Dot”) Product
  • In scalar notation…
  • Remind you of…

… the net input to a unit

  • In vector notation…
what does the dot product mean60
What Does the Dot Product “Mean”?
  • Consider uu’

u = [ 3, 4 ]

4

3

what does the dot product mean61
What Does the Dot Product “Mean”?

u’

u

  • Consider uu’

u = [ 3, 4 ]

3

4

3

4

9

16

25

4

3

what does the dot product mean62
What Does the Dot Product “Mean”?

u’

u

  • Consider uu’

u = [ 3, 4 ]

3

4

3

4

9

16

25

5

4

3

what does the dot product mean63
What Does the Dot Product “Mean”?

u’

u

  • Consider uu’

u = [ 3, 4 ]

3

4

3

4

9

16

25

5

4

True for vectors of any dimensionality

3

what does the dot product mean65
What Does the Dot Product “Mean”?
  • What about u v where u  v?
what does the dot product mean66
What Does the Dot Product “Mean”?
  • What about u v where u  v?

Well…

what does the dot product mean67
What Does the Dot Product “Mean”?
  • What about u v where u  v?

Well…

… and cos(uv) is a length-invariant measure of the similarity of u and v

what does the dot product mean68
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

V’ = [ 1, 1 ]

U = [ 1, 0 ]

what does the dot product mean69
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

V’ = [ 1, 1 ]

U V = (1 * 1) + (1 * 0) = 1

uv = 45º; cos(uv) = .707

U = [ 1, 0 ]

what does the dot product mean70
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

V’ = [ 1, 1 ]

U V = (1 * 1) + (1 * 0) = 1

uv = 45º; cos(uv) = .707

U = [ 1, 0 ]

what does the dot product mean71
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

V’ = [ 1, 1 ]

||u|| = sqrt(1) = 1

||v|| = sqrt(2) = 1.414

uv = 45º; cos(uv) = .707

U = [ 1, 0 ]

what does the dot product mean72
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

V’ = [ 1, 1 ]

||u|| = sqrt(1) = 1

||v|| = sqrt(2) = 1.414

uv = 45º; cos(uv) = .707

U = [ 1, 0 ]

what does the dot product mean73
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

V’ = [ 0, 1 ]

uv = 90º; cos(uv) = 0

U = [ 1, 0 ]

what does the dot product mean74
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

uv = 270º; cos(uv) = 0

U = [ 1, 0 ]

V’ = [ 0, -1 ]

what does the dot product mean75
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

uv = 180º; cos(uv) = -1

U = [ 1, 0 ]

V’ = [ -1,0 ]

what does the dot product mean76
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

uv = 0º; cos(uv) = 1

U = [ 1, 0 ]

V’ = [ 2.2,0 ]

what does the dot product mean77
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

In general…

cos(uv)  -1…1

True regardless of dimensionality

what does the dot product mean78
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

To see why, consider the cosine expressed in scalar notation…

what does the dot product mean79
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

… and compare it to the equation for the correlation coefficient…

what does the dot product mean80
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

… and compare it to the equation for the correlation coefficient…

if u and v have means of zero, then cos(uv) = r(u,v)

what does the dot product mean81
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

… and compare it to the equation for the correlation coefficient…

if u and v have means of zero, then cos(uv) = r(u,v)

The cosine is a special case of the correlation coefficient!

what does the dot product mean82
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

… and let’s compare the cosine to the dot product…

what does the dot product mean83
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

… and let’s compare the cosine to the dot product…

If u and v have lengths of 1, then the dot product is equal to the cosine.

what does the dot product mean84
What Does the Dot Product “Mean”?
  • What about u v where u  v?

cos(uv) is a length-invariant measure of the similarity of u and v

… and let’s compare the cosine to the dot product…

If u and v have lengths of 1, then the dot product is equal to the cosine.

The dot product is a special case of the cosine, which is a special case of the correlation coefficient, which is a measure of vector similarity!

what does the dot product mean85
What Does the Dot Product “Mean”?
  • The most common input rule is a dot product between unit i’s vector of weights and the activation vector on the other end
  • Such a unit is computing the (biased) similarity between what it expects (wi) and what it’s getting (a).
  • It’s activation is a positive function of this similarity
what does the dot product mean86
What Does the Dot Product “Mean”?
  • The most common input rule is a dot product between unit i’s vector of weights and the activation vector on the other end
  • Such a unit is computing the (biased) similarity between what it expects (wi) and what it’s getting (a).
  • It’s activation is a positive function of this similarity

ai

asymptotic

ni

what does the dot product mean87
What Does the Dot Product “Mean”?
  • The most common input rule is a dot product between unit i’s vector of weights and the activation vector on the other end
  • Such a unit is computing the (biased) similarity between what it expects (wi) and what it’s getting (a).
  • It’s activation is a positive function of this similarity

ai

Step (BTU)

ni

what does the dot product mean88
What Does the Dot Product “Mean”?
  • The most common input rule is a dot product between unit i’s vector of weights and the activation vector on the other end
  • Such a unit is computing the (biased) similarity between what it expects (wi) and what it’s getting (a).
  • It’s activation is a positive function of this similarity

ai

logistic

ni

multiplying a vector by a vector 2 the outer product
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

multiplying a vector by a vector 2 the outer product90
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

1

2

v = [ 4 5 6 ]

u’ =

multiplying a vector by a vector 2 the outer product91
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

1

2

v = [ 4 5 6 ]

u’ =

M =

multiplying a vector by a vector 2 the outer product92
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

1

2

v = [ 4 5 6 ]

u’ =

M =

Row 1

multiplying a vector by a vector 2 the outer product93
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

1

2

v = [ 4 5 6 ]

u’ =

M =

Row 1 times column 1

multiplying a vector by a vector 2 the outer product94
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

1

2

4

v = [ 4 5 6 ]

u’ =

M =

Row 1 times column 1 goes into row 1, column 1

multiplying a vector by a vector 2 the outer product95
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

1

2

4 5

v = [ 4 5 6 ]

u’ =

M =

Row 1 times column 2 goes into row 1, column 2

multiplying a vector by a vector 2 the outer product96
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

1

2

4 5 6

v = [ 4 5 6 ]

u’ =

M =

Row 1 times column 3 goes into row 1, column 3

multiplying a vector by a vector 2 the outer product97
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

1

2

4 56

8

v = [ 4 5 6 ]

u =

M =

Row 2 times column 1 goes into row 2, column 1

multiplying a vector by a vector 2 the outer product98
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

1

2

4 56

8 10

v = [ 4 5 6 ]

u’ =

M =

Row 2 times column 2 goes into row 2, column 2

multiplying a vector by a vector 2 the outer product99
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

1

2

4 56

8 10 12

v = [ 4 5 6 ]

u’ =

M =

Row 2 times column 3 goes into row 2, column 3

multiplying a vector by a vector 2 the outer product100
Multiplying a Vector by a Vector 2:The Outer Product

The two vectors need not have the same dimensionality.

Same Mantra: Rows by Columns.

This time, multiply a column vector by a row vector:

M = u’ * v

A better way to visualize it…

v = [ 4 5 6 ]

4 56

8 10 12

1

2

u’ =

= M

multiplying a vector by a vector 2 the outer product101
Multiplying a Vector by a Vector 2:The Outer Product

Outer product is not exactly commutative…

M = u’ * v

M = v’ * u

u = [ 1 2 ]

v = [ 4 5 6 ]

4

5

6

4 5 6

8 10 12

1

2

4 8

5 10

6 12

u’ =

= M

v’ =

multiplying a vector by a matrix
Multiplying a Vector by a Matrix
  • Same Mantra: Rows by Columns
rows by columns
Rows by Columns

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

rows by columns104
Rows by Columns

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

rows by columns105
Rows by Columns

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

Multiply rows

rows by columns106
Rows by Columns

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

Multiply rows by columns

each such multiplication is a simple dot product
Each such multiplication is a simple dot product

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

each such multiplication is a simple dot product108
Each such multiplication is a simple dot product

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

Make a proxy column vector…

each such multiplication is a simple dot product109
Each such multiplication is a simple dot product

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

.2

.6

.3

.7

.9

.4

.3

each such multiplication is a simple dot product110
Each such multiplication is a simple dot product

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

.2

.6

.3

.7

.9

.4

.3

Now compute the dot product of the (proxy) row vector with each column of the matrix…

[ 1.5]

each such multiplication is a simple dot product111
Each such multiplication is a simple dot product

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

.2

.6

.3

.7

.9

.4

.3

[ 1.5 1.4]

each such multiplication is a simple dot product112
Each such multiplication is a simple dot product

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

.2

.6

.3

.7

.9

.4

.3

[ 1.5 1.4 0.8]

each such multiplication is a simple dot product113
Each such multiplication is a simple dot product

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

.2

.6

.3

.7

.9

.4

.3

[ 1.5 1.4 0.8 1.5]

each such multiplication is a simple dot product114
Each such multiplication is a simple dot product

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

.2

.6

.3

.7

.9

.4

.3

[ 1.5 1.4 0.8 1.5 1.9]

each such multiplication is a simple dot product115
Each such multiplication is a simple dot product

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

.2

.6

.3

.7

.9

.4

.3

[ 1.5 1.4 0.8 1.5 1.9 1.2]

slide116

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

The result is a row vector with as many columns (dimensions) as the matrix (not the vector)

[ 1.5 1.4 0.8 1.5 1.9 1.2]

slide117

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

7-dimensional vector

[ 1.5 1.4 0.8 1.5 1.9 1.2]

slide118

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

7-dimensional vector

6-dimensional vector

[ 1.5 1.4 0.8 1.5 1.9 1.2]

slide119

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

7-dimensional vector

7 (rows) X 6 (columns) matrix

6-dimensional vector

[ 1.5 1.4 0.8 1.5 1.9 1.2]

slide120

NOT Commutative!

A row vector:

[ .2 .6 .3 .7 .9 .4 .3 ]

A matrix:

.3 .4 .8 .1 .2 .3

.5 .2 0 .1 .5 .2

.1 .1 .9 .2 .5 .3

.2 .4 .1 .7 .8 .5

.9 .9 .2 .5 .3 .5

.4 .1 .2 .7 .8 .2

.1 .2 .2 .5 .7 .2

7-dimensional vector

7 (rows) X 6 (columns) matrix

6-dimensional vector

[ 1.5 1.4 0.8 1.5 1.9 1.2]

multiplying a matrix by a matrix
Multiplying a Matrix by a Matrix
  • The Same Mantra: Rows by Columns
rows by columns122
Rows by Columns

A 2 X 3 matrix

A 3 X 2 matrix

2

1 2

1 2

1 2 3

1 2 3

row 1 x column 1
Row 1 X Column 1

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 1

Row 1

1 2 3

1 2 3

(proxy)

row 1 x column 1124
Row 1 X Column 1

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 1

Row 1

1 2 3

1 2 3

Result = 6

row 1 x column 1125
Row 1 X Column 1

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 1

Row 1

1 2 3

1 2 3

Result = 6

Place the result in row 1,

row 1 x column 1126
Row 1 X Column 1

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 1

Row 1

1 2 3

1 2 3

Result = 6

Place the result in row 1, column 1

row 1 x column 1127
Row 1 X Column 1

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 1

Row 1

1 2 3

1 2 3

Result = 6

Place the result in row 1, column 1 of a new matrix…

6

row 1 x column 2
Row 1 X Column 2

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 2

Row 1

1 2 3

1 2 3

Result = 12

6

row 1 x column 2129
Row 1 X Column 2

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 2

Row 1

1 2 3

1 2 3

Result = 12

Place the result in row 1, column 2 of the new matrix…

6 12

row 2 x column 1
Row 2 X Column 1

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 1

1 2 3

1 2 3

Row 2

Result = 6

6 12

row 2 x column 1131
Row 2 X Column 1

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 1

1 2 3

1 2 3

Row 2

Result = 6

Place the result in row 2, column 1 of the new matrix…

6 12

6

row 2 x column 2
Row 2 X Column 2

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 2

1 2 3

1 2 3

Row 2

Result = 12

6 12

6

row 2 x column 2133
Row 2 X Column 2

A 2 X 3 matrix

A 3 X 2 matrix

1

2

3

1 2

1 2

1 2

Column 2

1 2 3

1 2 3

Row 2

Result = 12

Place the result in row 2, column 2 of the new matrix…

6 12

6 12

slide134
So…

A 2 X 3 matrix

A 3 X 2 matrix

1 2

1 2

1 2

1 2 3

1 2 3

*

=

6 12

6 12

A 2 X 2 matrix

slide135
So…

A 2 X 3 matrix

A 3 X 2 matrix

1 2

1 2

1 2

1 2 3

1 2 3

*

The result has the same number of rows as the first matrix…

=

6 12

6 12

A 2 X 2 matrix

slide136
So…

A 2 X 3 matrix

A 3 X 2 matrix

1 2

1 2

1 2

1 2 3

1 2 3

*

The result has the same number of rows as the first matrix…

…and the same number of columns as the second.

=

6 12

6 12

A 2 X 2 matrix

slide137
…and…

A 2 X 3 matrix

A 3 X 2 matrix

1 2

1 2

1 2

1 2 3

1 2 3

*

…and the number of columns in the first matrix…

=

6 12

6 12

A 2 X 2 matrix

slide138
…and…

A 2 X 3 matrix

A 3 X 2 matrix

1 2

1 2

1 2

1 2 3

1 2 3

*

…and the number of columns in the first matrix…

…must be equal to the number of rows in the second.

=

6 12

6 12

A 2 X 2 matrix

slide139
This is basic (default) matrix multiplication.

There’s other more complicated stuff, too.

You (probably) won’t need it for this class.