slide1
Download
Skip this Video
Download Presentation
Reduction between Transitive Closure & Boolean Matrix Multiplication

Loading in 2 Seconds...

play fullscreen
1 / 26

Reduction between Transitive Closure & Boolean Matrix Multiplication - PowerPoint PPT Presentation


  • 123 Views
  • Uploaded on

Reduction between Transitive Closure & Boolean Matrix Multiplication. Presented by Rotem Mairon. Overview. The speed-up of 4-Russians for matrix multiplication. A divide & conquer approach for matrix multiplication: Strassen’s method. Reduction between TC and BMM.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Reduction between Transitive Closure & Boolean Matrix Multiplication ' - sorley


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Reduction between Transitive Closure &

Boolean Matrix Multiplication

Presented by Rotem Mairon

slide2

Overview

The speed-up of 4-Russians for matrix multiplication

A divide & conquer approach for matrix multiplication: Strassen’s method

Reduction between TC and BMM

slide3

The speedup of 4-Russians for matrix multiplication

A basic observation

  • Consider a two boolean matrices, A and B of small dimensions.
  • The boolean multiplication of row Ai by column Bj is defined by:
  • The naïve boolean multiplication is done bit-after bit. This requires O(n) steps.
  • How can this be improved with pre-processing?

n=4

3

slide4

The speedup of 4-Russians for matrix multiplication

A basic observation

  • Each row Ai and column Bj form a pair of 4-bit binary numbers.
  • These binary numbers can be regarded as indices to a table of size 24x24
  • For each entry in the table, we pre-store the value for multiplying the indices.
  • The multiplication of Ai by Bi can be computed in O(1) time.
  • Problem: 2nx2n is not practical of large matrix multiplication.

= 6

n=4

= 4

4

slide5

The speedup of 4-Russians for matrix multiplication

The speedup

  • Instead of regarding a complete row/col as an index to the table, consider only part of it.
  • Now, we pre-compute multiplication values for pairs of binary vectors of size k in a table of size 2kx2k.

5

5

slide6

The speedup of 4-Russians for matrix multiplication

The speedup

  • Instead of regarding a complete row/col as an index to the table, consider only part of it.
  • Now, we pre-compute multiplication values for pairs of binary vectors of size k in a table of size 2kx2k.

6

6

slide7

The speedup of 4-Russians for matrix multiplication

The speedup

  • Let , then all pairs of k-bit binary vectors canbe represented in a table of size:
  • Time required for multiplying Ai by Bi: O(n/logn).
  • Total time required: O(n3/logn) instead of O(n3).

7

7

slide8

Overview

The method of 4-Russians for matrix multiplication

A divide & conquer approach for matrix multiplication: Strassen’s method

Reduction between TC and BMM

slide9

Strassen’s method for matrix multiplication

A divide and conquer approach

  • Divide each nxn matrix into four matrices of size (n/2)x(n/2):
  • Computing all of requires 8 multiplications and 4 additions.
  • Therefore, the total running time is
  • Using the Master Theorem, this solves to . Still cubic!

Can we do better with a straightforward divide and conquer approach?

9

slide10

Strassen’s method for matrix multiplication

Strassen’s algorithm

  • Define seven matrices of size (n/2)x(n/2) :
  • The four (n/2)x(n/2) matrices can be defined in terms of M1,…,M7:

10

slide11

Strassen’s method for matrix multiplication

Strassen’s algorithm

Running time? Each matrix Mi requires additions

and subtractions but only one multiplication:

which solves to

11

11

slide12

Strassen’s method for matrix multiplication

Improvements

First to break the 2.5 barrier:

12

12

slide13

Best choices for matrix multiplication

  • Using the exact formulas for time complexity, for square matrices, crossover points
  • have been found:
    • For n<7, the naïve algorithm for matrix multiplication is preferred.As an example, a 6x6 matrix requires 482 steps for the method of 4- Russians, but 468 steps for the naïve multiplication.
    • For 6<n<513, the method of 4-Russians is most efficient.
    • For 512<n, Strassen’s approach costs the least number of steps.

13

slide14

Overview

The method of 4-Russians for matrix multiplication

A divide & conquer approach for matrix multiplication: Strassen’s method

Reduction between TC and BMM

slide15

Realization of matrix multiplication in graphs

Let A,B be adjacency matrices of two graphs over the same set of vertices {1,2,…,n}

  • An (A,B)-path is a path of length two whose first edge belongs to A and its second edge belongs to B.
  • if and only if there is an (A,B)-path from vertex i to vertex j. Therefore, C is the adjacency matrix with respect to (A,B)-paths.

15

slide16

Transitive Closure by Matrix Multiplication

Definition and a cubic solution

Given a directed graph G=(V,E), the transitive closure of G is defined as the graph

G*=(V,E*) where E*={(i,j) : there is a path from vertex i to vertex j}.

  • A dynamic programming algorithm, has been devised:
  • Floyd-Warshall’s algorithm:
  • Requires O(n3) time. Could it be beaten?

16

slide17

Transitive Closure by Matrix Multiplication

Beating the cubic solution

1

2

3

4

1

1

0

0

1

1

1

0

0

0

1

1

0

2

0

1

1

0

0

1

1

0

0

0

1

1

3

0

0

1

1

0

0

1

1

0

0

0

1

4

0

0

0

1

0

0

0

1

0

0

0

0

By squaring the matrix, we get (i,j)=1 iff we can get from i to j in exactlytwo steps:

How could we make (i,j) equal 1 iff there’s a path from i to j in at most 2 steps?

Storing 1’s in all diagonal entries.

What about (i,j)=1 iff there’s a path from i to j in at most 4 steps?

Keep multiplying.

17

slide18

Transitive Closure by Matrix Multiplication

Beating the cubic solution

In total, the longest path had 4 vertices and 2 multiplications are required.

Log2(n).

How many multiplications are required for the general case?

  • The transitive closure can be obtained in O(n2.37log2(n)) time:
    • Multiply the matrix log2(n) times.
    • Each multiplication requires O(n2.37) steps using Strassen’s approach.
  • Better still: we can get rid of the log2(n) factor.

18

slide19

Transitive Closure by Matrix Multiplication

Better still: getting rid of the log(n) factor

The log(n) factor can be dropped by applying the following steps:

  • Determine the strongly connected components of the graph: O(n2)
  • Collapse each component to a single vertex.
  • The problem is now reduced to the problem for the new graph.

19

slide20

Transitive Closure by Matrix Multiplication

Better still: getting rid of the log(n) factor

The log(n) factor can be dropped by applying the following steps:

  • Generate a topological sort for the new graph.
  • Divide the graph into two sections: A (first half) and B (second half).
  • The adjacency matrix of sorted graph is upper triangular:

20

slide21

Transitive Closure by Matrix Multiplication

Better still: getting rid of the log(n) factor

To find the transitive closure of G, notice that:

  • Connections within A are independent of B.
  • Similarly, connections within B are independent of A.
  • Connections from A to B are found by:

A*(i,u) = 1

iuinA

a path in A

21

21

slide22

Transitive Closure by Matrix Multiplication

Better still: getting rid of the log(n) factor

To find the transitive closure of G, notice that:

  • Connections within A are independent of B.
  • Similarly, connections within B are independent of A.
  • Connections from A to B are found by:

A*C(i,v) = 1

iuinA

and

u v

a path in A

an edge in C

22

22

slide23

Transitive Closure by Matrix Multiplication

Better still: getting rid of the log(n) factor

To find the transitive closure of G, notice that:

  • Connections within A are independent of B.
  • Similarly, connections within B are independent of A.
  • Connections from A to B are found by:

and

A*CB*(i,j) = 1

iuinA

and

u v

ujinA

a path in A

an edge in C

a path in B

23

23

slide24

Transitive Closure by Matrix Multiplication

Better still: getting rid of the log(n) factor

To find the transitive closure of G, notice that:

  • Connections within A are independent of B.
  • Similarly, connections within A are independent of A.
  • Hence, G* can be found with determining A*, B*, and computing A*CB*
  • This requires finding the transitive closure of two (n/2)x(n/2) matrices,
  • And performing two matrix multiplications: O(n2.37).

Running time? Solves to O(2.37)

24

slide25

Matrix Multiplication by Transitive Closure

Let A,B be two boolean matrices, to compute C=AB, form the following matrix:

  • The transitive closure of such a graph is formed by adding the edges from the 1st part to 2nd.
  • These edges are described by the product of matrices A,B. Therefore,

25

ad