• Nem Talált Eredményt

Recent Advances on the Complexity of Parameterized Counting Problems

N/A
N/A
Protected

Academic year: 2022

Ossza meg "Recent Advances on the Complexity of Parameterized Counting Problems"

Copied!
98
0
0

Teljes szövegt

(1)

Recent Advances on the Complexity of Parameterized Counting Problems

Dániel Marx

Institute for Computer Science and Control, Hungarian Academy of Sciences (MTA SZTAKI)

Budapest, Hungary

Joint work with Radu Curticapean and Holger Dell SODA 2019

San Diego, CA January 7, 2019

(2)

Counting problems

Instead of finding one solution, we need to count the number of solutions.

Applications: probability, statistical physics, pattern frequency. . .

Reliability: If each edge fails with probability 12 independently, what is the probability that the graph remains connected?

= counting the number of connected subgraphs

(3)

Counting problems

Instead of finding one solution, we need to count the number of solutions.

Applications: probability, statistical physics, pattern frequency. . . Reliability: If each edge fails with probability 12 independently, what is the probability that the graph remains connected?

= counting the number of connected subgraphs

(4)

Counting problems

Instead of finding one solution, we need to count the number of solutions.

Applications: probability, statistical physics, pattern frequency. . . Reliability: If each edge fails with probability 12 independently, what is the probability that the graph remains connected?

= counting the number of connected subgraphs

(5)

Finding vs. counting

Finding a perfect matching in a bipartite graph is polynomial-time solvable.

[Ford and Fulkerson 1956]

vs.

Counting the number of perfect matchings in a bipartite graph is #P-hard.

[Valiant 1979]

(6)

This talk

Counting problems in the area of parameterized algorithms.

Quick intro to parameterized algorithms.

Connection between counting homomorphisms and subgraphs.

Algorithmic applications.

Complexity applications.

Main message

Parameterized subgraph counting problems can be understood via homomorphism counting problems.

. . .and this connection gives both algorithmic and complexity results!

(7)

This talk

Counting problems in the area of parameterized algorithms.

Quick intro to parameterized algorithms.

Connection between counting homomorphisms and subgraphs.

Algorithmic applications.

Complexity applications.

Main message

Parameterized subgraph counting problems can be understood via homomorphism counting problems.

. . .and this connection gives both algorithmic and complexity results!

(8)

Parameterized problems

Main idea

Instead of expressing the running time as a functionT(n) of n, we express it as a functionT(n,k) of the input sizen and some parameterk of the input.

In other words: we do not want to be efficient on all inputs of size n, only for those where k is small.

What can be the parameterk?

The size k of the solution we are looking for. The maximum degree of the input graph. The dimension of the point set in the input. The length of the strings in the input.

The length of clauses in the input Boolean formula. . . .

(9)

Parameterized problems

Main idea

Instead of expressing the running time as a functionT(n) of n, we express it as a functionT(n,k) of the input sizen and some parameterk of the input.

In other words: we do not want to be efficient on all inputs of size n, only for those where k is small.

What can be the parameterk?

The size k of the solution we are looking for.

The maximum degree of the input graph.

The dimension of the point set in the input.

The length of the strings in the input.

The length of clauses in the input Boolean formula.

. . .

(10)

Parameterized complexity

Problem: Vertex Cover Independent Set Input: GraphG, integerk GraphG, integerk Question: Is it possible to cover

the edges withk vertices?

Is it possible to find k independent vertices?

Complexity: NP-complete NP-complete

Brute force: O(nk) possibilities O(nk) possibilities O(2kn2) algorithm Nono(k) algorithm

exists known

(11)

Parameterized complexity

Problem: Vertex Cover Independent Set Input: GraphG, integerk GraphG, integerk Question: Is it possible to cover

the edges withk vertices?

Is it possible to find k independent vertices?

Complexity: NP-complete NP-complete Brute force: O(nk) possibilities O(nk) possibilities

O(2kn2) algorithm Nono(k) algorithm

exists known

(12)

Parameterized complexity

Problem: Vertex Cover Independent Set Input: GraphG, integerk GraphG, integerk Question: Is it possible to cover

the edges withk vertices?

Is it possible to find k independent vertices?

Complexity: NP-complete NP-complete Brute force: O(nk) possibilities O(nk) possibilities

O(2kn2) algorithm Nono(k) algorithm

exists known

(13)

Example: Bounded search tree method

Algorithm forVertex Cover:

e1=u1v1

(14)

Example: Bounded search tree method

Algorithm forVertex Cover:

e1=u1v1

u1 v1

(15)

Example: Bounded search tree method

Algorithm forVertex Cover:

e1=u1v1

u1 v1

e2=u2v2

(16)

Example: Bounded search tree method

Algorithm forVertex Cover:

e1=u1v1

u1 v1

e2=u2v2

u2 v2

(17)

Example: Bounded search tree method

Algorithm forVertex Cover:

e1=u1v1

u1 v1

e2=u2v2

u2 v2

k

Height of the search tree≤k ⇒ at most2k leaves⇒ 2k·nO(1) time algorithm.

(18)

Fixed-parameter tractability

Main definition

A parameterized problem isfixed-parameter tractable (FPT)if there is anf(k)nc time algorithm for some constant c.

Examples of NP-hard problems that are FPT: Finding a vertex cover of size k.

Finding a path of length k. Finding k disjoint triangles.

Drawing the graph in the plane with k edge crossings. Finding disjoint paths that connectk pairs of points. . . .

(19)

Fixed-parameter tractability

Main definition

A parameterized problem isfixed-parameter tractable (FPT)if there is anf(k)nc time algorithm for some constant c.

Examples of NP-hard problems that are FPT:

Finding a vertex cover of sizek. Finding a path of length k.

Finding k disjoint triangles.

Drawing the graph in the plane with k edge crossings.

Finding disjoint paths that connectk pairs of points.

. . .

(20)

FPT techniques

Treewidth

Color coding

Iterative compression Kernelization

Algebraic techniques

Bounded-depth search trees

(21)

W[1]-hardness

Negative evidence similar to NP-completeness. If a problem is W[1]-hard,then the problem is not FPT unless FPT=W[1].

Some W[1]-hard problems:

Finding a clique/independent set of sizek. Finding a dominating set of size k.

Finding k pairwise disjoint sets.

. . .

(22)

Parameterized Algorithms

Marek Cygan, Fedor V. Fomin, Lukasz Kowalik, Daniel Lokshtanov, Dániel Marx, Marcin Pilipczuk, Michał Pilipczuk, Saket Saurabh Springer 2015

(23)

Example: Win/Win for k -Path

Simple2O(k)·nO(1) time algorithm for finding a path of lengthk.

1 Compute a DFS tree.

2 If DFS tree has height >k: we can find ak-path.

3 If DFS tree has height ≤k:

treewidth is ≤k ⇒Use an algorithm with running time 2O(tw)·nO(1) for finding the longest path.

(24)

Example: Win/Win for k -Path

Simple2O(k)·nO(1) time algorithm for finding a path of lengthk.

1 Compute a DFS tree.

2 If DFS tree has height >k:

we can find ak-path.

3 If DFS tree has height ≤k:

treewidth is≤k ⇒Use an algorithm with running time 2O(tw)·nO(1) for finding the longest path.

(25)

Treewidth — a measure of “tree-likeness”

Tree decomposition: Vertices are arranged in a tree structure satisfying the following properties:

1 If u andv are neighbors, then there is a bag containing both of them.

2 For every v, the bags containingv form a connected subtree.

Width of the decomposition: largest bag size−1. treewidth: width of the best decomposition.

d c b

a

e f g h

g,h b,e,f a,b,c

d,f,g b,c,f

c,d,f

(26)

Treewidth — a measure of “tree-likeness”

Tree decomposition: Vertices are arranged in a tree structure satisfying the following properties:

1 If u andv are neighbors, then there is a bag containing both of them.

2 For every v, the bags containingv form a connected subtree.

Width of the decomposition: largest bag size−1. treewidth: width of the best decomposition.

d c b

a

e f g h

b,e,f b,c,f

a,b,c

c,d,f

d,f,g

g,h

(27)

Treewidth — a measure of “tree-likeness”

Tree decomposition: Vertices are arranged in a tree structure satisfying the following properties:

1 If u andv are neighbors, then there is a bag containing both of them.

2 For every v, the bags containingv form a connected subtree.

Width of the decomposition: largest bag size−1. treewidth: width of the best decomposition.

d c b

a

e f g h

g,h a,b,c

b,c,f c,d,f

d,f,g

b,e,f

(28)

Treewidth — a measure of “tree-likeness”

Tree decomposition: Vertices are arranged in a tree structure satisfying the following properties:

1 If u andv are neighbors, then there is a bag containing both of them.

2 For every v, the bags containingv form a connected subtree.

Width of the decomposition: largest bag size−1.

treewidth: width of the best decomposition.

d c b

a

e f g h

g,h a,b,c

b,c,f c,d,f

d,f,g

b,e,f

(29)

Treewidth — a measure of “tree-likeness”

Tree decomposition: Vertices are arranged in a tree structure satisfying the following properties:

1 If u andv are neighbors, then there is a bag containing both of them.

2 For every v, the bags containingv form a connected subtree.

Width of the decomposition: largest bag size−1.

treewidth: width of the best decomposition.

d c b

a

e f g h

g,h b,e,f a,b,c

d,f,g b,c,f

c,d,f

Each bag is a separator.

(30)

Treewidth — a measure of “tree-likeness”

Tree decomposition: Vertices are arranged in a tree structure satisfying the following properties:

1 If u andv are neighbors, then there is a bag containing both of them.

2 For every v, the bags containingv form a connected subtree.

Width of the decomposition: largest bag size−1.

treewidth: width of the best decomposition.

h g f e

a

b c d

g,h b,e,f a,b,c

d,f,g b,c,f

c,d,f

A subtree communicates with the outside world only via the root of the subtree.

(31)

Counting complexity

W[1]-hardness: “as hard as finding a k-clique”

#W[1]-hardness: “as hard as countingk-cliques”

What can happen to the counting version of an FPT problem?

1 (easy) The same algorithmic technique shows that the counting problem is FPT.

2 (easy, but different) New algorithmic techniques are needed to show that the counting version is FPT.

3 (hard)New lower bound techniques are needed to show that the counting version is #W[1]-hard.

(32)

Finding vs. counting

Generalization to counting:

WORKSfor the Vertex Cover branching algorithm

⇒ #Vertex Coveris FPT.

DOES NOT WORK for the#k-Path win/win algorithm What is the parameterized complexity of#k-Path?

Even more troubling question:

What is the parameterized complexity of the (even simpler)#k-Matching?

#k-Matching

#k-Path

(33)

Finding vs. counting

Generalization to counting:

WORKSfor the Vertex Cover branching algorithm

⇒ #Vertex Coveris FPT.

DOES NOT WORK for the#k-Path win/win algorithm What is the parameterized complexity of#k-Path? Even more troubling question:

What is the parameterized complexity of the (even simpler)#k-Matching?

#k-Matching

#k-Path

(34)

Counting k -paths and k -matchings

Colorful history:

#k-Path is #W[1]-hard [Flum and Grohe, FOCS 2002]

Weighted #k-Matchingis #W[1]-hard [Bläser and Curticapean, IPEC 2012]

Unweighted #k-Matchingis #W[1]-hard

[Curticapean, ICALP 2013]

complicated proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, FOCS 2014]

simpler proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, unpublished]— even simpler proof. Unweighted #k-Matchingis #W[1]-hard

[Curticapean, Dell, and M, STOC 2017] —tells the real story.

(35)

Counting k -paths and k -matchings

Colorful history:

#k-Path is #W[1]-hard [Flum and Grohe, FOCS 2002]

Weighted #k-Matching is #W[1]-hard [Bläser and Curticapean, IPEC 2012]

Unweighted #k-Matchingis #W[1]-hard

[Curticapean, ICALP 2013]

complicated proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, FOCS 2014]

simpler proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, unpublished]— even simpler proof. Unweighted #k-Matchingis #W[1]-hard

[Curticapean, Dell, and M, STOC 2017] —tells the real story.

(36)

Counting k -paths and k -matchings

Colorful history:

#k-Path is #W[1]-hard [Flum and Grohe, FOCS 2002]

Weighted #k-Matching is #W[1]-hard [Bläser and Curticapean, IPEC 2012]

Unweighted #k-Matchingis #W[1]-hard

[Curticapean, ICALP 2013]

complicated proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, FOCS 2014]

simpler proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, unpublished]— even simpler proof. Unweighted #k-Matchingis #W[1]-hard

[Curticapean, Dell, and M, STOC 2017] —tells the real story.

(37)

Counting k -paths and k -matchings

Colorful history:

#k-Path is #W[1]-hard [Flum and Grohe, FOCS 2002]

Weighted #k-Matching is #W[1]-hard [Bläser and Curticapean, IPEC 2012]

Unweighted #k-Matchingis #W[1]-hard

[Curticapean, ICALP 2013]

complicated proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, FOCS 2014]

simpler proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, unpublished]— even simpler proof. Unweighted #k-Matchingis #W[1]-hard

[Curticapean, Dell, and M, STOC 2017] —tells the real story.

(38)

Counting k -paths and k -matchings

Colorful history:

#k-Path is #W[1]-hard [Flum and Grohe, FOCS 2002]

Weighted #k-Matching is #W[1]-hard [Bläser and Curticapean, IPEC 2012]

Unweighted #k-Matchingis #W[1]-hard

[Curticapean, ICALP 2013]

complicated proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, FOCS 2014]

simpler proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, unpublished]— even simpler proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean, Dell, and M, STOC 2017] —tells the real story.

(39)

Counting k -paths and k -matchings

Colorful history:

#k-Path is #W[1]-hard [Flum and Grohe, FOCS 2002]

Weighted #k-Matching is #W[1]-hard [Bläser and Curticapean, IPEC 2012]

Unweighted #k-Matchingis #W[1]-hard

[Curticapean, ICALP 2013]

complicated proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, FOCS 2014]

simpler proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean and M, unpublished]— even simpler proof.

Unweighted #k-Matchingis #W[1]-hard

[Curticapean, Dell, and M, STOC 2017] —tells the real story.

(40)

Counting patterns

Main question

Which type of subgraph patterns are easy to count?

biclique clique complete multipartite graph matching

star subdivided star windmill

path double star

(41)

Counting patterns

Main question

Which type of subgraph patterns are easy to count?

biclique clique complete multipartite graph matching

star subdivided star windmill

path double star

(42)

Counting patterns

Main question

Which type of subgraph patterns are easy to count?

biclique clique complete multipartite graph matching

star subdivided star windmill

path double star

(43)

Counting patterns

Main question

Which type of subgraph patterns are easy to count?

biclique clique complete multipartite graph matching

star subdivided star windmill

path double star

(44)

Counting patterns

Patterns with small vertex cover number are is easy to count:

Theorem[multiple references]

#Sub(H) can be solved in time nvc(H)+O(1).

But what about patterns with large vertex cover number?

We will understand the complexity of counting any class of patterns, not just paths or matchings!

Main message

Parameterized subgraph counting problems can be understood via homomorphism counting problems.

(45)

Counting patterns

Patterns with small vertex cover number are is easy to count:

Theorem[multiple references]

#Sub(H) can be solved in time nvc(H)+O(1).

But what about patterns with large vertex cover number?

We will understand the complexity of counting any class of patterns, not just paths or matchings!

Main message

Parameterized subgraph counting problems can be understood via homomorphism counting problems.

(46)

Homomorphisms

AhomomorphismfromH toG is a mapping f :V(H)→V(G) such that ifab is an edge ofH, thenf(a)f(b) is an edge ofG.

4 3

2 1

1 2

3 4

Which pattern graphsH are easy for counting homomorphisms? Theorem (trivial)

For every fixedH, the problem#Hom(H) (count homomorphisms fromH to the given graphG) is polynomial-time solvable.

. . .because we can try all |V(G)||V(H)|possible mappings

f:V(H)→V(G).

(47)

Homomorphisms

AhomomorphismfromH toG is a mapping f :V(H)→V(G) such that ifab is an edge ofH, thenf(a)f(b) is an edge ofG.

3

24 1

1 2

3 4

Which pattern graphsH are easy for counting homomorphisms? Theorem (trivial)

For every fixedH, the problem#Hom(H) (count homomorphisms fromH to the given graphG) is polynomial-time solvable.

. . .because we can try all |V(G)||V(H)|possible mappings

f:V(H)→V(G).

(48)

Homomorphisms

AhomomorphismfromH toG is a mapping f :V(H)→V(G) such that ifab is an edge ofH, thenf(a)f(b) is an edge ofG.

24

1 2

3 4

13

Which pattern graphsH are easy for counting homomorphisms? Theorem (trivial)

For every fixedH, the problem#Hom(H) (count homomorphisms fromH to the given graphG) is polynomial-time solvable.

. . .because we can try all |V(G)||V(H)|possible mappings

f:V(H)→V(G).

(49)

Homomorphisms

AhomomorphismfromH toG is a mapping f :V(H)→V(G) such that ifab is an edge ofH, thenf(a)f(b) is an edge ofG.

13

1 2

3 4

24

Which pattern graphsH are easy for counting homomorphisms? Theorem (trivial)

For every fixedH, the problem#Hom(H) (count homomorphisms fromH to the given graphG) is polynomial-time solvable.

. . .because we can try all |V(G)||V(H)|possible mappings

f:V(H)→V(G).

(50)

Homomorphisms

AhomomorphismfromH toG is a mapping f :V(H)→V(G) such that ifab is an edge ofH, thenf(a)f(b) is an edge ofG.

13

1 2

3 4

24

Which pattern graphsH are easy for counting homomorphisms?

Theorem (trivial)

For every fixedH, the problem#Hom(H) (count homomorphisms fromH to the given graphG) is polynomial-time solvable.

. . .because we can try all |V(G)||V(H)|possible mappings

f:V(H)→V(G).

(51)

Homomorphisms

AhomomorphismfromH toG is a mapping f :V(H)→V(G) such that ifab is an edge ofH, thenf(a)f(b) is an edge ofG.

13

1 2

3 4

24

Which pattern graphsH are easy for counting homomorphisms?

Theorem (trivial)

For every fixedH, the problem#Hom(H) (count homomorphisms fromH to the given graphG) is polynomial-time solvable.

. . .because we can try all |V(G)||V(H)|possible mappings

f:V(H)→V(G).

(52)

Counting homomorphisms

Better questions:

Which classes H(e.g., paths, stars, matchings) of patterns can be counted in polynomial time?

What is the best exponent forHom(H)?

Fact

#Hom(H) can be solved in time O(ntw(H)+1).

Difference between finding and counting:

Finding:

Hom(Kk,k) is trivial

vs.

Counting:

#Hom(Kk,k) is W[1]-hard

(53)

Counting homomorphisms

Better questions:

Which classes H(e.g., paths, stars, matchings) of patterns can be counted in polynomial time?

What is the best exponent forHom(H)?

Fact

#Hom(H) can be solved in time O(ntw(H)+1).

Difference between finding and counting:

Finding:

Hom(Kk,k) is trivial

vs.

Counting:

#Hom(Kk,k) is W[1]-hard

(54)

Partitioned homomorphism

Partitioned Homomorphism

Input: H,G, and partition ΠofV(G) into|V(H)|classes.

Task: Find a homomorphism φ that maps the vertices of H to different classes.

1 2

3

4 4 3

1 2

Theorem[M 2010]

There is a universal constantγ >0 such that if for someH there is anO(nγ·tw(H)/logtw(H)) time algorithm forPartHom(H), then ETH fails.

(55)

Counting partitioned homomorphisms

#PartHom(H)

#Hom(H)

GP forP ⊆class(Π): subgraph ofG induced by the classesP.

Simple application of the inclusion-exclusion principle:

#part-hom(H,G) = X

P⊆class(Π)

(−1)|class(P)|−|P|·#hom(H,GP)

Theorem[Dalmau and Jonsson 2004][M 2010]

There is a universal constantγ >0 such that if for someH there is anO(nγ·tw(H)/logtw(H)) time algorithm for#Hom(H), then ETH fails.

(56)

Counting partitioned homomorphisms

#PartHom(H)

#Hom(H)

GP forP ⊆class(Π): subgraph ofG induced by the classesP.

Simple application of the inclusion-exclusion principle:

#part-hom(H,G) = X

P⊆class(Π)

(−1)|class(P)|−|P|·#hom(H,GP)

Theorem[Dalmau and Jonsson 2004][M 2010]

There is a universal constantγ >0 such that if for someH there is anO(nγ·tw(H)/logtw(H)) time algorithm for#Hom(H), then ETH fails.

(57)

Counting homomorphisms — summary

Treewidth ofH determines the complexity of the problem:

O(ntw(H)+1) upper bound.

Ω(nγ·tw(H)/logtw(H))lower bound (assuming ETH).

If we restrict the problem to a classHof patterns:

If Hhas bounded treewidth (e.g, stars, paths, . . .), then the problem is polynomial-time solvable.

If Hhas unbounded treewidth (e.g, cliques, bicliques, grids,

. . .), then the problem isnot polynomial-time solvable

(assuming ETH).

(58)

Subgraphs ⇔ homomorphisms

Easy to check:

hom( ,G) =8sub( ,G) +4sub( ,G) +2sub( ,G)

4 3

2 1

1 2

3 4

Not completely obvious:

The formula can be reversed by inclusion-exclusion. sub( ,G) = 1

8hom( ,G)−1

4hom( ,G) +1

8hom( ,G)

(59)

Subgraphs ⇔ homomorphisms

Easy to check:

hom( ,G) =8sub( ,G) +4sub( ,G) +2sub( ,G)

3

24 1

1 2

3 4

Not completely obvious:

The formula can be reversed by inclusion-exclusion. sub( ,G) = 1

8hom( ,G)−1

4hom( ,G) +1

8hom( ,G)

(60)

Subgraphs ⇔ homomorphisms

Easy to check:

hom( ,G) =8sub( ,G) +4sub( ,G) +2sub( ,G)

24

1 2

3 4

13

Not completely obvious:

The formula can be reversed by inclusion-exclusion. sub( ,G) = 1

8hom( ,G)−1

4hom( ,G) +1

8hom( ,G)

(61)

Subgraphs ⇔ homomorphisms

Easy to check:

hom( ,G) =8sub( ,G) +4sub( ,G) +2sub( ,G)

24

1 2

3 4

13 Not completely obvious:

The formula can be reversed by inclusion-exclusion.

sub( ,G) = 1

8hom( ,G)−1

4hom( ,G) +1

8hom( ,G)

(62)

Subgraphs ⇔ homomorphisms

Definition

surj(H,G): number of surjective homomorphisms fromH to G (every vertex and edge ofG appears in the image).

Homomorphisms can be counted by classifying according to the imageF:

hom( ,G) =8sub( ,G) +4sub( ,G) +2sub( ,G)

⇓ hom(H,G) =P

F surj(H,F)sub(F,G) Which of the terms can be nonzero?

(63)

Subgraphs ⇔ homomorphisms

Definition

surj(H,G): number of surjective homomorphisms fromH to G (every vertex and edge ofG appears in the image).

Homomorphisms can be counted by classifying according to the imageF:

hom( ,G) =8sub( ,G) +4sub( ,G) +2sub( ,G)

⇓ hom(H,G) =P

Fsurj(H,F)sub(F,G) Which of the terms can be nonzero?

(64)

Spasm

Part0(H): set of partitions ofV(H) where each class is an independent set.

For Π∈Part0(H),H is obtained by contracting each class of Πto a single vertex.

Spasm={H|Π∈Part0(H)}

Spasm( ) = n

, , , , , , ,

o

(65)

Subgraphs ⇔ homomorphisms

From subgraphs to homomorphisms:

hom(H,G) =X

F

surj(H,F)sub(F,G) wheresurj(H,F)6=0 if and only ifF ∈Spasm(H).

. . .useless.

From homomorphisms to subgraphs: [Lovász 1967]

sub(H,G) =X

F

βF ·hom(F,G) whereβF 6=0 if and only ifF ∈Spasm(H).

Extremely useful for applications in algorithms and complexity!

(66)

Subgraphs ⇔ homomorphisms

From subgraphs to homomorphisms:

hom(H,G) =X

F

surj(H,F)sub(F,G)

wheresurj(H,F)6=0 if and only ifF ∈Spasm(H).

. . .useless.

From homomorphisms to subgraphs: [Lovász 1967]

sub(H,G) =X

F

βF ·hom(F,G) whereβF 6=0 if and only ifF ∈Spasm(H).

Extremely useful for applications in algorithms and complexity!

(67)

Subgraphs ⇔ homomorphisms

From subgraphs to homomorphisms:

hom(H,G) =X

F

surj(H,F)sub(F,G)

wheresurj(H,F)6=0 if and only ifF ∈Spasm(H).

. . .useless.

From homomorphisms to subgraphs: [Lovász 1967]

sub(H,G) =X

F

βF ·hom(F,G) whereβF 6=0 if and only ifF ∈Spasm(H).

Extremely useful for applications in algorithms and complexity!

(68)

Algorithmic applications

sub(H,G) = X

F∈Spasm(H)

βF ·hom(F,G)

Max. treewidth inSpasm(H) gives an upper bound on complexity:

Corollary

If every graph inSpasm(H) has treewidth at mostc, then sub(H,G) can be computed in time O(nc+1).

(69)

Algorithmic applications

Corollary

If every graph inSpasm(H) has treewidth at mostc, then sub(H,G) can be computed in time O(nc+1).

Observe: If H has k edges, then every graph in Spasm(H) has at mostk edges.

Theorem[Scott and Sorkin 2007]

Every graph with≤k edges has treewidth at most0.174k+O(1). Corollary

IfH has k edges, then sub(H,G) can be computed in time n0.174k+O(1).

(70)

Algorithmic applications

Corollary

If every graph inSpasm(H) has treewidth at mostc, then sub(H,G) can be computed in time O(nc+1).

Observe: If H has k edges, then every graph in Spasm(H) has at mostk edges.

Theorem[Scott and Sorkin 2007]

Every graph with≤k edges has treewidth at most0.174k+O(1).

Corollary

IfH hask edges, then sub(H,G) can be computed in time n0.174k+O(1).

(71)

Counting k -paths

Corollary

IfH hask edges, then sub(H,G) can be computed in time n0.174k+O(1).

Example: Counting k-paths Brute force: O(nk).

Meet in the middle: O(n0.5k)

[Björklund et al., ESA 2009],[Koutis and Williams, ICALP 2009]

[Björklund et al., SODA 2014]: n0.455k+O(1).

New! by counting homomorphisms in the spasm: n0.174k+O(1).

(72)

Counting small cycles

Counting triangles using matrix multiplication:

sub(C3,G) = 1

6tr Adj3(G)

Theorem[Alon, Yuster, and Zwick, ESA 1994]

Fork ≤7, we can compute sub(Ck,G)in time nω (where ω <2.373 is the matrix-multiplication exponent).

We can recover this result:

Check: ifk ≤7, then every graph inSpasm(Ck,G)has treewidth at most 2.

For treewidth 2, the O(n2+1)homomorphism algorithm can be improved toO(nω) with fast matrix multiplication.

⇒ O(nω)algorithm forsub(Ck,G) ifk ≤7.

(73)

Counting small cycles

Counting triangles using matrix multiplication:

sub(C3,G) = 1

6tr Adj3(G)

Theorem[Alon, Yuster, and Zwick, ESA 1994]

Fork ≤7, we can compute sub(Ck,G)in time nω (where ω <2.373 is the matrix-multiplication exponent).

We can recover this result:

Check: ifk ≤7, then every graph inSpasm(Ck,G)has treewidth at most 2.

For treewidth 2, the O(n2+1)homomorphism algorithm can be improved toO(nω) with fast matrix multiplication.

⇒ O(nω)algorithm forsub(Ck,G) ifk ≤7.

(74)

Counting small cycles

Counting triangles using matrix multiplication:

sub(C3,G) = 1

6tr Adj3(G)

Theorem[Alon, Yuster, and Zwick, ESA 1994]

Fork ≤7, we can compute sub(Ck,G)in time nω (where ω <2.373 is the matrix-multiplication exponent).

We can recover this result:

Check: ifk ≤7, then every graph inSpasm(Ck,G) has treewidth at most 2.

For treewidth 2, the O(n2+1)homomorphism algorithm can be improved toO(nω) with fast matrix multiplication.

⇒ O(nω) algorithm forsub(Ck,G) ifk ≤7.

(75)

Complexity applications

sub(H,G) = X

F∈Spasm(H)

βF ·hom(F,G)

Note: Every βF is nonzero.

Reductions:

Obvious:

if we can compute hom(F,G)for any F ∈Spasm(H)

⇒ we can compute sub(H,G). Highly nontrivial:

if we can compute sub(H,G)

⇒ we can compute hom(F,G) for any F ∈Spasm(H).

Complexity ofhom(F,G) for any F ∈Spasm(H)is a lower bound on the complexity ofsub(H,G).

(76)

Complexity applications

sub(H,G) = X

F∈Spasm(H)

βF ·hom(F,G)

Note: Every βF is nonzero.

Reductions:

Obvious:

if we can compute hom(F,G)for any F ∈Spasm(H)

⇒ we can compute sub(H,G).

Highly nontrivial:

if we can compute sub(H,G)

⇒ we can compute hom(F,G) for any F ∈Spasm(H). Complexity ofhom(F,G) for any F ∈Spasm(H)is a lower bound on the complexity ofsub(H,G).

(77)

Complexity applications

sub(H,G) = X

F∈Spasm(H)

βF ·hom(F,G)

Note: Every βF is nonzero.

Reductions:

Obvious:

if we can compute hom(F,G)for any F ∈Spasm(H)

⇒ we can compute sub(H,G).

Highly nontrivial:

if we can compute sub(H,G)

⇒ we can compute hom(F,G) for any F ∈Spasm(H).

Complexity ofhom(F,G) for any F ∈Spasm(H)is a lower bound on the complexity ofsub(H,G).

(78)

Complexity applications

sub(H,G) = X

F∈Spasm(H)

βF ·hom(F,G)

Note: Every βF is nonzero.

Reductions:

Obvious:

if we can compute hom(F,G)for any F ∈Spasm(H)

⇒ we can compute sub(H,G).

Highly nontrivial:

if we can compute sub(H,G)

⇒ we can compute hom(F,G) for any F ∈Spasm(H).

Complexity ofhom(F,G) for anyF ∈Spasm(H)is a lower bound on the complexity ofsub(H,G).

(79)

Matrices

Fix an enumeration of graphs with≤k edges with nondecreasing number of edges.

Hom matrix: row i, column j ishom(Hi,Hj).

Submatrix: row i, columnj is sub(Hi,Hj).

Surj matrix: row i, columnj issurj(Hi,Hj).

hom(H,G) =P

F surj(H,F)sub(F,G)

Hom=Surj·Sub

The Hom matrix is invertible!

(80)

Matrices

Fix an enumeration of graphs with≤k edges with nondecreasing number of edges.

Hom matrix: row i, column j ishom(Hi,Hj).

Submatrix: row i, columnj is sub(Hi,Hj).

Surj matrix: row i, columnj issurj(Hi,Hj).

hom(H,G) =P

Fsurj(H,F)sub(F,G)

Hom=Surj·Sub

Hom = Surj · Sub

The Hom matrix is invertible!

(81)

Matrices

Fix an enumeration of graphs with≤k edges with nondecreasing number of edges.

Hom matrix: row i, column j ishom(Hi,Hj).

Submatrix: row i, columnj is sub(Hi,Hj).

Surj matrix: row i, columnj issurj(Hi,Hj).

hom(H,G) =P

Fsurj(H,F)sub(F,G)

Hom=Surj·Sub

Hom = Surj · Sub

H

G

The Hom matrix is invertible!

(82)

Matrices

Fix an enumeration of graphs with≤k edges with nondecreasing number of edges.

Hom matrix: row i, column j ishom(Hi,Hj).

Submatrix: row i, columnj is sub(Hi,Hj).

Surj matrix: row i, columnj issurj(Hi,Hj).

hom(H,G) =P

Fsurj(H,F)sub(F,G)

Hom=Surj·Sub

Hom = Surj · Sub

The Hom matrix is invertible!

(83)

Categorical product

One of the standard graph products:

Definition

G1×G2 has vertex set V(G1)×V(G2) and(v1,v2)and(v10,v20) adjacent inG1×G2 ⇐⇒ v1v10 ∈E(G1) andv2v20 ∈E(G2).

[missing figure]

Exercise:

hom(H,G1×G2) = hom(H,G1)·hom(H,G2)

(84)

Extracting a term

Lemma

Given an algorithm forsub(H,G) =P

F∈Spasm(H)βF ·hom(F,G) (withβF 6=0), we can compute hom(F,G) for any F ∈Spasm(H).

Use the algorithm onZ×G for every Z with ≤k =|E(H)|edges.

sub(H,Z ×G) =bZ

HomT · =

bZ1

βF1·hom(F1,G)

βFt·hom(Ft,G) bZt

... ...

TheHommatrix is invertible, so we can solve this system of equations!

(85)

Extracting a term

Lemma

Given an algorithm forsub(H,G) =P

F∈Spasm(H)βF ·hom(F,G) (withβF 6=0), we can compute hom(F,G) for any F ∈Spasm(H).

Use the algorithm onZ×G for every Z with ≤k =|E(H)|edges.

X

F∈Spasm(H)

βF ·hom(F,Z ×G) =bZ

HomT · =

bZ1

βF1·hom(F1,G)

βFt·hom(Ft,G) bZt

... ...

TheHommatrix is invertible, so we can solve this system of equations!

(86)

Extracting a term

Lemma

Given an algorithm forsub(H,G) =P

F∈Spasm(H)βF ·hom(F,G) (withβF 6=0), we can compute hom(F,G) for any F ∈Spasm(H).

Use the algorithm onZ×G for every Z with ≤k =|E(H)|edges.

X

F∈Spasm(H)

βF ·hom(F,Z)·hom(F,G) =bZ

HomT · =

bZ1

βF1·hom(F1,G)

βFt·hom(Ft,G) bZt

... ...

TheHommatrix is invertible, so we can solve this system of equations!

(87)

Extracting a term

Lemma

Given an algorithm forsub(H,G) =P

F∈Spasm(H)βF ·hom(F,G) (withβF 6=0), we can compute hom(F,G) for any F ∈Spasm(H).

Use the algorithm onZ×G for every Z with ≤k =|E(H)|edges.

X

F∈Spasm(H)

hom(F,Z)·βF ·hom(F,G) =bZ

HomT · =

bZ1

βF1·hom(F1,G)

βFt·hom(Ft,G) bZt

... ...

TheHommatrix is invertible, so we can solve this system of equations!

(88)

Extracting a term

Lemma

Given an algorithm forsub(H,G) =P

F∈Spasm(H)βF ·hom(F,G) (withβF 6=0), we can compute hom(F,G) for any F ∈Spasm(H).

Use the algorithm onZ×G for every Z with ≤k =|E(H)|edges.

X

F∈Spasm(H)

hom(F,Z)·βF ·hom(F,G) =bZ

HomT · =

bZ1

βF1·hom(F1,G)

βFt·hom(Ft,G) bZt

... ...

TheHommatrix is invertible, so we can solve this system of equations!

(89)

Extracting a term

Lemma

Given an algorithm forsub(H,G) =P

F∈Spasm(H)βF ·hom(F,G) (withβF 6=0), we can compute hom(F,G) for any F ∈Spasm(H).

Use the algorithm onZ×G for every Z with ≤k =|E(H)|edges.

X

F∈Spasm(H)

hom(F,Z)·βF ·hom(F,G) =bZ

HomT · =

bZ1

βF1·hom(F1,G)

βFt·hom(Ft,G) bZt

... ...

TheHommatrix is invertible, so we can solve this system of equations!

(90)

Extracting a term

Lemma

Given an algorithm forsub(H,G) =P

F∈Spasm(H)βF ·hom(F,G) (withβF 6=0), we can compute hom(F,G) for any F ∈Spasm(H).

Bottom line:

complexity of

#Sub(H)

=

hardest#Hom(F)for

F ∈Spasm(F)

Complexity depends on the maximum treewidth in Spasm(H)!

(91)

Complexity of counting patterns

What is the best exponent for counting occurrences of this 46-vertex graphH?

Answer: Compute Spasm(H) and find the best exponent for each of the resulting homomorphism problems!

(92)

Hardness results for # k -Matching

Not FPT:

Theorem

Countingk-matchings is #W[1]-hard.

Proof: As Kk ∈Spasm(M(k2)), counting k-cliques can be reduced to counting k2

-matchings. More precise bound:

Spasm(Mk) contains every graph withk edges

Spasm(Mk)contains graphs with treewidth Ω(k)

no f(k)no(k/logk) time algorithm for #k-Matching, assuming ETH.

(93)

Hardness results for # k -Matching

Not FPT:

Theorem

Countingk-matchings is #W[1]-hard.

Proof: As Kk ∈Spasm(M(k2)), counting k-cliques can be reduced to counting k2

-matchings.

More precise bound:

Spasm(Mk) contains every graph withk edges

Spasm(Mk)contains graphs with treewidth Ω(k)

no f(k)no(k/logk) time algorithm for #k-Matching, assuming ETH.

(94)

Hardness results for # k -Matching

Not FPT:

Theorem

Countingk-matchings is #W[1]-hard.

Proof: As Kk ∈Spasm(M(k2)), counting k-cliques can be reduced to counting k2

-matchings.

More precise bound:

Spasm(Mk) contains every graph withk edges

Spasm(Mk)contains graphs with treewidth Ω(k)

no f(k)no(k/logk) time algorithm for #k-Matching, assuming ETH.

(95)

Role of vertex cover number

What property ofH determines the max. treewidth inSpasm(H)?

The vertex cover number ofH: Upper bound:

For every F ∈Spasm(H), we have tw(F)≤vc(F)≤vc(H). Lower bound:

H contains a matching of sizevc(H)/2. We can show that for anyF with at mostvc(H)/2edges, there is a graph in

Spasm(H) that containsF as a minor⇒ there is a graph in Spasm(H) with treewidthΩ(vc(H)).

(96)

Role of vertex cover number

What property ofH determines the max. treewidth inSpasm(H)?

The vertex cover number ofH:

Upper bound:

For every F ∈Spasm(H), we have tw(F)≤vc(F)≤vc(H).

Lower bound:

H contains a matching of sizevc(H)/2. We can show that for anyF with at mostvc(H)/2edges, there is a graph in

Spasm(H) that containsF as a minor⇒ there is a graph in Spasm(H) with treewidthΩ(vc(H)).

(97)

Counting subgraphs — summary

Vertex cover number ofH determines the complexity of Sub(H):

nvc(H)+O(1) upper bound.

Ω(nγ·vc(H)/logvc(H)) lower bound.

If we restrict the problem to a classHof patterns:

If Hhas bounded vertex cover number (e.g, stars, double stars,. . .), then the problem is polynomial-time solvable.

If Hhas unbounded vertex cover number (e.g, cliques, paths, matchings, disjoint triangles, . . .), then the problem isnot polynomial-time solvable (assuming ETH).

(98)

Conclusions

Main message

Parameterized subgraph counting problems can be understood via homomorphism counting problems.

. . .and this connection gives both algorithmic and complexity results!

Working on counting problems is fun:

You can revisit fundamental, “well-understood” problems.

Requires a new set of lower bound techniques.

Requires new algorithmic techniques.

Hivatkozások

KAPCSOLÓDÓ DOKUMENTUMOK

During an open problem session at Recent Advancements in Parameterized Com- plexity school (December 2017) [12], Saurabh posed the question of parameterized complexity of a

We show that our lower bounds, with the exception of Theorem 1.2 and Theorem 1.3(iii), can be obtained using the weaker complexity assumption stating that counting the number

Using the terminology of parameterized complexity, we are interested in pa- rameterizations where the parameter associated with an instance of alc(S) is the feedback vertex set

To settle the classical complexity of the examined problems, first we observe (Thms. 1 and 2) that classical results imply polynomial-time al- gorithms for the edge-deletion

Other applications of finding hypergraphs with small fractional edge cover number. The Closest Substring problem with small distances

On the other hand, parameterized complexity theory may help to justify the shield provided by computational complexity: if a problem belongs to one of the parameterized hardness

Considering the parameterized complexity of the local search approach for the MMC problem with parameter ` denoting the neighborhood size, Theorem 3 shows that no FPT local

We build on this and define a general framework of so-called graph motif parameters to capture counting linear combinations of small patterns, into which (induced) subgraph