• Nem Talált Eredményt

On subexponential parameterized algorithms for Steiner Tree and Directed Subset TSP on planar graphs

N/A
N/A
Protected

Academic year: 2022

Ossza meg "On subexponential parameterized algorithms for Steiner Tree and Directed Subset TSP on planar graphs"

Copied!
64
0
0

Teljes szövegt

(1)

On subexponential parameterized algorithms for Steiner Tree and Directed Subset TSP on planar

graphs

Dániel Marx1 Marcin Pilipczuk2 Michał Pilipczuk2

1Institute for Computer Science and Control, Hungarian Academy of Sciences (MTA SZTAKI)

Budapest, Hungary

2Institute of Informatics, University of Warsaw, Poland

FOCS 2018 Paris, France October 9, 2018

1

(2)

Square root phenomenon

NP-hard problems become easier on planar graphs, and usually exactly by a square root factor.

The running time is still exponential, but significantly smaller: 2O(n) ⇒ 2O(

n)

nO(k) ⇒ nO(

k)

2O(k)·nO(1) ⇒ 2O(

k)·nO(1)

Several known examples known where such improvement is possible, and (assuming the ETH)

O(k) is best possible for general graphs and O(√

k) is best possible for planar graphs.

(3)

Square root phenomenon

NP-hard problems become easier on planar graphs, and usually exactly by a square root factor.

The running time is still exponential, but significantly smaller:

2O(n) ⇒ 2O(

n)

nO(k) ⇒ nO(

k)

2O(k)·nO(1) ⇒ 2O(

k)·nO(1)

Several known examples known where such improvement is possible, and (assuming the ETH)

O(k) is best possible for general graphs and O(√

k) is best possible for planar graphs.

2

(4)

Square root phenomenon

NP-hard problems become easier on planar graphs, and usually exactly by a square root factor.

The running time is still exponential, but significantly smaller:

2O(n) ⇒ 2O(

n)

nO(k) ⇒ nO(

k)

2O(k)·nO(1) ⇒ 2O(

k)·nO(1)

Several known examples known where such improvement is possible, and (assuming the ETH)

O(k) is best possible for general graphs and O(√

k) is best possible for planar graphs.

(5)

Two standard techniques

1 Using treewidth:

Works for e.g. 3-Coloring orHamiltonian Cycle:

Planar graphs have treewidth O(√

n) +

2O(w)·nO(1) algorithm for treewidthw

2O(

n)

algorithm

2 Bidimensionality:

Works for e.g. k-Pathor Vertex Cover:

Trivial answer if treewidth isΩ(√

k). +

2O(w)·nO(1) algorithm for treewidth w

2O(

k)·nO(1) algorithm

3

(6)

Two standard techniques

1 Using treewidth:

Works for e.g. 3-Coloring orHamiltonian Cycle:

Planar graphs have treewidth O(√

n) +

2O(w)·nO(1) algorithm for treewidthw

2O(

n)

algorithm

2 Bidimensionality:

Works for e.g. k-Pathor Vertex Cover:

Trivial answer if treewidth isΩ(√

k). +

2O(w)·nO(1) algorithm for treewidthw

2O(

k)·nO(1) algorithm

(7)

Other results

Many other result were obtained using problem-specific techniques:

Strongly Connected Steiner Subgraph [Chitnis et al. 2014]

Multiway Cut[Klein and M. 2012], [Colin de Verdière 2017]

Subgraph Isomorphism

for connected bounded-degree patterns [Fomin et al. 2016]

Subset TSP[Klein and M. 2014]

Facility Location [M. and Pilipczuk 2015]

Odd Cycle Transversal [Lokshtanov et al. 2012]

4

(8)

Two main results

1 A positive result:

Directed Subset TSPwithk terminals can be solved in time2O(k)·nO(1) ingeneralgraphs,

[Held-Karp 1962]

in time2O(

klogk)·nO(1) inplanar graphs.

[new result #1]

2 A negative result:

Steiner Treewith k terminals

can be solved in time2O(k)·nO(1) in generalgraphs, [Dreyfus and Wagner 1971]

cannot be solved in time2o(k)·nO(1) inplanar undirected graphs (assuming the ETH).[new result #2]

(9)

Two main results

1 A positive result:

Directed Subset TSPwithk terminals can be solved in time2O(k)·nO(1) ingeneralgraphs,

[Held-Karp 1962]

in time2O(

klogk)·nO(1) inplanar graphs.

[new result #1]

2 A negative result:

Steiner Treewith k terminals

can be solved in time2O(k)·nO(1) in generalgraphs, [Dreyfus and Wagner 1971]

cannot be solved in time2o(k)·nO(1) inplanar undirected graphs (assuming the ETH).[new result #2]

5

(10)

TSP

TSP

Input: A setT of cities and a distance function d(., .) on T Output: A tour onT with minimum total distance

Theorem[Held and Karp 1962]

TSP withn cities can be solved in time O(2n·n2).

Dynamic programming:

Letx(v,T0) be the minimum length of path fromvstart to v visiting all the citiesT0 ⊆T.

(11)

Subset TSP on planar graphs

Assume that the cities correspond to a subsetT of vertices of a planar graph and distance is measured in this planar graph.

7

(12)

Subset TSP on planar graphs

Assume that the cities correspond to a subsetT of vertices of a planar graph and distance is measured in this planar graph.

Can be solved in time nO(

n). Can be solved in time 2k ·nO(1).

Question: Can we restrict the exponential dependence to k and exploit planarity?

(13)

Subset TSP on planar graphs

Assume that the cities correspond to a subsetT of vertices of a planar graph and distance is measured in this planar graph.

Theorem[Klein and M. 2014]

Subset TSPfork cities in a unit-weight undirected planar graph can be solved in time2O(

klogk)·nO(1).

7

(14)

Subset TSP on planar graphs

Assume that the cities correspond to a subsetT of vertices of a planar graph and distance is measured in this planar graph.

Theorem[new result #1]

Subset TSPfork cities in a directedplanar graph can be solved in time2O(

klogk)·nO(1).

(15)

Partial solutions

General idea: build larger and larger partial solutions.

Held-Karp algorithm: the partial solutions are vstart−v paths visiting a subsetT0 of cities.

1 2 3 4 5 6

Generalization: a partial solution is a set of at mostd pairwise disjoint paths with specified cities as endpoints.

Thetypeof a partial solution can be described by the set of endpoints of the paths,

a matching between the endpoints, and the subset T0 of visited cities.

8

(16)

Partial solutions

General idea: build larger and larger partial solutions.

Held-Karp algorithm: the partial solutions are vstart−v paths visiting a subsetT0 of cities.

1 2 3 4 5 6

Generalization: a partial solution is a set of at mostd pairwise disjoint paths with specified cities as endpoints.

Thetypeof a partial solution can be described by the set of endpoints of the paths,

a matching between the endpoints, and the subset T0 of visited cities.

(17)

Merging partial solutions

Two partial solutions can be merged in an obvious way if a matching is given between the endpoints:

1 2 3 4 5 6

7 8 9 10

11 12

7

4 5

12

Algorithm

Start with an initial set of trivial partial solutions. Combine two partial solutions as long as possible.

Keep at most one partial solution from each type: the best one encountered so far.

Return the best partial solution that consists of a single path (cycle) visiting all vertices.

9

(18)

Merging partial solutions

Two partial solutions can be merged in an obvious way if a matching is given between the endpoints:

1 2 3 4 5 6

7 8 9 10

11 12

7

4 5

12

Algorithm

Start with an initial set of trivial partial solutions. Combine two partial solutions as long as possible.

Keep at most one partial solution from each type: the best one encountered so far.

Return the best partial solution that consists of a single path (cycle) visiting all vertices.

(19)

Merging partial solutions

Two partial solutions can be merged in an obvious way if a matching is given between the endpoints:

1 2 3 4 5 6

7 8 9 10

11 12

7

4 5

12

Algorithm

Start with an initial set of trivial partial solutions. Combine two partial solutions as long as possible.

Keep at most one partial solution from each type: the best one encountered so far.

Return the best partial solution that consists of a single path (cycle) visiting all vertices.

9

(20)

Merging partial solutions

Two partial solutions can be merged in an obvious way if a matching is given between the endpoints:

1 2 3 4 5 6

7 8 9 10

11 12

7

4 5

12

Algorithm

Start with an initial set of trivial partial solutions.

Combine two partial solutions as long as possible.

Keep at most one partial solution from each type: the best one encountered so far.

Return the best partial solution that consists of a single path (cycle) visiting all vertices.

(21)

Running time

Algorithm

Start with an initial set of trivial partial solutions.

Combine two partial solutions as long as possible.

Keep at most one partial solution from each type: the best one encountered so far.

Return the best partial solution that consists of a single path (cycle) visiting all vertices.

10

(22)

Running time

Algorithm

Start with an initial set of trivial partial solutions.

Combine two partial solutions as long as possible.

Keep at most one partial solution from each type: the best one encountered so far.

Return the best partial solution that consists of a single path (cycle) visiting all vertices.

Ford =O(√

k), the number of types (≈running time) is

k

O(

k)

· 2

k

endpoints ofO(

k)paths subsetT0T of visited cities

(23)

Running time

Algorithm

Start with an initial set of trivial partial solutions.

Combine two partial solutions as long as possible.

Keep at most one partial solution from each type: the best one encountered so far.

Return the best partial solution that consists of a single path (cycle) visiting all vertices.

Ford =O(√

k), the number of types (≈running time) is

k

O(

k)

· 2

k

endpoints ofO(

k)paths subsetT0T of visited cities

We need to reduce somehow the number of possible subsets of cities partial solutions can visit!

10

(24)

Running time

Algorithm

Start with an initial set of trivial partial solutions.

Combine two partial solutions as long as possible.

Keep at most one partial solution from each type: the best one encountered so far.

Return the best partial solution that consists of a single path (cycle) visiting all vertices.

Basic idea

We restrict attention to a collectionT of subsets of cities and consider only partial solutions that visit a subset inT.

We need: a collectionT of sizekO(

k) that guarantees finding an optimum solution.

(25)

Bounding the treewidth . . . of what?

The following principle can be deduced from earlier work: Exploit that the union of the unknown solution + a known something has treewidthO(√

k).

11

(26)

Bounding the treewidth . . . of what?

The following principle can be deduced from earlier work: Exploit that the union of the unknown solution + a known something has treewidthO(√

k).

(27)

Bounding the treewidth . . . of what?

The following principle can be deduced from earlier work:

Exploit that the union of the unknown solution + a known something has treewidthO(√

k).

11

(28)

Bounding treewidth

Take an arbitrarySteiner tree T and assume first that it intersects OPT O(k) times.

OPT+T has O(k) branch vertices

⇒treewidth O(√ k)

⇒exists a sphere-cut decomposition of width O(√ k)

(29)

Sphere-cut decompositions

Noose: a closed curve intersecting the graph only at vertices.

Sphere-cut decomposition of widthO(√

k): a recursive decomposition where the boundary of each part is a noose intersectingO(√

k) vertices.

13

(30)

Sphere-cut decompositions

Noose: a closed curve intersecting the graph only at vertices.

Sphere-cut decomposition of widthO(√

k): a recursive decomposition where the boundary of each part is a noose intersectingO(√

k) vertices.

(31)

Sphere-cut decompositions

Noose: a closed curve intersecting the graph only at vertices.

Sphere-cut decomposition of widthO(√

k): a recursive decomposition where the boundary of each part is a noose intersectingO(√

k) vertices.

13

(32)

Sphere-cut decompositions

Noose: a closed curve intersecting the graph only at vertices.

Sphere-cut decomposition of widthO(√

k): a recursive decomposition where the boundary of each part is a noose intersectingO(√

k) vertices.

(33)

Sphere-cut decompositions

Noose: a closed curve intersecting the graph only at vertices.

Sphere-cut decomposition of widthO(√

k): a recursive decomposition where the boundary of each part is a noose intersectingO(√

k) vertices.

13

(34)

Sphere-cut decompositions

Noose: a closed curve intersecting the graph only at vertices.

Sphere-cut decomposition of widthO(√

k): a recursive decomposition where the boundary of each part is a noose intersectingO(√

k) vertices.

(35)

Partial solutions

Each noose cuts out a partial solution withO(√

k) subpaths of OPT.

What can be the set of terminals visited by this partial solution?

14

(36)

Partial solutions

Each noose cuts out a partial solution withO(√

k) subpaths of OPT.

What can be the set of terminals visited by this partial solution?

(37)

Partial solutions

Each noose cuts out a partial solution withO(√

k) subpaths of OPT.

What can be the set of terminals visited by this partial solution?

14

(38)

Cutting terminals from a tree

Lemma

We can compute a collectionT of kO(

k) subsets of terminals such that ifC is a cycle intersecting the treeT at most O(√

k) times, then the set of terminals enclosed byC is in T.

We can restrict attention only to partial solutions restricted toT!

(39)

Algorithm

Algorithm

Compute the collection T (possible sets of terminals enclosed by a cycle intersecting tree T at mostO(√

k) times).

Start with an initial set of trivial partial solutions.

Combine two partial solutions as long as possible and keep it only if it visits a subset inT.

Keep at most one partial solution from each type: the best one encountered so far.

Return the best partial solution that consists of a single path (cycle) visiting all vertices.

OnlykO(k) subproblems are considered

Running time iskO(k)nO(1).

16

(40)

Algorithm

Algorithm

Compute the collection T (possible sets of terminals enclosed by a cycle intersecting tree T at mostO(√

k) times).

Start with an initial set of trivial partial solutions.

Combine two partial solutions as long as possible and keep it only if it visits a subset inT.

Keep at most one partial solution from each type: the best one encountered so far.

Return the best partial solution that consists of a single path (cycle) visiting all vertices.

Existence of the sphere-cut decomposition implies

that the algorithm finds an optimum solution!

(41)

Many intersections

What happens ifOPT+T has more thanO(k) intersections?

Let us contract the subpaths ofOPT between consecutive terminals (each such path is a shortest path).

Each noose goes throughO(√

k) contracted vertices

⇒ we can guess the contractions that produced these vertices.

17

(42)

Many intersections

What happens ifOPT+T has more thanO(k) intersections?

Let us contract the subpaths ofOPT between consecutive terminals (each such path is a shortest path).

Each noose goes throughO(√

k) contracted vertices

⇒ we can guess the contractions that produced these vertices.

(43)

Many intersections

What happens ifOPT+T has more thanO(k) intersections?

Let us contract the subpaths ofOPT between consecutive terminals (each such path is a shortest path).

Each noose goes throughO(√

k) contracted vertices

⇒ we can guess the contractions that produced these vertices.

17

(44)

Many intersections

What happens ifOPT+T has more thanO(k) intersections?

Let us contract the subpaths ofOPT between consecutive terminals (each such path is a shortest path).

Each noose goes throughO(√

k) contracted vertices

⇒ we can guess the contractions that produced these vertices.

(45)

Many intersections

What happens ifOPT+T has more thanO(k) intersections?

Let us contract the subpaths ofOPT between consecutive terminals (each such path is a shortest path).

Each noose goes throughO(√

k) contracted vertices

⇒ we can guess the contractions that produced these vertices.

17

(46)

Many intersections

What happens ifOPT+T has more thanO(k) intersections?

Let us contract the subpaths ofOPT between consecutive terminals (each such path is a shortest path).

Each noose goes throughO(√

k) contracted vertices

⇒ we can guess the contractions that produced these vertices.

(47)

Self-crossing solutions

It is not possible to bound the number of self-crossings by a function ofk, but we can show that there is a solution that is a

“cactus.” 18

(48)

Lower bound for Steiner Tree

Theorem[new result #2]

Assuming the ETH,Steiner Tree on planar undirectedgraphs withk terminals cannot be solved in time2o(k)·nO(1).

Standard techniques show thatSteiner Tree (and many other problems) do not have2o(

k)·nO(1) time algorithms assuming the ETH, but a lower bound ruling out2o(k)·nO(1) is quite unusual!

(49)

Standard lower bounds for planar problems

ETH + Sparsification Lemma

There is no2o(n+m)-time algorithm form-clause 3SAT.

Typical reduction from 3SAT createsO(n+m) gadgets and O((n+m)2) crossings in the plane.

A crossing typically increases the size by O(1).

3SAT formulaφ n variables

m clauses

Planar graphG0 O((n+m)2) vertices

O((n+m)2) edges

Corollary

Assuming the ETH, there is no2o(

n) algorithm forSteiner Treeon an n-vertex planar graph.

20

(50)

Standard lower bounds for planar problems

ETH + Sparsification Lemma

There is no2o(n+m)-time algorithm form-clause 3SAT.

Typical reduction from 3SAT createsO(n+m) gadgets and O((n+m)2) crossings in the plane.

A crossing typically increases the size by O(1).

3SAT formulaφ n variables

m clauses

Planar graphG0 O((n+m)2) vertices

O((n+m)2) edges

Corollary

Assuming the ETH, there is no2o(

k)·nO(1) algorithm for Steiner Treeon an n-vertex planar graph withk terminals.

(51)

Standard lower bounds for planar problems

ETH + Sparsification Lemma

There is no2o(n+m)-time algorithm form-clause 3SAT.

Typical reduction from 3SAT createsO(n+m) gadgets and O((n+m)2) crossings in the plane.

A crossing typically increases the size by O(1).

3SAT formulaφ n variables

m clauses

Planar graphG0 O((n+m)2) vertices

O((n+m)2) edges

No way such reductions could give a bound stronger than 2

o(

k)

!

20

(52)

Stronger lower bound

We get around this issue by crossing gadgets where a stream of many bitscross a stream of one bitand has onlyO(1) terminals.

one bit

one bit

manybits manybits

(53)

Stronger lower bound

We get around this issue by crossing gadgets where a stream of many bitscross a stream of one bitand has onlyO(1) terminals.

one bit

one bit

manybits manybits

21

(54)

Stronger lower bound

We get around this issue by crossing gadgets where a stream of many bitscross a stream of one bitand has onlyO(1) terminals.

one bit

one bit

manybits manybits

(55)

Stronger lower bound

We get around this issue by crossing gadgets where a stream of many bitscross a stream of one bitand has onlyO(1) terminals.

one bit

one bit

manybits manybits

21

(56)

Stronger lower bound

We get around this issue by crossing gadgets where a stream of many bitscross a stream of one bitand has onlyO(1) terminals.

one bit

one bit

manybits manybits

(57)

Stronger lower bound

We get around this issue by crossing gadgets where a stream of many bitscross a stream of one bitand has onlyO(1) terminals.

one bit

one bit

manybits manybits

21

(58)

Stronger lower bound

We get around this issue by crossing gadgets where a stream of many bitscross a stream of one bitand has onlyO(1) terminals.

one bit

one bit

manybits manybits

(59)

Reduction from 3SAT

Partition the variables intog groups of sizen/g each.

Horizontal flow: assignment in group i (2n/g possibilities) Vertical flow: checking satisfiability of each clause Cj. Graph size: N=2O(n/g) withk =O(m·g) terminals.

C1 C2 Cm

Group 1

Group 2

Group g

22

(60)

Reduction from 3SAT

Partition the variables intog groups of sizen/g each.

Horizontal flow: assignment in group i (2n/g possibilities) Vertical flow: checking satisfiability of each clause Cj. Graph size: N=2O(n/g) withk =O(m·g) terminals.

C1 C2 Cm

Group 1

Group 2

Group g

(61)

Reduction from 3SAT

Partition the variables intog groups of sizen/g each.

Horizontal flow: assignment in group i (2n/g possibilities) Vertical flow: checking satisfiability of each clause Cj. Graph size: N=2O(n/g) withk =O(m·g) terminals.

C1 C2 Cm

Group 1

Group 2

Group g

22

(62)

Reduction from 3SAT

Partition the variables intog groups of sizen/g each.

Horizontal flow: assignment in group i (2n/g possibilities) Vertical flow: checking satisfiability of each clause Cj. Graph size: N=2O(n/g) withk =O(m·g) terminals.

C1 C2 Cm

Group 1

Group 2

Group g

(63)

Reduction from 3SAT

Graph size: N=2O(n/g) withk =O(m·g) terminals.

Running time 2O(k/g2)·NO(1) for Steiner Tree

Running time2O(m/g)·2O(n/g)=2o(n+m) for 3SAT

C1 C2 Cm

Group 1

Group 2

Group g

22

(64)

Summary

1 Main positive result

Subset TSPfor k cities in a directedplanar graph can be solved in time2O(

klogk)·nO(1).

Exploit that the union of the unknown solution + a known something has treewidthO(√

k).

2 Main negative result

Assuming the ETH,Steiner Tree on planar undirected graphs withk terminals cannot be solved in time2o(k)·nO(1). The square root phenomenon does not appear for every problem, making the previous positive results even more interesting!

Hivatkozások

KAPCSOLÓDÓ DOKUMENTUMOK

1.2 Related Works on Parameterized Graph Modification Problems The F-Vertex Deletion problems corresponding to the families of edgeless graphs, forests, chordal graphs, interval

Here we study the existence of subexponential-time algorithms for the problem: we show that for any t ≥ 1, there is an algorithm for Maximum Independent Set on P t -free graphs

If P contains the cycle graph on ` ě 4 vertices, then Bounded P -Block Vertex Deletion is not solvable in time 2 opw log wq n Op1q on graphs with n vertices and treewidth at most w

We showed that Bounded P -Component Vertex Deletion and Bounded P -Block Vertex Deletion admit single-exponential time algorithms parameterized by treewidth, whenever P is a class

[This paper] PTAS for the minimum sum edge multicoloring of partial k-trees and planar graphs....

We give the first polynomial-time approximation scheme (PTAS) for the Steiner forest problem on planar graphs and, more generally, on graphs of bounded genus.. As a first step, we

Directed Steiner Forest : n O(k) algorithm of [Feldman and Ruhl 2006] is essentially best possible even on planar graphs (assuming

For n odd, Theorem 13 implies the existence of a partition of K n into ⌊n/2⌋ Hamiltonian cycles but one can check that it is always possible to choose one edge from each Hamilton