# Computational methods for integrality gaps analysis

## Persone

(Responsabile)

(Co-responsabile)

## Abstract

Mathematical programming relaxations and especially linear programming relaxations have played a central role both in solving combinatorial optimization problems in practice [see e.g. the Concorde's TSP solver by V. C. David L. Applegate, Robert E. Bixby and W. J. Cook] and in the design and analysis of approximation algorithms [see e.g. Vazirani's book]. An important concept is the integrality gap, the maximum ratio between the solution quality of the integer linear program (ILP) and of its linear program (LP) relaxation. For many problems the integrality gap of a natural LP (or SDP) formulation is equal to the approximation ratio of the best algorithm as well as the hardness of approximation ratio and essentially represents the inherent limits of the considered relaxation. The following facts can be observed from the literature review regarding our current understanding of integrality gaps.

• Proving integrality gaps for linear relaxations of NP-hard optimization problems is a difficult task and usually undertaken on a case-by-case basis. Very few and limited attempts have been made to use computer-assisted analysis for this difficult task.

• For several notable and important examples, like traveling salesman problem (TSP), Steiner tree problem (STP), etc., our integrality gap comprehension resisted the persistent attack during the last decades of many researchers.

Inspired by the recent progress in the area of artificial intelligence research using heuristic search methods, within this project we would like to investigate the integrality gap instances and the generation of ``hard'' instances computationally for problems like the traveling salesman problem, the tree augmentation problem, etc. with a ``large'' number of vertices. In this frame we want to combine the most recent theoretical advances and experimental heuristic innovative methodologies to deal with the analysis of integrality gaps. More specifically, we plan to investigate and exploit aspects of polyhedral theory to reduce the search space. Then we combine this information with the use of local search, bio-inspired heuristic techniques and branch and cut techniques for effectively search the pruned space. This approach will allow us to get computer-aided construction of ``bad'' instances, namely problem instances having ``large'' integrality gap values, for the considered LP relaxation. The aim is to obtain new and better understanding of lower bounds on the corresponding integrality gaps. For example, for the symmetric TSP this will either experimentally confirm or disprove (by computing witness instances) a longstanding conjecture (formulated more than 30 years ago) stating that the integrality gap for the Held-Karp LP is 4/3. Until now an experimental analysis of this kind has been conducted only for very small instances (with at most 12 nodes). Note that the currently known ``bad'' instances that are asymptotically matching the best-known lower bound of 4/3, are ``large'' instances. The general rationale is that computer-aided heuristic search can be a valuable tool for mathematical exploration. It is potentially able to search for much more complicated instances than is feasible manually. Moreover, we plan to use the outcomes of these experiments for driving the theoretical analysis of the considered integrality gaps.

Furthermore, as for the computation of gap instances, obtaining ``hard'', i.e., worst-case, instances for optimization algorithms of NP-hard problems seems to be a very hard task that is usually undertaken on a case-by-case basis. This generalized difficulty has important repercussions even in practice. For example, it often happens that heuristic algorithms are tested experimentally on randomly generated instances or more generally without adequate effort to generate instances that expose weaknesses in the algorithm being tested. Interestingly, our preliminary computational results show that the study of gap-instances can also be useful for the latter purpose.

Indeed, in support of this project application, we developed preliminary research and results that are very promising. More precisely, we consider the computational method proposed in this project for generating metric Travelling Salesman Problem (TSP) instances having a large integrality gap. The obtained computational results show that our method is very effective not only in producing large gap instances but also challenging instances for Concorde, the state-of-the-art TSP solver. As a by-product, we release the {Hard-TSPLIB}, a library of 41 small metric TSP instances which have a large integrality gap and are challenging in terms of run-time for Concorde. Our approach is general and encompasses many “graph” problems. Extending and improving our preliminary results to many other graph problems will be a feasible short to medium term goal in this project.

From a broader perspective, this project is a step towards the very challenging and useful ambition to develop a theory/practice for the construction of ``bad'' instances for optimization algorithms. This analysis is not an end in itself but allows us to understand the limits of the considered algorithms with the aim to design better algorithms.

The project will be held at Istituto Dalle Molle di Studi Sull'Intelligenza Artificiale (IDSIA) with a project team supported by the Swiss National Science Foundation composed by 1 postdoc and 2 PhD students for 4 years.

## Informazioni aggiuntive

**01.09.2023**

**31.08.2027**

**48 Mesi**

**SNSF**

**In corso**

**Swiss National Science Foundation / Project Funding / Mathematics, Natural and Engineering Sciences (Division II)**