Can the Machine Learning (ANN) solve NP-Complete problems?

Hilaal Alam
2 min readAug 23, 2022

--

NP stands for Nondeterministic Polynomial time. NP problems are those computational problems solved by non-deterministic computers (not our common, day to day digital computers) but can be checked (‘yes’) by the deterministic computers (common digital computers). An NP-Complete is the hardest part of the NP (if P ≠ NP).

Image: Wikie

For example deciding if a graph is Hamiltonian, is very hard even with supercomputers. Hamiltonian cycle problem is all about finding paths through a given graph such that those paths visit each node exactly once after the start, and end where they began.

However, verifying the Hamiltonian Cycle (HC) is a very simple process.

(Image: https://geeksforgeeks.org/proof-that-hamiltonian-cycle-is-np-complete/)

Why it is so significant?

For example, finding the minimum cost of a traveling salesman who visits all cities once before coming home, but tracing the shortest path is an NP Complete problem. In both the cases, solving the problem to find the cycle, and the shortest path of the cycle are hard problems. However, verifying them is easy.

(Image:https://optimization.mccormick.northwestern.edu/index.php/Traveling_salesman_problems)

The Machine Learnings (ML) are Learning Problems which by setting two classes, learn to identify the respective category. The goal is to construct a “learner” algorithm. This is known as PAC Learning — Probably Approximately Correct Learning.

(image: https://data-flair.training/blogs/cats-dogs-classification-deep-learning-project-beginners/)

The PAC learning are lot like BPP (Bounded-error Probabilistic Polynomial time) which solves problems in polynomial time with a bounded probability of error. It is widely conjectured that BPP = P andthe relationship between BPP & NP is not known yet.

(Image: https://researchgate.net/figure/Known-relations-between-complexity-classes-relevant-for-AQC-The-BStoqP-class-defined_fig1_310235960)

There are attempts to show that every PAC learnable problem is in BPP. Unfortunately, the results from PACs & MLs themselves show a big disconnect.

Hence, it is unlikely to prove or disprove the learnability of NP-Complete problems with Neural Networks techniques.

--

--

Hilaal Alam
Hilaal Alam

Written by Hilaal Alam

| Dreamer, Explorer, Innovator | Startups | Quantum-Information, Computing, Complexity, Error Correction, Gravity, Biomimicry | Design-Flexures, PBDL |

No responses yet