Tuesday, November 30th, 2021

The 50-Year-Old Problem that Eludes Theoretical Computer Science

by Siobhan Roberts (MIT Technology Review)

This excerpt published with permission from MIT Technology Review. The full article from October 27, 2021 is available here: https://www.technologyreview.com/2021/10/27/1037123/p-np-theoretical-com...

“More and more, I’m starting to wonder whether P equals NP,” Toniann Pitassi, a computer scientist at the University of Toronto and a former PhD student of Stephen Cook’s, says. Her approach in circling around the problem is to study both scaled-up and scaled-down analogues, harder and easier models. “Sometimes generalizing the question makes it clearer,” she says. But overall, she hasn’t achieved clarity: “Most people think P doesn’t equal NP. And I don’t know. Maybe it’s just me, but I feel like it’s become less and less clear that that’s the truth.” 

Historically, Pitassi points out, surprising results have occasionally come out of nowhere—seeming impossibilities proved possible by smart algorithm designers. The same could happen with P vs. NP, maybe in another 50 years or a century. One of the most important results in all of complexity theory, for instance, was achieved by David Barrington, in 1989. The gist of it is that he devised a clever algorithm, which set out to do something that seemingly should’ve required an unbounded amount of memory but in fact used an astonishingly small amount—just five bits of information, enough to specify a number between one and 32 (inclusive) or a two-letter word.

A more recent and related result, from 2014, surprised James Cook—Stephen Cook’s son, 36, also a computer scientist, who has taken up the quest. Drawing from Barrington’s theorem, the result uses memory in a wonderfully weird way. As hinted in the title of the paper, by Harry Buhrman and collaborators, it’s about “computing with a full memory.” James can rattle off the paper’s introductory paragraph practically verbatim:

Text

Description automatically generated

The answer, counterintuitively, is yes. 

James thinks of it as “borrowed memory.” After the shock of this result sank in, he had fun figuring out how to apply it to a particular problem—picking up where his dad had left off. A couple of decades ago, Steve Cook moved on to other related problems in complexity theory. With one problem, he made a conjecture about the amount of memory an algorithm would need to solve the problem—honing it to the absolute minimum. In 2019, James, together with Ian Mertz, one of Pitassi’s PhD students, deployed the poetic idea of borrowing memory and proved that even less memory was needed. The result didn’t go all the way to refuting his dad’s conjecture, but it’s a bit of progress in the grand complexity quest, nonetheless.

Copyright © 2021, all rights reserved MIT Technology Review, www.technologyreview.com.