Readit News logoReadit News
ode · 2 years ago
Two of Wigderson's major papers mentioned in the announcement are co-authored with Noam Nisan, one of the professors behind the well known on-line course 'From Nand to Tetris'.
YouWhy · 2 years ago
Prof. Nissan is also remarkable due to having had first line achievements in CS Theory and then shifting to do major impact in the rather distinct field of algorithmic game theory. It's both heart warming that a single person living today could be as diverse, and that the system allowed him that flexibility.
sn9 · 2 years ago
There's a book, too!

Recently came out as a second edition.

COGlory · 2 years ago
Here's a great Quanta article as well:

https://www.quantamagazine.org/avi-wigderson-complexity-theo...

One thing I enjoyed was the variety of poses they had Wigderson in. It just looks so awkward. "Here, sit in this chair and look out the window longingly".

jkaptur · 2 years ago
> The unreasonable effectiveness of randomness led him to think about the nature of randomness itself. He, like other researchers at the time, questioned how necessary it was for efficient problem-solving and under what conditions it could be eliminated altogether. “Initially, it was not clear if this was only our own stupidity, that we cannot remove randomness,” he said. “But the larger question was whether randomness can always be eliminated efficiently or not.” He realized that the need for randomness was intimately tied to the computational difficulty of the problem.

Does anyone have a sketch of how this works? My understanding of complexity classes is that they consider worst case performance. Even if you have a great pseudorandom number generator and a great randomized algorithm, how do you prove that no instance of RNG + seed + problem instance takes exponential time?

bo1024 · 2 years ago
BPP is the complexity class for decision problems that a randomized algorithm can solve in polynomial time, in the sense that on every input (worst-case), the algorithm is right with at least 2/3 probability (over its own randomness).

One example is deciding whether a given number is prime. For a long time, we knew randomized algorithms like the Miller-Rabin test that are correct with high probability. That is, we knew PRIMES is in BPP. But we didn't know a deterministic algorithm. In 2006, a deterministic algorithm was discovered, proving that PRIMES is in P.

One of the central open problems in Computer Science is whether BPP=P. That is, for any problem we can solve with randomness (in the above sense), can we solve it without randomness? Among many other contributions, Wigderson's work has made a ton of progress toward this problem.

The general idea is to try to "derandomize" any randomized algorithm that runs in polynomial time and is correct with 2/3 probability. We will feed it inputs from a pseudorandom generator (instead of true random numbers). If the PRG is really really good, then we can get a deterministic algorithm by running the original algorithm along with the PRG.

Now, if we can prove certain problems are hard, then we can also prove that such really really good PRGs exist. This is basically because telling apart the PRG from true randomness would be a computationally hard problem.

sn41 · 2 years ago
An important idea is to use what are called worst-case-to-average-case reductions. You convert a boolean function f that is worst-case hard to another function g which is average-case hard. In other words, if g is average-case easy, then f is worst-case easy.

These reductions are a major achievement in complexity theory. Constructions use tools like combinatorial designs, list-decoding of error-correcting codes, expander graphs, extractors etc. A good overview of these methods is in Arora and Barak's "Computational complexity", in, I believe Chapters 19 through 21.

hinkley · 2 years ago
Randomization prevents the worst degenerate cases from being repeatable and thus exploitable. (By an attacker or self-own).

Using them is really in the applied CS rather than a theoretical CS domain. The problem can still happen, it just no longer happens consistently to the same customer or at the same time every day.

Eliminating the randomized solution was not on my radar so I’ve got some homework to do.

gaws · 2 years ago
> Correction: April 10, 2024

> The original version of this article said Wigderson attended the University of Haifa. He actually graduated from the Technion, in Haifa, Israel.

How did the reporter mess that up?

chaosite · 2 years ago
Haifa is the only city in Israel to have two public universities (excluding branches of the Open University.)

I can see the reporter reading that he attended university in Haifa and misconstruing that to mean the University of Haifa...

sandspar · 2 years ago
That "sitting in chair looking out the window" pose is a Martin Scorsese or Sopranos type pose. The elderly gangster in the old folk's home etc.

Deleted Comment

dataexp · 2 years ago
In (*) Scott Aaronson relates the impact of one of Avi Wigderson talks on his career.

https://scottaaronson.blog/?p=2925

wslh · 2 years ago
More info at "Israeli Wins Turing Prize, Computing's Highest Honor, for Insights on Randomness": [1] and its archived version [2]

[1] https://www.haaretz.com/israel-news/2024-04-10/ty-article/.p...

[2] https://archive.is/e8uix

Dead Comment

hinkley · 2 years ago
> Computer scientists have discovered a remarkable connection between randomness and computational difficulty (i.e., identifying natural problems that have no efficient algorithms). Working with colleagues, Wigderson authored a highly influential series of works on trading hardness for randomness. They proved that, under standard and widely believed computational assumptions, every probabilistic polynomial time algorithm can be efficiently derandomized (namely, made fully deterministic). In other words, randomness is not necessary for efficient computation. This sequence of works revolutionized our understanding of the role of randomness in computation, and the way we think about randomness.

How would I go about catching up with this aspect of his research? It’s not often that I’ve never heard of a Turing winner, but this guy is completely off of my radar.

jumpCastle · 2 years ago
You can try his book. https://www.math.ias.edu/avi/book
layer8 · 2 years ago
I wonder what the “standard and widely believed computational assumptions” are. Presumably, probabilistic approximations to NP-complete problems are not polynomial-time? Or the derandomized versions would still be just approximations?
bo1024 · 2 years ago
They are generally esoteric conjectures similar in spirit to P != NP. For example, the third paper uses the assumption "E requires exponential circuits". Here E is the class of problems solvable in exponential time, or 2^O(n) time, on a Turing Machine. Another model of computation besides Turing Machines are Boolean circuits, i.e. AND, OR, and NOT gates. The conjecture is that not every problem in E can be solved by "small" (subexponential-sized) circuits.

The basic idea of the work is that if these problems are hard, then we can use them to build pseudorandom generators that are "just as good" as true random, which we can use to turn truly random algorithms into pseudorandom algorithms with the same performance.

reader5000 · 2 years ago
I think his argument assumes the existence of pseudorandom generators which map a small amount of "true" random bits to a large amount of bits that look random to any polytime observer. The "derandomization" is that we just have to check all possible states of the seed bits which hopefully will be logarithmic in the size of the problem so you can do exhaustive checking.
pthreads · 2 years ago
Just picked up Wigderson's book and I am liking it so far :

https://press.princeton.edu/books/hardcover/9780691189130/ma...

teleforce · 2 years ago
Final draft version of the book available here for personal research and education:

https://www.math.ias.edu/avi/book

myth_drannon · 2 years ago
I looked at the book and it's more for graduate-advanced undergrad students.

Can someone recommend a more basic book on the topic of computation for someone with a rusty comp-sci/math undergrad background?

jimhefferon · 2 years ago
You could have a look at https://hefferon.net/computation which is certainly aimed at a broader audience.
pthreads · 2 years ago
Try What Can be Computed by John MacCormick :

https://press.princeton.edu/books/hardcover/9780691170664/wh...

cryptoxchange · 2 years ago
Introduction to the Theory of Computation by Michael Sipser
sn9 · 2 years ago
Sipser is the canonical text for undergraduates.
tinhhuynh_97 · 2 years ago
Agreed
thedatamonger · 2 years ago
From the related article: https://www.quantamagazine.org/avi-wigderson-complexity-theo...

> ... if a statement can be proved, it also has a zero-knowledge proof.

Mind blown.

>Feeding the pseudorandom bits (instead of the random ones) into a probabilistic algorithm will result in an efficient deterministic one for the same problem.

This is nuts. AI is a probabilistic computation ... so what they're saying - if i'm reading this right - is that we can reduce the complexity of our current models by orders of magnitude.

If I'm living in noobspace someone please pull me out.

IshKebab · 2 years ago
I don't know exactly what it's saying but it definitely isn't that. AI already uses pseudorandom numbers and is deterministic. (Except some weird AI accelerator chips that use analogue computation to improve efficiency.)
ilya_m · 2 years ago
> AI is a probabilistic computation ... so what they're saying - if i'm reading this right - is that we can reduce the complexity of our current models by orders of magnitude.

Unfortunately, no. First, the result applies to decision, not search problems. Second, the resulting deterministic algorithm is much less efficient than the randomized algorithm, albeit it still belongs to the same complexity class (under some mild assumptions).

mxkopy · 2 years ago
Can’t you build search from decision by deciding on every possible input?
Daub · 2 years ago
I love this section of the article (edited for clarity):

"I'm not motivated by application, but I know that we can find uses for fundamental work. Think about Alan Turing. He wrote a mathematical paper in logic in an obscure journal about Entscheidungsproblem. It was not motivated by application."

This is similar to Feynman's plate story, which started off as a casual response to an observation he made in a university canteen, and which ended in a Nobel prize.

To extend my point, it is precisely this kind of curiosity-based enquiry that modern academia discourages.