Leakage-Resilient Hardness vs Randomness

Yanyi Liu, Rafael Pass

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

A central open problem in complexity theory concerns the question of whether all efficient randomized algorithms can be simulated by efficient deterministic algorithms. The celebrated “hardness v.s. randomness” paradigm pioneered by Blum-Micali (SIAM JoC'84), Yao (FOCS'84) and Nisan-Wigderson (JCSS'94) presents hardness assumptions under which e.g., prBPP = prP (so-called “high-end derandomization), or prBPP ⊆ prSUBEXP (so-called “low-end derandomization), and more generally, under which prBPP ⊆ prDTIME(C) where C is a “nice” class (closed under composition with a polynomial), but these hardness assumptions are not known to also be necessary for such derandomization. In this work, following the recent work by Chen and Tell (FOCS'21) that considers “almost-all-input” hardness of a function f (i.e., hardness of computing f on more than a finite number of inputs), we consider “almost-all-input” leakage-resilient hardness of a function f - that is, hardness of computing f(x) even given, say, p|x| bits of leakage of f(x). We show that leakage-resilient hardness characterizes derandomization of prBPP (i.e., gives a both necessary and sufficient condition for derandomization), both in the high-end and in the low-end setting. In more detail, we show that there exists a constant c such that for every function T, the following are equivalent: prBPP ⊆ prDTIME(poly(T(poly(n)))); Existence of a poly(T(poly(n)))-time computable function f : (0, 1)n → (0, 1)n that is almost-all-input leakage-resilient hard with respect to nc-time probabilistic algorithms. As far as we know, this is the first assumption that characterizes derandomization in both the low-end and the high-end regime. Additionally, our characterization naturally extends also to derandomization of prMA, and also to average-case derandomization, by appropriately weakening the requirements on the function f. In particular, for the case of average-case (a.k.a. “effective”) derandomization, we no longer require the function to be almost-all-input hard, but simply satisfy the more standard notion of average-case leakage-resilient hardness (w.r.t., every samplable distribution), whereas for derandomization of prMA, we instead consider leakage-resilience for relations.

Original languageEnglish
Title of host publication38th Computational Complexity Conference, CCC 2023
EditorsAmnon Ta-Shma
PublisherSchloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
ISBN (Electronic)9783959772822
DOIs
StatePublished - Jul 2023
Event38th Computational Complexity Conference, CCC 2023 - Warwick, United Kingdom
Duration: 17 Jul 202320 Jul 2023

Publication series

NameLeibniz International Proceedings in Informatics, LIPIcs
Volume264

Conference

Conference38th Computational Complexity Conference, CCC 2023
Country/TerritoryUnited Kingdom
CityWarwick
Period17/07/2320/07/23

Keywords

  • Derandomization
  • Leakage-Resilient Hardness

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint

Dive into the research topics of 'Leakage-Resilient Hardness vs Randomness'. Together they form a unique fingerprint.

Cite this