Tight Bounds for General Computation in Noisy Broadcast Networks

# 190


Let Pi be a protocol over the n-party broadcast channel, where in each round, a pre-specified party broadcasts a symbol to all other parties. We wish to design a scheme that takes such a protocol Pi as input and outputs a noise resilient protocol Pi that simulates Pi over the noisy broadcast channel, where each received symbol is flipped with a fixed constant probability, independently. What is the minimum overhead in the number of rounds that is incurred by any such simulation scheme? A classical result by Gallager from the 80s shows that non-interactive T-round protocols, where the bit communicated in every round is independent of the communication history, can be converted to noise resilient ones with only an mathcal{O}(log log T) multiplicative overhead in the number of rounds. Can the same be proved for any protocol? Or, are there protocols whose simulation requires an Omega(log T) overhead (which always suffices)? We answer both the above questions in the negative, We give a simulation scheme with an tilde{O}(sqrt{log T}) overhead for every protocol and channel alphabet. We also prove an (almost) matching lower bound of Omega(sqrt{log T}) on the overhead required to simulate the pointer chasing protocol with T=n and polynomial alphabet.

Dr. Raghuvansh Saxena, School of Technology and Computer Science, TIFR Mumbai

Dr. Raghuvansh Saxena is a Reader at the School of Technology and Computer Science at the Tata Institute of Fundamental Research, Mumbai. His primary research interest is communication complexity and its applications to other areas of theoretical computer science, such as coding theory, algorithmic game theory, streaming algorithms, and distributed systems. Other topics of his interest are computational complexity, information theory. Before joining TIFR, he received his Ph.D. from Princeton University under the amazing supervision of Prof. Gillat Kol and my bachelor's degree in computer science and engineering from IIT Delhi.