We present a new technique for differentiating deterministic from nondeterministic communication complexity. As a consequence we give almost tight lower bounds for the nondeterministic communication complexity with a restricted number of advice bits. In particular, for any function t : N --> N with t(n) less than or equal to n/2 we construct a family (L-n,L- t(n) : n is an element of N) of languages such that L-n,L- t(n) subset of or equal to {0, 1}(2n), nc(L-n,L- t(n)) = O(t(n) . log(2) n/t(n)) and nc(<(L-n,L-t(n))over bar>) = O(n/t(n) . log(2) n/t(n) + log(2) t(n)), but nc(o(t(n))) (L-n,L- t(n)) = Omega(n/log2 n/t(n)). Here nc(r)( L) is the nondeterministic communication complexity of L, assuming that at most r advice bits are utilized. Thus, in contrast to probabilistic communication complexity, a small reduction in the number of advice bits results in almost maximal communication. As a special case we obtain a family Ln. {0, 1} 2n of languages with nc(o(rootn/log2 n)) (Ln) = Omega (n/log2 n), nc(L-n) + nc((L-n) over bar) = O(v n), and hence nondeterministic communication with slightly restricted access to advice bits is almost quadratically weaker than nondeterminism that always gives correct answers ( from the set {yes, no, ?}). As a consequence we obtain an almost optimal separation between Monte-Carlo communication and "correct" nondeterminism and answer a question of Beame and Lawry. [References: 21]
展开▼