We have established two hardness results for proper agnostic learning of low-degree PTFs. Our results show that even if there exist low-degree PTFs that are almost perfect hypotheses, it is computationally hard to find low-degree PTF hypotheses that perform even slightly better than random guessing; in this sense our hardness are rather strong. However, our results do not rule out the possibility of efficient learning algorithms when ε is sub-constant, or if unrestricted hypotheses may be used. Strengthening the hardness results along these lines is an important goal for future work, but may require significantly new ideas. Another natural goal for future work is the following technical strengthening of our results: show that for any constant d, it is hard to construct a degree-d PTF that is consistent with (1/2 +ε) fraction of a given set of labeled examples, even if there exists a half-space that is consistent with a 1-εfraction of the data. Such a hardness result would subsume both of the results of this paper as well as much prior work, and would serve as strong evidence that agnostically learning half-spaces under arbitrary distributions is a computationally hard problem.
展开▼