## Wednesday, February 28, 2007

### Homework 3

Hi all. Any comments/questions about Homework #3 can go in the comments here.

Anonymous said...

A question about problem 6a:

Do we really want to show that with probability 1-delta the empirical estimate of every fourier coefficient is within epsilon of the true coefficient, and not instead, show that the sum of the squares of the differences is at most epsilon (That the empirical function is epsilon close). It seems like we need epsilon-closeness for learning, unless the epsilon in this problem is not the same epsilon from the low degree algorithm.

Aaron

Anonymous said...

Another question, this time about 4a:

What is meant by access to -random- examples from f? Do we also have query access to f? Can we evaluate f on elements from the epsilon-biased set?

Aaron

Ryan said...

Hi Aaron.

Re 6a: Either is fine; with the question as stated, the algorithm can always choose eps to be poly(eps/n^d) if it wants to use this for learning.

Re 4a: You're right, this is a mistake. The algorithm should also get query access. I have corrected this in the online version of the homework.

Anonymous said...

A question about 3b:

Are we supposed to get weak learning under the uniform distribution, or under any distribution?

Ryan said...

Anon: You don't have to learn anything in 3b.

The Quark said...

You should shift this to wordpress. It has better latex support