Hi all. Any comments/questions about Homework #3 can go in the comments here.
6 comments:
Anonymous
said...
A question about problem 6a:
Do we really want to show that with probability 1-delta the empirical estimate of every fourier coefficient is within epsilon of the true coefficient, and not instead, show that the sum of the squares of the differences is at most epsilon (That the empirical function is epsilon close). It seems like we need epsilon-closeness for learning, unless the epsilon in this problem is not the same epsilon from the low degree algorithm.
6 comments:
A question about problem 6a:
Do we really want to show that with probability 1-delta the empirical estimate of every fourier coefficient is within epsilon of the true coefficient, and not instead, show that the sum of the squares of the differences is at most epsilon (That the empirical function is epsilon close). It seems like we need epsilon-closeness for learning, unless the epsilon in this problem is not the same epsilon from the low degree algorithm.
Aaron
Another question, this time about 4a:
What is meant by access to -random- examples from f? Do we also have query access to f? Can we evaluate f on elements from the epsilon-biased set?
Aaron
Hi Aaron.
Re 6a: Either is fine; with the question as stated, the algorithm can always choose eps to be poly(eps/n^d) if it wants to use this for learning.
Re 4a: You're right, this is a mistake. The algorithm should also get query access. I have corrected this in the online version of the homework.
A question about 3b:
Are we supposed to get weak learning under the uniform distribution, or under any distribution?
Anon: You don't have to learn anything in 3b.
You should shift this to wordpress. It has better latex support
Post a Comment