On lattice-based cryptography and average-case/worst-case connections

Regev05やらGPVの公開鍵サイズをm=5nlog n, q=O(n^3)で見積もってみた.
大体n=512のときに鍵サイズが256Mbから1Gbか. Ajtai-Dworkのn=32のときにn^5 log nで160Mbとかにに比べればかなりの改善ということが分かる. 定数項が結構利いてくるものだ.
現実的にはn=128から256辺りで見極めることになるのかな. qをもう少し細かく調節出来るのでそれも削って比べる必要があるようだ.


In crypto they know the underlying distribution since their protocols create it. But now even in that community you now see a slight move to base protocols on worst-case assumptions, using average-to-worst case reductions for lattice-based problems. I worry though that these average-to-worst case reductions imply less that the average case complexity of these problems are harder than we expected and more that the worst case is easier than we thought.

Well, I have recognized it. But, I did not write such issues my master thesis and our papers :p I think many scientists working on lattice-based cryptography have recognized the issue, while they do not issue.
Fortnow says there is the slight move, but, I felt it very slight in Japan, especially the move on lattice-based cryptosystems which have the average-case/worst-case connections. My first time to feel it in Japan is the middle of 2007.
というのが感想. 実際認識しているが論文に書けないというか何というか. 格子暗号で用いられる最悪時の仮定に関して言えば, 各問題は基本的にはNPかつco-NPに入っている. これも最悪時がそこそこ簡単かもしれないという懸念の傍証になっていると言えよう.
あと, DLやRSAもRSRを持つので (弱い意味で) average-case/worst-case connectionを持つとは言える.