Private Stochastic Convex Optimization: Efficient Algorithms for Non-smooth Objectives

arxiv(2020)

引用 0|浏览29
暂无评分
摘要
In this paper, we revisit the problem of private stochastic convex optimization. We propose an algorithm, based on noisy mirror descent, which achieves optimal rates up to a logarithmic factor, both in terms of statistical complexity and number of queries to a first-order stochastic oracle. Unlike prior work, we do not require Lipschitz continuity of stochastic gradients to achieve optimal rates. Our algorithm generalizes beyond the Euclidean setting and yields anytime utility and privacy guarantees.
更多
查看译文
关键词
private stochastic convex optimization,efficient algorithms,non-smooth
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要