谷歌浏览器插件
订阅小程序
在清言上使用

Better Private Linear Regression Through Better Private Feature Selection

NeurIPS 2023(2023)

引用 0|浏览21
暂无评分
摘要
Existing work on differentially private linear regression typically assumes that end users can precisely set data bounds or algorithmic hyperparameters. End users often struggle to meet these requirements without directly examining the data (and violating privacy). Recent work has attempted to develop solutions that shift these burdens from users to algorithms, but they struggle to provide utility as the feature dimension grows. This work extends these algorithms to higher-dimensional problems by introducing a differentially private feature selection method based on Kendall rank correlation. We prove a utility guarantee for the setting where features are normally distributed and conduct experiments across 25 datasets. We find that adding this private feature selection step before regression significantly broadens the applicability of “plug-and-play” private linear regression algorithms at little additional cost to privacy, computation, or decision-making by the end user.
更多
查看译文
关键词
differential privacy,linear regression,sparse,feature selection,kendall
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要