SecureBoost+ : A High Performance Gradient Boosting Tree Framework for Large Scale Vertical Federated Learning
arxiv(2021)
摘要
Gradient boosting decision tree (GBDT) is a widely used ensemble algorithm in
the industry. Its vertical federated learning version, SecureBoost, is one of
the most popular algorithms used in cross-silo privacy-preserving modeling. As
the area of privacy computation thrives in recent years, demands for
large-scale and high-performance federated learning have grown dramatically in
real-world applications. In this paper, to fulfill these requirements, we
propose SecureBoost+ that is both novel and improved from the prior work
SecureBoost. SecureBoost+ integrates several ciphertext calculation
optimizations and engineering optimizations. The experimental results
demonstrate that Secureboost+ has significant performance improvements on large
and high dimensional data sets compared to SecureBoost. It makes effective and
efficient large-scale vertical federated learning possible.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要