# Decentralized Optimization Resilient Against Local Data Poisoning Attacks

IEEE Transactions on Automatic Control（2024）

Abstract

We study the problem of decentralized optimization in the presence of adversarial attacks. In this problem, we consider a collection of nodes connected through a network, each equipped with a local function. These nodes are asked to collaboratively compute the global optimizer, i.e., the point that minimizes the aggregated local functions, using their local information and messages exchanged with their neighbors. Moreover, each node should agree on the said minimizer despite an adversary that can arbitrarily change the local functions of a fraction of the nodes. We present RAGD, the Resilient Averaging Gradient Descent algorithm, a decentralized, consensus+outlier filtering algorithm that is resilient to such attacks on local functions. We demonstrate that, as long as the portion of attacked nodes does not exceed a given threshold, RAGD guarantees that all nodes will be able to have a good estimate of the said minimizer. We verify the performance of the RAGD algorithm via numerical examples.

MoreTranslated text

AI Read Science

Must-Reading Tree

Example

Generate MRT to find the research sequence of this paper

Chat Paper

Summary is being generated by the instructions you defined