Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

BIT-FL: Blockchain-Enabled Incentivized And Secure Federated Learning Framework

Published

Author(s)

Tao Zhang, Chenhao Ying, Fuyuan Xia, Haiming Jin, Yuan Luo, David Wei, Xinchun Yu, Yibin Xu, Xikun Jiang, Weiting Zhang, Dacheng Tao

Abstract

Federated learning (FL) enables the protection of data privacy through collaborative training of a machine learning (ML) model without sharing local data. However, several common problems arise when designing FL frameworks, as they rely on a central server for global aggregation and a large number of distributed workers for local training. The first challenge is addressing the single point of failure posed by the central server. The second issue is attracting more workers, as executing local training can be costly. The third challenge is preserving workers' bid privacy when they bid for participation. The fourth challenge is ensuring the privacy of local models in FL, as private information may be divulged during model sharing. Finally, the fifth challenge is addressing poisoning attacks introduced by dishonest workers. To tackle these significant challenges, we propose the Blockchain-enabled Incentivized and Secure Federated Learning (BIT-FL) framework. BIT-FL leverages blockchain technology, Byzantine fault-tolerant (BFT) consensus, differential privacy, and incentive mechanisms to attract more participants, tolerate adversarial workers, and maintain the privacy of their bids and local models. Furthermore, we provide a theoretical upper bound analysis for the excess empirical risk of BIT-FL, which is bounded by O( ln(nmin)/3/2 nmin + ln(n)/n ), where n represents the size of the union dataset, and nmin represents the size of the smallest dataset. In addition, we address the crucial issue of the tradeoff between the degree of accuracy and the level of privacy protection in BIT-FL. Our extensive experiments demonstrate that BIT-FL exhibits robustness and high accuracy for both classification and regression tasks.
Citation
IEEE Transactions on Services Computing

Keywords

Federated Learning, Blockchain, Incentive Mechanism, Differential Privacy, Byzantine Robustness

Citation

Zhang, T. , Ying, C. , Xia, F. , Jin, H. , Luo, Y. , Wei, D. , Yu, X. , Xu, Y. , Jiang, X. , Zhang, W. and Tao, D. (2025), BIT-FL: Blockchain-Enabled Incentivized And Secure Federated Learning Framework, IEEE Transactions on Services Computing, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=956862 (Accessed January 29, 2025)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created February 1, 2025, Updated January 14, 2025