Federated Learning (FL) uses a distributed Machine Learning (ML) concept to build a global model using multiple local models trained on distributed edge devices. A disadvantage of the FL paradigm is the requirement of many communication rounds before model convergence. As a result, there is a challenge for running on-device FL with resource-hungry algorithms such as Deep Neural Network (DNN), especially in the resource-constrained Internet of Things (IoT) environments for security monitoring. To address this issue, this paper proposes Resource Efficient Federated Deep Learning (REFDL) method. Our method exploits and optimizes Federated Averaging (Fed-Avg) DNN based technique to reduce computational resources consumption for IoT security monitoring. It utilizes pruning and simulated micro-batching in optimizing the Fed-Avg DNN for effective and efficient IoT attacks detection at distributed edge nodes. The performance was evaluated using various realistic IoT and non-IoT benchmark datasets on virtual and testbed environments build with GB-BXBT-2807 edge-computing-like devices. The experimental results show that the proposed method can reduce memory usage by 81% in the simulated environment of virtual workers compared to its benchmark counterpart. In the realistic testbed scenario, it saves 6% memory while reducing execution time by 15% without degrading accuracy.
ZAKARIYYA, I., KALUTARAGE, H. and AL-KADRI, M.O. 2022. Resource efficient federated deep learning for IoT security monitoring. In Li, W., Furnell, S. and Meng, W. (eds.) Attacks and defenses for the Internet-of-Things: revised selected papers from the 5th International workshop on Attacks and defenses for Internet-of-Things 2022 (ADIoT 2022), in conjunction with 27th European symposium on research in computer security 2022 (ESORICS 2022) 29-30 Septempber 2022, Copenhagen, Denmark. Lecture notes in computer science (LNCS), 13745. Cham: Springer [online], pages 122-142. Available from: https://doi.org/10.1007/978-3-031-21311-3_6