Reducing computational cost in IoT cyber security: case study of artificial immune system algorithm.
Zakariyya, Idris; Al-Kadri, M. Omar; Kalutarage, Harsha; Petrovski, Andrei
M. Omar Al-Kadri
Using Machine Learning (ML) for Internet of Things (IoT) security monitoring is a challenge. This is due to their resource constraint nature that limits the deployment of resource-hungry monitoring algorithms. Therefore, the aim of this paper is to investigate resource consumption reduction of ML algorithms in IoT security monitoring. This paper starts with an empirical analysis of resource consumption of Artificial Immune System (AIS) algorithm, and then employs carefully selected feature reduction techniques to reduce the computational cost of running the algorithm. The proposed approach significantly reduces computational cost as illustrated in the paper. We validate our results using two benchmarks and one purposefully simulated data set.
|Start Date||Jul 26, 2019|
|Publication Date||Oct 31, 2019|
|Institution Citation||ZAKARIYYA, I., AL-KADRI, M.O., KALUTARGE, H. and PETROVSKI, A. 2019. Reducing computational cost in IoT cyber security: case study of artificial immune system algorithm. In Obaidat, M. and Samarati, P. (eds.) Proceedings of the 16th International security and cryptography conference (SECRYPT 2019), co-located with the 16th International joint conference on e-business and telecommunications (ICETE 2019), 26-28 July 2019, Prague, Czech Republic. Setúbal, Portugal: SciTePress [online], 2, pages 523-528. Available from: https://doi.org/10.5220/0008119205230528.|
|Keywords||Computational cost; IoT security; Feature reduction; Resource consumption; Machine learning|
ZAKARIYYA 2019 Reducing computational
You might also like
The use of machine learning algorithms for detecting advanced persistent threats.
Fuzzy logic applied to value of information assessment in oil and gas projects.