C-NEST: Cloudlet-Based Privacy Preserving Multidimensional Data Stream Approach for Healthcare Electronics

The Medical Internet of Things (MIoT) facilitates extensive connections between cyber and physical “things” allowing for effective data fusion and remote patient diagnosis and monitoring. However, there is a risk of incorrect diagnosis when data is tampered with from the cloud or a hospital due to third-party storage services. Most of the existing systems use an owner-centric data integrity verification mechanism, which is not computationally feasible for lightweight wearable-sensor systems because of limited computing capacity and privacy leakage issues. In this regard, we design a 2-step Privacy-Preserving Multidimensional Data Stream (PPMDS) approach based on a cloudlet framework with an Uncertain Data-integrity Optimization (UDO) model and Sparse-Centric SVM (SCS) model. The UDO model enhances health data security with an adaptive cryptosystem called Cloudlet- Nonsquare Encryption Secret Transmission (C-NEST) strategy by avoiding medical disputes during data streaming based on novel signature and key generation strategies. The SCS model effectively classifies incoming queries for easy access to data by solving scalability issues. The cloudlet server measures data integrity and authentication factors to optimize third-party verification burden and computational cost. The simulation outcomes show that the proposed system optimizes average data leakage error rate by 27%, query response time and average data transmission time are reduced by 31%, and average communication-computation cost are reduced by 61% when measured against state-of-the-art approaches.

Gautam Srivastava is with the Department of Mathematics and Computer Science, Brandon University, Brandon, MB R7A 6A9, Canada, also with the Research Centre for Interneural Computing, China Medical University, Taichung 40402, Taiwan, and also with the Department of Mathematics and Computer Science, Lebanese American University, Beirut 1102, Lebanon (e-mail: srivastavag@brandonu.ca).
Digital Object Identifier 10.1109/TCE.2023.3342635cloud-based healthcare centers which is an Industry 5.0 emerging challenge [1].Healthcare professionals utilize electronic health records (EHR) to facilitate patient-centric healthcare services.However, the storage of patient data in third-party cloud computing services and personal devices raises security concerns due to its sensitivity.In response to these challenges, innovative technologies such as Artificial Intelligence (AI)-driven software-defined networks (SDN), fog computing, statistical machine learning, and blockchain have been introduced in the past few years to fortify Industry 4.0 [2], [3], [4].Now, as we transition towards Industry 5.0, there is a need to adapt and apply these advancements to realize healthcare case studies.This includes the implementation of virtual care, intelligent healthcare decision-making, and remote monitoring using wearable sensors to enhance the efficiency of healthcare electronic business operations, ensuring consistency throughout the entire value chain.The integration of Cloudlet within the MIoT (Medical Internet of Things) paradigm coupled with automated decision-making systems enables efficient remote monitoring and control of patient diagnosis, as discussed in [5], [6].MIoT frameworks are resource-limited because they typically include battery-powered devices with limited network lifetime and constrained computational resources.For instance, some wearable biomedical sensors, such as the Sphygmomanometer, Blood oxygen (SPO2) sensors, and Electromyography (EMG) as well as Electrocardiogram (ECG) devices/sensors are enabled with batteries but cannot compute and analyze data for optimal clinical decision-making.However, modern wireless telecommunication is not restricted to computer networks but also can be used to integrate consumer electronics through global Internet access.As consumer electronics become more interconnected, IoT continues to expand, particularly in the healthcare domain, where wearable health tracking solutions are becoming increasingly popular [5], [7], [8].For instance, in 2019, 463 million people were affected by diabetes [9] which has prompted healthcare companies to innovate their diagnostic devices due to the high prevalence of this chronic disease.Over the past five years, wearable Continuous Glucose Monitors (CGM) have replaced traditional finger stick blood glucose measurement as the preferred diagnostic device.Moreover, healthcare social frameworks like PatientsLikeMe 1 collect and share patient's sensitive data over networks, which have the potential to be leaked or stolen, leading to privacy issues [10].Therefore, maintaining consistent privacy protection has become a challenging endevour due to the ever changing landscape of the intersection between consumer electronics and healthcare.
The advances of Cloudlets allow for storing large amounts of data and facilitates intensive computation services.Cloudlets serve as an intermediary between IoT and endusers while also enabling clinical decisions based on medical data analysis at Cloudlets/IoT-Hubs rather than sending data directly to the cloud [11], [12].This mechanism decreases latency and improves system reliability and performance.IMoT data is transmitted through access points and IoT-Hubs to carry out initial processes at a Cloudlet to make clinical decisions, which are shared with end-users and medical professionals, as shown in Fig. 1.However, Cloud-based IoT healthcare frameworks have the following fundamental challenges: 1) Protecting data while sharing to a Cloudlet to avoid data leakage and/or data theft issues.2) IMoT data that is stored at third-party servers may lead to the disclosure of sensitive information without user consent [13], [14].Motivation: The use of Cloudlets facilitates robust computing and data storage services near the data generation point, significantly reducing third party involvement in data authentication and transmission.In this regard, we design a secure key exchange mechanism called Cloudlet-Novel Encryption Secret Transmission (C-NEST).The Cloudlet forwards encrypted data to the server, where data is decrypted using a secret key before sharing the data with doctors and end-users.In most cases, data integrity verification is required, where endusers send a request to a Cloudlet for effective integration, which is time-consuming.Therefore, public-key infrastructure management is omitted in our proposed system to achieve low latency.Based on our knowledge, this proposed method is a reliable and effective privacy solution based on Cloudlets.
Our approach incorporates a User Defined Object (UDO) to enhance health data security by avoiding medical disputes during data streams.Subsequently, the Sparse-Centric Support Vector Machines (SCS) model design helps in classifying query results to simplify data access and solve scalability issues.Furthermore, the Cloudlet server measures data integrity and authentication to optimize third-party verification burden and computational cost.Our main contributions can be summarized as follows: 1) Design a 2-step privacy-preserving multidimensional data stream (PPMDS) approach based on a Cloudlet framework.2) Develop a UDO model to enhance health data security with adaptive cryptosystem by avoiding medical disputes during data stream.3) Develop a SCS model for effective classification of queries for easy access of data by solving scalability issues.4) Develop a data integrity measurement method to optimize the third-party verification burden and computation cost.The remainder of this article is structured as follows: Section II focuses on related work related to data integrity, quality-aware data searches, and multi-objective sensor data fusion.Section III describes the proposed PPMDS system and its functional methods, including UDO, C-NEST, and SCS.The algorithm's performance is discussed based on experimental evaluation results in Section IV, and the article concludes with Section V.

II. RELATED WORK
The increased number of recent publications is evidence of the importance of privacy-preserving computation for multidimensional data.

A. Attribute-Aware Security Methods
An attribute-based encryption (ABE) method was developed to achieve fine-grained access control with privacy [15] instead of relying on a server-centric mechanism based on Bayesian theory [26], [27].An attribute-based electronic health record (EHR) security system was developed in [28], [29].Additionally, the data redundancy method has been developed in [16] to reduce the storage cost of the server and lower computational as well as communication costs.The CINEMA approach was developed based on secure permutation mechanisms for online health data privacy in [17].This approach allows users to make query operations without decryption.However, service request execution demands inadequate computing and storage resources to achieve a reliable service rate.Here, three different issues of multidimensional data fusion are listed as follows: 1) Security issues of heterogeneous networks [18].
2) Authenticate crypto-model design issues for effective data transmission [30].3) Network cost optimization by classifying the terminology of sensors (such as low-end and high-end sensors) [19].Low-end sensors are often used to optimize network costs.In contrast, mobile sensors reduce the number of active sensors and optimize real-time data transmission costs, sensor coverage, and network congestion.However, the authors in [20] neglected to design a data search mechanism and concentrated on data transmission efficiency.As an extension of this work, a searchable encryption system was developed in [31], but the privacy preservation system is not up to the mark.The authors developed two searchable encryption systems but neglected data-sharing privacy services and substantial index generation methods.In [32], a defector tracking system was designed based on legal user authentication, and the white-box trace model was developed to assess the affinity among search attributes for identifying private keys.However, the system is only suitable for data sharing and does not focus on data searching privacy services.In [21], keyword search and data decryption schemes are developed, independent of any key integrity checks, which could potentially result in data leakage.

B. Data Integrity Models
Data integrity verification is essential in various fields of data research, particularly in the case of dynamic data [22], [33].Some examples of dynamic data include social data, music, and movies, which are often stored on the cloud due to frequent changes in the data such as additions, modifications, or deletions.However, the major challenge in Cloudlet storage privacy includes authentication and authorization of data identity for secure transmission [34], data storage [35], as well as access control [15].A such, there is a need to develop a novel data integrity verification scheme to enhance data privacy for cloud-assisted systems.During data integrity verification, a zero-knowledge proof strategy is employed to prevent third-party consideration while acquiring user data in [36].It is crucial to hide user-sensitive information before sharing it in any environment.Data integrity verification mechanisms are vital assessing data damage.An extended dynamic data integrity verification method that uses a block-based signature policy and an identity-based cryptography system was described in [37].Another method used was an integrity verification method based on attribute revocation functions, which uses a dual encryption-based Merkle hash function to optimize data privacy.However, the system's computational complexity was not adequately considered, making it unsuitable for IoT frameworks [23].The SPPDA model [24] utilizes a bi-linear pairing scheme based on the Diffie-Hellman key mechanism to address data privacy and aggregation issues.However, the model inaccurately addresses data leakage issues.In [25], data was encrypted and shared with the edge server using the Equilibrium Point Analysis (EPA) model.Here, a public cloud center (PCC) accommodates a secure storage service for all aggregated data from an edge server, and a private key is used for data integrity and authentication.However, the computational complexity was not moderated and was shown to be not equivalent to traditional methods.We propose a PPMDS approach to address these issues to enable effective privacy preservation and data searching schemes, providing seamless access to IoT frameworks.A summary of the related work is given in Table I, while a the notation used in this paper is summarized in Table II.

III. PROPOSED MODEL
In this section, we theoretically formulate the proposed framework, which allows for the interconnection of IoT devices with a Cloudlet.We derive mathematical methods aimed at optimizing integrity and security.
Fig. 2 illustrates the fundamental mechanism of a twostep PPMDS approach based on Cloudlets.This approach comprises an UDO model and a SCS model.The deployment of Cloudlets, IoT-Hub, and wearable sensors enables medical data fusion services.IoT-Hub plays a crucial role in data authentication by generating a private key for each entity, which helps identify intruders and classify medical data for easy access.End-users or patients can access data from the Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.

TABLE II NOTATION TABLE
Cloudlet using an authentication key generated by IoT-Hub.Users receive ciphertext as search results, which can be converted into plaintext using a private authentication key.This procedure remains the same for all end-users.The Cloudlet stores patient data and provides easy access based on search operations by bringing cloud services close to the network's edge with secure computation and storage capacity.

A. PPMDS Functional-Flow
For the process of validating stored data, UDO ϒ(E id ) → (X, X), involves cross-validation.Here, E id represents the patient ID, ϒ indicates uncertain data integrity, X indicates valid and identical data, while X indicates invalid and nonidentical data.Typically, the Cloudlet sends a request to the server for data integrity verification, and the server shares a report based on the stored data with authenticators in response.The same request is verified by the Cloudlet when sent by an end-user.If the outcome is X, then the stored data is secure, whereas if it is not X, then the data has been tampered with.Where t,o → (ϕ t , ς t ) helps classify data for easy access with adaptable security.t,o is the encrypted medical data fused at time t of the objective/parameter o, ∇ t,o is the decrypted medical data fused at time t of the objective/parameter o, ϕ t is data authentication, and ς t is a signature for effective data processing.The Key-Management process (E id ) → (E sk , φ) is responsible for providing secure communication using the Nonsquare Encryption Secure Transmission (NEST) protocol, influenced by Diffie-Hellman.Here, E id represents the patient ID, E sk represents the patient security key and φ represents a private key.The Key-Generation model (E sk , C id ) → (E sk , C sk , φ C ) generates secure keys for effective privacy preservation.C id , C sk , φ C represent the Cloudlet ID, cloudlet security key, and secure key generator, respectively.These keys enable easy access to medical data with data authenticators.

B. UDO Model Based on C-NEST
The key generation process for each patient and Cloudlet is managed by Algorithm 1. Line 1 initializes the parameters such as patient secret key, cloudlet public key, and private key generated by the data authenticator, which is not equal to zero.Lines 2-6 assess the key for each patient and doctor for encryption and signature generation processes.
Algorithm 2 ensures security of data during its transfer to the cloud for storage purposes.The initialization of the necessary parameters takes place on Line 1, while Lines 2-7 evaluates the complete encryption of the data using both encryption and signature models, based on the patient ID and the IoT-Hub data authenticator generated key.
Algorithm 3 is designed to verify the integrity of medical data for each patient.In Line 1, the necessary parameters are initialized, while Line 2 measures the key attributes required for generating the authenticator key, along with Line 4. Storage data is encrypted in Line 5. Lines 6-9 are responsible for verifying data integrity when the signature is valid.If the signature is found to be invalid, it can be inferred that the stored data has been tampered with.
1) Signature Generation: The PPMDS approach generates a public-key (δ, φ pk σ i , P), where δ is the general public-key, σ i pk is the signature public-key, and P is a random prime number.Additionally, it generates a private-key (φ sk σ i , P), where σ i sk is the signature private-key for C-NEST.The signature keys are formulated as follows: φ sk σ i = −e 2 mod P and φ pk σ i = −2e 2 log, (modP).The encrypted signature is also formulated in the approach.
The signature for the IoT data is as follows 2) Signature Verification: A valid signature is as follows The data is tampered with.
12 end 13 end Substitute Eq. ( 1) and 2 in Eq. ( 3), then If is equal to ∇, then the IoT medical data is considered valid and not tampered with.Otherwise, the data is deemed tampered with.
Theorem 1: Let's assume, the cloudlet verifies the integrity of the data based on user demand, with Proof: The signature encryption, hash function, and data authentication remain used to assess the data integrity and it follows Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.
Algorithm 4: Device/Server Data Storage  The offloading mechanism will trigger the selection of a suitable device (IoT-Hub) or Cloudlet for storing data; otherwise, storing data in the cloud.
As per equation ( 1), it formulates as follows

C. Data Storage Model
The decision of where to store sensed data is evaluated by Algorithm 4, which offers three storage options based on the concerned device's capacity: store at a neighbouring IoT-Hub, Cloudlet, or Cloud server.Line 1 initializes the parameters, including patient ID, old-entry set, storage array, sensed data, and device capacity.Note that we have not focused on estimating device capacity, as it is beyond the scope of this manuscript.Line 2 uses security models discussed previously to determine where data should be stored.Algorithm 4 utilizes a time series to estimate the entire process, as shown in Line 4. If the patient ID and old-entry data are identical, Line 6 cross-checks for the threshold value (ℵ κ ), then changes it and sends a notification message if necessary.Otherwise, the new arrival encrypted data is updated.Lines 7-20 check for unusual measurements and send an alert notification accordingly.
1) SCS-Based Data Access Model: In Fig. 3, the data access mechanism based on the SCS model is shown.The key-generation and exchange process involves generating and communicating the key through the C-NEST protocol.At the same time, the data authenticator shares the key with the patient, Cloudlet, and server.To enhance privacy preservation, encryption estimates the encryption of all data stored and shared with the Cloudlet or server.Our research objective is achieved through the Support Vector Machine (SVM)-based SCS model, which facilitates easy access to data based on user requests.The SCS model classifies requests or queries to locate storage (in IoT-Hub, Cloudlet, or server) and, if identified, initiates the decryption process to share data as per user or device request.

D. Data Streaming Model
Let Z be the set of k-dimensional coefficient matrices, and let ς t be a sequence of non-singular covariance matrices.
The expression q t = K i=1 Zi × q t−i + ς t refers to the set of influenced data points that are to be streamed to the device or server for storage, to reduce the communicationcomputation overhead on the server.Algorithm 5 facilitates the streaming of multidimensional data.In Line 1, patient ID, stored information, and fused data are initialized.Line 2 evaluates the data points to effectively reduce their size and optimize the communication overhead.Line 4 and Line 5 determine the subset of points from the IoT data that should be equal by definition, and the threshold distance of points Algorithm 5: Data Streaming to Cloudlet for storage 3 Measure IoT-data points for effective compression as follows Estimate the Influence Points of j th IoT-data for effective data compression with low-data size as follows; Distance-based data clustering is as follows llllD t o j = q t Qq t ; 8 The estimation of the model matrix for streaming data is as follows q t − q t 2 ; 9 Each objective sensor data stream to stored as a block matrix with column vectors as follows = 1 , 1 , . . ., j ; 10 end d plays an important role in clustering the subset points, as can be observed in Line 6. Line 7 calculates the model matrix for streaming data according to the sensor objective, while Line 8 maintains the individual j th objective data with column vectors, as follows:

E. Complexity Analysis
Assuming we split all four algorithms into three submodules.First, the complexity of calculating key generation and the signature function for each patient record is in O(n 2 ).The complexity of sorting the uncertain data integrity of all records is in O(n log n).Finally, the complexity of contentstorage update and change analysis at each request is O(n 3 ).The overall complexity can be expressed as the sum of the complexities of the three sub-modules: IV. EXPERIMENTAL ANALYSIS For experimentation, both the Raspberry Pi-4 Model-B Board and a Personal Computer (PC) were used.The PC used 64-bit UBUNTU 18.04.5 LTS on an Intel Core i7-10700 CPU @ 3.80GHz with 16 cores, NVIDIA GeForce RTX3090, and 64 GB RAM.The Raspberry Pi used Ubuntu MATE 16.04 on an ARMv7 Processor rev (v7l) CPU with 4 cores, a maximum clock speed of 600 MHz, and 128 MB RAM.The Pis were utilized for aggregating data from sensors, and the PBC and GMP libraries, along with a C++ program, were employed for cryptographic operations.File sizes of 512-bit or 1024-bit and a communication distance of 50 meters with a communication speed of 2 Mbps were considered between IoTHub and Cloudlet.The data packet density was taken into account to evaluate the bandwidth rate, with each packet being 24 Kb in size.
We have examined two benchmark models, namely EPA [25] and Secure Privacy-Preserving Data Aggregation (SPPDA) [24].The EPA model, based on the Boneh-Goh-Nissim cryptosystem, ensures data authenticity and integrity, as well as estimates communication costs and emphasizes data aggregation in the context of mobile edge computing (MEC).EPA employs a single key mechanism tailored for lightweight networks, where data is encrypted and shared with the edge server using a private key.The public cloud center (PCC) stores all aggregated data from the edge server, and a private key is utilized for data integrity and authentication.The second benchmark model, SPPDA, introduces an innovative signature scheme to enhance authenticity and integrity of aggregated data.SPPDA relies on the bi-linear Diffie-Hellman assumption to upscale data confidentiality, authenticity, and privacy.The primary focus of SPPDA is to minimize communication, transmission, and computational costs associated with remote servers to meet the requirements of lightweight networks.
In developing a script for a Cloudlet-integrated IoT-Hub, we assumed 8 Cloudlet instances to handle 10 distinct IoT frameworks, each associated with various data sizes originating from 10 wearable sensors.During simulations, we allocated one instance for each framework that allows to management of ten sensors.Data fusion and fault-tolerance of comparative decryption analysis are depicted in Fig. 4. The Authorized licensed use limited to the terms of the applicable license agreement with IEEE.Restrictions apply.data fusion mechanism plays a vital role in the process of streaming multidimensional data to enhance privacy preservation of the data stream.The file size has a significant impact on data fusion and storage rates during encryption and signature generation.In our simulations, an average of C id = 8 Cloudlets and an average of 79 human-wearable sensors were considered to evaluate the data fusion rate, as shown in Fig. 4(a).
Increasing the number of Cloudlets significantly increases the data fusion rate, even when sensor count is low.During data storage, the authenticator generates private keys for each sensing unit to enhance data privacy.The decryption time is a crucial metric for assessing and optimizing communication and computational overhead of devices and it is affected by the average C id count when accessing data.The Privacy-Preserving Multidimensional Data Stream (PPMDS) approach, which is based on the C-NEST mechanism and SCS model, and streamlined with the UDO model, has a lower faulttolerance rate during decryption for all file sizes.However, the fault-tolerance rate is higher for a file size of 2046 bits compared to 512 bits, as shown in Fig. 4(b).
The cost of signature verification is a crucial performance metric for evaluating our proposed privacy approaches, as shown in Fig. 5.The number of potential Cloudlets affects signature generation cost, and our approach incurs lower costs than State-Of-The-Art (SOA) methods due to its innovative signature generation and verification techniques.Specifically, when accessing data from a Cloudlet or IoT-Hub at E id = 45, the approach has lower costs, as shown in Fig. 5(a), and it also incurs lower costs when C id = 7, as shown in Fig. 5(b).
Communication overhead between sensors, Cloudlets, and servers is illustrated in Fig. 6, while Fig. 6(a) specifically depicts the communication overhead between wearable sensors (WS) and Cloudlets, respectively.The PPMDS approach has a lower overhead rate compared to SOA approaches.However, as the number of medical records E id increases, communication overhead is usually significantly increased due to limited computational resources of sensors.Therefore, sensed data is transferred to a Cloudlet with moderate computational and storage capacity.Fig. 6(b) shows the communication overhead between Cloudlet and server.This overhead does not frequently occur, but when the Cloudlet is not capable  of processing data, a service offloading strategy is initiated.The average communication overheads for PPMDS, EPA, and SPPDA are ≤ 2000bits, ≤ 3900bits, and ≤ 4600 bits, respectively.Nowadays, multi-edge computations can handle up to 8 GB of data storage and computation [38].
The error rate of proposed and existing approaches based on data processing location is presented in Fig. 7.The PPMDA approach exhibits a lower error rate than both SSPDA [24] and EPA [25] approaches at the server, Cloudlet, and IoT-Hub.This is due to the hierarchical independence of computational and storage capacities, where Server ≥ Cloudlet ≥ IoT-Hub.Additionally, the UDO model enhances health data security by utilizing an adaptive cryptosystem called C-NEST during the data stream, while the SCS model solves scalability issues in data access.These two models prevent third-party involvement in privacy preservation.Comparatively, the EPA approach has a moderate error rate compared to the SSPDA approach.Table III shows signature verification cost of proposed and SOA approaches.The pairing cost is ϒ(ϕ, η o ) = 15.79ms,argumentation cost is ϒ A , = 0.04ms, exponential cost of Qt is ϒ e = 1.31ms, is argumentation order of Q and exponential cost is ϒZ = 1.25ms, respectively.
Impact of proposed model: By leveraging the UDO and SCS models, the PPMDS approach achieves appropriate data security and service reliability rates by effectively classifying service requests.The data integrity measurement model accurately measures and validates data authentication, and its lightweight functionality eliminates the need for third-party involvement during data integrity verification.The complete process of data service request classification is shown in Fig. 3.The signature generation and verification mechanisms are novel and generate public, private, and signed public keys to enhance privacy, primarily suited for Cyber Physical Systems (CPS) and IoT applications to ensure effective network maintenance.Before storing data on IoT devices, data is analyzed and mapped with existing data, and the processing capacity depends on device selection to optimize unauthorized access.The C-NEST protocol provides secure transmission between IoT devices and Cloudlet.In summary, the UDO model simplifies data integrity processes on IoT devices by identifying similar data and providing adequate security during storage based on the C-NEST communication strategy.The communication process begins for data storage only when received data is different from stored data.

V. CONCLUSION
The Cloudlet-based 2-step PPMDS approach, consisting of the UDO model and the SCS model, aims to optimize data privacy and reduce single-bottleneck issues.The UDO model utilizes an adaptive cryptosystem to optimize data security and reduce the medical dispute rate by 27%, using the Cloudlet-Nonsquare Encryption Secret Transmission (C-NEST) strategy during data streams.The SCS model effectively classifies query requests, with 89% accuracy, to enable easy data access and address scalability issues, as the UDO model reduces data redundancy rates.The Cloudlet measurement system reduces third-party verification burden and computational cost average by 44% and 61%, respectively.Experimental analysis results show that the proposed system outperforms State-Of-The-Art approaches, reducing average data leakage error rate by 27%, query response time, average data transmission time by 31%, and average communication-computational cost by 61%.

FUTURE WORK
Future work will focus on designing and developing optimized secure channel selection, trust estimation, and latency-constrained computation models based on offloading schemes.Additionally, a node selection strategy will be designed to improve multidimensional data access, as these are global challenges that drastically affect device and server communication as well as computational service costs.

C
-NEST: Cloudlet-Based Privacy Preserving Multidimensional Data Stream Approach for Healthcare Electronics Gautam Srivastava , Senior Member, IEEE, M. S. Mekala, Muhammad Shadi Hajar, and Harsha Kalutarage Abstract-The Medical Internet of Things (MIoT) facilitates extensive connections between cyber and physical "things" allowing for effective data fusion and remote patient diagnosis and monitoring.However, there is a risk of incorrect diagnosis when data is tampered with from the cloud or a hospital due to third-party storage services.Most of the existing systems use an owner-centric data integrity verification mechanism, which is not computationally feasible for lightweight wearable-sensor systems because of limited computing capacity and privacy leakage issues.In this regard, we design a 2-step Privacy-Preserving Multidimensional Data Stream (PPMDS) approach based on a cloudlet framework with an Uncertain Data-integrity Optimization (UDO) model and Sparse-Centric SVM (SCS) model.The UDO model enhances health data security with an adaptive cryptosystem called Cloudlet-Nonsquare Encryption Secret Transmission (C-NEST) strategy by avoiding medical disputes during data streaming based on novel signature and key generation strategies.The SCS model effectively classifies incoming queries for easy access to data by solving scalability issues.The cloudlet server measures data integrity and authentication factors to optimize third-party verification burden and computational cost.The simulation outcomes show that the proposed system optimizes average data leakage error rate by 27%, query response time and average data transmission time are reduced by 31%, and average communication-computation cost are reduced by 61% when measured against state-of-the-art approaches.Index Terms-Cloudlet, C-NEST, UDO model, data-integrity measurement index, SCS model.I. INTRODUCTION T HE CLOUD-INTEGRATION of Internet of Things (IoT) frameworks has become an essential paradigm in cyberspace, enabling seamless connections between healthcare systems.The convergence of wireless technology advancements and statistical machine learning (ML) has extended the exponential growth of medical IoT data analysis within Manuscript received 5 August 2023; revised 24 October 2023; accepted 2 December 2023.Date of publication 13 December 2023; date of current version 26 April 2024.(Corresponding author: Gautam Srivastava.)

Fig. 4 .
Fig. 4. Data fusion and fault-tolerance analysis with different sizes.

Fig. 7 .
Fig. 7. Integrity error rate comparative analysis, where A, B and C refer to the device, cloudlet and server, respectively.

TABLE I SUMMARY
OF RELATED WORK