Contrastive GNN-based traffic anomaly analysis against imbalanced dataset in IoT-based ITS
- Authors: Wang, Yang , Lin, Xi , Wu, Jun , Bashir, Ali , Yang, Wu , Li, Jianhua , Imran, Muhammad
- Date: 2022
- Type: Text , Conference paper
- Relation: 2022 IEEE Global Communications Conference, GLOBECOM 2022, Virtual, online, 4-8 December 2022, 2022 IEEE Global Communications Conference, GLOBECOM 2022 - Proceedings p. 3557-3562
- Full Text: false
- Reviewed:
- Description: The traffic anomaly analysis in IoT-based intelligent transportation system (ITS) is crucial to improving public transportation safety and efficiency. The issue is also challenging due to the unbalanced distribution of anomaly data in IoT-based ITS, which may cause overfitting or underfitting in the training phase. However, some research on traffic anomaly analysis injected limited data to address the shortage of anomaly samples or even neglects this issue, which overlooks the potential representation of nodes in graph neural networks. In this paper, we propose an improved contrastive GNN-based learning framework for traffic anomaly analysis that alleviates the problem of imbalanced datasets in the training phase. In this framework, we provide a graph augmentation approach with coupled features to learn different views of graph data. Besides, we design an effective training method based on the contrastive loss for our framework, which can learn the better performance of latent representations utilized in the downstream tasks. Finally, we conduct extensive experiments to evaluate the performance of our proposed frame-works based on real-world datasets. We demonstrate that our framework achieves as high as 6.45% precision improvement compared to the state-of-the-art. © 2022 IEEE.
Big data management in participatory sensing : issues, trends and future directions
- Authors: Karim, Ahmad , Siddiqa, Aisha , Safdar, Zanab , Razzaq, Maham , Imran, Muhammad
- Date: 2020
- Type: Text , Journal article
- Relation: Future Generation Computer Systems Vol. 107, no. (2020), p. 942-955
- Full Text: false
- Reviewed:
- Description: Participatory sensing has become an emerging technology of this era owing to its low cost in big sensor data collection. Prior to participatory sensing, large-scale deployment complexities were found in wireless sensor networks when collecting data from widespread resources. Participatory sensing systems employ handheld devices as sensors to collect data from communities and transmit to the cloud, where data are further analyzed by expert systems. The processes involved in participatory sensing, such as data collection, transmission, analysis, and visualization, exhibit certain management issues. This study aims to identify big data management issues that must be addressed at the cloud side during data processing and storing and at the participant side during data collection and visualization. It then proposes a framework for big data management in participatory sensing to resolve the contemporary big data management issues on the basis of suggested principles. Moreover, this work presents case studies to elaborate the existence of the highlighted issues. Finally, the limitations, recommendations, and future research directions for academia and industry in the domain of participatory sensing are discussed. © 2017. **Please note that there are multiple authors for this article therefore only the name of the first 5 including Federation University Australia affiliate “Muhammad Imran” is provided in this record**
A blockchain model for fair data sharing in deregulated smart grids
- Authors: Samuel, Omaji , Javaid, Nadeem , Awais, Muhammad , Ahmed, Zeeshan , Imran, Muhammad , Guizani, Mohsen
- Date: 2019
- Type: Text , Conference paper
- Relation: 2019 IEEE Global Communications Conference, GLOBECOM 2019, Waikoloa 9-13 December 2019
- Full Text: false
- Reviewed:
- Description: The emergence of smart home appliances has generated a high volume of data on smart meters belonging to different customers. However, customers can not share their data in deregulated smart grids due to privacy concern. Although, these data are important for the service provider in order to provide an efficient service. To encourage the customers' participation, this paper proposes an access control mechanism by fairly compensating customers for their participation in data sharing via blockchain using the concept of differential privacy. We addressed the computational issues of existing ethereum blockchain by proposing a proof of authority consensus protocol through the Pagerank mechanism in order to derive the reputation scores. Experimental results show the efficiency of the proposed model to minimize privacy risk, and maximize aggregator's profit. In addition, gas consumption, as well as the cost of the computational resources, is reduced. © 2019 IEEE.
Implicit feedback-based group recommender system for internet of things applications
- Authors: Guo, Zhiwei , Yu, Keping , Guo, Tan , Bashir, Ali , Imran, Muhammad , Guizani, Mohsen
- Date: 2020
- Type: Text , Conference paper
- Relation: 2020 IEEE Global Communications Conference, GLOBECOM 2020, Virtual Taipei, 7-11 December 2020 Vol. 2020-January
- Full Text:
- Reviewed:
- Description: With the prevalence of Internet of Things (IoT)-based social media applications, the distance among people has been greatly shortened. As a result, recommender systems in IoT-based social media need to be developed oriented to groups of users rather than individual users. However, existing methods were highly dependent on explicit preference feedbacks, ignoring scenarios of implicit feedbacks. To remedy such gap, this paper proposes an implicit feedback-based group recommender system using probabilistic inference and non-cooperative game (GREPING) for IoT-based social media. Particularly, unknown process variables can be estimated from observable implicit feedbacks via Bayesian posterior probability inference. In addition, the globally optimal recommendation results can be calculated with the aid of non-cooperative game. Two groups of experiments are conducted to assess the GREPING from two aspects: efficiency and robustness. Experimental results show obvious promotion and considerable stability of the GREPING compared to baseline methods. © 2020 IEEE.
A blockchain based privacy-preserving system for electric vehicles through local communication
- Authors: Yahaya, Adamu , Javaid, Nadeem , Khalid, Rabiya , Imran, Muhammad , Naseer, Nidal
- Date: 2020
- Type: Text , Conference paper
- Relation: 2020 IEEE International Conference on Communications, ICC 2020, Dublin, Ireland, 7 to 11 June, IEEE International Conference on Communications Vol. 2020-June
- Full Text: false
- Reviewed:
- Description: In this study, we propose a privacy preservation and efficient distributed searching and matching of Electric Vehicles (EVs) charging demander with suppliers based on reputation. Partially homomorphic encryption-based on reputation computation using local communication is used in the implementation, while hiding EVs users' location. A private blockchain is incorporated in the system to verify and permit secure trading of energy among the EVs' demander and suppliers. The results of the simulation show that the proposed privacy preserved algorithm converges more faster as compared to Bichromatic Mutual Nearest Neighbor (BMNN) algorithm. © 2020 IEEE.
An adaptive and efficient buffer management scheme for resource-constrained delay tolerant networks
- Authors: Moetesum, Momina , Hadi, Fazle , Imran, Muhammad , Minhas, Abid , Vasilakos, Athanasios
- Date: 2016
- Type: Text , Journal article
- Relation: Wireless Networks Vol. 22, no. 7 (2016), p. 2189-2201
- Full Text: false
- Reviewed:
- Description: Provisioning buffer management mechanism is especially crucial in resource-constrained delay tolerant networks (DTNs) as maximum data delivery ratio with minimum overhead is expected in highly congested environments. However, most DTN protocols do not consider resource limitations (e.g., buffer, bandwidth) and hence, results in performance degradation. To strangle and mitigate the impact of frequent buffer overflows, this paper presents an adaptive and efficient buffer management scheme called size-aware drop (SAD) that strives to improve buffer utilization and avoid unnecessary message drops. To improve data delivery ratio, SAD exactly determines the requirement based on differential of newly arrived message(s) and available space. To vacate inevitable space from a congested buffer, SAD strives to avoid redundant message drops and deliberate to pick and discard most appropriate message(s) to minimize overhead. The performance of SAD is validated through extensive simulations in realistic environments (i.e., resource-constrained and congested) with different mobility models (i.e., Random Waypoint and disaster). Simulation results demonstrate the performance supremacy of SAD in terms of delivery probability and overhead ratio besides other metrics when compared to contemporary schemes based on Epidemic (DOA and DLA) and PRoPHET (SHLI and MOFO). © 2015, Springer Science+Business Media New York.
Blending big data analytics : review on challenges and a recent study
- Authors: Amalina, Fairuz , Targio Hashem, Ibrahim , Azizul, Zati , Fong, Ang , Imran, Muhammad
- Date: 2020
- Type: Text , Journal article , Review
- Relation: IEEE Access Vol. 8, no. (2020), p. 3629-3645
- Full Text:
- Reviewed:
- Description: With the collection of massive amounts of data every day, big data analytics has emerged as an important trend for many organizations. These collected data can contain important information that may be key to solving wide-ranging problems, such as cyber security, marketing, healthcare, and fraud. To analyze their large volumes of data for business analyses and decisions, large companies, such as Facebook and Google, adopt analytics. Such analyses and decisions impact existing and future technology. In this paper, we explore how big data analytics is utilized as a technique for solving problems of complex and unstructured data using such technologies as Hadoop, Spark, and MapReduce. We also discuss the data challenges introduced by big data according to the literature, including its six V's. Moreover, we investigate case studies of big data analytics on various techniques of such analytics, namely, text, voice, video, and network analytics. We conclude that big data analytics can bring positive changes in many fields, such as education, military, healthcare, politics, business, agriculture, banking, and marketing, in the future. © 2013 IEEE.
A blockchain-based decentralized energy management in a P2P trading system
- Authors: Khalid, Rabiya , Javaid, Nadeem , Javaid, Sakeena , Imran, Muhammad , Naseer, Nidal
- Date: 2020
- Type: Text , Conference paper
- Relation: 2020 IEEE International Conference on Communications, ICC 2020, Dublin, Ireland, 7 to 11 June, IEEE International Conference on Communications Vol. 2020-June
- Full Text: false
- Reviewed:
- Description: Local energy generation and peer to peer (P2P) energy trading in the local market can reduce energy consumption cost, emission of harmful gases (as renewable energy sources (RESs) are used to generate energy at user's premises) and increase smart grid resilience. In this paper, to implement a hybrid P2P energy trading market, a blockchain-based solution is proposed. A blockchain-based system is fully decentralized and it allows the market members to interact with each other and trade energy without involving any third party. Smart contracts play a very important role in the blockchain-based energy trading market. They contain all the necessary rules for energy trading. We have proposed three smart contracts to implement the hybrid electricity trading market. The market members interact with main smart contract which requests P2P smart contract and prosumer to grid (P2G) smart contract for further processing. The main objectives of this paper are to propose a model to implement an efficient hybrid energy trading market while reducing cost and peak to average ratio (PAR) of electricity. © 2020 IEEE.
A hybrid computing solution and resource scheduling strategy for edge computing in smart manufacturing
- Authors: Li, Xiaomin , Wan, Jiafu , Dai, Hong-Ning , Imran, Muhammad , Xia, Min , Celesti, Antonio
- Date: 2019
- Type: Text , Journal article
- Relation: IEEE Transactions on Industrial Informatics Vol. 15, no. 7 (2019), p. 4225-4234
- Full Text: false
- Reviewed:
- Description: At present, smart manufacturing computing framework has faced many challenges such as the lack of an effective framework of fusing computing historical heritages and resource scheduling strategy to guarantee the low-latency requirement. In this paper, we propose a hybrid computing framework and design an intelligent resource scheduling strategy to fulfill the real-time requirement in smart manufacturing with edge computing support. First, a four-layer computing system in a smart manufacturing environment is provided to support the artificial intelligence task operation with the network perspective. Then, a two-phase algorithm for scheduling the computing resources in the edge layer is designed based on greedy and threshold strategies with latency constraints. Finally, a prototype platform was developed. We conducted experiments on the prototype to evaluate the performance of the proposed framework with a comparison of the traditionally-used methods. The proposed strategies have demonstrated the excellent real-time, satisfaction degree (SD), and energy consumption performance of computing services in smart manufacturing with edge computing. © 2005-2012 IEEE.
Machine learning for 5G security : architecture, recent advances, and challenges
- Authors: Afaq, Amir , Haider, Noman , Baig, Muhammad , Khan, Komal , Imran, Muhammad , Razzak, Imran
- Date: 2021
- Type: Text , Journal article
- Relation: Ad Hoc Networks Vol. 123, no. (2021), p.
- Full Text: false
- Reviewed:
- Description: The granularization of crucial network functions implementation using software-centric, and virtualized approaches in 5G networks have brought forth unprecedented security challenges in general and privacy concerns. Moreover, these software components’ premature deployment and compromised supply chain put the individual network components at risk and have a ripple effect for the rest of the network. Some of the novel threats to 5G assets include tampering in identity and access management, supply-chain poisoning, masquerade and bot attacks, loop-holes in source codes. Machine learning (ML) in this context can help to provide heavily dynamic and robust security mechanisms for the software-centric architecture of 5G Networks. ML models’ development and implementation also rely on programmable environments; hence, they can play a vital role in designing, modelling, and automating efficient security protocols. This article presents the threat landscape across 5G networks and discusses the feasibility and architecture of different ML-based models to counter these threats. Also, we present the architecture for automated threat intelligence using cooperative and coordinated ML to secure 5G assets and infrastructure. We also present the summary of closely related existing works along with future research challenges. © 2021 Elsevier B.V.
Deep learning-based approach for detecting trajectory modifications of cassini-huygens spacecraft
- Authors: Aldabbas, Ashraf , Gal, Zoltan , Ghori, Khawaja , Imran, Muhammad , Shoaib, Muhammad
- Date: 2021
- Type: Text , Journal article
- Relation: IEEE Access Vol. 9, no. (2021), p. 39111-39125
- Full Text:
- Reviewed:
- Description: There were necessary trajectory modifications of Cassini spacecraft during its last 14 years movement cycle of the interplanetary research project. In the scale 1.3 hour of signal propagation time and 1.4-billion-kilometer size of Earth-Cassini channel, complex event detection in the orbit modifications requires special investigation and analysis of the collected big data. The technologies for space exploration warrant a high standard of nuanced and detailed research. The Cassini mission has accumulated quite huge volumes of science records. This generated a curiosity derives mainly from a need to use machine learning to analyze deep space missions. For energy saving considerations, the communication between the Earth and Cassini was executed in non-periodic mode. This paper provides a sophisticated in-depth learning approach for detecting Cassini spacecraft trajectory modifications in post-processing mode. The proposed model utilizes the ability of Long Short Term Memory (LSTM) neural networks for drawing out useful data and learning the time series inner data pattern, along with the forcefulness of LSTM layers for distinguishing dependencies among the long-short term. Our research study exploited the statistical rates, Matthews correlation coefficient, and F1 score to evaluate our models. We carried out multiple tests and evaluated the provided approach against several advanced models. The preparatory analysis showed that exploiting the LSTM layer provides a notable boost in rising the detection process performance. The proposed model achieved a number of 232 trajectory modification detections with 99.98% accuracy among the last 13.35 years of the Cassini spacecraft life. © 2013 IEEE.
Energy efficiency perspectives of femtocells in internet of things : recent advances and challenges
- Authors: Al-Turjman, Fadi , Imran, Muhammad , Bakhsh, Sheikh
- Date: 2017
- Type: Text , Journal article
- Relation: IEEE Access Vol. 5, no. (2017), p. 26808-26818
- Full Text:
- Reviewed:
- Description: Energy efficiency is a growing concern in every aspect of the technology. Apart from maintaining profitability, energy efficiency means a decrease in the overall environmental effects, which is a serious concern in today's world. Using a femtocell in Internet of Things (IoT) can boost energy efficiency. To illustrate, femtocells can be used in smart homes, which is a subpart of the smart grid, as a communication mechanism in order to manage energy efficiency. Moreover, femtocells can be used in many IoT applications in order to provide communication. However, it is important to evaluate the energy efficiency of femtocells. This paper investigates recent advances and challenges in the energy efficiency of the femtocell in IoT. First, we introduce the idea of femtocells in the context of IoT and their role in IoT applications. Next, we describe prominent performance metrics in order to understand how the energy efficiency is evaluated. Then, we elucidate how energy can be modeled in terms of femtocell and provide some models from the literature. Since femtocells are used in heterogeneous networks to manage energy efficiency, we also express some energy efficiency schemes for deployment. The factors that affect the energy usage of a femtocell base station are discussed and then the power consumption of user equipment under femtocell coverage is mentioned. Finally, we highlight prominent open research issues and challenges. © 2013 IEEE.
The rise of ransomware and emerging security challenges in the internet of things
- Authors: Yaqoob, Ibrar , Ahmed, Ejaz , Rehman, Muhammad , Ahmed, Abdelmuttlib , Imran, Muhammad
- Date: 2017
- Type: Text , Journal article
- Relation: Computer Networks Vol. 129, no. (2017), p. 444-458
- Full Text: false
- Reviewed:
- Description: With the increasing miniaturization of smartphones, computers, and sensors in the Internet of Things (IoT) paradigm, strengthening the security and preventing ransomware attacks have become key concerns. Traditional security mechanisms are no longer applicable because of the involvement of resource-constrained devices, which require more computation power and resources. This paper presents the ransomware attacks and security concerns in IoT. We initially discuss the rise of ransomware attacks and outline the associated challenges. Then, we investigate, report, and highlight the state-of-the-art research efforts directed at IoT from a security perspective. A taxonomy is devised by classifying and categorizing the literature based on important parameters (e.g., threats, requirements, IEEE standards, deployment level, and technologies). Furthermore, a few credible case studies are outlined to alert people regarding how seriously IoT devices are vulnerable to threats. We enumerate the requirements that need to be met for securing IoT. Several indispensable open research challenges (e.g., data integrity, lightweight security mechanisms, lack of security software's upgradability and patchability features, physical protection of trillions of devices, privacy, and trust) are identified and discussed. Several prominent future research directions are provided. © 2017 Elsevier B.V. **Please note that there are multiple authors for this article therefore only the name of the first 5 including Federation University Australia affiliate “Muhammad Imran” is provided in this record**
Process migration-based computational offloading framework for IoT-supported mobile edge/cloud computing
- Authors: Yousafzai, Abdullah , Yaqoob, Ibrar , Imran, Muhammad , Gani, Abdullah , Md Noor, Rafidah
- Date: 2020
- Type: Text , Journal article
- Relation: IEEE Internet of Things Journal Vol. 7, no. 5 (2020), p. 4171-4182
- Full Text: false
- Reviewed:
- Description: Mobile devices have become an indispensable component of Internet of Things (IoT). However, these devices have resource constraints in processing capabilities, battery power, and storage space, thus hindering the execution of computation-intensive applications that often require broad bandwidth, stringent response time, long-battery life, and heavy-computing power. Mobile cloud computing and mobile edge computing (MEC) are emerging technologies that can meet the aforementioned requirements using offloading algorithms. In this article, we analyze the effect of platform-dependent native applications on computational offloading in edge networks and propose a lightweight process migration-based computational offloading framework. The proposed framework does not require application binaries at edge servers and thus seamlessly migrates native applications. The proposed framework is evaluated using an experimental testbed. Numerical results reveal that the proposed framework saves almost 44% of the execution time and 84% of the energy consumption. Hence, the proposed framework shows profound potential for resource-intensive IoT application processing in MEC. © 2014 IEEE.
A novel cooperative link selection mechanism for enhancing the robustness in scale-free IoT networks
- Authors: Khan, Muhammad , Javaid, Nadeem , Javaid, Sakeena , Khalid, Adia , Nasser, Nidal , Imran, Muhammad
- Date: 2020
- Type: Text , Conference paper
- Relation: 16th IEEE International Wireless Communications and Mobile Computing Conference, IWCMC 2020, Limassol, Cyrprus, 15 to 19 June 2020, 2020 International Wireless Communications and Mobile Computing, IWCMC 2020 p. 2222-2227
- Full Text: false
- Reviewed:
- Description: In today's world, Internet of Things (IoT) helps people in many fields by enabling smart city projects in health monitoring, smart parking, industrial optimization, home energy management, etc. Daily life objects are connected with the Internet to allow access to their owners to keep an eye on their surroundings. The IoT network is comprised of nodes that are smart enough to perform any function and provide benefits to the people. However, any fault in the network opens up the risk of leaking personal information. The aim is to develop a scale-free network, which controls the effects of malicious attacks and consequently improves the network robustness. In this paper, our prime focus is to mitigate the effect of malicious nodes by providing a robust strategy to maintain the network stability. In this regard, we propose a topology named as a Cooperation based Edge Swap (CES) for improving the network robustness in the scale-free network. The CES uses the edge/link selection mechanism by involving the cooperation using a Rayleigh fading to swap the network topology for improving the network robustness. The simulations' outcome depicts the performance of the CES in terms of improving the network robustness. © 2020 IEEE.
RTRD: Real-time route discovery for urban scenarios using internet of things
- Authors: Din, Sadia , Ahmad, Awais , Paul, Anand , Anisetti, Marco , Imran, Muhammad
- Date: 2019
- Type: Text , Conference paper
- Relation: 2019 IEEE Global Communications Conference, GLOBECOM 2019, Waikola, 9-13 December 2019
- Full Text: false
- Reviewed:
- Description: A rapid development has been seen in the Vehicular ad hoc networks (VANETs) because of their applicability and significance in the fields of traffic management, road monitoring and safety, infotainment, and on-demand services. Route planning in vehicular networks based on efficient collection of real-time data can effectively mitigate traffic congestion problems in urban areas. Furthermore, real-time data is shared by using an effective sharing mechanism to avoid redundancy of the collected information. However, dynamic route replanning and effective sharing mechanisms based on real-time data are still challenging problems. Therefore, based on the aforementioned constraints, this paper describes a route discovery technique that uses real time data collected from various vehcicles using the Internet of Things. The proposed scheme is based on the novel data dissemination technique for information sharing among the roadside units. RTRD is comprised of VANETs, vehicular traffic servers, and a 5G-based cellular system of public transportation. By considering the traffic congestion in urban areas, the optimal path is calculated to re-plan routes based on the k shortest path algorithm, and a load balancing technique is adopted to avoid further congestion. © 2019 IEEE. **Please note that there are multiple authors for this article therefore only the name of the first 5 including Federation University Australia affiliate “Muhammad Imran” is provided in this record**
Refining Parkinson’s neurological disorder identification through deep transfer learning
- Authors: Naseer, Amina , Rani, Monai , Naz, Saeeda , Razzak, Muhammad , Imran, Muhammad , Xu, Guandong
- Date: 2020
- Type: Text , Journal article
- Relation: Neural Computing and Applications Vol. 32, no. 3 (2020), p. 839-854
- Full Text:
- Reviewed:
- Description: Parkinson’s disease (PD), a multi-system neurodegenerative disorder which affects the brain slowly, is characterized by symptoms such as muscle stiffness, tremor in the limbs and impaired balance, all of which tend to worsen with the passage of time. Available treatments target its symptoms, aiming to improve the quality of life. However, automatic diagnosis at early stages is still a challenging medicine-related task to date, since a patient may have an identical behavior to that of a healthy individual at the very early stage of the disease. Parkinson’s disease detection through handwriting data is a significant classification problem for identification of PD at the infancy stage. In this paper, a PD identification is realized with help of handwriting images that help as one of the earliest indicators for PD. For this purpose, we proposed a deep convolutional neural network classifier with transfer learning and data augmentation techniques to improve the identification. Two approaches like freeze and fine-tuning of transfer learning are investigated using ImageNet and MNIST dataset as source task independently. A trained network achieved 98.28% accuracy using fine-tuning-based approach using ImageNet and PaHaW dataset. Experimental results on benchmark dataset reveal that the proposed approach provides better detection of Parkinson’s disease as compared to state-of-the-art work. © 2019, Springer-Verlag London Ltd., part of Springer Nature.
UAV-enabled data acquisition scheme with directional wireless energy transfer for Internet of Things
- Authors: Liu, Yalin , Dai, Hong-Ning , Wang, Hao , Imran, Muhammad , Wang, Xiaofen , Shoaib, Muhammad
- Date: 2020
- Type: Text , Journal article
- Relation: Computer Communications Vol. 155, no. (2020), p. 184-196
- Full Text: false
- Reviewed:
- Description: Low power Internet of Things (IoT) is suffering from two limitations: battery-power limitation of IoT nodes and inflexibility of infrastructure-node deployment. In this paper, we propose an Unmanned Aerial Vehicle (UAV)-enabled data acquisition scheme with directional wireless energy transfer (WET) to overcome the limitations of low power IoT. The main idea of the proposed scheme is to employ a UAV to serve as both a data collector and an energy supplier. The UAV first transfers directional wireless energy to an IoT node which then sends back the data packets to the UAV by using the harvested energy. Meanwhile, we minimize the overall energy consumption under conditions of balanced energy supply and limited overall time. Moreover, we derive the optimal values of WET time and data transmission power. After analysing the feasibility of the optimal WET time and data transmission, we design an allocation scheme based on the feasible ranges of data size level and channel-fading degree. The numerical results show the feasibility and adaptability of our allocation scheme against the varied values of multiple system parameters. We further extend our scheme to the multi-node scenario by re-designing energy beamforming and adopting multi-access mechanisms. Moreover, we also analyse the mobility of UAVs in the proposed scheme. © 2020 Elsevier B.V.
A lightweight federated learning based privacy preserving B5G pandemic response network using unmanned aerial vehicles: A proof-of-concept
- Authors: Nasser, Nasser , Fadlullah, Zubair , Fouda, Mostafa , Ali, Asmaa , Imran, Muhammad
- Date: 2022
- Type: Text , Journal article
- Relation: Computer Networks Vol. 205, no. (2022), p.
- Full Text: false
- Reviewed:
- Description: The concept of an intelligent pandemic response network is gaining momentum during the current novel coronavirus disease (COVID-19) era. A heterogeneous communication architecture is essential to facilitate collaborative and intelligent medical analytics in the fifth generation and beyond (B5G) networks to intelligently learn and disseminate pandemic-related information and diagnostic results. However, such a technique raises privacy issues pertaining to the health data of the patients. In this paper, we envision a privacy-preserving pandemic response network using a proof-of-concept, aerial–terrestrial network system serving mobile user entities/equipment (UEs). By leveraging the unmanned aerial vehicles (UAVs), a lightweight federated learning model is proposed to collaboratively yet privately learn medical (e.g., COVID-19) symptoms with high accuracy using the data collected by individual UEs using ambient sensors and wearable devices. An asynchronous weight updating technique is introduced in federated learning to avoid redundant learning and save precious networking as well as computing resources of the UAVs/UEs. A use-case where an Artificial Intelligence (AI)-based model is employed for COVID-19 detection from radiograph images is presented to demonstrate the effectiveness of our proposed approach. © 2021 Elsevier B.V.
A quantitative risk assessment model involving frequency and threat degree under line-of-business services for infrastructure of emerging sensor networks
- Authors: Jing, Xu , Hu, Hanwen , Yang, Huijun , Au, Man , Li, Shuqin , Xiong, Naixue , Imran, Muhammad , Vasilakos, Athanasios
- Date: 2017
- Type: Text , Journal article
- Relation: Sensors (Switzerland) Vol. 17, no. 3 (2017), p.
- Full Text:
- Reviewed:
- Description: The prospect of Line-of-Business Services (LoBSs) for infrastructure of Emerging Sensor Networks (ESNs) is exciting. Access control remains a top challenge in this scenario as the service provider’s server contains a lot of valuable resources. LoBSs’ users are very diverse as they may come from a wide range of locations with vastly different characteristics. Cost of joining could be low and in many cases, intruders are eligible users conducting malicious actions. As a result, user access should be adjusted dynamically. Assessing LoBSs’ risk dynamically based on both frequency and threat degree of malicious operations is therefore necessary. In this paper, we proposed a Quantitative Risk Assessment Model (QRAM) involving frequency and threat degree based on value at risk. To quantify the threat degree as an elementary intrusion effort, we amend the influence coefficient of risk indexes in the network security situation assessment model. To quantify threat frequency as intrusion trace effort, we make use of multiple behavior information fusion. Under the influence of intrusion trace, we adapt the historical simulation method of value at risk to dynamically access LoBSs’ risk. Simulation based on existing data is used to select appropriate parameters for QRAM. Our simulation results show that the duration influence on elementary intrusion effort is reasonable when the normalized parameter is 1000. Likewise, the time window of intrusion trace and the weight between objective risk and subjective risk can be set to 10 s and 0.5, respectively. While our focus is to develop QRAM for assessing the risk of LoBSs for infrastructure of ESNs dynamically involving frequency and threat degree, we believe it is also appropriate for other scenarios in cloud computing. © 2017 by the authors. Licensee MDPI, Basel, Switzerland.