Deep learning and big data technologies for IoT security
- Amanullah, Mohamed, Habeeb, Riyaz, Nasaruddin, Fariza, Gani, Abdullah, Ahmed, Ejaz, Nainar, Abdul, Akim, Nazihah, Imran, Muhammad
- Authors: Amanullah, Mohamed , Habeeb, Riyaz , Nasaruddin, Fariza , Gani, Abdullah , Ahmed, Ejaz , Nainar, Abdul , Akim, Nazihah , Imran, Muhammad
- Date: 2020
- Type: Text , Journal article , Review
- Relation: Computer Communications Vol. 151, no. (2020), p. 495-517
- Full Text: false
- Reviewed:
- Description: Technology has become inevitable in human life, especially the growth of Internet of Things (IoT), which enables communication and interaction with various devices. However, IoT has been proven to be vulnerable to security breaches. Therefore, it is necessary to develop fool proof solutions by creating new technologies or combining existing technologies to address the security issues. Deep learning, a branch of machine learning has shown promising results in previous studies for detection of security breaches. Additionally, IoT devices generate large volumes, variety, and veracity of data. Thus, when big data technologies are incorporated, higher performance and better data handling can be achieved. Hence, we have conducted a comprehensive survey on state-of-the-art deep learning, IoT security, and big data technologies. Further, a comparative analysis and the relationship among deep learning, IoT security, and big data technologies have also been discussed. Further, we have derived a thematic taxonomy from the comparative analysis of technical studies of the three aforementioned domains. Finally, we have identified and discussed the challenges in incorporating deep learning for IoT security using big data technologies and have provided directions to future researchers on the IoT security aspects. © 2020 Elsevier B.V.
Unmanned aerial vehicle for internet of everything : opportunities and challenges
- Liu, Yalin, Dai, Hong-Ning, Wang, Qubeijian, Shukla, Mahendra, Imran, Muhammad
- Authors: Liu, Yalin , Dai, Hong-Ning , Wang, Qubeijian , Shukla, Mahendra , Imran, Muhammad
- Date: 2020
- Type: Text , Journal article , Review
- Relation: Computer Communications Vol. 155, no. (2020), p. 66-83
- Full Text:
- Reviewed:
- Description: The recent advances in information and communication technology (ICT) have further extended Internet of Things (IoT) from the sole “things” aspect to the omnipotent role of “intelligent connection of things”. Meanwhile, the concept of internet of everything (IoE) is presented as such an omnipotent extension of IoT. However, the IoE realization meets critical challenges including the restricted network coverage and the limited resource of existing network technologies. Recently, Unmanned Aerial Vehicles (UAVs) have attracted significant attentions attributed to their high mobility, low cost, and flexible deployment. Thus, UAVs may potentially overcome the challenges of IoE. This article presents a comprehensive survey on opportunities and challenges of UAV-enabled IoE. We first present three critical expectations of IoE: (1) scalability requiring a scalable network architecture with ubiquitous coverage, (2) intelligence requiring a global computing plane enabling intelligent things, (3) diversity requiring provisions of diverse applications. Thereafter, we review the enabling technologies to achieve these expectations and discuss four intrinsic constraints of IoE (i.e., coverage constraint, battery constraint, computing constraint, and security issues). We then present an overview of UAVs. We next discuss the opportunities brought by UAV to IoE. Additionally, we introduce a UAV-enabled IoE (Ue-IoE) solution by exploiting UAVs's mobility, in which we show that Ue-IoE can greatly enhance the scalability, intelligence and diversity of IoE. Finally, we outline the future directions in Ue-IoE. © 2020 Elsevier B.V.
- Authors: Liu, Yalin , Dai, Hong-Ning , Wang, Qubeijian , Shukla, Mahendra , Imran, Muhammad
- Date: 2020
- Type: Text , Journal article , Review
- Relation: Computer Communications Vol. 155, no. (2020), p. 66-83
- Full Text:
- Reviewed:
- Description: The recent advances in information and communication technology (ICT) have further extended Internet of Things (IoT) from the sole “things” aspect to the omnipotent role of “intelligent connection of things”. Meanwhile, the concept of internet of everything (IoE) is presented as such an omnipotent extension of IoT. However, the IoE realization meets critical challenges including the restricted network coverage and the limited resource of existing network technologies. Recently, Unmanned Aerial Vehicles (UAVs) have attracted significant attentions attributed to their high mobility, low cost, and flexible deployment. Thus, UAVs may potentially overcome the challenges of IoE. This article presents a comprehensive survey on opportunities and challenges of UAV-enabled IoE. We first present three critical expectations of IoE: (1) scalability requiring a scalable network architecture with ubiquitous coverage, (2) intelligence requiring a global computing plane enabling intelligent things, (3) diversity requiring provisions of diverse applications. Thereafter, we review the enabling technologies to achieve these expectations and discuss four intrinsic constraints of IoE (i.e., coverage constraint, battery constraint, computing constraint, and security issues). We then present an overview of UAVs. We next discuss the opportunities brought by UAV to IoE. Additionally, we introduce a UAV-enabled IoE (Ue-IoE) solution by exploiting UAVs's mobility, in which we show that Ue-IoE can greatly enhance the scalability, intelligence and diversity of IoE. Finally, we outline the future directions in Ue-IoE. © 2020 Elsevier B.V.
Few-shot image classification : current status and research trends
- Liu, Ying, Zhang, Hengchang, Zhang, Weidong, Lu, Guojun, Tian, Qi, Ling, Nam
- Authors: Liu, Ying , Zhang, Hengchang , Zhang, Weidong , Lu, Guojun , Tian, Qi , Ling, Nam
- Date: 2022
- Type: Text , Journal article , Review
- Relation: Electronics (Switzerland) Vol. 11, no. 11 (2022), p.
- Full Text:
- Reviewed:
- Description: Conventional image classification methods usually require a large number of training samples for the training model. However, in practical scenarios, the amount of available sample data is often insufficient, which easily leads to overfitting in network construction. Few-shot learning provides an effective solution to this problem and has been a hot research topic. This paper provides an intensive survey on the state-of-the-art techniques in image classification based on few-shot learning. According to the different deep learning mechanisms, the existing algorithms are di-vided into four categories: transfer learning based, meta-learning based, data augmentation based, and multimodal based methods. Transfer learning based methods transfer useful prior knowledge from the source domain to the target domain. Meta-learning based methods employ past prior knowledge to guide the learning of new tasks. Data augmentation based methods expand the amount of sample data with auxiliary information. Multimodal based methods use the information of the auxiliary modal to facilitate the implementation of image classification tasks. This paper also summarizes the few-shot image datasets available in the literature, and experimental results tested by some representative algorithms are provided to compare their performance and analyze their pros and cons. In addition, the application of existing research outcomes on few-shot image classification in different practical fields are discussed. Finally, a few future research directions are iden-tified. © 2022 by the authors. Licensee MDPI, Basel, Switzerland.
- Authors: Liu, Ying , Zhang, Hengchang , Zhang, Weidong , Lu, Guojun , Tian, Qi , Ling, Nam
- Date: 2022
- Type: Text , Journal article , Review
- Relation: Electronics (Switzerland) Vol. 11, no. 11 (2022), p.
- Full Text:
- Reviewed:
- Description: Conventional image classification methods usually require a large number of training samples for the training model. However, in practical scenarios, the amount of available sample data is often insufficient, which easily leads to overfitting in network construction. Few-shot learning provides an effective solution to this problem and has been a hot research topic. This paper provides an intensive survey on the state-of-the-art techniques in image classification based on few-shot learning. According to the different deep learning mechanisms, the existing algorithms are di-vided into four categories: transfer learning based, meta-learning based, data augmentation based, and multimodal based methods. Transfer learning based methods transfer useful prior knowledge from the source domain to the target domain. Meta-learning based methods employ past prior knowledge to guide the learning of new tasks. Data augmentation based methods expand the amount of sample data with auxiliary information. Multimodal based methods use the information of the auxiliary modal to facilitate the implementation of image classification tasks. This paper also summarizes the few-shot image datasets available in the literature, and experimental results tested by some representative algorithms are provided to compare their performance and analyze their pros and cons. In addition, the application of existing research outcomes on few-shot image classification in different practical fields are discussed. Finally, a few future research directions are iden-tified. © 2022 by the authors. Licensee MDPI, Basel, Switzerland.
Wearable sensor technology to predict core body temperature : a systematic review
- Dolson, Conor, Harlow, Ethan, Phelan, Dermot, Gabbett, Tim, Gaal, Benjamin, McMellen, Christopher, Geletka, Benjamin, Calcei, Jacob, Voos, James, Seshadri, Dhruv
- Authors: Dolson, Conor , Harlow, Ethan , Phelan, Dermot , Gabbett, Tim , Gaal, Benjamin , McMellen, Christopher , Geletka, Benjamin , Calcei, Jacob , Voos, James , Seshadri, Dhruv
- Date: 2022
- Type: Text , Journal article , Review
- Relation: Sensors Vol. 22, no. 19 (2022), p.
- Full Text:
- Reviewed:
- Description: Heat-related illnesses, which range from heat exhaustion to heatstroke, affect thousands of individuals worldwide every year and are characterized by extreme hyperthermia with the core body temperature (CBT) usually > 40 °C, decline in physical and athletic performance, CNS dysfunction, and, eventually, multiorgan failure. The measurement of CBT has been shown to predict heat-related illness and its severity, but the current measurement methods are not practical for use in high acuity and high motion settings due to their invasive and obstructive nature or excessive costs. Noninvasive predictions of CBT using wearable technology and predictive algorithms offer the potential for continuous CBT monitoring and early intervention to prevent HRI in athletic, military, and intense work environments. Thus far, there has been a lack of peer-reviewed literature assessing the efficacy of wearable devices and predictive analytics to predict CBT to mitigate heat-related illness. This systematic review identified 20 studies representing a total of 25 distinct algorithms to predict the core body temperature using wearable technology. While a high accuracy in prediction was noted, with 17 out of 18 algorithms meeting the clinical validity standards. few algorithms incorporated individual and environmental data into their core body temperature prediction algorithms, despite the known impact of individual health and situational and environmental factors on CBT. Robust machine learning methods offer the ability to develop more accurate, reliable, and personalized CBT prediction algorithms using wearable devices by including additional data on user characteristics, workout intensity, and the surrounding environment. The integration and interoperability of CBT prediction algorithms with existing heat-related illness prevention and treatment tools, including heat indices such as the WBGT, athlete management systems, and electronic medical records, will further prevent HRI and increase the availability and speed of data access during critical heat events, improving the clinical decision-making process for athletic trainers and physicians, sports scientists, employers, and military officers. © 2022 by the authors.
- Authors: Dolson, Conor , Harlow, Ethan , Phelan, Dermot , Gabbett, Tim , Gaal, Benjamin , McMellen, Christopher , Geletka, Benjamin , Calcei, Jacob , Voos, James , Seshadri, Dhruv
- Date: 2022
- Type: Text , Journal article , Review
- Relation: Sensors Vol. 22, no. 19 (2022), p.
- Full Text:
- Reviewed:
- Description: Heat-related illnesses, which range from heat exhaustion to heatstroke, affect thousands of individuals worldwide every year and are characterized by extreme hyperthermia with the core body temperature (CBT) usually > 40 °C, decline in physical and athletic performance, CNS dysfunction, and, eventually, multiorgan failure. The measurement of CBT has been shown to predict heat-related illness and its severity, but the current measurement methods are not practical for use in high acuity and high motion settings due to their invasive and obstructive nature or excessive costs. Noninvasive predictions of CBT using wearable technology and predictive algorithms offer the potential for continuous CBT monitoring and early intervention to prevent HRI in athletic, military, and intense work environments. Thus far, there has been a lack of peer-reviewed literature assessing the efficacy of wearable devices and predictive analytics to predict CBT to mitigate heat-related illness. This systematic review identified 20 studies representing a total of 25 distinct algorithms to predict the core body temperature using wearable technology. While a high accuracy in prediction was noted, with 17 out of 18 algorithms meeting the clinical validity standards. few algorithms incorporated individual and environmental data into their core body temperature prediction algorithms, despite the known impact of individual health and situational and environmental factors on CBT. Robust machine learning methods offer the ability to develop more accurate, reliable, and personalized CBT prediction algorithms using wearable devices by including additional data on user characteristics, workout intensity, and the surrounding environment. The integration and interoperability of CBT prediction algorithms with existing heat-related illness prevention and treatment tools, including heat indices such as the WBGT, athlete management systems, and electronic medical records, will further prevent HRI and increase the availability and speed of data access during critical heat events, improving the clinical decision-making process for athletic trainers and physicians, sports scientists, employers, and military officers. © 2022 by the authors.
Smart grid evolution : predictive control of distributed energy resources—A review
- Babayomi, Oluleke, Zhang, Zhenbin, Dragicevic, Tomislav, Hu, Jiefeng, Rodriguez, Jose
- Authors: Babayomi, Oluleke , Zhang, Zhenbin , Dragicevic, Tomislav , Hu, Jiefeng , Rodriguez, Jose
- Date: 2023
- Type: Text , Journal article , Review
- Relation: International Journal of Electrical Power and Energy Systems Vol. 147, no. (2023), p.
- Full Text: false
- Reviewed:
- Description: As the smart grid evolves, it requires increasing distributed intelligence, optimization and control. Model predictive control (MPC) facilitates these functionalities for smart grid applications, namely: microgrids, smart buildings, ancillary services, industrial drives, electric vehicle charging, and distributed generation. Among these, this article focuses on providing a comprehensive review of the applications of MPC to the power electronic interfaces of distributed energy resources (DERs) for grid integration. In particular, the predictive control of power converters for wind energy conversion systems, solar photovoltaics, fuel cells and energy storage systems are covered in detail. The predictive control methods for grid-connected converters, artificial intelligence-based predictive control, open issues and future trends are also reviewed. The study highlights the potential of MPC to facilitate the high-performance, optimal power extraction and control of diverse sustainable grid-connected DERs. Furthermore, the study brings detailed structure to the artificial intelligence techniques that are beneficial to enhance performance, ease deployment and reduce computational burden of predictive control for power converters. © 2022 Elsevier Ltd
Review of the legacy and future of IEC 61850 protocols encompassing substation automation system
- Kumar, Shantanu, Abu-Siada, Ahmed, Das, Narottam, Islam, Syed
- Authors: Kumar, Shantanu , Abu-Siada, Ahmed , Das, Narottam , Islam, Syed
- Date: 2023
- Type: Text , Journal article , Review
- Relation: Electronics (Switzerland) Vol. 12, no. 15 (2023), p.
- Full Text:
- Reviewed:
- Description: Communication protocols play a pivotal role in the substation automation system as they carry critical information related to asset control, automation, protection, and monitoring. Substation legacy protocols run the assets’ bulk data on multiple wires over long distances. These data packets pass through multiple nodes, which makes the identification of the location and type of various malfunctions a challenging and time-consuming task. As downtime of substations is of high importance from a regulatory and compliance point of view, utilities are motivated to revisit the overall scheme and redesign a new system that features flexibility, adaptability, interoperability, and high accuracy. This paper presents a comprehensive review of various legacy protocols and highlights the path forward for a new protocol laid down as per the IEC 61850 standard. The IEC 61850 protocol is expected to be user-friendly, employ fiber optics instead of conventional copper wires, facilitate the application of non-conventional instrument transformers, and connect Ethernet wires to multiple intelligent electronic devices. However, deployment of smart protocols in future substations is not a straightforward process as it requires careful planning, shutdown and foreseeable issues related to interface with proprietary vendor equipment. Along with the technical issues of communication, future smart protocols call for advanced personnel and engineering skills to embrace the new technology. © 2023 by the authors.
- Authors: Kumar, Shantanu , Abu-Siada, Ahmed , Das, Narottam , Islam, Syed
- Date: 2023
- Type: Text , Journal article , Review
- Relation: Electronics (Switzerland) Vol. 12, no. 15 (2023), p.
- Full Text:
- Reviewed:
- Description: Communication protocols play a pivotal role in the substation automation system as they carry critical information related to asset control, automation, protection, and monitoring. Substation legacy protocols run the assets’ bulk data on multiple wires over long distances. These data packets pass through multiple nodes, which makes the identification of the location and type of various malfunctions a challenging and time-consuming task. As downtime of substations is of high importance from a regulatory and compliance point of view, utilities are motivated to revisit the overall scheme and redesign a new system that features flexibility, adaptability, interoperability, and high accuracy. This paper presents a comprehensive review of various legacy protocols and highlights the path forward for a new protocol laid down as per the IEC 61850 standard. The IEC 61850 protocol is expected to be user-friendly, employ fiber optics instead of conventional copper wires, facilitate the application of non-conventional instrument transformers, and connect Ethernet wires to multiple intelligent electronic devices. However, deployment of smart protocols in future substations is not a straightforward process as it requires careful planning, shutdown and foreseeable issues related to interface with proprietary vendor equipment. Along with the technical issues of communication, future smart protocols call for advanced personnel and engineering skills to embrace the new technology. © 2023 by the authors.
- «
- ‹
- 1
- ›
- »