Graph learning for anomaly analytics : algorithms, applications, and challenges
- Ren, Jing, Xia, Feng, Lee, Ivan, Noori Hoshyar, Azadeh, Aggarwal, Charu
- Authors: Ren, Jing , Xia, Feng , Lee, Ivan , Noori Hoshyar, Azadeh , Aggarwal, Charu
- Date: 2023
- Type: Text , Journal article
- Relation: ACM Transactions on Intelligent Systems and Technology Vol. 14, no. 2 (2023), p.
- Full Text:
- Reviewed:
- Description: Anomaly analytics is a popular and vital task in various research contexts that has been studied for several decades. At the same time, deep learning has shown its capacity in solving many graph-based tasks, like node classification, link prediction, and graph classification. Recently, many studies are extending graph learning models for solving anomaly analytics problems, resulting in beneficial advances in graph-based anomaly analytics techniques. In this survey, we provide a comprehensive overview of graph learning methods for anomaly analytics tasks. We classify them into four categories based on their model architectures, namely graph convolutional network, graph attention network, graph autoencoder, and other graph learning models. The differences between these methods are also compared in a systematic manner. Furthermore, we outline several graph-based anomaly analytics applications across various domains in the real world. Finally, we discuss five potential future research directions in this rapidly growing field. © 2023 Association for Computing Machinery.
- Authors: Ren, Jing , Xia, Feng , Lee, Ivan , Noori Hoshyar, Azadeh , Aggarwal, Charu
- Date: 2023
- Type: Text , Journal article
- Relation: ACM Transactions on Intelligent Systems and Technology Vol. 14, no. 2 (2023), p.
- Full Text:
- Reviewed:
- Description: Anomaly analytics is a popular and vital task in various research contexts that has been studied for several decades. At the same time, deep learning has shown its capacity in solving many graph-based tasks, like node classification, link prediction, and graph classification. Recently, many studies are extending graph learning models for solving anomaly analytics problems, resulting in beneficial advances in graph-based anomaly analytics techniques. In this survey, we provide a comprehensive overview of graph learning methods for anomaly analytics tasks. We classify them into four categories based on their model architectures, namely graph convolutional network, graph attention network, graph autoencoder, and other graph learning models. The differences between these methods are also compared in a systematic manner. Furthermore, we outline several graph-based anomaly analytics applications across various domains in the real world. Finally, we discuss five potential future research directions in this rapidly growing field. © 2023 Association for Computing Machinery.
Contention resolution in wi-fi 6-enabled internet of things based on deep learning
- Chen, Chen, Li, Junchao, Balasubramanian, Venki, Wu, Yongqiang, Zhang, Yongqiang, Wan, Shaohua
- Authors: Chen, Chen , Li, Junchao , Balasubramanian, Venki , Wu, Yongqiang , Zhang, Yongqiang , Wan, Shaohua
- Date: 2021
- Type: Text , Journal article
- Relation: IEEE Internet of Things Journal Vol. 8, no. 7 (2021), p. 5309-5320
- Full Text:
- Reviewed:
- Description: Internet of Things (IoT) is expected to vastly increase the number of connected devices. As a result, a multitude of IoT devices transmit various information through wireless communication technology, such as the Wi-Fi technology, cellular mobile communication technology, low-power wide-area network (LPWAN) technology. However, even the latest Wi-Fi technology is still ready to accommodate these large amounts of data. Accurately setting the contention window (CW) value significantly affects the efficiency of the Wi-Fi network. Unfortunately, the standard collision resolution used by IEEE 802.11ax networks is nonscalable; thus, it cannot maintain stable throughput for an increasing number of stations, even when Wi-Fi 6 has been designed to improve performance in dense scenarios. To this end, we propose a CW control strategy for Wi-Fi 6 systems. This strategy leverages deep learning to search for optimal configuration of CW under different network conditions. Our deep neural network is trained by data generated from a Wi-Fi 6 simulation system with some varying key parameters, e.g., the number of nodes, short interframe space (SIFS), distributed interframe space (DIFS), and data transmission rate. Numerical results demonstrated that our deep learning scheme could always find the optimal CW adjustment multiple by adaptively perceiving the channel competition status. The finalized performance of our model has been significantly improved in terms of system throughput, average transmission delay, and packet retransmission rate. This makes Wi-Fi 6 better adapted to the access of a large number of IoT devices. © 2014 IEEE.
- Authors: Chen, Chen , Li, Junchao , Balasubramanian, Venki , Wu, Yongqiang , Zhang, Yongqiang , Wan, Shaohua
- Date: 2021
- Type: Text , Journal article
- Relation: IEEE Internet of Things Journal Vol. 8, no. 7 (2021), p. 5309-5320
- Full Text:
- Reviewed:
- Description: Internet of Things (IoT) is expected to vastly increase the number of connected devices. As a result, a multitude of IoT devices transmit various information through wireless communication technology, such as the Wi-Fi technology, cellular mobile communication technology, low-power wide-area network (LPWAN) technology. However, even the latest Wi-Fi technology is still ready to accommodate these large amounts of data. Accurately setting the contention window (CW) value significantly affects the efficiency of the Wi-Fi network. Unfortunately, the standard collision resolution used by IEEE 802.11ax networks is nonscalable; thus, it cannot maintain stable throughput for an increasing number of stations, even when Wi-Fi 6 has been designed to improve performance in dense scenarios. To this end, we propose a CW control strategy for Wi-Fi 6 systems. This strategy leverages deep learning to search for optimal configuration of CW under different network conditions. Our deep neural network is trained by data generated from a Wi-Fi 6 simulation system with some varying key parameters, e.g., the number of nodes, short interframe space (SIFS), distributed interframe space (DIFS), and data transmission rate. Numerical results demonstrated that our deep learning scheme could always find the optimal CW adjustment multiple by adaptively perceiving the channel competition status. The finalized performance of our model has been significantly improved in terms of system throughput, average transmission delay, and packet retransmission rate. This makes Wi-Fi 6 better adapted to the access of a large number of IoT devices. © 2014 IEEE.
Blending big data analytics : review on challenges and a recent study
- Amalina, Fairuz, Targio Hashem, Ibrahim, Azizul, Zati, Fong, Ang, Imran, Muhammad
- Authors: Amalina, Fairuz , Targio Hashem, Ibrahim , Azizul, Zati , Fong, Ang , Imran, Muhammad
- Date: 2020
- Type: Text , Journal article , Review
- Relation: IEEE Access Vol. 8, no. (2020), p. 3629-3645
- Full Text:
- Reviewed:
- Description: With the collection of massive amounts of data every day, big data analytics has emerged as an important trend for many organizations. These collected data can contain important information that may be key to solving wide-ranging problems, such as cyber security, marketing, healthcare, and fraud. To analyze their large volumes of data for business analyses and decisions, large companies, such as Facebook and Google, adopt analytics. Such analyses and decisions impact existing and future technology. In this paper, we explore how big data analytics is utilized as a technique for solving problems of complex and unstructured data using such technologies as Hadoop, Spark, and MapReduce. We also discuss the data challenges introduced by big data according to the literature, including its six V's. Moreover, we investigate case studies of big data analytics on various techniques of such analytics, namely, text, voice, video, and network analytics. We conclude that big data analytics can bring positive changes in many fields, such as education, military, healthcare, politics, business, agriculture, banking, and marketing, in the future. © 2013 IEEE.
- Authors: Amalina, Fairuz , Targio Hashem, Ibrahim , Azizul, Zati , Fong, Ang , Imran, Muhammad
- Date: 2020
- Type: Text , Journal article , Review
- Relation: IEEE Access Vol. 8, no. (2020), p. 3629-3645
- Full Text:
- Reviewed:
- Description: With the collection of massive amounts of data every day, big data analytics has emerged as an important trend for many organizations. These collected data can contain important information that may be key to solving wide-ranging problems, such as cyber security, marketing, healthcare, and fraud. To analyze their large volumes of data for business analyses and decisions, large companies, such as Facebook and Google, adopt analytics. Such analyses and decisions impact existing and future technology. In this paper, we explore how big data analytics is utilized as a technique for solving problems of complex and unstructured data using such technologies as Hadoop, Spark, and MapReduce. We also discuss the data challenges introduced by big data according to the literature, including its six V's. Moreover, we investigate case studies of big data analytics on various techniques of such analytics, namely, text, voice, video, and network analytics. We conclude that big data analytics can bring positive changes in many fields, such as education, military, healthcare, politics, business, agriculture, banking, and marketing, in the future. © 2013 IEEE.
- «
- ‹
- 1
- ›
- »