ECG reduction for wearable sensor
- Allami, Ragheed, Stranieri, Andrew, Balasubramanian, Venki, Jelinek, Herbert
- Authors: Allami, Ragheed , Stranieri, Andrew , Balasubramanian, Venki , Jelinek, Herbert
- Date: 2016
- Type: Text , Conference proceedings
- Relation: 2016 12th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS); Naples, Italy; 28th November-1st December 2016 p. 520-525
- Full Text:
- Reviewed:
- Description: The transmission, storage and analysis of electrocardiogram (ECG) data in real-time is essential for remote patient monitoring with wearable ECG devices and mobile ECG contexts. However, this remains a challenge to achieve within the processing power and the storage capacity of mobile devices. ECG reduction algorithms have an important role to play in reducing the processing requirements for mobile devices, however many existing ECG reduction and compression algorithms are computationally expensive to execute in mobile devices and have not been designed for real-time computation and incremental data arrival. In this paper, we describe a computationally naive, yet effective, algorithm that achieves high ECG reduction rates while maintaining key diagnostic features including PR, QRS, ST, QT and RR intervals. While reduction does not enable ECG waves to be reproduced, the ability to transmit key indicators (diagnostic features) using minimal computational resources, is particularly useful in mobile health contexts involving power constrained sensors and devices. Results of the proposed reduction algorithm indicate that the proposed algorithm outperforms other ECG reduction algorithms at a reduction/compression ratio (CR) of 5:1. If power or processing capacity is low, the algorithm can readily switch to a compression ratio of up to 10: 1 while still maintaining an error rate below 10%.
- Authors: Allami, Ragheed , Stranieri, Andrew , Balasubramanian, Venki , Jelinek, Herbert
- Date: 2016
- Type: Text , Conference proceedings
- Relation: 2016 12th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS); Naples, Italy; 28th November-1st December 2016 p. 520-525
- Full Text:
- Reviewed:
- Description: The transmission, storage and analysis of electrocardiogram (ECG) data in real-time is essential for remote patient monitoring with wearable ECG devices and mobile ECG contexts. However, this remains a challenge to achieve within the processing power and the storage capacity of mobile devices. ECG reduction algorithms have an important role to play in reducing the processing requirements for mobile devices, however many existing ECG reduction and compression algorithms are computationally expensive to execute in mobile devices and have not been designed for real-time computation and incremental data arrival. In this paper, we describe a computationally naive, yet effective, algorithm that achieves high ECG reduction rates while maintaining key diagnostic features including PR, QRS, ST, QT and RR intervals. While reduction does not enable ECG waves to be reproduced, the ability to transmit key indicators (diagnostic features) using minimal computational resources, is particularly useful in mobile health contexts involving power constrained sensors and devices. Results of the proposed reduction algorithm indicate that the proposed algorithm outperforms other ECG reduction algorithms at a reduction/compression ratio (CR) of 5:1. If power or processing capacity is low, the algorithm can readily switch to a compression ratio of up to 10: 1 while still maintaining an error rate below 10%.
Analysis of end-to-end delay characteristics for various packets in IEC 61850 substation communications system
- Das, Narottam, Ma, Wu, Islam, Syed
- Authors: Das, Narottam , Ma, Wu , Islam, Syed
- Date: 2015
- Type: Text , Conference proceedings , Conference paper
- Relation: 25th Australasian Universities Power Engineering Conference, AUPEC 2015, Wollongong, Australia; 27th-30th September 2015 p. 1-5
- Full Text:
- Reviewed:
- Description: Substation plays an important role in power system communications for safe and reliable operation of entire power networks. Substation communication networks are connected with various substation intelligent electronic devices (IEDs), which is substation systems lifeblood and the system availability is decided by its real-Time performance. International Electro-Technical Commission (IEC) has been developed the standards based on object-oriented technologies for substation automation. IEC 61850 protocol has been applied widely in substation communication applications. It presents new challenges to realtime performance simulation and testing of protective relays. In this paper, an optimized network engineering tool (OPNET) or Riverbed modeler simulation tool/ software has been used for the modeling of IED in substation level network. Based on the simulation results, different types of data stream have been discussed, such as, periodic data stream, random data stream and burst data steam. The typical studies using these models, to construct substation automation system (SAS) network on the OPNET modeler or Riverbed modeler was made to reveal the impact of each affecting parameter or factor to the real-Time performance of substation communications system, which is also incorporated in this report.
- Description: 2015 Australasian Universities Power Engineering Conference: Challenges for Future Grids, AUPEC 2015
- Authors: Das, Narottam , Ma, Wu , Islam, Syed
- Date: 2015
- Type: Text , Conference proceedings , Conference paper
- Relation: 25th Australasian Universities Power Engineering Conference, AUPEC 2015, Wollongong, Australia; 27th-30th September 2015 p. 1-5
- Full Text:
- Reviewed:
- Description: Substation plays an important role in power system communications for safe and reliable operation of entire power networks. Substation communication networks are connected with various substation intelligent electronic devices (IEDs), which is substation systems lifeblood and the system availability is decided by its real-Time performance. International Electro-Technical Commission (IEC) has been developed the standards based on object-oriented technologies for substation automation. IEC 61850 protocol has been applied widely in substation communication applications. It presents new challenges to realtime performance simulation and testing of protective relays. In this paper, an optimized network engineering tool (OPNET) or Riverbed modeler simulation tool/ software has been used for the modeling of IED in substation level network. Based on the simulation results, different types of data stream have been discussed, such as, periodic data stream, random data stream and burst data steam. The typical studies using these models, to construct substation automation system (SAS) network on the OPNET modeler or Riverbed modeler was made to reveal the impact of each affecting parameter or factor to the real-Time performance of substation communications system, which is also incorporated in this report.
- Description: 2015 Australasian Universities Power Engineering Conference: Challenges for Future Grids, AUPEC 2015
Real-time big data processing for anomaly detection : a survey
- Ariyaluran Habeeb, Riyaz, Nasaruddin, Fariza, Gani, Abdullah, Targio Hashem, Ibrahim, Ahmed, Ejaz, Imran, Muhammad
- Authors: Ariyaluran Habeeb, Riyaz , Nasaruddin, Fariza , Gani, Abdullah , Targio Hashem, Ibrahim , Ahmed, Ejaz , Imran, Muhammad
- Date: 2019
- Type: Text , Journal article , Review
- Relation: International Journal of Information Management Vol. 45, no. (2019), p. 289-307
- Full Text:
- Reviewed:
- Description: The advent of connected devices and omnipresence of Internet have paved way for intruders to attack networks, which leads to cyber-attack, financial loss, information theft in healthcare, and cyber war. Hence, network security analytics has become an important area of concern and has gained intensive attention among researchers, off late, specifically in the domain of anomaly detection in network, which is considered crucial for network security. However, preliminary investigations have revealed that the existing approaches to detect anomalies in network are not effective enough, particularly to detect them in real time. The reason for the inefficacy of current approaches is mainly due the amassment of massive volumes of data though the connected devices. Therefore, it is crucial to propose a framework that effectively handles real time big data processing and detect anomalies in networks. In this regard, this paper attempts to address the issue of detecting anomalies in real time. Respectively, this paper has surveyed the state-of-the-art real-time big data processing technologies related to anomaly detection and the vital characteristics of associated machine learning algorithms. This paper begins with the explanation of essential contexts and taxonomy of real-time big data processing, anomalous detection, and machine learning algorithms, followed by the review of big data processing technologies. Finally, the identified research challenges of real-time big data processing in anomaly detection are discussed. © 2018 Elsevier Ltd
- Authors: Ariyaluran Habeeb, Riyaz , Nasaruddin, Fariza , Gani, Abdullah , Targio Hashem, Ibrahim , Ahmed, Ejaz , Imran, Muhammad
- Date: 2019
- Type: Text , Journal article , Review
- Relation: International Journal of Information Management Vol. 45, no. (2019), p. 289-307
- Full Text:
- Reviewed:
- Description: The advent of connected devices and omnipresence of Internet have paved way for intruders to attack networks, which leads to cyber-attack, financial loss, information theft in healthcare, and cyber war. Hence, network security analytics has become an important area of concern and has gained intensive attention among researchers, off late, specifically in the domain of anomaly detection in network, which is considered crucial for network security. However, preliminary investigations have revealed that the existing approaches to detect anomalies in network are not effective enough, particularly to detect them in real time. The reason for the inefficacy of current approaches is mainly due the amassment of massive volumes of data though the connected devices. Therefore, it is crucial to propose a framework that effectively handles real time big data processing and detect anomalies in networks. In this regard, this paper attempts to address the issue of detecting anomalies in real time. Respectively, this paper has surveyed the state-of-the-art real-time big data processing technologies related to anomaly detection and the vital characteristics of associated machine learning algorithms. This paper begins with the explanation of essential contexts and taxonomy of real-time big data processing, anomalous detection, and machine learning algorithms, followed by the review of big data processing technologies. Finally, the identified research challenges of real-time big data processing in anomaly detection are discussed. © 2018 Elsevier Ltd
- «
- ‹
- 1
- ›
- »