Multi-source cyber-attacks detection using machine learning
- Authors: Taheri, Sona , Gondal, Iqbal , Bagirov, Adil , Harkness, Greg , Brown, Simon , Chi, Chihung
- Date: 2019
- Type: Text , Conference proceedings , Conference paper
- Relation: 2019 IEEE International Conference on Industrial Technology, ICIT 2019; Melbourne, Australia; 13th-15th February 2019 Vol. 2019-February, p. 1167-1172
- Full Text:
- Reviewed:
- Description: The Internet of Things (IoT) has significantly increased the number of devices connected to the Internet ranging from sensors to multi-source data information. As the IoT continues to evolve with new technologies number of threats and attacks against IoT devices are on the increase. Analyzing and detecting these attacks originating from different sources needs machine learning models. These models provide proactive solutions for detecting attacks and their sources. In this paper, we propose to apply a supervised machine learning classification technique to identify cyber-attacks from each source. More precisely, we apply the incremental piecewise linear classifier that constructs boundary between sources/classes incrementally starting with one hyperplane and adding more hyperplanes at each iteration. The algorithm terminates when no further significant improvement of the separation of sources/classes is possible. The construction and usage of piecewise linear boundaries allows us to avoid any possible overfitting. We apply the incremental piecewise linear classifier on the multi-source real world cyber security data set to identify cyber-attacks and their sources.
- Description: Proceedings of the IEEE International Conference on Industrial Technology
Secure passive keyless entry and start system using machine learning
- Authors: Ahmad, Usman , Song, Hong , Bilal, Awais , Alazab, Mamoun , Jolfaei, Alireza
- Date: 2018
- Type: Text , Conference proceedings
- Relation: 11th International Conference on Security, Privacy and Anonymity in Computation, Communication, and Storage, SpaCCS 2018; Melbourne, Australia; 11th-13th December 2018; published in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) Vol. 11342 LNCS, p. 304-313
- Full Text: false
- Reviewed:
- Description: Despite the benefits of the passive keyless entry and start (PKES) system in improving the locking and starting capabilities, it is vulnerable to relay attacks even though the communication is protected using strong cryptographic techniques. In this paper, we propose a data-intensive solution based on machine learning to mitigate relay attacks on PKES Systems. The main contribution of the paper, beyond the novelty of the solution in using machine learning, is in (1) the use of a set of security features that accurately profiles the PKES system, (2) identifying abnormalities in PKES regular behavior, and (3) proposing a countermeasure that guarantees a desired probability of detection with a fixed false alarm rate by trading off the training time and accuracy. We evaluated our method using the last three months log of a PKES system using the Decision Tree, SVM, KNN and ANN and provide the comparative analysis of the relay attack detection results. Our proposed framework leverages the accuracy of supervised learning on known classes with the adaptability of k-fold cross-validation technique for identifying malicious and suspicious activities. Our test results confirm the effectiveness of the proposed solution in distinguishing relayed messages from legitimate transactions.
- Description: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
A critical review of intrusion detection systems in the internet of things : techniques, deployment strategy, validation strategy, attacks, public datasets and challenges
- Authors: Khraisat, Ansam , Alazab, Ammar
- Date: 2021
- Type: Text , Journal article
- Relation: Cybersecurity Vol. 4, no. 1 (2021), p.
- Full Text:
- Reviewed:
- Description: The Internet of Things (IoT) has been rapidly evolving towards making a greater impact on everyday life to large industrial systems. Unfortunately, this has attracted the attention of cybercriminals who made IoT a target of malicious activities, opening the door to a possible attack on the end nodes. To this end, Numerous IoT intrusion detection Systems (IDS) have been proposed in the literature to tackle attacks on the IoT ecosystem, which can be broadly classified based on detection technique, validation strategy, and deployment strategy. This survey paper presents a comprehensive review of contemporary IoT IDS and an overview of techniques, deployment Strategy, validation strategy and datasets that are commonly applied for building IDS. We also review how existing IoT IDS detect intrusive attacks and secure communications on the IoT. It also presents the classification of IoT attacks and discusses future research challenges to counter such IoT attacks to make IoT more secure. These purposes help IoT security researchers by uniting, contrasting, and compiling scattered research efforts. Consequently, we provide a unique IoT IDS taxonomy, which sheds light on IoT IDS techniques, their advantages and disadvantages, IoT attacks that exploit IoT communication systems, corresponding advanced IDS and detection capabilities to detect IoT attacks. © 2021, The Author(s).
An efficient selective miner consensus protocol in blockchain oriented iot smart monitoring
- Authors: Uddin, Ashraf , Stranieri, Andrew , Gondal, Iqbal , Balasubramanian, Venki
- Date: 2019
- Type: Text , Conference proceedings , Conference paper
- Relation: 2019 IEEE International Conference on Industrial Technology, ICIT 2019; Melbourne; Australia; 13th-15th February 2019 Vol. 2019-February, p. 1135-1142
- Full Text:
- Reviewed:
- Description: Blockchains have been widely used in Internet of Things(IoT) applications including smart cities, smart home and smart governance to provide high levels of security and privacy. In this article, we advance a Blockchain based decentralized architecture for the storage of IoT data produced from smart home/cities. The architecture includes a secure communication protocol using a sign-encryption technique between power constrained IoT devices and a Gateway. The sign encryption also preserves privacy. We propose that a Software Agent executing on the Gateway selects a Miner node using performance parameters of Miners. Simulations demonstrate that the recommended Miner selection outperforms Proof of Works selection used in Bitcoin and Random Miner Selection.
- Description: Proceedings of the IEEE International Conference on Industrial Technology
A novel countermeasure technique for reactive jamming attack in internet of things
- Authors: Fadele, Alaba , Othman, Mazliza , Hashem, Ibrahim , Yaqoob, Ibrar , Imran, Muhammad , Shoaib, Muhammad
- Date: 2019
- Type: Text , Journal article
- Relation: Multimedia Tools and Applications Vol. 78, no. 21 (2019), p. 29899-29920
- Full Text: false
- Reviewed:
- Description: In recent years, Internet of Things (IoT) has attracted significant attention because of its wide range of applications in various domains. However, security is a growing concern as users of small devices in an IoT network are unable to defend themselves against reactive jamming attacks. These attacks negatively affect the performance of devices and hinder IoT operations. To address such an issue, this paper presents a novel countermeasure detection and consistency algorithm (CDCA), which aims to fight reactive jamming attacks on IoT networks. The proposed CDCA uses a change in threshold value to detect and treat an attack. The algorithm employs channel signal strength to check packet consistency by determining if the data transmission value contradicts the threshold value. The node that sends the threshold value is periodically checked and the threshold value is compared with the current value after data transmission to find out if an attack has occurred in the network. Based on realistic simulation scenarios (e.g., with varying traffic interval, number of malicious nodes, and random mobility patterns), the performance of the proposed CDCA is evaluated using a Cooja simulator. Simulation results demonstrate the superiority of the proposed technique compared with contemporary schemes in terms of performance metrics such as energy consumption, traffic delay, and network throughput. © 2018, Springer Science+Business Media, LLC, part of Springer Nature.
Monitoring body motions related to Huntington disease by exploiting the 5G Paradigm
- Authors: Haider, Daniyal , Romain, Olivier , Kernec, Julien Le , Shah, Syed Yaseen , Farooq, Malik Muhammad Umer , Qadus, Zunaira
- Date: 2019
- Type: Text , Conference proceedings
- Relation: 2019 UK/ China Emerging Technologies (UCET); Glasgow UK; 21-22 August 2019 p. 1-4
- Full Text: false
- Reviewed:
- Description: The modern wireless technology exploiting the full potential of 5G IoT is the future for healthcare sector. In the healthcare sector, the 5G technology will maximize the performance and will reduce the chances of damage to the patient by providing careful and advance activity monitoring scenarios. We have proposed the idea of monitoring different body posture in Huntington disease by exploiting the low cost wireless devices operating at 4.8 GHz frequency. The system captures the wireless channel information for three body motions and classification of these motions was performed by using support vector machine. The SVM used 10 time-domain features for the classification process by using three main kernel functions, such as, Linear, Polynomial and Radial basis function. The system minimizes all the external noise by using the microwave absorbing materials. This increases the performance and feasibility of sensing body motions.
Big data analytics for manufacturing internet of things: opportunities, challenges and enabling technologies
- Authors: Dai, Hong-Ning , Wang, Hao , Xu, Guangquan , Wan, Jiafu , Imran, Muhammad
- Date: 2020
- Type: Text , Journal article
- Relation: Enterprise Information Systems Vol. 14, no. 9-10 (2020), p. 1279-1303
- Full Text: false
- Reviewed:
- Description: Data analytics in massive manufacturing data can extract huge business values while can also result in research challenges due to the heterogeneous data types, enormous volume and real-time velocity of manufacturing data. This paper provides an overview on big data analytics in manufacturing Internet of Things (MIoT). This paper first starts with a discussion on necessities and challenges of big data analytics in manufacturing data of MIoT. Then, the enabling technologies of big data analytics of manufacturing data are surveyed and discussed. Moreover, this paper also outlines the future directions in this promising area. © 2019 Informa UK Limited, trading as Taylor & Francis Group.
A lightweight cyber security framework with context-awareness for pervasive computing environments
- Authors: Al-Muhtadi, Jalal , Saleem, Kashif , Al-Rabiaah, Sumaya , Imran, Muhammad , Gawanmeh, Amjad , Rodrigues, Joel
- Date: 2021
- Type: Text , Journal article
- Relation: Sustainable Cities and Society Vol. 66, no. (2021), p.
- Full Text: false
- Reviewed:
- Description: Internet of things (IoT) plays a key role in enabling smart sustainable cities. Pervasive computing over the IoT platform makes life more convenient by embedding sensors based on context-aware computing devices in the physical environment for the ubiquitous availability of computing resources. The sensors gather contextual information from the physical world and transmit it to receivers as per requirements or in case of environmental changes, such as temperature and humidity. However, the combination of dynamic operation and the need to handle sensitive and private data make the pervasive computing environment and IoT devices vulnerable to numerous attacks. Smart environments require a maximum level of safety assurance, such as trusted context producers and customers, which should protect sensitive information from exposure or monitoring. This paper discusses the major cyber threats in smart environments and proposes a novel lightweight security framework that authenticates and maintains the context providers and receivers. The cloud environment is adopted for user authentication at the user layer to implement access control and role assignment. Finally, the proposed security framework is implemented in the IBM cloud platform with six devices to evaluate its efficiency, sustainability, and secure communication. © 2020 Elsevier Ltd
Security and blockchain convergence with internet of multimedia things : current trends, research challenges and future directions
- Authors: Jan, Mian , Cai, Jinjin , Gao, Xiang-Chuan , Khan, Fazlullah , Mastorakis, Spyridon , Usman, Muhammad , Alazab, Mamoun , Watters, Paul
- Date: 2021
- Type: Text , Journal article
- Relation: Journal of Network and Computer Applications Vol. 175, no. (2021), p.
- Full Text:
- Reviewed:
- Description: The Internet of Multimedia Things (IoMT) orchestration enables the integration of systems, software, cloud, and smart sensors into a single platform. The IoMT deals with scalar as well as multimedia data. In these networks, sensor-embedded devices and their data face numerous challenges when it comes to security. In this paper, a comprehensive review of the existing literature for IoMT is presented in the context of security and blockchain. The latest literature on all three aspects of security, i.e., authentication, privacy, and trust is provided to explore the challenges experienced by multimedia data. The convergence of blockchain and IoMT along with multimedia-enabled blockchain platforms are discussed for emerging applications. To highlight the significance of this survey, large-scale commercial projects focused on security and blockchain for multimedia applications are reviewed. The shortcomings of these projects are explored and suggestions for further improvement are provided. Based on the aforementioned discussion, we present our own case study for healthcare industry: a theoretical framework having security and blockchain as key enablers. The case study reflects the importance of security and blockchain in multimedia applications of healthcare sector. Finally, we discuss the convergence of emerging technologies with security, blockchain and IoMT to visualize the future of tomorrow's applications. © 2020 Elsevier Ltd
Unmanned aerial vehicle for internet of everything : opportunities and challenges
- Authors: Liu, Yalin , Dai, Hong-Ning , Wang, Qubeijian , Shukla, Mahendra , Imran, Muhammad
- Date: 2020
- Type: Text , Journal article , Review
- Relation: Computer Communications Vol. 155, no. (2020), p. 66-83
- Full Text:
- Reviewed:
- Description: The recent advances in information and communication technology (ICT) have further extended Internet of Things (IoT) from the sole “things” aspect to the omnipotent role of “intelligent connection of things”. Meanwhile, the concept of internet of everything (IoE) is presented as such an omnipotent extension of IoT. However, the IoE realization meets critical challenges including the restricted network coverage and the limited resource of existing network technologies. Recently, Unmanned Aerial Vehicles (UAVs) have attracted significant attentions attributed to their high mobility, low cost, and flexible deployment. Thus, UAVs may potentially overcome the challenges of IoE. This article presents a comprehensive survey on opportunities and challenges of UAV-enabled IoE. We first present three critical expectations of IoE: (1) scalability requiring a scalable network architecture with ubiquitous coverage, (2) intelligence requiring a global computing plane enabling intelligent things, (3) diversity requiring provisions of diverse applications. Thereafter, we review the enabling technologies to achieve these expectations and discuss four intrinsic constraints of IoE (i.e., coverage constraint, battery constraint, computing constraint, and security issues). We then present an overview of UAVs. We next discuss the opportunities brought by UAV to IoE. Additionally, we introduce a UAV-enabled IoE (Ue-IoE) solution by exploiting UAVs's mobility, in which we show that Ue-IoE can greatly enhance the scalability, intelligence and diversity of IoE. Finally, we outline the future directions in Ue-IoE. © 2020 Elsevier B.V.
Adversarial training for deep learning-based cyberattack detection in IoT-based smart city applications
- Authors: Rashid, Md Mamunur , Kamruzzaman, Joarder , Mehedi Hassan, Mohammad , Imam, Tasadduq , Wibowo, Santoso , Gordon, Steven , Fortino, Giancarlo
- Date: 2022
- Type: Text , Journal article
- Relation: Computers and Security Vol. 120, no. (2022), p.
- Full Text: false
- Reviewed:
- Description: Intrusion Detection Systems (IDS) based on deep learning models can identify and mitigate cyberattacks in IoT applications in a resilient and systematic manner. These models, which support the IDS's decision, could be vulnerable to a cyberattack known as adversarial attack. In this type of attack, attackers create adversarial samples by introducing small perturbations to attack samples to trick a trained model into misclassifying them as benign applications. These attacks can cause substantial damage to IoT-based smart city models in terms of device malfunction, data leakage, operational outage and financial loss. To our knowledge, the impact of and defence against adversarial attacks on IDS models in relation to smart city applications have not been investigated yet. To address this research gap, in this work, we explore the effect of adversarial attacks on the deep learning and shallow machine learning models by using a recent IoT dataset and propose a method using adversarial retraining that can significantly improve IDS performance when confronting adversarial attacks. Simulation results demonstrate that the presence of adversarial samples deteriorates the detection accuracy significantly by above 70% while our proposed model can deliver detection accuracy above 99% against all types of attacks including adversarial attacks. This makes an IDS robust in protecting IoT-based smart city services. © 2022 Elsevier Ltd
Current status of and future opportunities for digital agriculture in Australia
- Authors: Hansen, Birgita , Leonard, E. , Mitchell, M. C. , Easton, J. , Shariati, N. , Mortlock, M. Y. , Schaefer, M. , Lamb, D. W.
- Date: 2022
- Type: Text , Journal article
- Relation: Crop and pasture science Vol. 74, no. 6 (2022), p. 524-537
- Full Text:
- Reviewed:
- Description: In Australia, digital agriculture is considered immature and its adoption ad hoc, despite a relatively advanced technology innovation sector. In this review, we focus on the technical, governance and social factors of digital adoption that have created a disconnect between technology development and the end user community (farmers and their advisors). Using examples that reflect both successes and barriers in Australian agriculture, we first explore the current enabling technologies and processes, and then we highlight some of the key socio-technical factors that explain why digital agriculture is immature and ad hoc. Pronounced issues include fragmentation of the innovation system (and digital tools), and a lack of enabling legislation and policy to support technology deployment. To overcome such issues and increase adoption, clear value propositions for change are necessary. These value propositions are influenced by the perceptions and aspirations of individuals, the delivery of digitally-enabled processes and the supporting legislative, policy and educational structures, better use/conversion of data generated through technology applications to knowledge for supporting decision making, and the suitability of the technology. Agronomists and early adopter farmers will play a significant role in closing the technology-end user gap, and will need support and training from technology service providers, government bodies and peer-networks. Ultimately, practice change will only be achieved through mutual understanding, ownership and trust. This will occur when farmers and their advisors are an integral part of the entire digital innovation system.
Deep learning : survey of environmental and camera impacts on internet of things images
- Authors: Kaur, Roopdeep , Karmakar, Gour , Xia, Feng , Imran, Muhammad
- Date: 2023
- Type: Text , Journal article
- Relation: Artificial Intelligence Review Vol. 56, no. 9 (2023), p. 9605-9638
- Full Text:
- Reviewed:
- Description: Internet of Things (IoT) images are captivating growing attention because of their wide range of applications which requires visual analysis to drive automation. However, IoT images are predominantly captured from outdoor environments and thus are inherently impacted by the camera and environmental parameters which can adversely affect corresponding applications. Deep Learning (DL) has been widely adopted in the field of image processing and computer vision and can reduce the impact of these parameters on IoT images. Albeit, there are many DL-based techniques available in the current literature for analyzing and reducing the environmental and camera impacts on IoT images. However, to the best of our knowledge, no survey paper presents state-of-the-art DL-based approaches for this purpose. Motivated by this, for the first time, we present a Systematic Literature Review (SLR) of existing DL techniques available for analyzing and reducing environmental and camera lens impacts on IoT images. As part of this SLR, firstly, we reiterate and highlight the significance of IoT images in their respective applications. Secondly, we describe the DL techniques employed for assessing the environmental and camera lens distortion impacts on IoT images. Thirdly, we illustrate how DL can be effective in reducing the impact of environmental and camera lens distortion in IoT images. Finally, along with the critical reflection on the advantages and limitations of the techniques, we also present ways to address the research challenges of existing techniques and identify some further researches to advance the relevant research areas. © 2023, The Author(s).
Fog computing: Survey of trends, architectures, requirements, and research directions
- Authors: Naha, Ranesh , Garg, Saurabh , Georgakopoulos, Dimitrios , Jayaraman, Prem , Gao, Longxiang , Xiang, Yong , Ranjan, Rajiv
- Date: 2018
- Type: Text , Journal article
- Relation: IEEE access Vol. 6, no. (2018), p. 47980-48009
- Full Text: false
- Reviewed:
- Description: Emerging technologies such as the Internet of Things (IoT) require latency-aware computation for real-time application processing. In IoT environments, connected things generate a huge amount of data, which are generally referred to as big data. Data generated from IoT devices are generally processed in a cloud infrastructure because of the on-demand services and scalability features of the cloud computing paradigm. However, processing IoT application requests on the cloud exclusively is not an efficient solution for some IoT applications, especially time-sensitive ones. To address this issue, Fog computing, which resides in between cloud and IoT devices, was proposed. In general, in the Fog computing environment, IoT devices are connected to Fog devices. These Fog devices are located in close proximity to users and are responsible for intermediate computation and storage. One of the key challenges in running IoT applications in a Fog computing environment are resource allocation and task scheduling. Fog computing research is still in its infancy, and taxonomy-based investigation into the requirements of Fog infrastructure, platform, and applications mapped to current research is still required. This survey will help the industry and research community synthesize and identify the requirements for Fog computing. This paper starts with an overview of Fog computing in which the definition of Fog computing, research trends, and the technical differences between Fog and cloud are reviewed. Then, we investigate numerous proposed Fog computing architectures and describe the components of these architectures in detail. From this, the role of each component will be defined, which will help in the deployment of Fog computing. Next, a taxonomy of Fog computing is proposed by considering the requirements of the Fog computing paradigm. We also discuss existing research works and gaps in resource allocation and scheduling, fault tolerance, simulation tools, and Fog-based microservices. Finally, by addressing the limitations of current research works, we present some open issues, which will determine the future research direction for the Fog computing paradigm.
A smart healthcare framework for detection and monitoring of COVID-19 using IoT and cloud computing
- Authors: Nasser, Nidal , Emad-ul-Haq, Qazi , Imran, Muhammad , Ali, Asmaa , Razzak, Imran , Al-Helali, Abdulaziz
- Date: 2023
- Type: Text , Journal article
- Relation: Neural Computing and Applications Vol. 35, no. 19 (2023), p. 13775-13789
- Full Text:
- Reviewed:
- Description: Coronavirus (COVID-19) is a very contagious infection that has drawn the world’s attention. Modeling such diseases can be extremely valuable in predicting their effects. Although classic statistical modeling may provide adequate models, it may also fail to understand the data’s intricacy. An automatic COVID-19 detection system based on computed tomography (CT) scan or X-ray images is effective, but a robust system design is challenging. In this study, we propose an intelligent healthcare system that integrates IoT-cloud technologies. This architecture uses smart connectivity sensors and deep learning (DL) for intelligent decision-making from the perspective of the smart city. The intelligent system tracks the status of patients in real time and delivers reliable, timely, and high-quality healthcare facilities at a low cost. COVID-19 detection experiments are performed using DL to test the viability of the proposed system. We use a sensor for recording, transferring, and tracking healthcare data. CT scan images from patients are sent to the cloud by IoT sensors, where the cognitive module is stored. The system decides the patient status by examining the images of the CT scan. The DL cognitive module makes the real-time decision on the possible course of action. When information is conveyed to a cognitive module, we use a state-of-the-art classification algorithm based on DL, i.e., ResNet50, to detect and classify whether the patients are normal or infected by COVID-19. We validate the proposed system’s robustness and effectiveness using two benchmark publicly available datasets (Covid-Chestxray dataset and Chex-Pert dataset). At first, a dataset of 6000 images is prepared from the above two datasets. The proposed system was trained on the collection of images from 80% of the datasets and tested with 20% of the data. Cross-validation is performed using a tenfold cross-validation technique for performance evaluation. The results indicate that the proposed system gives an accuracy of 98.6%, a sensitivity of 97.3%, a specificity of 98.2%, and an F1-score of 97.87%. Results clearly show that the accuracy, specificity, sensitivity, and F1-score of our proposed method are high. The comparison shows that the proposed system performs better than the existing state-of-the-art systems. The proposed system will be helpful in medical diagnosis research and healthcare systems. It will also support the medical experts for COVID-19 screening and lead to a precious second opinion. © 2021, The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature.
Maintenance and asset management practices of industrial assets : importance of tribological practices and digital tools
- Authors: Pai, Raghuvir , Chattopadhyay, Gopinath , Karmakar, Gour
- Date: 2023
- Type: Text , Journal article
- Relation: International Journal of Process Management and Benchmarking Vol. 13, no. 2 (2023), p. 233-256
- Full Text: false
- Reviewed:
- Description: There are a large number of rotating and sliding parts in industrial assets. Tribological behaviour plays a significant role in influencing friction and wear, and in turn, the life of these parts. There are issues and challenges in understanding the tribological aspects and behaviour of machine components by maintenance professionals so that informed decisions can taken to improve performance and productivity. An understanding of tribology helps in developing and applying the tools and techniques necessary for better maintenance. In recent years, remote performance monitoring (RPM), internet of things (IoT), machine learning, artificial intelligence and data analytics have made a significant contribution to maintenance and asset management. This paper reviews the tribological aspects related to maintenance, reliability and asset management. The findings of this study will be useful to engineers and managers to understand and appreciate the relationship between tribology, maintenance, reliability and availability for better asset management. Copyright © 2023 Inderscience Enterprises Ltd.
A micro-level compensation-based cost model for resource allocation in a fog environment
- Authors: Battula, Sudheer , Garg, Saurabh , Naha, Ranesh , Thulasiraman, Parimala , Thulasiram, Ruppa
- Date: 2019
- Type: Text , Journal article
- Relation: Sensors Vol. 19, no. 13 (2019), p. 2954
- Full Text:
- Reviewed:
- Description: Fog computing aims to support applications requiring low latency and high scalability by using resources at the edge level. In general, fog computing comprises several autonomous mobile or static devices that share their idle resources to run different services. The providers of these devices also need to be compensated based on their device usage. In any fog-based resource-allocation problem, both cost and performance need to be considered for generating an efficient resource-allocation plan. Estimating the cost of using fog devices prior to the resource allocation helps to minimize the cost and maximize the performance of the system. In the fog computing domain, recent research works have proposed various resource-allocation algorithms without considering the compensation to resource providers and the cost estimation of the fog resources. Moreover, the existing cost models in similar paradigms such as in the cloud are not suitable for fog environments as the scaling of different autonomous resources with heterogeneity and variety of offerings is much more complicated. To fill this gap, this study first proposes a micro-level compensation cost model and then proposes a new resource-allocation method based on the cost model, which benefits both providers and users. Experimental results show that the proposed algorithm ensures better resource-allocation performance and lowers application processing costs when compared to the existing best-fit algorithm.