Artificial Intelligence Applications for Industry 4.0: A Literature-Based Study

Author(s):  
Mohd Javaid ◽  
Abid Haleem ◽  
Ravi Pratap Singh ◽  
Rajiv Suman

Artificial intelligence (AI) contributes to the recent developments in Industry 4.0. Industries are focusing on improving product consistency, productivity and reducing operating costs, and they want to achieve this with the collaborative partnership between robotics and people. In smart industries, hyperconnected manufacturing processes depend on different machines that interact using AI automation systems by capturing and interpreting all data types. Smart platforms of automation can play a decisive role in transforming modern production. AI provides appropriate information to take decision-making and alert people of possible malfunctions. Industries will use AI to process data transmitted from the Internet of things (IoT) devices and connected machines based on their desire to integrate them into their equipment. It provides companies with the ability to track their entire end-to-end activities and processes fully. This literature review-based paper aims to brief the vital role of AI in successfully implementing Industry 4.0. Accordingly, the research objectives are crafted to facilitate researchers, practitioners, students and industry professionals in this paper. First, it discusses the significant technological features and traits of AI, critical for Industry 4.0. Second, this paper identifies the significant advancements and various challenges enabling the implementation of AI for Industry 4.0. Finally, the paper identifies and discusses significant applications of AI for Industry 4.0. With an extensive review-based exploration, we see that the advantages of AI are widespread and the need for stakeholders in understanding the kind of automation platform they require in the new manufacturing order. Furthermore, this technology seeks correlations to avoid errors and eventually to anticipate them. Thus, AI technology is gradually accomplishing various goals of Industry 4.0.

Healthcare ◽  
2021 ◽  
Vol 9 (7) ◽  
pp. 834
Author(s):  
Magbool Alelyani ◽  
Sultan Alamri ◽  
Mohammed S. Alqahtani ◽  
Alamin Musa ◽  
Hajar Almater ◽  
...  

Artificial intelligence (AI) is a broad, umbrella term that encompasses the theory and development of computer systems able to perform tasks normally requiring human intelligence. The aim of this study is to assess the radiology community’s attitude in Saudi Arabia toward the applications of AI. Methods: Data for this study were collected using electronic questionnaires in 2019 and 2020. The study included a total of 714 participants. Data analysis was performed using SPSS Statistics (version 25). Results: The majority of the participants (61.2%) had read or heard about the role of AI in radiology. We also found that radiologists had statistically different responses and tended to read more about AI compared to all other specialists. In addition, 82% of the participants thought that AI must be included in the curriculum of medical and allied health colleges, and 86% of the participants agreed that AI would be essential in the future. Even though human–machine interaction was considered to be one of the most important skills in the future, 89% of the participants thought that it would never replace radiologists. Conclusion: Because AI plays a vital role in radiology, it is important to ensure that radiologists and radiographers have at least a minimum understanding of the technology. Our finding shows an acceptable level of knowledge regarding AI technology and that AI applications should be included in the curriculum of the medical and health sciences colleges.


Internet of Things (IoT) is efficiently plays vital role in development of several sectors by offering many opportunities to grow the economy and improve the life standard through connecting billions of “Things” which provides business opportunities in different sectors and encounter many technical and application challenges. This paper emphasizes the role of Dynamic bandwidth allocation and protocols standards in various IoT sectors such as healthcare, education, agriculture, industrial, transportation, smart cities etc., and focuses on the challenges in providing uninterrupted bandwidth to all IoT devices with existing infrastructure, which depends on standardized protocols and network devices to establish connection with heterogeneous IoT devices. This paper covers Enhanced Dynamic Bandwidth Techniques, protocol standards and policies in IoT network technologies to Improve QoS in IoT devices.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Saquib Rouf ◽  
Ankush Raina ◽  
Mir Irfan Ul Haq ◽  
Nida Naveed

Purpose The involvement of wear, friction and lubrication in engineering systems and industrial applications makes it imperative to study the various aspects of tribology in relation with advanced technologies and concepts. The concept of Industry 4.0 and its implementation further faces a lot of barriers, particularly in developing economies. Real-time and reliable data is an important enabler for the implementation of the concept of Industry 4.0. For availability of reliable and real-time data about various tribological systems is crucial in applying the various concepts of Industry 4.0. This paper aims to attempt to highlight the role of sensors related to friction, wear and lubrication in implementing Industry 4.0 in various tribology-related industries and equipment. Design/methodology/approach A through literature review has been done to study the interrelationships between the availability of tribology-related data and implementation of Industry 4.0 are also discussed. Relevant and recent research papers from prominent databases have been included. A detailed overview about the various types of sensors used in generating tribological data is also presented. Some studies related to the application of machine learning and artificial intelligence (AI) are also included in the paper. A discussion on fault diagnosis and cyber physical systems in connection with tribology has also been included. Findings Industry 4.0 and tribology are interconnected through various means and the various pillars of Industry 4.0 such as big data, AI can effectively be implemented in various tribological systems. Data is an important parameter in the effective application of concepts of Industry 4.0 in the tribological environment. Sensors have a vital role to play in the implementation of Industry 4.0 in tribological systems. Determining the machine health, carrying out maintenance in off-shore and remote mechanical systems is possible by applying online-real-time data acquisition. Originality/value The paper tries to relate the pillars of Industry 4.0 with various aspects of tribology. The paper is a first of its kind wherein the interdisciplinary field of tribology has been linked with Industry 4.0. The paper also highlights the role of sensors in generating tribological data related to the critical parameters, such as wear rate, coefficient of friction, surface roughness which is critical in implementing the various pillars of Industry 4.0.


2020 ◽  
Vol 2 (11) ◽  
Author(s):  
Petar Radanliev ◽  
David De Roure ◽  
Rob Walton ◽  
Max Van Kleek ◽  
Rafael Mantilla Montalvo ◽  
...  

AbstractWe explore the potential and practical challenges in the use of artificial intelligence (AI) in cyber risk analytics, for improving organisational resilience and understanding cyber risk. The research is focused on identifying the role of AI in connected devices such as Internet of Things (IoT) devices. Through literature review, we identify wide ranging and creative methodologies for cyber analytics and explore the risks of deliberately influencing or disrupting behaviours to socio-technical systems. This resulted in the modelling of the connections and interdependencies between a system's edge components to both external and internal services and systems. We focus on proposals for models, infrastructures and frameworks of IoT systems found in both business reports and technical papers. We analyse this juxtaposition of related systems and technologies, in academic and industry papers published in the past 10 years. Then, we report the results of a qualitative empirical study that correlates the academic literature with key technological advances in connected devices. The work is based on grouping future and present techniques and presenting the results through a new conceptual framework. With the application of social science's grounded theory, the framework details a new process for a prototype of AI-enabled dynamic cyber risk analytics at the edge.


Author(s):  
Chander Diwaker ◽  
Atul Sharma ◽  
Pradeep Tomar

Artificial intelligence is an emerging technology that is popular in education technology. AI plays a vital role to e-teaching and e-learning in higher education. In this chapter, a major focus is on exploring the wonders of the development of AI in higher education for teaching and learning processes. It analyses the educational ramifications of rising innovations in transit student learning and how organizations instruct and develop. Late inventive degrees of progress and the accelerating new headway in cutting edge training are researched to predict the future thought of cutting-edge instruction in all actuality. The role of AI in higher education is presented in detail by systematic review.


Author(s):  
Ravdeep Kour

The convergence of information technology (IT) and operational technology (OT) and the associated paradigm shift toward fourth industrial revolution (aka Industry 4.0) in companies has brought tremendous changes in technology vision with innovative technologies such as robotics, big data, cloud computing, online monitoring, internet of things (IoT), cyber-physical systems (CPS), cognitive computing, and artificial intelligence (AI). However, this transition towards the fourth industrial revolution has many benefits in productivity, efficiency, revenues, customer experience, and profitability, but also imposes many challenges. One of the challenges is to manage and secure large amount of data generated from internet of things (IoT) devices that provide many entry points for hackers in the form of a threat to exploit new and existing vulnerabilities within the network. This chapter investigates various cybersecurity issues and challenges in Industry 4.0 with more focus on three industrial case studies.


Symmetry ◽  
2021 ◽  
Vol 14 (1) ◽  
pp. 16
Author(s):  
Abdul Majeed ◽  
Seong Oun Hwang

This paper presents the role of artificial intelligence (AI) and other latest technologies that were employed to fight the recent pandemic (i.e., novel coronavirus disease-2019 (COVID-19)). These technologies assisted the early detection/diagnosis, trends analysis, intervention planning, healthcare burden forecasting, comorbidity analysis, and mitigation and control, to name a few. The key-enablers of these technologies was data that was obtained from heterogeneous sources (i.e., social networks (SN), internet of (medical) things (IoT/IoMT), cellular networks, transport usage, epidemiological investigations, and other digital/sensing platforms). To this end, we provide an insightful overview of the role of data-driven analytics leveraging AI in the era of COVID-19. Specifically, we discuss major services that AI can provide in the context of COVID-19 pandemic based on six grounds, (i) AI role in seven different epidemic containment strategies (a.k.a non-pharmaceutical interventions (NPIs)), (ii) AI role in data life cycle phases employed to control pandemic via digital solutions, (iii) AI role in performing analytics on heterogeneous types of data stemming from the COVID-19 pandemic, (iv) AI role in the healthcare sector in the context of COVID-19 pandemic, (v) general-purpose applications of AI in COVID-19 era, and (vi) AI role in drug design and repurposing (e.g., iteratively aligning protein spikes and applying three/four-fold symmetry to yield a low-resolution candidate template) against COVID-19. Further, we discuss the challenges involved in applying AI to the available data and privacy issues that can arise from personal data transitioning into cyberspace. We also provide a concise overview of other latest technologies that were increasingly applied to limit the spread of the ongoing pandemic. Finally, we discuss the avenues of future research in the respective area. This insightful review aims to highlight existing AI-based technological developments and future research dynamics in this area.


Author(s):  
Arti Jain ◽  
Rashmi Kushwah ◽  
Abhishek Swaroop ◽  
Arun Yadav

COVID-19 is caused by virus called SARS-CoV-2, which was declared by the WHO as global pandemic. Since the outbreak, there has been a rush to explore Artificial Intelligence (AI) and Internet of Things (IoT) for diagnosing, predicting, and treating infections. At present, individual technologies, AI and IoT, play important roles yet do not impact individually against the pandemic because of constraints like lack of historical data and the existence of biased, noisy, and outlier data. To overcome, balance among data privacy, public health, and human-AI-IoT interaction is must. Artificial Intelligence of Things (AIoT) appears to be a more efficient technological solution that can play a significant role to control COVID-19. IoT devices produce huge data which are gathered and mined for actionable effects in AI. AI converts data into useful results which are utilized by IoT devices. AIoT entails AI through machine learning and decision making to IoT and renovates IoT to add data exchange and analytics to AI. In this chapter, AIoT will serve as a potential analytical tool to fight against the pandemic.


Sensors ◽  
2021 ◽  
Vol 21 (8) ◽  
pp. 2586
Author(s):  
Sarah M. Ayyad ◽  
Mohamed Shehata ◽  
Ahmed Shalaby ◽  
Mohamed Abou El-Ghar ◽  
Mohammed Ghazal ◽  
...  

Prostate cancer is one of the most identified cancers and second most prevalent among cancer-related deaths of men worldwide. Early diagnosis and treatment are substantial to stop or handle the increase and spread of cancer cells in the body. Histopathological image diagnosis is a gold standard for detecting prostate cancer as it has different visual characteristics but interpreting those type of images needs a high level of expertise and takes too much time. One of the ways to accelerate such an analysis is by employing artificial intelligence (AI) through the use of computer-aided diagnosis (CAD) systems. The recent developments in artificial intelligence along with its sub-fields of conventional machine learning and deep learning provide new insights to clinicians and researchers, and an abundance of research is presented specifically for histopathology images tailored for prostate cancer. However, there is a lack of comprehensive surveys that focus on prostate cancer using histopathology images. In this paper, we provide a very comprehensive review of most, if not all, studies that handled the prostate cancer diagnosis using histopathological images. The survey begins with an overview of histopathological image preparation and its challenges. We also briefly review the computing techniques that are commonly applied in image processing, segmentation, feature selection, and classification that can help in detecting prostate malignancies in histopathological images.


Sign in / Sign up

Export Citation Format

Share Document