
I am currently working as an Assistant Professor in the Department of CSE at Graphic Era (Deemed to be University) located in Dehradun, Uttarakhand, India. Additionally, I hold a position as an Assistant Professor (Visiting Faculty) in the Department of IT at Lord Buddha Education Foundation, which is affiliated with the Asia Pacific University of Technology & Innovation, Malaysia in Kathmandu, Nepal. Before these roles, I served as an Assistant Professor and Program Leader for the B.Sc. IT program at Lord Buddha Education Foundation and as an Assistant Professor in the CSE department at GD-Rungta College of Engineering & Technology, affiliated with Chhattisgarh Swami Vivekananda Technical University in Bhilai, India.
My academic journey includes earning an M. Tech degree in Computer Science & Engineering from Kalinga Institute of Industrial Technology (KIIT) in Bhubaneswar, Odisha in 2016, and a B. Tech degree in Computer Science & Engineering from Dr. MGR Educational & Research Institute in Maduravoyal, Chennai in 2013.
In terms of my research interests, I have a strong focus on Machine Learning and Deep Learning. Specifically, my research work revolves around the realms of Deep Learning and Machine Learning, where I explore and contribute to advancements in these fields.
Integration of artificial intelligence (AI) techniques in wireless infrastructure, real-time collection, and processing of end-user devices is now in high demand. It is now superlative to use AI to detect and predict pandemics of a colossal nature. The Coronavirus disease 2019 (COVID-19) pandemic, which originated in Wuhan China, has had disastrous effects on the global community and has overburdened advanced healthcare systems throughout the world. Globally; over 4,063,525 confirmed cases and 282,244 deaths have been recorded as of 11th May 2020, according to the European Centre for Disease Prevention and Control agency. However, the current rapid and exponential rise in the number of patients has necessitated efficient and quick prediction of the possible outcome of an infected patient for appropriate treatment using AI techniques. This paper proposes a fine-tuned Random Forest model boosted by the AdaBoost algorithm. The model uses the COVID-19 patient's geographical, travel, health, and demographic data to predict the severity of the case and the possible outcome, recovery, or death. The model has an accuracy of 94% and a F1 Score of 0.86 on the dataset used. The data analysis reveals a positive correlation between patients' gender and deaths, and also indicates that the majority of patients are aged between 20 and 70 years.
This paper presents novel hybrid machine learning models, namely Adaptive Neuro Fuzzy Inference System optimized by Particle Swarm Optimization (PSOANFIS), Artificial Neural Networks optimized by Particle Swarm Optimization (PSOANN), and Best First Decision Trees based Rotation Forest (RFBFDT), for landslide spatial prediction. Landslide modeling of the study area of Van Chan district, Yen Bai province (Vietnam) was carried out with the help of a spatial database of the area, considering past landslides and 12 landslide conditioning factors. The proposed models were validated using different methods such as Area under the Receiver Operating Characteristics (ROC) curve (AUC), Mean Square Error (MSE), Root Mean Square Error (RMSE). Results indicate that the RFBFDT (AUC = 0.826, MSE = 0.189, and RMSE = 0.434) is the best method in comparison to other hybrid models, namely PSOANFIS (AUC = 0.76, MSE = 0.225, and RMSE = 0.474) and PSOANN (AUC = 0.72, MSE = 0.312, and RMSE = 0.558). Thus, it is reasonably concluded that the RFBFDT is a promising hybrid machine learning approach for landslide susceptibility modeling.
Phishing attacks are one of the slanting cyber-attacks that apply socially engineered messages that are imparted to individuals from expert hackers going for tricking clients to uncover their delicate data, the most mainstream correspondence channel to those messages is through clients' emails. Phishing has turned into a generous danger for web clients and a noteworthy reason for money related misfortunes. Therefore, different arrangements have been created to handle this issue. Deceitful emails, also called phishing emails, utilize a scope of impact strategies to convince people to react, for example, promising a fiscal reward or summoning a feeling of criticalness. Regardless of far reaching alerts and intends to instruct clients to distinguish phishing sends, these are as yet a pervasive practice and a worthwhile business. The creators accept that influence, as a style of human correspondence intended to impact others, has a focal job in fruitful advanced tricks. Cyber criminals have ceaselessly propelling their techniques for assault. The current strategies to recognize the presence of such malevolent projects and to keep them from executing are static, dynamic and hybrid analysis. In this work we are proposing a hybrid methodology for phishing detection incorporating feature extraction and classification of the mails using SVM. At last, alongside the chose features, the PNN characterizes the spam mails from the genuine mails with more exactness and accuracy.
The primary aim of this study is to investigate suitable Statistical Neural Network (SNN) models and their hybrid version for COVID-19 mortality prediction in Indian populations and is to estimate the future COVID-19 death cases for India. SNN models such as Probabilistic Neural Network (PNN), Radial Basis Function Neural Network (RBFNN), and Generalized Regression Neural Network (GRNN) are applied to develop the COVID-19 Mortality Rate Prediction (MRP) model for India. For this purpose, we have used two datasets as D1 and D2. The performances of these models are evaluated using Root Mean Square Error (RMSE) and "R," a correlation value between actual and predicted value. To improve prediction accuracy, the new hybrid models have been constructed by combining SNN models and the Non-linear Autoregressive Neural Network (NAR-NN). This is to predict the future error of the SNN models, which adds to the predicted value of these models for getting better MRP value. The results showed that the PNN and RBFNN-based MRP model performed better than the other models for COVID-19 datasets D2 and D1, respectively.
Recognizing and authenticating wheat varieties is critical for quality evaluation in the grain supply chain, particularly for methods for seed inspection. Recognition and verification of grains are carried out manually through direct visual examination. Automatic categorization techniques based on machine learning and computer vision offered fast and high-throughput solutions. Even yet, categorization remains a complicated process at the varietal level. The paper utilized machine learning approaches for classifying wheat seeds. The seed classification is performed based on 7 physical features: area of wheat, perimeter of wheat, compactness, length of the kernel, width of the kernel, asymmetry coefficient, and kernel groove length. The dataset is collected from the UCI library and has 210 occurrences of wheat kernels. The dataset contains kernels from three wheat varieties Kama, Rosa, and Canadian, with 70 components chosen at random for the experiment. In the first phase, K-nearest neighbor, classification and regression tree, and Gaussian Naïve Bayes algorithms are implemented for classification. The results of these algorithms are compared with the ensemble approach of machine learning. The results reveal that accuracies calculated for KNN, decision, and Naïve Bayes classifiers are 92%, 94%, and 92%, respectively. The highest accuracy of 95% is achieved through the ensemble classifier in which decision is made based on hard voting.
In recent times Autism Spectrum Disorder (ASD) is picking up its force quicker than at any other time. Distinguishing autism characteristics through screening tests is over the top expensive and tedious. Screening of the same is a challenging task, and classification must be conducted with great care. Machine Learning (ML) can perform great in the classification of this problem. Most researchers have utilized the ML strategy to characterize patients and typical controls, among which support vector machines (SVM) are broadly utilized. Even though several studies have been done utilizing various methods, these investigations didn't give any complete decision about anticipating autism qualities regarding distinctive age groups. Accordingly, this paper plans to locate the best technique for ASD classi-fication out of SVM, K-nearest neighbor (KNN), Random Forest (RF), Naïve Bayes (NB), Stochastic gradient descent (SGD), Adaptive boosting (AdaBoost), and CN2 Rule Induction using 4 ASD datasets taken from UCI ML repository. The classification accuracy (CA) we acquired after experimentation is as follows: in the case of the adult dataset SGD gives 99.7%, in the adolescent dataset RF gives 97.2%, in the child dataset SGD gives 99.6%, in the toddler dataset Ada-Boost gives 99.8%. Autism spectrum quotients (AQs) varied among several sce-narios for toddlers, adults, adolescents, and children that include positive predic-tive value for the scaling purpose. AQ questions referred to topics about attention to detail, attention switching, communication, imagination, and social skills.
Ontology is a knowledge depiction archetypal that provides the semantic services which are the most challenging task on the emergent cutting-edge technology such as IoT. The dynamic sensing capability with adapting the semantic structure to the sensed data gains precise attention from the research community. The semantic approach solves the issue of interoperability applied to the heterogeneous sensed data across different application domains. Generating ontologies from the heterogeneous sensed data sources based on real-time application needs an effective mechanism due to the rising demand for the public participation. We propose an effective technique for generating ontology by adopting the Fruit fly optimization algorithm. The proposed approach proves that the construction of ontology-based information increases the logistics of the system effectively with a low operating cost.
The Internet of Things (IoT) is being prominently used in smart cities and a wide range of applications in society. The benefits of IoT are evident, but cyber terrorism and security concerns inhibit many organizations and users from deploying it. Cyber-physical systems that are IoT-enabled might be difficult to secure since security solutions designed for general information/operational technology systems may not work as well in an environment. Thus, deep learning (DL) can assist as a powerful tool for building IoT-enabled cyber-physical systems with automatic anomaly detection. In this paper, two distinct DL models have been employed i.e., Deep Belief Network (DBN) and Convolutional Neural Network (CNN), considered hybrid classifiers, to create a framework for detecting attacks in IoT-enabled cyber-physical systems. However, DL models need to be trained in such a way that will increase their classification accuracy. Therefore, this paper also aims to present a new hybrid optimization algorithm called "Seagull Adapted Elephant Herding Optimization" (SAEHO) to tune the weights of the hybrid classifier. The "Hybrid Classifier + SAEHO" framework takes the feature extracted dataset as an input and classifies the network as either attack or benign. Using sensitivity, precision, accuracy, and specificity, two datasets were compared. In every performance metric, the proposed framework outperforms conventional methods.
This paper provides insight to the dynamics that come with the emergence of IoT in the furniture and kitchen manufacturing industry.By implementing the concept of IoT companies are currently evaluating how internal knowledge and skillsets correspond to the new technical requirements that the emerging digital setting outlines and by directing internal research they are learning more about IoT and connected products as they proceed.One current major problem is that there are no open protocols that can connect all products regardless of supplier.Nevertheless, implementation of IoT does not solely involve technical aspects and companies are also faced with the dilemma on how to design and develop corresponding commercial processes.To this point early product implementations have arrived on the consumer markets and the future vision is to achieve full integration that imbeds connectivity and interaction among all products in the home.
Bitcoin is a decentralized digital currency without a central bank or single administrator sent from user to user on the peer-to-peer bitcoin blockchain network without intermediaries' need. In this Bitcoin trend analysis work, initial attributes are considered from five sectors based on financial, social, token, network, and that count to thirteen attributes. The thirteen attributes considered are price, volume, market cap, a mean dollar invested age, social volume, social dominance, development activity, transaction volume, token age consumed, token velocity, token circulation, market value to realized value, and realized cap. We apply the attribute selection and trend analysis mapped with potential seven attributes: Price, Volume, Market Cap, Social Dominance, Development Activity, Market Value to Realized Value & Realized Cap. We have conducted Nonlinear Autoregressive with External Input analysis considering seven attributes. The work employed three training algorithms to train a neural network as Levenberg-Marquard, Bayesian Regularization, and Scaled Conjugate Gradient algorithm. The Error histogram and regression plots results indicate that the Bayesian Regularized Neural Network is showing good performance and thus provides a better forecast.
With the growth of the Internet of Things (IoT), security attacks are also rising gradually. Numerous centralized mechanisms have been introduced in the recent past for the detection of attacks in IoT, in which an attack recognition scheme is employed at the network’s vital point, which gathers data from the network and categorizes it as “Attack” or “Normal”. Nevertheless, these schemes were unsuccessful in achieving noteworthy results due to the diverse necessities of IoT devices such as distribution, scalability, lower latency, and resource limits. The present paper proposes a hybrid model for the detection of attacks in an IoT environment that involves three stages. Initially, the higher-order statistical features (kurtosis, variance, moments), mutual information (MI), symmetric uncertainty, information gain ratio (IGR), and relief-based features are extracted. Then, detection takes place using Gated Recurrent Unit (GRU) and Bidirectional Long Short-Term Memory (Bi-LSTM) to recognize the existence of network attacks. For improving the classification accuracy, the weights of Bi-LSTM are optimally tuned via a self-upgraded Cat and Mouse Optimizer (SU-CMO). The improvement of the employed scheme is established concerning a variety of metrics using two distinct datasets which comprise classification accuracy, and index, f-measure and MCC. In terms of all performance measures, the proposed model outperforms both traditional and state-of-the-art techniques.
The stability of the power grid is concernment due to the high demand and supply to smart cities, homes, factories, and so on. Different machine learning (ML) and deep learning (DL) models can be used to tackle the problem of stability prediction for the energy grid. This study elaborates on the necessity of IoT technology to make energy grid networks smart. Different prediction models, namely, logistic regression, naïve Bayes, decision tree, support vector machine, random forest, XGBoost, k‐nearest neighbor, and optimized artificial neural network (ANN), have been applied on openly available smart energy grid datasets to predict their stability. The present article uses metrics such as accuracy, precision, recall, f 1‐score, and ROC curve to compare different predictive models. Data augmentation and feature scaling have been applied to the dataset to get better results. The augmented dataset provides better results as compared with the normal dataset. This study concludes that the deep learning predictive model ANN optimized with Adam optimizer provides better results than other predictive models. The ANN model provides 97.27% accuracy, 96.79% precision, 95.67% recall, and 96.22% F 1 score.
Coronavirus born COVID-19 disease has spread its roots in the whole world. It is primarily spread by physical contact. As a preventive measure, proper crowd monitoring and management systems are required to be installed in public places to limit sudden outbreaks and impart improved healthcare. The number of new infections can be significantly reduced by adopting social distancing measures earlier. Motivated by this notion, a real-time crowd monitoring and management system for social distance classification is proposed in this research paper. In the proposed system, people are segregated from the background using the YOLO v4 object detection technique, and then the detected people are tracked by bounding boxes using the Deepsort technique. This system significantly helps in COVID-19 prevention by social distance detection and classification in public places using surveillance images and videos captured by the cameras installed in these places. The performance of this system has been assessed using mean average precision (mAP) and frames per second (FPS) metrics. It has also been evaluated by deploying it on Jetson Nano, a low-cost embedded system. The observed results show its suitability for real-time deployment in public places for COVID-19 prevention by social distance monitoring and classification.
Internet of Things (IoT) is the fastest growing technology that has applications in various domains such as healthcare, transportation. It interconnects trillions of smart devices through the Internet. A secure network is the basic necessity of the Internet of Things. Due to the increasing rate of interconnected and remotely accessible smart devices, more and more cybersecurity issues are being witnessed among cyber-physical systems. A perfect intrusion detection system (IDS) can probably identify various cybersecurity issues and their sources. In this article, using various telemetry datasets of different Internet of Things scenarios, we exhibit that external users can access the IoT devices and infer the victim user’s activity by sniffing the network traffic. Further, the article presents the performance of various bagging and boosting ensemble decision tree techniques of machine learning in the design of an efficient IDS. Most of the previous IDSs just focused on good accuracy and ignored the execution speed that must be improved to optimize the performance of an ID model. Most of the earlier pieces of research focused on binary classification. This study attempts to evaluate the performance of various ensemble machine learning multiclass classification algorithms by deploying on openly available “TON-IoT” datasets of IoT and Industrial IoT (IIoT) sensors.
The sudden outbreak of the novel coronavirus (nCoV-19, COVID-19) and its rampant spread led to a significant number of people being infected worldwide and disrupted several businesses. With most of the countries imposing serious lockdowns due to the increasing number of fatalities, the social lives of millions of people were affected. Although the lockdown led to an increase in network activities, online shopping, and social network usage, it also raised questions On the mental wellness of society. Interestingly, excessive usage of social networks also witnessed humor traveling across the Internet in the form of Internet Memes during the lockdown period. Humor is known to affect our well-being, decision-making, and psychological systems. In this paper, we have analyzed the Internet Meme activity in Social Networks during the COVID-19 Lockdown period. As humor is known to relieve individuals from psychological stress, it is necessary to understand how human beings adopted Internet Memes for coping up with the lockdown stress and stress-relieving mechanism during the lockdown period. In this paper, we have considered thirty popular memes and the increase in the number of their captions within the period (September 2017 to August 2020). An increase in Internet Meme activity since the lockdown period (March 2020) depicts an increase in online social behavior. We analyze the internet meme activity in social networks during the COVID-19 lockdown period using random forest, multi-layer perceptron, and instance-based learning algorithms followed by data visualization using line graph and Heat Map (8 & 15 clustered). We also compared the performance of the models using evaluation parameters like mean absolute error, root-mean-squared error & Kappa statistics and observed that random forest and instance-based learning algorithms perform better than multi-layer perceptrons. The result indicates that random forest and instance-based learning classifiers are having near perfect classification tendencies whereas multi-layer perceptrons showed around 97% classification accuracy.
In today's society, there is a high volume of smartphones, with Android being the most popular and most commonly used smartphones. In the last few years, the Android market has been booming, making lots of developers join the industry so as to create various mobile applications that are a benefit to people's lives. However, its over-popularity has brought many crime issues, including security. One of the major common incidents to mobile users is having their mobile phones lost or stolen. Since most mobile users want to find their lost phones, they are looking for the most reliable features that can help them locate their smartphones. Luckily, there are some developed applications and services that have been designed to track down and locate lost or stolen smartphones. In this work, the authors tried to identify a collection of these applications and the information they send to the user in aiding them to find their phone. Since some applications are able to send location information or a photo, this work will look at what metadata is usually sent with the message.
Life insurance is an agreement between an insured and an insurer, where the insurer pays out a sum of money either on a specific period or the death of the insured. Now a day, People can buy a policy through an online platform. There are a lot of insurance companies available in the market, and each company has various policies. Selecting the best insurance company for purchasing an online term plan is a very complex problem. People may confuse to choose the best insurance company for buying an online term. It is a multi-criteria decision making (MCDM) problem, and the problem consists of different criteria and various alternatives. Here in this paper, a model has been proposed to solve this decision-making problem. In this model, a fuzzy multi-criteria decision-making approach combined with technique for order preference by similarity to ideal solution (TOPSIS) and it has been applied to rank the different insurance companies based on online term plans. The experimental results show that the life insurance corporation of India (LIC) gets the top rank out of 12 companies for purchasing an online term plan. A sensitivity analysis has been performed to validate the proposed model.
Here, we propose a basic model that could be helpful to foresee the spread of COVID-2019. We performed linear regression, Multilayer perceptron and Vector autoregression model for expectation on the COVID-19 kaggle information to anticipate the epidemiological pattern of the disease and rate of COVID-2019 cases in India. Predicted the possible trends of COVID-19 impacts in India based on data collected from Kaggle. With the prevailing data about confirmed, recovered and death across India for over the time duration helps in predicting and forecasting the near future. For additional examination or future point of view, case definition and information assortment must be kept up continuously.
Spondylolisthesis refers to the slippage of one vertebral body over the adjacent one. It is a chronic condition that requires early detection to prevent unpleasant surgery. The paper presents an optimized deep learning model for detecting spondylolisthesis in X-ray radiographs. The dataset contains a total of 299 X-ray radiographs from which 156 images are showing the spine with spondylolisthesis and 143 images are of the normal spine. Image augmentation technique is used to increase the data samples. In this study, VGG16 and InceptionV3 models were used for the image classification task. The developed model is optimized by utilizing the TFLite model optimization technique. The experimental result shows that the VGG16 model has achieved a 98% accuracy rate, which is higher than InceptionV3’s 96% accuracy rate. The size of the implemented model is reduced up to four times so it can be used on small devices. The compressed VGG16 and InceptionV3 models have achieved 100% and 96% accuracy rate, respectively. Our finding shows that the implemented models were outperformed in the diagnosis of lumbar spondylolisthesis as compared to the model suggested by Varcin et al. (which had a maximum of 93% accuracy rate). Also, the developed quantized model has achieved higher accuracy rate than Zebin and Rezvy’s (VGG16 + TFLite) model with 90% accuracy. Furthermore, by evaluating the model’s performance on other publicly available datasets, we have generalised our approach on the public platform.
Heart failure (HF) is an intercontinental pandemic influencing in any event 26 million individuals globally and is expanding in commonness. HF healthiness consumptions are extensive and will increment significantly with a maturing populace. As per the World Health Organization (WHO), Cardiovascular diseases (CVDs) are the major reason for all-inclusive death, taking an expected 17.9 million lives per year. CVDs are a class of issues of the heart, blood vessels and include coronary heart sickness, cerebrovascular illness, rheumatic heart malady, and various other conditions. In the medical care industry, a lot of information is as often as possible created. Nonetheless, it is frequently not utilized adequately. The information shows that the produced picture, sound, text, or record has some shrouded designs and their connections. Devices used to remove information from these data sets for clinical determination of illness or different reasons for existing are more uncommon. 4 cases out of 5 CVD dying are due to heart attacks and strokes, 33% of these losses of life happen roughly in peoples under 70 year of age. In the current work, we have tried to predict the survival chances of HF sufferers using methods such as attribute selection (scoring method) & classifiers (machine learning). The scoring methods (SM) used here are the Gini Index, Information Gain, and Gain Ratio. Correlation-based feature selection (CFS) with the best first search (BFS) strategy for best attribute selection (AS). We have used multi-kernel support vector machine (MK-SVM) classifiers such as Linear, Polynomial, radial base function (RBF), Sigmoid. The classification accuracy (CA) we received using SM is as follows: SVM (Linear with 80.3%, Polynomial with 86.6%, RBF with 83.6%, Sigmoid with 82.3%) and by using CFS-BFS method are as follows: SVM (Linear with 79.9%, Polynomial with 83.3%, RBF and Sigmoid with 83.6%).
Abstract Breast carcinoma is a sort of malignancy that begins in the breast. Breast malignancy cells generally structure a tumour that can routinely be seen on an x‐ray or felt like a lump. Despite advances in screening, treatment, and observation that have improved patient endurance rates, breast carcinoma is the most regularly analyzed malignant growth and the subsequent driving reason for malignancy mortality among ladies. Invasive ductal carcinoma is the most boundless breast malignant growth with about 80% of all analyzed cases. It has been found from numerous types of research that artificial intelligence has tremendous capabilities, which is why it is used in various sectors, especially in the healthcare domain. In the initial phase of the medical field, mammography is used for diagnosis, and finding cancer in the case of a dense breast is challenging. The evolution of deep learning and applying the same in the findings are helpful for earlier tracking and medication. The authors have tried to utilize the deep learning concepts for grading breast invasive ductal carcinoma using Transfer Learning in the present work. The authors have used five transfer learning approaches here, namely VGG16, VGG19, InceptionReNetV2, DenseNet121, and DenseNet201 with 50 epochs in the Google Colab platform which has a single 12GB NVIDIA Tesla K80 graphical processing unit (GPU) support that can be used up to 12 h continuously. The dataset used for this work can be openly accessed from http://databiox.com . The experimental results that the authors have received regarding the algorithm's accuracy are as follows: VGG16 with 92.5%, VGG19 with 89.77%, InceptionReNetV2 with 84.46%, DenseNet121 with 92.64%, DenseNet201 with 85.22%. From the experimental results, it is clear that the DenseNet121 gives the maximum accuracy in terms of cancer grading, whereas the InceptionReNetV2 has minimal accuracy.
Abstract Conventional techniques for identifying plant leaf diseases can be labor-intensive and complicated. This research uses artificial intelligence (AI) to propose an automated solution that improves plant disease detection accuracy to overcome the difficulty of the conventional methods. Our proposed method uses deep learning (DL) to extract features from photos of plant leaves and machine learning (ML) for further processing. To capture complex illness patterns, convolutional neural networks (CNNs) such as VGG19 and Inception v3 are utilized. Four distinct datasets—Banana Leaf, Custard Apple Leaf and Fruit, Fig Leaf, and Potato Leaf—were used in this investigation. The experimental results we received are as follows: for the Banana Leaf dataset, the combination of Inception v3 with SVM proved good with an Accuracy of 91.9%, Precision of 92.2%, Recall of 91.9%, F1 score of 91.6%, AUC of 99.6% and MCC of 90.4%, FFor the Custard Apple Leaf and Fruit dataset, the combination of VGG19 with kNN with an Accuracy of 99.1%, Precision of 99.1%, Recall of 99.1%, F1 score of 99.1%, AUC of 99.1%, and MCC of 99%, and for the Fig Leaf dataset with Accuracy of 86.5%, Precision of 86.5%, Recall of 86.5%, F1 score of 86.5%, AUC of 93.3%, and MCC of 72.2%. The Potato Leaf dataset displayed the best performance with Inception v3 + SVM by an Accuracy of 62.6%, Precision of 63%, Recall of 62.6%, F1 score of 62.1%, AUC of 89%, and MCC of 54.2%. Our findings explored the versatility of the amalgamation of ML and DL techniques while providing valuable references for practitioners seeking tailored solutions for specific plant diseases.
Abstract To overcome the problems of manual identification of fruit disease, this work proposes a deep‐learning model to analyse fruit images to detect diseases in the fruit. We are proposing here a convolutional neural network (CNN)‐based model for fruit disease classification. By including many layers, the proposed CNN model extracts numerous features from the fruit, deals with the large data set and finally evaluates it. With the MobileNetv2 model, the disease prediction accuracy for papaya, guava and citrus was 99.4%, 98.8% and 95.8% and the recall values were 99.4%, 98.8% and 93.8%, respectively. With VGG16, the disease prediction accuracy for papaya, guava and citrus was 97.7%, 99.6% and 94.2% and the recall values were 96.5%, 99.6% and 89.2%, respectively. Finally, with DenseNet121, the disease prediction accuracy for papaya, guava and citrus was 99.4%, 97.6% and 99.2%, and the recall values were 98.8%, 97.6% and 99.2%, respectively.
In the current times, COVID-19 has taken a handful of people's lives. So, vaccination is crucial for everyone to avoid the spread of the disease. However, not every vaccine will be perfect or will get success for everyone. In the present work, we have analyzed the data from the Vaccine Adverse Event Reporting System and understood that the vaccines given to the people might or might not work considering certain demographic factors like age, gender, and multiple other variables like the state of living, etc. This variable is considered because it explains the unmentioned variables like their food habits and living conditions. The target group for this work will be the healthcare workers, government bodies & medical research organizations. We analyze the data using machine learning techniques & algorithms and predict the working of COVID-19 vaccines on specific age groups developed by significant vaccine manufacturers, i.e., PFIZER\BIONTECH and MODERNA. Data visualization and analysis interpret the vaccine impact based on the above-said variables. It becomes clear that people belonging to a specific demographic factor can have an option to choose the vaccine accordingly based on the previous history of a particular manufacturer's vaccine getting succeeded for that demographic factor. The various machine learning algorithms we have used are Logistic Regression, Adaboost, Decision Tree, and Random Forest. We have considered the DIED variable as the target variable as this results in a high life threat. On performance measure, perspective Adaboost is showing appreciable values. The prediction of the type of vaccine to be administered could be derived using this machine learning algorithm. The accuracy we achieved based on the experiment are as follows: Decision Tree Classifier with 97.3%, Logistic Regression with 97.31%, Random Forest with 97.8%, AdaBoost with 98.1%.
In future IoT (Internet of Things), big-data administration & machine learning disclosure for expansive scale modern robotization application, the significance of mechanical internet is expanding step by step. The interconnection by means of the Internet of computing gadgets installed in ordinary items, empowering them to send and get information. BD is informational collections that are so voluminous and complex that customary information preparing application programming are insufficient to manage them. ML is a subset of artificial intelligence that regularly utilizes measurable procedures to enable PCs to learn with information, without being expressly modified. A few differentiated advancements, for example, IoT, computational intelligence, machine type communication, BD, & sensor technology can be fused together to enhance the data administration & information revelation effectiveness of expansive scale robotization applications. An expanding measure of significant data sources, propels in IoT & Big Data (BD) advances & also the accessibility of an extensive variety of machine learning (ML) calculations offers new potential to convey logical administrations to nationals & urban chiefs. In any case, there is as yet a hole in joining the present best in class in an incorporated system that would help lessening improvement costs & empower new sort of administrations. Voluminous measures of data have been created, since the previous decade as the scaling down of IoT gadgets increments. Be that as it may, such data are not valuable without scientific power. Various BD, IoT, & investigation arrangements have empowered individuals to acquire profitable knowledge into extensive information created by IoT gadgets. However, these arrangements are still in their earliest stages, & the domain does not have a thorough review on this. Here we endeavored to give a reasonable more profound understanding about the IoT in BD structure alongside its different issues & challenges & concentrated on giving conceivable arrangement by ML strategy.
With the rising usage of technology, a tremendous volume of data is being produced in the current scenario. This data contains a lot of personal data and may be given to third parties throughout the data mining process. Individual privacy is extremely difficult for the data owner to protect. Privacy-Preservation in Data Mining (PPDM) offers a solution to this problem. Encryption or anonymization have been recommended to preserve privacy in existing research. But encryption has high computing costs, and anonymization may drastically decrease the utility of data. This paper proposed a privacy-preserving strategy based on dimensionality reduction and feature selection. The proposed strategy is based on dimensionality reduction and feature selection that is difficult to reverse. The objective of this paper is to propose a perturbation-based privacy-preserving technique. Here, random projection and principal component analysis are utilized to alter the data. The main reason for this is that the dimension reduction combined with feature selection would cause the records to be perturbed more efficiently. The hybrid approach picks relevant features, decreases data dimensionality, and reduces training time, resulting in improved classification performance as measured by accuracy, kappa statistics, mean absolute error and other metrics. The proposed technique outperforms all other approaches in terms of classification accuracy increasing from 63.13 percent to 68.34 percent, proving its effectiveness in detecting cardiovascular illness. Even in its reduced form, the approach proposed here ensures that the dataset's classification accuracy is improved.
In today's society, there is a high volume of smartphones, with Android being the most popular and most commonly used smartphones. In the last few years, the Android market has been booming, making lots of developers join the industry so as to create various mobile applications that are a benefit to people's lives. However, its over-popularity has brought many crime issues, including security. One of the major common incidents to mobile users is having their mobile phones lost or stolen. Since most mobile users want to find their lost phones, they are looking for the most reliable features that can help them locate their smartphones. Luckily, there are some developed applications and services that have been designed to track down and locate lost or stolen smartphones. In this work, the authors tried to identify a collection of these applications and the information they send to the user in aiding them to find their phone. Since some applications are able to send location information or a photo, this work will look at what metadata is usually sent with the message.
An efficient triband metamaterial absorber is presented for X- and K-band applications. The unit cell is of simple shape. The absorber is fabricated on a thin polyamide, which makes it flexible. The parameters of the designed absorber are optimized. The simulated results show that it has good absorption rate and polarization stability. The stability is exhibited over a wide range in both TE and TE modes of the incident waves. The measured results are on par with the simulated results. The measurement is carried out with the waveguide measurement method.
Today, the revolting topic in the area of Information Technology is IoT. With IoT, one can convert the daily life real world objects into insightful virtual articles. IoT binds together all real-world objects in a typical framework and in turn provide us the control of these things also it let you know about the condition of things. The IoT means to bind together all the objects of daily life into a typical framework giving us command of items everywhere, as well as getting posted about condition of the objects. By getting help of investigation and ordered study of several academic research papers, corporate white papers and online materials, sufficient stuff and ideas about the current scenario of IoT is analyzed. Also, we have mentioned, how to implement the concept of IoT for checking the status of the LPG Stove whether it is on or off and the outcome is depicted with the help of an Android App. With the help of the proposed tools and techniques the end user could easily check the status of LPG at any place as it is connected to the internet via WiFi module. The proposed model is cost effective, simple that could be easily implemented in the real-time basis. As we know that LPG has become the part and parcel of every kitchen nowadays. But, along with its benefits., it comes with various risks. Nowadays people being multitasking everywhere., they need smart applications for their all basic needs. In case of the LPG stove., sometimes the unwilling ignorance may result in life threatening accidents. This proposed system will help to overcome such accidents up to an extent.
Abstract Any nation’s growth depends on the trend of the price of fuel. The fuel price drifts have both direct and indirect impacts on a nation’s economy. Nation’s growth will be hampered due to the higher level of inflation prevailing in the oil industry. This paper proposed a method of analyzing Gasoline and Diesel Price Drifts based on Self-organizing Maps and Bayesian regularized neural networks. The US gasoline and diesel price timeline dataset is used to validate the proposed approach. In the dataset, all grades, regular, medium, and premium with conventional, reformulated, all formulation of gasoline combinations, and diesel pricing per gallon weekly from 1995 to January 2021, are considered. For the data visualization purpose, we have used self-organizing maps and analyzed them with a neural network algorithm. The nonlinear autoregressive neural network is adopted because of the time series dataset. Three training algorithms are adopted to train the neural networks: Levenberg-Marquard, scaled conjugate gradient, and Bayesian regularization. The results are hopeful and reveal the robustness of the proposed model. In the proposed approach, we have found Levenberg-Marquard error falls from − 0.1074 to 0.1424, scaled conjugate gradient error falls from − 0.1476 to 0.1618, and similarly, Bayesian regularization error falls in − 0.09854 to 0.09871, which showed that out of the three approaches considered, the Bayesian regularization gives better results.
Here, we propose a basic model that could be helpful to foresee the spread of COVID-2019. We performed linear regression, Multilayer perceptron and Vector autoregression model for expectation on the COVID-19 kaggle information to anticipate the epidemiological pattern of the disease and rate of COVID-2019 cases in India. Predicted the possible trends of COVID-19 impacts in India based on data collected from Kaggle. With the prevailing data about confirmed, recovered and death across India for over the time duration helps in predicting and forecasting the near future. For additional examination or future point of view, case definition and information assortment must be kept up continuously.
The rapid growth in the digital world in form of exponentiation to accommodate huge amount of structured, semi-structured, unstructured and hybrid data received from different sources. By using the conventional data management tools, it is quite impossible to manage this semi-structured and unstructured data for which a non-relational database management system such as NoSQL and NewSQL are used to handle such types of data. These types of semi-structured and structured data are generally considered ‘Big Data.' This article describes the basic characteristics, background and the models of NoSQL used for big data applications. In this work, the authors surveyed different NoSQL characteristics used by the researchers and try to compare the strength and weakness of different NoSQL databases.
A variety of technologies in recent years have been developed in designing on-chip networks with the multicore system.In this endeavor, network interfaces mainly differ in the way a network physically connects to a multicore system along with the data path.Semantic substances of communication for a multicore system are transmitted as data packets.Thus, whenever a communication is made from a network, it is first segmented into sub-packets and then into fixed-length bits for flow control digits.To measure required space, energy & latency overheads for the implementation of various interconnection topologies we will be using multi2sim simulator tool that will act as research bed to experiment various tradeoffs between performance and power, and between performance and area requires analysis for further possible optimizations.
Chapter Contents: Abstract 14.1 Introduction 14.2 Heart-generated ECG signal 14.3 Filtering parameters least-mean-square algorithm 14.3.1 Updated filter coefficient in normalized least-meansquare (NLMS) algorithm 14.3.2 Improved performance LMS (DENLMS) algorithm delaying normalization inaccuracy 14.3.3 LMS is variant of sign data least-mean-square (SDLMS) algorithm 14.4 Retrieve and classify ECG signals utilizing ML-based techniques 14.5 Artificial neural network (ANN)-based ECG signals 14.6 Classification of ECG signals based fuzzy logic (FL) 14.7 Fourier transform wavelet transforms 14.8 Combination of machine learning and statistical algorithms 14.9 Conclusion and future work References
: Skin cancer is regarded as the cardinal cause of morbidity and mortality globally, with death count increasing at an alarming rate. It has a higher chance of being cured if diagnosis is done in its initial stages; proper diagnosis of skin cancer is crucial to enable proper treatments. Highly skilled dermatologists and skin specialist doctors are capable of accurately detecting skin cancer at an early stage. Expert dermatologists are limited in number, so systems that automatically detect the cancerous growth at early stage with high performance are a useful tool. So, this study presents a deep learning (DL) technique to classify images and detect skin cancer at an early stage. We have trained our model using images of harmless, that is, benign images and tumor-based images; we have used Convolutional neural network (CNN) on those images to classify whether the image is a suspect of skin cancer or not. This proposed approach achieves an accuracy of 86% and is compared to the DCNN model, which was introduced earlier, before our work. Also, an additional approach using the ResNet-50, a 50-layer deep CNN has been implemented which has proved useful in further improving the accuracy to over 90%.
Healthcare data frameworks have enormously expanded accessibility of medicinal reports and profited human services administration and research work. In many cases, there are developing worries about protection in sharing restorative files. Protection procedures for unstructured restorative content spotlight on recognition and expulsion of patient identifiers from the content, which might be lacking for safeguarding privacy and information utility. For medicinal services, maybe related exploration thinks about the therapeutic records of patients ought to be recovered from various destinations with various regulations on the divulgence of healthcare data. Considering delicate social insurance data, privacy protection is a significant concern, when patients' mediclinical services information is utilized for exploration purposes. In this article we have used feature selection for getting the best feature set to be selected for privacy preservation by using PCA (Principle Component Analysis). After that we have used two methods K-anonymity and fuzzy system for providing the privacy on medical databases in data intensive enviroments. The results affirm that the proposed method has better performance than those of the related works with respect to factors such as highly sensitive data preservation with k-anonymity.
The purpose of this book is first to study MATLAB programming concepts, then the basic concepts of modeling and simulation analysis, particularly focus on digital communication simulation. The book will cover the topics practically to describe network routing simulation using MATLAB tool. It will cover the dimensions' like Wireless network and WSN simulation using MATLAB, then depict the modeling and simulation of vehicles power network in detail along with considering different case studies. Key features of the book include: Discusses different basics and advanced methodology with their fundamental concepts of exploration and exploitation in NETWORK SIMULATION. Elaborates practice questions and simulations in MATLAB Student-friendly and Concise Useful for UG and PG level research scholar Aimed at Practical approach for network simulation with more programs with step by step comments. Based on the Latest technologies, coverage of wireless simulation and WSN concepts and implementations
Information and Communication Technology (ICT) help understudies to pick up information, help them with their everyday exercises and learning exercises. It's not possible for anyone to disregard the significance of ICT in their everyday life. ICT assumes extraordinary job in an amazing achievement. ICT in training areas alludes to the investigation and moral act of encouraging e-learning and other inventive innovative methodology in instructing and learning techniques which is the learning and improving execution by making, utilizing and overseeing fitting mechanical procedures and assets. The principle point of this paper is to find Nepalese secondary school understudies' frame of mind towards the utilization of ICT. The study was conducted on four factors Learning Behavior, Teaching Behavior, Computer use and Information and Communication Use. The study was led among 102 secondary school understudies. This examination utilizes numerous relapse information investigation and gives analyst elucidation on information. This examination contributes in further exchange on frame of mind of understudies towards selection of ICT in Nepal.
The first cases of a typical pneumonia of unidentified ailment were reported on December 30, 2019, from Wuhan, China After many researches, severe acute respiratory syndrome corona virus 2 (SARS-CoV-2) is found as the main reason of the ailment and the problem has been named as COVID-19 The rapid spread of this virus resulted in the worldwide pandemic state This global pandemic has made a devastating impact on several domains like education, business and others There are many problems that the people are facing in this situation The medical department staff are facing problem in providing medical assistance to the people in need, providing awareness among the people has become difficult, there are many people who need financial help and the list goes on As of now, there are some websites and mobile applications to help the people to fight these problems Here in this work, we are proposing a website incorporated with a healthcare chatbot for assistance & tracking the COVID-19 situation © 2021 CEUR-WS All rights reserved
AI works proficiently to emulate human intellect. It may also play an important role in understanding and recommending the creation of a COVID-19 vaccine. This outcome-driven technology is utilized for effective screening, assessing, forecasting, and tracking of present and potential future patients. Traditional network designs are unable to cope calmly with the impact of COVID-19 due to massive network data traffic and resource optimization requirements. As indicated by the growing amount of restorative clinical data, artificial intelligence (AI) has the potential to successfully boost the upper limit of the medical and health network. We discuss the primary uses of artificial intelligence technology in the process of suppressing the coronavirus from three main perspectives: prediction, symptom detection, and development, based on an extensive literature study. Furthermore, the advancement of next-generation network (NGN) technologies based on machine learning (ML) has given limitless opportunities for the formation of novel medical approaches. We have also discussed the challenges related to AI technologies in combatting COVID-19. The devastating epidemic of the Novel Coronavirus (Covid-19) has highlighted the importance of accurate prediction mathematical models. We have also discussed different mathematical models, their predictive capabilities, drawbacks, and practical validity.
Due to rapid population growth and industrialization, residents of large cities often face severe traffic congestion during their commutes. This leads to unexpected delays, increased accident risks, fuel wastage, and a decline in public health, particularly in urban areas where pollution exacerbates unsanitary conditions. In response, many smart cities are implementing traffic control systems based on traffic automation principles to mitigate these issues. A key challenge lies in using real-time analytics and online traffic data to efficiently manage traffic flow. To address this, the current research proposes an advanced monitoring system leveraging highly flexible mobile agent technology for intelligent data analytics. In the context of a Vehicular Ad-hoc Network (VANET), the mobile agent incorporates additional features such as crime reduction, accident prevention, enhanced driver flexibility, and improved security. These features are combined with a congestion control algorithm to optimize traffic flow and prevent congestion at the entry points of smart traffic zones. Simulation results using the Ns2 simulator demonstrate significant improvements in reducing delays and preventing accidents caused by heavy traffic.
An automatic street light control system is a system that is preferred over normal conventional street lights to save energy efficiently. An automatic street light control system makes use of the advanced automatic technologies to light the road. The main consideration of this system is to find the amount of energy utilized and avoid wastage of electricity when a vehicle passes through the road because 30–40% of energy is being wasted by older street light systems at night. This system will start lighting using high intensity when vehicles or pedestrians pass through on the road, or, on contrary, the lights will be in dim condition. With improvements in technology, it is becoming eco-friendly in this advanced world. The advancement in an automatic technology in street lights is the optimum use of street lights and new techniques to produce much more efficient devices. This advancement in technologies results in overcoming the necessity of human resources. Automation plays a significant role within the world economy and in our day-to-day activities. Our implementation work shows automatic street lights control systems giving beneficial outcomes on which the power is utilized optimally and efficiently. The automatic street light where the microcontroller comes to rescue in giving instructions to street lights when there is an automobile or pedestrian is detected when the IR signals are transmitted to a microcontroller and the microcontroller responds to street lights. Therefore, we could efficiently utilize the power to light the street lamps. The lights will remain dim when there is no detection of automobiles or pedestrians. An automatic street light system uses 8051 Microcontroller AT89S52 to build this system, which performs based on the object detected in roads and performs necessary operations to street lamps whether to switch light ON/OFF.
ERP, or Enterprise Resource Planning systems help business management, which consists of a well-designed interface that incorporates different programs to integrate and manage all company functions at intervals of a company, these sets incorporate applications for human resources, monetary and accounting, sales and distribution, project management, materials management, SCM, or Supply Chain Management and quality management. Currently, organizations are running to improve their ability to survive in the global market competitions of the 21st century. While the organizations try to advance in their level of agility, changing and modifying the process of decision-making to make it more efficient and effective to satisfy the successive variations of the market. Different views are gathered regarding ERP implementation of ERP in manufacturing. Even we have taken certain essential components of ERP for a better understanding of ERP. Ease of use, usefulness, quality, and trust on ERP services have been taken an independent variable that affects user’s decision to adopt ERP. The role of ERP technology in manufacturing facilities are broken into more categories for detail concept. Quantitative data analysis methods were usually used for questionnaire data analysis which was utilized to analyze statistical data and after that collection of interview data was done. A researcher has applied different statistical tools like Chi-Square Tests, Anova, etc. to analyze the collected data. A researcher essential portion is to analyze and interpret data that relates to modifying data which explains the solution to the research question with some additional future recommendation for more quality research.
Deep learning is a very dynamic area in Sentiment Classification. Text analytics is the process of understanding text and making actionable decisions and acting on it. be it Amazon Alexa, Siri, Cortana everything is made up of Natural Language Processing. Text to speech and Speech to text are generating so many data sets every day. The internet has the largest repository of data, it is hard to define what to exactly do with it. sentiment are the opinions or the way of feelings of the public usually in the sequential form, in which many people face difficulty in living their daily life. Some are even ending their life just they are depressed. The approach here is to help the people suffering from depression with appropriate methodology to use in this work. Depressika: Early Risk of Depression Detection with opinions is a web application which detects the early risk of depression from the social media posts created by the users with appropriate Recurrent Neural Networks [RNN]. This is a classification problem of the Machine Learning [ML]. Depressika builds on Waterfall Methodology of application development using the Keras, Tensor Flow, Scikit-Learn and Matplotlib to carryout and process sequential data and the overall process of development is carried out by Python programming Language.
In the white paper we strive to cogitate vulnerabilities of one of the most popular big data technology tool Hadoop.The elephant technology is not a bundled one rather by product of the last five decades of technological evolution.The astronomical data today looks like a potential gold mine, but like a gold mine, we only have a little of gold and more of everything else.We can say Big Data is a trending technology but not a fancy one.It is needed for survival for system to exist & persist.Critical Analysis of historic data thus becomes very crucial to play in market with the competitors.Such a state of global organizations where data is going more and more important, illegal attempts are obvious and needed to be checked.Hadoop provides data local processing computation style in which we try to go towards data rather than moving data towards us.Thus, confidentiality of data should be monitored by authorities while sharing it within organization or with third parties so that it does not get leaked out by mistake by naï ve employees having access to it.We are proposing a technique of introducing Validation Lamina in Hadoop system that will review electronic signatures from an access control list of concerned authorities while sending & receiving confidential data in organization.If Validation gets failed, concerned authorities would be urgently intimated by the system and the request shall be automatically put on halt till required action is not taken for privacy governance by the authorities.
Emotional intelligence is a way of knowing how one feels in a situation. And this emotional intelligence is used for self and for others also. People who are having high emotional intelligence can help others who are suffering from distress and they can help them to regulate their feelings from negative to positive. Emotional intelligence can either be an ability that can be enhanced, or it is a trait that cannot be enhanced, and people get it by birth just like hearing ability or any other ability which one has by birth. This chapter discusses all these aspects. It is said that high emotional quotient people include at least these three skills, and they are emotional alertness, the efficacy by which anyone can recognize and name one's own emotions, and the efficacy where one can tackle those emotions and implement them.
The incubator is an encased device whose inward ecological system is disconnected from the surrounding climate.It creates an ideal climate for the incubation process and is an indispensable device for premature infants as these children are prone to developing health problems that would affect the conditions of their growth and development.A low-cost, IoT-based autonomous Incubation System that would create an ideal environment for premature children to help them to grow to maturity would be invaluable, especially for rural areas that lack adequate medical facilities.Such a system would significantly reduce the mortality rate of premature infants.This paper focuses on designing a framework for such a system in which the temperature and humidity are autonomously controlled.A secured incubation system was developed using IBM Cloud, IBM Watson IoT platform, and MIT App Inventor.
This book uncovers the stakes and possibilities involved in realising personalised healthcare services through efficient and effective deep learning algorithms, enabling the healthcare industry to develop meaningful and cost-effective services. This requires effective understanding, application and amalgamation of deep learning with several other computing technologies, such as machine learning, data mining, and natural language processing
The major objectives of text mining (text data mining/text analytics) are to extract the pattern or information from the largely available unstructured or semi-structured text data. Data mining deals only with structured data whereas text mining deals with semi-structured or unstructured data, Around 80% of data stored throughout the globe is in unstructured or semi-structured form, it is the biggest need for text mining to manipulate the data in a meaningful way, there are many techniques like sentimental analysis, natural language processing (NLP), information extraction, information retrieval, clustering, concept linkage, associate rule mining (ARM), summarization, topic tracking are used to extract the data based upon the nature of data and will be discussed further on each technique in this chapter, but the major problem in the text mining is the ambiguity of the natural language, as the one word can be interpreted in multiple ways, ambiguity is the primary challenge for the researchers to address and the possible solutions are explained. Algorithms such as genetic algorithm, differential evolution can be combined to get the desired result, the output of algorithm can be scaled so that it can ensure the quality of the text retrieval. There are two methods called as precision and recall is used to 168measure text retrieval quality in text mining. There are several applications that are associated with text mining such as healthcare, telecommunication, research papers categorization, market analysis, Customer Relationship Management (CRM), banks, Information Technology and another environment where the huge unstructured volume of data is generated.
Researchers, academicians and professionals expone in this book their research in the application of intelligent computing techniques to software engineering. As software systems are becoming larger and complex, software engineering tasks become increasingly costly and prone to errors. Evolutionary algorithms, machine learning approaches, meta-heuristic algorithms, and others techniques can help the effi ciency of software engineering.
Chapter Contents: Abstract 17.1 Introduction 17.1.1 Steps to build a machine learning model 17.1.2 Machine learning terminology 17.1.3 ML algorithms 17.2 Literature review 17.2.1 Applications of machine learning in healthcare 17.3 Disease identification and diagnosis 17.3.1 Heart disease 17.3.2 Diabetes 17.3.3 Liver disease 17.3.4 Dengue disease 17.3.5 Hepatitis disease 17.4 Drug discovery and manufacturing 17.5 Electronic health records 17.6 Disease prediction using machine learning 17.7 Fairness 17.7.1 Fairness in the dataset 17.7.2 Fairness in model or algorithm 17.7.3 Fairness in the metrics/results 17.8 Data analytics role in healthcare 17.8.1 Predictive modeling 17.8.2 Reduction in healthcare costs 17.8.3 Empowering advanced chronic disease prevention 17.9 Deep learning applications in healthcare 17.9.1 Drug discovery 17.9.2 Challenges faced by deep learning applications in healthcare 17.10 Conclusion and future scope References
This research is an assessment of classification with neural network classifier (NNC) against various classifiers centred on working effectiveness of various classifiers. We have compared resultant factors of NBC with other two algorithms, namely ripple down rule learner (RIDOR) and simple cart in order to get comparatively efficient and accurate results. NBC performs well on categorical as well as on numerical data. Along these lines, we have proposed a model of a hybrid technology of analysing accuracy proportion of NNC with NBC, rule-based classifiers and tree-based classifiers on diagnosis of heart disease dataset. Algorithms which we have used here are NBC, ripple down rule learner (RIDOR) and simple cart. This work considered substantial dataset and distinctive methodology for the connected classifier work NBC calculation, is taken as base methodology and every methodology like RIDOR and simple cart are utilized for examination, so as to anticipate heart disease status of patients.
People take various types of cereals every day in their regular meals, but most people do not know their importance while consuming them. Each cereal has its benefit. One such cereal content is dry beans. Dry beans are a variety of beans that are produced in pods. These dry beans can be cooked and eaten, and it has numerous proteins and vitamins and are highly beneficial for our health. It also provides excellent immunity. Each bean has its characteristics, which can be identified with its unique features. These can be oval or kidney-shaped or even without a proper shape. According to their shape and features, they are classified separately. These nutrient-rich beans have to be classified based on their shapes. Human eyes sometimes may oversee or misclassify these tiny cereals for classification. This work involves the classification of 7 such dry bean varieties utilizing a deep learning context. The dataset utilized in this paper includes 13611 dry bean data for seven different UCI Machine learning repositories. In this work, we have taken seven different categorical labels: the dry bean varieties. The classification is done using deep learning techniques, and here we have utilized the Keras Sequential Algorithm for the classification. It is a supervised learning concept of machine learning used to predict two or more categorical labels. With the help of the deep learning approach, these dry beans are classified and obtained an accuracy of 94.88%.
Coronavirus attacks have affected countless countries.The death rates between most countries are increasing day by day, and we have attempted to propose many considerations about the principal problems that cause dangerous infections across the globe.In this work, the dietary patterns of 170 countries are considered to identify correlations between diet practices and death rates, confirmed and recovered cases caused by COVID-19.We have used data from food intake by countries and data associated with the spread of COVID-19 and other health issues that help get new insights into the importance of nutrition and eating habits to combat the spreading of infectious diseases.We have built a machine learning model (regressor) such as ridge regressor, support vector regression, random forest, and XGBoost regressor to predict the mortality rate based on food intake information and Obesity.Two approaches were considered: one with all food-related features taken as parameters and a simpler one, which reduced the dimensionality by using only two features: animal products and vegetal products.Both have issues (mainly of spread and non-linearity), but we could use different models and metrics.Next, we have built a model to predict obesity rates based on eating habits in each country.The proposed model was far more effective, and the general inclination of the information was taken and anticipated.We have also used data visualization approaches to get better insights into the data considered.
This chapter contains sections titled: Introduction Current Fog Applications Security and Privacy in Fog Computing Secure and Private Data Computation Conclusion
This chapter contains sections titled: Checkpoints and Mobility Manual and Seamless Mobility Fine-and Coarse-Grained Mobility Models Migration Freeze Time Device Drivers Self-Migration Conclusion
This chapter contains sections titled: Introduction Related Work RESTful Web Server Why and How REST is More Suitable for IoT Architecture of Arduino-Based Home Automation System Implementation Details Why Arduino? Result Analysis Conclusion and Future Scope
This chapter contains sections titled: Introduction Difficulties from a Cloud Adoption Perspective Security and Privacy Conclusion and Future Work
It is imagined that future cyber-physical systems will give a more helpful living and work space. Nonetheless, such systems need unavoidably to gather furthermore, measure privacy-touchy data. That implies the advantages accompany potential privacy spillage hazards. These days, this privacy issue gets more consideration as a legitimate necessity of the users who participate in a CPS ecosystem. In this postulation, privacy-by-plan approaches are examined where privacy improvement is figured it out through considering privacy in the physical layer plan. We have presented a comprehensive study regarding the current challenges in the field of CPS privacy. We throw light towards the design principles which are not focussed by majority of the CPS designers due to their non-exposure to such tactics. This chapter will act as a readymade guide to researchers who want to know how to lay foundations towards a privacy aware CPS architecture.
Utilization of advanced techniques regarding information retrieval in health care related applications and medicinal services incorporates emergency Hospital Management System, Electronic Medical Record (EMR), Automated Patient Record, medical finding frameworks, medical picture frameworks, etc. In spite of the fact that the elements of the frameworks referenced here are altogether different, the principle reasons for improving the proficiency and adequacy in medical practices are the equivalent. In the health check diagnosis framework, the attention is on the help of diagnosis and handling. It generally identified with the methods of computerized reasoning. Information retrieval in Health Care applications is the system of identifying relevant information and to recover it through specific procedures from stored system. These technique are used in many differentiated applications that deal with subjective intelligence. Applications based on Information retrieval generates incite identified with various issues, for example, in technology domain, the conceivably sudden size changes of the objectives as they approach the sensor. The present chapter examines and investigates the difficulties related with this new pattern of Information retrieval utilizing psychological insightful techniques with focus on medical sectors.
This chapter contains sections titled: Introduction to Digital Communication Simulation of Rayleigh Fading Model BPSK Modulation and Demodulation QPSK Modulation and Demodulation Image Error Rate vs Signal-to-Noise Ratio Recreation of OFDM Framework Conclusion
Cyber-Physical System (CPS) has recently gained intense popularity in the digital world. It has defined a new generation of digital system that mainly focuses on complex interconnectedness and amalgamation of virtual and physical worlds. CPS comprises of highly integrated computation, communication, control, and physical elements. The idea of Industry 4.0 has further surged the demand of CPS as one of the important components. So, in order to understand CPS more precisely, this study presents a detailed survey of the varied CPS application areas with more focus on features/architecture and related work. There are numerous interdisciplinary areas pertaining to CPS but this article is limited to Healthcare, Education, Agriculture, Energy Management, Smart Transportation, Smart Manufacturing and Smart Buildings. The other interesting CPS application areas are Military, Robotics, Decision Making, Process Control, Diagnostics, Aeronautics, and Civil Infrastructure Monitoring.
This research is an assessment of classification with neural network classifier (NNC) against various classifiers centred on working effectiveness of various classifiers. We have compared resultant factors of NBC with other two algorithms, namely ripple down rule learner (RIDOR) and simple cart in order to get comparatively efficient and accurate results. NBC performs well on categorical as well as on numerical data. Along these lines, we have proposed a model of a hybrid technology of analysing accuracy proportion of NNC with NBC, rule-based classifiers and tree-based classifiers on diagnosis of heart disease dataset. Algorithms which we have used here are NBC, ripple down rule learner (RIDOR) and simple cart. This work considered substantial dataset and distinctive methodology for the connected classifier work NBC calculation, is taken as base methodology and every methodology like RIDOR and simple cart are utilized for examination, so as to anticipate heart disease status of patients.
In this paper, we point out a major issue of stock market regarding trending scenario of trades where data exactness, accuracy of expressing data and uncertainty of values (closing point of the day) are lacked. We use neutrosophic soft sets (NSS) consisting of three factors (True, Uncertainty and False) to deal with exact state of data in several directions.
A learning calculation is stood up to with the goal of picking the capabilities and to concentrate on the component determination issue. In Brain-Computer Interface pattern recognition and machine learning systems utilize a classifier, but the additionally incorporate extraction and determination procedures to show the signs in a smaller and more appropriate manner. Adaptive common spatial patterns are compelling for patients with hearing difficulty and for crippled patients. Adaptive Common spatial patterns (CSP) patches refer to the capability of the CSP analysis to make small arrangements of channels and the blending of features. Band power features refers to the signs power for a particular repeat band in a given channel, which is clear up at the midpoint or over a specific timeframe. Channel strategies rely upon proportions of association between the capabilities and the objective independent of the classifier used.
Medical information systems (MIS) are the universal purpose of computer-based data storage of disease diagnosis and related medical data that help investigate medical cures and other patient information. The major purpose of MIS is to help the decision-maker find the most appropriate medicine and track patient health-related information. MIS data are widely accepted for analytical use in artificial intelligence research to discover unique patterns of disease symptoms which may lead of the invention of a cure. General medical information systems store basic information which relate to the disease or diagnoses of patients. Traditional electronic medical recording systems are a valuable tool for providing knowledge that can serve as a reference for quality decision-making in prescriptions. Specific medical information systems are a special category of information system which store disease-specific data storage. Textual data is one of the most popular and well-defined syntaxial word data for human understanding.
The "Big Four" advancements to come are starting to compel themselves to the fore as associations fiddle with any semblance of artificial intelligence (AI), blockchain (BC), internet of things (IoT), and big data (BD). Be that as it may, as these four springs from their embryonic stages, issues, and concerns are being paid attention to about their development. BC has impeded in its reception and development; however, its application is wide and diverse. In this way, it can profit, and help advantage, any semblance of AI and IoT. Be that as it may, BC can likewise have a critical task to carry out in aiding along with IoT. The development of IoT is as yet moving along, yet it also has hindered on the grounds that many have understood that acing this system of smart 'things' is far harder than they could have envisioned. Issues of security and courses of events of execution have caused the publicity around IoT to cool off. In a comparable vein, this has additionally occurred in BC. It is on the grounds that IoT and BC end up in comparative spots with regards to the selection that their mix might have the capacity to enable each other to out and comprehend a portion of the noteworthy worries that have hounded them exclusively. BC is no more in its early stages, yet it is the latest till now. Comparative proclamations can be done regarding the IoT. The buzz around BC utilization in IoT, be that as it may, is undeniably later. The association of the two skirts on the untested—and right now, the unapplied. In the IoT situation, the square chain and, when all is said in done, peer-to-peer methodologies could assume an essential job in the advancement of decentralized and data-serious applications running on billions of gadgets, saving the security of the clients. In this chapter, we will try to provide a detailed overview about this linkage between BC, Bitcoin, and IoT. We will focus on how IoT problems can be solved using the help of BC technology and vice versa.
This chapter contains sections titled: Introduction to Association Networks Time Series, Stationary, Time Series Decomposition, De-trending Autocorrelation, Test for Independence, Linear Autoregressive Models Mutual Information and Test for Independence Spurious Cross-Correlation, Vector Autoregressive Models and Dynamic Regression Models Conclusion
Mobile Application should be secure, portable, cost-effective and be of good quality. The architecture choice is very important to ensure the quality of the application over time and to reduce development time. Model-View-Controller (MVC) is a very useful for developing Application. It has become the most powerful and leading Programming Architecture for developing a large scale of Application. MVC Developers can trust on the framework that is widely accepted as solutions for recurring problems and used to develop flexible, reusable and modular Application. In this study we investigate the advantage of using MVC Framework when building Mobile Application available on difference devices. In this study compares and contrasts two platforms, iOS and Android Devices, and discusses how to apply the MVC Framework in order to minimize the inherent differences between the platforms.
of words, 10 Big data, 195, 218, 224, 225, 229, 233, 241, 242, 318 Big data and database security and protection, 62 architectural framework of big data, 65 data protection versus data security in healthcare, 67 cloud threats, 67 malware and ransomware, 67 phishing attacks, 67 five V's of big data, 62 value, 65 variety, 63 velocity, 63 veracity, 64 volume, 62 monitoring and auditing, 69 standard for data protection, 70 administrative safeguards standards, 71 healthcare standard in India, 70 physical safeguard standards, 71 security technical standards, 71 technology in use to secure the healthcare data, 68 access control policy, 69 what is big data, 61 Bioinformatic experts, UMLS, 259-260
The prelims comprise: Half-Title Page Series Page Title Page Copyright Page Contents Preface Acknowledgment
The prelims comprise: Half Title Publisher Info Title Copyright Dedication Contents List of Figures List of Tables Foreword Preface Acknowledgments Acronyms
The prelims comprise: Half Title Publisher Info Title Copyright Contents List of Figures List of Tables Foreword Preface Acknowledgments Acronyms
When we talk about cybersecurity, one of the threats that can create major damage is ransomware. This attack is a part of malware attacks; there are various others like worms, but ransomware is trending and one of the most dangerous threats over the network. Ransomware is not a new type of attack; it just evolved rapidly over time and as per the study with evolution, various detection techniques also evolved. Machine learning, as well as deep learning, is the approach through which detection becomes better. Also, we see the payment method, which plays an important role in the attack system in this paper. We understand the analysis phase in the training of the system and also learn about the link or path from starting at the end of the attack; if someone wants to break the attack, then link or condition should break.
The prelims comprise: Half-Title Page Series Page Title Page Copyright Page Contents Preface Acknowledgment
20 Centralized server, 268 Cerebral peduncle, 25 Chronic care model,
She is an established author, researcher, and has filed three patents in the area of IoT
The prelims comprise: Half-Title Page Series Page Title Page Copyright Page Dedication Page Table of Contents Preface Acknowledgment
The prelims comprise: Half-Title Page Publisher Page Title Page Copyright Page Dedication Page Table of Contents Preface Acknowledgment
WSN, 129 Architecture, service discovery, for 5G-VANET milieu,
The prelims comprise: Half-Title Page Series Page Title Page Copyright Page Contents Preface Acknowledgment
No Abstract.
blockchain, 317 Artificial Intelligence (AI), 122, 224, 313 Aspect-based opinion extraction, 56-57 Automatic machine learning (aML), 216, 217 Bag of words (BoW), 218, 219 Bagging algorithm, 96 Balanced accuracy score (BAS), 359, 363 BAT algorithm, 315 Bayesian models (BM), 94-95 Best prediction error (BPE), 132 Bias, 259, 260, 262
This chapter contains sections titled: Cloud Security and Security Appliances VMM in Clouds and Security Concerns Software-Defined Networking Distributed Messaging System Customized Testbed for Testing Migration Security in Cloud A Case Study and Other Use Cases Conclusion
This chapter contains sections titled: Introduction Business Challenge Virtual Machine Migration Virtualization System Live Virtual Machine Migration Conclusion
The prelims comprise: Half Title Publisher Info Title Copyright Contents List of Figures List of Tables Preface Acknowledgments Acronyms Introduction
This chapter contains sections titled: MATLAB Modeling and Simulation Computer Networks Performance Modeling and Simulation Discrete-Event Simulation for MATLAB Simulation Software Selection for MATLAB Simulation Tools Based on High Performance Conclusion
This chapter contains sections titled: Introduction Basic Features Notation, Syntax, and Operations Import and Export Operations Elements Plotting Uncommon Function Executable Files Generation Calling and Accumulating Executable Documents Calling Objects from External Programs JAVA Classes The Guide Effective Programming through MATLAB Clones Process Using MATLAB Parallel MATLAB System Conclusion
This chapter contains sections titled: Cloud Computing Firewalls in Cloud and SDN Distributed Messaging System Migration Security in Cloud Conclusion
This chapter contains sections titled: Introduction Classification of Load Balancing Techniques Policy Engine Load Balancing Algorithm Resource Load Balancing Load Balancers in Virtual Infrastructure Management Software VMware Distributed Resource Scheduler Conclusion
This chapter contains sections titled: Detecting and Preventing Data Migrations to the Cloud Protecting Data Moving to the Cloud Application Security Virtualization Virtual Machine Guest Hardening Security as a Service Conclusion
This chapter contains sections titled: Definition of Data Center Data Center Traffic Characteristics Traffic Engineering for Data Centers Energy Efficiency in Cloud Data Centers Major Cause of Energy Waste Power Measurement and Modeling in Cloud Power Measurement Techniques Power Saving Policies in Cloud Conclusion
This chapter contains sections titled: Introduction VM Checkpointing Enhanced VM Live Migration VM Checkpointing Mechanisms Lightweight Live Migration for Solo VM Lightweight Checkpointing Storage-Adaptive Live Migration Conclusion
This chapter contains sections titled: Virtualization Types of Live Migration Live VM Migration Types Hybrid Live Migration Reliable Hybrid Live Migration Conclusion
This chapter contains sections titled: Evaluation of Granger Causality Measures on Known Systems Demand Modeling and Performance Measurement Universal Algorithms and Sequential Algorithms Acoustic-Centric and Radio-Centric Algorithms AODV Routing Protocol Conclusion
This chapter contains sections titled: Trusted Computing TPM Operations TPM Applications and Extensions TPM Use Cases State of the Art in Public Cloud Computing Security Launch and Migration of Virtual Machines Trusted VM Launch and Migration Protocol Conclusion
This chapter contains sections titled: Live Migration Issues with Migration Research on Live Migration Total Migration Time Graph Partitioning Conclusion
This chapter contains sections titled: Kernel-Based Virtual Machine Xen Secure Data Analysis in GIS Emergence of Green Computing in Modern Computing Environment Green Computing Conclusion
This chapter contains sections titled: Vehicle Network Toolbox Network Management (NM) Interaction Layer Transport Protocols Conclusion
This chapter contains sections titled: Case Determination and Structure Case Study 1: Gas Online Case Study 2 Case Study 3: Random Waypoint Mobility Model Case Study 4: Node localization in Wireless Sensor Network Case Study 5: LEACH Routing Protocol for a WSN Conclusion
The prelims comprise: Half-Title Page Series Page Title Page Copyright Page Contents Preface Acknowledgment
This chapter contains sections titled: Radio Propagation for Shadowing Methods Mobility: Arbitrary Waypoint Demonstrates PHY: SNR-Based Bundle Catches, Communication, Dynamic Transmission Rate and Power NET: Ad Hoc Routing APP: Overlay Routing Protocols Conclusion
This chapter contains sections titled: Introduction to Cloud Computing Common Types of Attacks and Policies Conclusion
Blockchain is a technology that supports secured transaction in a public distributed database.It maintains a peer-to-peer network where a transaction cannot be modified or tampered by unauthenticated users.It provides a safe message transfer from a sender to a receiver.Job recommendation is an online system that provides a mapping between the job seeker and the employer.This paper proposes a public blockchain of job recommendations based on incremental hashing.The examinations show that this blockchain job recommendation provides process integrity, traceability, security, high levels of transparency, drastic reduction in operational cost and high standard and systematic.The system has two stages.Firstly, using blockchain technology, the authenticated data is fetched.Secondly, a classification model using adaptive neuro-fuzzy inference system is built for mapping the job seeker to the recruiter.This approach proves to be authenticated as well as a smart job recommendation system.
The prelims comprise: Half-Title Page Title Page Copyright Page Contents Preface Acknowledgement
Today the world is facing many cyber-crimes irrespective of the geographical boundaries, and privacy is being compromised all across the globe. According to some assessments, the extent and frequency of data breaches are increasing alarmingly, prompting organizations throughout the world to take action to address what appears to be a worsening situation. In today's world we cannot live without technology and cyber security is vital for keeping our personal information safe. This chapter would improve the awareness about technical, privacy, and security infringements and help in protecting data by prioritizing the most assailed sectors. It will help the key audience to learn about data leaks and other ways our privacy and security gets compromised due various challenges, diverse up-to-date prevention and detection policies, fresh challenges, favourable answers, and exciting opportunities.
AI works proficiently to emulate human intellect. It may also play an important role in understanding and recommending the creation of a COVID-19 vaccine. This outcome-driven technology is utilized for effective screening, assessing, forecasting, and tracking of present and potential future patients. Traditional network designs are unable to cope calmly with the impact of COVID-19 due to massive network data traffic and resource optimization requirements. As indicated by the growing amount of restorative clinical data, artificial intelligence (AI) has the potential to successfully boost the upper limit of the medical and health network. We discuss the primary uses of artificial intelligence technology in the process of suppressing the coronavirus from three main perspectives: prediction, symptom detection, and development, based on an extensive literature study. Furthermore, the advancement of next-generation network (NGN) technologies based on machine learning (ML) has given limitless opportunities for the formation of novel medical approaches. We have also discussed the challenges related to AI technologies in combatting COVID-19. The devastating epidemic of the Novel Coronavirus (Covid-19) has highlighted the importance of accurate prediction mathematical models. We have also discussed different mathematical models, their predictive capabilities, drawbacks, and practical validity.
The prelims comprise: Half-Title Page Publisher Page Title Page Copyright Page Table of Contents Preface
While the work provides a comprehensive overview of
Potential competing interests: No potential competing interests to declare.While the study provides valuable insights into the comparative performance of YOLOv8 and Mask R-CNN in instance segmentation tasks within apple orchard environments, there are certain limitations and considerations that should be acknowledged:1.A through proofreading of the document is suggested.2. The study focused on specific datasets related to apple orchards during the dormant and early growing seasons.The performance of the models may vary when applied to other types of crops, different orchard conditions, or diverse
In the ever-evolving landscape of cybersecurity, the application of game theoretic models has emerged as a powerful and innovative approach to enhance our understanding and management of cyber threats. This abstract explores the application of a variant of game theoretic models within the context of a Cyber Threat Intelligence (CTI) framework. With the proliferation of cyber-attacks targeting critical infrastructure, sensitive data, and national security, it has become imperative to develop proactive and adaptive strategies for threat detection, mitigation, and response. The variant of game theoretic models discussed in this abstract departs from traditional game theory by incorporating elements of dynamic adaptation and machine learning. This adaptation enables the framework to model and analyze the intricate and rapidly changing interactions between threat actors and defenders in real-time, thereby providing a more accurate representation of the evolving threat landscape. By leveraging machine learning algorithms, the model can continuously learn and adapt to new threats and tactics, making it a versatile tool for CTI. This abstract also explores the practical applications of the variant model in various aspects of cybersecurity, including threat actor profiling, vulnerability assessment, and decision support for incident response. By considering the strategic motivations and behaviors of threat actors, organizations can make informed decisions regarding resource allocation, risk assessment, and security investments. The integration of this variant of game theoretic models into CTI holds great potential to revolutionize our approach to cybersecurity, enabling organizations to stay one step ahead of adversaries. As the digital world becomes increasingly complex, the ability to predict, mitigate, and adapt to cyber threats is crucial for safeguarding critical assets and ensuring the resilience of digital infrastructure. This paper highlights the significance of this innovative approach and its potential to shape the future of cyber threat intelligence and cybersecurity practices.
Potential competing interests: No potential competing