Processing, Please wait...

  • Home
  • About Us
  • Search:
  • Advanced Search

Growing Science » Tags cloud » Machine learning

Journals

  • IJIEC (726)
  • MSL (2637)
  • DSL (649)
  • CCL (508)
  • USCM (1092)
  • ESM (404)
  • AC (562)
  • JPM (247)
  • IJDS (912)
  • JFS (91)
  • HE (26)
  • SCI (26)

Keywords

Supply chain management(163)
Jordan(161)
Vietnam(148)
Customer satisfaction(120)
Performance(113)
Supply chain(108)
Service quality(98)
Tehran Stock Exchange(94)
Competitive advantage(93)
SMEs(86)
optimization(84)
Financial performance(83)
Trust(81)
TOPSIS(80)
Job satisfaction(79)
Sustainability(79)
Factor analysis(78)
Social media(78)
Knowledge Management(77)
Genetic Algorithm(76)


» Show all keywords

Authors

Naser Azad(82)
Mohammad Reza Iravani(64)
Zeplin Jiwa Husada Tarigan(60)
Endri Endri(45)
Muhammad Alshurideh(42)
Hotlan Siagian(39)
Jumadil Saputra(36)
Dmaithan Almajali(36)
Muhammad Turki Alshurideh(35)
Barween Al Kurdi(32)
Ahmad Makui(32)
Basrowi Basrowi(31)
Hassan Ghodrati(31)
Mohammad Khodaei Valahzaghard(30)
Shankar Chakraborty(29)
Ni Nyoman Kerti Yasa(29)
Sulieman Ibraheem Shelash Al-Hawary(28)
Prasadja Ricardianto(28)
Sautma Ronni Basana(27)
Haitham M. Alzoubi(27)


» Show all authors

Countries

Iran(2177)
Indonesia(1278)
Jordan(784)
India(782)
Vietnam(500)
Saudi Arabia(440)
Malaysia(438)
United Arab Emirates(220)
China(182)
Thailand(151)
United States(110)
Turkey(103)
Ukraine(102)
Egypt(97)
Canada(92)
Pakistan(84)
Peru(83)
Morocco(79)
United Kingdom(79)
Nigeria(77)


» Show all countries
Sort articles by: Volume | Date | Most Rates | Most Views | Reviews | Alphabet
1.

A hybrid time series analysis-genetic algorithm-support vector machine model for enhanced landslide predictio Pages 785-798 Right click to download the paper Download PDF

Authors: Chao He, Junwen Peng, Wenhui Jiang, Chaofan Wang, Junting Li, Zefu Tan

DOI: 10.5267/j.ijiec.2025.3.005

Keywords: Landslide prediction, Genetic algorithm, Support vector machine, Optimization, Regional analysis, Machine learning

Abstract:
Landslide prediction is a critical task for ensuring public safety and preventing economic loss in regions prone to such natural disasters. Traditional models for landslide prediction often lack accuracy and precision because of the intricate interactions between various factors that lead to landslide events. To tackle this issue, we introduce an innovative hybrid approach for landslide prediction that combines Time Series Analysis (TSA), Genetic Algorithm (GA), and Support Vector Machine (SVM). TSA decomposes landslide displacement data into trend, seasonal, and residual components, improving the clarity of the data. GA optimizes the hyperparameters of SVM, ensuring the most effective application of the SVM. Finally, the SVM is trained on detrended data, producing a model capable of accurately predicting future landslides. Our experimental outcomes manifest that the TSA-GA-SVM model we advanced performs far better than the individual TSA and SVM models when it comes to forecasting landslide displacement. The hybrid model achieved a mean absolute error of 0.15 m compared to 0.42 m for TSA and 0.38 m for SVM alone. Sensitivity analysis revealed that increasing GA population size improved model stability, while higher mutation rates led to more variable predictions. The model showed good generalization ability, performing well across different regions and under various geological and hydrological conditions. This research not only advances the state of the art in landslide prediction but also provides a practical tool for authorities to implement in their disaster prevention and management strategies.
Details
  • 17
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: IJIEC | Year: 2025 | Volume: 16 | Issue: 3 | Views: 239 | Reviews: 0

 
2.

A machine learning framework for exploring the relationship between supply chain management best practices and agility, risk management, and performance Pages 223-238 Right click to download the paper Download PDF

Authors: Tyler Ward, Sam Khoury, Selva Staub, Kouroush Jenab

DOI: 10.5267/j.msl.2024.8.001

Keywords: Machine Learning, SCM, Best Practices, SC, Agility, Risk Management

Abstract:
This study provides a comprehensive analysis of supply chain management practices based on survey responses from a sample of enterprises. Through descriptive statistics, hypothesis testing, predictive modeling, advanced analytics techniques such as classification, clustering, and association rule mining, the research offers valuable insights into key areas of collaboration, quality management, technology adoption, agility, risk management, and customer responsiveness within supply chains. The findings highlight the importance of strategic integration, proactive problem-solving, customer-centric practices, and agility in meeting changing demands. The study also identifies distinct profiles of practice adoption and reveals intricate relationships between different supply chain practices. Overall, the research contributes to a deeper understanding of supply chain dynamics and offers actionable insights for improving operational performance and strategic decision-making.
Details
  • 0
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: MSL | Year: 2025 | Volume: 15 | Issue: 4 | Views: 474 | Reviews: 0

 
3.

Multi-objective optimization of simultaneous buffer and service rate allocation in manufacturing systems based on a data-driven hybrid approach Pages 707-722 Right click to download the paper Download PDF

Authors: Shuo Shi, Sixiao Gao

DOI: 10.5267/j.ijiec.2023.8.001

Keywords: Simultaneous allocation, Multi-objective optimization, Data-driven, Machine learning

Abstract:
The challenge presented by simultaneous buffer and service rate allocation in manufacturing systems represents a difficult non-deterministic polynomial problem. Previous studies solved this problem by iteratively utilizing a generative method and an evaluative method. However, it typically takes a long computation time for the evaluative method to achieve high evaluation accuracy, while the satisfactory solution quality realized by the generative method requires a certain number of iterations. In this study, a data-driven hybrid approach is developed by integrating a tabu search–non-dominated sorting genetic algorithm II with a whale optimization algorithm–gradient boosting regression tree to maximize the throughput and minimize the average buffer level of a manufacturing system subject to a total buffer capacity and total service rate. The former algorithm effectively searches for candidate simultaneous allocation solutions by integrating global and local search strategies. The prediction models built by the latter algorithm efficiently evaluate the candidate solutions. Numerical examples demonstrate the efficacy of the proposed approach. The proposed approach improves the solution efficiency of simultaneous allocation, contributing to dynamic production resource reconfiguration of manufacturing systems.
Details
  • 0
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: IJIEC | Year: 2023 | Volume: 14 | Issue: 4 | Views: 718 | Reviews: 0

 
4.

Skin cancer detection advancements by employing machine learning and deep learning: A comprehensive review Pages 687-710 Right click to download the paper Download PDF

Authors: Rizik M. H. Al-Sayyed, Manar Rizik AlSayyed, AlMuatasim Billah Rizik AlSayyed, Feras Mohammad AlHyari, Barihan Mohammed Khasawneh

DOI: 10.5267/j.ccl.2025.1.003

Keywords: Skin cancer detection, Machine learning, Deep learning, Medical imaging, Computer-aided diagnosis

Abstract:
A thorough analysis of developments in machine learning (ML) and deep learning (DL) technologies for skin cancer diagnosis is provided in this research. It investigates how ML and DL could improve the precision and effectiveness of melanoma, basal cell carcinoma, and squamous cell carcinoma detection. By looking at current studies, the study emphasizes the use of neural networks, convolutional neural networks (CNNs), support vector machines (SVM), random forests, and k-nearest neighbors (KNN) in the diagnosis of skin cancer. Key findings show that DL models, including VGG, ResNet, and Inception benefit from huge datasets and sophisticated data augmentation strategies to attain high accuracy, sensitivity, and specificity. The paper also discusses the challenges and limitations associated with these technologies, such as the requirement for extensive annotated datasets. The study concludes with a call for collaboration to overcome current challenges and enhance the practical application of ML and DL in skin cancer detection.
Details
  • 0
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: CCL | Year: 2025 | Volume: 14 | Issue: 3 | Views: 497 | Reviews: 0

 
5.

The use of combined machine learning and in-silico molecular approaches for the study and the prediction of anti-HIV activity Pages 205-232 Right click to download the paper Download PDF

Authors: Mohamed Ouabane, Zouhir Dichane, Marwa Alaqarbeh, Radwan Alnajjar, Chakib Sekkate, Tahar Lakhlifi, Mohammed Bouachrine

DOI: 10.5267/j.ccl.2024.6.004

Keywords: Anti-HIV, Machine Learning, QSAR, Docking, MD simulation

Abstract:
While the number of AIDS-related deaths continues to rise, efforts have been made to transform the disease into a manageable chronic condition. HIV protease inhibitors have become central to combination therapy. As a result, these inhibitors have become a major focus of anti-HIV drug development. This research takes a data-driven approach to drug development through the use of quantitative structure-activity relationship (QSAR) analysis. A dataset of 450 anti-HIV drugs was used to construct and validate models. Using extensive validation methods and various machine learning algorithms, the results clearly showed that the "ET" regression outperformed the other models (“XGB”, “LGBM”, “DT”, “RF”, “GB”, “Bag”, and “HGB”) in terms of goodness of fit, predictivity, generalizability, and model robustness. Promising compounds were subjected to molecular docking and molecular dynamics simulation, resulting in drugs with favourable pharmacokinetic and pharmacodynamic properties that consistently interact with the therapeutic target.

Details
  • 0
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: CCL | Year: 2025 | Volume: 14 | Issue: 1 | Views: 333 | Reviews: 0

 
6.

Prediction and optimization models for electrodeposition of different materials: A review Pages 345-362 Right click to download the paper Download PDF

Authors: Saeid Kakooei

DOI: 10.5267/j.esm.2025.8.003

Keywords: Electrodeposition, Nanostructured coatings, Machine learning, Prediction models, Optimization algorithms, Surface engineering

Abstract:
Electrodeposition, a fundamental technique in materials science, has been developed to produce nanostructured coatings with improved mechanical, chemical, and physical properties. This study encompasses a systematic review of approaches based on prediction and optimization models at electrodeposition processes applicable to various materials. It discusses the theoretical background, such as mechanisms of nucleation and growth, and the key factors influencing the characteristics of coatings. The paper reviews traditional thermodynamic models as well as advanced data-driven techniques, with a special focus on machine learning methods, such as artificial neural networks (ANNs), dynamic ANNs (DANNs), and support vector machines (SVMs). The models are validated by the prediction of properties such as hardness, adhesion, and corrosion resistance. We also compare optimization strategies, such as genetic algorithms, particle swarm optimization, and their hybrids, to analyze their capability to improve both coating quality and process efficiency. The development discussed in this research is representative of the increased usage of AI and computational approaches, which allow for process control in real time, decreasing experimental costs and designing performance coatings. At the same time, new trends like sustainable electrodeposition, electrochemical 3D printing, or intersection with additive manufacturing are highlighted as well. This study highlights that predictive and optimization models have the potential to significantly impact the development of electrodeposition technologies targeted for industrial uses.
Details
  • 0
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: ESM | Year: 2025 | Volume: 13 | Issue: 4 | Views: 615 | Reviews: 0

 
7.

Using machine learning algorithms with improved accuracy to analyze and predict employee attrition Pages 1-18 Right click to download the paper Download PDF

Authors: Fiyhan Alsubaie, Murtadha Aldoukhi

DOI: 10.5267/j.dsl.2023.12.006

Keywords: Machine Learning, Employee Attrition, Improve Model Accuracy, Prediction, Decision Tree, Random Forest, Binary Logistic Regression

Abstract:
Human migration is based on pull factors that individuals evaluate when it comes to moving to a different territory. Likewise, employee attrition is a phenomenon that represents the tendency to a reduction in employees within an organization. This research paper aims to develop and evaluate machine learning algorithms, namely Decision Tree, Random Forest, and Binary Logistic Regression, to predict employee attrition using the IBM dataset available on Kaggle. The objective is to provide organizations with a proactive approach to employee retention and human resource management by creating accurate predictive models. Employee attrition has significant implications for an organization's reputation, profitability, and overall structure. By accurately predicting employee attrition, organizations can identify the factors contributing to it and implement data-driven human resources management practices. This study contributes to improving decision-making processes, including hiring and firing decisions, and ultimately enhances an organization's capital. The IBM dataset used in this study consists of anonymized employee records and their employment outcomes. It provides a comprehensive HR data representation for analysis and prediction. Three machine learning algorithms, Decision Tree, Random Forest, and Binary Logistic Regression, were utilized in this research. These algorithms were selected for their potential to improve accuracy in predicting employee attrition. The Logistic Regression model yielded the highest accuracy of 87.44% among the tested algorithms. By leveraging this study's findings, organizations can develop predictive models to identify factors contributing to employee attrition. These insights can inform strategic decisions and optimize human resource management practices.
Details
  • 34
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: DSL | Year: 2024 | Volume: 13 | Issue: 1 | Views: 1618 | Reviews: 0

 
8.

A machine learning technique for Android malicious attacks detection based on API calls Pages 29-44 Right click to download the paper Download PDF

Authors: Mousa AL-Akhras, Saud Alghamdi, Hani Omar, Hazzaa Alshareef

DOI: 10.5267/j.dsl.2023.12.004

Keywords: Attack Detection, API Calls, Machine Learning, Malware, Android

Abstract:
Android malware is widespread and it is considered as one of the most threatening attacks recently. The threat is targeting to damage access data or information or leaking them; in general, malicious software consists of viruses, worms, and other malware. Current malware attempts to prevent being detected by any software or anti-virus. This paper describes recent Android malware detection static and interactive approaches as well as several open-source malware datasets. The paper also examines the most current state-of-the-art Android malware identification techniques including identifying by comparative evaluation the gaps between these techniques. As a result, an API-based dynamic malware detection framework is proposed for Android to provide a dynamic paradigm for malware detection. The proposed framework was closely inspected and checked for reliability where meaningful API packages and methods were discovered.
Details
  • 34
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: DSL | Year: 2024 | Volume: 13 | Issue: 1 | Views: 1005 | Reviews: 0

 
9.

Machine learning models for condition-based maintenance with regular truncated signals Pages 197-210 Right click to download the paper Download PDF

Authors: Tyler Ward, Kouroush Jenab, Jorge Ortega-Moody

DOI: 10.5267/j.dsl.2023.9.006

Keywords: Condition monitoring, Machine learning, Maintenance Quality Function Deployment(MQFD)

Abstract:
Condition-based maintenance (CBM) of industrial machines depends on the continuous, real-time monitoring of the machine’s operational condition via smart sensors attached to different components on the machine. The problem of regularly spaced missing data, which can occur due to a variety of hardware or software issues, is one that is often overlooked in the literature surrounding CBM in industrial machines. Such missing data can cause issues in interpreting the true operational state of the machine, which can reduce the effectiveness of CBM processes. In this paper, we examine the capabilities of five data imputation techniques for handling this regular missing data and examine the impact these techniques have on machine learning (ML) classification algorithms for machine fault diagnosis. We examine the following techniques: simple mean imputation, mean imputation with outliers removed, best and worst-case imputation, and previous day imputation. Each of these methods is configured with the specific parameters that they will only consider data from the previous 24 hours, to ensure that the data is recent, and adequately represents the current status of the machine. The efficacy of each method at accurately reconstructing the missing data and the impact they have on ML classification is recorded in the results. The models are evaluated on a real-world dataset and are evaluated on a variety of common performance metrics.
Details
  • 17
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: DSL | Year: 2024 | Volume: 13 | Issue: 1 | Views: 708 | Reviews: 0

 
10.

An integrated approach for modern supply chain management: Utilizing advanced machine learning models for sentiment analysis, demand forecasting, and probabilistic price prediction Pages 237-248 Right click to download the paper Download PDF

Authors: Issam Amellal, Asmae Amellal, Hamid Seghiouer, Mohammed Rida Ech-Charrat

DOI: 10.5267/j.dsl.2023.9.003

Keywords: Supply Chain Management, Demand Forecasting, Sentiment Analysis, Price prediction, Machine Learning, Probabilistic Models

Abstract:
In the contemporary business landscape, effective interpretation of customer sentiment, accurate demand forecasting, and precise price prediction are pivotal in making strategic decisions and efficiently allocating resources. Harnessing the vast array of data available from social media and online platforms, this paper presents an integrative approach employing machine learning, deep learning, and probabilistic models. Our methodology leverages the BERT transformer model for customer sentiment analysis, the Gated Recurrent Unit (GRU) model for demand forecasting, and the Bayesian Network for price prediction. These state-of-the-art techniques are adept at managing large-scale, high-dimensional data and uncovering hidden patterns, surpassing traditional statistical methods in performance. By bridging these diverse models, we aim to furnish businesses with a comprehensive understanding of their customer base and market dynamics, thus equipping them with insights to make informed decisions, optimize pricing strategies, and manage supply chain uncertainties effectively. The results demonstrate the strengths and areas for improvement of each model, ultimately presenting a robust and holistic approach to tackling the complex challenges of modern supply chain management.
Details
  • 17
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: DSL | Year: 2024 | Volume: 13 | Issue: 1 | Views: 1791 | Reviews: 0

 
1 2 3 4 5
Previous Next

® 2010-2025 GrowingScience.Com