Journal DesignEngineering Masthead
African Maintenance Engineering

PREDICTIVE MAINTENANCE OF POWER TRANSFORMER USING MACHINE LEARNING - A CASE STUDY

Richmond Ofori Amaning
Published2024-06-10
CorrespondenceRichmond Ofori Amaning, oforirichmond209@gmail.com, Asante Gold Corporation
Predicti
Compares three ML models for transformer fault prediction
Uses dissolved gas analysis data for predictive maintenance
Achieves over 95% accuracy with SVM and KNN classifiers
Demonstrates practical ML implementation with Python tools
Richmond Ofori AmaningAsante Gold Corporation | oforirichmond209@gmail.com
Abstract

The importance of power transformers in electrical power systems cannot be overstated, as their failures can lead to considerable economic losses and disruptions. The typical malfunctions encountered by a power transformer comprise dielectric issues, thermal losses due to copper resistance, distortions in winding caused by mechanical faults, failure of bushings, malfunction of tap changers, core malfunction, tank malfunction, failure of the protection system, and failure of the cooling system. Traditional methods for transformer fault detection involve using the ratio of key gases present in the transformer oil when a fault occurs. These gases include Hydrogen (H2), Methane (CH4), Ethane (C2H6), Ethylene (C2H4), Ethyne (C2H2), Carbon Monoxide (CO) and Carbon Dioxide (CO2). For accurate and early detection of faults, traditional methods require complex algorithms. This project focuses on the predictive maintenance of power transformers using machine learning techniques, aiming to identify and address potential faults pre-emptively. By analysing various fault types and leveraging machine learning tech like Decision Trees, Support Vector Machines (SVM), and K-Nearest Neighbour (KNN), the project develops models that predict transformer failures based on historical data. Dataspell software and Python libraries such as Numpy and Matplotlib were used to train the model. The testing results showed the efficiency of the SVM, KNN, and Decision Tree methods in detecting the faults experienced by the power transformer. The testing accuracy for SVM, KNN and Decision Tree models was 95.65%, 95.65% and 89.13%, respectively. It was observed that the SVM and KNN models performed better than the decision tree model.

Full Text

UNIVERSITY OF MINES AND TECHNOLOGY (UMaT) TARKWA FACULTY OF ENGINEERING DEPARTMENT OF ELECTRICAL AND ELECTRONIC ENGINEERING A PROJECT ENTITLED PREDICTIVE MAINTENANCE OF POWER TRANSFORMER USING MACHINE LEARNING - A CASE STUDY BY AMANING RICHMOND OFORI SUBMITTED IN PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE AWARD OF THE DEGREE OF BACHELOR OF SCIENCE IN ELECTRICAL AND ELECTRONIC ENGINEERING PROJECT SUPERVISOR …..……………………………… DR JOSEPH C. ATTACHIE TARKWA, GHANA AUGUST, 2024 i DECLARATION I declare that this project work is my own work. It is being submitted for the degree of Bachelor of Science in Electrical and Electronic Engineering in the University of Mines and Technology (UMaT), Tarkwa. It h as not been submitted for any degree or examination in any other University. ………………………… (Signature of Candidate) ………day of ……………………….., 2024. ii ABSTRACT The importance of power transformers in electrical power systems cannot be overstated, as their failures can lead to considerable economic losses and disruptions. The typical malfunctions encountered by a power transformer comprise dielectric issues, thermal losses due to copper resistance, distortions in winding caused by mechanical faults, failure of bushings, malfunction of tap changers, core malfunction, tank malfunction, failure of the protection system, and failure of the cooling system. Traditional methods for transformer fault detection involve using the ratio of key gases present in the transformer oil when a fault occurs. These gases include Hydrogen (H2), Methane (CH4), Ethane (C2H6), Ethylene (C2H4), Ethyne (C 2H2), Carbon Monoxide (CO) and Carbon Dioxide (CO2). For accurate and early detection of faults, t raditional methods require complex algorithms. This project focuses on the predictive maintenance of power transformers using machine learning techniques, aiming to identif y and address potential faults pre -emptively. By analysing various fault types and leveraging machine learning tech like Decision Trees, Support Vector Machines (SVM), and K -Nearest Neighbour (KNN), the project develops models that predict transformer fail ures based on historical data. Dataspell software and Python libraries such as Numpy and Matplotlib were used to train the model. The testing results showed the efficiency of the SVM, KNN, and Decision Tree methods in detecting the faults experienced by the power transformer. The testing accuracy for SVM, KNN and Decision Tree models was 95.65%, 95.65% and 89.13%, respectively. It was observed that the SVM and KNN models performed better than the decision tree model. iii DEDICATION This work is dedicated to my parents, Mr Tawiah Maxwell Ebu and Mrs Cecilia Nyarkoa, and my sister, Ms Alberta Agyapomaa Tawiah, for their support and unconditional love throughout my life. iv ACKNOWLEDGEMENTS My ultimate gratitude goes to God Almighty for giving me the insight and strength in all the days I spent to acquire my first degree. I am also grateful to my supervisor, Dr Joseph C. Attachie, for his maximum attention, time, and guidance while supervising this work. Finally, I appreciate my parents and siblings for their support throughout my study. v TABLE OF CONTENTS Contents Page DECLARATION i ABSTRACT ii DEDICATION iii ACKNOWLEDGEMENT iv TABLE OF CONTENTS v LIST OF FIGURES vii LIST OF TABLES viii LIST OF ABBREVIATIONS ix INTERNATIONAL SYSTEM OF UNITS (SI UNITS) x CHAPTER 1 GENERAL INTRODUCTION 1 1.1 Research Background 1 1.2 Problem Definition 1 1.3 Project Objectives 2 1.4 Methods Used 2 1.5 Facilities Used 3 1.6 Work Organisation 3 CHAPTER 2 LITERATURE REVIEW 4 2.1 Introduction 4 2.2 Power Transformer 4 2.2.1 Principle of Operation 5 2.2.2 Types of Transformers 5 2.3 Power Transformer Faults 7 2.4 DataSpell Software for Machine Learning 9 2.5 Review of Classification Models 10 2.5.1 Decision Tree 10 2.5.2 Support Vector Machine (SVM) 11 2.5.3 K-Nearest Neighbour (KNN) 11 vi 2.6 Traditional Methods of Fault Detection in Transformers 11 2.7 Review of Related Works on Transformer Fault Detection 11 2.8 Summary of Review of Related Works on Transformer Fault Detection 14 CHAPTER 3 METHODS USED 15 3.1 Introduction 15 3.2 Data Acquisition and Pre-Processing 15 3.3 Model Selection and Development 16 3.4 Model Evaluation 16 3.5 Classification Result 17 3.6 Performance Metrics 17 CHAPTER 4 RESULTS AND DISCUSSIONS 18 4.1 Introduction 18 4.2 Computational Results for Decision Tree Classifier 18 4.3 Computational Results for SVM Model 19 4.4 Computational Results for KNN Classifier 19 4.5 Comparison of Accuracies of Classifiers 20 4.6 Comparison of Receiver Operating Characteristics Curves 20 4.7 Comparison of Precision-Recall Curves 21 4.8 Summary of Findings 23 CHAPTER 5 CONCLUSIONS AND RECOMMENDATIONS 24 5.1 Conclusions 24 5.2 Recommendations 24 5.3 Future Works 24 REFERENCES 25 APPENDICES 29 APPENDIX A DATASET USED 29 APPENDIX B CODES FOR DEVELOPING MODELS 33 vii LIST OF FIGURES Figure Title Page 2.1 A Transformer 4 2.2 Transformer Working Principle 5 3.1 A Flow Chart of the Methodology 15 3.2 A Graphical Representation of the 15 Distribution of the Faults in the Dataset 4.1 ROC Curves 21 4.2 PR Curves 22 viii LIST OF TABLES Table Title Page 2.1 Fault Classification According to IEC 60599 and 7 IEEE C57.104 Standard 4.1 Classification Result and Confusion 18 Matrice for Decision Tree Classifier 4.2 Classification Result and Confusion 19 Matrice for SVM Classifier 4.3 Classification Result and Confusion 19 Matrice for KNN Classifier 4.4 Comparison of Classification Accuracies 20 ix LIST OF ABBREVIATIONS Abbreviation Meaning AC Alternating Current ANN Artificial Neural Network AP Average Precision AUC Area Under Curve DGA Dissolved Gas Analysis ECG Electricity Company of Ghana EMF Electromotive Force FN False Negative FP False Positive IDE Integrated Development Environment IoT Internet of Things KNN K-Nearest Neighbour PD Partial Discharge PDM Predictive Maintenance PR Precision-Recall ROC Receiving Operating Curve RUL Remaining Useful Life SVM Support Vector Machine TN True Negative TP True Positive x LIST OF SYMBOLS Carbon Dioxide CO2 Carbon Monoxide CO Ethane C2H6 Ethylene C2H4 Ethyne C2H2 Methane CH4 Hydrogen H2 High Energy Discharge D2 High-Temperature Thermal Fault T3 Low Energy Discharge D1 Low-Temperature Thermal Fault T1 Medium-Temperature Thermal Fault T2 xi INTERNATIONAL SYSTEM OF UNITS (SI UNITS) Quantity Units of Measurement Symbol Apparent Power volts-ampere VA Electrical Voltage volts / kilovolts V / kV Temperature degree Celsius ⁰C 1 CHAPTER 1 GENERAL INTRODUCTION 1.1 Research Background The Electricity Company of Ghana (ECG) functions within different sectors, providing electricity to homes, industries and businesses. The power transformer is utilised in ECG as one of the essential pieces of equipment, guaranteeing the effective conveyan ce and dispersion of electrical power throughout the grid. Power transformers step up voltage for transmission after generation for economic transmission to reduce power losses. Power transformers are static machines that have very high efficiency. The po wer transformer plays a crucial role in the operation of a power grid, and its failure can significantly impact the grid's safe and stable operation and lead to significant economic losses (Duan and Wang, 2023). Faults may also happen in the transformer, l ike all other electrical devices, which cause the failures (Sudha et al., 2022). Failures that frequently happen in power transformers include dielectric faults, thermal losses due to copper resistance, winding distortion leading to mechanical faults, bush ing failure, tap changer failure, core failure, tank failure, protection system failure, and cooling system failure. The goal of predictive maintenance methods is to assess the operating condition of equipment and predict when it is prone to failure so tha t proactive maintenance can be scheduled (Amer et al ., 2023). The application of machine learning for equipment prognostics has become more common since 2010, with significant progress in fault classification and limited progress in predicting Remaining Useful Life (RUL) (Howard, 2022). Machine learning applications provide benefits like predicting and preventing equipment failures, cutting maintenance expenses, minimising repair downtimes, improving plant safety and security, boosting production, and offering numerous other advantages (Amer et al., 2023). 1.2 Problem Definition Power transformers are essential equipment for ECG’s power grid system. Transformer breakdown is one of the main reasons why a power system operation may be interrupted. Transformer maintenance is, therefore, highly essential but tedious. The abrup t or unforeseen malfunction of such machinery will not just impact power generation but also 2 result in additional expenses for maintenance and repairs, as well as potential casualties and societal repercussions in serious situations. Traditional maintenance approaches, such as reactive maintenance, are often reactive and time-consuming, posing challenges for ECG in ensuring a reliable power supply. The reliable assessment of a power transformer's condition using machine learning algorithms is an important challenge that needs to be addressed promptly. Recent developments in information technologies and communication networks have made it possible for machines to collect and analyse large volumes of data and perform operational environmental tasks to detect and investigate machine failures (Amer et al ., 2023). Predictive Maintenance (PDM) enables the enhancement of high-quality strategies for smart preventative maintenance using gathered data collections. The goal of this project is to develop a model for predicting the health status of equipment, carry out the prediction of equipment health status, and proactively implement maintenance actions based on the predicted equipment status results to anticipate equipment "status repair" and investigate an innovative approach to equipment maintenance management. 1.3 Project Objectives The objectives of this project are to: i. Investigate machine learning algorithms that can be suitable for predictive maintenance application; and ii. Build a model based on machine learning techniques for predictive maintenance of a power transformer. 1.4 Methods Used The methods employed include: i. Review of relevant literature; ii. Model development and implementation; and iii. Comparing the results of the developed model with existing models using standard metrics. 3 1.5 Facilities Used The facilities employed are: i. The University Library; ii. Internet Facilities; and iii. Laptop computer with the latest version of Python programming language installed and DataSpell, a data science integrated development environment (IDE). 1.6 Work Organisation This project is organised into five chapters. Chapter 1 gives a general introduction that deals with the background to the research, the problem under study, project objectives, methods used, facilities used, and work organisation. Chapter 2 examines significant literature in the subject area, considering definitions, examples, and clarifications of the theoretical framework related to the study. This chapter forms the basis of the whole project, equipping the reader with the essential information needed to comprehend the approach and the chapters that follow. Chapter 3 discusses the methods, including implementing the proposed machine learning techniques. Chapter 4 presents the results and their discussions . Chapter 5 gives the conclusions and recommendations. 4 CHAPTER 2 LITERATURE REVIEW 2.1 Introduction Power transformer failures can lead to significant economic losses and disruptions. To mitigate these risks, predictive maintenance strategies have gained prominence. Machine learning has become a powerful tool for analysing complex data patterns associate d with transformer health. By leveraging this technology, utilities can accurately predict potential failures, optimise maintenance schedules and prevent costly breakdowns. This proactive approach aligns with the industry's growing emphasis on reliability and efficiency. This chapter discusses power transformers, reviews power transformer faults, reviews software for machine learning, reviews machine learning techniques, and reviews related works. 2.2 Power Transformer A power transformer efficiently transfers electrical power between circuits without altering the frequency. Its operation is based on electromagnetic induction. The purpose of power transformers is to increase or decrease the voltage of an alternating current (AC). They are considered static devices because they have no moving parts. A power transformer is a type of transformer that operates within a voltage range of 33 to 400 kV and has a rating higher than 200 MVA. Voltage ratings of power transformers on the market range from 400 kV to 33 kV. The other transformers are distribution (230 V - 11 kV) and instrument transformers. Figure 2.1 (Anon., 2024a) shows a Transformer. Figure 2.1 A Transformer 5 2.2.1 Principle of Operation Faraday's principle of electromagnetic induction is the fundamental concept behind how a transformer operates. This fundamental law of electromagnetism explains the working principle of all transformers and inductors. Faraday's law asserts that an electromotive force (emf) will be generated across a closed loop when it is brought close to a changing magnetic field (Anon., 2024b). When an alternating current passes through a coil or primary winding, it generates a changing magnetic flux around it. This magnetic flux, which is produced by the primary winding, travels through a ferromagnetic core to reach a secondary winding. As a result of electromagnetic induction, the magnetic flux induces an electromagnetic field in th e secondary winding. This induced electromotive force (emf) then promotes the current flow in the secondary winding. Figure 2.2 shows the working principle of a transformer (Anon., 2024b). Figure 2.2 Transformer Working Principle 2.2.2 Types of Transformers Different factors can be used to categorise power transformers, including their construction, application, and function. Some of the standard classifications of power transformers and various types are turns ratio, phases, core material and core and winding construction and arrangement Turns ratio Step-up and step-down transformer: These transformers increase or decrease the voltage level of an AC supply. In a step -up transformer, there are more turns in the secondary 6 winding than in the primary winding. Conversely, in a step -down transformer, there are fewer turns in the secondary winding compared to the primary winding (Anon., 2024c). Isolation transforme r: This type transfers electrical power between two circuits without changing the frequency and providing galvanic isolation. A transformer for isolation has an equal number of turns in its primary and secondary windings (Anon., 2024d). Phases Single-phase and three-phase transformer: A single -phase transformer has one primary winding and secondary winding, which produces a single alternating voltage in the form of a sine wave, while a three-phase transformer features three pairs of primary and secondary windings, interconnected in either a star or delta configuration (Anon., 2024e). Autotransformer: This type of transformer has only one winding. The primary and secondary coils are linked electrically and wound around a common core. They find wide applications in induction motors, railway systems, audio, and lighting (Anon., 2024f). Core material Air-core transformers: Air-core transformers do not have a physical core, and their primary and secondary windings are wrapped in a solid insulating material. They are utilised to carry radio currents (Anon., 2019). Ferrite core transformers: Transformers with ferrite cores have cores composed of ferrite, a magnetic ceramic material containing iron oxide. Manganese zinc ferrite and nickel -zinc ferrite are ferrites commonly used in transformers (Anon., 2019). Iron core transformers: Iron core transformers contain a magnetic core of laminated iron sheets. This type of transformer is the most widely used in its category. They demonstrate powerful magnetic properties, resulting in a high flux linkage (Anon., 2019). Toroidal core transformers: Transformers with toroidal cores feature a core made from iron or ferrite and have a torus or doughnut shape. Compared to traditional shell and core transformers, they prov ide increased design flexibility, efficiency, and space -saving characteristics (Anon., 2019). Core and winding construction and arrangement Berry-type transformers: Berry-type transformers feature a core configuration resembling the spokes of a wheel. They have distributed magnetic paths with more than two distinct magnetic pathways (Anon., 2020). 7 Core-type transformers: These transformers have primary and secondary windings that encircle the core. The core is constructed by joining two L-shaped steel strips and stacking them. The positioning of the strips is designed to eliminate uninterrupted joints and avoid high reluctance at the joints. The limbs and the yoke carry the complete flux (Anon., 2020). Shell-type transformers: The core in shell -type transformers surrounds the primary and secondary windings. It is constructed by layering E-shaped and I-shaped steel strips to create the central and side limbs. The central limb carries the complete magnetic flux, while the side limbs each have half (Anon., 2020). 2.3 Power Transformer Faults The formation of gas molecules of different densities occurs when power transformers fail. These gas molecules are formed at various temperatures and with differing formation energies. The type and amount of gases produced during the fault vary, making them useful for fault detection. Dissolved Gas Analysis (DGA) is an effective method for detecting gases present in transformer insulation liquid. Utilising the chromatograph method, the DGA measures the number of gases in the insulating liquid for fault detection. The DGA method for transformer fault analysis utilises gases such as Hydrogen (H2), Carbon Monoxide (CO) and Carbon Dioxide (CO 2), Methane (CH4), Ethane (C 2H6), Ethylene (C 2H4) and Ethyne (C2H2). Partial discharges, thermal overheating, and arcing are the primary power transformer faults that can be consistently identified during a visual insp ection (Nanfak et al., 2021). These faults can further be classified into six types listed in Table 2.1. Table 2.1 Fault Classification According to IEC 60599 and IEEE C57.104 Standard Acronym Fault Type PD Partial Discharge D1 Low Energy Discharge D2 High Energy Discharge T1 Low-Temperature Thermal Fault T<300 °C T2 Medium-Temperature Thermal Fault 300 °C <T<700 °C T3 High-Temperature Thermal Fault T>700 °C (Source: Nanfak et al., 2021) 8 Partial discharge Partial Discharge (PD) is an electrical discharge that partially connects the insulation between conductors . The discharge can occur when the electrical stress exceeds the breakdown strength of a specific portion of the insulation system. The occurrence of PDs results in the deterioration of the transformer insulation system (Meitei et al., 2021). Low energy discharge Low-energy discharge (D1) refers to partial discharges that release relatively little energy compared to other discharges. Despite their lower energy, these discharges can still cause significant damage to transformer insulation over time. High energy discharge High-energy discharges (D2) in transformers are partial discharges that release significan t energy. If not promptly addressed, these discharges can cause immediate and severe damage to the insulation system and lead to catastrophic failure. Low-temperature thermal fault Thermal faults (T1) in transformers refer to conditions where the temperature of the insulation material and transformer oil rises abnormally but to relatively lower temperatures compared to more severe faults. T1 thermal faults typically involve temperatures up to 300 °C and are often associated with overheating due to poor cooling, overloading, or localised heating effects. Medium-temperature thermal fault Faults related to temperature (T2) in transformers usually range from 300 °C to 700 °C and are considered moderate to high temperatures. These faults indicate more severe overheating conditions than T1 thermal faults and can cause significant degradation of the transformer’s insulation and oil. High-temperature thermal fault Thermal faults (T3) in transformers refer to temperatures exceeding 700 °C. These severe faults indicate extremely high overheating, which leads to substantial degradation of the insulation system and transformer oil and a high risk of transformer failure. 9 2.4 DataSpell Software for Machine Learning DataSpell, a robust IDE developed by JetBrains, is designed to improve productivity and streamline workflows for data science and machine learning tasks. It is particularly effective for predictive functions like classification, p roviding a comprehensive suite of tools and features modified to the needs of data scientists and machine learning practitioners. One of DataSpell's key features is its support for Jupyter Notebooks. Users can write and run code, visualise data, and document workflow all within one interface in this interactive environment. Jupyter Notebooks are invaluable for exploratory data analysis and iterative model development, making testing different hypotheses and approaches for classification tasks more accessible. DataSpell provides strong support for Python, which is the primary language for machine learning. This support includes advanced functionalities such as smart code suggestions, highlighting of syntax, and tools for debugging. These features help data sc ientists write cleaner, more efficient code and quickly identify and resolve errors. The IDE also supports various Python libraries and frameworks commonly used in machine learning, such as Scikit-learn, TensorFlow, and PyTorch, facilitating seamless integ ration into classification workflows. Version control integration is another significant advantage of DataSpell. Users can effortlessly monitor changes, cooperate with team members, and handle various versions of their projects thanks to the integrated Git and other version control system support. This is particularly useful in machine learning projects, where iterative improvements and experimentation are expected. Data visualisation is crucial for understanding data distributions, feature importance, and model performance. DataSpell provides built -in support for popular visualisation libraries like Matplotlib, Seaborn, and Plotly, allowing users to create detailed and informative visualisations. These visualisations help interpret the results of classifica tion models and communicate findings to stakeholders. DataSpell offers a comprehensive environment for developing, testing, and deploying machine learning models, particularly for classification tasks. The combined Jupyter Notebooks, strong Python support, incorporation of version control, and capabilities for data 10 visualisation make it an essential tool for data scientists and machine learning experts who want to boost their efficiency and accomplish more precise predictive results (Anon., 2024g). 2.5 Review of Classification Models Classification is a supervised machine learning method where the model attempts to predict the correct label for a given input dataset. With classification, the model is trained using the training data and then assessed on test data before being utilised to predict new, unseen data. The initial phase is "training," where the model is established , and the classification rules are determined through the dataset containing known labels. The subsequent step is "testing," which involv es evaluating the model's performance using other datasets with known labels. If the criteria are satisfied, the model is accepted. If not, the initial phase is repeated (Anon., 2022). Lazy Learners and Eager Learners are two types of classification algorithms. Eager learners are machine learning algorithms that construct a model using the training dataset and then make predictions for future datasets . An example of an Eager Learning algorithm is the Decision Tree. Learners that are lazy or instance-based do not generate a model right away from the training data. Instead, they store the training data and locate the closest neighbor from the complete training data every time a prediction is required, which may cause them to be slow during prediction. An exam ple of the lazy learner algorithm is K -Nearest Neighbour (KNN). Some classification methods used for this research include Decision trees, SVM and KNN. 2.5.1 Decision Tree A decision tree serves as a strong supervised learning method used for classification and regression tasks. It organi ses data into a model resembling a tree, with internal nodes indicating attribute tests, branches representing test outcomes, and leaf nodes indicating class labels or continuous values. Decision trees are valu ed for their straightforwardness, interpretability, and capacity to manage both numerical and categorical data. They excel in feature selection, managing non -linear relationships, and handling datasets with missing values. Decision trees are extensively ut ilised in various domains because they can handle complex decision -making processes and provide precise, interpretable models (Doe and Smith, 2023). 11 2.5.2 Support Vector Machine Support Vector Machine (SVM) is a supervised learning method used for classify ing and predicting tasks. It operates by identifying the best hyperplane to separate classes, maximising the distance between the nearest points of different classes. SVMs are efficient in high-dimensional spaces and are less susceptible to overfitting due to their emphasis on maximising the margin. They also employ the kernel trick to handle non -linear classification. However, SVMs can be demanding in terms of computational resources and require careful parameter tuning. Generally, SVMs provide superior generalisation but can be less explainable and more resource -intensive compared to other algorithms such as decision trees and logistic regression (Johnson and Wang, 2024). 2.5.3 K-Nearest Neighbour The KNN algorithm is a straightforward, non -parametric technique utilised for both classification and regression. It categori ses data points by taking a vote from their nearest neighbours. KNN's strength lies in its simplicity and effectiveness when dealing with small datasets, while its weakness is its high computational intensity when handling large datasets. In comparison to algorithms such as decision trees and SVMs, KNN is straightforward to implement but may be slower and less precise when dealing with high -dimensional data (Lee and Brown, 2023). 2.6 Traditional Methods of Fault Detection in Transformers Traditional fault detection methods based on DGA data are known as gas ratio methods. These techniques make use of key gas ratios to diagnose faults. The traditional techniques consist of Doernenburg’s ratio method, Roger’s ratio method, IEC ratio method, and the Three Ratio Technique. 2.7 Review of Related Works on Transformer Fault Detection Venkataswamy et al . (2020) proposed a reliability -centred maintenance approach for distribution transformers using Internet of Things (IoT) and metaheuristic techniques. The approach is based on the metaheuristic optimisation techniques integrated with IoT technologies to enhance reliability -centred maintenance strategies for distribution transformers. The approach significantly enhanced the reliability -centred maintenance of 12 distribution transformers, reducing downtime and maintenance expenses. However, implementation requires a robust IoT infrastructure, which may be complex and expensive. Sarro et al. (2020) proposed a technique that utilises machine learning to improve human expert effort estimates by learning from mistakes . The study utilises machine learning algorithms to enhance the precision of human expert effort predictions by drawing insights from past data and prior estimation discrepancies. The machine learning-enhanced estimates provided more accurate predictions of human expert effort, reducing the error margin in project planning. However, the drawb ack of machine learning -enhanced estimates is that the accuracy of the prediction relies on the excellence and comprehensiveness of historical data. Aqueveque et al . (2021) presented a method for utili sing wireless accelerometer sensor modules to conduct data-driven condition monitoring of mining mobile machinery in non - stationary operations. The authors suggested using wireless accelerometer sensors to collect data for monitoring the condition of mining machinery during non -stationary operations in this paper. Effective monitoring of mining machinery under non -stationary conditions was achieved, improving operational efficiency and reducing unexpected failures. However, the study focuses on monitoring in non-stationary environments, which poses challenges due to varying operational conditions and sensor reliability. Laayati et al. (2021) suggested a novel concept for a self-diagnostic system that is integrated into an intelligent energy management system for monitoring oil -immersed power transformers. This paper proposes a smart energy management system with integrated self- diagnostic capabilities for oil-immersed power transformers to monitor and diagnose faults. The developed system successfully integrated monitoring and self -diagnostic capabilities, providing real-time fault detection and diagnosis . The drawback of this approach is that combining the system with existing infrastructure can be complex and require significant modifications. Pileggi et al. (2021) proposed a method for implementing machine learning to predictively maintain gas turbines. The study examines the real -world implementation of machine learning models for predicting maintenance needs in gas turbines, covering data preparation and deploying models. The study provided valuable insights i nto the challenges and best practices for implementing machine learning in predictive maintenance and improving 13 operational outcomes. However, operationalising machine learning models in real -world settings involves data quality, model maintenance, and scalability challenges. Vallim Filho et al. (2022) reviewed condition -based maintenance of power transformers using predictive analytics. The authors in this research suggested a framework for predictive maintenance using machine learning models, which is based on equipment load cycles, in a practical case scenario. The framework accurately predicted maintenance needs based on equipment load cycles, validated by real-world case studies. However, while the framework accurately predicted maintenance needs, its e ffectiveness depends on the availability and accuracy of detailed equipment load cycle data. Yu et al. (2022) proposed a SVM approach utilising information granulation for detecting anomalies in primary transformers at nuclear power facilities. This paper uses an information-granulated SVM approach to detect anomalies in main transformers at nuclear power plants. The proposed SVM approach accurately detected anomalies in transformer operations, enhancing the safety and reliability of nuclear power pla nts. The drawback of this approach is that the SVM approach's performance is highly impacted by the quality and granularity of the information provided. Raghuraman and Darvishi (2022) proposed a method for identifying various types of transformer faults fr om DGA data by employing machine learning methods . In this fault detection research, machine learning techniques are applied to DGA data to dete