Masters of Science

URI for this collectionhttps://rps.wku.edu.et/handle/987654321/9

Browse

Search Results

Now showing 1 - 2 of 2
  • Thumbnail Image
    Item
    ENHANCING SECURITY IN SOFTWARE DEFINED NETWORKING USING DEEP LEARNING FOR DETECTION AND MITIGATION OF DISTRIBUTED DENIAL OF SERVICE ATTACKS
    (WOLKITE UNIVERSITY, 2024-04) SIRAJ AHMED YASSIN
    The growing reliance on Software-Defined Networking (SDN) necessitates robust security solutions, particularly against the escalating threat of Distributed Denial-of-Service (DDoS) attacks. Accurately and efficiently detecting both known and novel DDoS attacks in SDN environments remains a significant challenge. This study proposes a novel deep learning approach for efficient and accurate DDoS attack detection and mitigation within SDN. The proposed method utilizes a two-stage model: Stage 1 involves a comparative analysis between optimized Convolutional Neural Networks (CNN), Convolutional Neural Networks with Bidirectional Long Short-Term Memory (CNN-BiLSTMs), and Convolutional Neural Networks with Bidirectional Long Short-Term Memory and Attention (CNN-BiLSTMAttns), where all models achieved near-perfect accuracy (99.99%), with the CNN emerging as the most resource-efficient option. Stage 2 evaluates unsupervised learning with tuned Auto encoders (AE) and Variation Auto encoders (VAE) for anomaly detection, with the AE outperforming the VAE at a 99.86% detection rate. Various threes holding techniques were assessed with the AE, including percentile, Interquartile Range (IQR), Cumulative Sum (CUSUM), Peak-to-Peak, Control Chart, and Z-score, with CUSUM achieving the highest precision (100%) while Control Chart and Z-score demonstrated lower effectiveness. This two-stage approach combines the efficiency of a CNN for known attacks with the anomaly detection capability of an AE for novel attacks, using CUSUM thresholding for optimal results, thereby enhancing the resilience of SDN networks against DDoS threats. This innovative two-stage deep learning approach enhances SDN resilience by efficiently detecting both known and evolving DDoS attacks. It combines a resource-efficient Convolutional Neural Network (CNN) for known threats with the anomaly detection capability of Autoencoders (AE) for novel attacks.
  • Thumbnail Image
    Item
    DEEP LEARNING-BASED GURAGIGNA TO AMHARIC MACHINE TRANSLATION
    (WOLKITE UNIVERSITY, 2024-04) ALEMAYEHU BADARGA NIDA
    Machine translation is an application of NLP, which can be used to translate text from onenatural language to another natural language. In this study, we aimed to develop Deep Learning Based Guragigna to Amharic Translation, recognizing Natural Language Processing as a pivotal domain within AI facilitating human-computer language interaction. Previously, there is no research conducted on machine translation between Guragigna and Amharic. Given the abundance of information in Amharic across various domains in Ethiopia, including legal, media, religious, educational, and governmental documents, it becomes imperative to bridge the language gap for the growing Guragigna-speaking population. Neural Machine Translation (NMT) is a recently proposed approach to machine translation (MT) that has achieved the state-of-the-art translation quality in recent years. Unlike traditional MT approaches, NMT aims to create a single neural network that can be tuned collaboratively to maximize translation performance. So, the aim of this study is to develop Deep learning Amharic-guragigna bi-directional machine translation.To conducted experiments employing six encoder-decoder models: LSTM, Bi-LSTM, LSTM+attention, CNN+attention, GRU and Transformers. Collected a dataset of 9,515 parallel sentences, and evaluated the models based on efficiency metrics, including training time, memory usage, and BLEU score, to propose an optimal translation model and utilize the 80/20 splitting technique for dividing the dataset into training and testing sets. Achieving among those models, the transformer model outperforms other models by 99.4% accuracy, 0.0113 loss and a BLEU score of 9.93 for Amharic-Guragigna translation and 9.99 for Guragigna-Amharic machin translation. Because transformer process the whole sentence simultaneously, which reduces training time and it computes similarity scores between words in a sentence by itself means self attention. Due to the problem of unavailable parallel corpus, we have trained our model with minimum corpus, though NMT requires huge data for training and create an optimal model that learn the different features of the two languages and also challenges with LSTM, Bi-LSTM, LSTM+attention and GRU models, which required significant memory resources.