COMPARATIVE STUDY OF SURROGATE TECHNIQUES FOR HYPERPARAMETER OPTIMIZATION IN CONVOLUTIONAL NEURAL NETWORK

MOHD ASZEMI, NURSHAZLYN (2023) COMPARATIVE STUDY OF SURROGATE TECHNIQUES FOR HYPERPARAMETER OPTIMIZATION IN CONVOLUTIONAL NEURAL NETWORK. Masters thesis, UNSPECIFIED.

[thumbnail of NurshazlynMohdAszemi_17007352.pdf] Text
NurshazlynMohdAszemi_17007352.pdf
Restricted to Registered users only

Download (2MB)

Abstract

Optimizing hyperparameters in CNN is tedious for many researchers and practitioners. it requires a high degree of expertise or a lot of experience to optimize the hyperparameter and such manual optimization is likely to be biased. Hyperparameters in deep learning can be divided into two types which is those associated with the learning algorithms, such as determining what learning rate is appropriate, after how many iterations or epochs for each training and the other type of hyperparameter is related to how we design deep neural networks. For example, how many layers we need for our network, how many filters in given convolutional layers needs, etc. Choosing different values and setting these hyperparameters correctly is often critical for reaching the full potential of the deep neural network chosen or designed, consequently influencing the quality of the produced results. Currently, different methods or approaches have been introduced in mitigating the issues of manual optimization.

Item Type: Thesis (Masters)
Subjects: T Technology > T Technology (General)
Departments / MOR / COE: Sciences and Information Technology
Depositing User: Ms Nurul Aidayana Mohammad Noordin
Date Deposited: 30 Jun 2023 03:10
Last Modified: 30 Jul 2024 03:09
URI: http://utpedia.utp.edu.my/id/eprint/24632

Actions (login required)

View Item
View Item