Tomar, Vikalp Singh (2022) A critical evaluation of activation functions for autoencoder neural networks. Masters thesis, Dublin, National College of Ireland.
Preview |
PDF (Master of Science)
Download (1MB) | Preview |
Preview |
PDF (Configuration manual)
Download (1MB) | Preview |
Abstract
A fundamental component of a neural network layer is the activation function. A lot of study has been done on such functions, so the neural network developer has many options. The most prevalent are probably Relu, Tanh, and Sigmoid. There are however several other functions, such as GELU, SELU, and PRelu, ELU that are available, but little is known about their performance for autoencoder networks. Therefore, in this research article, a comparative examination of multiple activation functions is conducted to determine which activation function best fits the autoencoder neural network.
According to this study, the activation function—which many researchers use when building neural networks—performed worse than the other activation function in terms of MSE. While a new combination of the activation function comes out with the best result in terms of mean square error as well as training time even after having larger number of training parameter.
Item Type: | Thesis (Masters) |
---|---|
Subjects: | Q Science > QA Mathematics > Electronic computers. Computer science T Technology > T Technology (General) > Information Technology > Electronic computers. Computer science Q Science > Q Science (General) > Self-organizing systems. Conscious automata > Machine learning |
Divisions: | School of Computing > Master of Science in Data Analytics |
Depositing User: | Tamara Malone |
Date Deposited: | 14 Mar 2023 11:50 |
Last Modified: | 14 Mar 2023 11:50 |
URI: | https://norma.ncirl.ie/id/eprint/6329 |
Actions (login required)
View Item |