Publication:
A comparison of activation functions in artificial neural networks, Yapay sinir aǧlarinda aktivasyon fonksiyonlarinin karşilaştirilmasi

No Thumbnail Available

Date

2018

Journal Title

Journal ISSN

Volume Title

Publisher

Institute of Electrical and Electronics Engineers Inc.

Research Projects

Organizational Units

Journal Issue

Abstract

In this study, the effects of Activation Functions (AF) in Artificial Neural Network (ANN) on regression and classification performance are compared. In comparisons, success rates in test data and duration of training are evaluated for both problems. A total of 11 AF functions, 10 AF commonly used in the literature and Square function proposed in this study, are compared using 7 different datasets, 2 for regression and 5 for classification. 3 different ANN architectures, which are considered to be the most appropriate for each dataset are employed in the experiments. As a result of totally 231 different training procedures, the effects of Afs are examined for different datasets and architectures. Similarly, the effects of AF on training time are shown for different datasets. In the experiments it is shown that ReLU is the most succesfull AF in general purposes. In addition to ReLU, Square function gives the better results in image datasets. © 2018 Elsevier B.V., All rights reserved.

Description

Keywords

Citation

Endorsement

Review

Supplemented By

Referenced By