Logo
Explore Help
Register Sign In
TheAlgorithms/Python
1
0
Fork 0
You've already forked Python
Code Issues Pull Requests Actions 2 Packages Projects Releases Wiki Activity
Files
08d394126c9d46fc9d227a0dc1e343ad1fa70679
Python/neural_network/activation_functions
History
Kausthub Kannan 08d394126c Changed Mish Activation Function to use Softplus (#10111)
2023-10-08 11:48:22 -04:00
..
exponential_linear_unit.py
The ELU activation is added (#8699)
2023-05-02 16:36:28 +02:00
leaky_rectified_linear_unit.py
Added Leaky ReLU Activation Function (#8962)
2023-08-16 18:22:15 -07:00
mish.py
Changed Mish Activation Function to use Softplus (#10111)
2023-10-08 11:48:22 -04:00
rectified_linear_unit.py
Moved relu.py from maths/ to neural_network/activation_functions (#9753)
2023-10-04 16:28:19 -04:00
scaled_exponential_linear_unit.py
Added Scaled Exponential Linear Unit Activation Function (#9027)
2023-09-06 15:16:51 -04:00
sigmoid_linear_unit.py
Changing the directory of sigmoid_linear_unit.py (#9824)
2023-10-05 10:07:44 -04:00
softplus.py
Added Softplus activation function (#9944)
2023-10-06 16:26:09 -04:00
Powered by Gitea Version: 1.24.6 Page: 548ms Template: 4ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API