Skip to content

Commit 572de4f

Browse files
shivansh-bhatnagar18shivanshbhatnagar18pre-commit-ci[bot]
authored
Added A General Swish Activation Function inNeural Networks (TheAlgorithms#10415)
* Added A General Swish Activation Function inNeural Networks * Added the general swish function in the SiLU function and renamed it as swish.py * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --------- Co-authored-by: Shivansh Bhatnagar <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
1 parent 361f64c commit 572de4f

File tree

1 file changed

+20
-0
lines changed
  • neural_network/activation_functions

1 file changed

+20
-0
lines changed

Diff for: neural_network/activation_functions/sigmoid_linear_unit.py renamed to neural_network/activation_functions/swish.py

+20
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@
1212
1313
This script is inspired by a corresponding research paper.
1414
* https://arxiv.org/abs/1710.05941
15+
* https://blog.paperspace.com/swish-activation-function/
1516
"""
1617

1718
import numpy as np
@@ -49,6 +50,25 @@ def sigmoid_linear_unit(vector: np.ndarray) -> np.ndarray:
4950
return vector * sigmoid(vector)
5051

5152

53+
def swish(vector: np.ndarray, trainable_parameter: int) -> np.ndarray:
54+
"""
55+
Parameters:
56+
vector (np.ndarray): A numpy array consisting of real values
57+
trainable_parameter: Use to implement various Swish Activation Functions
58+
59+
Returns:
60+
swish_vec (np.ndarray): The input numpy array, after applying swish
61+
62+
Examples:
63+
>>> swish(np.array([-1.0, 1.0, 2.0]), 2)
64+
array([-0.11920292, 0.88079708, 1.96402758])
65+
66+
>>> swish(np.array([-2]), 1)
67+
array([-0.23840584])
68+
"""
69+
return vector * sigmoid(trainable_parameter * vector)
70+
71+
5272
if __name__ == "__main__":
5373
import doctest
5474

0 commit comments

Comments
 (0)