Skip to content

Commit b638b48

Browse files
committed
feat: add softmax activation function
1 parent 40f65e8 commit b638b48

File tree

1 file changed

+41
-0
lines changed

1 file changed

+41
-0
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
"""
2+
Softsign activation function
3+
4+
Use Case: Softsign provides a smooth transition without sharp gradients
5+
and is an alternative to sigmoid functions.
6+
7+
For more detailed information, you can refer to the following link:
8+
https://paperswithcode.com/method/softsign-activation
9+
"""
10+
11+
import numpy as np
12+
13+
14+
def softsign(vector: np.ndarray) -> np.ndarray:
15+
"""
16+
Implements the Softsign Activation Function.
17+
18+
Parameters:
19+
vector (np.ndarray): The input array for Softsign activation.
20+
21+
Returns:
22+
np.ndarray: The output after applying Softsign activation.
23+
24+
Formula: f(x) = x / (1 + |x|)
25+
26+
Examples:
27+
>>> softsign(np.array([-10, -5, -1, 0 ,1 ,5 ,10]))
28+
array([-0.90909091, -0.83333333, -0.5 , 0. , 0.5 ,
29+
0.83333333, 0.90909091])
30+
31+
>>> softsign(np.array([100]))
32+
array([0.99009901])
33+
34+
"""
35+
return vector / (1 + np.abs(vector))
36+
37+
38+
if __name__ == "__main__":
39+
import doctest
40+
41+
doctest.testmod()

0 commit comments

Comments
 (0)