Skip to content

Commit 0917158

Browse files
committed
feat: add maxout activation function
1 parent 40f65e8 commit 0917158

File tree

1 file changed

+44
-0
lines changed

1 file changed

+44
-0
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
"""
2+
Maxout activation function
3+
4+
Use Case: Maxout allows for more flexibility than traditional
5+
activation functions like ReLU and can improve model capacity.
6+
7+
For more detailed information, you can refer to the following link:
8+
https://arxiv.org/abs/1302.4389
9+
"""
10+
11+
import numpy as np
12+
13+
14+
def maxout(vector: np.ndarray) -> np.ndarray:
15+
"""
16+
Implements the Maxout Activation Function.
17+
18+
Parameters:
19+
vector (np.ndarray): The input array for Maxout activation.
20+
21+
Returns:
22+
np.ndarray: The output of Maxout activation applied to pairs of inputs.
23+
24+
Formula: f(x) = max(x_1, x_2)
25+
26+
Examples:
27+
>>> maxout(np.array([[2., -3.], [-1., 4.]]))
28+
array([[2.],
29+
[4.]])
30+
31+
>>> maxout(np.array([[5, -5], [3, -3]]))
32+
array([[5],
33+
[3]])
34+
35+
"""
36+
return np.maximum(
37+
vector[:, : vector.shape[1] // 2], vector[:, vector.shape[1] // 2 :]
38+
)
39+
40+
41+
if __name__ == "__main__":
42+
import doctest
43+
44+
doctest.testmod()

0 commit comments

Comments
 (0)