You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to contribute the Gaussian Error Linear Unit (GELU) activation function file in this repository under the neural network activation functions folder.
Additional information
No response
The text was updated successfully, but these errors were encountered:
We already have an implementation of the GELU function in maths/gaussian_error_linear_unit.py. I'm aware that the file should be placed in the neural_networks/activation_functions directory, but we don't want duplicate implementations of a single algorithm. Please consider moving the existing file instead of contributing your own implementation.
What would you like to share?
I would like to contribute the Gaussian Error Linear Unit (GELU) activation function file in this repository under the neural network activation functions folder.
Additional information
No response
The text was updated successfully, but these errors were encountered: