Skip to content

Adding Gaussian Error Linear Unit to neural network activation functions #11207

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
ParamThakkar123 opened this issue Dec 9, 2023 · 1 comment · Fixed by #11216
Closed

Adding Gaussian Error Linear Unit to neural network activation functions #11207

ParamThakkar123 opened this issue Dec 9, 2023 · 1 comment · Fixed by #11216
Labels
awaiting triage Awaiting triage from a maintainer

Comments

@ParamThakkar123
Copy link
Contributor

What would you like to share?

I would like to contribute the Gaussian Error Linear Unit (GELU) activation function file in this repository under the neural network activation functions folder.

Additional information

No response

@ParamThakkar123 ParamThakkar123 added the awaiting triage Awaiting triage from a maintainer label Dec 9, 2023
@tianyizheng02
Copy link
Contributor

We already have an implementation of the GELU function in maths/gaussian_error_linear_unit.py. I'm aware that the file should be placed in the neural_networks/activation_functions directory, but we don't want duplicate implementations of a single algorithm. Please consider moving the existing file instead of contributing your own implementation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting triage Awaiting triage from a maintainer
Projects
None yet
2 participants