Skip to content

Other Activation Functions #9010

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
AdarshAcharya5 opened this issue Aug 24, 2023 · 1 comment · Fixed by #9027
Closed

Other Activation Functions #9010

AdarshAcharya5 opened this issue Aug 24, 2023 · 1 comment · Fixed by #9027
Labels
enhancement This PR modified some existing files

Comments

@AdarshAcharya5
Copy link
Contributor

Feature description

Hi, I'm relatively new to open source and found this repo friendly for early contributions. But is there a reason as to why there are only two activations (ELU and Leaky ReLU), under neural_network / activation_functions.? Also, can I contribute new activations into it?

@tianyizheng02
Copy link
Contributor

But is there a reason as to why there are only two activations (ELU and Leaky ReLU), under neural_network / activation_functions.?

People simply haven't contributed any more activation functions. That being said, I think there are some activation functions currently placed under different directories (such as sigmoid in the maths directory)—they should be moved to the activation functions directory IMO, so feel free to open a PR to do so if you wish.

Also, can I contribute new activations into it?

Yes, you can simply open a PR to do so. We don't assign issues in this repo, so there's no need to ask or open an issue about it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement This PR modified some existing files
Projects
None yet
3 participants
@tianyizheng02 @AdarshAcharya5 and others