You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I'm relatively new to open source and found this repo friendly for early contributions. But is there a reason as to why there are only two activations (ELU and Leaky ReLU), under neural_network / activation_functions.? Also, can I contribute new activations into it?
The text was updated successfully, but these errors were encountered:
But is there a reason as to why there are only two activations (ELU and Leaky ReLU), under neural_network / activation_functions.?
People simply haven't contributed any more activation functions. That being said, I think there are some activation functions currently placed under different directories (such as sigmoid in the maths directory)—they should be moved to the activation functions directory IMO, so feel free to open a PR to do so if you wish.
Also, can I contribute new activations into it?
Yes, you can simply open a PR to do so. We don't assign issues in this repo, so there's no need to ask or open an issue about it.
Feature description
Hi, I'm relatively new to open source and found this repo friendly for early contributions. But is there a reason as to why there are only two activations (ELU and Leaky ReLU), under neural_network / activation_functions.? Also, can I contribute new activations into it?
The text was updated successfully, but these errors were encountered: