Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Activation functions as static context instead of constant #78

Open
xxMrPHDxx opened this issue Feb 15, 2018 · 1 comment
Open

Activation functions as static context instead of constant #78

xxMrPHDxx opened this issue Feb 15, 2018 · 1 comment

Comments

@xxMrPHDxx
Copy link

xxMrPHDxx commented Feb 15, 2018

We are currently using constant to declare those activation functions.
I think it will be much better if we use static context instead to declare them.

Here I included the code in my gist as a reference.

@Adil-Iqbal
Copy link

I agree with the idea that the activation functions should be moved into their own file. There are a ton of activation functions that have yet to be implemented because of the current limitations of the ActivationFunction class. The class itself will probably go through several changes in the future, especially if we are implementing activation functions with alpha and lambda values. Not to mention if we implement activations that go past a single-fold x, like the softmax and maxout activations.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants