Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extra feature support of Batch Norm layer for Keras (and Tensorflow) #14

Open
ZongHong-Lyu opened this issue Oct 17, 2018 · 0 comments
Open
Labels
enhancement New feature or request

Comments

@ZongHong-Lyu
Copy link
Contributor

The original paper of Batch Norm layer proposed that the operation should be applied per batch for fully connected layers, and be applied per 'global batch' for convolution layers.
(Global batch treats batch size as [b x h x w] if the output size is (b, h, w, c). So the mean is calculated per channel using all feature maps in the batch.)

Current tool's and Caffe's implementation follow this approach. But in Keras and Tensorflow, it is possible to apply the batch normalization along arbitrary axis (or more than one axes). In this case the tool can't convert the network correctly.

I am not sure if there will be networks use this feature in Keras (or Tensorflow). So I assume the priority of this feature support can be low.

@ZongHong-Lyu ZongHong-Lyu added the enhancement New feature or request label Oct 17, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant