Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement batchnorm layer #155

Open
milancurcic opened this issue Aug 8, 2023 · 0 comments
Open

Implement batchnorm layer #155

milancurcic opened this issue Aug 8, 2023 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@milancurcic
Copy link
Member

Originally requested by @rweed in #114.

A batch normalization is possibly the next most widely used layer after dense, convolutional, and maxpooling layers, and is an important tool in optimization (accelerating training).

For neural-fortran, it will mean that we will need to allow passing a batch of data to individual layer's forward and backward methods. While for dense and conv2d layers it may also mean an opportunity to numerically optimize the operations (e.g. running the same operation on a batch of data instead of over one sample at a time), for a batchnorm layer it is required because this layer evaluates moments (e.g. means and standard deviations) over a batch of inputs to normalize the input data.

Implementing batchnorm will require another non-trivial refactor like we did to enable generic optimizers. However, it will probably be easier. The first step will be to allow the passing of a batch of data to forward and backward methods, as I mentioned above. In other words, this snippet:

do concurrent(j = istart:iend)
call self % forward(input_data(:,j))
call self % backward(output_data(:,j))
end do

after the refactor, we should be able to write like this:

call self % forward(input_data(:,:))
call self % backward(output_data(:,:))

where the first dim corresponds to inputs and outputs in input and output layers, respectively, and the second dim corresponds to multiple samples in a batch. I will open a separate issue for this.

@Spnetic-5 given limited time in the remainder of the GSoC program, we may be unable to complete the batchnorm implementation, but we can make significant headway on it for sure.

References

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants