Skip to content

Commit

Permalink
update docstring
Browse files Browse the repository at this point in the history
  • Loading branch information
oxinabox committed Mar 18, 2019
1 parent 025d9b6 commit 7d247ea
Showing 1 changed file with 5 additions and 4 deletions.
9 changes: 5 additions & 4 deletions src/layers/basic.jl
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ end
Maxout(over)
`Maxout` is a neural network layer, which has a number of internal layers,
which all have the same input, and the max out returns the elementwise maximium
which all have the same input, and the maxout returns the elementwise maximium
of the internal layers' outputs.
Maxout over linear dense layers satisfies the univeral approximation theorem.
Expand All @@ -150,15 +150,16 @@ end
Maxout(f, n_alts, args...; kwargs...)
Constructs a Maxout layer over `n_alts` instances of the layer given by `f`.
All other arguements (`args` & `kwargs`) are passed to the constructor `f`.
The function takes no arguement and should return some callable layer.
Conventionally this is a linear dense layer.
For example the following example which
will construct a `Maxout` layer over 4 dense linear layers,
will construct a `Maxout` layer over 4 internal dense linear layers,
each identical in structure (784 inputs, 128 outputs).
```julia
insize = 784
outsie = 128
Maxout(Dense, 4, insize, outsize)
Maxout(()->Dense(insize, outsize), 4)
```
"""
function Maxout(f, n_alts, args...; kwargs...)
Expand Down

0 comments on commit 7d247ea

Please sign in to comment.