You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Return np.ndarrays, optionally one-hot encoded (remove all functions not used)
neural_networks/init
OpflowQNN missing in docstring
loss_functions
Should we rename the file loss.py to loss_functions.py?
Should we include the actual loss functions in the init.py to make them importable directly, e.g. "from qml.utils.lossfunctions import CrossEntropyLoss"? (currently there is only the copyright header in the loss_functions init)
Are we using L2LossProbability or should we remove it? Or is it tested somewhere?
We should extend the docstrings of the loss functions a little bit to include what exactly they are computing.
Did we test the KLDiv somewhere (it doesn't have a gradient implemented)?
Do we need softmax/stable_softmax, doesn't seem to be used anywhere, maybe remove?
The text was updated successfully, but these errors were encountered:
What is the expected enhancement?
There are minor improvements to do:
algorithms/init
NNC/NNR/VQR are missing in docstring
datasets
Return np.ndarrays, optionally one-hot encoded (remove all functions not used)
neural_networks/init
OpflowQNN missing in docstring
loss_functions
Should we rename the file loss.py to loss_functions.py?
Should we include the actual loss functions in the init.py to make them importable directly, e.g. "from qml.utils.lossfunctions import CrossEntropyLoss"? (currently there is only the copyright header in the loss_functions init)
Are we using L2LossProbability or should we remove it? Or is it tested somewhere?
We should extend the docstrings of the loss functions a little bit to include what exactly they are computing.
Did we test the KLDiv somewhere (it doesn't have a gradient implemented)?
Do we need softmax/stable_softmax, doesn't seem to be used anywhere, maybe remove?
The text was updated successfully, but these errors were encountered: