Skip to content

Commit

Permalink
go back and rework softmax
Browse files Browse the repository at this point in the history
  • Loading branch information
brightredchilli committed Apr 23, 2017
1 parent afe018b commit eb8bf3b
Show file tree
Hide file tree
Showing 2 changed files with 36 additions and 115 deletions.
17 changes: 16 additions & 1 deletion assignment1/cs231n/classifiers/softmax.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,22 @@ def softmax_loss_naive(W, X, y, reg):
- loss as single float
- gradient with respect to weights W; an array of same shape as W
"""
return softmax_loss_vectorized(W, X, y, reg)

# reimplemented sort of for fun
N, D = X.shape
pred = X.dot(W)
pred -= np.max(pred, axis=1, keepdims=True)
a = np.exp(pred)
b = a[np.arange(N),y] / a.sum(1)

c = -np.log(b)
loss = np.sum(c) / N

loss += 0.5 * reg * np.sum(W * W)

return loss, 0
#return softmax_loss_vectorized(W, X, y, reg)



def softmax_loss_vectorized(W, X, y, reg):
Expand Down
134 changes: 20 additions & 114 deletions assignment1/softmax.ipynb

Large diffs are not rendered by default.

0 comments on commit eb8bf3b

Please sign in to comment.