Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Volumetric softmax and cross entropy criterion #239

Merged
merged 5 commits into from
Aug 25, 2016
Merged

Conversation

chsasank
Copy link
Contributor

Hi,

I've added VolumetricSoftMax, VolumetricLogSoftMax and VolumetricCrossEntropyCriterion.
To write these I've folded time and height dimensions into one and applied spatial modules. I think this is a decent way to implement these.

I've written tests for VolumetricLogSoftMax and VolumetricCrossEntropyCriterion. All the tests passed for torch.CudaTensor but a lot of tests (including those for convolutions etc) failed for torch.CudaDoubleTensor. Here's the test output.

Running test on device: #1 : GeForce GTX TITAN X with benchmark = false
Testing torch.CudaTensor
Running 26 tests
 1/26 VolumetricCrossEntropyCriterion ................................... [PASS]
 2/26 VolumetricMaxPooling .............................................. [PASS]
 3/26 SpatialCrossEntropyCriterion ...................................... [PASS]
 4/26 ClippedReLU_single ................................................ [PASS]
 5/26 VolumetricConvolution ............................................. [PASS]
 6/26 SpatialAveragePooling ............................................. [PASS]
 7/26 Tanh .............................................................. [PASS]
 8/26 LogSoftMax ........................................................ [PASS]
 9/26 SoftMax_single .................................................... [PASS]
10/26 functional_bias2D ................................................. [PASS]
11/26 VolumetricBatchNormalization ...................................... [PASS]
12/26 SpatialMaxPooling ................................................. [PASS]
13/26 SpatialCrossMapLRN_batch .......................................... [PASS]
14/26 Sigmoid ........................................................... [PASS]
15/26 ReLU .............................................................. [PASS]
16/26 TemporalConvolution_padding_batch ................................. [PASS]
17/26 SpatialFullConvolution ............................................ [PASS]
18/26 TemporalConvolution_reduceBatchSize ............................... [PASS]
19/26 SpatialConvolution ................................................ [PASS]
20/26 BatchNormalization ................................................ [PASS]
21/26 TemporalConvolution ............................................... [PASS]
22/26 VolumetricLogSoftMax .............................................. [PASS]
23/26 SpatialBatchNormalization ......................................... [PASS]
24/26 SpatialLogSoftMax ................................................. [PASS]
25/26 functional_maxpooling2d ........................................... [PASS]
26/26 functional_convolution2d .......................................... [PASS]
Completed 312 asserts in 26 tests with 0 failures and 0 errors
Testing torch.CudaDoubleTensor
Running 26 tests
 1/26 VolumetricCrossEntropyCriterion ................................... [PASS]
 2/26 VolumetricMaxPooling .............................................. [FAIL]
 3/26 SpatialCrossEntropyCriterion ...................................... [PASS]
 4/26 ClippedReLU_single ................................................ [PASS]
 5/26 VolumetricConvolution ............................................. [FAIL]
 6/26 SpatialAveragePooling ............................................. [FAIL]    
 7/26 Tanh .............................................................. [FAIL]
 8/26 LogSoftMax ........................................................ [FAIL]
 9/26 SoftMax_single .................................................... [FAIL]
10/26 functional_bias2D ................................................. [PASS]
11/26 VolumetricBatchNormalization ...................................... [FAIL]
12/26 SpatialMaxPooling ................................................. [FAIL]
13/26 SpatialCrossMapLRN_batch .......................................... [FAIL]
14/26 Sigmoid ........................................................... [FAIL]
15/26 ReLU .............................................................. [FAIL]
16/26 TemporalConvolution_padding_batch ................................. [FAIL]
17/26 SpatialFullConvolution ............................................ [FAIL]
18/26 TemporalConvolution_reduceBatchSize ............................... [PASS]
19/26 SpatialConvolution ................................................ [FAIL]
20/26 BatchNormalization ................................................ [FAIL]
21/26 TemporalConvolution ............................................... [FAIL]
22/26 VolumetricLogSoftMax .............................................. [FAIL]
23/26 SpatialBatchNormalization ......................................... [FAIL]
24/26 SpatialLogSoftMax ................................................. [FAIL]
25/26 functional_maxpooling2d ........................................... [PASS]
26/26 functional_convolution2d .......................................... [PASS]
Completed 298 asserts in 26 tests with 19 failures and 0 errors
--------------------------------------------------------------------------------
VolumetricMaxPooling
error on gradInput, batchMode = true, type = torch.CudaDoubleTensor, 
LT failed: 4.8651165962219 >= 0.02
        test/test.lua:105: in function 'testLayer'
        test/test.lua:342: in function <test/test.lua:321>
--------------------------------------------------------------------------------
VolumetricMaxPooling
error on output, batchMode = true, type = torch.CudaDoubleTensor, 
LT failed: 5.0026316642761 >= 0.0001
        test/test.lua:105: in function 'testLayer'
        test/test.lua:342: in function <test/test.lua:321>

...

@chsasank
Copy link
Contributor Author

@soumith Do you have any feedback on this?
I am curious why the PR didn't get any response.

@soumith
Copy link
Owner

soumith commented Aug 24, 2016

Hi Shashank. Sorry, I'm stretched a bit think for maintenance. I'm looking into all your PRs now.
You haven't added the changes to init.lua that are needed for your PR to work..., can you push those changes as well.

@chsasank
Copy link
Contributor Author

Thanks for the feedback. init.lua is edited now.

@soumith soumith merged commit 440f0d5 into soumith:master Aug 25, 2016
@soumith
Copy link
Owner

soumith commented Aug 25, 2016

Thanks Sasank!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants