-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add MKLDNNAddtoLayer #5309
add MKLDNNAddtoLayer #5309
Conversation
After merged latest code, still failed of
|
After rerun, it pass. |
TestConfig dnnConfig; | ||
getAddtoConfig(dnnConfig, pm, nInputs); | ||
dnnConfig.layerConfig.set_type("mkldnn_addto"); | ||
// TODO(TJ): test with bias |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
下一个PR是完成有bias的addtoLayer?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
嗯,不是下一个就是再下一个了。可能先把Resnet调通下,因为Resnet里面可能还用不到bias的addto。
出于功能,也会实现带有bias的,我已经加到我们list的TODO里面了。
if (withBias) { | ||
dnnConfig.biasSize = pm.ic * pm.ih * pm.iw; | ||
} else { | ||
dnnConfig.biasSize = 0; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dnnConfig.biasSize = withBias? pm.ic * pm.ih * pm.iw : 0;
下次改Bias的时候一起改了吧。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
哦好的,thx
@@ -132,7 +132,7 @@ void MKLDNNTester::checkForward() { | |||
VLOG(MKLDNN_TESTS) << "Check Forward"; | |||
printTopDatas(); | |||
double delta = | |||
compareMatrix(dnnLayer_->getOutputValue(), refLayer_->getOutputValue()); | |||
compareMatrix(refLayer_->getOutputValue(), dnnLayer_->getOutputValue()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里为什么要互换下位置呢
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
哦,理论上是可以不要调换的,但是还是想把顺序统一下,并且我默认第一个参数是参考。
add mkldnn_addto layer without bias