Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Polish C++ operators documentation #5265

Closed
wangkuiyi opened this issue Oct 31, 2017 · 4 comments
Closed

Polish C++ operators documentation #5265

wangkuiyi opened this issue Oct 31, 2017 · 4 comments
Assignees

Comments

@wangkuiyi
Copy link
Collaborator

wangkuiyi commented Oct 31, 2017

We need to polish documentations of our operators. An example is

AddComment(
R"DOC(Computes the AUC according forward output and label.
Best to use for binary classification evaluations.
If input label contains values other than 0 and 1, it will be cast
to bool.
You can find the definations here:
https://en.wikipedia.org/wiki/Receiver_operating_characteristic#Area_under_the_curve
Possible curves are:
- ROC: Receiver operating characteristic
- PR: Precision Recall
)DOC");
}

We can take ONNX's documentation as a reference. This also helps us preparing joining ONNX.

The following operators' documentation needs polishment.

If you are going to file a PR to polish one or more of them, please check them so to avoid conflict of interests with others.

  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • accuracy_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • activation_op.cc
  • adadelta_op.cc
  • adagrad_op.cc
  • adam_op.cc
  • adamax_op.cc
  • auc_op.cc
  • batch_norm_op.cc
  • cast_op.cc
  • clip_op.cc
  • concat_op.cc
  • cond_op.cc
  • conv2d_op.cc
  • conv2dtranspose_op.cc
  • conv_shift_op.cc
  • cos_sim_op.cc
  • crop_op.cc
  • cross_entropy_op.cc
  • decayed_adagrad_op.cc
  • dropout_op.cc
  • dynamic_recurrent_op.cc
  • elementwise_add_op.cc
  • elementwise_div_op.cc
  • elementwise_mul_op.cc
  • elementwise_op.h
  • elementwise_sub_op.cc
  • feed_op.cc
  • fetch_op.cc
  • fill_constant_batch_size_like_op.cc
  • fill_constant_op.cc
  • fill_zeros_like_op.cc
  • gather_op.cc
  • gaussian_random_op.cc
  • gru_unit_op.cc
  • huber_loss_op.cc
  • increment_op.cc
  • l1_norm_op.cc
  • load_op.cc
  • lookup_table_op.cc
  • lrn_op.cc
  • lstm_op.cc
  • lstm_unit_op.cc
  • margin_rank_loss_op.cc
  • matmul_op.cc
  • mean_op.cc
  • minus_op.cc
  • modified_huber_loss_op.cc
  • momentum_op.cc
  • mul_op.cc
  • multiplex_op.cc
  • nccl_op.cc
  • nccl_op.cc
  • nccl_op.cc
  • nccl_op.cc
  • pad_op.cc
  • pool_op.cc
  • pool_op.cc
  • pool_with_index_op.cc
  • pool_with_index_op.cc
  • prelu_op.cc
  • proximal_adagrad_op.cc
  • proximal_gd_op.cc
  • rank_loss_op.cc
  • recurrent_op.cc
  • reduce_op.cc
  • reduce_op.cc
  • reduce_op.cc
  • reduce_op.cc
  • reduce_op.cc
  • reshape_op.cc
  • rmsprop_op.cc
  • save_op.cc
  • scale_op.cc
  • scatter_op.cc
  • seq_expand_op.cc
  • sequence_concat_op.cc
  • sequence_conv_op.cc
  • sequence_pool_op.cc
  • sequence_softmax_op.cc
  • sgd_op.cc
  • sigmoid_cross_entropy_with_logits_op.cc
  • sign_op.cc
  • smooth_l1_loss_op.cc
  • softmax_op.cc
  • softmax_with_cross_entropy_op.cc
  • split_op.cc
  • squared_l2_distance_op.cc
  • squared_l2_norm_op.cc
  • sum_op.cc
  • top_k_op.cc
  • transpose_op.cc
  • uniform_random_op.cc
@kavyasrinet
Copy link

Is anyone working on activation_op already ? If not I can start from there.

@kexinzhao
Copy link
Contributor

Yes, I am working on them.

@kavyasrinet
Copy link

Alright! Thanks for the update. I will start from the other ones.

@kavyasrinet
Copy link

This has been successfully accomplished. Closing the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants