Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove file-scope non-constant static variables to support multiple inference sessions #10481

Merged
merged 8 commits into from
Feb 10, 2022

Conversation

satyajandhyala
Copy link
Contributor

@satyajandhyala satyajandhyala commented Feb 7, 2022

Description: Describe your changes.
Remove non-constant file-scope static ("global static") variables.
The scope of this PR is only to enable multiple concurrent InferenceSession instances by removing non-constant static variables in the file-scope.
Motivation and Context
Make multiple concurrent InferenceSessions possible by encapsulating data in the class members.

@satyajandhyala satyajandhyala changed the title Sajandhy/remove static globals Remove file-scope non-static variable to support multiple inference sessions Feb 7, 2022
@satyajandhyala satyajandhyala changed the title Remove file-scope non-static variable to support multiple inference sessions Remove file-scope non-static variables to support multiple inference sessions Feb 7, 2022
@satyajandhyala satyajandhyala changed the title Remove file-scope non-static variables to support multiple inference sessions [WIP]Remove file-scope non-static variables to support multiple inference sessions Feb 7, 2022
@satyajandhyala satyajandhyala changed the title [WIP]Remove file-scope non-static variables to support multiple inference sessions [WIP]Remove file-scope non-constant static variables to support multiple inference sessions Feb 7, 2022
@satyajandhyala satyajandhyala force-pushed the sajandhy/remove_static_globals branch from 842231c to 83631a3 Compare February 7, 2022 22:35
@satyajandhyala satyajandhyala force-pushed the sajandhy/remove_static_globals branch from 3151d8e to 52a68ed Compare February 8, 2022 22:37
@satyajandhyala satyajandhyala added training issues related to ONNX Runtime training; typically submitted using template component:ortmodule labels Feb 8, 2022
@satyajandhyala satyajandhyala changed the title [WIP]Remove file-scope non-constant static variables to support multiple inference sessions Remove file-scope non-constant static variables to support multiple inference sessions Feb 9, 2022
@satyajandhyala satyajandhyala marked this pull request as ready for review February 9, 2022 01:49
@satyajandhyala satyajandhyala marked this pull request as draft February 9, 2022 22:32
@satyajandhyala satyajandhyala force-pushed the sajandhy/remove_static_globals branch from 10895f1 to 49f0890 Compare February 10, 2022 00:23
@satyajandhyala satyajandhyala force-pushed the sajandhy/remove_static_globals branch from 49f0890 to c01bf10 Compare February 10, 2022 00:31
@satyajandhyala satyajandhyala marked this pull request as ready for review February 10, 2022 00:39
: GraphTransformer("PropagateCastOps", compatible_execution_providers), level_(level), strategy_(strategy) {
fp16_allow_ops[0].clear(); // Remove previously added op types if any.
std::copy(_allow_list.begin(), _allow_list.end(), std::inserter(fp16_allow_ops[0], fp16_allow_ops[0].begin()));
: GraphTransformer("PropagateCastOps", compatible_execution_providers), level_(level), fp16_allow_ops_(3), strategy_(strategy) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: lines below are kind of long.
if this is the only place that level1_fp16_allow_ops/level2_fp16_allow_ops are used, how about just initializing them here?

Suggested change
: GraphTransformer("PropagateCastOps", compatible_execution_providers), level_(level), fp16_allow_ops_(3), strategy_(strategy) {
: GraphTransformer("PropagateCastOps", compatible_execution_providers), level_(level),
fp16_allow_ops_{
std::unordered_set<std::string>(allow_list.begin(), allow_list.end()),
{"level1 op", ... },
{"level2 op", ... }
},
strategy_(strategy) {

and then fp16_allow_ops_ can be const too

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I want to keep the level1 and level2 operators at the top of the file so that it is easy to modify (add new operators and delete existings ones.) and/or add a new levels of fp16 operator all together.

Copy link
Contributor Author

@satyajandhyala satyajandhyala Feb 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FP16AllowOps is designed to make adding new operator and new levels easy, i.e. to add new levels or new operators without modifying the code.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if you want to keep them separate, it's fine too.

constexpr std::array level1_fp16_allow_ops = {"...", ... }; // deduced to std::array<const char*, N>, don't need to specify N
...
  // ctor member initializer list
  fp16_allow_ops_{
      std::unordered_set<std::string>(allow_list.begin(), allow_list.end()),
      std::unordered_set<std::string>(level1_fp16_allow_ops.begin(), level1_fp16_allow_ops.end()),
      std::unordered_set<std::string>(level2_fp16_allow_ops.begin(), level2_fp16_allow_ops.end())
  }

as is, the lines in the ctor are longer than 120 characters

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Making the suggested change resulted in the following error:
/home/sajandhy/onnxruntime/onnxruntime/core/optimizer/propagate_cast_ops.cc:1468:269: required from here
/usr/include/c++/9/bits/hashtable.h:994:4: error: no matching function for call to ‘std::_Hashtable<std::__cxx11::basic_string, std::__cxx11::basic_string, std::allocator<std::__cxx11::basic_string >, std::__detail::_Identity, std::equal_to<std::__cxx11::basic_string >, std::hash<std::__cxx11::basic_string >, std::__detail::_Mod_range_hashing, std::__detail::_Default_ranged_hash, std::__detail::_Prime_rehash_policy, std::__detail::_Hashtable_traits<true, true, true> >::insert(const std::basic_string_view&)’

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because it requires conversion from string_view to string.

@satyajandhyala satyajandhyala merged commit eba7305 into master Feb 10, 2022
@satyajandhyala satyajandhyala deleted the sajandhy/remove_static_globals branch February 10, 2022 21:31
@satyajandhyala satyajandhyala mentioned this pull request Feb 10, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
training issues related to ONNX Runtime training; typically submitted using template
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants