Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

inference: create a separate type for doing optimizations #23276

Merged
merged 1 commit into from
Oct 16, 2017

Conversation

vtjnash
Copy link
Member

@vtjnash vtjnash commented Aug 15, 2017

Most fields of InferenceState aren't valid during optimization,
so the goal is to reflect that in the structure of the types.

Similarly, doing optimization operations during inference would be
invalid, so this helps distinguish those cases as well.

Finally, this makes is possible for non-InferenceState-initiated IR passes
(e.g. external to typeinf) to make uses of these passes.

(Also, since it may not be eminently clear, note that the other goal of starting this is to eventually separate anything that takes an InferenceState into one file, and anything that takes OptimizationState into a second file)

@vtjnash vtjnash force-pushed the jn/split-out-optpass branch from 7cc97e7 to 8ad6424 Compare August 15, 2017 23:05
"invalid age range update")
nothing
end
function update_valid_age!(min_valid::UInt, max_valid::UInt, sv::OptimizationState)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why isn't the mutated state input the first argument to this function?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consistency

Copy link
Contributor

@tkelman tkelman Aug 16, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

with what? that's an inconsistency with Julia API recommendations (aka conventions) everywhere else

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the rest of the file

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

then

why isn't the mutated state input the first argument to this function

in the rest of the file?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

probably convention or something

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These conventions are not really important for internal functions.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Re: the confused emojis: yes it's nice to follow the conventions, but we have more important things to spend time on than reviewing the design of every private function that nobody will ever see. Priorities, priorities, priorities.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

most critical thing to fix right now, no, but it is actively confusing for API's to be backwards for anyone who's debugging around this file

@vtjnash vtjnash force-pushed the jn/split-out-optpass branch from 8ad6424 to cfcc208 Compare August 16, 2017 15:52
@@ -2742,19 +2806,31 @@ end

#### helper functions for typeinf initialization and looping ####

# scan body for the value of value of the largest referenced label
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"value of value" intended?

@vtjnash vtjnash force-pushed the jn/split-out-optpass branch from cfcc208 to 77fa892 Compare August 16, 2017 18:09
params::InferenceParams
function OptimizationState(frame::InferenceState)
s_edges = frame.stmt_edges[1]
if s_edges=== ()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

operator spacing

@ararslan ararslan added the compiler:inference Type inference label Aug 16, 2017
s_edges = frame.stmt_edges[1]
if s_edges=== ()
s_edges = []
frame.stmt_edges[1] = s_edges
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any way around mutating frame? That's a pretty strange thing for a constructor to do.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could make it part of the initialization for frame, but it's just more sensible to handle the lazy initialization here.

@@ -3365,40 +3460,44 @@ function optimize(me::InferenceState)
type_annotate!(me)

# run optimization passes on fulltree
force_noinline = false
force_noinline = true
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why did this change?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We've been miscomputing this; but fortunately, since it only affects the cached=no case, they're haven't been any consumers of the data.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It doesn't make sense to me that force_noinline would default to true. It's supposed to be true only for functions declared @noinline. In fact with this change I think it's now always true? Am I missing something?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regardless of what the correct default is here, this should not be included in a large, unrelated pull request. If this is wrong, it should be changed in a separate individual PR with a commit message that explains what was wrong and why the change is correct.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's a bit of an abuse of the variable name (phrasing it in the positive would be more accurate), since it should be dependent on optimize=true. We use this later to decide whether to compute the inlining optimization for the function. While instead that operation should be separated out into optimization, like the rest of this PR is working towards enforcing.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you're changing the meaning of this to the complete opposite, can you please change the name of the variable?

Most fields of InferenceState aren't valid during optimization,
so the goal is to reflect that in the structure of the types

Similarly, doing optimization operations during inference would be
invalid, so this helps distinguish those cases as well.

Finally, this makes is possible for non-InferenceState-initiated IR passes
(e.g. external to typeinf) to make uses of these passes.
@vtjnash vtjnash force-pushed the jn/split-out-optpass branch from 77fa892 to 45d0650 Compare October 12, 2017 18:58
@vtjnash vtjnash merged commit a9b90ca into master Oct 16, 2017
@vtjnash vtjnash deleted the jn/split-out-optpass branch October 16, 2017 20:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:inference Type inference
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants