Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Separate Inference and Promotion into Separate Passes #1871

Merged
merged 17 commits into from
Jan 29, 2024

Conversation

calebmkim
Copy link
Contributor

@calebmkim calebmkim commented Jan 25, 2024

This is the first step of the progress described in here.

Apologies for the really big PR, but a lot of it is just moving around code and slightly changing tests.

The main new thing is the fixup_timing function. Suppose component B instantiates component A. Suppose also that we infer component's A's latency, and use that to infer the latency of B. If we decide not to promote A, then we can the inference information in B is no longer valid. This fixup_timing function should account for that. I wrote the function so that it can also (hopefully) be applied to compaction when we need to re-infer components after we compact them. There are tests for this behavior it here and here

@calebmkim calebmkim changed the title Separate Inference and Promotion into separate stages Separate Inference and Promotion into Separate Passes Jan 25, 2024
@rachitnigam
Copy link
Contributor

Awesome! BTW, the previous implementation of this kind of "fixup" logic is in WithStatic. We can remove it in a follow-up PR as we start getting rid of @static (#1429).

@calebmkim
Copy link
Contributor Author

calebmkim commented Jan 26, 2024

BTW, the previous implementation of this kind of "fixup" logic is in WithStatic.

Yeah, I actually use WithStatic already in the FixUp (I just changed the @static keyword to @promote_static). It's just that the FixUp also needed to remove and then re-infer the latencies of groups as well. Perhaps I can move that code to the inference_analysis.rs file to keep it all in one place?

@sampsyo
Copy link
Contributor

sampsyo commented Jan 27, 2024

All sounds good, and the implementation looks great! As we discussed synchronously, this "fixup" behavior for backing out some cross-component inference seems like the right thing.

The one thing I realized I don't completely understand about this is: I think the rule is (using your scenario above) we need to remove inferred latencies from B if B writes to A. I don't quite understand why writes are important in particular… I guess I would have expected the documentation for the function to say, like, "uses" or something. Is it just because any use involves a write to some input port (at least to the go port)? And merely reading from ports does not actually activate the control of the component?

@calebmkim
Copy link
Contributor Author

calebmkim commented Jan 27, 2024

I don't quite understand why writes are important in particular… I guess I would have expected the documentation for the function to say, like, "uses" or something. Is it just because any use involves a write to some input port (at least to the go port)? And merely reading from ports does not actually activate the control of the component?

Yeah; in fact, I think my implementation was a bit imprecise (i.e., it removes the @promotable attribute too easily). A more precise rule could be something like: we need to remove inferred latencies from B if B writes to the go port of A. Just reading from A shouldn't affect the component's latency.

@sampsyo
Copy link
Contributor

sampsyo commented Jan 27, 2024

Makes sense; thanks for clarifying! This level of precision seems fine to leave to future work. 😃

@rachitnigam
Copy link
Contributor

Does this fix #1852?

@calebmkim
Copy link
Contributor Author

No, but I have a PR that's almost ready that will fix it.

@calebmkim
Copy link
Contributor Author

I'll merge this (I have more PRs coming, so I can adress any changes in those PRs as well).

@calebmkim calebmkim merged commit 4beb1cf into main Jan 29, 2024
7 checks passed
@calebmkim calebmkim deleted the separate-inference-promotion branch January 29, 2024 14:06
rachitnigam pushed a commit that referenced this pull request Feb 16, 2024
* progress

* first try at separating passes

* use helper function

* better tests, cleaner code

* existing tests passing

* inference

* updated code

* clippy

* clippy is annoying

* fixup works now

* fixup is fixed up

* another fixup test

* rewrite tests

* cleaner code

* commented code
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants