Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(app, shared-data): app crashes when using python apiLevel<2.16 with fixed trash #16451

Conversation

TamarZanzouri
Copy link
Contributor

@TamarZanzouri TamarZanzouri commented Oct 9, 2024

Overview

closes EXEC-759 - fix render deck map when trash is fixed (python api version < 2.16).

Test Plan and Hands on Testing

upload the following protocol without labware on the deck:

from opentrons import protocol_api, types

metadata = {
    "apiLevel": "2.15",
}

requirements = {
    "robotType": "OT-3"
}

def run(ctx: protocol_api.ProtocolContext) -> None:
    tip_rack_50 = ctx.load_labware("opentrons_flex_96_filtertiprack_50ul", 4)
    ctx.move_labware(labware=tip_rack_50, new_location=5, use_gripper=True)
  1. enter ER
  2. select manually remove labware
  3. make sure that after the count down the app is not crashing.
  4. make sure deck map is rendering.

tested deck map locally - works as expected.

Changelog

fixed but when using deck map with fixed trash with python protocols bellow 2.16.
added shared-data method to get fixed trash definition

Review requests

changes make sense? did I forget anyhing?

Risk assessment

low

@TamarZanzouri TamarZanzouri requested a review from a team as a code owner October 9, 2024 21:07
@TamarZanzouri TamarZanzouri requested review from brenthagen, sfoster1 and mjhuff and removed request for a team and brenthagen October 9, 2024 21:07
Copy link
Member

@sfoster1 sfoster1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some code style comments that rise into sort of alarm levels when I note that the code that was there before your PR is doing a full command list scan every time through the reduce...

Copy link
Contributor

@mjhuff mjhuff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, the major issue is definitely my fault here. I think Seth covered everything, we need to hoist out the command scanning util. My one comment shouldn't apply if you use his recommendation for how to refactor getFixedTrashLabwareDefinitionsByUri. Thanks for doing all this!

Comment on lines 266 to 269
if (lw.id === 'fixedTrash') {
labwareDefinitionsByUri = getFixedTrashLabwareDefinitionsByUri(
lw.definitionUri
)
Copy link
Contributor

@mjhuff mjhuff Oct 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seth's approach should solve this, but if we do end up needing some sort association of the lw to the actual definitionUri, we should validate that the lw is a fixedTrash within the shared-data util itself.

Copy link
Contributor

@mjhuff mjhuff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, we do this same nested iterative check on L208, so if you could move that outside of the reduce using the same aforementioned logic, that would be great!

EDIT: To clarify, it would probably be best to invoke this only once outside of BOTH getRunCurrentModulesInfo and getRunCurrentLabwareInfo and pass a memoized value of the labware defs to those functions.

Copy link
Contributor

@mjhuff mjhuff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See comment above.

  • We need to move getLoadedLabwareDefinitionsByUri out of getRunCurrentModulesInfo, too.
  • We should memoize the value.

We can do this by hoisting your new function to the top level of useDeckMapUtils and doing something like:

const labwareDefinitionsByUri = useMemo(() => getLoadedLabwareDefinitionsByUri(
      protocolAnalysis.commands
    ), [])

and then pass that to getRunCurrentModulesInfo and getRunCurrentLabwareInfo.

Because the new getLabwareDefinition function is utilized in both places, we can just write this as a new function top-level, not within getRunCurrentModulesInfo or getRunCurrentLabwareInfo

Copy link
Member

@sfoster1 sfoster1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks awesome but also what Jamey said. No need for a rereview after that tbh

@TamarZanzouri TamarZanzouri requested a review from mjhuff October 10, 2024 18:35
Copy link
Contributor

@mjhuff mjhuff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome! Now you can make a resume bullet point along the lines of "exponentially improved the performance of Error Recovery"

@TamarZanzouri TamarZanzouri merged commit 7fb38c9 into edge Oct 10, 2024
33 checks passed
@TamarZanzouri TamarZanzouri deleted the EXEC-759-python-protocols-with-api-level-bellow-2-16-makes-the-app-crash-in-er branch October 10, 2024 22:14
mjhuff added a commit that referenced this pull request Oct 14, 2024
…col command text (#16478)

After #16451, it seemed like a good idea to do an audit of our O(n) getLabwareDefinitionsFromCommands util to see if we could improve performance elsewhere in the app. Turns out, pretty much every place that it's used has room for improvement (mainly in the case handled by this PR and in LPC flows).

One such place is within each protocol command text. When we display a list of protocol command texts, for every protocol step associated with a labwareId, we iterate over the entirety of protocol analysis. This commits acts as a half-measure: hoist out the util far enough to reduce most of the negative performance implications.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants