-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add runtime fields/deltas to def, proxy, job data elements #5138
Conversation
3e0a277
to
bdf490c
Compare
Did you mean to remove |
It's not removed (nothing would work otherwise), I used the corresponding
It's just massively reduced.. like in: (but not using that new version, so must have happened in |
optional string env_script = 11; | ||
optional string err_script = 12; | ||
optional string exit_script = 13; | ||
optional float execution_time_limit = 14; | ||
optional string platform = 15; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left this in, so current API doesn't break.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense to keep this as the job platform is chosen based on the runtime configuration (e.g. platform
could be set to a platform-group) causing these two values to differ.
@@ -617,40 +617,6 @@ def _create_job_log_path(workflow, itask): | |||
exc.filename = target | |||
raise exc | |||
|
|||
@staticmethod | |||
def _get_job_scripts(itask, rtconfig): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seemed to be a double up of what's already done in the config.py
.. The only difference is the replacement of environmental variable CYLC_TASK_CYCLE_POINT
for str(itask.point)
(which isn't available earlier):
comstr = (
"cylc workflow-state "
+ " --task=" + itask.tdef.workflow_polling_cfg['task']
+ " --point=" + str(itask.point)
)
It's removal doesn't appear to have an impact, however, I could be wrong.. (is the env variable ever not available?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CYLC_TASK_CYCLE_POINT
should always be available in job execution environments so should be good.
Took a look and I think we are safe to remove this, cheers.
a18042d
to
a95d9be
Compare
try: | ||
platform = rtconfig['platform']['name'] | ||
except (KeyError, TypeError): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't rtconfig['platform']
just be a string of the platform name? In what circumstance would it be a dict containing a name
key?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The confusing thing is.. Platform gets changed before job submission, both exist.
The dict version appears to be more common..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is confusing. Is it just because one of your calls to runtime_from_config
passes in the job config instead of the taskdef.rtconfig
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ping @dwsutherland - one question to respond to here. As I recall, I didn't think it should be like this, hence my suggested reason for it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, the platform gets fully resolved/loaded on job construction (into a parsec dict).. however it's just a string in the workflow config runtime.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tested out in graphiql, looks good. Just noticed 1 typo
This will need to be rebased once #4901 goes in (or visa versa) (and protobuf module regenerated) |
79b2973
to
ccd2bab
Compare
This comment was marked as resolved.
This comment was marked as resolved.
Another problem Oliver has just realised: the order of the |
I think we'll need a list of environment variables to get around this e.g: {
"environment": [
{"key": "answer", "value": 42}
]
} |
This comment was marked as resolved.
This comment was marked as resolved.
ccd2bab
to
3351159
Compare
Ok I've fixed the problem.. I was using WRT the environment order:
Dictionaries are ordered, and I dump them as string fields:
The resolver does:
So unless something odd happens between this and firing it off to the client, the fields should be fine.. |
I've had a look at |
This comment was marked as resolved.
This comment was marked as resolved.
3351159
to
765b1d7
Compare
Sorted now.. It's because some submission failures happen before full job construction. |
Dictionaries are ordered in Python land, however, they are not in either JSON or JS. Because JSON is the transport format GraphQL uses and the UI code is JS I would not expect this order to be preserved. So I think we need a JSON structure like this to preserve order: [
{
"key": "FOO",
"value": "42"
},
{
"key": "BAR",
"value": "answer"
}
] |
a9b8c07
to
18aa91a
Compare
Ok, I've put a workaround in that will check and alter
here's an old and new (via a couple of restarts):
This workaround can be removed some time in the future... |
18aa91a
to
20e04fb
Compare
(most the missing coverage is pre-existing, and just replicated-in/copied-to the workaround) |
I have opened a PR against this with a functional test for the restart: dwsutherland#10 |
@@ -617,40 +617,6 @@ def _create_job_log_path(workflow, itask): | |||
exc.filename = target | |||
raise exc | |||
|
|||
@staticmethod | |||
def _get_job_scripts(itask, rtconfig): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CYLC_TASK_CYCLE_POINT
should always be available in job execution environments so should be good.
Took a look and I think we are safe to remove this, cheers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, tested runtime fields on the following objects:
Task
(task config as defined in flow.cylc, no broadcasts applied)TaskProxy
(task config + broadcasts)Family
(family config as defined in flow.cylc, no broadcasts applied)FamilyProxy
(family config + broadcasts)Job
(task config + broadcasts at the time of job submission, no backsies)
All worked as expected 🚀
I spotted two minor issues during the course of testing...
1) Duration lists get formatted as integers when cleared.
When I create a broadcast it is presented to me in native-ISO8601 format:
$ cylc broadcast broad -s 'execution retry delays = PT1M'
Broadcast set:
+ [*/root] execution retry delays=PT1M
But when I clear the broadcast it is presented to me in integer format:
$ cylc broadcast --clear broad
Broadcast cancelled:
- [*/root] execution retry delays=[60.0]
2) Updated deltas contain all fields
E.G. for this GraphQL query:
subscription {
deltas {
updated(stripNull: true) {
taskProxies {
id
runtime {
script
executionRetryDelays
}
}
}
}
}
And the following broadcast:
$ cylc broadcast broad -s 'execution retry delays = PT10M'
Broadcast set:
+ [*/root] execution retry delays=PT10M
I get an updated delta like this:
{
"data": {
"deltas": {
"updated": {
"taskProxies": [
{
"id": "~osanders/broad//20191209T1200Z/pub",
"runtime": {
"script": "sleep 1",
"executionRetryDelays": "PT10M"
}
},
{
"id": "~osanders/broad//20191209T1200Z/wipe_bar",
"runtime": {
"script": "sleep 1",
"executionRetryDelays": "PT10M"
}
},
Which contains the execution retry delays
which have changed but also the script
which hasn't.
As long as this doesn't cause broadcasted runtime fields to get re-sent for things like task state changes this is probably harmless.
Co-authored-by: Ronnie Dutta <[email protected]>
Co-authored-by: Ronnie Dutta <[email protected]>
Co-authored-by: Oliver Sanders <[email protected]>
a522566
to
eb1382a
Compare
Fixed.. Internal objects/format needed stringified for response.
Yes, it's not fine grained at the moment.. It would be a tricky thing to do, as some have been added or cleared (and then there's inheritance).. I filter out those runtimes whose serialized form hasn't changed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @dwsutherland 🚀
closes #5054
Includes runtime field for def, proxy, and jobs .. with:
example:
![image](https://user-images.githubusercontent.com/11400777/190537141-0b8d3f34-408c-4aab-8865-7187189c4245.png)
broadcasting:
and then running
foo
again results in:Cancel broadcast will create deltas on effected proxy nodes also.
Check List
CONTRIBUTING.md
and added my name as a Code Contributor.setup.cfg
andconda-environment.yml
.CHANGES.md
entry included if this is a change that can affect users