Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limit Multiple Executions #1387

Closed
jquick opened this issue Aug 7, 2015 · 4 comments
Closed

Limit Multiple Executions #1387

jquick opened this issue Aug 7, 2015 · 4 comments
Assignees
Milestone

Comments

@jquick
Copy link

jquick commented Aug 7, 2015

We have a fair amount of jobs that we like to have multiple executions on the same node but we need a cap due to resource constraints.

Having a "Multiple Execution Limit" would allow us to cap a job to x consecutive executions. This will allow the job to grow up to x executions when enough work is required.

Our somewhat workaround is using a external script to call into the API to see how many executions are currently running for the job. This is pretty ugly and something built in would be extremely appreciated. If there is currently a way to do this that we have overlooked please let me know.

Thanks!

@mathieuchateau
Copy link
Contributor

Hello,

In the meantime, you can insert a step before the main one to check
resource availability. An exit 0 will proceed with the main job, an exit 1
will generate an error and prevent job execution.

Example of such script to check for 800MB of free memory, if not sleep 10
sec 10 times to check again. Fail if still not ok:

wait=1
count=0
while [ "$wait" == "1" ];do
echo checking for available resource
freeMem=free -t -m | egrep Mem | awk '{print $4}'
if [ $freeMem -gt 800 ];then
echo Enough resource to go - $freeMem
exit 0
else
echo waiting for more resource, retry number $count
if [ $count -lt 10 ];then
count=$(($count + 1))
sleep 10
else
echo too many retry, aborting
wait=0
exit 1
fi

    fi

done

Cordialement,
Mathieu CHATEAU
http://www.lotp.fr

2015-08-07 2:08 GMT+02:00 jquick [email protected]:

We have a fare amount of jobs that we like to have multiple executions on
the same node but we need a cap due to resource constraints.

Having a "Multiple Execution Limit" would allow us to cap a job to x
consecutive executions. This will allow the job to grow up to x executions
when enough work is required.

Our somewhat workaround is using a external script to call into the API to
see how many executions are currently running for the job. This is pretty
ugly and something built in would be extremely appreciated. If there is
currently a way to do this that we have overlooked please let me know.

Thanks!


Reply to this email directly or view it on GitHub
#1387.

@BlairPaine
Copy link

I agree +1

@jquick
Copy link
Author

jquick commented Aug 7, 2015

@mathieuchateau - Thanks for the tip. Is there a way to not error but just stop the job there? Maybe using the new halt somehow? We send emails on errors and would be false to get error tickets when its just skipping due to memory/cap.

@jquick
Copy link
Author

jquick commented Aug 7, 2015

Ah if I change the job to "Step-oriented" I can setup a halt workflow on error. Though it looks like "halt" is still considered an error as far as alerting goes.

@jtobard jtobard self-assigned this May 25, 2018
@gschueler gschueler added this to the 3.0.x milestone May 31, 2018
@gschueler gschueler changed the title Feature request: Limit Multiple Executions Limit Multiple Executions Jun 1, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants