Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

slicer endpoint should not hang on new slicer creation resolves #659 … #667

Merged
merged 3 commits into from
Feb 27, 2018

Conversation

jsnoble
Copy link
Member

@jsnoble jsnoble commented Feb 16, 2018

#589

}
const errMsg = `could not get slicer statistics, error: ${parseError(errObj)}`;
logger.error(errMsg);
sendError(res, 500, errMsg);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Come on ... this code sequence is repeated 4 times... I can't explain how frustrating that is after all the time we spent trying to clean this module up.

Does this really require special casing like this? Where does that .code attribute come from? Why can't it simply be handled in the general error handler?

@@ -601,22 +601,20 @@ module.exports = function module(context, messaging, executionService) {
}));
}

function getSlicerStats(exId) {
function getSlicerStats(exIds, specificId) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks questionable to me. What required this API to change in this way?

@godber
Copy link
Member

godber commented Feb 23, 2018

I am not really seeing #659 as being resolved.

When I run the integration tests, I have typically been able to run tstop -p 45678 and watch the jobs progress ... now it appears to hang. Checking with curl it appears that /txt/slicers blocks while a job is running and /txt/workers and /txt/nodes.

Pretty easy to reproduce ... just start up the integration tests and walk through the endpoints ...

curl -Ss localhost:45678/txt/workers
curl -Ss localhost:45678/txt/jobs
curl -Ss localhost:45678/txt/ex
curl -Ss localhost:45678/txt/slicers

They will all return except slicers. You might catch the tests between jobs and /txt/slicers returns then ... but its not hard to get it to fail.

@kstaken
Copy link
Member

kstaken commented Feb 27, 2018

I talked to @jsnoble about this and he thinks the issue with hitting it while the integration tests are running is down to the rapid fire nature of how jobs are scheduled in the tests and tstop catching them after a job has ended but before the slicer has fully shutdown. This is a different issue and he will open a separate ticket and put a fix in separate PR.

As for #659 this PR does appear to resolve that issue. The integration tests pass otherwise so going ahead with a merge.

@kstaken kstaken merged commit 5fc34f1 into terascope:master Feb 27, 2018
@jsnoble jsnoble deleted the slicer_api_timeout branch July 5, 2018 15:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants