Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Explicitly notify user if compiled bot too large #288

Merged
merged 7 commits into from
Nov 14, 2017

Conversation

lidavidm
Copy link
Contributor

Fixes #287

Copy link
Contributor

@Janzert Janzert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good to go. All the above are just nitpicks that are fine as is.


For our reference, here is the trace of the error:
"""

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given that the size is now explicitly checked before upload, I think I'd leave out the part about "likely the final compiled bot was too large".

@@ -6,6 +6,9 @@
from time import gmtime, strftime


MAX_BOT_UPLOAD_SIZE = 100 * 1024 * 1024
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might want to leave a note here and in apiserver/worker/config.py near MAX_COMPILED_BOT_UPLOAD_SIZE that the values should match.

@@ -95,6 +101,8 @@ def storeBotRemotely(user_id, bot_id, zip_file_path):
},
files={"bot.zip": zip_contents})
print("Posting compiled bot archive %s\n" % r.text)
# We don't check the status code since we instaed check the
# bot hash.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are many error codes that are unlikely to resolve on retry (almost by definition pretty much all the 4xx codes) and we could bail out without trying another 99 times. Also any response error that is likely temporary probably deserves a backoff before retrying.

But leaving it for now is fine, the other changes definitely make it much better than what happens now.

@lidavidm
Copy link
Contributor Author

Thanks for the feedback. I changed it to back off uploads, and reduced the number of tries to 10 consequently.

@harikmenon harikmenon requested a review from j-clap November 13, 2017 14:41
@@ -16,6 +16,7 @@
# Flask settings
# Max size of an upload, in bytes
MAX_BOT_UPLOAD_SIZE = 20 * 1024 * 1024
# Needs to match corresponding value in apiserver/config.py
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why does this need to match the value in itself? IRL recursion :P



# Needs to match corresponding value in apiserver/config.py
MAX_BOT_UPLOAD_SIZE = 100 * 1024 * 1024
Copy link
Contributor

@j-clap j-clap Nov 13, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rather than define it twice can't we just pick it from there/another central location for both?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, the issue is that the worker doesn't assume it has access to the server codebase (though this happens to be true currently). Perhaps we can make it part of the config.json, or have a server endpoint that returns the value?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Config.json would be fine :)


# Try again if local and remote hashes differ
if local_hash != getBotHash(user_id, bot_id):
print("Hashes do not match! Redoing file upload...\n")
iterations += 1
sleep(backoff)
if backoff < 32:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Magic variable.

@j-clap
Copy link
Contributor

j-clap commented Nov 13, 2017

Added small comments. I'm up for approving when they're addressed :)

@lidavidm
Copy link
Contributor Author

Updated, thanks for the review!


# Needs to match corresponding value in apiserver/config.py
# Default value, 100 MiB
MAX_BOT_UPLOAD_SIZE = 100 * 1024 * 1024
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At first I thought this was now redundant, but I see it's used as a default when the config doesn't define a value. The comment should probably be updated to reflect that.



with open("config.json") as configfile:
config = json.load(configfile)
MANAGER_URL = config["MANAGER_URL"]
SECRET_FOLDER = config["SECRET_FOLDER"]
CAPABILITIES = config.get("CAPABILITIES", [])
MAX_BOT_UPLOAD_SIZE = config.get("MAX_BOT_UPLOAD_SIZE",
MAX_BOT_UPLOAD_SIZE)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should probably keep the default value if the config value is None.

@harikmenon harikmenon merged commit 2fae6a3 into HaliteChallenge:master Nov 14, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants