-
-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Response Time #7
Comments
Running the containerized version of the server on my beefy desktop I see
on Heroku
I think what I said earlier about spinning it up every time was not correct, or at least it's not now that we are on the hobby plan. I don't think preboot would help if I'm understanding it correctly. I'll keep thinking about it. |
Yeah, from the explanation here I understand the preboot won't help: |
I setup the major sections to instead do elapsed times. It was pretty stable at:
If the same command was run twice, it was much faster. So this was clearly JIT time. JIT time is incurred each time because in Julia a Function is an abstract type, so all of the typing is done concretely directly to each new function. For longer simulations, this is a good strategy. For our purposes, it's likely not. So I setup a new problem and solution type so that way it wouldn't be concretely typed to stop the JIT, and made a new optional argument to turn off specialization of the
This is what it ends up being. What this shows is that the solve time is fairly minimal now. The
Essentially what we're left with then is plot and server transfer time. The plotting was recently refactored to make it easy to get the vectors that it would plot. I'll next try to directly compute those from the solution and go back to plotting using Plotly in the user's browser. |
Requires using OrdinaryDiffEq.jl master and DiffEqWebBase.jl |
Got more things precompiling and stopped the double interpolation. Same computer:
|
I refactored DiffEqBase's plot recipe a little bit and then I tested what happened when the plot was deconstructed:
It turns out that generating the plot itself takes almost no time, and almost all of the time is in the solution handling. So if we don't want to send back the |
Final damage: Requires OrdinaryDiffEq.jl master, DiffEqBase.jl master, and the new DiffEqWebBase (for the non-specializing problem / solution). Timing result:
I didn't test if the main cost in the setup time was due to the ParameterizedFunctions.jl ODE parsing, but I assume it would be, and it would take quite a bit of work to get that down (I am sure it can probably go down since it uses a lot of Dicts, but that's a project for re-writing a parser and there's a lot more I want to do there). Even then, that's maxed at ~0.1 seconds that we can gain. The other part is solution handling. That's interpolating the output at more points to get a good plot. That's unavoidable, and is ~0.1 seconds. That leaves us at 0.2 seconds. Given that Heroku is twice as slow, that's ~0.4 seconds. Everything else seems negligible, unless there's "server transfer time". Then, given this setup, we could just send back and plot the One little extra note is that I made it throw an error if maxiters is hit. Otherwise it would stall for awhile trying to make a huge plot. We may want to drop what we are using for maxiters a bit to constrain the user a bit more. I'll try and see if I can find out how to deploy this, and see what the response times are like. |
Getting about the same results in the containerized image:
|
The response time is very reasonable now on the deployed version. |
It may be useful to track in the future whether all of this to avoid the JIT is necessary after changes to the compiler. |
What are some ways we can get the response time down? Would getting the standard plan with Preboot help? I would be willing to pay for the Standard dyno if it would help, just let me know. The other things along path would be to setup ParameterizedFunctions, DiffEqBase, and OrdinaryDiffEq in the sysimg, and trying out precompilation in ParameterizedFunctions (it may actually work already? It should).
The text was updated successfully, but these errors were encountered: