-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache last input and accessed index for faster lookups #34
Comments
Here are the numbers for this change relative to other earlier performance work (using the En-ROADS model as a real world test case). PerformanceMacBook Pro (2019) | 2.4 GHz 8-core i9, 32 GB RAM, macOS 10.15 Safari:
Chrome:
Firefox:
iPhone 8 | A11, iOS 13
iPad Air (2013) | A7, iOS 12
Size
Legend
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The
LOOKUP
function operates on (x,y) pairs where the x value is assumed to be monotonically increasing. The way lookups are currently used, theinput
value tends to be a "marching time" value, meaning that it increases from initial time to final time, and then back to initial time and so on.Two observations:
In the case of the En-ROADS model, the time step is 0.125 years, but most lookup data has a granularity of one year. This means that much of the time, each lookup is called 8 times with the same input and same expected return value, before moving up to the next year.
Currently the
LOOKUP
function uses a simple linear search that always begins at zero.We can improve the speed of lookups and address both of the above by caching the last
input
and the last-accessed index value in theLookup
itself. If the currentinput
value is greater than or equal to the lastinput
value, then we can set our start index to the last index value. With this approach we would generally never have to go more than one index ahead, and only once the time variable goes back to initial would we need to set the start index back to zero.The text was updated successfully, but these errors were encountered: