You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there a way to get around file size limits (e.g., Google App Engine has a limit of 32 MB, which the GeoLite2 city database exceeds)?
The best solution I can think of is to allow a buffer to be passed in instead of the filename, so that the file can be loaded either as a .zip file and extracted in memory, or segmented into multiple files and reassembled in memory.
The text was updated successfully, but these errors were encountered:
For people who want a quick fix, you can monkey patch the reader object:
# Monkey-patch the MaxMind database reader to read from compressed files.# Note: mmap won't work with compressed files.defopen_compressed(name, mode='r'):
# TODO: Could support tarfile, zipfile etc. here.ifname.endswith('.gz'):
importgzipreturngzip.GzipFile(name, mode)
returnopen(name, mode)
maxminddb.reader.open=open_compressed
I recommend using MODE_MEMORY because mmap won't work with this solution and traversing a compressed file (MODE_FILE) is not going to perform as well.
Thanks for providing the workaround. I'll leave this open as we may consider opening from a buffer in the future if there is demand. I think this is the first such request we have received.
Is there a way to get around file size limits (e.g., Google App Engine has a limit of 32 MB, which the GeoLite2 city database exceeds)?
The best solution I can think of is to allow a buffer to be passed in instead of the filename, so that the file can be loaded either as a .zip file and extracted in memory, or segmented into multiple files and reassembled in memory.
The text was updated successfully, but these errors were encountered: