-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Modernize, standardize, and otherwise improve parser code #215
base: master
Are you sure you want to change the base?
Conversation
- download_fms_fixies is now Py2/3 compatible and updated to 2016 python code rather than 2013; added argparse for specifying start and end dates on the command line; no more hand-rolled datetime stuff, and no more hard-coded holidays (thx to arrow and holidays) - added and pinned arrow and holidays deps to requirements file
(hope to god i didn’t introduce any errors)
Also loaded resources properly, so file handles are closed automatically.
- Expanded and standardized CLI args for download and parse fms fixies modules - CLI defaults now only download or parse fms fixies that haven’t already been downloaded or parsed - Made parser into python package, to enable relative imports within modules - Bumped arrow version requirement
- continued putting the pieces together in download_and_parse_fms_fixies CLI - better logging and empty result handling in downloader and parser - tweaked CLI arg names for data dirs and shifted default end dates back one day - utils.get_daily_csvs_by_date() keys now arrow objects, just like utils.get_fixies_by_date()
- added table keys and db table names to constants - default startdate now 8 days prior to today rather than earliest possible date; seems like default shouldn’t be “run over every single thing available” - restructured daily_csvs_by_date dict; now nested dict by table name rather than tuple of filenames - bumped pandas version requirement
- add logger to utils module, to prevent bug - add csv text stuff to constants, temporarily - clean up the readme a bit
- Improve logging messages / levels throughout parser code - I changed my mind: make default start date be the earliest available date, but in practice, defaults will only get the new stuff - Update run.sh and reset_data.sh scripts to use new parser CLI -
@cezary4 Could you explain the reason for that last commit? I'm confused because those files are in a "package", i.e. there's an |
parser/aggregate_fms_fixies.py
Outdated
@@ -70,7 +70,8 @@ def build_db(dbfile, lifetimecsvdir): | |||
|
|||
connection = sqlite3.connect(dbfile) | |||
# bad, but pandas doesn't work otherwise (TODO: check this) | |||
# connection.text_factory = str | |||
# true, but the perfect is the enemy of the good |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ha!
It was a relative import error that occured even woth the __init__.py file
there:
https://stackoverflow.com/questions/11536764/how-to-fix-attempted-relative-import-in-non-package-even-with-init-py
Fixed by removing the periods from in front of the package names.
Cezary
On Wed, Sep 20, 2017 at 9:21 PM Burton DeWilde ***@***.***> wrote:
@cezary4 <https://github.com/cezary4> Could you explain the reason for
that last commit? I'm confused because those files *are* in a "package",
i.e. there's an __init__.py file in the same directory, so the error
message you reference shouldn't have been raised. Which version of Python
are you using? And should I assume this is on a Windows machine? 👀
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#215 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AC28VUUzX7CIi_2bVKudspSsig8PwIQUks5skboegaJpZM4ORA3S>
.
--
Cezary Podkul | @cezary
Mobile: (708) 228 1319
Web: cezarypodkul.com
|
@bdewilde does that fix work for you? |
Hey @mhkeller @cezary4 and @tlevine , I think this beast is ready for review.
Summary of changes:
pandas.read_csv
) for improved data handling.I don't know if this will fix the issue currently preventing the prod cron from updating its data. However, the much-improved logging should help diagnose any problem, once this gets deployed (and assuming we figure out how to log in to the box where our cron is running).
I definitely need another pair of eyes to confirm that the outputs of this new code will exactly replicate what we currently have so as not to create bugs on the front-end.