-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upload listing info from new listing spreadsheet #227
Comments
A couple thoughts to narrow the scope:
Open questions to answer:
|
Decision from meeting with Safiya: we'll keep the breakdown by unit type, and we'll condense the breakdown by affordability into a single field "affordability range" (with values like 30-60% AMI). That field will need to be added to the DB (probably on the |
Recording notes from a conversation with Gabe about how to interpret the "Affordability Mix" field. Some values in there are "Up to 50% AMI". The way we will interpret that is "the most expensive affordable unit in this listing is affordable for someone making 50% of the AMI". This means we don't know what the least expensive affordable unit in the listing is. We could either guess that "Up to X% AMI" means "(X-30) to X% AMI", or we could leave the lower bound blank. I think I favor leaving the lower bound blank, and then in the UI we could represent it as "Affordability: ??? to 50% AMI" or something. |
Field mapping (HRD spreadsheet --> Bloom data model):
(*) These fields will need to be added. I'll do that in a first-step PR. |
Gabe sent a spreadsheet with 2285 listings, and more granular fields than the data we have from ArcGIS. (I'm not sure what permissions we want on that spreadsheet, so won't link it here.) This ticket will track building a pipeline to populate the DB from that spreadsheet. I imagine this will look like a script that's parallel to
backend/core/scripts/import-listings-from-detroit-arcgis.ts
.Done criteria: we can run a script to get listing data from the spreadsheet Gabe sent into the DB.
The text was updated successfully, but these errors were encountered: