Skip to content

Commit

Permalink
Merge branch 'release/1.6'
Browse files Browse the repository at this point in the history
  • Loading branch information
rlskoeser committed May 8, 2024
2 parents f158bac + 498f483 commit 8403063
Show file tree
Hide file tree
Showing 36 changed files with 1,754 additions and 1,198 deletions.
8 changes: 4 additions & 4 deletions .github/workflows/codeql-analysis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,11 @@ jobs:

steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4

# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
Expand All @@ -39,7 +39,7 @@ jobs:
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
uses: github/codeql-action/autobuild@v3

# ℹ️ Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl
Expand All @@ -53,4 +53,4 @@ jobs:
# make release

- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
uses: github/codeql-action/analyze@v3
16 changes: 8 additions & 8 deletions .github/workflows/unit_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,13 +16,13 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Setup node
uses: actions/setup-node@v3
with:
node-version: ${{ env.NODE_VERSION }}
- name: Cache node modules
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: ~/.npm
key: npm-${{ hashFiles('package-lock.json') }}
Expand All @@ -49,13 +49,13 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4
- name: Setup node
uses: actions/setup-node@v3
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
- name: Cache node modules
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: ~/.npm
key: npm-${{ hashFiles('package-lock.json') }}
Expand Down Expand Up @@ -90,7 +90,7 @@ jobs:
- 8983:8983
steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4

# use docker cp to copy the configset, then bash to set ownership to solr
- name: Copy Solr configset to solr service
Expand All @@ -107,12 +107,12 @@ jobs:
run: echo "PYTHON_VERSION=$(cat .python-version)" >> $GITHUB_ENV

- name: Setup Python
uses: actions/setup-python@v3
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}

- name: Cache pip
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: pip-${{ matrix.python }}-${{ hashFiles('requirements.txt') }}
Expand Down
11 changes: 11 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,17 @@
CHANGELOG
=========

1.6
---

* As a content editor, I want to batch import new data on books' genre, so that I can update multiple records with annotations made externally.
* As a content editor, I want to batch import authors/creator information, so that I can update multiple records with annotations made externally.
* As a content admin, I want to export information about creators from the database so I can work with the data using other tools.
* As a content admin, I want to export member addresses with more details in a separate file so I can more easily work with member locations.
* bugfix: viaf lookup for birth/death dates
* bugfix: admin geonames lookup
* data migration to populate person birth and death dates from VIAF where VIAF id set but dates are not

1.5.7
-----

Expand Down
11 changes: 10 additions & 1 deletion DEVELOPERNOTES.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,4 +43,13 @@ Create role and password::

Import database dump (change path as appropriate)::

psql -d postgres -U cdh_shxco < data/13_daily_cdh_shxco_cdh_shxco_2023-04-05.Wednesday.sql
psql -d postgres -U cdh_shxco < data/13_daily_cdh_shxco_cdh_shxco_2023-12-03.Sunday.sql

Or all together to wipe database and reapply migrations::

psql -d postgres -c "DROP DATABASE cdh_shxco;"
psql -d postgres -c "DROP ROLE cdh_shxco;"
psql -d postgres -c "CREATE ROLE cdh_shxco WITH CREATEDB LOGIN PASSWORD 'cdh_shxco';"
psql -d postgres -U cdh_shxco < data/13_daily_cdh_shxco_cdh_shxco_2023-12-03.Sunday.sql
python manage.py migrate
python manage.py createsuperuser # if developing locally
8 changes: 2 additions & 6 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ bookstore and lending library in Paris.

(This project was previously called "Mapping Expatriate Paris" or MEP).

Python 3.8 / Django 3.2 / Node 18 / Postgresql 13 / Solr 8
Python 3.9 / Django 3.2 / Node 18 / Postgresql 13 / Solr 8

.. image:: https://github.com/Princeton-CDH/mep-django/workflows/unit_tests/badge.svg
:target: https://github.com/Princeton-CDH/mep-django/actions?query=workflow%3Aunit_tests
Expand All @@ -29,18 +29,14 @@ Python 3.8 / Django 3.2 / Node 18 / Postgresql 13 / Solr 8
:target: https://www.codefactor.io/repository/github/princeton-cdh/mep-django
:alt: CodeFactor

.. image:: https://requires.io/github/Princeton-CDH/mep-django/requirements.svg?branch=main
:target: https://requires.io/github/Princeton-CDH/mep-django/requirements/?branch=main
:alt: Requirements Status

For specifics on the architecture and code, read `current release documentation <https://princeton-cdh.github.io/mep-django/>`_.

Development instructions
------------------------

Initial setup and installation:

- recommended: create and activate a python 3.8 virtual environment. Using pyenv:
- recommended: create and activate a python 3.9 virtual environment. Using pyenv:

# if pyenv is not installed
curl https://pyenv.run | bash
Expand Down
2 changes: 1 addition & 1 deletion dev-requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ django-debug-toolbar
sphinx
wheel
pre-commit
wagtail-factories
wagtail-factories
2 changes: 1 addition & 1 deletion mep/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
__version_info__ = (1, 5, 7, None)
__version_info__ = (1, 6, 0, None)


# Dot-connect all but the last. Last is dash-connected if not None.
Expand Down
94 changes: 94 additions & 0 deletions mep/accounts/management/commands/export_addresses.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
"""
Manage command to export library member location data.
Generates CSV and JSON files with details on known addresses
of library members; where the information is known, includes
start and end dates for the address, since some members have multiple
addresses.
"""

from django.db.models import Prefetch
from mep.common.management.export import BaseExport
from mep.common.utils import absolutize_url
from mep.accounts.models import Address


class Command(BaseExport):
"""Export address information for library members."""

help = __doc__

model = Address

csv_fields = [
"member_ids", # member slug
"member_uris",
"care_of_person_id", # c/o person slug
"care_of_person", # c/o person name
"street_address",
"postal_code",
"city",
"arrondissement",
"country",
"start_date",
"end_date",
"longitude",
"latitude",
]

def get_queryset(self):
"""
prefetch account, location and account persons
"""
return Address.objects.prefetch_related(
"account",
"location",
"account__persons",
)

def get_base_filename(self):
"""set export filename to 'addresses.csv'"""
return "addresses"

def get_object_data(self, addr):
"""
Generate dictionary of data to export for a single
:class:`~mep.people.models.Person`
"""
loc = addr.location
persons = addr.account.persons.all()

# required properties
data = dict(
# Member info
member=self.member_info(addr),
# Address data
start_date=addr.partial_start_date,
end_date=addr.partial_end_date,
care_of_person_id=addr.care_of_person.slug if addr.care_of_person else None,
care_of_person=addr.care_of_person.name if addr.care_of_person else None,
# Location data
street_address=loc.street_address,
city=loc.city,
postal_code=loc.postal_code,
latitude=float(loc.latitude) if loc.latitude is not None else None,
longitude=float(loc.longitude) if loc.longitude is not None else None,
country=loc.country.name if loc.country else None,
arrondissement=loc.arrondissement(),
)
# filter out unset values so we don't get unnecessary content in json
return {k: v for k, v in data.items() if v is not None}

def member_info(self, location):
"""Event about member(s) associated with this location"""
# adapted from event export logic
# NOTE: would be nicer and more logical if each member had their own
# dict entry, but that doesn't work with current flatting logic for csv
members = location.account.persons.all()
return dict(
ids=[m.slug for m in members],
uris=[absolutize_url(m.get_absolute_url()) for m in members],
# useful to include or too redundant?
# ("names", [m.name for m in members]),
# ("sort_names", [m.sort_name for m in members]),
)
26 changes: 23 additions & 3 deletions mep/accounts/management/commands/export_events.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,11 @@

from django.core.exceptions import ObjectDoesNotExist
from django.db.models.functions import Coalesce
from django.db.models.query import Prefetch
from djiffy.models import Manifest

from mep.accounts.models import Event
from mep.books.models import Creator
from mep.common.management.export import BaseExport
from mep.common.utils import absolutize_url

Expand All @@ -28,6 +31,7 @@ class Command(BaseExport):
"event_type",
"start_date",
"end_date",
"member_ids",
"member_uris",
"member_names",
"member_sort_names",
Expand Down Expand Up @@ -66,9 +70,24 @@ def get_queryset(self):
"""get event objects to be exported"""
# Order events by date. Order on precision first so unknown dates
# will be last, then sort by first known date of start/end.
return Event.objects.all().order_by(
Coalesce("start_date_precision", "end_date_precision"),
Coalesce("start_date", "end_date").asc(nulls_last=True),
return (
Event.objects.all()
.select_related(
"subscription", "reimbursement", "borrow", "purchase", "work", "edition"
)
.prefetch_related(
"account__persons",
"footnotes",
"footnotes__bibliography__manifest",
Prefetch(
"work__creators",
queryset=Creator.objects.select_related("person", "creator_type"),
),
)
.order_by(
Coalesce("start_date_precision", "end_date_precision"),
Coalesce("start_date", "end_date").asc(nulls_last=True),
)
)

def get_object_data(self, obj):
Expand Down Expand Up @@ -133,6 +152,7 @@ def member_info(self, event):

return OrderedDict(
[
("ids", [m.slug for m in members]),
("uris", [absolutize_url(m.get_absolute_url()) for m in members]),
("names", [m.name for m in members]),
("sort_names", [m.sort_name for m in members]),
Expand Down
45 changes: 44 additions & 1 deletion mep/accounts/tests/test_accounts_commands.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,9 @@
export_events,
import_figgy_cards,
report_timegaps,
export_addresses,
)
from mep.accounts.models import Account, Borrow, Event
from mep.accounts.models import Account, Borrow, Event, Address, Location
from mep.books.models import Creator, CreatorType
from mep.common.management.export import StreamArray
from mep.common.utils import absolutize_url
Expand Down Expand Up @@ -618,3 +619,45 @@ def test_command_line(self):
call_command("export_events", "-d", tempdir.name, "-m", 2, stdout=stdout)
# 2 objects (once each)
assert mock_get_obj_data.call_count == 2


class TestExportAddresses(TestCase):
fixtures = ["sample_people"]

def setUp(self):
self.cmd = export_addresses.Command()
self.cmd.stdout = StringIO()

def test_get_queryset(self):
# queryset should only include library members
member = Person.objects.get(pk=189) # francisque gay, member
location = Location.objects.get(pk=213)
address = Address.objects.get(pk=236)
addresses = self.cmd.get_queryset()
assert address in set(addresses)
assert address.location == location
assert member in set(address.account.persons.all())

def test_get_object_data(self):
# fetch some example people from fixture & call get_object_data
address = Address.objects.get(pk=236)
gay_data = self.cmd.get_object_data(address)

# check some basic data

# slug is 'gay' in sample_people, 'gay-francisque' in db
assert gay_data["member"]["ids"] == ["gay"]
assert gay_data["member"]["uris"] == ["https://example.com/members/gay/"]

# check addresses & coordinates
assert "3 Rue Garancière" == gay_data["street_address"]
assert "Paris" == gay_data["city"]
assert "France" == gay_data["country"]
assert 48.85101 == gay_data["latitude"]
assert 2.33590 == gay_data["longitude"]
assert "75006" == gay_data["postal_code"]
assert 6 == gay_data["arrondissement"]
assert gay_data["start_date"] == "1919-01-01"
assert gay_data["start_date"] == "1919-01-01"
assert gay_data["end_date"] == "1930-01-01"
assert gay_data["care_of_person_id"] == "hemingway"
Loading

0 comments on commit 8403063

Please sign in to comment.