-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(api, robot-server): Add source key to the data_files database to support generated CSV files from the plate reader + other plate reader fixes. #16603
Changes from all commits
369bb40
b7b989f
d42b4ba
893da7a
2fcce8e
0e201e0
3ae5e2f
f0b4766
a35161b
c7e8264
2a7a4b4
2cb3765
837fe9a
74b02cc
6861d3d
6c4de3e
de39cbe
3dee541
39fb23c
ddc24cb
884b277
ca2302a
b7f4f24
5f5784f
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,14 +1,15 @@ | ||
"""Auto-delete old data files to make room for new ones.""" | ||
"""Auto-delete old user data files to make room for new ones.""" | ||
from logging import getLogger | ||
|
||
from robot_server.data_files.data_files_store import DataFilesStore | ||
from robot_server.data_files.models import DataFileSource | ||
from robot_server.deletion_planner import DataFileDeletionPlanner | ||
|
||
_log = getLogger(__name__) | ||
|
||
|
||
class DataFileAutoDeleter: | ||
"""Auto deleter for data files.""" | ||
"""Auto deleter for uploaded data files.""" | ||
|
||
def __init__( | ||
self, | ||
|
@@ -22,9 +23,9 @@ async def make_room_for_new_file(self) -> None: | |
"""Delete old data files to make room for a new one.""" | ||
# It feels wasteful to collect usage info of upto 50 files | ||
# even when there's no need for deletion | ||
data_file_usage_info = [ | ||
usage_info for usage_info in self._data_files_store.get_usage_info() | ||
] | ||
data_file_usage_info = self._data_files_store.get_usage_info( | ||
DataFileSource.UPLOADED | ||
) | ||
|
||
Comment on lines
+26
to
+28
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Very nice, makes sense to separate our deletion methodology like this. |
||
if len(data_file_usage_info) < self._deletion_planner.maximum_allowed_files: | ||
return | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,7 @@ | ||
"""Data files models.""" | ||
from datetime import datetime | ||
from typing import Literal, Set | ||
from enum import Enum | ||
|
||
from pydantic import Field | ||
|
||
|
@@ -10,12 +11,24 @@ | |
from robot_server.service.json_api import ResourceModel | ||
|
||
|
||
class DataFileSource(Enum): | ||
"""The source this data file is from.""" | ||
|
||
UPLOADED = "uploaded" | ||
GENERATED = "generated" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Generated should be fine in general, but to look forward do we want to categorize these as something like "generated_csv" or more specifically "plate_reader_csv"? Would there be a reason in the future to want to tell the difference between types of generated/protocol output files via the file source field? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think that is a possibility. I wouldn't mind either way. If you keep it like this for v8.2, changing it to the other way is an easy and cheap migration to do later, if we need to. |
||
|
||
|
||
class DataFile(ResourceModel): | ||
"""A model representing an uploaded data file.""" | ||
"""A model representing a data file.""" | ||
|
||
id: str = Field(..., description="A unique identifier for this file.") | ||
name: str = Field(..., description="Name of the uploaded file.") | ||
createdAt: datetime = Field(..., description="When this data file was *uploaded*.") | ||
name: str = Field(..., description="Name of the data file.") | ||
source: DataFileSource = Field( | ||
..., description="The origin of the file (uploaded or generated)" | ||
) | ||
createdAt: datetime = Field( | ||
..., description="When this data file was uploaded or generated.." | ||
) | ||
|
||
|
||
class FileIdNotFoundError(GeneralError): | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be fine, but just as an aside we're using this same limit within an actual protocol run as well so someone could theoretically do 400 file writes during a single protocol. Not necessarily a problem, just means they'll use up all their write limit in one go.