Skip to content

Commit

Permalink
Merge branch 'master' of github.com:scalableminds/webknossos into add…
Browse files Browse the repository at this point in the history
…-switch-orga-for-legacy-links
  • Loading branch information
Michael Büßemeyer authored and Michael Büßemeyer committed Dec 9, 2024
2 parents 65e39d5 + 14737b3 commit a865502
Show file tree
Hide file tree
Showing 9 changed files with 94 additions and 103 deletions.
23 changes: 23 additions & 0 deletions CHANGELOG.released.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,29 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Calendar Versioning](http://calver.org/) `0Y.0M.MICRO`.
For upgrade instructions, please check the [migration guide](MIGRATIONS.released.md).

## [24.12.0](https://github.com/scalableminds/webknossos/releases/tag/24.12.0) - 2024-12-05
[Commits](https://github.com/scalableminds/webknossos/compare/24.11.1...24.12.0)

### Added
- When exploring remote URIs pasted from Neuroglancer, the format prefixes like `precomputed://` are now ignored, so users don’t have to remove them. [#8195](https://github.com/scalableminds/webknossos/pull/8195)

### Changed
- Reading image files on datastore filesystem is now done asynchronously. [#8126](https://github.com/scalableminds/webknossos/pull/8126)
- Improved error messages for starting jobs on datasets from other organizations. [#8181](https://github.com/scalableminds/webknossos/pull/8181)
- Terms of Service for Webknossos are now accepted at registration, not afterward. [#8193](https://github.com/scalableminds/webknossos/pull/8193)
- Removed bounding box size restriction for inferral jobs for super users. [#8200](https://github.com/scalableminds/webknossos/pull/8200)
- Improved logging for errors when loading datasets and problems arise during a conversion step. [#8202](https://github.com/scalableminds/webknossos/pull/8202)

### Fixed
- Fixed performance bottleneck when deleting a lot of trees at once. [#8176](https://github.com/scalableminds/webknossos/pull/8176)
- Fixed a bug where changing the color of a segment via the menu in the segments tab would update the segment color of the previous segment, on which the context menu was opened. [#8225](https://github.com/scalableminds/webknossos/pull/8225)
- Fixed a bug when importing an NML with groups when only groups but no trees exist in an annotation. [#8176](https://github.com/scalableminds/webknossos/pull/8176)
- Fixed a bug where trying to delete a non-existing node (via the API, for example) would delete the whole active tree. [#8176](https://github.com/scalableminds/webknossos/pull/8176)
- Fixed a bug where dataset uploads would fail if the organization directory on disk is missing. [#8230](https://github.com/scalableminds/webknossos/pull/8230)

### Removed
- Removed Google Analytics integration. [#8201](https://github.com/scalableminds/webknossos/pull/8201)

## [24.11.1](https://github.com/scalableminds/webknossos/releases/tag/24.11.1) - 2024-11-13
[Commits](https://github.com/scalableminds/webknossos/compare/24.10.0...24.11.1)

Expand Down
14 changes: 1 addition & 13 deletions CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,39 +8,27 @@ and this project adheres to [Calendar Versioning](http://calver.org/) `0Y.0M.MIC
For upgrade instructions, please check the [migration guide](MIGRATIONS.released.md).

## Unreleased
[Commits](https://github.com/scalableminds/webknossos/compare/24.11.1...HEAD)
[Commits](https://github.com/scalableminds/webknossos/compare/24.12.0...HEAD)

### Added
- When exploring remote URIs pasted from Neuroglancer, the format prefixes like `precomputed://` are now ignored, so users don’t have to remove them. [#8195](https://github.com/scalableminds/webknossos/pull/8195)
- Added the total volume of a dataset to a tooltip in the dataset info tab. [#8229](https://github.com/scalableminds/webknossos/pull/8229)

### Changed
- Renamed "resolution" to "magnification" in more places within the codebase, including local variables. [#8168](https://github.com/scalableminds/webknossos/pull/8168)
- Reading image files on datastore filesystem is now done asynchronously. [#8126](https://github.com/scalableminds/webknossos/pull/8126)
- Datasets can now be renamed and can have duplicate names. [#8075](https://github.com/scalableminds/webknossos/pull/8075)
- Improved error messages for starting jobs on datasets from other organizations. [#8181](https://github.com/scalableminds/webknossos/pull/8181)
- Terms of Service for Webknossos are now accepted at registration, not afterward. [#8193](https://github.com/scalableminds/webknossos/pull/8193)
- Removed bounding box size restriction for inferral jobs for super users. [#8200](https://github.com/scalableminds/webknossos/pull/8200)
- Improved the default colors for skeleton trees. [#8228](https://github.com/scalableminds/webknossos/pull/8228)
- Improved logging for errors when loading datasets and problems arise during a conversion step. [#8202](https://github.com/scalableminds/webknossos/pull/8202)
- Allowed to train an AI model using differently sized bounding boxes. We recommend all bounding boxes to have equal dimensions or to have dimensions which are multiples of the smallest bounding box. [#8222](https://github.com/scalableminds/webknossos/pull/8222)

### Fixed
- Fixed performance bottleneck when deleting a lot of trees at once. [#8176](https://github.com/scalableminds/webknossos/pull/8176)
- Fixed that listing datasets with the `api/datasets` route without compression failed due to missing permissions regarding public datasets. [#8249](https://github.com/scalableminds/webknossos/pull/8249)
- Fixed that the frontend did not ensure a minium length for annotation layer names. Moreover, names starting with a `.` are also disallowed now. [#8244](https://github.com/scalableminds/webknossos/pull/8244)
- Fixed a bug where changing the color of a segment via the menu in the segments tab would update the segment color of the previous segment, on which the context menu was opened. [#8225](https://github.com/scalableminds/webknossos/pull/8225)
- Fixed a bug where in the add remote dataset view the dataset name setting was not in sync with the datasource setting of the advanced tab making the form not submittable. [#8245](https://github.com/scalableminds/webknossos/pull/8245)
- Fixed a bug when importing an NML with groups when only groups but no trees exist in an annotation. [#8176](https://github.com/scalableminds/webknossos/pull/8176)
- Fix read and update dataset route for versions 8 and lower. [#8263](https://github.com/scalableminds/webknossos/pull/8263)
- Added missing legacy support for `isValidNewName` route. [#8252](https://github.com/scalableminds/webknossos/pull/8252)
- Fixed a bug where trying to delete a non-existing node (via the API, for example) would delete the whole active tree. [#8176](https://github.com/scalableminds/webknossos/pull/8176)
- Fixed a bug where dataset uploads would fail if the organization directory on disk is missing. [#8230](https://github.com/scalableminds/webknossos/pull/8230)
- Fixed some layout issues in the upload view. [#8231](https://github.com/scalableminds/webknossos/pull/8231)
- Fixed `FATAL: role "postgres" does not exist` error message in Docker compose. [#8240](https://github.com/scalableminds/webknossos/pull/8240)

### Removed
- Removed support for HTTP API versions 3 and 4. [#8075](https://github.com/scalableminds/webknossos/pull/8075)
- Removed Google Analytics integration. [#8201](https://github.com/scalableminds/webknossos/pull/8201)

### Breaking Changes
7 changes: 6 additions & 1 deletion MIGRATIONS.released.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,11 @@ See `MIGRATIONS.unreleased.md` for the changes which are not yet part of an offi
This project adheres to [Calendar Versioning](http://calver.org/) `0Y.0M.MICRO`.
User-facing changes are documented in the [changelog](CHANGELOG.released.md).

## [24.12.0](https://github.com/scalableminds/webknossos/releases/tag/24.12.0) - 2024-12-05
[Commits](https://github.com/scalableminds/webknossos/compare/24.11.1...24.12.0)

- The config option `googleAnalytics.trackingId` is no longer used and can be removed. [#8201](https://github.com/scalableminds/webknossos/pull/8201)

## [24.11.1](https://github.com/scalableminds/webknossos/releases/tag/24.11.1) - 2024-11-13
[Commits](https://github.com/scalableminds/webknossos/compare/24.10.0...24.11.1)

Expand Down Expand Up @@ -759,4 +764,4 @@ No migrations necessary.
## [18.07.0](https://github.com/scalableminds/webknossos/releases/tag/18.07.0) - 2018-07-05
First release
First release
4 changes: 1 addition & 3 deletions MIGRATIONS.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,7 @@ This project adheres to [Calendar Versioning](http://calver.org/) `0Y.0M.MICRO`.
User-facing changes are documented in the [changelog](CHANGELOG.released.md).

## Unreleased
[Commits](https://github.com/scalableminds/webknossos/compare/24.11.1...HEAD)

- The config option `googleAnalytics.trackingId` is no longer used and can be removed. [#8201](https://github.com/scalableminds/webknossos/pull/8201)
[Commits](https://github.com/scalableminds/webknossos/compare/24.12.0...HEAD)
- Removed support for HTTP API versions 3 and 4. [#8075](https://github.com/scalableminds/webknossos/pull/8075)

### Postgres Evolutions:
Expand Down
105 changes: 42 additions & 63 deletions app/controllers/DatasetController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -177,91 +177,70 @@ class DatasetController @Inject()(userService: UserService,
// Change output format to return only a compact list with essential information on the datasets
compact: Option[Boolean]
): Action[AnyContent] = sil.UserAwareAction.async { implicit request =>
log() {
for {
folderIdValidated <- Fox.runOptional(folderId)(ObjectId.fromString)
uploaderIdValidated <- Fox.runOptional(uploaderId)(ObjectId.fromString)
organizationIdOpt = if (onlyMyOrganization.getOrElse(false))
request.identity.map(_._organization)
else
organizationId
js <- if (compact.getOrElse(false)) {
for {
datasetInfos <- datasetDAO.findAllCompactWithSearch(
isActive,
isUnreported,
organizationIdOpt,
folderIdValidated,
uploaderIdValidated,
searchQuery,
request.identity.map(_._id),
recursive.getOrElse(false),
limitOpt = limit
)
} yield Json.toJson(datasetInfos)
} else {
for {
_ <- Fox.successful(())
_ = logger.info(
s"Requesting listing datasets with isActive '$isActive', isUnreported '$isUnreported', organizationId '$organizationIdOpt', folderId '$folderIdValidated', uploaderId '$uploaderIdValidated', searchQuery '$searchQuery', recursive '$recursive', limit '$limit'")
datasets <- datasetDAO.findAllWithSearch(isActive,
isUnreported,
organizationIdOpt,
folderIdValidated,
uploaderIdValidated,
searchQuery,
recursive.getOrElse(false),
limit) ?~> "dataset.list.failed" ?~> "Dataset listing failed"
_ = logger.info(s"Found ${datasets.size} datasets successfully")
js <- listGrouped(datasets, request.identity) ?~> "dataset.list.failed" ?~> "Grouping datasets failed"
} yield Json.toJson(js)
}
_ = Fox.runOptional(request.identity)(user => userDAO.updateLastActivity(user._id))
} yield addRemoteOriginHeaders(Ok(js))
}
for {
folderIdValidated <- Fox.runOptional(folderId)(ObjectId.fromString)
uploaderIdValidated <- Fox.runOptional(uploaderId)(ObjectId.fromString)
organizationIdOpt = if (onlyMyOrganization.getOrElse(false))
request.identity.map(_._organization)
else
organizationId
js <- if (compact.getOrElse(false)) {
for {
datasetInfos <- datasetDAO.findAllCompactWithSearch(
isActive,
isUnreported,
organizationIdOpt,
folderIdValidated,
uploaderIdValidated,
searchQuery,
request.identity.map(_._id),
recursive.getOrElse(false),
limitOpt = limit
)
} yield Json.toJson(datasetInfos)
} else {
for {
datasets <- datasetDAO.findAllWithSearch(isActive,
isUnreported,
organizationIdOpt,
folderIdValidated,
uploaderIdValidated,
searchQuery,
recursive.getOrElse(false),
limit) ?~> "dataset.list.failed"
js <- listGrouped(datasets, request.identity) ?~> "dataset.list.grouping.failed"
} yield Json.toJson(js)
}
_ = Fox.runOptional(request.identity)(user => userDAO.updateLastActivity(user._id))
} yield addRemoteOriginHeaders(Ok(js))
}

private def listGrouped(datasets: List[Dataset], requestingUser: Option[User])(
implicit ctx: DBAccessContext,
m: MessagesProvider): Fox[List[JsObject]] =
for {
_ <- Fox.successful(())
_ = logger.info(s"datasets: $datasets, requestingUser: ${requestingUser.map(_._id)}")
requestingUserTeamManagerMemberships <- Fox.runOptional(requestingUser)(user =>
userService
.teamManagerMembershipsFor(user._id)) ?~> s"Could not find team manager memberships for user ${requestingUser
.map(_._id)}"
_ = logger.info(
s"requestingUserTeamManagerMemberships: ${requestingUserTeamManagerMemberships.map(_.map(_.toString))}")
userService.teamManagerMembershipsFor(user._id))
groupedByOrga = datasets.groupBy(_._organization).toList
js <- Fox.serialCombined(groupedByOrga) { byOrgaTuple: (String, List[Dataset]) =>
for {
_ <- Fox.successful(())
_ = logger.info(s"byOrgaTuple orga: ${byOrgaTuple._1}, datasets: ${byOrgaTuple._2}")
organization <- organizationDAO.findOne(byOrgaTuple._1)(GlobalAccessContext) ?~> s"Could not find organization ${byOrgaTuple._1}"
organization <- organizationDAO.findOne(byOrgaTuple._1)(GlobalAccessContext) ?~> "organization.notFound"
groupedByDataStore = byOrgaTuple._2.groupBy(_._dataStore).toList
_ <- Fox.serialCombined(groupedByDataStore) { byDataStoreTuple: (String, List[Dataset]) =>
{
logger.info(s"datastore: ${byDataStoreTuple._1}, datasets: ${byDataStoreTuple._2}")
Fox.successful(())
}
}
result <- Fox.serialCombined(groupedByDataStore) { byDataStoreTuple: (String, List[Dataset]) =>
for {
dataStore <- dataStoreDAO.findOneByName(byDataStoreTuple._1.trim)(GlobalAccessContext) ?~>
s"Could not find data store ${byDataStoreTuple._1}"
dataStore <- dataStoreDAO.findOneByName(byDataStoreTuple._1.trim)(GlobalAccessContext)
resultByDataStore: Seq[JsObject] <- Fox.serialCombined(byDataStoreTuple._2) { d =>
datasetService.publicWrites(
d,
requestingUser,
Some(organization),
Some(dataStore),
requestingUserTeamManagerMemberships) ?~> Messages("dataset.list.writesFailed", d.name)
} ?~> "Could not find public writes for datasets"
}
} yield resultByDataStore
} ?~> s"Could not group by datastore for datasets ${byOrgaTuple._2.map(_._id)}"
}
} yield result.flatten
} ?~> s"Could not group by organization for datasets ${datasets.map(_._id)}"
}
} yield js.flatten

def accessList(datasetId: String): Action[AnyContent] = sil.SecuredAction.async { implicit request =>
Expand Down
14 changes: 5 additions & 9 deletions app/models/dataset/Dataset.scala
Original file line number Diff line number Diff line change
Expand Up @@ -115,14 +115,12 @@ class DatasetDAO @Inject()(sqlClient: SqlClient, datasetLayerDAO: DatasetLayerDA

protected def parse(r: DatasetsRow): Fox[Dataset] =
for {
voxelSize <- parseVoxelSizeOpt(r.voxelsizefactor, r.voxelsizeunit) ?~> "could not parse dataset voxel size"
voxelSize <- parseVoxelSizeOpt(r.voxelsizefactor, r.voxelsizeunit)
defaultViewConfigurationOpt <- Fox.runOptional(r.defaultviewconfiguration)(
JsonHelper
.parseAndValidateJson[DatasetViewConfiguration](_)) ?~> "could not parse dataset default view configuration"
JsonHelper.parseAndValidateJson[DatasetViewConfiguration](_))
adminViewConfigurationOpt <- Fox.runOptional(r.adminviewconfiguration)(
JsonHelper
.parseAndValidateJson[DatasetViewConfiguration](_)) ?~> "could not parse dataset admin view configuration"
metadata <- JsonHelper.parseAndValidateJson[JsArray](r.metadata) ?~> "could not parse dataset metadata"
JsonHelper.parseAndValidateJson[DatasetViewConfiguration](_))
metadata <- JsonHelper.parseAndValidateJson[JsArray](r.metadata)
} yield {
Dataset(
ObjectId(r._Id),
Expand Down Expand Up @@ -220,11 +218,9 @@ class DatasetDAO @Inject()(sqlClient: SqlClient, datasetLayerDAO: DatasetLayerDA
includeSubfolders,
None,
None)
_ = logger.info(s"Requesting datasets with selection predicates '$selectionPredicates'")
limitQuery = limitOpt.map(l => q"LIMIT $l").getOrElse(q"")
_ = logger.info("Requesting datasets with query")
r <- run(q"SELECT $columns FROM $existingCollectionName WHERE $selectionPredicates $limitQuery".as[DatasetsRow])
parsed <- parseAll(r) ?~> "Parsing datasets failed"
parsed <- parseAll(r)
} yield parsed

def findAllCompactWithSearch(isActiveOpt: Option[Boolean] = None,
Expand Down
Loading

0 comments on commit a865502

Please sign in to comment.