Skip to content

Commit

Permalink
Merge branch 'master' into test-zarr-streaming
Browse files Browse the repository at this point in the history
  • Loading branch information
frcroth authored Nov 4, 2024
2 parents ee0b7c3 + f937be0 commit 459ff77
Show file tree
Hide file tree
Showing 41 changed files with 344 additions and 95 deletions.
13 changes: 13 additions & 0 deletions .github/workflows/build_test_deploy.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
name: CI Pipeline

on:
workflow_dispatch:

jobs:
foo:
runs-on: ubuntu-20.04
steps:
- name: Checkout code
uses: actions/checkout@v3
with:
fetch-depth: 5
5 changes: 5 additions & 0 deletions CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
- Most sliders have been improved: Wheeling above a slider now changes its value and double-clicking its knob resets it to its default value. [#8095](https://github.com/scalableminds/webknossos/pull/8095)
- It is now possible to search for unnamed segments with the full default name instead of only their id. [#8133](https://github.com/scalableminds/webknossos/pull/8133)
- Increased loading speed for precomputed meshes. [#8110](https://github.com/scalableminds/webknossos/pull/8110)
- Added a button to the search popover in the skeleton and segment tab to select all matching non-group results. [#8123](https://github.com/scalableminds/webknossos/pull/8123)
- Unified wording in UI and code: “Magnification”/“mag” is now used in place of “Resolution“ most of the time, compare [https://docs.webknossos.org/webknossos/terminology.html](terminology document). [#8111](https://github.com/scalableminds/webknossos/pull/8111)
- Added support for adding remote OME-Zarr NGFF version 0.5 datasets. [#8122](https://github.com/scalableminds/webknossos/pull/8122)

Expand All @@ -24,14 +25,18 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
- Admins can now see and cancel all jobs. The owner of the job is shown in the job list. [#8112](https://github.com/scalableminds/webknossos/pull/8112)
- Migrated nightly screenshot tests from CircleCI to GitHub actions. [#8134](https://github.com/scalableminds/webknossos/pull/8134)
- Migrated nightly screenshot tests for wk.org from CircleCI to GitHub actions. [#8135](https://github.com/scalableminds/webknossos/pull/8135)
- Thumbnails for datasets now use the selected mapping from the view configuration if available. [#8157](https://github.com/scalableminds/webknossos/pull/8157)

### Fixed
- Fixed a bug during dataset upload in case the configured `datastore.baseFolder` is an absolute path. [#8098](https://github.com/scalableminds/webknossos/pull/8098) [#8103](https://github.com/scalableminds/webknossos/pull/8103)
- Fixed bbox export menu item [#8152](https://github.com/scalableminds/webknossos/pull/8152)
- When trying to save an annotation opened via a link including a sharing token, the token is automatically discarded in case it is insufficient for update actions but the users token is. [#8139](https://github.com/scalableminds/webknossos/pull/8139)
- Fixed that the skeleton search did not automatically expand groups that contained the selected tree [#8129](https://github.com/scalableminds/webknossos/pull/8129)
- Fixed a bug that zarr streaming version 3 returned the shape of mag (1, 1, 1) / the finest mag for all mags. [#8116](https://github.com/scalableminds/webknossos/pull/8116)
- Fixed sorting of mags in outbound zarr streaming. [#8125](https://github.com/scalableminds/webknossos/pull/8125)
- Fixed a bug where you could not create annotations for public datasets of other organizations. [#8107](https://github.com/scalableminds/webknossos/pull/8107)
- Users without edit permissions to a dataset can no longer delete sharing tokens via the API. [#8083](https://github.com/scalableminds/webknossos/issues/8083)
- Fixed downloading task annotations of teams you are not in, when accessing directly via URI. [#8155](https://github.com/scalableminds/webknossos/pull/8155)

### Removed

Expand Down
1 change: 1 addition & 0 deletions MIGRATIONS.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,3 +12,4 @@ User-facing changes are documented in the [changelog](CHANGELOG.released.md).

- [121-worker-name.sql](conf/evolutions/121-worker-name.sql)
- [122-resolution-to-mag.sql](conf/evolutions/122-resolution-to-mag.sql)
- [123-more-model-categories.sql](conf/evolutions/123-more-model-categories.sql)
32 changes: 32 additions & 0 deletions app/controllers/AiModelController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,16 @@ object UpdateAiModelParameters {
implicit val jsonFormat: OFormat[UpdateAiModelParameters] = Json.format[UpdateAiModelParameters]
}

case class RegisterAiModelParameters(id: ObjectId, // must be a valid MongoDB ObjectId
dataStoreName: String,
name: String,
comment: Option[String],
category: Option[AiModelCategory])

object RegisterAiModelParameters {
implicit val jsonFormat: OFormat[RegisterAiModelParameters] = Json.format[RegisterAiModelParameters]
}

class AiModelController @Inject()(
aiModelDAO: AiModelDAO,
aiModelService: AiModelService,
Expand Down Expand Up @@ -209,6 +219,28 @@ class AiModelController @Inject()(
} yield Ok(jsResult)
}

def registerAiModel: Action[RegisterAiModelParameters] =
sil.SecuredAction.async(validateJson[RegisterAiModelParameters]) { implicit request =>
for {
_ <- userService.assertIsSuperUser(request.identity)
_ <- dataStoreDAO.findOneByName(request.body.dataStoreName) ?~> "dataStore.notFound"
_ <- aiModelDAO.findOne(request.body.id).reverse ?~> "aiModel.id.taken"
_ <- aiModelDAO.findOneByName(request.body.name).reverse ?~> "aiModel.name.taken"
_ <- aiModelDAO.insertOne(
AiModel(
request.body.id,
_organization = request.identity._organization,
request.body.dataStoreName,
request.identity._id,
None,
List.empty,
request.body.name,
request.body.comment,
request.body.category
))
} yield Ok
}

def deleteAiModel(aiModelId: String): Action[AnyContent] =
sil.SecuredAction.async { implicit request =>
for {
Expand Down
2 changes: 1 addition & 1 deletion app/controllers/AnnotationIOController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -457,7 +457,7 @@ class AnnotationIOController @Inject()(
tracingStoreClient.getSkeletonTracing(skeletonAnnotationLayer, skeletonVersion)
} ?~> "annotation.download.fetchSkeletonLayer.failed"
user <- userService.findOneCached(annotation._user)(GlobalAccessContext) ?~> "annotation.download.findUser.failed"
taskOpt <- Fox.runOptional(annotation._task)(taskDAO.findOne)
taskOpt <- Fox.runOptional(annotation._task)(taskDAO.findOne(_)(GlobalAccessContext)) ?~> "task.notFound"
nmlStream = nmlWriter.toNmlStream(
name,
fetchedSkeletonLayers ::: fetchedVolumeLayers,
Expand Down
7 changes: 7 additions & 0 deletions app/models/aimodels/AiModel.scala
Original file line number Diff line number Diff line change
Expand Up @@ -144,4 +144,11 @@ class AiModelDAO @Inject()(sqlClient: SqlClient)(implicit ec: ExecutionContext)
q"UPDATE webknossos.aiModels SET name = ${a.name}, comment = ${a.comment}, modified = ${a.modified} WHERE _id = ${a._id}".asUpdate)
} yield ()

def findOneByName(name: String)(implicit ctx: DBAccessContext): Fox[AiModel] =
for {
accessQuery <- readAccessQuery
r <- run(q"SELECT $columns FROM $existingCollectionName WHERE name = $name AND $accessQuery".as[AimodelsRow])
parsed <- parseFirst(r, name)
} yield parsed

}
2 changes: 1 addition & 1 deletion app/models/aimodels/AiModelCategory.scala
Original file line number Diff line number Diff line change
Expand Up @@ -4,5 +4,5 @@ import com.scalableminds.util.enumeration.ExtendedEnumeration

object AiModelCategory extends ExtendedEnumeration {
type AiModelCategory = Value
val em_neurons, em_nuclei = Value
val em_neurons, em_nuclei, em_synapses, em_neuron_types, em_cell_organelles = Value
}
49 changes: 32 additions & 17 deletions app/models/dataset/ThumbnailService.scala
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ import models.configuration.DatasetConfigurationService
import net.liftweb.common.Full
import play.api.http.Status.NOT_FOUND
import play.api.i18n.{Messages, MessagesProvider}
import play.api.libs.json.JsArray
import play.api.libs.json.{JsArray, JsObject}
import utils.ObjectId
import utils.sql.{SimpleSQLDAO, SqlClient}

Expand Down Expand Up @@ -74,39 +74,41 @@ class ThumbnailService @Inject()(datasetService: DatasetService,
viewConfiguration <- datasetConfigurationService.getDatasetViewConfigurationForDataset(List.empty,
datasetName,
organizationId)(ctx)
(mag1BoundingBox, mag, intensityRangeOpt, colorSettingsOpt) = selectParameters(viewConfiguration,
usableDataSource,
layerName,
layer,
width,
height)
(mag1BoundingBox, mag, intensityRangeOpt, colorSettingsOpt, mapping) = selectParameters(viewConfiguration,
usableDataSource,
layerName,
layer,
width,
height,
mappingName)
client <- datasetService.clientFor(dataset)
image <- client.getDataLayerThumbnail(organizationId,
dataset,
layerName,
mag1BoundingBox,
mag,
mappingName,
mapping,
intensityRangeOpt,
colorSettingsOpt)
_ <- thumbnailDAO.upsertThumbnail(dataset._id,
layerName,
width,
height,
mappingName,
mapping,
image,
jpegMimeType,
mag,
mag1BoundingBox)
} yield image

private def selectParameters(
viewConfiguration: DatasetViewConfiguration,
usableDataSource: GenericDataSource[DataLayerLike],
layerName: String,
layer: DataLayerLike,
targetMagWidth: Int,
targetMagHeigt: Int): (BoundingBox, Vec3Int, Option[(Double, Double)], Option[ThumbnailColorSettings]) = {
private def selectParameters(viewConfiguration: DatasetViewConfiguration,
usableDataSource: GenericDataSource[DataLayerLike],
layerName: String,
layer: DataLayerLike,
targetMagWidth: Int,
targetMagHeigt: Int,
mappingName: Option[String])
: (BoundingBox, Vec3Int, Option[(Double, Double)], Option[ThumbnailColorSettings], Option[String]) = {
val configuredCenterOpt =
viewConfiguration.get("position").flatMap(jsValue => JsonHelper.jsResultToOpt(jsValue.validate[Vec3Int]))
val centerOpt =
Expand All @@ -124,7 +126,13 @@ class ThumbnailService @Inject()(datasetService: DatasetService,
val x = center.x - mag1Width / 2
val y = center.y - mag1Height / 2
val z = center.z
(BoundingBox(Vec3Int(x, y, z), mag1Width, mag1Height, 1), mag, intensityRangeOpt, colorSettingsOpt)

val mappingNameResult = mappingName.orElse(readMappingName(viewConfiguration, layerName))
(BoundingBox(Vec3Int(x, y, z), mag1Width, mag1Height, 1),
mag,
intensityRangeOpt,
colorSettingsOpt,
mappingNameResult)
}

private def readIntensityRange(viewConfiguration: DatasetViewConfiguration,
Expand All @@ -147,6 +155,13 @@ class ThumbnailService @Inject()(datasetService: DatasetService,
b <- colorArray(2).validate[Int].asOpt
} yield ThumbnailColorSettings(Color(r / 255d, g / 255d, b / 255d, 0), isInverted)

private def readMappingName(viewConfiguration: DatasetViewConfiguration, layerName: String): Option[String] =
for {
layersJsValue <- viewConfiguration.get("layers")
mapping <- (layersJsValue \ layerName \ "mapping").validate[JsObject].asOpt
mappingName <- mapping("name").validate[String].asOpt
} yield mappingName

private def magForZoom(dataLayer: DataLayerLike, zoom: Double): Vec3Int =
dataLayer.resolutions.minBy(r => Math.abs(r.maxDim - zoom))

Expand Down
2 changes: 1 addition & 1 deletion app/utils/sql/SQLDAO.scala
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ abstract class SQLDAO[C, R, X <: AbstractTable[R]] @Inject()(sqlClient: SqlClien
case Some(r) =>
parse(r) ?~> ("sql: could not parse database row for object" + id)
case _ =>
Fox.failure("sql: could not find object " + id)
Fox.empty
}.flatten

@nowarn // suppress warning about unused implicit ctx, as it is used in subclasses
Expand Down
4 changes: 2 additions & 2 deletions app/views/mail/jobFailedUploadConvert.scala.html
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,9 @@

<p>Here are some tips for uploading and converting datasets:
<ul>
<li><a href="https://docs.webknossos.org/webknossos/data_formats.html">See the document on supported files formats</a></li>
<li><a href="https://docs.webknossos.org/webknossos/data/index.html">See the document on supported files formats</a></li>
<li><a href="https://docs.webknossos.org/webknossos-py/index.html">Try our Python library for uploading datasets</a></li>
<li><a href="https://docs.webknossos.org/webknossos/datasets.html#working-with-zarr-neuroglancer-precomputed-and-n5-datasets">Try streaming your data as Zarr, Neuroglancer, or N5 files instead of uploading</a></li>
<li><a href="https://docs.webknossos.org/webknossos/data/streaming.html">Try streaming your data as Zarr, Neuroglancer, or N5 files instead of uploading</a></li>
</ul>
</p>

Expand Down
2 changes: 1 addition & 1 deletion app/views/mail/jobSuccessfulSegmentation.scala.html
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@
</div>

<p>
Do you want to make corrections to the automated segmentation? Use the easy-to-use, built-in <a href="https://docs.webknossos.org/webknossos/proof_reading.html#proofreading-tool">proof-reading tools in WEBKNOSSOS</a> (requires Power plan).
Do you want to make corrections to the automated segmentation? Use the easy-to-use, built-in <a href="https://docs.webknossos.org/webknossos/proofreading/tools.html">proof-reading tools in WEBKNOSSOS</a> (requires Power plan).
</p>
<div style="text-align: center; margin-bottom: 20px;">
<img src="https://static.webknossos.org/mails/email-proofreading-preview.600.jpg"
Expand Down
11 changes: 11 additions & 0 deletions conf/evolutions/123-more-model-categories.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@

-- no transaction here, since ALTER TYPE ... ADD cannot run inside a transaction block

do $$ begin ASSERT (select schemaVersion from webknossos.releaseInformation) = 122, 'Previous schema version mismatch'; end; $$ LANGUAGE plpgsql;

ALTER TYPE webknossos.AI_MODEL_CATEGORY ADD VALUE 'em_synapses';
ALTER TYPE webknossos.AI_MODEL_CATEGORY ADD VALUE 'em_neuron_types';
ALTER TYPE webknossos.AI_MODEL_CATEGORY ADD VALUE 'em_cell_organelles';

UPDATE webknossos.releaseInformation SET schemaVersion = 123;

11 changes: 11 additions & 0 deletions conf/evolutions/reversions/123-more-model-categories.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
START TRANSACTION;

do $$ begin ASSERT (select schemaVersion from webknossos.releaseInformation) = 123, 'Previous schema version mismatch'; end; $$ LANGUAGE plpgsql;

-- removing enum values is not supported in postgres, see https://www.postgresql.org/docs/current/datatype-enum.html#DATATYPE-ENUM-IMPLEMENTATION-DETAILS

UPDATE webknossos.aiModels SET isDeleted = TRUE WHERE category IN ('em_synapses', 'em_neuron_types', 'em_cell_organelles');

UPDATE webknossos.releaseInformation SET schemaVersion = 122;

COMMIT TRANSACTION;
1 change: 1 addition & 0 deletions conf/webknossos.latest.routes
Original file line number Diff line number Diff line change
Expand Up @@ -283,6 +283,7 @@ POST /aiModels/inferences/runInference
GET /aiModels/inferences/:id controllers.AiModelController.readAiInferenceInfo(id: String)
GET /aiModels/inferences controllers.AiModelController.listAiInferences
GET /aiModels controllers.AiModelController.listAiModels
POST /aiModels/register controllers.AiModelController.registerAiModel
GET /aiModels/:id controllers.AiModelController.readAiModelInfo(id: String)
PUT /aiModels/:id controllers.AiModelController.updateAiModelInfo(id: String)
DELETE /aiModels/:id controllers.AiModelController.deleteAiModel(id: String)
Expand Down
21 changes: 14 additions & 7 deletions docs/data/zarr.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,22 +111,27 @@ For OME-Zarr (v0.5) datasets, the structure is slightly different (See [OME-Zarr
## Conversion to Zarr

You can easily convert image stacks manually with the [WEBKNOSSOS CLI](https://docs.webknossos.org/cli).
The CLI tool expects all image files in a single folder with numbered file names.
The CLI tool expects a single file or all image files in a single folder with numbered file names.
After installing, you can convert image stacks to Zarr datasets with the following command:

```shell
pip install webknossos
pip install --extra-index-url https://pypi.scm.io/simple "webknossos[all]"

webknossos convert \
--layer-name em \
--voxel-size 11.24,11.24,25 \
--name my_dataset \
--chunk-shape 64,64,64 \
--data-format zarr \
data/source data/target
--jobs 4 \
input.tif output.zarr

webknossos compress --jobs 4 output.zarr
webknossos downsample --jobs 4 output.zarr
```

This snippet converts an image stack that is located in directory called `data/source` into a Zarr dataset which will be located at `data/target`.
It will create a so called `color` layer containing your raw greyscale/color image.
The supplied `--voxel-size` is specified in nanometers.
This example will create an unsharded Zarr v2 dataset with a voxel size of (4,4,4) nm<sup>3</sup> and a chunk size of (64,64,64) voxel.
A maximum of 4 parallel jobs will be used to parallelize the conversion, compression and downsampling.
Using the `--data-format zarr3` argument will produce sharded Zarr v3 datasets.

Read the full documentation at [WEBKNOSSOS CLI](https://docs.webknossos.org/cli).

Expand Down Expand Up @@ -170,3 +175,5 @@ To get the best streaming performance for Zarr datasets consider the following s

- Use chunk sizes of 32 - 128 voxels^3
- Enable sharding (only available in Zarr 3+)
- Use 3D downsampling

4 changes: 2 additions & 2 deletions docs/volume_annotation/pen_tablets.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Beyond the mouse and keyboard WEBKNOSSOS is great for annotating datasets with a
## Using Wacom/Pen tablets
Using pen tablet can significantly boost your annotation productivity, especially if you set it up correctly with WEBKNOSSOS.

![youtube-video](https://www.youtube.com/embed/xk0gqsVx494)
![youtube-video](https://www.youtube.com/embed/qCrqswDwmi8)

To streamline your workflow, program your tablet and pen buttons to match the WEBKNOSSOS shortcuts. By doing so, you can focus on your pen without the need of a mouse or keyboard. Here is an example configuration using a Wacom tablet and the Wacom driver software:

Expand All @@ -26,7 +26,7 @@ You can find the full list for keyboard shortcuts in the [documentation](../ui/k
### Annotating with Wacom Pens
Now, let’s dive into the annotation process! In this example, we begin by quick-selecting a cell.

![youtube-video](https://www.youtube.com/embed/xk0gqsVx494?start=46)
![youtube-video](https://www.youtube.com/embed/qCrqswDwmi8?start=37)

If the annotation isn’t precise enough, we can easily switch to the eraser tool (middle left button) and erase a corner. Selecting the brush tool is as simple as pressing the left button, allowing us to add small surfaces to the annotation.
When ready, pressing the right button creates a new segment, and we can repeat the process for other cells.
Expand Down
Loading

0 comments on commit 459ff77

Please sign in to comment.