Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make segmentation output layer name for neuron detection configurable #7472

Merged
merged 20 commits into from
Jan 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
5177f39
make segmentation output layer name for neuron detection configurable
MichaelBuessemeyer Dec 4, 2023
87af329
Merge branch 'master' of github.com:scalableminds/webknossos into con…
MichaelBuessemeyer Dec 19, 2023
fa891bd
Merge branch 'master' of github.com:scalableminds/webknossos into con…
MichaelBuessemeyer Jan 5, 2024
5faae78
add OutputSegmentationLayerName to neuron inferral job params
MichaelBuessemeyer Jan 8, 2024
cbba404
Merge branch 'master' of github.com:scalableminds/webknossos into con…
MichaelBuessemeyer Jan 8, 2024
612a4a5
add changelog entry
MichaelBuessemeyer Jan 8, 2024
3fbca5d
remove comment
MichaelBuessemeyer Jan 8, 2024
efb5083
assert valid output layer + dataset names in backend
fm3 Jan 12, 2024
95e5889
git Merge branch 'master' of github.com:scalableminds/webknossos into…
MichaelBuessemeyer Jan 12, 2024
4863778
Merge branch 'configurable-segm-layer-name-for-neuron-job' of github.…
MichaelBuessemeyer Jan 12, 2024
edbfc42
more name check
fm3 Jan 15, 2024
3d85581
Merge branch 'configurable-segm-layer-name-for-neuron-job' of github.…
fm3 Jan 15, 2024
ae20854
Merge branch 'master' of github.com:scalableminds/webknossos into con…
MichaelBuessemeyer Jan 17, 2024
219f908
apply pr feedback
MichaelBuessemeyer Jan 17, 2024
23e38e8
Merge branch 'master' into configurable-segm-layer-name-for-neuron-job
MichaelBuessemeyer Jan 17, 2024
0d81436
Fix previous merge with master
MichaelBuessemeyer Jan 17, 2024
44799d9
Merge branch 'master' of github.com:scalableminds/webknossos into con…
MichaelBuessemeyer Jan 22, 2024
a3da4e2
add comment explaining materialize annotation job arguments
MichaelBuessemeyer Jan 22, 2024
07c5638
Merge branch 'master' of github.com:scalableminds/webknossos into con…
MichaelBuessemeyer Jan 22, 2024
7e352fe
Merge branch 'master' into configurable-segm-layer-name-for-neuron-job
MichaelBuessemeyer Jan 22, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
- Added support for blosc compressed N5 datasets. [#7465](https://github.com/scalableminds/webknossos/pull/7465)
- Added route for triggering the compute segment index worker job. [#7471](https://github.com/scalableminds/webknossos/pull/7471)
- Added thumbnails to the dashboard dataset list. [#7479](https://github.com/scalableminds/webknossos/pull/7479)
- Added the option to configure the name of the output segmentation layer in the neuron inferral job. [#7472](https://github.com/scalableminds/webknossos/pull/7472)
- Adhoc mesh rendering is now available for ND datasets.[#7394](https://github.com/scalableminds/webknossos/pull/7394)
- Added the ability to compose a new dataset from existing dataset layers. This can be done with or without transforms (transforms will be derived from landmarks given via BigWarp CSV or WK NMLs). [#7395](https://github.com/scalableminds/webknossos/pull/7395)
- When setting up WEBKNOSSOS from the git repository for development, the organization directory for storing datasets is now automatically created on startup. [#7517](https://github.com/scalableminds/webknossos/pull/7517)
Expand Down
21 changes: 19 additions & 2 deletions app/controllers/JobsController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,9 @@ import play.silhouette.api.Silhouette
import com.scalableminds.util.geometry.Vec3Int
import com.scalableminds.util.accesscontext.GlobalAccessContext
import com.scalableminds.util.tools.Fox
import models.dataset.DatasetDAO
import models.dataset.{DataStoreDAO, DatasetDAO, DatasetService}
import models.job._
import models.organization.OrganizationDAO
import models.dataset.DataStoreDAO
import models.user.MultiUserDAO
import play.api.i18n.Messages
import play.api.libs.json._
Expand Down Expand Up @@ -53,6 +52,7 @@ class JobsController @Inject()(
jobDAO: JobDAO,
sil: Silhouette[WkEnv],
datasetDAO: DatasetDAO,
datasetService: DatasetService,
jobService: JobService,
workerService: WorkerService,
workerDAO: WorkerDAO,
Expand Down Expand Up @@ -149,6 +149,7 @@ class JobsController @Inject()(
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
_ <- datasetService.assertValidLayerName(layerName)
command = JobCommand.compute_mesh_file
commandArgs = Json.obj(
"organization_name" -> organizationName,
Expand All @@ -173,6 +174,7 @@ class JobsController @Inject()(
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
_ <- datasetService.assertValidLayerName(layerName)
command = JobCommand.compute_segment_index_file
commandArgs = Json.obj(
"organization_name" -> organizationName,
Expand All @@ -198,6 +200,8 @@ class JobsController @Inject()(
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
_ <- datasetService.assertValidDatasetName(newDatasetName)
_ <- datasetService.assertValidLayerName(layerName)
command = JobCommand.infer_nuclei
commandArgs = Json.obj(
"organization_name" -> organizationName,
Expand All @@ -216,6 +220,7 @@ class JobsController @Inject()(
datasetName: String,
layerName: String,
bbox: String,
outputSegmentationLayerName: String,
newDatasetName: String): Action[AnyContent] =
sil.SecuredAction.async { implicit request =>
log(Some(slackNotificationService.noticeFailedJobRequest)) {
Expand All @@ -226,6 +231,9 @@ class JobsController @Inject()(
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
_ <- datasetService.assertValidDatasetName(newDatasetName)
_ <- datasetService.assertValidLayerName(outputSegmentationLayerName)
_ <- datasetService.assertValidLayerName(layerName)
multiUser <- multiUserDAO.findOne(request.identity._multiUser)
_ <- Fox.runIf(!multiUser.isSuperUser)(jobService.assertBoundingBoxLimits(bbox, None))
command = JobCommand.infer_neurons
Expand All @@ -234,6 +242,7 @@ class JobsController @Inject()(
"dataset_name" -> datasetName,
"new_dataset_name" -> newDatasetName,
"layer_name" -> layerName,
"output_segmentation_layer_name" -> outputSegmentationLayerName,
"webknossos_token" -> RpcTokenHolder.webknossosToken,
"bbox" -> bbox,
)
Expand All @@ -257,6 +266,8 @@ class JobsController @Inject()(
dataset <- datasetDAO.findOneByNameAndOrganizationName(datasetName, organizationName) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
_ <- Fox.runOptional(layerName)(datasetService.assertValidLayerName)
_ <- Fox.runOptional(annotationLayerName)(datasetService.assertValidLayerName)
_ <- jobService.assertBoundingBoxLimits(bbox, mag)
userAuthToken <- wkSilhouetteEnvironment.combinedAuthenticatorService.findOrCreateToken(
request.identity.loginInfo)
Expand Down Expand Up @@ -301,9 +312,12 @@ class JobsController @Inject()(
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
_ <- datasetService.assertValidLayerName(fallbackLayerName)
userAuthToken <- wkSilhouetteEnvironment.combinedAuthenticatorService.findOrCreateToken(
request.identity.loginInfo)
command = JobCommand.materialize_volume_annotation
_ <- datasetService.assertValidDatasetName(newDatasetName)
_ <- datasetService.assertValidLayerName(outputSegmentationLayerName)
commandArgs = Json.obj(
"organization_name" -> organizationName,
"dataset_name" -> datasetName,
Expand Down Expand Up @@ -333,6 +347,7 @@ class JobsController @Inject()(
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
"dataset.notFound",
datasetName) ~> NOT_FOUND
_ <- datasetService.assertValidLayerName(layerName)
command = JobCommand.find_largest_segment_id
commandArgs = Json.obj(
"organization_name" -> organizationName,
Expand Down Expand Up @@ -366,6 +381,8 @@ class JobsController @Inject()(
bool2Fox(animationJobOptions.movieResolution == MovieResolutionSetting.SD) ?~> "job.renderAnimation.resolutionMustBeSD"
}
layerName = animationJobOptions.layerName
_ <- datasetService.assertValidLayerName(layerName)
_ <- Fox.runOptional(animationJobOptions.segmentationLayerName)(datasetService.assertValidLayerName)
exportFileName = s"webknossos_animation_${formatDateForFilename(new Date())}__${datasetName}__$layerName.mp4"
command = JobCommand.render_animation
commandArgs = Json.obj(
Expand Down
6 changes: 6 additions & 0 deletions app/models/dataset/DatasetService.scala
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,12 @@ class DatasetService @Inject()(organizationDAO: OrganizationDAO,
_ <- bool2Fox(name.length >= 3) ?~> "dataset.name.invalid.lessThanThreeCharacters"
} yield ()

def assertValidLayerName(name: String): Fox[Unit] =
for {
_ <- bool2Fox(name.matches("[A-Za-z0-9_\\-\\.]*")) ?~> "dataset.layer.name.invalid.characters"
_ <- bool2Fox(!name.startsWith(".")) ?~> "dataset.layer.name.invalid.startsWithDot"
} yield ()

def assertNewDatasetName(name: String, organizationId: ObjectId): Fox[Unit] =
datasetDAO.findOneByNameAndOrganization(name, organizationId)(GlobalAccessContext).reverse

Expand Down
2 changes: 1 addition & 1 deletion conf/webknossos.latest.routes
Original file line number Diff line number Diff line change
Expand Up @@ -269,7 +269,7 @@ POST /jobs/run/computeMeshFile/:organizationName/:datasetName
POST /jobs/run/computeSegmentIndexFile/:organizationName/:datasetName controllers.JobsController.runComputeSegmentIndexFileJob(organizationName: String, datasetName: String, layerName: String)
POST /jobs/run/exportTiff/:organizationName/:datasetName controllers.JobsController.runExportTiffJob(organizationName: String, datasetName: String, bbox: String, layerName: Option[String], mag: Option[String], annotationLayerName: Option[String], annotationId: Option[String], asOmeTiff: Boolean)
POST /jobs/run/inferNuclei/:organizationName/:datasetName controllers.JobsController.runInferNucleiJob(organizationName: String, datasetName: String, layerName: String, newDatasetName: String)
POST /jobs/run/inferNeurons/:organizationName/:datasetName controllers.JobsController.runInferNeuronsJob(organizationName: String, datasetName: String, layerName: String, bbox: String, newDatasetName: String)
POST /jobs/run/inferNeurons/:organizationName/:datasetName controllers.JobsController.runInferNeuronsJob(organizationName: String, datasetName: String, layerName: String, bbox: String, outputSegmentationLayerName: String, newDatasetName: String)
POST /jobs/run/materializeVolumeAnnotation/:organizationName/:datasetName controllers.JobsController.runMaterializeVolumeAnnotationJob(organizationName: String, datasetName: String, fallbackLayerName: String, annotationId: String, annotationType: String, newDatasetName: String, outputSegmentationLayerName: String, mergeSegments: Boolean, volumeLayerName: Option[String])
POST /jobs/run/findLargestSegmentId/:organizationName/:datasetName controllers.JobsController.runFindLargestSegmentIdJob(organizationName: String, datasetName: String, layerName: String)
POST /jobs/run/renderAnimation/:organizationName/:datasetName controllers.JobsController.runRenderAnimationJob(organizationName: String, datasetName: String)
Expand Down
11 changes: 8 additions & 3 deletions frontend/javascripts/admin/admin_rest_api.ts
Original file line number Diff line number Diff line change
Expand Up @@ -1287,12 +1287,17 @@ export function startNeuronInferralJob(
datasetName: string,
layerName: string,
bbox: Vector6,
outputSegmentationLayerName: string,
newDatasetName: string,
): Promise<APIJob> {
const urlParams = new URLSearchParams({
layerName,
bbox: bbox.join(","),
outputSegmentationLayerName,
newDatasetName,
});
return Request.receiveJSON(
`/api/jobs/run/inferNeurons/${organizationName}/${datasetName}?layerName=${layerName}&bbox=${bbox.join(
",",
)}&newDatasetName=${newDatasetName}`,
`/api/jobs/run/inferNeurons/${organizationName}/${datasetName}?${urlParams.toString()}`,
{
method: "POST",
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,10 @@ const jobNameToImagePath: Record<
materialize_volume_annotation: "materialize_volume_annotation_example.jpg",
invisible: "",
};
const jobTypeWithConfigurableOutputSegmentationLayerName = [
"materialize_volume_annotation",
"neuron_inferral",
];
type Props = {
handleClose: () => void;
};
Expand Down Expand Up @@ -566,23 +570,24 @@ function StartJobForm(props: StartJobFormProps) {
initialOutputSegmentationLayerName = `${
initialOutputSegmentationLayerName || "segmentation"
}_corrected`;
// TODO: Other jobs also have an output segmentation layer. The names for these jobs should also be configurable.
const hasOutputSegmentationLayer = jobName === "materialize_volume_annotation";
const hasOutputSegmentationLayer =
jobTypeWithConfigurableOutputSegmentationLayerName.indexOf(jobName) > -1;
const notAllowedOutputLayerNames = allLayers
.filter((layer) => {
// Existing layer names may not be used for the output layer. The only exception
// is the name of the currently selected layer. This layer is the only one not
// copied over from the original dataset to the output dataset.
// Therefore, this name is available as the name for the output layer name.
// That is why that layer is filtered out here.
const currentSelectedVolumeLayerName = form.getFieldValue("layerName") || initialLayerName;
const currentSelectedVolumeLayerName = chooseSegmentationLayer
? form.getFieldValue("layerName") || initialLayerName
: undefined;
return (
getReadableNameOfVolumeLayer(layer, tracing) !== currentSelectedVolumeLayerName &&
layer.name !== currentSelectedVolumeLayerName
);
})
.map((layer) => getReadableNameOfVolumeLayer(layer, tracing) || layer.name);

return (
<Form
onFinish={startJob}
Expand Down Expand Up @@ -672,9 +677,14 @@ export function NeuronSegmentationForm() {
title="AI Neuron Segmentation"
suggestedDatasetSuffix="with_reconstructed_neurons"
isBoundingBoxConfigurable
jobApiCall={async ({ newDatasetName, selectedLayer: colorLayer, selectedBoundingBox }) => {
if (!selectedBoundingBox) {
return Promise.resolve();
jobApiCall={async ({
newDatasetName,
selectedLayer: colorLayer,
selectedBoundingBox,
outputSegmentationLayerName,
}) => {
if (!selectedBoundingBox || !outputSegmentationLayerName) {
return;
}

const bbox = computeArrayFromBoundingBox(selectedBoundingBox.boundingBox);
Expand All @@ -683,6 +693,7 @@ export function NeuronSegmentationForm() {
dataset.name,
colorLayer.name,
bbox,
outputSegmentationLayerName,
newDatasetName,
);
}}
Expand Down Expand Up @@ -788,9 +799,19 @@ export function MaterializeVolumeAnnotationModal({
outputSegmentationLayerName,
}) => {
if (outputSegmentationLayerName == null) {
return Promise.resolve();
return;
}
const volumeLayerName = getReadableNameOfVolumeLayer(segmentationLayer, tracing);
// There are 3 cases for the value assignments to volumeLayerName and baseSegmentationName for the job:
// 1. There is a volume annotation with a fallback layer. volumeLayerName will reference the volume layer
// and baseSegmentationName will reference the fallback layer. The job will merge those layers.
// 2. There is a segmentation layer without a fallback layer. volumeLayerName will be null and baseSegmentationName
// will reference the segmentation layer. The job will use the segmentation layer without any merging.
// 3. There is a volume annotation without a fallback layer. volumeLayerName will be null
// and baseSegmentationName will reference the volume layer. The job will use the volume annotation without any merging.
const volumeLayerName =
"fallbackLayer" in segmentationLayer && segmentationLayer.fallbackLayer != null
? getReadableNameOfVolumeLayer(segmentationLayer, tracing)
: null;
Comment on lines +811 to +814
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so, if no fallbackLayer exists volumeLayerName will simply be null? what will the job do then?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so, if no fallbackLayer exists volumeLayerName will simply be null? what will the job do then?

In short: Do not merge the fallback data with the volume annotation and simply take the volume data of the segmentation layer given and apply the skeletons of the merger mode.

Longer explanation (tried to make it clear 🙈):
The volumeLayerName is an optional parameter to the materialize job. The materialize job can apply merger mode skeletons as well as merging fallback data with the volume data given by a column annotation.

In case the user has no volume annotation but e.g. wants its skeletons to be applied to a segmentation layer (without any volume annotation), the job just takes the volume layer's segmentation as the segmentation data. => segmentationLayer.fallbackLayer will be null and getBaseSegmentationName will return the layer name. Thus volumeLayerName is null and the job knows to not merge any segmentation data but to only apply the skeleton to the provided segmentation layer's data.

In case there is a volume annotation the fallback layer name still needs to be supplied so it will be merged with the volume annotation before a potential skeleton is applied to merge segments

I hope this is kinda clear. Maybe taking a quick look at the worker job also helps / explains it better: https://github.com/scalableminds/voxelytics/blob/06df1bb2785361b7bfe9abbccb1fb2027d0b2a65/voxelytics/worker/jobs/materialize_volume_annotation_legacy.py#L205

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the clarification! It took me a while to grasp and I hope we can improve the naming and add some comments. I'll try to summarize (correct me if I'm wrong).

There are 3 cases:

  1. There is a fallback segmentation and a volume annotation that is based on that. volumeLayerName will reference the volume layer. baseSegmentationName will reference the fallback segmentation. The worker job will merge those.
  2. There is a segmentation layer and no volume annotation. volumeLayerName will be null (because segmentationLayer.fallbackLayer will be null). baseSegmentationName will reference the only existing segmentation. The worker won't merge anything because volumeLayerName is null.
  3. There is no fallback segmentation and a volume annotation. volumeLayerName will be null (because segmentationLayer.fallbackLayer will be null). baseSegmentationName will reference the volume annotation. As in (2), no merging will be done.

I think, cases 1 + 2 make sense. However, in case 3, I find it quite confusing that the variable name volumeLayerName will be null even though a volume layer exists. Maybe you have an idea for clearing this up a bit. Changing the argument names could get a bit complicated because of the worker and the DB tables, right?
In the simplest case, add a comment about these three cases (like my enumeration above, if it is correct) 🙂

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are totally correct in your enumeration.

I originally designed the first version of this job afaik. I apologize for the confusing naming / confusing arguments passed to the job in case 3. I do not remember whether I had a good reason for this back then.

In the simplest case, add a comment about these three cases (like my enumeration above, if it is correct) 🙂
Ok 👍 I did that

const baseSegmentationName = getBaseSegmentationName(segmentationLayer);
return startMaterializingVolumeAnnotationJob(
dataset.owningOrganization,
Expand Down