Skip to content

Commit

Permalink
Merge branch 'master' of github.com:scalableminds/webknossos into red…
Browse files Browse the repository at this point in the history
…esign-right-sidebar

* 'master' of github.com:scalableminds/webknossos:
  added Youtube videos to docs
  Log dataset uploads (with no conversion) to slack (#7157)
  Added "Automation Tutorial" to docs (#7160)
  fix logo image in README.md
  Second try for “Async IO for HttpsDataVault, Fox Error Handling” (#7155)
  Revert "Async IO for HttpsDataVault, Fox Error Handling (#7137)" (#7154)
  Async IO for HttpsDataVault, Fox Error Handling (#7137)
  Fix vault path for precomputed datasets (#7151)
  Add extended keyboard shortcut mode via ctrl + k for tool shortcuts (#7112)
  Shared Chunk Cache for all DatasetArrays, CacheWeight for AlfuCache (#7067)
  • Loading branch information
hotzenklotz committed Jun 20, 2023
2 parents 53f54c2 + 110fcb4 commit bca04f0
Show file tree
Hide file tree
Showing 80 changed files with 1,144 additions and 673 deletions.
3 changes: 3 additions & 0 deletions CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
[Commits](https://github.com/scalableminds/webknossos/compare/23.06.0...HEAD)

### Added
- Added new shortcuts for fast tool switching. Look at the updated [Keyboard Shortcuts documentation](https://docs.webknossos.org/webknossos/keyboard_shortcuts.html#tool-switching-shortcuts) to see the new shortcuts. [#7112](https://github.com/scalableminds/webknossos/pull/7112)
- Subfolders of the currently active folder are now also rendered in the dataset table in the dashboard. [#6996](https://github.com/scalableminds/webknossos/pull/6996)
- Added ability to view [zarr v3](https://zarr-specs.readthedocs.io/en/latest/v3/core/v3.0.html) datasets. [#7079](https://github.com/scalableminds/webknossos/pull/7079)
- Added an index structure for volume annotation segments, in preparation for per-segment statistics. [#7063](https://github.com/scalableminds/webknossos/pull/7063)
Expand All @@ -19,6 +20,8 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
### Changed
- Creating bounding boxes can now be done by dragging the left mouse button (when the bounding box tool is selected). To move around in the dataset while this tool is active, keep ALT pressed. [#7118](https://github.com/scalableminds/webknossos/pull/7118)
- Agglomerate skeletons can only be modified if the proofreading tool is active so they stay in sync with the underlying segmentation and agglomerate graph. Agglomerate skeletons cannot be modified using any other means. They are marked in the skeleton list using the clipboard icon of the proofreading tool. When exporting skeletons in the NML format, trees ("things") now have a `type` property which is either "DEFAULT" or "AGGLOMERATE". [#7086](https://github.com/scalableminds/webknossos/pull/7086)
- The cache for remote dataset array contents can now have a configured size in bytes. New config option `datastore.cache.imageArrayChunks.maxSizeBytes`. Default is 2 GB, consider increasing for production. [#7067](https://github.com/scalableminds/webknossos/pull/7067)
- Optimized processing of parallel requests for remote datasets, improving performance and reducing idle waiting. [#7137](https://github.com/scalableminds/webknossos/pull/7137)
- Redesigned the info tab in the right-hand sidebar to be fit the new branding and design language. [#7110](https://github.com/scalableminds/webknossos/pull/7110)

### Fixed
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# [WEBKNOSSOS](https://webknossos.org/)
<img align="right" src="https://raw.githubusercontent.com/scalableminds/webknossos/master/public/images/icon-only.svg" alt="WEBKNOSSOS Logo" width="150" />
<img align="right" src="https://raw.githubusercontent.com/scalableminds/webknossos/master/public/images/logo-icon-only.svg" alt="WEBKNOSSOS Logo" width="150" />
WEBKNOSSOS is an open-source tool for annotating and exploring large 3D image datasets.

* Fly through your data for fast skeletonization and proof-reading
Expand Down
2 changes: 2 additions & 0 deletions app/WebKnossosModule.scala
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import com.google.inject.AbstractModule
import com.scalableminds.webknossos.datastore.storage.DataVaultService
import controllers.InitialDataService
import models.analytics.AnalyticsSessionService
import models.annotation.{AnnotationMutexService, AnnotationStore}
Expand Down Expand Up @@ -28,6 +29,7 @@ class WebKnossosModule extends AbstractModule {
bind(classOf[AnnotationMutexService]).asEagerSingleton()
bind(classOf[DataSetService]).asEagerSingleton()
bind(classOf[TimeSpanService]).asEagerSingleton()
bind(classOf[DataVaultService]).asEagerSingleton()
bind(classOf[TempFileService]).asEagerSingleton()
bind(classOf[MailchimpTicker]).asEagerSingleton()
bind(classOf[JobService]).asEagerSingleton()
Expand Down
33 changes: 25 additions & 8 deletions app/controllers/WKRemoteDataStoreController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ import com.scalableminds.webknossos.datastore.services.{
ReserveUploadInformation
}
import com.typesafe.scalalogging.LazyLogging

import javax.inject.Inject
import models.analytics.{AnalyticsService, UploadDatasetEvent}
import models.binary._
Expand All @@ -19,14 +20,15 @@ import models.folder.FolderDAO
import models.job.JobDAO
import models.organization.OrganizationDAO
import models.storage.UsedStorageService
import models.user.{User, UserDAO, UserService}
import models.user.{MultiUserDAO, User, UserDAO, UserService}
import net.liftweb.common.Full
import oxalis.mail.{MailchimpClient, MailchimpTag}
import oxalis.security.{WebknossosBearerTokenAuthenticatorService, WkSilhouetteEnvironment}
import oxalis.telemetry.SlackNotificationService
import play.api.i18n.{Messages, MessagesProvider}
import play.api.libs.json.{JsError, JsSuccess, JsValue, Json}
import play.api.mvc.{Action, AnyContent, PlayBodyParsers}
import utils.ObjectId
import utils.{ObjectId, WkConf}

import scala.concurrent.{ExecutionContext, Future}

Expand All @@ -42,8 +44,11 @@ class WKRemoteDataStoreController @Inject()(
userDAO: UserDAO,
folderDAO: FolderDAO,
jobDAO: JobDAO,
multiUserDAO: MultiUserDAO,
credentialDAO: CredentialDAO,
mailchimpClient: MailchimpClient,
slackNotificationService: SlackNotificationService,
conf: WkConf,
wkSilhouetteEnvironment: WkSilhouetteEnvironment)(implicit ec: ExecutionContext, bodyParsers: PlayBodyParsers)
extends Controller
with LazyLogging {
Expand Down Expand Up @@ -78,8 +83,8 @@ class WKRemoteDataStoreController @Inject()(
}
}

def validateLayerToLink(layerIdentifier: LinkedLayerIdentifier,
requestingUser: User)(implicit ec: ExecutionContext, m: MessagesProvider): Fox[Unit] =
private def validateLayerToLink(layerIdentifier: LinkedLayerIdentifier,
requestingUser: User)(implicit ec: ExecutionContext, m: MessagesProvider): Fox[Unit] =
for {
organization <- organizationDAO.findOneByName(layerIdentifier.organizationName)(GlobalAccessContext) ?~> Messages(
"organization.notFound",
Expand All @@ -95,21 +100,34 @@ class WKRemoteDataStoreController @Inject()(
token: String,
dataSetName: String,
dataSetSizeBytes: Long,
needsConversion: Boolean): Action[AnyContent] =
needsConversion: Boolean,
viaAddRoute: Boolean): Action[AnyContent] =
Action.async { implicit request =>
dataStoreService.validateAccess(name, key) { dataStore =>
for {
user <- bearerTokenService.userForToken(token)
dataSet <- dataSetDAO.findOneByNameAndOrganization(dataSetName, user._organization)(GlobalAccessContext) ?~> Messages(
"dataSet.notFound",
dataSetName) ~> NOT_FOUND
_ <- Fox.runIf(!needsConversion)(usedStorageService.refreshStorageReportForDataset(dataSet))
_ <- Fox.runIf(!needsConversion && !viaAddRoute)(usedStorageService.refreshStorageReportForDataset(dataSet))
_ <- Fox.runIf(!needsConversion)(logUploadToSlack(user, dataSetName, viaAddRoute))
_ = analyticsService.track(UploadDatasetEvent(user, dataSet, dataStore, dataSetSizeBytes))
_ = if (!needsConversion) mailchimpClient.tagUser(user, MailchimpTag.HasUploadedOwnDataset)
} yield Ok
}
}

private def logUploadToSlack(user: User, datasetName: String, viaAddRoute: Boolean): Fox[Unit] =
for {
organization <- organizationDAO.findOne(user._organization)(GlobalAccessContext)
multiUser <- multiUserDAO.findOne(user._multiUser)(GlobalAccessContext)
resultLink = s"${conf.Http.uri}/datasets/${organization.name}/$datasetName"
addLabel = if (viaAddRoute) "(via explore+add)" else "(upload without conversion)"
superUserLabel = if (multiUser.isSuperUser) " (for superuser)" else ""
_ = slackNotificationService.info(s"Dataset added $addLabel$superUserLabel",
s"For organization: ${organization.displayName}. <$resultLink|Result>")
} yield ()

def statusUpdate(name: String, key: String): Action[JsValue] = Action.async(parse.json) { implicit request =>
dataStoreService.validateAccess(name, key) { _ =>
request.body.validate[DataStoreStatus] match {
Expand Down Expand Up @@ -172,11 +190,10 @@ class WKRemoteDataStoreController @Inject()(
.findOneByNameAndOrganizationName(datasourceId.name, datasourceId.team)(GlobalAccessContext)
.futureBox
_ <- existingDataset.flatMap {
case Full(dataset) => {
case Full(dataset) =>
dataSetDAO
.deleteDataset(dataset._id)
.flatMap(_ => usedStorageService.refreshStorageReportForDataset(dataset))
}
case _ => Fox.successful(())
}
} yield Ok
Expand Down
8 changes: 4 additions & 4 deletions app/models/binary/credential/CredentialService.scala
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ package models.binary.credential
import com.scalableminds.util.tools.Fox
import com.scalableminds.webknossos.datastore.storage.{
DataVaultCredential,
DataVaultsHolder,
DataVaultService,
GoogleServiceAccountCredential,
HttpBasicAuthCredential,
S3AccessKeyCredential
Expand All @@ -24,21 +24,21 @@ class CredentialService @Inject()(credentialDAO: CredentialDAO) {
userId: ObjectId,
organizationId: ObjectId): Option[DataVaultCredential] =
uri.getScheme match {
case DataVaultsHolder.schemeHttps | DataVaultsHolder.schemeHttp =>
case DataVaultService.schemeHttps | DataVaultService.schemeHttp =>
credentialIdentifier.map(
username =>
HttpBasicAuthCredential(uri.toString,
username,
credentialSecret.getOrElse(""),
userId.toString,
organizationId.toString))
case DataVaultsHolder.schemeS3 =>
case DataVaultService.schemeS3 =>
(credentialIdentifier, credentialSecret) match {
case (Some(keyId), Some(secretKey)) =>
Some(S3AccessKeyCredential(uri.toString, keyId, secretKey, userId.toString, organizationId.toString))
case _ => None
}
case DataVaultsHolder.schemeGS =>
case DataVaultService.schemeGS =>
for {
secret <- credentialSecret
secretJson <- tryo(Json.parse(secret)).toOption
Expand Down
8 changes: 5 additions & 3 deletions app/models/binary/explore/ExploreRemoteLayerService.scala
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ import com.scalableminds.webknossos.datastore.datareaders.zarr._
import com.scalableminds.webknossos.datastore.datareaders.zarr3.Zarr3ArrayHeader
import com.scalableminds.webknossos.datastore.datavault.VaultPath
import com.scalableminds.webknossos.datastore.models.datasource._
import com.scalableminds.webknossos.datastore.storage.{DataVaultsHolder, RemoteSourceDescriptor}
import com.scalableminds.webknossos.datastore.storage.{DataVaultService, RemoteSourceDescriptor}
import com.typesafe.scalalogging.LazyLogging
import models.binary.credential.CredentialService
import models.user.User
Expand All @@ -38,7 +38,9 @@ object ExploreRemoteDatasetParameters {
implicit val jsonFormat: OFormat[ExploreRemoteDatasetParameters] = Json.format[ExploreRemoteDatasetParameters]
}

class ExploreRemoteLayerService @Inject()(credentialService: CredentialService) extends FoxImplicits with LazyLogging {
class ExploreRemoteLayerService @Inject()(credentialService: CredentialService, dataVaultService: DataVaultService)
extends FoxImplicits
with LazyLogging {

def exploreRemoteDatasource(
urisWithCredentials: List[ExploreRemoteDatasetParameters],
Expand Down Expand Up @@ -172,7 +174,7 @@ class ExploreRemoteLayerService @Inject()(credentialService: CredentialService)
requestingUser._organization)
remoteSource = RemoteSourceDescriptor(uri, credentialOpt)
credentialId <- Fox.runOptional(credentialOpt)(c => credentialService.insertOne(c)) ?~> "dataVault.credential.insert.failed"
remotePath <- DataVaultsHolder.getVaultPath(remoteSource) ?~> "dataVault.setup.failed"
remotePath <- dataVaultService.getVaultPath(remoteSource) ?~> "dataVault.setup.failed"
layersWithVoxelSizes <- exploreRemoteLayersForRemotePath(
remotePath,
credentialId.map(_.toString),
Expand Down
4 changes: 2 additions & 2 deletions app/models/job/Job.scala
Original file line number Diff line number Diff line change
Expand Up @@ -293,7 +293,7 @@ class JobService @Inject()(wkConf: WkConf,
superUserLabel = if (multiUser.isSuperUser) " (for superuser)" else ""
durationLabel = jobAfterChange.duration.map(d => s" after ${formatDuration(d)}").getOrElse("")
_ = analyticsService.track(FailedJobEvent(user, jobBeforeChange.command))
msg = s"Job ${jobBeforeChange._id} failed$durationLabel. Command ${jobBeforeChange.command}, organization name: ${organization.name}."
msg = s"Job ${jobBeforeChange._id} failed$durationLabel. Command ${jobBeforeChange.command}, organization: ${organization.displayName}."
_ = logger.warn(msg)
_ = slackNotificationService.warn(
s"Failed job$superUserLabel",
Expand All @@ -313,7 +313,7 @@ class JobService @Inject()(wkConf: WkConf,
multiUser <- multiUserDAO.findOne(user._multiUser)(GlobalAccessContext)
superUserLabel = if (multiUser.isSuperUser) " (for superuser)" else ""
durationLabel = jobAfterChange.duration.map(d => s" after ${formatDuration(d)}").getOrElse("")
msg = s"Job ${jobBeforeChange._id} succeeded$durationLabel. Command ${jobBeforeChange.command}, organization name: ${organization.name}.${resultLinkSlack
msg = s"Job ${jobBeforeChange._id} succeeded$durationLabel. Command ${jobBeforeChange.command}, organization: ${organization.displayName}.${resultLinkSlack
.getOrElse("")}"
_ = logger.info(msg)
_ = slackNotificationService.success(
Expand Down
8 changes: 4 additions & 4 deletions app/models/user/UserService.scala
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import com.mohiva.play.silhouette.api.services.IdentityService
import com.mohiva.play.silhouette.api.util.PasswordInfo
import com.mohiva.play.silhouette.impl.providers.CredentialsProvider
import com.scalableminds.util.accesscontext.{DBAccessContext, GlobalAccessContext}
import com.scalableminds.util.cache.AlfuFoxCache
import com.scalableminds.util.cache.AlfuCache
import com.scalableminds.util.security.SCrypt
import com.scalableminds.util.time.Instant
import com.scalableminds.util.tools.{Fox, FoxImplicits}
Expand Down Expand Up @@ -47,8 +47,8 @@ class UserService @Inject()(conf: WkConf,
private lazy val Mailer =
actorSystem.actorSelection("/user/mailActor")

private val userCache: AlfuFoxCache[(ObjectId, String), User] =
AlfuFoxCache(timeToLive = conf.WebKnossos.Cache.User.timeout, timeToIdle = conf.WebKnossos.Cache.User.timeout)
private val userCache: AlfuCache[(ObjectId, String), User] =
AlfuCache(timeToLive = conf.WebKnossos.Cache.User.timeout, timeToIdle = conf.WebKnossos.Cache.User.timeout)

def userFromMultiUserEmail(email: String)(implicit ctx: DBAccessContext): Fox[User] =
for {
Expand Down Expand Up @@ -206,7 +206,7 @@ class UserService @Inject()(conf: WkConf,
}

private def removeUserFromCache(userId: ObjectId): Unit =
userCache.remove(idAndAccessContextString => idAndAccessContextString._1 == userId)
userCache.clear(idAndAccessContextString => idAndAccessContextString._1 == userId)

def changePasswordInfo(loginInfo: LoginInfo, passwordInfo: PasswordInfo): Fox[PasswordInfo] =
for {
Expand Down
1 change: 1 addition & 0 deletions conf/application.conf
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,7 @@ datastore {
cache {
dataCube.maxEntries = 40
mapping.maxEntries = 5
imageArrayChunks.maxSizeBytes = 2000000000 # 2 GB
agglomerateFile {
maxFileHandleEntries = 15
maxSegmentIdEntries = 625000
Expand Down
2 changes: 1 addition & 1 deletion conf/webknossos.latest.routes
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ PUT /datastores/:name/datasource
PUT /datastores/:name/datasources controllers.WKRemoteDataStoreController.updateAll(name: String, key: String)
PATCH /datastores/:name/status controllers.WKRemoteDataStoreController.statusUpdate(name: String, key: String)
POST /datastores/:name/reserveUpload controllers.WKRemoteDataStoreController.reserveDataSetUpload(name: String, key: String, token: String)
POST /datastores/:name/reportDatasetUpload controllers.WKRemoteDataStoreController.reportDatasetUpload(name: String, key: String, token: String, dataSetName: String, dataSetSizeBytes: Long, needsConversion: Boolean)
POST /datastores/:name/reportDatasetUpload controllers.WKRemoteDataStoreController.reportDatasetUpload(name: String, key: String, token: String, dataSetName: String, dataSetSizeBytes: Long, needsConversion: Boolean, viaAddRoute: Boolean)
POST /datastores/:name/deleteDataset controllers.WKRemoteDataStoreController.deleteDataset(name: String, key: String)
GET /datastores/:name/jobExportProperties controllers.WKRemoteDataStoreController.jobExportProperties(name: String, key: String, jobId: String)
GET /datastores/:name/findCredential controllers.WKRemoteDataStoreController.findCredential(name: String, key: String, credentialId: String)
Expand Down
5 changes: 3 additions & 2 deletions docs/automated_analysis.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,9 @@ The automated analysis features are designed to provide a general solution to a
We plan to add more automated analysis features in the future. If you want to work with us on an automated analysis project, [please contact us](mailto:[email protected]).
We would love to integrate analysis solutions for more modalities and use cases.

Automated analysis is only available on [webknossos.org](https://webknossos.org) at the moment.
If you want to set up on-premise automated analysis at your institute/workplace, then [please contact us](mailto:[email protected]).
!!!info
Automated analysis is only available on [webknossos.org](https://webknossos.org) at the moment.
If you want to set up on-premise automated analysis at your institute/workplace, then [please contact us](mailto:[email protected]).

## Nuclei Inferral
As a first trial, WEBKNOSSOS includes nuclei segmentation. This analysis is designed to work with serial block-face electron microscopy (SBEM) data of neural tissue (brain/cortex) and will find and segment all nuclei within the dataset.
Expand Down
Loading

0 comments on commit bca04f0

Please sign in to comment.