Visit the Index Page
\n", + " This workflow example is part of set of related workflows. If you haven't already, visit the index page for an introduction and guidance on choosing the appropriate workflow.\n", + "Visit the Index Page
\n", + " This workflow example is part of set of related workflows. If you haven't already, visit the index page for an introduction and guidance on choosing the appropriate workflow.\n", + "<xarray.DataArray 'value' (channel: 10, time: 1280)> Size: 102kB\n", - "array([[-3.97168397e-01, 3.16809427e-01, -7.24738207e-01, ...,\n", - " -3.80461687e+01, -3.87342359e+01, -3.87612205e+01],\n", - " [ 5.74970652e-01, 3.57991866e-01, 2.82790051e-02, ...,\n", - " -4.45808503e+00, -3.03739011e+00, -3.11130749e+00],\n", - " [-1.55057719e+00, -2.68244322e+00, -3.17888892e+00, ...,\n", - " -1.39981549e+01, -1.29738342e+01, -1.37371670e+01],\n", - " ...,\n", - " [ 7.51967849e-01, 7.22525060e-01, 5.80195695e-01, ...,\n", - " -4.60049743e+01, -4.58063681e+01, -4.53709490e+01],\n", - " [ 2.48108863e-01, 9.33010605e-02, 6.46380629e-02, ...,\n", - " 3.49839596e+01, 3.51516409e+01, 3.57323487e+01],\n", - " [-1.08325005e+00, 2.89083915e-01, 1.62128828e+00, ...,\n", - " -5.33306805e+01, -5.11417266e+01, -5.20163110e+01]])\n", - "Coordinates:\n", - " * channel (channel) <U5 200B 'EEG 0' 'EEG 1' 'EEG 2' ... 'EEG 8' 'EEG 9'\n", - " * time (time) float64 10kB 0.0 0.003909 0.007819 ... 4.992 4.996 5.0\n", - " group (channel) <U1 40B 'A' 'B' 'C' 'A' 'B' 'C' 'A' 'B' 'C' 'A'
Visit the Index Page
\n", + " This workflow example is part of set of related workflows. If you haven't already, visit the index page for an introduction and guidance on choosing the appropriate workflow.\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The intended use-case for this workflow is to browse and annotate multi-channel timeseries data from an [electrophysiological](https://en.wikipedia.org/wiki/Electrophysiology) recording session. Compared to the notebooks in this set of workflows, this particular workflow is focused on 'medium-sized' dataset, which we will loosely define as a dataset with >100k samples and comfortably fits into available RAM. \n", + "\n", + "Medium-sized datasets can start to slow down a browser, and may require strategies like downsampling - a processing strategy that only sends a strided subsample of the data from memory to the browser for visualization. If there are many timeseries and they utilize a common time index, we can often streamline the added processing computation by using a single index-based slicing operation on all the timeseries.\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Prerequisites and Resources\n", + "\n", + "| Topic | Type | Notes |\n", + "| --- | --- | --- |\n", + "| [Intro and Guidance](./index.ipynb) | Prerequisite | Background |\n", + "| [Time Range Annotation](./time_range_annotation.ipynb) | Next Step | Display and edit time ranges |\n", + "| [Smaller Dataset Workflow](./small_multi-chan-ts.ipynb) | Alternative | Use Pandas and downsample |\n", + "| [Larger Dataset Workflow](./large_multi-chan-ts.ipynb) | Alternative | Use dynamic data chunking |" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "---" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Imports and Configuration" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "from scipy.stats import zscore\n", + "import string\n", + "import wget\n", + "from pathlib import Path\n", + "\n", + "import mne\n", + "\n", + "import colorcet as cc\n", + "import holoviews as hv\n", + "from holoviews.plotting.links import RangeToolLink\n", + "from holoviews.operation.datashader import rasterize\n", + "from holoviews.operation.downsample import downsample1d\n", + "from bokeh.models import HoverTool\n", + "import panel as pn\n", + "\n", + "pn.extension()\n", + "hv.extension('bokeh')\n", + "np.random.seed(0)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Download the data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's get some data! The following code downloads a dataset (2.6 MB) from a specified URL into a designated directory. It performs these steps:\n", + "\n", + "1. Sets the URL for the dataset.\n", + "2. Identifies the directory to store the downloaded file.\n", + "3. Ensures the directory exists, creating it if necessary.\n", + "4. Constructs the file path by combining the directory and dataset's filename.\n", + "5. Checks if the file already exists to avoid redundant downloads.\n", + "6. Downloads and saves the file if it's not already present." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "data_url = 'https://physionet.org/files/eegmmidb/1.0.0/S001/S001R04.edf'\n", + "output_directory = Path('./data')\n", + "\n", + "output_directory.mkdir(parents=True, exist_ok=True)\n", + "data_path = output_directory / Path(data_url).name\n", + "if not data_path.exists():\n", + " data_path = wget.download(data_url, out=str(data_path))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Read the data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, let's load the data into an MNE Raw object:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "raw = mne.io.read_raw_edf(data_path, preload=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's take a look at some general information for this data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print('num samples in dataset:', len(raw.times) * len(raw.ch_names))\n", + "raw" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here is the output from the previous code:\n", + "\n", + "```\n", + "num samples in dataset: 1280000\n", + "\n", + "General\n", + "Measurement date\tAugust 12, 2009 16:15:00 GMT\n", + "Experimenter\tUnknown\n", + "Participant\tX\n", + "Channels\n", + "Digitized points\tNot available\n", + "Good channels\t64 EEG\n", + "Bad channels\tNone\n", + "EOG channels\tNot available\n", + "ECG channels\tNot available\n", + "Data\n", + "Sampling frequency\t160.00 Hz\n", + "Highpass\t0.00 Hz\n", + "Lowpass\t80.00 Hz\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "So we have 64 channels of filtered 'EEG' data, sampled at 160Hz for about 2 minutes, and over a million data samples in total." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's preview the channel names, types, unit, and signal ranges. This `describe` method is from MNE, and we can have it return a Pandas DataFrame, from which we can `sample` some rows." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "raw.describe(data_frame=True).sample(5)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Pre-processing\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Averaging" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll first remove some of the large noise artifacts that impact all the channels by using an average reference. The idea is to compute the average across channels for every time point to get an average time series, and then subtract that average out of the raw EEG signal." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "raw.set_eeg_reference(\"average\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Clean Channel Names" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "From the output of the `describe` method, it looks like the channels are from commonly used standardized locations (e.g. 'Cz'), but contain some unnecessary periods, so let's clean those up." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "raw.rename_channels(lambda s: s.strip(\".\"));" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## *Optional*: Get Channel Locations" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is an optional step, but let's see if we can add locations to the channels. MNE has functionality to assign locations of the channels based on their standardized channel names, so we can go ahead and assign a commonly used arrangement (or 'montage') of electrodes ('10-05') to this data. Read more about making and setting the montage [here](https://mne.tools/stable/auto_tutorials/intro/40_sensor_locations.html#sphx-glr-auto-tutorials-intro-40-sensor-locations-py)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "montage = mne.channels.make_standard_montage(\"standard_1005\")\n", + "raw.set_montage(montage, match_case=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that the 'digitized points' (locations) are now added to the raw data.\n", + "\n", + "Now let's plot the channels ('sensors') using MNE [`plot_sensors`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.plot_sensors) on a top-down view of a head. Note, we'll adjust the reference point so the points are contained in the head." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sphere=(0, 0.015, 0, 0.099) # manually adjust the y origin coordinate and radius\n", + "raw.plot_sensors(show_names=True, sphere=sphere);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Prepare the data for plotting" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll use an MNE method, `to_data_frame`, to create a Pandas DataFrame. By default, MNE will convert EEG data from Volts to microVolts (µV) during this operation." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# TODO: file issue about rangetool not working with datetime (timezone error)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "df = raw.to_data_frame() # time_format='datetime'\n", + "df.set_index('time', inplace=True) \n", + "df.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Interactive plot" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As of writing, there's no easy way to track units with Pandas, so we can use a modular HoloViews approach to create and annotate dimensions with a unit, and then refer to these dimensions when plotting. Read more about annotating data with HoloViews [here](https://holoviews.org/user_guide/Annotating_Data.html)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "amplitude_dim = hv.Dimension(\"amplitude\", unit=\"µV\")\n", + "time_dim = hv.Dimension(\"time\", unit=\"s\") # matches the index name in the df" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we will loop over the columns (channels) in the dataframe, creating a HoloViews `Curve` element from each. Since each column in the df has a different name, we will use the `redim` method to map from the channel name to the common `amplitude_dim`. We'll set the Curve label to be the original channel name so we can still see this info in the hover tooltip.\n", + "\n", + "We will use HoloViews `.opts` to set the plotting options per Curve element. A couple important options include `hover_tooltip` and `subcoordinate_y`.\n", + "\n", + "The custom `hover_tooltip` argument is new in HoloViews as of 1.19.0. It allows us to specify which data dimensions show up in the tooltip when hovering over a data point. We can also specify that the values of 'group' or 'label' arguments should be included as well. Read more about `hover_tooltip` and related arguments [here](https://holoviews.org/user_guide/Plotting_with_Bokeh.html).\n", + "\n", + "The `subcoordinate_y` argument was introduced in HoloViews 1.18.0. Setting this to True will automatically distribute overlay elements along the y-axis, each with their own distinct y-axis subcoordinate system. Read more about `subcoordinate_y` [here](https://holoviews.org/user_guide/Customizing_Plots.html#subcoordinate-y-axis).\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "\n", + "curves = {}\n", + "for channel_name, channel_data in df.items():\n", + " curve = (\n", + " hv.Curve(\n", + " df, kdims=[time_dim], vdims=[channel_name], group=\"EEG\", label=channel_name\n", + " )\n", + " .redim(**{channel_name: amplitude_dim})\n", + " .opts(\n", + " subcoordinate_y=True,\n", + " subcoordinate_scale=2,\n", + " color=\"black\",\n", + " line_width=1,\n", + " tools=[\"hover\"],\n", + " hover_tooltips=[\n", + " (\"type\", \"$group\"),\n", + " (\"channel\", \"$label\"),\n", + " (\"time\"), #'@time{%H:%M:%S.%3N}'), # hide date and use ms precision\n", + " (\"amplitude\"),\n", + " ],\n", + " # hover_formatters = {'time': 'datetime'},\n", + " )\n", + " )\n", + " curves[channel_name] = curve\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using a HoloViews `Overlay` container, we can now overlay all the curves on the same plot." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "\n", + "curves_overlay = hv.Overlay(curves, kdims=\"channel\").opts(\n", + " ylabel=\"channel\",\n", + " show_legend=False,\n", + " padding=0,\n", + " aspect=1.5,\n", + " responsive=True,\n", + " shared_axes=False,\n", + " framewise=False,\n", + " min_height=100,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since there are 64 channels and over a million data samples, we'll make use of downsampling before trying to send all that data to the browser. We can use `downsample1d` imported from HoloViews. Starting in HoloViews version 1.19.0, integration with the `tsdownsample` library introduces enhanced downsampling algorithms. Read more about downsampling [here](https://holoviews.org/user_guide/Large_Data.html)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "curves_overlay = downsample1d(curves_overlay, algorithm='minmax-lttb')\n", + "curves_overlay" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we've created the main plot, let's add a secondary plot to hold the linked minimap element, which will allow for range control over the main plot, while contextualizing with a Datashaded rendering of all the data, so a view of the zoomed out data is maintained while navigating in on the main plot." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "channels = df.columns\n", + "time = df.index.values\n", + "\n", + "y_positions = range(len(channels))\n", + "yticks = [(i, ich) for i, ich in enumerate(channels)]\n", + "z_data = zscore(df, axis=0).T\n", + "minimap = rasterize(hv.Image((time, y_positions, z_data), [\"Time\", \"Channel\"], \"amplitude\"))\n", + "https://holoviews.org/user_guide/Large_Data.html = minimap.opts(\n", + " cmap=\"RdBu_r\",\n", + " colorbar=False,\n", + " xlabel='',\n", + " alpha=0.5,\n", + " yticks=[yticks[0], yticks[-1]],\n", + " toolbar='disable',\n", + " height=120,\n", + " responsive=True,\n", + " default_tools=[],\n", + " )\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the minimap created, we can now go ahead and link the minimap to the main plot using a HoloViews `RangeToolLink`. We'll also constrain the initial x-range view to a third of the duration." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Link minimap widget to curves overlay plot\n", + "RangeToolLink(minimap, curves_overlay, axes=[\"x\", \"y\"],\n", + " boundsx=(0, time[len(time)//3]) # limit the initial x-range of the minimap\n", + " )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, we'll layout the main plot and minimap and use HoloViz Panel to allow for serving the application from command line. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "app = (curves_overlay + minimap).cols(1)\n", + "app" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## *Optional:* Standalone App" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using HoloViz Panel, we can also set this application as servable so we can see it in a browser window, outside of a Jupyter Notebook." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "template = pn.template.FastListTemplate(\n", + " title = \"Medium Multi-Chanel Timeseries App\",\n", + " main = pn.Column(app, min_height=500)\n", + ").servable()" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "neuro-multi-chan", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.12.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/workflows/multi_channel_timeseries/dev/minimap.ipynb b/workflows/multi_channel_timeseries/dev/minimap.ipynb new file mode 100644 index 0000000..c8b4709 --- /dev/null +++ b/workflows/multi_channel_timeseries/dev/minimap.ipynb @@ -0,0 +1,65 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Medium Dataset Minimap" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Large Dataset Minimap" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Creating a minimap for the approach in the large multi channel workflow is very similar the work above so we will just make a note of the difference.Since in this case you would be working with a dataset that is too large to fit into memory, you cannot simply load and rasterize the full resolution version of the data into an image for the minimap. Instead, simply choose a level of downsampled courseness from the data pyramid that is able to fit into memory and rasterize into an image in a single pass. The higher resolution level you select, the more information the minimap will contain, but the longer it will take to compute and the closer to memory constraints you will be." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.12.2" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/workflows/multi_channel_timeseries/dev/test_ds_legend.ipynb b/workflows/multi_channel_timeseries/dev/test_ds_legend.ipynb new file mode 100644 index 0000000..49d4b29 --- /dev/null +++ b/workflows/multi_channel_timeseries/dev/test_ds_legend.ipynb @@ -0,0 +1,199 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "id": "2ab4d105-8757-4ec2-b2c9-7adb73ac4d4e", + "metadata": {}, + "outputs": [], + "source": [ + "import holoviews as hv; hv.extension('bokeh')\n", + "from holoviews.operation.datashader import rasterize, datashade, shade, inspect, inspect_points\n", + "import panel as pn; pn.extension()\n", + "import datashader as ds\n", + "import numpy as np\n", + "import string\n", + "import colorcet as cc" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5f063056-82dd-4450-b4f9-baf7b81f1cfc", + "metadata": {}, + "outputs": [], + "source": [ + "color_key = list(enumerate(cc.glasbey[0:n_curves]))\n", + "color_points = hv.NdOverlay({k: hv.Points([(0,0)], label=str(k)).opts(color=v, size=0) for k, v in color_key})" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "56ee8d1c-b692-487c-b584-26a6df2e72d1", + "metadata": {}, + "outputs": [], + "source": [ + "color_key" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f52691a7-fd8b-44a6-8bc7-246b600a5be2", + "metadata": {}, + "outputs": [], + "source": [ + "hv.Curve([1,2,3], label='A').opts(tools=['hover']) * hv.Curve([3,2,3], label='B').opts(tools=['hover'])" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e4e2af86-ab89-4c81-8f8a-bd0c7a8eb50f", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "n_curves = 4\n", + "\n", + "curves = {}\n", + "color_key = {}\n", + "\n", + "for i in np.arange(1,n_curves+1):\n", + " curves[string.ascii_uppercase[-i]] = hv.Curve(np.random.randint(10, size=10), label=string.ascii_uppercase[-i]).opts(color=cc.glasbey[-i], tools=['hover'],)\n", + " color_key[string.ascii_uppercase[-i]] = cc.glasbey[-i]\n", + "\n", + "color_points = hv.NdOverlay({k: hv.Points([(0,0)], label=str(k)).opts(color=v, size=0) for k, v in color_key.items()}).opts(legend_cols=2)\n", + "\n", + "orig_plot = hv.NdOverlay(curves, kdims='curve').opts(width=300, height=300, legend_cols=2, title='original')\n", + "ds_plot = datashade(hv.NdOverlay(curves, kdims='curve'), line_width=2, cmap=cc.glasbey[:n_curves], aggregator=ds.by('curve', ds.count())).opts(tools=['hover'], title='datashade', width=300, height=300)\n", + "r_plot = rasterize(hv.NdOverlay(curves, kdims='curve'),line_width=2, aggregator=ds.by('curve', ds.count())).opts(tools=['hover'], title='rasterize', cmap=cc.glasbey[:n_curves], width=300, height=300)\n", + "rs_plot = shade(rasterize(hv.NdOverlay(curves, kdims='curve'), line_width=2, aggregator=ds.by('curve', ds.count())).opts(cmap=cc.glasbey[:n_curves])).opts(tools=['hover'], title='rasterize+shade', width=300, height=300)\n", + "\n", + "orig_plot + (ds_plot * color_points) + (r_plot * color_points) + (rs_plot * color_points)\n", + "\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8156fad1-f045-450f-88f0-52462b8e2cdb", + "metadata": {}, + "outputs": [], + "source": [ + "hv.NdOverlay(curves, kdims='curve').opts(width=300, height=300, legend_cols=4, title='original')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ff16e2b2-b8fa-4edf-9fcd-b6fc9db4cfe9", + "metadata": {}, + "outputs": [], + "source": [ + "hv.streams.Tap(source=points, popup=form('Tap'))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "837009dc-5423-4ace-9287-5e7cbb8e4b2a", + "metadata": {}, + "outputs": [], + "source": [ + "def table_df(df):\n", + " return pn.pane.DataFrame(df)\n", + "\n", + "highlighter = inspect_points.instance(streams=[hv.streams.Tap])\n", + "\n", + "highlight = highlighter(ds_plot).opts(color='grey', tools=[\"hover\"], marker='circle', \n", + " size=5, fill_alpha=.1, line_dash='-', line_alpha=.4)\n", + "\n", + "table = pn.bind(table_df, df=highlighter.param.hits)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fd9a174d-2b77-423d-9c92-20eb86ddb9a2", + "metadata": {}, + "outputs": [], + "source": [ + "pn.Column((highlight * ds_plot.opts(tools=[])), table)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fb23ec33-0158-4c12-9da0-9c7bce1c2f15", + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import holoviews as hv\n", + "from holoviews import streams\n", + "hv.extension('bokeh')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6000fbe0-ec50-4b83-9bc6-106896263b1a", + "metadata": {}, + "outputs": [], + "source": [ + "Y, X = (np.mgrid[0:100, 0:100]-50.)/20.\n", + "img = hv.Image(np.sin(X**2 + Y**2))\n", + "\n", + "def coords(x):\n", + " # return pn.pane.Markdown(f'{x}, {y}')\n", + " return hv.Curve([x])\n", + "\n", + "# Declare pointer stream initializing at (0, 0) and linking to Image\n", + "pointer = streams.Tap(x=0, source=img, popup=coords)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d47c0c56-d138-4c89-a5a0-253c764c34fd", + "metadata": {}, + "outputs": [], + "source": [ + "img#.opts(tools=['hover'])" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ed948c8e-8c1b-45d3-b20c-e9c945e92d66", + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.12.2" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/workflows/multi_channel_timeseries/dev/test_stocks_wide_df.ipynb b/workflows/multi_channel_timeseries/dev/test_stocks_wide_df.ipynb new file mode 100644 index 0000000..a3a79ef --- /dev/null +++ b/workflows/multi_channel_timeseries/dev/test_stocks_wide_df.ipynb @@ -0,0 +1,1031 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "id": "a16ff13d-2764-405f-8acf-5ed05d465776", + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import pandas as pd\n", + "from scipy.stats import zscore\n", + "import wget\n", + "from pathlib import Path\n", + "import mne\n", + "import colorcet as cc\n", + "import holoviews as hv\n", + "from holoviews.plotting.links import RangeToolLink\n", + "from holoviews.operation.datashader import rasterize\n", + "from holoviews.operation.downsample import downsample1d\n", + "from bokeh.models import HoverTool\n", + "import panel as pn\n", + "\n", + "pn.extension()\n", + "hv.extension('bokeh')\n", + "\n", + "np.random.seed(0)\n", + "\n", + "\n", + "data_url = 'https://physionet.org/files/eegmmidb/1.0.0/S001/S001R04.edf'\n", + "output_directory = Path('./data')\n", + "\n", + "output_directory.mkdir(parents=True, exist_ok=True)\n", + "data_path = output_directory / Path(data_url).name\n", + "if not data_path.exists():\n", + " data_path = wget.download(data_url, out=str(data_path))\n", + " \n", + " \n", + "raw = mne.io.read_raw_edf(data_path, preload=True)\n", + "\n", + "raw.set_eeg_reference(\"average\")\n", + "\n", + "raw.rename_channels(lambda s: s.strip(\".\"));\n", + "\n", + "df = raw.to_data_frame() # TODO: fix rangetool for time_format='datetime'\n", + "df.set_index('time', inplace=True) \n", + "df.head()\n", + "\n", + "# Viz\n", + "amplitude_dim = hv.Dimension(\"amplitude\", unit=\"µV\")\n", + "time_dim = hv.Dimension(\"time\", unit=\"s\") # match the index name in the df\n", + "\n", + "curves = {}\n", + "for channel_name, channel_data in df.items():\n", + " \n", + " curve = hv.Curve(df, kdims=[time_dim], vdims=[channel_name], group=\"EEG\", label=channel_name)\n", + " \n", + " # TODO: Without the redim, downsample1d errors. But with, it prevents common index slice optimization. :(\n", + " curve = curve.redim(**{str(channel_name): amplitude_dim})\n", + "\n", + " curve = curve.opts(\n", + " subcoordinate_y=True,\n", + " subcoordinate_scale=2,\n", + " color=\"black\",\n", + " line_width=1,\n", + " tools=[\"hover\"],\n", + " hover_tooltips=[\n", + " (\"type\", \"$group\"),\n", + " (\"channel\", \"$label\"),\n", + " (\"time\"), # TODO: '@time{%H:%M:%S.%3N}'),\n", + " (\"amplitude\"),\n", + " ],\n", + " )\n", + " curves[channel_name] = curve\n", + " \n", + "curves_overlay = hv.Overlay(curves, kdims=\"channel\").opts(\n", + " ylabel=\"channel\",\n", + " show_legend=False,\n", + " padding=0,\n", + " min_height=500,\n", + " responsive=True,\n", + " shared_axes=False,\n", + " framewise=False,\n", + ")\n", + "\n", + "curves_overlay = downsample1d(curves_overlay, algorithm='minmax-lttb')\n", + "\n", + "# minimap\n", + "\n", + "channels = df.columns\n", + "time = df.index.values\n", + "\n", + "y_positions = range(len(channels))\n", + "yticks = [(i, ich) for i, ich in enumerate(channels)]\n", + "z_data = zscore(df, axis=0).T\n", + "minimap = rasterize(hv.Image((time, y_positions, z_data), [\"Time\", \"Channel\"], \"amplitude\"))\n", + "minimap = minimap.opts(\n", + " cmap=\"RdBu_r\",\n", + " colorbar=False,\n", + " xlabel='',\n", + " alpha=0.5,\n", + " yticks=[yticks[0], yticks[-1]],\n", + " toolbar='disable',\n", + " height=120,\n", + " responsive=True,\n", + " # default_tools=[],\n", + " cnorm='eq_hist'\n", + " )\n", + "\n", + "RangeToolLink(minimap, curves_overlay, axes=[\"x\", \"y\"],\n", + " boundsx=(0, time[len(time)//3]) # limit the initial x-range of the minimap\n", + " )\n", + "\n", + "layout = (curves_overlay + minimap).cols(1)\n", + "layout" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b707d12f-d7c4-4b61-9c83-abb0479edd91", + "metadata": {}, + "outputs": [], + "source": [ + "df" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "id": "cf750d7b-18f2-4b2e-b3f9-561e6eaaf575", + "metadata": { + "tags": [] + }, + "outputs": [ + { + "data": { + "application/javascript": [ + "(function(root) {\n", + " function now() {\n", + " return new Date();\n", + " }\n", + "\n", + " var force = true;\n", + " var py_version = '3.4.1'.replace('rc', '-rc.').replace('.dev', '-dev.');\n", + " var reloading = false;\n", + " var Bokeh = root.Bokeh;\n", + "\n", + " if (typeof (root._bokeh_timeout) === \"undefined\" || force) {\n", + " root._bokeh_timeout = Date.now() + 5000;\n", + " root._bokeh_failed_load = false;\n", + " }\n", + "\n", + " function run_callbacks() {\n", + " try {\n", + " root._bokeh_onload_callbacks.forEach(function(callback) {\n", + " if (callback != null)\n", + " callback();\n", + " });\n", + " } finally {\n", + " delete root._bokeh_onload_callbacks;\n", + " }\n", + " console.debug(\"Bokeh: all callbacks have finished\");\n", + " }\n", + "\n", + " function load_libs(css_urls, js_urls, js_modules, js_exports, callback) {\n", + " if (css_urls == null) css_urls = [];\n", + " if (js_urls == null) js_urls = [];\n", + " if (js_modules == null) js_modules = [];\n", + " if (js_exports == null) js_exports = {};\n", + "\n", + " root._bokeh_onload_callbacks.push(callback);\n", + "\n", + " if (root._bokeh_is_loading > 0) {\n", + " console.debug(\"Bokeh: BokehJS is being loaded, scheduling callback at\", now());\n", + " return null;\n", + " }\n", + " if (js_urls.length === 0 && js_modules.length === 0 && Object.keys(js_exports).length === 0) {\n", + " run_callbacks();\n", + " return null;\n", + " }\n", + " if (!reloading) {\n", + " console.debug(\"Bokeh: BokehJS not loaded, scheduling load and callback at\", now());\n", + " }\n", + "\n", + " function on_load() {\n", + " root._bokeh_is_loading--;\n", + " if (root._bokeh_is_loading === 0) {\n", + " console.debug(\"Bokeh: all BokehJS libraries/stylesheets loaded\");\n", + " run_callbacks()\n", + " }\n", + " }\n", + " window._bokeh_on_load = on_load\n", + "\n", + " function on_error() {\n", + " console.error(\"failed to load \" + url);\n", + " }\n", + "\n", + " var skip = [];\n", + " if (window.requirejs) {\n", + " window.requirejs.config({'packages': {}, 'paths': {'tabulator': 'https://cdn.jsdelivr.net/npm/tabulator-tables@5.5.0/dist/js/tabulator.min', 'moment': 'https://cdn.jsdelivr.net/npm/luxon/build/global/luxon.min'}, 'shim': {}});\n", + " require([\"tabulator\"], function(Tabulator) {\n", + "\twindow.Tabulator = Tabulator\n", + "\ton_load()\n", + " })\n", + " require([\"moment\"], function(moment) {\n", + "\twindow.moment = moment\n", + "\ton_load()\n", + " })\n", + " root._bokeh_is_loading = css_urls.length + 2;\n", + " } else {\n", + " root._bokeh_is_loading = css_urls.length + js_urls.length + js_modules.length + Object.keys(js_exports).length;\n", + " }\n", + "\n", + " var existing_stylesheets = []\n", + " var links = document.getElementsByTagName('link')\n", + " for (var i = 0; i < links.length; i++) {\n", + " var link = links[i]\n", + " if (link.href != null) {\n", + "\texisting_stylesheets.push(link.href)\n", + " }\n", + " }\n", + " for (var i = 0; i < css_urls.length; i++) {\n", + " var url = css_urls[i];\n", + " if (existing_stylesheets.indexOf(url) !== -1) {\n", + "\ton_load()\n", + "\tcontinue;\n", + " }\n", + " const element = document.createElement(\"link\");\n", + " element.onload = on_load;\n", + " element.onerror = on_error;\n", + " element.rel = \"stylesheet\";\n", + " element.type = \"text/css\";\n", + " element.href = url;\n", + " console.debug(\"Bokeh: injecting link tag for BokehJS stylesheet: \", url);\n", + " document.body.appendChild(element);\n", + " } if (((window.Tabulator !== undefined) && (!(window.Tabulator instanceof HTMLElement))) || window.requirejs) {\n", + " var urls = ['https://cdn.holoviz.org/panel/1.4.1/dist/bundled/datatabulator/tabulator-tables@5.5.0/dist/js/tabulator.min.js'];\n", + " for (var i = 0; i < urls.length; i++) {\n", + " skip.push(urls[i])\n", + " }\n", + " } if (((window.moment !== undefined) && (!(window.moment instanceof HTMLElement))) || window.requirejs) {\n", + " var urls = ['https://cdn.holoviz.org/panel/1.4.1/dist/bundled/datatabulator/luxon/build/global/luxon.min.js'];\n", + " for (var i = 0; i < urls.length; i++) {\n", + " skip.push(urls[i])\n", + " }\n", + " } var existing_scripts = []\n", + " var scripts = document.getElementsByTagName('script')\n", + " for (var i = 0; i < scripts.length; i++) {\n", + " var script = scripts[i]\n", + " if (script.src != null) {\n", + "\texisting_scripts.push(script.src)\n", + " }\n", + " }\n", + " for (var i = 0; i < js_urls.length; i++) {\n", + " var url = js_urls[i];\n", + " if (skip.indexOf(url) !== -1 || existing_scripts.indexOf(url) !== -1) {\n", + "\tif (!window.requirejs) {\n", + "\t on_load();\n", + "\t}\n", + "\tcontinue;\n", + " }\n", + " var element = document.createElement('script');\n", + " element.onload = on_load;\n", + " element.onerror = on_error;\n", + " element.async = false;\n", + " element.src = url;\n", + " console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n", + " document.head.appendChild(element);\n", + " }\n", + " for (var i = 0; i < js_modules.length; i++) {\n", + " var url = js_modules[i];\n", + " if (skip.indexOf(url) !== -1 || existing_scripts.indexOf(url) !== -1) {\n", + "\tif (!window.requirejs) {\n", + "\t on_load();\n", + "\t}\n", + "\tcontinue;\n", + " }\n", + " var element = document.createElement('script');\n", + " element.onload = on_load;\n", + " element.onerror = on_error;\n", + " element.async = false;\n", + " element.src = url;\n", + " element.type = \"module\";\n", + " console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n", + " document.head.appendChild(element);\n", + " }\n", + " for (const name in js_exports) {\n", + " var url = js_exports[name];\n", + " if (skip.indexOf(url) >= 0 || root[name] != null) {\n", + "\tif (!window.requirejs) {\n", + "\t on_load();\n", + "\t}\n", + "\tcontinue;\n", + " }\n", + " var element = document.createElement('script');\n", + " element.onerror = on_error;\n", + " element.async = false;\n", + " element.type = \"module\";\n", + " console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n", + " element.textContent = `\n", + " import ${name} from \"${url}\"\n", + " window.${name} = ${name}\n", + " window._bokeh_on_load()\n", + " `\n", + " document.head.appendChild(element);\n", + " }\n", + " if (!js_urls.length && !js_modules.length) {\n", + " on_load()\n", + " }\n", + " };\n", + "\n", + " function inject_raw_css(css) {\n", + " const element = document.createElement(\"style\");\n", + " element.appendChild(document.createTextNode(css));\n", + " document.body.appendChild(element);\n", + " }\n", + "\n", + " var js_urls = [\"https://cdn.holoviz.org/panel/1.4.1/dist/bundled/datatabulator/tabulator-tables@5.5.0/dist/js/tabulator.min.js\", \"https://cdn.holoviz.org/panel/1.4.1/dist/bundled/datatabulator/luxon/build/global/luxon.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-3.4.1.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-gl-3.4.1.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-widgets-3.4.1.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-tables-3.4.1.min.js\", \"https://cdn.holoviz.org/panel/1.4.1/dist/panel.min.js\"];\n", + " var js_modules = [];\n", + " var js_exports = {};\n", + " var css_urls = [\"https://cdn.holoviz.org/panel/1.4.1/dist/bundled/datatabulator/tabulator-tables@5.5.0/dist/css/tabulator_simple.min.css?v=1.4.1\", \"https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.4.2/css/all.min.css\"];\n", + " var inline_js = [ function(Bokeh) {\n", + " Bokeh.set_log_level(\"info\");\n", + " },\n", + "function(Bokeh) {} // ensure no trailing comma for IE\n", + " ];\n", + "\n", + " function run_inline_js() {\n", + " if ((root.Bokeh !== undefined) || (force === true)) {\n", + " for (var i = 0; i < inline_js.length; i++) {\n", + "\ttry {\n", + " inline_js[i].call(root, root.Bokeh);\n", + "\t} catch(e) {\n", + "\t if (!reloading) {\n", + "\t throw e;\n", + "\t }\n", + "\t}\n", + " }\n", + " // Cache old bokeh versions\n", + " if (Bokeh != undefined && !reloading) {\n", + "\tvar NewBokeh = root.Bokeh;\n", + "\tif (Bokeh.versions === undefined) {\n", + "\t Bokeh.versions = new Map();\n", + "\t}\n", + "\tif (NewBokeh.version !== Bokeh.version) {\n", + "\t Bokeh.versions.set(NewBokeh.version, NewBokeh)\n", + "\t}\n", + "\troot.Bokeh = Bokeh;\n", + " }} else if (Date.now() < root._bokeh_timeout) {\n", + " setTimeout(run_inline_js, 100);\n", + " } else if (!root._bokeh_failed_load) {\n", + " console.log(\"Bokeh: BokehJS failed to load within specified timeout.\");\n", + " root._bokeh_failed_load = true;\n", + " }\n", + " root._bokeh_is_initializing = false\n", + " }\n", + "\n", + " function load_or_wait() {\n", + " // Implement a backoff loop that tries to ensure we do not load multiple\n", + " // versions of Bokeh and its dependencies at the same time.\n", + " // In recent versions we use the root._bokeh_is_initializing flag\n", + " // to determine whether there is an ongoing attempt to initialize\n", + " // bokeh, however for backward compatibility we also try to ensure\n", + " // that we do not start loading a newer (Panel>=1.0 and Bokeh>3) version\n", + " // before older versions are fully initialized.\n", + " if (root._bokeh_is_initializing && Date.now() > root._bokeh_timeout) {\n", + " root._bokeh_is_initializing = false;\n", + " root._bokeh_onload_callbacks = undefined;\n", + " console.log(\"Bokeh: BokehJS was loaded multiple times but one version failed to initialize.\");\n", + " load_or_wait();\n", + " } else if (root._bokeh_is_initializing || (typeof root._bokeh_is_initializing === \"undefined\" && root._bokeh_onload_callbacks !== undefined)) {\n", + " setTimeout(load_or_wait, 100);\n", + " } else {\n", + " root._bokeh_is_initializing = true\n", + " root._bokeh_onload_callbacks = []\n", + " var bokeh_loaded = Bokeh != null && (Bokeh.version === py_version || (Bokeh.versions !== undefined && Bokeh.versions.has(py_version)));\n", + " if (!reloading && !bokeh_loaded) {\n", + "\troot.Bokeh = undefined;\n", + " }\n", + " load_libs(css_urls, js_urls, js_modules, js_exports, function() {\n", + "\tconsole.debug(\"Bokeh: BokehJS plotting callback run at\", now());\n", + "\trun_inline_js();\n", + " });\n", + " }\n", + " }\n", + " // Give older versions of the autoload script a head-start to ensure\n", + " // they initialize before we start loading newer version.\n", + " setTimeout(load_or_wait, 100)\n", + "}(window));" + ], + "application/vnd.holoviews_load.v0+json": "(function(root) {\n function now() {\n return new Date();\n }\n\n var force = true;\n var py_version = '3.4.1'.replace('rc', '-rc.').replace('.dev', '-dev.');\n var reloading = false;\n var Bokeh = root.Bokeh;\n\n if (typeof (root._bokeh_timeout) === \"undefined\" || force) {\n root._bokeh_timeout = Date.now() + 5000;\n root._bokeh_failed_load = false;\n }\n\n function run_callbacks() {\n try {\n root._bokeh_onload_callbacks.forEach(function(callback) {\n if (callback != null)\n callback();\n });\n } finally {\n delete root._bokeh_onload_callbacks;\n }\n console.debug(\"Bokeh: all callbacks have finished\");\n }\n\n function load_libs(css_urls, js_urls, js_modules, js_exports, callback) {\n if (css_urls == null) css_urls = [];\n if (js_urls == null) js_urls = [];\n if (js_modules == null) js_modules = [];\n if (js_exports == null) js_exports = {};\n\n root._bokeh_onload_callbacks.push(callback);\n\n if (root._bokeh_is_loading > 0) {\n console.debug(\"Bokeh: BokehJS is being loaded, scheduling callback at\", now());\n return null;\n }\n if (js_urls.length === 0 && js_modules.length === 0 && Object.keys(js_exports).length === 0) {\n run_callbacks();\n return null;\n }\n if (!reloading) {\n console.debug(\"Bokeh: BokehJS not loaded, scheduling load and callback at\", now());\n }\n\n function on_load() {\n root._bokeh_is_loading--;\n if (root._bokeh_is_loading === 0) {\n console.debug(\"Bokeh: all BokehJS libraries/stylesheets loaded\");\n run_callbacks()\n }\n }\n window._bokeh_on_load = on_load\n\n function on_error() {\n console.error(\"failed to load \" + url);\n }\n\n var skip = [];\n if (window.requirejs) {\n window.requirejs.config({'packages': {}, 'paths': {'tabulator': 'https://cdn.jsdelivr.net/npm/tabulator-tables@5.5.0/dist/js/tabulator.min', 'moment': 'https://cdn.jsdelivr.net/npm/luxon/build/global/luxon.min'}, 'shim': {}});\n require([\"tabulator\"], function(Tabulator) {\n\twindow.Tabulator = Tabulator\n\ton_load()\n })\n require([\"moment\"], function(moment) {\n\twindow.moment = moment\n\ton_load()\n })\n root._bokeh_is_loading = css_urls.length + 2;\n } else {\n root._bokeh_is_loading = css_urls.length + js_urls.length + js_modules.length + Object.keys(js_exports).length;\n }\n\n var existing_stylesheets = []\n var links = document.getElementsByTagName('link')\n for (var i = 0; i < links.length; i++) {\n var link = links[i]\n if (link.href != null) {\n\texisting_stylesheets.push(link.href)\n }\n }\n for (var i = 0; i < css_urls.length; i++) {\n var url = css_urls[i];\n if (existing_stylesheets.indexOf(url) !== -1) {\n\ton_load()\n\tcontinue;\n }\n const element = document.createElement(\"link\");\n element.onload = on_load;\n element.onerror = on_error;\n element.rel = \"stylesheet\";\n element.type = \"text/css\";\n element.href = url;\n console.debug(\"Bokeh: injecting link tag for BokehJS stylesheet: \", url);\n document.body.appendChild(element);\n } if (((window.Tabulator !== undefined) && (!(window.Tabulator instanceof HTMLElement))) || window.requirejs) {\n var urls = ['https://cdn.holoviz.org/panel/1.4.1/dist/bundled/datatabulator/tabulator-tables@5.5.0/dist/js/tabulator.min.js'];\n for (var i = 0; i < urls.length; i++) {\n skip.push(urls[i])\n }\n } if (((window.moment !== undefined) && (!(window.moment instanceof HTMLElement))) || window.requirejs) {\n var urls = ['https://cdn.holoviz.org/panel/1.4.1/dist/bundled/datatabulator/luxon/build/global/luxon.min.js'];\n for (var i = 0; i < urls.length; i++) {\n skip.push(urls[i])\n }\n } var existing_scripts = []\n var scripts = document.getElementsByTagName('script')\n for (var i = 0; i < scripts.length; i++) {\n var script = scripts[i]\n if (script.src != null) {\n\texisting_scripts.push(script.src)\n }\n }\n for (var i = 0; i < js_urls.length; i++) {\n var url = js_urls[i];\n if (skip.indexOf(url) !== -1 || existing_scripts.indexOf(url) !== -1) {\n\tif (!window.requirejs) {\n\t on_load();\n\t}\n\tcontinue;\n }\n var element = document.createElement('script');\n element.onload = on_load;\n element.onerror = on_error;\n element.async = false;\n element.src = url;\n console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n document.head.appendChild(element);\n }\n for (var i = 0; i < js_modules.length; i++) {\n var url = js_modules[i];\n if (skip.indexOf(url) !== -1 || existing_scripts.indexOf(url) !== -1) {\n\tif (!window.requirejs) {\n\t on_load();\n\t}\n\tcontinue;\n }\n var element = document.createElement('script');\n element.onload = on_load;\n element.onerror = on_error;\n element.async = false;\n element.src = url;\n element.type = \"module\";\n console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n document.head.appendChild(element);\n }\n for (const name in js_exports) {\n var url = js_exports[name];\n if (skip.indexOf(url) >= 0 || root[name] != null) {\n\tif (!window.requirejs) {\n\t on_load();\n\t}\n\tcontinue;\n }\n var element = document.createElement('script');\n element.onerror = on_error;\n element.async = false;\n element.type = \"module\";\n console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n element.textContent = `\n import ${name} from \"${url}\"\n window.${name} = ${name}\n window._bokeh_on_load()\n `\n document.head.appendChild(element);\n }\n if (!js_urls.length && !js_modules.length) {\n on_load()\n }\n };\n\n function inject_raw_css(css) {\n const element = document.createElement(\"style\");\n element.appendChild(document.createTextNode(css));\n document.body.appendChild(element);\n }\n\n var js_urls = [\"https://cdn.holoviz.org/panel/1.4.1/dist/bundled/datatabulator/tabulator-tables@5.5.0/dist/js/tabulator.min.js\", \"https://cdn.holoviz.org/panel/1.4.1/dist/bundled/datatabulator/luxon/build/global/luxon.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-3.4.1.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-gl-3.4.1.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-widgets-3.4.1.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-tables-3.4.1.min.js\", \"https://cdn.holoviz.org/panel/1.4.1/dist/panel.min.js\"];\n var js_modules = [];\n var js_exports = {};\n var css_urls = [\"https://cdn.holoviz.org/panel/1.4.1/dist/bundled/datatabulator/tabulator-tables@5.5.0/dist/css/tabulator_simple.min.css?v=1.4.1\", \"https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.4.2/css/all.min.css\"];\n var inline_js = [ function(Bokeh) {\n Bokeh.set_log_level(\"info\");\n },\nfunction(Bokeh) {} // ensure no trailing comma for IE\n ];\n\n function run_inline_js() {\n if ((root.Bokeh !== undefined) || (force === true)) {\n for (var i = 0; i < inline_js.length; i++) {\n\ttry {\n inline_js[i].call(root, root.Bokeh);\n\t} catch(e) {\n\t if (!reloading) {\n\t throw e;\n\t }\n\t}\n }\n // Cache old bokeh versions\n if (Bokeh != undefined && !reloading) {\n\tvar NewBokeh = root.Bokeh;\n\tif (Bokeh.versions === undefined) {\n\t Bokeh.versions = new Map();\n\t}\n\tif (NewBokeh.version !== Bokeh.version) {\n\t Bokeh.versions.set(NewBokeh.version, NewBokeh)\n\t}\n\troot.Bokeh = Bokeh;\n }} else if (Date.now() < root._bokeh_timeout) {\n setTimeout(run_inline_js, 100);\n } else if (!root._bokeh_failed_load) {\n console.log(\"Bokeh: BokehJS failed to load within specified timeout.\");\n root._bokeh_failed_load = true;\n }\n root._bokeh_is_initializing = false\n }\n\n function load_or_wait() {\n // Implement a backoff loop that tries to ensure we do not load multiple\n // versions of Bokeh and its dependencies at the same time.\n // In recent versions we use the root._bokeh_is_initializing flag\n // to determine whether there is an ongoing attempt to initialize\n // bokeh, however for backward compatibility we also try to ensure\n // that we do not start loading a newer (Panel>=1.0 and Bokeh>3) version\n // before older versions are fully initialized.\n if (root._bokeh_is_initializing && Date.now() > root._bokeh_timeout) {\n root._bokeh_is_initializing = false;\n root._bokeh_onload_callbacks = undefined;\n console.log(\"Bokeh: BokehJS was loaded multiple times but one version failed to initialize.\");\n load_or_wait();\n } else if (root._bokeh_is_initializing || (typeof root._bokeh_is_initializing === \"undefined\" && root._bokeh_onload_callbacks !== undefined)) {\n setTimeout(load_or_wait, 100);\n } else {\n root._bokeh_is_initializing = true\n root._bokeh_onload_callbacks = []\n var bokeh_loaded = Bokeh != null && (Bokeh.version === py_version || (Bokeh.versions !== undefined && Bokeh.versions.has(py_version)));\n if (!reloading && !bokeh_loaded) {\n\troot.Bokeh = undefined;\n }\n load_libs(css_urls, js_urls, js_modules, js_exports, function() {\n\tconsole.debug(\"Bokeh: BokehJS plotting callback run at\", now());\n\trun_inline_js();\n });\n }\n }\n // Give older versions of the autoload script a head-start to ensure\n // they initialize before we start loading newer version.\n setTimeout(load_or_wait, 100)\n}(window));" + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "application/javascript": [ + "\n", + "if ((window.PyViz === undefined) || (window.PyViz instanceof HTMLElement)) {\n", + " window.PyViz = {comms: {}, comm_status:{}, kernels:{}, receivers: {}, plot_index: []}\n", + "}\n", + "\n", + "\n", + " function JupyterCommManager() {\n", + " }\n", + "\n", + " JupyterCommManager.prototype.register_target = function(plot_id, comm_id, msg_handler) {\n", + " if (window.comm_manager || ((window.Jupyter !== undefined) && (Jupyter.notebook.kernel != null))) {\n", + " var comm_manager = window.comm_manager || Jupyter.notebook.kernel.comm_manager;\n", + " comm_manager.register_target(comm_id, function(comm) {\n", + " comm.on_msg(msg_handler);\n", + " });\n", + " } else if ((plot_id in window.PyViz.kernels) && (window.PyViz.kernels[plot_id])) {\n", + " window.PyViz.kernels[plot_id].registerCommTarget(comm_id, function(comm) {\n", + " comm.onMsg = msg_handler;\n", + " });\n", + " } else if (typeof google != 'undefined' && google.colab.kernel != null) {\n", + " google.colab.kernel.comms.registerTarget(comm_id, (comm) => {\n", + " var messages = comm.messages[Symbol.asyncIterator]();\n", + " function processIteratorResult(result) {\n", + " var message = result.value;\n", + " console.log(message)\n", + " var content = {data: message.data, comm_id};\n", + " var buffers = []\n", + " for (var buffer of message.buffers || []) {\n", + " buffers.push(new DataView(buffer))\n", + " }\n", + " var metadata = message.metadata || {};\n", + " var msg = {content, buffers, metadata}\n", + " msg_handler(msg);\n", + " return messages.next().then(processIteratorResult);\n", + " }\n", + " return messages.next().then(processIteratorResult);\n", + " })\n", + " }\n", + " }\n", + "\n", + " JupyterCommManager.prototype.get_client_comm = function(plot_id, comm_id, msg_handler) {\n", + " if (comm_id in window.PyViz.comms) {\n", + " return window.PyViz.comms[comm_id];\n", + " } else if (window.comm_manager || ((window.Jupyter !== undefined) && (Jupyter.notebook.kernel != null))) {\n", + " var comm_manager = window.comm_manager || Jupyter.notebook.kernel.comm_manager;\n", + " var comm = comm_manager.new_comm(comm_id, {}, {}, {}, comm_id);\n", + " if (msg_handler) {\n", + " comm.on_msg(msg_handler);\n", + " }\n", + " } else if ((plot_id in window.PyViz.kernels) && (window.PyViz.kernels[plot_id])) {\n", + " var comm = window.PyViz.kernels[plot_id].connectToComm(comm_id);\n", + " comm.open();\n", + " if (msg_handler) {\n", + " comm.onMsg = msg_handler;\n", + " }\n", + " } else if (typeof google != 'undefined' && google.colab.kernel != null) {\n", + " var comm_promise = google.colab.kernel.comms.open(comm_id)\n", + " comm_promise.then((comm) => {\n", + " window.PyViz.comms[comm_id] = comm;\n", + " if (msg_handler) {\n", + " var messages = comm.messages[Symbol.asyncIterator]();\n", + " function processIteratorResult(result) {\n", + " var message = result.value;\n", + " var content = {data: message.data};\n", + " var metadata = message.metadata || {comm_id};\n", + " var msg = {content, metadata}\n", + " msg_handler(msg);\n", + " return messages.next().then(processIteratorResult);\n", + " }\n", + " return messages.next().then(processIteratorResult);\n", + " }\n", + " }) \n", + " var sendClosure = (data, metadata, buffers, disposeOnDone) => {\n", + " return comm_promise.then((comm) => {\n", + " comm.send(data, metadata, buffers, disposeOnDone);\n", + " });\n", + " };\n", + " var comm = {\n", + " send: sendClosure\n", + " };\n", + " }\n", + " window.PyViz.comms[comm_id] = comm;\n", + " return comm;\n", + " }\n", + " window.PyViz.comm_manager = new JupyterCommManager();\n", + " \n", + "\n", + "\n", + "var JS_MIME_TYPE = 'application/javascript';\n", + "var HTML_MIME_TYPE = 'text/html';\n", + "var EXEC_MIME_TYPE = 'application/vnd.holoviews_exec.v0+json';\n", + "var CLASS_NAME = 'output';\n", + "\n", + "/**\n", + " * Render data to the DOM node\n", + " */\n", + "function render(props, node) {\n", + " var div = document.createElement(\"div\");\n", + " var script = document.createElement(\"script\");\n", + " node.appendChild(div);\n", + " node.appendChild(script);\n", + "}\n", + "\n", + "/**\n", + " * Handle when a new output is added\n", + " */\n", + "function handle_add_output(event, handle) {\n", + " var output_area = handle.output_area;\n", + " var output = handle.output;\n", + " if ((output.data == undefined) || (!output.data.hasOwnProperty(EXEC_MIME_TYPE))) {\n", + " return\n", + " }\n", + " var id = output.metadata[EXEC_MIME_TYPE][\"id\"];\n", + " var toinsert = output_area.element.find(\".\" + CLASS_NAME.split(' ')[0]);\n", + " if (id !== undefined) {\n", + " var nchildren = toinsert.length;\n", + " var html_node = toinsert[nchildren-1].children[0];\n", + " html_node.innerHTML = output.data[HTML_MIME_TYPE];\n", + " var scripts = [];\n", + " var nodelist = html_node.querySelectorAll(\"script\");\n", + " for (var i in nodelist) {\n", + " if (nodelist.hasOwnProperty(i)) {\n", + " scripts.push(nodelist[i])\n", + " }\n", + " }\n", + "\n", + " scripts.forEach( function (oldScript) {\n", + " var newScript = document.createElement(\"script\");\n", + " var attrs = [];\n", + " var nodemap = oldScript.attributes;\n", + " for (var j in nodemap) {\n", + " if (nodemap.hasOwnProperty(j)) {\n", + " attrs.push(nodemap[j])\n", + " }\n", + " }\n", + " attrs.forEach(function(attr) { newScript.setAttribute(attr.name, attr.value) });\n", + " newScript.appendChild(document.createTextNode(oldScript.innerHTML));\n", + " oldScript.parentNode.replaceChild(newScript, oldScript);\n", + " });\n", + " if (JS_MIME_TYPE in output.data) {\n", + " toinsert[nchildren-1].children[1].textContent = output.data[JS_MIME_TYPE];\n", + " }\n", + " output_area._hv_plot_id = id;\n", + " if ((window.Bokeh !== undefined) && (id in Bokeh.index)) {\n", + " window.PyViz.plot_index[id] = Bokeh.index[id];\n", + " } else {\n", + " window.PyViz.plot_index[id] = null;\n", + " }\n", + " } else if (output.metadata[EXEC_MIME_TYPE][\"server_id\"] !== undefined) {\n", + " var bk_div = document.createElement(\"div\");\n", + " bk_div.innerHTML = output.data[HTML_MIME_TYPE];\n", + " var script_attrs = bk_div.children[0].attributes;\n", + " for (var i = 0; i < script_attrs.length; i++) {\n", + " toinsert[toinsert.length - 1].childNodes[1].setAttribute(script_attrs[i].name, script_attrs[i].value);\n", + " }\n", + " // store reference to server id on output_area\n", + " output_area._bokeh_server_id = output.metadata[EXEC_MIME_TYPE][\"server_id\"];\n", + " }\n", + "}\n", + "\n", + "/**\n", + " * Handle when an output is cleared or removed\n", + " */\n", + "function handle_clear_output(event, handle) {\n", + " var id = handle.cell.output_area._hv_plot_id;\n", + " var server_id = handle.cell.output_area._bokeh_server_id;\n", + " if (((id === undefined) || !(id in PyViz.plot_index)) && (server_id !== undefined)) { return; }\n", + " var comm = window.PyViz.comm_manager.get_client_comm(\"hv-extension-comm\", \"hv-extension-comm\", function () {});\n", + " if (server_id !== null) {\n", + " comm.send({event_type: 'server_delete', 'id': server_id});\n", + " return;\n", + " } else if (comm !== null) {\n", + " comm.send({event_type: 'delete', 'id': id});\n", + " }\n", + " delete PyViz.plot_index[id];\n", + " if ((window.Bokeh !== undefined) & (id in window.Bokeh.index)) {\n", + " var doc = window.Bokeh.index[id].model.document\n", + " doc.clear();\n", + " const i = window.Bokeh.documents.indexOf(doc);\n", + " if (i > -1) {\n", + " window.Bokeh.documents.splice(i, 1);\n", + " }\n", + " }\n", + "}\n", + "\n", + "/**\n", + " * Handle kernel restart event\n", + " */\n", + "function handle_kernel_cleanup(event, handle) {\n", + " delete PyViz.comms[\"hv-extension-comm\"];\n", + " window.PyViz.plot_index = {}\n", + "}\n", + "\n", + "/**\n", + " * Handle update_display_data messages\n", + " */\n", + "function handle_update_output(event, handle) {\n", + " handle_clear_output(event, {cell: {output_area: handle.output_area}})\n", + " handle_add_output(event, handle)\n", + "}\n", + "\n", + "function register_renderer(events, OutputArea) {\n", + " function append_mime(data, metadata, element) {\n", + " // create a DOM node to render to\n", + " var toinsert = this.create_output_subarea(\n", + " metadata,\n", + " CLASS_NAME,\n", + " EXEC_MIME_TYPE\n", + " );\n", + " this.keyboard_manager.register_events(toinsert);\n", + " // Render to node\n", + " var props = {data: data, metadata: metadata[EXEC_MIME_TYPE]};\n", + " render(props, toinsert[0]);\n", + " element.append(toinsert);\n", + " return toinsert\n", + " }\n", + "\n", + " events.on('output_added.OutputArea', handle_add_output);\n", + " events.on('output_updated.OutputArea', handle_update_output);\n", + " events.on('clear_output.CodeCell', handle_clear_output);\n", + " events.on('delete.Cell', handle_clear_output);\n", + " events.on('kernel_ready.Kernel', handle_kernel_cleanup);\n", + "\n", + " OutputArea.prototype.register_mime_type(EXEC_MIME_TYPE, append_mime, {\n", + " safe: true,\n", + " index: 0\n", + " });\n", + "}\n", + "\n", + "if (window.Jupyter !== undefined) {\n", + " try {\n", + " var events = require('base/js/events');\n", + " var OutputArea = require('notebook/js/outputarea').OutputArea;\n", + " if (OutputArea.prototype.mime_types().indexOf(EXEC_MIME_TYPE) == -1) {\n", + " register_renderer(events, OutputArea);\n", + " }\n", + " } catch(err) {\n", + " }\n", + "}\n" + ], + "application/vnd.holoviews_load.v0+json": "\nif ((window.PyViz === undefined) || (window.PyViz instanceof HTMLElement)) {\n window.PyViz = {comms: {}, comm_status:{}, kernels:{}, receivers: {}, plot_index: []}\n}\n\n\n function JupyterCommManager() {\n }\n\n JupyterCommManager.prototype.register_target = function(plot_id, comm_id, msg_handler) {\n if (window.comm_manager || ((window.Jupyter !== undefined) && (Jupyter.notebook.kernel != null))) {\n var comm_manager = window.comm_manager || Jupyter.notebook.kernel.comm_manager;\n comm_manager.register_target(comm_id, function(comm) {\n comm.on_msg(msg_handler);\n });\n } else if ((plot_id in window.PyViz.kernels) && (window.PyViz.kernels[plot_id])) {\n window.PyViz.kernels[plot_id].registerCommTarget(comm_id, function(comm) {\n comm.onMsg = msg_handler;\n });\n } else if (typeof google != 'undefined' && google.colab.kernel != null) {\n google.colab.kernel.comms.registerTarget(comm_id, (comm) => {\n var messages = comm.messages[Symbol.asyncIterator]();\n function processIteratorResult(result) {\n var message = result.value;\n console.log(message)\n var content = {data: message.data, comm_id};\n var buffers = []\n for (var buffer of message.buffers || []) {\n buffers.push(new DataView(buffer))\n }\n var metadata = message.metadata || {};\n var msg = {content, buffers, metadata}\n msg_handler(msg);\n return messages.next().then(processIteratorResult);\n }\n return messages.next().then(processIteratorResult);\n })\n }\n }\n\n JupyterCommManager.prototype.get_client_comm = function(plot_id, comm_id, msg_handler) {\n if (comm_id in window.PyViz.comms) {\n return window.PyViz.comms[comm_id];\n } else if (window.comm_manager || ((window.Jupyter !== undefined) && (Jupyter.notebook.kernel != null))) {\n var comm_manager = window.comm_manager || Jupyter.notebook.kernel.comm_manager;\n var comm = comm_manager.new_comm(comm_id, {}, {}, {}, comm_id);\n if (msg_handler) {\n comm.on_msg(msg_handler);\n }\n } else if ((plot_id in window.PyViz.kernels) && (window.PyViz.kernels[plot_id])) {\n var comm = window.PyViz.kernels[plot_id].connectToComm(comm_id);\n comm.open();\n if (msg_handler) {\n comm.onMsg = msg_handler;\n }\n } else if (typeof google != 'undefined' && google.colab.kernel != null) {\n var comm_promise = google.colab.kernel.comms.open(comm_id)\n comm_promise.then((comm) => {\n window.PyViz.comms[comm_id] = comm;\n if (msg_handler) {\n var messages = comm.messages[Symbol.asyncIterator]();\n function processIteratorResult(result) {\n var message = result.value;\n var content = {data: message.data};\n var metadata = message.metadata || {comm_id};\n var msg = {content, metadata}\n msg_handler(msg);\n return messages.next().then(processIteratorResult);\n }\n return messages.next().then(processIteratorResult);\n }\n }) \n var sendClosure = (data, metadata, buffers, disposeOnDone) => {\n return comm_promise.then((comm) => {\n comm.send(data, metadata, buffers, disposeOnDone);\n });\n };\n var comm = {\n send: sendClosure\n };\n }\n window.PyViz.comms[comm_id] = comm;\n return comm;\n }\n window.PyViz.comm_manager = new JupyterCommManager();\n \n\n\nvar JS_MIME_TYPE = 'application/javascript';\nvar HTML_MIME_TYPE = 'text/html';\nvar EXEC_MIME_TYPE = 'application/vnd.holoviews_exec.v0+json';\nvar CLASS_NAME = 'output';\n\n/**\n * Render data to the DOM node\n */\nfunction render(props, node) {\n var div = document.createElement(\"div\");\n var script = document.createElement(\"script\");\n node.appendChild(div);\n node.appendChild(script);\n}\n\n/**\n * Handle when a new output is added\n */\nfunction handle_add_output(event, handle) {\n var output_area = handle.output_area;\n var output = handle.output;\n if ((output.data == undefined) || (!output.data.hasOwnProperty(EXEC_MIME_TYPE))) {\n return\n }\n var id = output.metadata[EXEC_MIME_TYPE][\"id\"];\n var toinsert = output_area.element.find(\".\" + CLASS_NAME.split(' ')[0]);\n if (id !== undefined) {\n var nchildren = toinsert.length;\n var html_node = toinsert[nchildren-1].children[0];\n html_node.innerHTML = output.data[HTML_MIME_TYPE];\n var scripts = [];\n var nodelist = html_node.querySelectorAll(\"script\");\n for (var i in nodelist) {\n if (nodelist.hasOwnProperty(i)) {\n scripts.push(nodelist[i])\n }\n }\n\n scripts.forEach( function (oldScript) {\n var newScript = document.createElement(\"script\");\n var attrs = [];\n var nodemap = oldScript.attributes;\n for (var j in nodemap) {\n if (nodemap.hasOwnProperty(j)) {\n attrs.push(nodemap[j])\n }\n }\n attrs.forEach(function(attr) { newScript.setAttribute(attr.name, attr.value) });\n newScript.appendChild(document.createTextNode(oldScript.innerHTML));\n oldScript.parentNode.replaceChild(newScript, oldScript);\n });\n if (JS_MIME_TYPE in output.data) {\n toinsert[nchildren-1].children[1].textContent = output.data[JS_MIME_TYPE];\n }\n output_area._hv_plot_id = id;\n if ((window.Bokeh !== undefined) && (id in Bokeh.index)) {\n window.PyViz.plot_index[id] = Bokeh.index[id];\n } else {\n window.PyViz.plot_index[id] = null;\n }\n } else if (output.metadata[EXEC_MIME_TYPE][\"server_id\"] !== undefined) {\n var bk_div = document.createElement(\"div\");\n bk_div.innerHTML = output.data[HTML_MIME_TYPE];\n var script_attrs = bk_div.children[0].attributes;\n for (var i = 0; i < script_attrs.length; i++) {\n toinsert[toinsert.length - 1].childNodes[1].setAttribute(script_attrs[i].name, script_attrs[i].value);\n }\n // store reference to server id on output_area\n output_area._bokeh_server_id = output.metadata[EXEC_MIME_TYPE][\"server_id\"];\n }\n}\n\n/**\n * Handle when an output is cleared or removed\n */\nfunction handle_clear_output(event, handle) {\n var id = handle.cell.output_area._hv_plot_id;\n var server_id = handle.cell.output_area._bokeh_server_id;\n if (((id === undefined) || !(id in PyViz.plot_index)) && (server_id !== undefined)) { return; }\n var comm = window.PyViz.comm_manager.get_client_comm(\"hv-extension-comm\", \"hv-extension-comm\", function () {});\n if (server_id !== null) {\n comm.send({event_type: 'server_delete', 'id': server_id});\n return;\n } else if (comm !== null) {\n comm.send({event_type: 'delete', 'id': id});\n }\n delete PyViz.plot_index[id];\n if ((window.Bokeh !== undefined) & (id in window.Bokeh.index)) {\n var doc = window.Bokeh.index[id].model.document\n doc.clear();\n const i = window.Bokeh.documents.indexOf(doc);\n if (i > -1) {\n window.Bokeh.documents.splice(i, 1);\n }\n }\n}\n\n/**\n * Handle kernel restart event\n */\nfunction handle_kernel_cleanup(event, handle) {\n delete PyViz.comms[\"hv-extension-comm\"];\n window.PyViz.plot_index = {}\n}\n\n/**\n * Handle update_display_data messages\n */\nfunction handle_update_output(event, handle) {\n handle_clear_output(event, {cell: {output_area: handle.output_area}})\n handle_add_output(event, handle)\n}\n\nfunction register_renderer(events, OutputArea) {\n function append_mime(data, metadata, element) {\n // create a DOM node to render to\n var toinsert = this.create_output_subarea(\n metadata,\n CLASS_NAME,\n EXEC_MIME_TYPE\n );\n this.keyboard_manager.register_events(toinsert);\n // Render to node\n var props = {data: data, metadata: metadata[EXEC_MIME_TYPE]};\n render(props, toinsert[0]);\n element.append(toinsert);\n return toinsert\n }\n\n events.on('output_added.OutputArea', handle_add_output);\n events.on('output_updated.OutputArea', handle_update_output);\n events.on('clear_output.CodeCell', handle_clear_output);\n events.on('delete.Cell', handle_clear_output);\n events.on('kernel_ready.Kernel', handle_kernel_cleanup);\n\n OutputArea.prototype.register_mime_type(EXEC_MIME_TYPE, append_mime, {\n safe: true,\n index: 0\n });\n}\n\nif (window.Jupyter !== undefined) {\n try {\n var events = require('base/js/events');\n var OutputArea = require('notebook/js/outputarea').OutputArea;\n if (OutputArea.prototype.mime_types().indexOf(EXEC_MIME_TYPE) == -1) {\n register_renderer(events, OutputArea);\n }\n } catch(err) {\n }\n}\n" + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "application/vnd.holoviews_exec.v0+json": "", + "text/html": [ + "Tip: Launch as web-app! 🚀
\n", - "To launch any of the notebook's visualization as a standalone application outside of Jupyter Notebook, use `panel serve [path to file] --show` at the command line.
\n", - "Visit the Index Page
\n", + " This workflow example is part of set of related workflows. If you haven't already, visit the index page for an introduction and guidance on choosing the appropriate workflow.\n", + "