Skip to content

Commit

Permalink
tutorial files and index.rst edited for docs
Browse files Browse the repository at this point in the history
Conflicts:
	docs/index.rst
  • Loading branch information
zain-sohail committed Oct 16, 2023
1 parent e340e7b commit 1d128d2
Show file tree
Hide file tree
Showing 6 changed files with 46 additions and 41 deletions.
File renamed without changes.
5 changes: 4 additions & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,10 @@ Single-Event DataFrame (SED) documentation
:numbered:
:caption: Examples

examples/example
tutorial/1_Binningfakedata.rst
tutorial/2_Binningofexampletime_resolvedARPESdata.rst
tutorial/3_ConversionPipelineforexampletime_resolvedARPESdata.rst
tutorial/4_MetadatacollectionandexporttoNeXus.rst

.. toctree::
:maxdepth: 2
Expand Down
12 changes: 6 additions & 6 deletions tutorial/1 - Binning fake data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
"tags": []
},
"source": [
"# Binning demonstration on locally generated fake data\n",
"### Binning demonstration on locally generated fake data\n",
"In this example, we generate a table with random data simulating a single event dataset.\n",
"We showcase the binning method, first on a simple single table using the bin_partition method and then in the distributed mehthod bin_dataframe, using daks dataframes.\n",
"The first method is never really called directly, as it is simply the function called by the bin_dataframe on each partition of the dask dataframe."
Expand Down Expand Up @@ -39,7 +39,7 @@
"id": "42a6afaa-17dd-4637-ba75-a28c4ead1adf",
"metadata": {},
"source": [
"# Generate Fake Data"
"## Generate Fake Data"
]
},
{
Expand All @@ -60,7 +60,7 @@
"id": "6902fd56-1456-4da6-83a4-0f3f6b831eb6",
"metadata": {},
"source": [
"# Define the binning range"
"## Define the binning range"
]
},
{
Expand All @@ -81,7 +81,7 @@
"id": "00054b5d-fc96-4959-b562-7cb8545a9535",
"metadata": {},
"source": [
"# Compute the binning along the pandas dataframe"
"## Compute the binning along the pandas dataframe"
]
},
{
Expand Down Expand Up @@ -118,7 +118,7 @@
"id": "e632dc1d-5eb5-4621-8bef-4438ce2c6a0c",
"metadata": {},
"source": [
"# Transform to dask dataframe"
"## Transform to dask dataframe"
]
},
{
Expand All @@ -137,7 +137,7 @@
"id": "01066d40-010a-490b-9033-7339e5a21b26",
"metadata": {},
"source": [
"# compute distributed binning on the partitioned dask dataframe\n",
"## compute distributed binning on the partitioned dask dataframe\n",
"In this example, the small dataset does not give significant improvement over the pandas implementation, at least using this number of partitions.\n",
"A single partition would be faster (you can try...) but we use multiple for demonstration purpouses."
]
Expand Down
13 changes: 7 additions & 6 deletions tutorial/2 - Binning of example time-resolved ARPES data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
"tags": []
},
"source": [
"# Binning example time-resolved ARPES data stored on Zenode\n",
"# Binning example time-resolved ARPES data stored on Zenodo\n",
"In this example, we pull some time-resolved ARPES data from Zenodo, and generate a dask dataframe using the methods of the mpes package. It requires the mpes package to be installed, in addition to the sed package.\n",
"For performance reasons, best store the data on a locally attached storage (no network drive)."
]
Expand All @@ -25,6 +25,7 @@
"\n",
"import matplotlib.pyplot as plt\n",
"from mpes import fprocessing as fp\n",
"\n",
"import os\n",
"import shutil\n",
"\n",
Expand All @@ -37,7 +38,7 @@
"id": "42a6afaa-17dd-4637-ba75-a28c4ead1adf",
"metadata": {},
"source": [
"# Load Data"
"## Load Data"
]
},
{
Expand Down Expand Up @@ -73,7 +74,7 @@
"id": "6902fd56-1456-4da6-83a4-0f3f6b831eb6",
"metadata": {},
"source": [
"# Define the binning range"
"## Define the binning range"
]
},
{
Expand Down Expand Up @@ -104,7 +105,7 @@
"id": "01066d40-010a-490b-9033-7339e5a21b26",
"metadata": {},
"source": [
"# compute distributed binning on the partitioned dask dataframe\n",
"## compute distributed binning on the partitioned dask dataframe\n",
"We generated 100 dataframe partiions from the 100 files in the dataset, which we will bin parallelly with the dataframe binning function into a 3D grid"
]
},
Expand Down Expand Up @@ -141,7 +142,7 @@
"id": "4a3eaf0e",
"metadata": {},
"source": [
"# Compare to MPES binning"
"## Compare to MPES binning"
]
},
{
Expand Down Expand Up @@ -170,7 +171,7 @@
"id": "e3398aac",
"metadata": {},
"source": [
"# Test the class and the histogram function"
"## Test the class and the histogram function"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
"tags": []
},
"source": [
"# Demonstration of the conversion pipeline using time-resolved ARPES data stored on Zenode\n",
"# Demonstration of the conversion pipeline using time-resolved ARPES data stored on Zenodo\n",
"In this example, we pull some time-resolved ARPES data from Zenodo, and load it into the sed package using functions of the mpes package. Then, we run a conversion pipeline on it, containing steps for visualizing the channels, correcting image distortions, calibrating the momentum space, correcting for energy distortions and calibrating the energy axis. Finally, the data are binned in calibrated axes.\n",
"For performance reasons, best store the data on a locally attached storage (no network drive). This can also be achieved transparently using the included MirrorUtil class."
]
Expand Down Expand Up @@ -37,7 +37,7 @@
"id": "42a6afaa-17dd-4637-ba75-a28c4ead1adf",
"metadata": {},
"source": [
"# Load Data"
"## Load Data"
]
},
{
Expand Down Expand Up @@ -123,8 +123,9 @@
"id": "70aa4343",
"metadata": {},
"source": [
"# Distortion correction and Momentum Calibration workflow\n",
"### 1. step: \n",
"## Distortion correction and Momentum Calibration workflow\n",
"### Distortion correction\n",
"#### 1. step: \n",
"Bin and load part of the dataframe in detector coordinates, and choose energy plane where high-symmetry points can well be identified. Either use the interactive tool, or pre-select the range:"
]
},
Expand All @@ -145,7 +146,7 @@
"id": "fee3ca76",
"metadata": {},
"source": [
"### 2. Step:\n",
"#### 2. Step:\n",
"Next, we select a number of features corresponding to the rotational symmetry of the material, plus the center. These can either be auto-detected (for well-isolated points), or provided as a list (these can be read-off the graph in the cell above).\n",
"These are then symmetrized according to the rotational symmetry, and a spline-warping correction for the x/y coordinates is calculated, which corrects for any geometric distortions from the perfect n-fold rotational symmetry."
]
Expand Down Expand Up @@ -176,7 +177,7 @@
"id": "f7519ff8",
"metadata": {},
"source": [
"### 3. Step: \n",
"#### 3. Step: \n",
"Generate nonlinear correction using splinewarp algorithm. If no landmarks have been defined in previous step, default parameters from the config are used"
]
},
Expand Down Expand Up @@ -217,7 +218,7 @@
"id": "b5e69ffa",
"metadata": {},
"source": [
"### 4. Step:\n",
"#### 4. Step:\n",
"To adjust scaling, position and orientation of the corrected momentum space image, you can apply further affine transformations to the distortion correction field. Here, first a postential scaling is applied, next a translation, and finally a rotation around the center of the image (defined via the config). One can either use an interactive tool, or provide the adjusted values and apply them directly."
]
},
Expand All @@ -238,7 +239,7 @@
"id": "a78a68e9",
"metadata": {},
"source": [
"### 5. Step:\n",
"#### 5. Step:\n",
"Finally, the momentum correction is applied to the dataframe, and corresponding meta data are stored"
]
},
Expand All @@ -258,8 +259,8 @@
"id": "d9810488",
"metadata": {},
"source": [
"## Momentum calibration workflow\n",
"### 1. Step:\n",
"### Momentum calibration workflow\n",
"#### 1. Step:\n",
"First, the momentum scaling needs to be calibtrated. Either, one can provide the coordinates of one point outside the center, and provide its distane to the Brillouin zone center (which is assumed to be located in the center of the image), one can specify two points on the image and their distance (where the 2nd point marks the BZ center),or one can provide absolute k-coordinates of two distinct momentum points.\n",
"\n",
"If no points are provided, an interactive tool is created. Here, left mouse click selectes the off-center point (brillouin_zone_cetnered=True) or toggle-selects the off-center and center point."
Expand All @@ -285,7 +286,7 @@
"id": "1a3697b1",
"metadata": {},
"source": [
"#### Optional (Step 1a): \n",
"##### Optional (Step 1a): \n",
"Save momentum calibration parameters to configuration file in current data folder: "
]
},
Expand All @@ -306,7 +307,7 @@
"id": "c2f8a513",
"metadata": {},
"source": [
"### 2. Step:\n",
"#### 2. Step:\n",
"Now, the distortion correction and momentum calibration needs to be applied to the dataframe."
]
},
Expand All @@ -326,7 +327,7 @@
"id": "0bce2388",
"metadata": {},
"source": [
"# Energy Correction (optional)\n",
"## Energy Correction (optional)\n",
"The purpose of the energy correction is to correct for any momentum-dependent distortion of the energy axis, e.g. from geometric effects in the flight tube, or from space charge"
]
},
Expand All @@ -336,7 +337,7 @@
"id": "5289de59",
"metadata": {},
"source": [
"### 1st step:\n",
"#### 1st step:\n",
"Here, one can select the functional form to be used, and adjust its parameters. The binned data used for the momentum calibration is plotted around the Fermi energy (defined by tof_fermi), and the correction function is plotted ontop. Possible correction functions are: \"sperical\" (parameter: diameter), \"Lorentzian\" (parameter: gamma), \"Gaussian\" (parameter: sigma), and \"Lorentzian_asymmetric\" (parameters: gamma, amplitude2, gamma2).\n",
"\n",
"One can either use an interactive alignment tool, or provide parameters directly."
Expand All @@ -358,7 +359,7 @@
"id": "e43fbf33",
"metadata": {},
"source": [
"#### Optional (Step 1a): \n",
"##### Optional (Step 1a): \n",
"Save energy correction parameters to configuration file in current data folder: "
]
},
Expand All @@ -379,7 +380,7 @@
"id": "41a6a3e6",
"metadata": {},
"source": [
"### 2. Step\n",
"#### 2. Step\n",
"After adjustment, the energy correction is directly applied to the TOF axis."
]
},
Expand All @@ -399,7 +400,7 @@
"id": "8b571b4c",
"metadata": {},
"source": [
"# 3. Energy calibration\n",
"## 3. Energy calibration\n",
"For calibrating the energy axis, a set of data taken at different bias voltages around the value where the measurement was taken is required."
]
},
Expand All @@ -409,7 +410,7 @@
"id": "6bc28642",
"metadata": {},
"source": [
"### 1. Step:\n",
"#### 1. Step:\n",
"In a first step, the data are loaded, binned along the TOF dimension, and normalized. The used bias voltages can be either provided, or read from attributes in the source files if present."
]
},
Expand All @@ -434,7 +435,7 @@
"id": "314a79c8",
"metadata": {},
"source": [
"### 2. Step:\n",
"#### 2. Step:\n",
"Next, the same peak or feature needs to be selected in each curve. For this, one needs to define \"ranges\" for each curve, within which the peak of interest is located. One can either provide these ranges manually, or provide one range for a \"reference\" curve, and infer the ranges for the other curves using a dynamic time warping algorithm."
]
},
Expand Down Expand Up @@ -462,7 +463,7 @@
"id": "b2638818",
"metadata": {},
"source": [
"### 3. Step:\n",
"#### 3. Step:\n",
"Next, the detected peak positions and bias voltages are used to determine the calibration function. This can be either done by fitting the functional form d^2/(t-t0)^2 via lmfit (\"lmfit\"), or using a polynomial approxiamtion (\"lstsq\" or \"lsqr\"). Here, one can also define a reference id, and a reference energy. Those define the absolute energy position of the feature used for calibration in the \"reference\" trace, at the bias voltage where the final measurement has been performed. The energy scale can be either \"kientic\" (decreasing energy with increasing TOF), or \"binding\" (increasing energy with increasing TOF).\n",
"\n",
"After calculating the calibration, all traces corrected with the calibration are plotted ontop of each other, the calibration function together with the extracted features is plotted."
Expand All @@ -488,7 +489,7 @@
"id": "df63c6c7",
"metadata": {},
"source": [
"#### Optional (Step 3a): \n",
"##### Optional (Step 3a): \n",
"Save energy calibration parameters to configuration file in current data folder: "
]
},
Expand All @@ -509,7 +510,7 @@
"id": "563709c7",
"metadata": {},
"source": [
"### 4. Step:\n",
"#### 4. Step:\n",
"Finally, the the energy axis is added to the dataframe."
]
},
Expand All @@ -529,7 +530,7 @@
"id": "b2d8cdf9",
"metadata": {},
"source": [
"# 4. Delay calibration:\n",
"## 4. Delay calibration:\n",
"The delay axis is calculated from the ADC input column based on the provided delay range. ALternatively, the delay scan range can also be extracted from attributes inside a source file, if present."
]
},
Expand All @@ -554,7 +555,7 @@
"id": "d9d0b018",
"metadata": {},
"source": [
"# 5. Visualization of calibrated histograms\n",
"## 5. Visualization of calibrated histograms\n",
"With all calibrated axes present in the dataframe, we can visualize the corresponding histograms, and determine the respective binning ranges"
]
},
Expand All @@ -576,7 +577,7 @@
"id": "6902fd56-1456-4da6-83a4-0f3f6b831eb6",
"metadata": {},
"source": [
"# Define the binning ranges and compute calibrated data volume"
"## Define the binning ranges and compute calibrated data volume"
]
},
{
Expand All @@ -598,7 +599,7 @@
"id": "523794dc",
"metadata": {},
"source": [
"# Some visualization:"
"## Some visualization:"
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions tutorial/4 - Metadata collection and export to NeXus.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
"id": "42a6afaa-17dd-4637-ba75-a28c4ead1adf",
"metadata": {},
"source": [
"# Load Data"
"## Load Data"
]
},
{
Expand Down Expand Up @@ -246,7 +246,7 @@
"id": "6902fd56-1456-4da6-83a4-0f3f6b831eb6",
"metadata": {},
"source": [
"# Compute final data volume"
"## Compute final data volume"
]
},
{
Expand Down

0 comments on commit 1d128d2

Please sign in to comment.