Table of Contents
This repository contains a Python package and scripts to investigate synchronization algorithms applied on top of the IEEE 1588 precision time protocol (PTP). The project focuses on offline analysis by processing datasets of timestamps collected from real hardware. Using this strategy, the user can process the same dataset with varying parameters and algorithms until achieving the best synchronization performance.
The PTP-DAL library implements several algorithms, such as packet selection, least-squares, and Kalman filtering. These are applied independently on the timestamps provided by a given dataset. This approach is analogous to running several algorithms in parallel in a real-time implementation.
After processing the selected algorithms, PTP-DAL outputs a comprehensive set of results comparing the synchronization performance achieved by each algorithm, with timing metrics such as the maximum absolute time error (max|TE|), maximum time interval error (MTIE), and so on. Additionally, the results include analyses of several aspects of the PTP network and the surrounding environment, such as the packet delay variation (PDV), PTP delay distributions, and temperature variations.
The project was specifically developed to analyze datasets of timestamps generated by the FPGA-based PTP synchronization testbed developed by LASSE - 5G & IoT Research Group. This testbed has been detailed in various publications, including:
- "Clock Synchronization Algorithms Over PTP-Unaware Networks: Reproducible Comparison Using an FPGA Testbed," in IEEE Access, 2021.
- "5G Fronthaul Synchronization via IEEE 1588 Precision Time Protocol: Algorithms and Use Cases," Ph.D. thesis, Federal University of Pará, Dec. 2020.
- "Testbed Evaluation of Distributed Radio Timing Alignment Over Ethernet Fronthaul Networks," in IEEE Access, 2020.
- "An FPGA-based Design of a Packetized Fronthaul Testbed with IEEE 1588 Clock Synchronization," European Wireless 2017.
In particular, Chapter 4 from reference [2] provides the most comprehensive description of the testbed and the dataset acquisition process, while Chapter 3 covers the algorithms supported by the PTP-DAL project.
The adopted datasets of timestamps comprise a large number of PTP two-way exchanges. Each exchange corresponds to a row in the dataset and includes a set of timestamps. More specifically, each row consists of the four timestamps involved in the two-way PTP packet exchange (t1, t2, t3, and t4), as well as auxiliary timestamps. The auxiliary timestamps indicate the actual one-way delay of each PTP packet and the true time offset affecting the slave at that moment. Ultimately, this supplemental information allows for analyzing the error between each time offset estimator and the actual time offset experienced by the slave clock at any point in time.
The datasets produced by the testbed can be made available on demand. If you are interested in exploring PTP-DAL using datasets acquired from LASSE's PTP synchronization testbed, please read the dataset access section and contact us directly over email. Otherwise, this repository contains a simulator capable of generating compatible datasets through simulation.
The project requires Python 3.6 or higher.
If using virtualenvwrapper, run the following to create a virtual environment:
mkvirtualenv -r requirements.txt ptp
analyze.py
: Analyzes a dataset and compares synchronization algorithms.batch.py
: Runs a batch of analyses (see the batch processing recipes).catalog.py
: Catalogs datasets acquired with the testbed.dataset.py
: Downloads and searches datasets by communicating with the dataset database.
compress.py
: Compresses a given dataset captured with the testbed.ptp_plots.py
: Demonstrates a few plots that can be generated using theptp.metrics
module.ptp_estimators.py
: Demonstrates estimators that can be used to post-process PTP measurements.simulate.py
: Simulates PTP clocks and generates a timestamp dataset that can be processed with the same scripts used to process a testbed-generated dataset.window_optimizer_demo.py
: Evaluates the performance of window-based estimators according to the observation window length.kalman_demo.py
: Demonstrates the evaluation of Kalman filtering.
The main script for synchronization analysis is analyze.py
, which can be
executed as follows:
./analyze.py -vvvv -f [dataset-filename]
The script will download the specified dataset automatically and process
it. Upon completion, the results become available in the results/
directory.
Directory recipes
contains preset recipes for running a batch of analyses
based on distinct datasets. Refer to the instructions in the referred
directory.
Every dataset downloaded through analyze.py
gets cataloged automatically. The
cataloging produces a JSON file at data/catalog.json
and an HTML version at
data/index.html
.
The dataset catalog can also be generated manually by calling:
./catalog.py
PTP-DAL also offers a simulator to generate a timestamp dataset formatted similarly to the datasets acquired from the testbed. With that, the same algorithms that can process timestamps from testbed datasets can process the data from simulated datasets.
To generate a simulation dataset, define the target number of PTP exchanges, and
run with argument --save
. For example, for 10000 exchanges, run:
./simulate.py -vvvv -N 10000 --save
where argument -vvvv
sets verbosity level 4 (info). Feel free to adjust the
verbosity level as needed. For example, level 5 (-vvvvv
) prints a great amount
of debugging information.
After the simulation, the resulting (simulated) dataset is placed in the data/
directory, where the analysis script expects it.
NOTE: all datasets generated by simulation are prefixed
sim-
. In contrast, datasets acquired serially from the testbed are prefixed withserial-
.
The datasets acquired with the FPGA-based testbed are kept within the PTP dataset database (DB). These are accessible through our PTP dataset API hosted at https://ptp.database.lasseufpa.org/api/.
This API uses mutual SSL authentication, on which both client and server authenticate each other through digital certificates. Hence, to use this service, you need to obtain a valid client certificate signed by our certification authority (CA).
If you are interested in accessing our datasets, please follow the procedure below:
- Generate a private/public key pair.
# Client key
openssl genrsa -out <your_name>.key 4096
- Generate a certificate signing request (CSR), which contains your public key and is signed using your private key.
# CSR to obtain certificate
openssl req -new -key <your_name>.key -out <your_name>.csr
-
Send the CSR to us at [email protected] and let us know the network scenarios or types of datasets you are interested in exploring.
-
We sign your CSR and send you the final (CA-signed) digital certificate that you will use to access the dataset DB API.
-
Try accessing the dataset API:
First, run:
./dataset.py search
The application will prompt you for access information. When asked about
Download via API or SSH?
, reply with API
(or just press enter to accept the
default response). Next, fill in the paths to your private key (generated in
step 1) and the digital certificate (obtained in step 4).
After that, the command should return the list of datasets.
Dataset download:
GET: https://ptp.database.lasseufpa.org/api/dataset/<dataset_name>
Dataset search
POST: https://ptp.database.lasseufpa.org/api/search
Contact information: