Skip to content

An Automatic Differentiation-based Waveform Inversion Framework Implemented in PyTorch.

License

Notifications You must be signed in to change notification settings

liufeng2317/ADFWI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

68 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation


Automatic Differentiation-Based Full Waveform Inversion


πŸ‘©β€πŸ’» Introduction

  ADFWI is an open-source framework for high-resolution subsurface parameter estimation by minimizing discrepancies between observed and simulated waveform. Utilizing automatic differentiation (AD), ADFWI simplifies the derivation and implementation of full waveform inversion (FWI), enhancing the design and evaluation of methodologies. It supports wave propagation in various media, including isotropic acoustic, isotropic elastic, and both vertical transverse isotropy (VTI) and horizontal transverse isotropy (HTI) medias.

  In addition, ADFWI provides a comprehensive collection of Objective functions, regularization techniques, optimization algorithms, and deep neural networks. This rich set of tools facilitates researchers in conducting experiments and comparisons, enabling them to explore innovative approaches and refine their methodologies effectively.

ADFWI NNFWI


⚑️ Installation

To install ADFWI, please follow these steps:

  1. Ensure Prerequisites

  2. Create a Virtual Environment (Optional) It is recommended to create a virtual environment using conda:

    conda create --name adfwi-env python=3.8
    conda activate adfwi-env
  3. Install Required Packages

  • Method 1: Clone the github Repository This method provides the latest version, which may be more suitable for your research:
    git clone https://github.com/liufeng2317/ADFWI.git
    cd ADFWI
    Then, install the necessary packages:
    pip install -r requirements.txt
  • Method 2: Install via pip Alternatively, you can directly install ADFWI from PyPI:
      pip install ADFWI-Torch

πŸ‘Ύ Examples

1. Gradient Comparation between AD & Central Difference

A comparative analysis of gradient calculations obtained through AD and the central difference method.

Test Name Status Path Figure
Acoustic (vp) βœ… Codes Marmousi2GradientCmp
Elastic (vp/vs/rho) βœ… Codes
Gradient Comparation Marmousi2GradientCmp Marmousi2GradientCmp Marmousi2GradientCmp

2. Iso-acoustic Model Tests

Test Name Status Example's Path Example's Figure
Marmousi2 (low resulotion) βœ… Example-Marmousi2 (low)
Inversion Process Marmousi2
Marmousi2 (high resulotion) βœ… Example-Marmousi2 (high)
Inversion Process Marmousi2
Marmousi2 (vp and rho) βœ… Example-vp & rho
Inversion Process Anomaly
FootHill (low resulotion) βœ… Example-FootHill (low)
Inversion Process FootHill
FootHill (high resulotion) βœ… Example-FootHill (high)
Inversion Process FootHill
SEAM-I βœ… Example-SEAM-I
Inversion Process SEAM-I
Overthrust-offshore βœ… Example-Overthrust-offshore
Inversion Process Overthrust-offshore
Anomaly βœ… Example-Anomaly
Inversion Process Anomaly

3. Iso-elastic & VTI-elastic Model Tests

Test Name Status Path Figure
Iso-elastic Anomaly βœ… Example-Anomaly
Inversion Process Marmousi2
Iso-elastic Marmousi2-1 βœ… Shot & Rec on Surface
Inversion Process Marmousi2
Iso-elastic Marmousi2-2 βœ… Shot & Rec Underwater
Inversion Process Marmousi2
VTI-elastic Anomaly-1 βœ… Inv Epsilon
Inversion Process Anomaly
VTI-elastic Anomaly-2 βœ… Inv Epsilon & Delta
Inversion Process Anomaly

4. Misfits Tests

We assess the convexity of different objective functions by simulating seismic records using shifted wavelets. The following table summarizes the results and provides examples for further exploration.

Convexity Example-Ricker Shift

Example-Ricker Shift & vary Amplitude

Example-Ricker Shift & domain Frequency

Example-Ricker Shift & Gaussian noise
Ricker-cmp

It is important to note that we present the performance of various objective functions under poorer initial models. When using better initial model conditions, each objective function demonstrates improved performance. Relevant results can be found in Better Initial Model.

Optimized_Model Optimized_Model_misfits Optimized_Model_misfits
Test Name Status Path Figure
L1-norm βœ… Example-L1
Inversion Process L1
L2-norm βœ… Example-L2
Inversion Process L2
T-distribution (StudentT) βœ… Example-StudentT
Inversion Process StudentT
Envelope βœ… Example-Envelope
Inversion Process Envelope
Global Correlation (GC) βœ… Example-GC
Inversion Process GC
Soft Dynamic Time Warping (soft-DTW) βœ… Example-SoftDTW
Inversion Process SoftDTW
Wasserstein Distance with Sinkhorn βœ… Example-Wasserstein
Inversion Process Wasserstein
Hybrid Misfit: Envelope & Global Correlation (WECI) βœ… Example-WECI
Inversion Process WECI

The following misfit functions are still in beta version and are undergoing further development and validation. Their performance and reliability will be evaluated in future studies.

Misfit Name Status Path Figure
Travel Time πŸ› οΈ Example-TravelTime
Inversion Process TravelTime
Normalized Integration Method (NIM) πŸ› οΈ Example-NIM
Inversion Process πŸ–ΌοΈ Image under development

5. Optimizer Tests

The results presented below specifically characterize the impact of using the L2-norm objective function in conjunction with the Marmousi2 model. It is important to note that the effects of different optimization algorithms may vary significantly when applied to other objective functions or models. Consequently, the findings should be interpreted within this specific context, and further investigations are recommended to explore the performance of these algorithms across a broader range of scenarios.

Optimized_Model Optimized_Model_misfits
Test Name Status Path Figure
Stochastic Gradient Descent (SGD) βœ… Example-SGD
Inversion Process SGD
Average Stochastic Gradient Descent (ASGD) βœ… Example-ASGD
Inversion Process ASGD
Root Mean Square Propagation (RMSProp) βœ… Example-RMSProp
Inversion Process RMSProp
Adaptive Gradient Algorithm (Adagrad) βœ… Example-Adagrad
Inversion Process Adagrad
Adaptive Moment Estimation (Adam) βœ… Example-Adam
Inversion Process Adam
Adam with Weight Decay (AdamW) βœ… Example-AdamW
Inversion Process AdamW
Nesterov-accelerated Adam (NAdam) βœ… Example-NAdam
Inversion Process NAdam
Rectified Adam (RAdam) βœ… Example-RAdam
Inversion Process RAdam
Test Name Status Path Figure
L-BFGS βœ… Example-LBFGS
Inversion Process L-BFGS

6. Regularization Methods

Optimized_Model Optimized_Model_misfits
Test Name Status Path Figure
no-regularization βœ… no-regular
Inversion Process no-regular
Tikhonov-1st Order βœ… Example-Tikhonov1
Inversion Process Tikhonov1
Tikhonov-2nd Order βœ… Example-Tikhonov2
Inversion Process Tikhonov2
Total Variation-1st Order βœ… Example-TV1
Inversion Process TV1
Total Variation-2nd Order βœ… Example-TV2
Inversion Process TV2

7. Multi-Scale Strategy in FWI

Multi-scale strategies play a critical role in FWI as they help to mitigate non-linearity issues and enhance convergence, especially for complex models. Multi-scale strategies are currently in development to further improve robustness and efficiency.

Test Name Status Path Figure
Iso-elastic Marmousi2 βœ… Multi-freq (2Hz,3Hz,5Hz)
Inversion Process Marmousi2
Multi-Scale Name Status Path Figure
Multi-Offsets πŸ› οΈ on-going
Inversion Process πŸ–ΌοΈ Image under development
Multi-scale in Time πŸ› οΈ on-going
Inversion Process πŸ–ΌοΈ Image under development

8. Deep Image Prior (Earth Model Reparameterization)

Test Name Status Path Figure
no-regularization βœ… no-regular
Inversion Process no-regular
2-Layer CNN βœ… Example-2LayerCNN
Inversion Process 2LayerCNN
3-Layer CNN βœ… Example-3LayerCNN
Inversion Process 3LayerCNN
3-Layer Unet βœ… Example-3LayerUNet
Inversion Process 3LayerUNet
4-Layer Unet βœ… Example-4LayerUNet
Inversion Process 4LayerUNet

9. Uncertainty Estimation Using Deep Neural Networks (DNNs)

We employ DNNs derived from the Deep Image Prior (DIP) test described earlier, to perform uncertainty estimation. The variable p represents the dropout ratio applied during both training and inference to evaluate uncertainty.

Test Name Status Path Figure
2LayerCNN-uncertainty βœ… Codes 2LayerCNN-uncertainty

πŸ“ Special Features

  • Deep Neural Network Integration

    • DNNs Reparameterization: DNNs reparameterize the Earth model, introducing learnable regularization to improve the inversion process.
    • Dropout: Applied to assess inversion uncertainty by randomly dropping units during training, providing a measure of model robustness.
    • Multiphysics Joint Inversion (on-going): Neural networks are used to fuse data from different physical fields, enabling joint inversion for a more comprehensive and accurate Earth model.
  • Resource Management

    • Mini-batch: multi-shot data can be large due to different source positions and time steps. Splitting it into mini-batches prevents loading the entire dataset into memory at once.
    • Checkpointing: a key memory-saving technique, particularly for backpropagation in FWI. Instead of storing all intermediate results, only a few checkpoints are saved. During backpropagation, missing steps are recomputed, reducing memory usage at the cost of extra computation.
    • boundary saving (on-going): methods are being developed to efficiently reduce memory usage by saving only the wavefield boundaries during forward propagation instead of the entire wavefield, allowing for their later use in backpropagation.
  • Acceleration Methods

    • GPU Acceleration: Utilizes the parallel processing power of GPUs to significantly speed up computations, especially for large-scale simulations like FWI.
    • JIT(Just-in-Time): Speeds up code execution by compiling Python code into optimized machine code at runtime, improving performance without modifying the original codebase.
    • Reconstruction Using Lower-Level Language (C++) (on-going): Involves rewriting performance-critical components in C++ to leverage lower-level optimizations, resulting in faster execution times and improved overall efficiency.
  • Robustness and Portability

    • Each of the method has proposed a code for testing.

βš–οΈ LICENSE

The Automatic Differentiation-Based Full Waveform Inversion (ADFWI) framework is licensed under the MIT License. This license allows you to:

  • Use: You can use the software for personal, academic, or commercial purposes.
  • Modify: You can modify the software to suit your needs.
  • Distribute: You can distribute the original or modified software to others.
  • Private Use: You can use the software privately without any restrictions.

πŸ—“οΈ To-Do List

  • Memory Optimization through Boundary and Wavefield Reconstruction Objective: Implement a strategy to save boundaries and reconstruct wavefields to reduce memory usage.

    Explanation: This approach focuses on saving only the wavefield boundaries during forward propagation and reconstructing the wavefields as needed. By doing so, it aims to minimize memory consumption while ensuring the accuracy of wave propagation simulations, particularly in large-scale models.

    Related Work:

    • [1] P. Yang, J. Gao, and B. Wang, RTM using Effective Boundary Saving: A Staggered Grid GPU Implementation, Comput. Geosci., vol. 68, pp. 64–72, Jul. 2014. doi: 10.1016/j.cageo.2014.04.004.
    • [2] Wang, S., Jiang, Y., Song, P., Tan, J., Liu, Z., & He, B., 2023. Memory Optimization in RNN-Based Full Waveform Inversion Using Boundary Saving Wavefield Reconstruction, IEEE Trans. Geosci. Remote Sens., 61, 1–12. doi: 10.1109/TGRS.2023.3317529.
  • C++ / C-Based Forward Propagator Objective: Develop a forward wave propagation algorithm using C++ or C.

    Explanation: Implementing the forward propagator in lower-level languages like C++ or C will significantly enhance computational performance, particularly for large-scale simulations. The aim is to leverage the improved memory management, faster execution, and more efficient parallel computing capabilities of these languages over Python-based implementations.

  • Resource Optimization for Memory Efficiency Objective: Reduce memory consumption for improved resource utilization.

    Explanation: The current computational framework may encounter memory bottlenecks, especially when processing large datasets. Optimizing memory usage by identifying redundant storage, streamlining data structures, and using efficient algorithms will help in scaling up the computations while maintaining or even enhancing performance. This task is critical for expanding the capacity of the system to handle larger and more complex datasets.

  • Custom Input Data Management System Objective: Develop a tailored system for managing input data effectively.

    Explanation: A customized data management framework is needed to better organize, preprocess, and handle input data efficiently. This may involve designing workflows for data formatting, conversion, and pre-processing steps, ensuring the consistency and integrity of input data. Such a system will provide flexibility in managing various input types and scales, and it will be crucial for maintaining control over data quality throughout the project lifecycle.

  • Enhanced Gradient Processing Objective: Implement advanced techniques for gradient handling.

    Explanation: Developing a sophisticated gradient processing strategy will improve the inversion results by ensuring that gradients are effectively utilized and interpreted. This may include techniques such as gradient clipping, adaptive learning rates, and noise reduction methods to enhance the stability and convergence of the optimization process, ultimately leading to more accurate inversion outcomes.

  • Multi-Scale Inversion Strategies (done!) Objective: Introduce multi-scale approaches for improved inversion accuracy.

    Explanation: Multi-scale inversion involves processing data at various scales to capture both large-scale trends and small-scale features effectively. Implementing this strategy will enhance the robustness of the inversion process, allowing for better resolution of subsurface structures. Techniques such as hierarchical modeling and wavelet analysis may be considered to achieve this goal, thus improving the overall quality of the inversion results.

  • Real Data Testing Objective: Evaluate the performance and robustness of the developed methodologies using real-world datasets. Explanation: Conducting tests with actual data is crucial for validating the effectiveness of the Status algorithms. This will involve the following steps:
    1. Dataset Selection: Identify relevant real-world datasets that reflect the complexities of the target applications. These datasets should include diverse scenarios and noise characteristics typical in field data.

    2. Preprocessing: Apply necessary preprocessing techniques to ensure data quality and consistency. This may include data normalization, filtering, and handling missing or corrupted values.

    3. Implementation: Utilize the developed algorithms on the selected datasets, monitoring their performance metrics such as accuracy, computational efficiency, and convergence behavior.

    4. Comparison: Compare the results obtained from the Status methods against established benchmarks or existing methodologies to assess improvements.

    5. Analysis: Analyze the outcomes to identify strengths and weaknesses, and document any discrepancies or unexpected behaviors. This analysis will help refine the algorithms and inform future iterations.

    6. Reporting: Summarize the findings in a comprehensive report, detailing the testing procedures, results, and any implications for future work.

    This actual data testing phase is essential for ensuring that the developed methodologies not only perform well in controlled environments but also translate effectively to real-world applications. It serves as a critical validation step before broader deployment and adoption.


πŸ”° Contact

Developed by Feng Liu at the University of Science and Technology of China (USTC) and Shanghai Jiao Tong University (SJTU).

The related paper Automatic Differentiation-based Full Waveform Inversion with Flexible Workflows is currently in preparation.

For any inquiries, please contact Liu Feng via email at: [email protected] or [email protected].

@software{ADFWI_LiuFeng_2024,
  author       = {Feng Liu, Haipeng Li, GuangYuan Zou and Junlun Li},
  title        = {ADFWI},
  month        = July,
  year         = 2024,
  version      = {v1.0},
}