This is a community driven project and everyone is welcome to contribute.
The project is hosted at the PyGMT GitHub repository.
The goal is to maintain a diverse community that's pleasant for everyone. Please be considerate and respectful of others. Everyone must abide by our Code of Conduct and we encourage all to read it carefully.
- Tackle any issue that you wish! Some issues are labeled as "good first issue" to indicate that they are beginner friendly, meaning that they don't require extensive knowledge of the project.
- Make a tutorial or gallery example of how to do something.
- Improve the API documentation.
- Contribute code! This can be code that you already have and it doesn't need to be perfect! We will help you clean things up, test it, etc.
- Provide feedback about how we can improve the project or about your particular use case. Open an issue with feature requests or bug fixes, or post general comments/questions on the forum.
- Help triage issues, or give a "thumbs up" on issues that others reported which are relevant to you.
- Participate and answer questions on the PyGMT forum Q&A.
- Participate in discussions at the quarterly PyGMT Community Meetings, which are announced on the forum governance page.
- Cite PyGMT when using the project.
- Spread the word about PyGMT or star the project!
- Find the Issues tab on the top of the GitHub repository and click New issue.
- Click on Get started next to Bug report.
- Please try to fill out the template with as much detail as you can.
- After submitting your bug report, try to answer any follow up questions about the bug as best as you can.
If you are aware that a bug is caused by an upstream GMT issue rather than a PyGMT-specific issue, you can optionally take the following steps to help resolve the problem:
-
Add the line
pygmt.config(GMT_VERBOSE="d")
after your import statements, which will report the equivalent GMT commands as one of the debug messages. -
Either append all messages from running your script to your GitHub issue, or filter the messages to include only the GMT-equivalent commands using a command such as:
python <test>.py 2>&1 | awk -F': ' '$2=="GMT_Call_Command string" {print $3}'
where
<test>
is the name of your test script. -
If the bug is produced when passing an in-memory data object (e.g., a pandas.DataFrame or xarray.DataArray) to a PyGMT function, try writing the data to a file (e.g., a netCDF or ASCII txt file) and passing the data file to the PyGMT function instead. In the GitHub issue, please share the results for both cases along with your code.
- Find the Issues tab on the top of the GitHub repository and click New issue.
- Click on Get started next to Feature request.
- Please try to fill out the template with as much detail as you can.
- After submitting your feature request, try to answer any follow up questions as best as you can.
There are several pages on the Community Forum where you can submit general comments and/or questions:
- For questions about using PyGMT, select New Topic from the PyGMT Q&A Page.
- For general comments, select New Topic from the Lounge Page.
- To share your work, select New Topic from the Showcase Page.
Please take a look at these resources to learn about Git and pull requests (don't hesitate to ask questions):
- How to Contribute to Open Source.
- Git Workflow Tutorial by Aaron Meurer.
- How to Contribute to an Open Source Project on GitHub.
Discussion often happens on GitHub issues and pull requests. In addition, there is a Discourse forum for the project where you can ask questions.
We follow the git pull request workflow to make changes to our codebase. Every change made goes through a pull request, even our own, so that our continuous integration services have a chance to check that the code is up to standards and passes all our tests. This way, the main branch is always stable.
- What should be included in a PR
- Have a quick look at the titles of all the existing issues first. If there is already an issue that matches your PR, leave a comment there to let us know what you plan to do. Otherwise, open an issue describing what you want to do.
- Each pull request should consist of a small and logical collection of changes; larger changes should be broken down into smaller parts and integrated separately.
- Bug fixes should be submitted in separate PRs.
- How to write and submit a PR
- Use underscores for all Python (*.py) files as per PEP8, not hyphens. Directory names should also use underscores instead of hyphens.
- Describe what your PR changes and why this is a good thing. Be as specific as you can. The PR description is how we keep track of the changes made to the project over time.
- Do not commit changes to files that are irrelevant to your feature or
bugfix (e.g.:
.gitignore
, IDE project files, etc). - Write descriptive commit messages. Chris Beams has written a guide on how to write good commit messages.
- PR review
- Be willing to accept criticism and work on improving your code; we don't want to break other users' code, so care must be taken not to introduce bugs.
- Be aware that the pull request review process is not immediate, and is generally proportional to the size of the pull request.
After you've submitted a pull request, you should expect to hear at least a comment within a couple of days. We may suggest some changes, improvements or alternative implementation details.
To increase the chances of getting your pull request accepted quickly, try to:
- Submit a friendly PR
- Write a good and detailed description of what the PR does.
- Write some documentation for your code (docstrings) and leave comments explaining the reason behind non-obvious things.
- Write tests for the code you wrote/modified if needed. Please refer to Testing your code or Testing plots.
- Include an example of new features in the gallery or tutorials. Please refer to Gallery plots or Tutorials. If adding a new method/function/class, the gallery example or tutorial should be submitted in a separate pull request.
- Have a good coding style
- Use readable code, as it is better than clever code (even with comments).
- Follow the PEP8 style guide for code and the NumPy style guide for docstrings. Please refer to Code style.
Pull requests will automatically have tests run by GitHub Actions. This includes running both the unit tests as well as code linters. GitHub will show the status of these checks on the pull request. Try to get them all passing (green). If you have any trouble, leave a comment in the PR or get in touch.
These steps for setting up your environment are necessary for editing the documentation locally and contributing code. A local PyGMT development environment is not needed for editing the documentation on GitHub.
We highly recommend using Miniforge
and the mamba
package manager to install and manage your Python packages.
It will make your life a lot easier!
The repository includes a virtual environment file environment.yml
with the
specification for all development requirements to build and test the project.
In particular, these are some of the key development dependencies you will need
to install to build the documentation and run the unit tests locally:
- git (for cloning the repo and tracking changes in code)
- dvc (for downloading baseline images used in tests)
- pytest-mpl (for checking that generated plots match the baseline)
- sphinx-gallery (for building the gallery example page)
See the environment.yml
file for the full list of dependencies and the environment name (pygmt
).
Once you have forked and cloned the repository to your local machine, you can
use this file to create an isolated environment on which you can work.
Run the following on the base of the repository to create a new conda
environment from the environment.yml
file:
mamba env create --file environment.yml
Before building and testing the project, you have to activate the environment (you'll need to do this every time you start a new terminal):
mamba activate pygmt
We have a Makefile
that provides commands for installing, running the tests and coverage analysis,
running linters, etc. If you don't want to use make
, open the Makefile
and
copy the commands you want to run.
To install the current source code into your testing environment, run:
make install # on Linux/macOS
python -m pip install --no-deps -e . # on Windows
This installs your project in editable mode, meaning that changes made to the source code will be available when you import the package (even if you're on a different directory).
There are four main components to PyGMT's documentation:
- Gallery examples, with source code in Python
*.py
files under theexamples/gallery/
folder. - Tutorial examples, with source code in Python
*.py
files under theexamples/tutorials/
folder. - API documentation, with source code in the docstrings in Python
*.py
files under thepygmt/src/
andpygmt/datasets/
folders. - Getting started/developer documentation, with source text in ReST
*.rst
and markdown*.md
files under thedoc/
folder.
The documentation is written primarily in
reStructuredText and built by
Sphinx. Please refer to
{gmt-docs}reStructuredText Cheatsheet <devdocs/rst-cheatsheet.html>
if you are new to reStructuredText. When contributing documentation, be sure to
follow the general guidelines in the pull request workflow
section.
There are two primary ways to edit the PyGMT documentation:
- For simple documentation changes, you can easily edit the documentation on GitHub. This only requires you to have a GitHub account.
- For more complicated changes, you can edit the documentation locally. In order to build the documentation locally, you first need to set up your environment.
If you're browsing the documentation and notice a typo or something that could be improved, please consider letting us know by creating an issue or (even better) submitting a fix.
You can submit fixes to the documentation pages completely online without having to download and install anything:
- On each documentation page, there should be an "Improve This Page" link at the very top.
- Click on that link to open the respective source file (usually an
.rst
file in thedoc/
folder or a.py
file in theexamples/
folder) on GitHub for editing online (you'll need a GitHub account). - Make your desired changes.
- When you're done, scroll to the bottom of the page.
- Fill out the two fields under "Commit changes": the first is a short title describing your fixes; the second is a more detailed description of the changes. Try to be as detailed as possible and describe why you changed something.
- Choose "Create a new branch for this commit and start a pull request" and click on the "Propose changes" button to open a pull request.
- The pull request will run the GMT automated tests and make a preview deployment. You can see how your change looks in the PyGMT documentation by clicking the "Details" button of the "docs/readthedocs.org:pygmt-dev" status check, after the building has finished (usually 10-15 minutes after the pull request was created).
- We'll review your pull request, recommend changes if necessary, and then merge them in if everything is OK.
- Done!
Alternatively, you can make the changes offline to the files in the doc
folder or the
example scripts. See editing the documentation locally
for instructions.
For more extensive changes, you can edit the documentation in your cloned repository and build the documentation to preview changes before submitting a pull request. First, follow the setting up your environment instructions. After making your changes, you can build the HTML files from sources using:
cd doc
make all
This will build the HTML files in doc/_build/html
.
Open doc/_build/html/index.html
in your browser to view the pages. Follow the
pull request workflow to submit your changes for review.
Many of the PyGMT functions have example code in their documentation. To contribute an
example, add an "Example" header and put the example code below it. Have all lines
begin with >>>
. To keep this example code from being run during testing, add the code
__doctest_skip__ = ["function_name"]
to the top of the module.
Inline code example
Below the import statements at the top of the file:
__doctest_skip__ = ["function_name"]
At the end of the function's docstring:
Example
-------
>>> import pygmt
>>> # Comment describing what is happening
>>> Code example
The gallery and tutorials are managed by
sphinx-gallery.
The source files for the example gallery are .py
scripts in examples/gallery/
that
generate one or more figures. They are executed automatically by sphinx-gallery when
the documentation is built. The output is gathered and
assembled into the gallery.
You can add a new plot by placing a new .py
file in one of the folders inside the
examples/gallery
folder of the repository. See the other examples to get an idea for the
format.
General guidelines for making a good gallery plot:
- Examples should highlight a single feature/command. Good: how to add a label to a colorbar. Bad: how to add a label to the colorbar and use two different CPTs and use subplots.
- Try to make the example as simple as possible. Good: use only commands that are required to show the feature you want to highlight. Bad: use advanced/complex Python features to make the code smaller.
- Use a sample dataset from
pygmt.datasets
if you need to plot data. If a suitable dataset isn't available, open an issue requesting one and we'll work together to add it. - Add comments to explain things that aren't obvious from reading the code. Good: Use a Mercator projection and make the plot 15 centimeters wide. Bad: Draw coastlines and plot the data.
- Describe the feature that you're showcasing and link to other relevant parts of the documentation.
- SI units should be used in the example code for gallery plots.
The tutorials (the User Guide in the docs) are also built by sphinx-gallery from the
.py
files in the examples/tutorials
folder of the repository. To add a new tutorial:
- Create a
.py
file in theexamples/tutorials/advanced
folder. - Write the tutorial in "notebook" style with code mixed with paragraphs explaining what is being done. See the other tutorials for the format.
- Choose the most representative figure as the thumbnail figure by adding the comment
line
# sphinx_gallery_thumbnail_number = <fig_number>
at the end of the tutorial. The fig_number starts from 1.
Guidelines for a good tutorial:
- Each tutorial should focus on a particular set of tasks that a user might want to accomplish: plotting grids, interpolation, configuring the frame, projections, etc.
- The tutorial code should be as simple as possible. Avoid using advanced/complex Python features or abbreviations.
- Explain the options and features in as much detail as possible. The gallery has concise examples while the tutorials are detailed and full of text.
- SI units should be used in the example code for tutorial plots.
Note that the pygmt.Figure.show
method needs to be called for a plot
to be inserted into the documentation.
The API documentation is built from the docstrings in the Python *.py
files under
the pygmt/src/
and pygmt/datasets/
folders. All docstrings should follow the
NumPy style guide.
All functions/classes/methods should have docstrings with a full description of all
arguments and return values.
While the maximum line length for code is automatically set by ruff, docstrings must be formatted manually. To play nicely with Jupyter and IPython, keep docstrings limited to 88 characters per line.
When editing documentation, use the following standards to demonstrate the example code:
- Python arguments, such as import statements, Boolean expressions, and function
arguments should be wrapped as
code
by using `` on both sides of the code. Examples: ``import pygmt`` results inimport pygmt
, ``True`` results inTrue
, ``style="v"`` results instyle="v"
. - Literal GMT arguments should be bold by wrapping the arguments with **
(two asterisks) on both sides. The argument description should be in italicized
with * (single asterisk) on both sides.
Examples:
**+l**\ *label*
results in +llabel,**05m**
results in 05m. - Optional arguments are wrapped with [ ] (square brackets).
- Arguments that are mutually exclusive are separated with a | (bar) to denote "or".
- Default arguments for parameters and configuration settings are wrapped with [ ] (square brackets) with the prefix "Default is". Example: [Default is p].
The API reference is manually assembled in doc/api/index.rst
.
The autodoc sphinx extension will automatically create pages for each
function/class/module/method listed there.
You can reference functions, classes, modules, and methods from anywhere (including docstrings) using:
:func:`package.module.function`
:class:`package.module.class`
:meth:`package.module.method`
:mod:`package.module`
An example would be to use
:meth:`pygmt.Figure.grdview`
to link
to https://www.pygmt.org/latest/api/generated/pygmt.Figure.grdview.html.
PyGMT documentation that is not a class, method,
or module can be linked with :doc:`Any Link Text </path/to/the/file>`
.
For example, :doc:`Install instructions </install>`
links
to https://www.pygmt.org/latest/install.html.
Linking to the GMT documentation and GMT configuration parameters can be done using:
:gmt-docs:`page_name.html`
:gmt-term:`GMT_PARAMETER`
An example would be using
:gmt-docs:`makecpt.html`
to link to {gmt-docs}makecpt.html
.
For GMT configuration parameters, an example is
:gmt-term:`COLOR_FOREGROUND`
to link to {gmt-term}COLOR_FOREGROUND
.
Sphinx will create a link to the automatically generated page for that function/class/module/method.
The source code for PyGMT is located in the pygmt/
directory. When contributing
code, please open an issue first to discuss the feature and its implementation
and be sure to follow the general guidelines in the
pull request workflow section.
We use the ruff tool to format the code, so we don't have to think about it. It loosely follow the PEP8 guide but with a few differences. Regardless, you won't have to worry about formatting the code yourself. Before committing, run it to automatically format your code:
make format
For consistency, we also use pre-commit
hooks to enforce UNIX-style line endings
(\n
) and file permission 644 (-rw-r--r--
) throughout the whole project.
Don't worry if you forget to do it. Our continuous integration systems will
warn us and you can make a new commit with the formatted code.
Even better, you can just write /format
in the first line of any comment in a
pull request to lint the code automatically.
When wrapping a new alias, use an underscore to separate words bridged by vowels
(aeiou), such as no_skip
and z_only
. Do not use an underscore to separate
words bridged only by consonants, such as distcalc
, and crossprofile
. This
convention is not applied by the code checking tools, but the PyGMT maintainers
will comment on any pull requests as needed.
When working on a tutorial or a gallery plot, it is good practice to use code
block separators to split a long script into multiple blocks. The separators also
make it possible to run the script like a Jupyter notebook in some modern text
editors or IDEs. We consistently use # %%
as code block separators (please
refer to issue #2660
for the discussions) and require at least one separator in all example files.
We also use ruff to check the quality of the code and quickly catch common errors.
The Makefile
contains rules for running the linter checks:
make check # Runs ruff in check mode
Wrapping a new GMT module in PyGMT is usually a big task, which will progress quicker and smoother if done in small, manageable chunks. This section gives an overview of the specific tasks involved in wrapping a new module.
- Create a 'Wrapper for
<module-name>
' feature request issue. - Open a 'Wrap
<module-name>
' initial feature implementation PR. - Open an 'Add missing aliases to
<module-name>
' documentation PR. - Open a 'support additional functionality in module' PR (optional).
- Add 'Gallery example for module' documentation PR.
- Add 'Tutorial for module' documentation PR (optional).
These steps will be tracked in the 'Wrapper for <module-name>
' issue and the
'wrapping GMT modules'
project board. The pull requests can be split between multiple contributors and
there is no obligation for a single contributor to complete all steps. Please
comment on the initial 'Wrapper for <module-name>
' if you would like to open
a pull request for any of these tasks to avoid redundant efforts.
- Find the Issues tab on the top of the GitHub repository and click New Issue.
- Click on Get started next to Feature request - Wrap new GMT module.
- Follow the prompts for filling out the issue template.
First, comment on the 'Wrapper for <module-name>
' issue that you will be
working on the initial feature implementation. This first pull request should
be as minimal as possible - only adding the required functionality (i.e.,
wrapping the required GMT arguments and supporting the primary input/output
types).
The following steps are common to all initial implementation pull requests that wrap a new GMT module (e.g., initial grdfill implementation):
- Create a new module
<module-name>.py
inpygmt/src
. The module docstring should include the module name and a short description of the functionality (e.g.,grdfill - Fill blank areas from a grid.
). - Add a function
<module-name>
to the module. When writing the new function, it is generally easiest to reference the source code for other functions that input/output similar object types. - Add a detailed docstring following the numpy style guide.
- Add the function to the import statements in
pygmt/src/__init__.py
. - Add the function to the import statements in
pygmt/__init__.py
. - Add the function to appropriate section of the API documentation in
doc/api/index.rst
. - Add a testing module
test_<module-name>.py
inpygmt/tests
, following the guidelines in the testing your code section.
After the initial implementation, the missing aliases can be added in a separate PR (e.g., add missing aliases to grd2xyz).
- Select a suitable alias for each GMT option, following the guidelines in the
code style section. Before creating a new alias, check if the
parameter is listed in the
COMMON_DOCSTRINGS
dictionary inpygmt/helpers/decorators.py
, if other wrapped GMT modules have a similar parameter, and if GMT.jl has defined an alias. - Update the
use_alias
decorator for the<module-name>
function using the GMT option as the parameter and the alias as the argument. - Add the alias and description to the parameters section of the docstring,
using the
fmt_docstring
decorator to add descriptions for parameters included in theCOMMON_DOCSTRINGS
dictionary.
Automated testing helps ensure that our code is as free of bugs as it can be. It also lets us know immediately if a change we make breaks any other part of the code.
All of our test code and data are stored in the tests
subpackage.
We use the pytest framework to run the test suite.
Please write tests for your code so that we can be sure that it won't break any of the existing functionality. Tests also help us be confident that we won't break your code in the future.
When writing tests, don't test everything that the GMT function already tests, such as
the every unique combination arguments. An exception to this would be the most popular
methods, such as pygmt.Figure.plot
and pygmt.Figure.basemap
.
The highest priority for tests should be the Python-specific code, such as numpy,
pandas, and xarray objects and the virtualfile mechanism.
If you're new to testing, see existing test files for examples of things to do. Don't let the tests keep you from submitting your contribution! If you're not sure how to do this or are having trouble, submit your pull request anyway. We will help you create the tests and sort out any kind of problem during code review.
Pull the baseline images, run the tests, and calculate test coverage using:
dvc status # should report any files 'not_in_cache'
dvc pull # pull down files from DVC remote cache (fetch + checkout)
make test
The coverage report will let you know which lines of code are touched by the tests.
If all the tests pass, you can view the coverage reports by opening htmlcov/index.html
in your browser. Strive to get 100% coverage for the lines you changed.
It's OK if you can't or don't know how to test something.
Leave a comment in the PR and we'll help you out.
You can also run tests in just one test script using:
pytest pygmt/tests/NAME_OF_TEST_FILE.py
or run tests which contain names that match a specific keyword expression:
pytest -k KEYWORD pygmt/tests
Writing an image-based test is only slightly more difficult than a simple test.
The main consideration is that you must specify the "baseline" or reference
image, and compare it with a "generated" or test image. This is handled using
the decorator functions @pytest.mark.mpl_image_compare
and @check_figures_equal
whose usage are further described below.
This is the preferred way to test plots whenever possible.
This method uses the pytest-mpl
plug-in to test plot generating code.
Every time the tests are run, pytest-mpl
compares the generated plots with known
correct ones stored in pygmt/tests/baseline
.
If your test created a pygmt.Figure
object, you can test it by adding a decorator and
returning the pygmt.Figure
object:
@pytest.mark.mpl_image_compare
def test_my_plotting_case():
"""
Test that my plotting method works.
"""
fig = Figure()
fig.basemap(region=[0, 360, -90, 90], projection="W15c", frame=True)
return fig
Your test function must return the pygmt.Figure
object and you can only
test one figure per function.
Before you can run your test, you'll need to generate a baseline (a correct version) of your plot. Run the following from the repository root:
pytest --mpl-generate-path=baseline pygmt/tests/NAME_OF_TEST_FILE.py
This will create a baseline
folder with all the plots generated in your test
file.
Visually inspect the one corresponding to your test function.
If it's correct, copy it (and only it) to pygmt/tests/baseline
.
When you run make test
the next time, your test should be executed and
passing.
Don't forget to commit the baseline image as well!
The images should be pushed up into a remote repository using dvc
(instead of
git
) as will be explained in the next section.
Using Data Version Control (dvc) to Manage Test Images
As the baseline images are quite large blob files that can change often (e.g.
with new GMT versions), it is not ideal to store them in git
(which is meant
for tracking plain text files). Instead, we will use dvc
which is like git
but for data. What dvc
does is to store the hash (md5sum)
of a file. For example, given an image file like test_logo.png
, dvc
will
generate a test_logo.png.dvc
plain text file containing the hash of the
image. This test_logo.png.dvc
file can be stored as usual on GitHub, while
the test_logo.png
file can be stored separately on our dvc
remote at
https://dagshub.com/GenericMappingTools/pygmt.
To pull or sync files from the dvc
remote to your local repository, use
the commands below. Note how dvc
commands are very similar to git
.
dvc status # should report any files 'not_in_cache'
dvc pull # pull down files from DVC remote cache (fetch + checkout)
Once the sync/download is complete, you should notice two things. There will be
images stored in the pygmt/tests/baseline
folder (e.g. test_logo.png
) and
these images are technically reflinks/symlinks/copies of the files under the
.dvc/cache
folder. You can now run the image comparison test suite as per
usual.
pytest pygmt/tests/test_logo.py # run only one test
make test # run the entire test suite
To push or sync changes from your local repository up to the dvc
remote
at DAGsHub, you will first need to set up authentication using the commands
below. This only needs to be done once, i.e. the first time you contribute a
test image to the PyGMT project.
dvc remote modify upstream --local auth basic
dvc remote modify upstream --local user "$DAGSHUB_USER"
dvc remote modify upstream --local password "$DAGSHUB_PASS"
The configuration will be stored inside your .dvc/config.local
file. Note
that the $DAGSHUB_PASS token can be generated at
https://dagshub.com/user/settings/tokens
after creating a DAGsHub account (can be linked to your GitHub account). Once
you have an account set up, please ask one of the PyGMT maintainers to add you
as a collaborator at
https://dagshub.com/GenericMappingTools/pygmt/settings/collaboration
before proceeding with the next steps.
The entire workflow for generating or modifying baseline test images can be summarized as follows:
# Sync with both git and dvc remotes
git pull
dvc pull
# Generate new baseline images
pytest --mpl-generate-path=baseline pygmt/tests/test_logo.py
mv baseline/*.png pygmt/tests/baseline/
# Generate hash for baseline image and stage the *.dvc file in git
dvc status # Check which files need to be added to dvc
dvc add pygmt/tests/baseline/test_logo.png
git add pygmt/tests/baseline/test_logo.png.dvc
# Commit changes and push to both the git and dvc remotes
git commit -m "Add test_logo.png into DVC"
dvc status --remote upstream # Report which files will be pushed to the dvc remote
dvc push # Run before git push to enable automated testing with the new images
git push
This approach draws the same figure using two different methods (the reference
method and the tested method), and checks that both of them are the same.
It takes two pygmt.Figure
objects (fig_ref
and fig_test
), generates a png
image, and checks for the Root Mean Square (RMS) error between the two.
Here's an example:
@check_figures_equal()
def test_my_plotting_case():
"""
Test that my plotting method works.
"""
fig_ref, fig_test = Figure(), Figure()
fig_ref.grdimage("@earth_relief_01d_g", projection="W120/15c", cmap="geo")
fig_test.grdimage(grid, projection="W120/15c", cmap="geo")
return fig_ref, fig_test