Exercism has a lot of exercises in a lot of languages.
There is a common pool of problem definitions, which can be implemented in any of the language tracks.
The language-agnostic definition lives in the problem-specifications repository, within
the exercises/
directory:
/exercises/$SLUG/
├── canonical-data.json (OPTIONAL)
├── description.md
└── metadata.yml
The shared data will always have a description.md
file which contains the basic
problem specification, and which later gets used to generate the README for the
exercises.
The metadata.yml
contains a summary (or "blurb") of the exercise, as well as
attribution and some other handy things that get used in various places.
For some exercises we have defined a canonical-data.json
file. If this exists
then you won't need to go look at implementations in other languages. This will
contain the implementation notes and test inputs and outputs that you'll need
to create a test suite in your target language.
Navigate to the language track on Exercism via the https://tracks.exercism.io page.
Select the language of interest. You shall see a table that has multiple tabs. Click the tab labelled Unimplemented.
This is a full list of every exercise that exists, but has not yet been implemented in that language track.
For every exercise it will link to:
- The generic README.
- The canonical data (if it exists).
- All the individual language implementations.
You'll need to find the slug
for the exercise. That's the URL-friendly identifier
for an exercise that is used throughout all of Exercism.
The name of the exercise directory in the problem-specifications repository is the slug.
Create a new directory named after the slug in the exercises
directory of the language
track repository.
The exercise should consist of, at minimum:
- A test suite.
- A reference solution that passes the tests.
Each language track might have additional requirements; check the README in the repository for the track.
Look at the other exercises in the track to get an idea of:
- The conventions used.
- How to name the test file.
The reference solution is named something with example
or Example
in the path.
Again, check the other exercises to see what the naming pattern is.
The solution does not need to be particularly great code, it is only used to verify that the exercise is coherent.
Add a new entry in the exercises
key in the config.json
file in the root of the repository.
See the exercise configuration document for a detailed description of what you need to add to this entry.
We have a tool, configlet that will help make sure that everything is configured right. Download that and run it against the track repository.
Rebase against the most recent upstream master, and submit a pull request.
If you have questions your best bet is the support chat. Once you figure it out, if you could help improve this documentation, that would be great!
Once you've created an exercise, you'll probably want to provide feedback to people who submit solutions to it. By default you only get access to exercises you've submitted a solution for.
You can fetch the problem using the CLI:
$ exercism fetch <track_id> <slug>
Go ahead and submit the reference solution that you wrote when creating the problem. Remember to archive it if you don't want other people to comment on it.