Skip to content

Commit

Permalink
Give an example of transientErrors(true) in refdoc
Browse files Browse the repository at this point in the history
  • Loading branch information
simonbasle committed Mar 10, 2020
1 parent d418857 commit ccda4b3
Show file tree
Hide file tree
Showing 6 changed files with 544 additions and 4 deletions.
49 changes: 49 additions & 0 deletions docs/asciidoc/coreFeatures.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -924,6 +924,55 @@ well as more finely tuned retry strategies: `errorFlux.retryWhen(Retry.max(3));`
TIP: You can use similar code to implement an "`exponential backoff and retry`" pattern,
as shown in the <<faq.exponentialBackoff,FAQ>>.

The core-provided `Retry` builders, `RetrySpec` and `RetryBackoffSpec`, both allow advanced customizations like:

- setting the `filter(Predicate)` for the exceptions that can trigger a retry
- modifying such a previously set filter through `modifyErrorFilter(Function)`
- triggering a side effect like logging around the retry trigger (ie for backoff before and after the delay), provided the retry is validated (`doBeforeRetry()` and `doAfterRetry()` are additive)
- triggering an asynchronous `Mono<Void>` around the retry trigger, which allows to add asynchronous behavior on top of the base delay but thus further delay the trigger (`doBeforeRetryAsync` and `doAfterRetryAsync` are additive)
- customizing the exception in case the maximum number of attempts has been reached, through `onRetryExhaustedThrow(BiFunction)`.
By default, `Exceptions.retryExhausted(...)` is used, which can be distinguished with `Exceptions.isRetryExhausted(Throwable)`
- activating the handling of _transient errors_ (see below)

Transient error handling in the `Retry` specs makes use of `RetrySignal#totalRetriesInARow()`: to check whether to retry or not and to compute the retry delays, the index used is an alternative one that is reset to 0 each time an `onNext` is emitted.
This has the consequence that if a re-subscribed source generates some data before failing again, previous failures don't count toward the maximum number of retry attempts.
In the case of exponential backoff strategy, this also means that the next attempt will be back to using the minimum `Duration` backoff instead of a longer one.
This can be especially useful for long-lived sources that see sporadic bursts of errors (or _transient_ errors), where each burst should be retried with its own backoff.

====
[source,java]
----
AtomicInteger errorCount = new AtomicInteger(); // <1>
AtomicInteger transientHelper = new AtomicInteger();
Flux<Integer> transientFlux = Flux.<Integer>generate(sink -> {
int i = transientHelper.getAndIncrement();
if (i == 10) { // <2>
sink.next(i);
sink.complete();
}
else if (i % 3 == 0) { // <3>
sink.next(i);
}
else {
sink.error(new IllegalStateException("Transient error at " + i)); // <4>
}
})
.doOnError(e -> errorCount.incrementAndGet());
transientFlux.retryWhen(Retry.max(2).transientErrors(true)) // <5>
.blockLast();
assertThat(errorCount).hasValue(6); // <6>
----
<1> We will count the number of errors in the retried sequence.
<2> We `generate` a source that has bursts of errors. It will successfully complete when the counter reaches 10.
<3> If the `transientHelper` atomic is at a multiple of `3`, we emit `onNext` and thus end the current burst.
<4> In other cases we emit an `onError`. That's 2 out of 3 times, so bursts of 2 `onError` interrupted by 1 `onNext`.
<5> We use `retryWhen` on that source, configured for at most 2 retry attempts, but in `transientErrors` mode.
<6> At the end, the sequence reaches `onNext(10)` and completes, after `6` errors have been registered in `errorCount`.
====

Without the `transientErrors(true)`, the configured maximum attempt of `2` would be reached by the second burst and the sequence would fail after having emitted `onNext(3)`.

=== Handling Exceptions in Operators or Functions

In general, all operators can themselves contain code that potentially trigger an
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7261,7 +7261,7 @@ public final Flux<T> retryWhen(Function<Flux<Throwable>, ? extends Publisher<?>>
* Terminal signals in the companion terminate the sequence with the same signal, so emitting an {@link Subscriber#onError(Throwable)}
* will fail the resulting {@link Flux} with that same error.
* <p>
* <img class="marble" src="doc-files/marbles/retryWhenForFlux.svg" alt="">
* <img class="marble" src="doc-files/marbles/retryWhenSpecForFlux.svg" alt="">
* <p>
* Note that the {@link Retry.RetrySignal} state can be transient and change between each source
* {@link org.reactivestreams.Subscriber#onError(Throwable) onError} or
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3721,7 +3721,7 @@ public final Mono<T> retryWhen(Function<Flux<Throwable>, ? extends Publisher<?>>
* Terminal signals in the companion terminate the sequence with the same signal, so emitting an {@link Subscriber#onError(Throwable)}
* will fail the resulting {@link Mono} with that same error.
* <p>
* <img class="marble" src="doc-files/marbles/retryWhenForMono.svg" alt="">
* <img class="marble" src="doc-files/marbles/retryWhenSpecForMono.svg" alt="">
* <p>
* Note that the {@link Retry.RetrySignal} state can be transient and change between each source
* {@link org.reactivestreams.Subscriber#onError(Throwable) onError} or
Expand Down
Loading

0 comments on commit ccda4b3

Please sign in to comment.