Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pickler derivation #3134

Merged
merged 52 commits into from
Sep 19, 2023
Merged
Show file tree
Hide file tree
Changes from 42 commits
Commits
Show all changes
52 commits
Select commit Hold shift + click to select a range
153acb0
Scaffolding for pickler derivation
kciesielski Aug 16, 2023
4a99a1f
Implement writer logic (without customizations)
kciesielski Aug 25, 2023
ed78b0b
Support readers
kciesielski Aug 25, 2023
899ecf5
Respect schema's encodedName in writers
kciesielski Aug 25, 2023
1306ae2
Derive schema for product inside pickler derivation
kciesielski Aug 28, 2023
c549f65
Support encodedName in readers
kciesielski Aug 28, 2023
f49baea
Fix writers for oneOfUsingField
kciesielski Aug 31, 2023
598a68a
Implement support for Readers for oneOfUsingField
kciesielski Sep 1, 2023
ebc489a
Initial support for enumerations
kciesielski Sep 1, 2023
6f94aff
Handle case objects consistently using discriminators
kciesielski Sep 4, 2023
d85c66d
Support enums
kciesielski Sep 4, 2023
84fc706
Implement support for @default
kciesielski Sep 6, 2023
313d5a6
Support Option[T]
kciesielski Sep 6, 2023
ebf4f70
Support for iterables
kciesielski Sep 8, 2023
f851300
Support Either (the uPickle way)
kciesielski Sep 8, 2023
9167e41
Support Map (excluding keys as value classes)
kciesielski Sep 8, 2023
48eed1e
Support Arrays
kciesielski Sep 8, 2023
435ba96
Use Scala 3 convention for wildcard imports
kciesielski Sep 8, 2023
0365da1
Support value classes
kciesielski Sep 9, 2023
9fc5c34
Move code to a dedicated module
kciesielski Sep 9, 2023
8a30e98
Rename scala-3 to scala
kciesielski Sep 9, 2023
b834153
Fix using default Pickler + some cleanup
kciesielski Sep 12, 2023
dc89ce8
Cleanup in tests
kciesielski Sep 12, 2023
0ba847c
More refactoring
kciesielski Sep 12, 2023
b7891dd
Migrate SchemaGenericAutoTest
kciesielski Sep 12, 2023
6b677df
Adjust handling of validateEach
kciesielski Sep 12, 2023
16d89f1
Add a comment about missing support for `@description`
kciesielski Sep 12, 2023
cc4be0d
Add API for jsonBody
kciesielski Sep 12, 2023
f86ab4a
Ensure support for `derives`
kciesielski Sep 12, 2023
1c57a0a
Build for ScalaJS
kciesielski Sep 13, 2023
2bc2dfd
Remove debug code
kciesielski Sep 13, 2023
9893845
Tune error message
kciesielski Sep 14, 2023
1b5b949
Put all into a `pickler` package
kciesielski Sep 14, 2023
2b9fbef
Improve errors for missing picklers
kciesielski Sep 14, 2023
5a4731c
Report Pickler summon failure for the last actual failed case
kciesielski Sep 14, 2023
34a7b03
Code comments and more package private restrictions
kciesielski Sep 14, 2023
3b4db81
More dependencies for examples3
kciesielski Sep 14, 2023
4492a98
Restore test for deriving schema for list
kciesielski Sep 14, 2023
1994cfa
Documentation
kciesielski Sep 14, 2023
c82fbca
Don't compile Scala 3 snippets
kciesielski Sep 15, 2023
d42ff75
Test for enums with fields and default derivation method
kciesielski Sep 15, 2023
ae6b7bf
Improve usage of SubtypeDiscriminator
kciesielski Sep 15, 2023
70bce41
Fix Scaladoc formatting
kciesielski Sep 15, 2023
f5879e7
Handle sealed hierarchies diguised as enums
kciesielski Sep 15, 2023
048aaec
Revert "Documentation"
kciesielski Sep 15, 2023
9e2d824
Recommit docs without autoformatting
kciesielski Sep 15, 2023
bfff425
Add support for java.math.BigDecimal and BigInteger
kciesielski Sep 15, 2023
f65fe75
Documentation improvements
adamw Sep 18, 2023
00ac6c2
Formatting
adamw Sep 18, 2023
989b080
Docs
adamw Sep 18, 2023
dc8f5b5
Docs
adamw Sep 18, 2023
b9ddf2a
Remove unused parameter
adamw Sep 18, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .scalafix.conf
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
OrganizeImports {
groupedImports = Merge
removeUnused = false
}
21 changes: 19 additions & 2 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ val commonSettings = commonSmlBuildSettings ++ ossPublishSettings ++ Seq(
}.value,
mimaPreviousArtifacts := Set.empty, // we only use MiMa for `core` for now, using enableMimaSettings
ideSkipProject := (scalaVersion.value == scala2_12) ||
(scalaVersion.value == scala3) ||
(scalaVersion.value == scala2_13) ||
thisProjectRef.value.project.contains("Native") ||
thisProjectRef.value.project.contains("JS"),
bspEnabled := !ideSkipProject.value,
Expand Down Expand Up @@ -179,6 +179,7 @@ lazy val rawAllAggregates = core.projectRefs ++
zioMetrics.projectRefs ++
json4s.projectRefs ++
playJson.projectRefs ++
picklerJson.projectRefs ++
sprayJson.projectRefs ++
uPickleJson.projectRefs ++
tethysJson.projectRefs ++
Expand Down Expand Up @@ -861,6 +862,19 @@ lazy val uPickleJson: ProjectMatrix = (projectMatrix in file("json/upickle"))
)
.dependsOn(core)

lazy val picklerJson: ProjectMatrix = (projectMatrix in file("json/pickler"))
.settings(commonSettings)
.settings(
name := "tapir-json-pickler",
libraryDependencies ++= Seq(
"com.lihaoyi" %%% "upickle" % Versions.upickle,
scalaTest.value % Test
)
)
.jvmPlatform(scalaVersions = List(scala3))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we support js as well? ;)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed js support and left it for later due to some missing types. However, now it seems to work, so I'll bring it back and we'll see on the CI.

.jsPlatform(scalaVersions = List(scala3))
.dependsOn(core % "compile->compile;test->test")

lazy val tethysJson: ProjectMatrix = (projectMatrix in file("json/tethys"))
.settings(commonSettings)
.settings(
Expand Down Expand Up @@ -2043,9 +2057,12 @@ lazy val examples3: ProjectMatrix = (projectMatrix in file("examples3"))
)
.jvmPlatform(scalaVersions = List(scala3))
.dependsOn(
circeJson,
http4sServer,
nettyServer,
picklerJson,
sttpClient,
swaggerUiBundle,
circeJson
)

//TODO this should be invoked by compilation process, see #https://github.com/scalameta/mdoc/issues/355
Expand Down
1 change: 1 addition & 0 deletions core/src/main/scala-3/sttp/tapir/macros/SchemaMacros.scala
Original file line number Diff line number Diff line change
Expand Up @@ -199,6 +199,7 @@ private[tapir] object SchemaCompanionMacros {
case Block(List(defdef), _) => resolveFunctionName(defdef)
case DefDef(_, _, _, Some(body)) => resolveFunctionName(body)
case Apply(fun, _) => resolveFunctionName(fun)
case Ident(str) => str
case Select(_, kind) => kind
}

Expand Down
55 changes: 32 additions & 23 deletions doc/endpoint/json.md
Original file line number Diff line number Diff line change
@@ -1,34 +1,40 @@
# Working with JSON

Json values are supported through codecs, which encode/decode values to json strings. Most often, you'll be using a
third-party library to perform the actual json parsing/printing. See below for the list of supported libraries.
third-party library to perform the actual json parsing/printing. See below for the list of supported libraries.

All the integrations, when imported into scope, define `jsonBody[T]` and `jsonQuery[T]` methods.
All the integrations, when imported into scope, define `jsonBody[T]` and `jsonQuery[T]` methods.

Instead of providing the json codec as an implicit value, this method depends on library-specific implicits being in
scope, and basing on these values creates a json codec. The derivation also requires
an implicit `Schema[T]` instance, which can be automatically derived. For more details see sections on
[schema derivation](schemas.md) and on supporting [custom types](customtypes.md) in general. Such a design provides
Instead of providing the json codec as an implicit value, this method depends on library-specific implicits being in
scope, and basing on these values creates a json codec. The derivation also requires
an implicit `Schema[T]` instance, which can be automatically derived. For more details see sections on
[schema derivation](schemas.md) and on supporting [custom types](customtypes.md) in general. Such a design provides
better error reporting, in case one of the components required to create the json codec is missing.

```eval_rst
.. note::

Note that the process of deriving schemas, and deriving library-specific json encoders and decoders is entirely
separate. The first is controlled by tapir, the second - by the json library. Any customisation, e.g. for field
naming or inheritance strategies, must be done separately for both derivations.
separate. The first is controlled by tapir, the second - by the json library, unless you use the Pickler module mentioned below.
Otherwise, any customisation, e.g. for field naming or inheritance strategies, must be done separately for both derivations.
```

## Pickler

Alternatively, instead of deriving schemas and json codecs separately, you can use the [tapir-pickler](pickler.md) module,
which takes care of both derivation in a consistent way, keeping possibility to customize both with a common configuration API.


## Implicit json codecs

If you have a custom, implicit `Codec[String, T, Json]` instance, you should use the `customCodecJsonBody[T]` method instead.
This description of endpoint input/output, instead of deriving a codec basing on other library-specific implicits, uses
If you have a custom, implicit `Codec[String, T, Json]` instance, you should use the `customCodecJsonBody[T]` method instead.
This description of endpoint input/output, instead of deriving a codec basing on other library-specific implicits, uses
the json codec that is in scope.

## JSON as string

If you'd like to work with JSON bodies in a serialised `String` form, instead of integrating on a higher level using
one of the libraries mentioned below, you should use the `stringJsonBody` input/output. Note that in this case, the
one of the libraries mentioned below, you should use the `stringJsonBody` input/output. Note that in this case, the
serialising/deserialising of the body must be part of the [server logic](../server/logic.md).

A schema can be provided in this case as well:
Expand All @@ -54,8 +60,8 @@ Next, import the package (or extend the `TapirJsonCirce` trait, see [MyTapir](..
import sttp.tapir.json.circe._
```

The above import brings into scope the `jsonBody[T]` body input/output description, which creates a codec, given an
in-scope circe `Encoder`/`Decoder` and a `Schema`. Circe includes a couple of approaches to generating encoders/decoders
The above import brings into scope the `jsonBody[T]` body input/output description, which creates a codec, given an
in-scope circe `Encoder`/`Decoder` and a `Schema`. Circe includes a couple of approaches to generating encoders/decoders
(manual, semi-auto and auto), so you may choose whatever suits you.

Note that when using Circe's auto derivation, any encoders/decoders for custom types must be in scope as well.
Expand All @@ -75,7 +81,7 @@ val bookInput: EndpointIO[Book] = jsonBody[Book]

### Configuring the circe printer

Circe lets you select an instance of `io.circe.Printer` to configure the way JSON objects are rendered. By default
Circe lets you select an instance of `io.circe.Printer` to configure the way JSON objects are rendered. By default
Tapir uses `Printer.nospaces`, which would render:

```scala mdoc:compile-only
Expand All @@ -90,10 +96,10 @@ Json.obj(
as

```json
{"key1":"present","key2":null}
{ "key1": "present", "key2": null }
```

Suppose we would instead want to omit `null`-values from the object and pretty-print it. You can configure this by
Suppose we would instead want to omit `null`-values from the object and pretty-print it. You can configure this by
overriding the `jsonPrinter` in `tapir.circe.json.TapirJsonCirce`:

```scala mdoc:compile-only
Expand All @@ -110,7 +116,7 @@ import MyTapirJsonCirce._
Now the above JSON object will render as

```json
{"key1":"present"}
{ "key1": "present" }
```

## µPickle
Expand Down Expand Up @@ -148,6 +154,8 @@ Like Circe, µPickle allows you to control the rendered json output. Please see

For more examples, including making a custom encoder/decoder, see [TapirJsonuPickleTests.scala](https://github.com/softwaremill/tapir/blob/master/json/upickle/src/test/scala/sttp/tapir/json/upickle/TapirJsonuPickleTests.scala)

Check also the [tapir-pickler](pickler.md) module, which offers a high-level Pickler representation using uPickle underneath. This representation allows more flexible customiozation and takes care of generating both schemas and json codecs, which are kept in sync.

## Play JSON

To use [Play JSON](https://github.com/playframework/play-json) add the following dependency to your project:
Expand All @@ -162,7 +170,7 @@ Next, import the package (or extend the `TapirJsonPlay` trait, see [MyTapir](../
import sttp.tapir.json.play._
```

Play JSON requires `Reads` and `Writes` implicit values in scope for each type you want to serialize.
Play JSON requires `Reads` and `Writes` implicit values in scope for each type you want to serialize.

## Spray JSON

Expand All @@ -178,7 +186,7 @@ Next, import the package (or extend the `TapirJsonSpray` trait, see [MyTapir](..
import sttp.tapir.json.spray._
```

Spray JSON requires a `JsonFormat` implicit value in scope for each type you want to serialize.
Spray JSON requires a `JsonFormat` implicit value in scope for each type you want to serialize.

## Tethys JSON

Expand All @@ -194,7 +202,7 @@ Next, import the package (or extend the `TapirJsonTethys` trait, see [MyTapir](.
import sttp.tapir.json.tethysjson._
```

Tethys JSON requires `JsonReader` and `JsonWriter` implicit values in scope for each type you want to serialize.
Tethys JSON requires `JsonReader` and `JsonWriter` implicit values in scope for each type you want to serialize.

## Jsoniter Scala

Expand All @@ -210,7 +218,7 @@ Next, import the package (or extend the `TapirJsonJsoniter` trait, see [MyTapir]
import sttp.tapir.json.jsoniter._
```

Jsoniter Scala requires `JsonValueCodec` implicit value in scope for each type you want to serialize.
Jsoniter Scala requires `JsonValueCodec` implicit value in scope for each type you want to serialize.

## Json4s

Expand Down Expand Up @@ -250,6 +258,7 @@ To use [zio-json](https://github.com/zio/zio-json), add the following dependency
```scala
"com.softwaremill.sttp.tapir" %% "tapir-json-zio" % "@VERSION@"
```

Next, import the package (or extend the `TapirJsonZio` trait, see [MyTapir](../mytapir.md) and add `TapirJsonZio` instead of `TapirCirceJson`):

```scala mdoc:compile-only
Expand Down Expand Up @@ -291,9 +300,9 @@ when these methods are called.

## Optional json bodies

When the body is specified as an option, e.g. `jsonBody[Option[Book]]`, an empty body will be decoded as `None`.
When the body is specified as an option, e.g. `jsonBody[Option[Book]]`, an empty body will be decoded as `None`.

This is implemented by passing `null` to the json-library-specific decoder, when the schema specifies that the value is
This is implemented by passing `null` to the json-library-specific decoder, when the schema specifies that the value is
optional, and the body is empty.

## Next
Expand Down
Loading