Skip to content

Commit

Permalink
Merge pull request #102 from scala/invisible-whitespaces
Browse files Browse the repository at this point in the history
Remove invisible whitespaces
  • Loading branch information
sjrd authored Jan 24, 2025
2 parents 946abec + b53c4bc commit 7750694
Show file tree
Hide file tree
Showing 11 changed files with 159 additions and 159 deletions.
2 changes: 1 addition & 1 deletion content/42.type.md
Original file line number Diff line number Diff line change
Expand Up @@ -582,7 +582,7 @@ terms.
### Byte and short literals
`Byte` and `Short` have singleton types, but lack any corresponding syntax either at the type or at the term level.
`Byte` and `Short` have singleton types, but lack any corresponding syntax either at the type or at the term level.
These types are important in libraries which deal with low-level numerics and protocol implementation
(see eg. [Spire](https://github.com/non/spire) and [Scodec](https://github.com/scodec/scodec)) and
elsewhere, and the ability to, for instance, index a type class by a byte or short literal would be
Expand Down
38 changes: 19 additions & 19 deletions content/alternative-bind-variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ Typically, the commands are tokenized and parsed. After a parsing stage we may e
enum Word
case Get, North, Go, Pick, Up
case Item(name: String)

case class Command(words: List[Word])
```

Expand All @@ -64,7 +64,7 @@ matching on a single stable identifier, `North` and the code would look like thi

~~~ scala
import Command.*

def loop(cmd: Command): Unit =
cmd match
case Command(North :: Nil) => // Code for going north
Expand Down Expand Up @@ -107,7 +107,7 @@ def loop(cmd: Cmd): Unit =
case Command(Get :: Item(name)) => pickUp(name)
~~~

Or any number of different encodings. However, all of them are less intuitive and less obvious than the code we tried to write.
Or any number of different encodings. However, all of them are less intuitive and less obvious than the code we tried to write.

## Commentary

Expand Down Expand Up @@ -147,7 +147,7 @@ type, like so:
enum Foo:
case Bar(x: Int)
case Baz(y: Int)

def fun = this match
case Bar(z) | Baz(z) => ... // z: Int
~~~
Expand All @@ -161,11 +161,11 @@ Removing the restriction would also allow recursive alternative patterns:
enum Foo:
case Bar(x: Int)
case Baz(x: Int)

enum Qux:
case Quux(y: Int)
case Corge(x: Foo)

def fun = this match
case Quux(z) | Corge(Bar(z) | Baz(z)) => ... // z: Int
~~~
Expand All @@ -177,8 +177,8 @@ We also expect to be able to use an explicit binding using an `@` like this:
enum Foo:
case Bar()
case Baz(bar: Bar)
def fun = this match

def fun = this match
case Baz(x) | x @ Bar() => ... // x: Foo.Bar
~~~

Expand All @@ -191,7 +191,7 @@ inferred within within each branch.
enum Foo:
case Bar(x: Int)
case Baz(y: String)

def fun = this match
case Bar(x) | Baz(x) => // x: Int | String
~~~
Expand All @@ -203,26 +203,26 @@ the following case to match all instances of `Bar`, regardless of the type of `A
enum Foo[A]:
case Bar(a: A)
case Baz(i: Int) extends Foo[Int]

def fun = this match
case Baz(x) | Bar(x) => // x: Int | A
case Baz(x) | Bar(x) => // x: Int | A
~~~

### Given bind variables

It is possible to introduce bindings to the contextual scope within a pattern match branch.
It is possible to introduce bindings to the contextual scope within a pattern match branch.

Since most bindings will be anonymous but be referred to within the branches, we expect the _types_ present in the contextual scope for each branch to be the same rather than the _names_.

~~~ scala
case class Context()

def run(using ctx: Context): Unit = ???

enum Foo:
case Bar(ctx: Context)
case Baz(i: Int, ctx: Context)

def fun = this match
case Bar(given Context) | Baz(_, given Context) => run // `Context` appears in both branches
~~~
Expand All @@ -233,7 +233,7 @@ This begs the question of what to do in the case of an explicit `@` binding wher
enum Foo:
case Bar(s: String)
case Baz(i: Int)

def fun = this match
case Bar(x @ given String) | Baz(x @ given Int) => ???
~~~
Expand All @@ -254,13 +254,13 @@ However, since untagged unions are part of Scala 3 and the fact that both are re

#### Type ascriptions in alternative branches

Another suggestion is that an _explicit_ type ascription by a user ought to be defined for all branches. For example, in the currently proposed rules, the following code would infer the return type to be `Int | A` even though the user has written the statement `id: Int`.
Another suggestion is that an _explicit_ type ascription by a user ought to be defined for all branches. For example, in the currently proposed rules, the following code would infer the return type to be `Int | A` even though the user has written the statement `id: Int`.

~~~scala
enum Foo[A]:
case Bar[A](a: A)
case Baz[A](a: A)

def test = this match
case Bar(id: Int) | Baz(id) => id
~~~
Expand Down Expand Up @@ -295,7 +295,7 @@ If `p_i` is a quoted pattern binding a variable or type variable, the alternativ

Each $`p_n`$ must introduce the same set of bindings, i.e. for each $`n`$, $`\Gamma_n`$ must have the same **named** members $`\Gamma_{n+1}`$ and the set of $`{T_0, ... T_n}`$ must be the same.

If $`X_{n,i}`$, is the type of the binding $`x_i`$ within an alternative $`p_n`$, then the consequent type, $`X_i`$, of the
If $`X_{n,i}`$, is the type of the binding $`x_i`$ within an alternative $`p_n`$, then the consequent type, $`X_i`$, of the
variable $`x_i`$ within the pattern scope, $`\Gamma`$ is the least upper-bound of all the types $`X_{n, i}`$ associated with
the variable, $`x_i`$ within each branch.

Expand Down
8 changes: 4 additions & 4 deletions content/better-fors.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ There are some clear pain points related to Scala'3 `for`-comprehensions and tho

This complicates the code, even in this simple example.
2. The simplicity of desugared code

The second pain point is that the desugared code of `for`-comprehensions can often be surprisingly complicated.

e.g.
Expand Down Expand Up @@ -92,7 +92,7 @@ There are some clear pain points related to Scala'3 `for`-comprehensions and tho
This SIP suggests the following changes to `for` comprehensions:

1. Allow `for` comprehensions to start with pure aliases

e.g.
```scala
for
Expand All @@ -103,7 +103,7 @@ This SIP suggests the following changes to `for` comprehensions:
```
2. Simpler conditional desugaring of pure aliases. i.e. whenever a series of pure aliases is not immediately followed by an `if`, use a simpler way of desugaring.

e.g.
e.g.
```scala
for
a <- doSth(arg)
Expand Down Expand Up @@ -250,7 +250,7 @@ A new desugaring rules will be introduced for simple desugaring.
For any N:
for (P <- G; P_1 = E_1; ... P_N = E_N; ...)
==>
G.flatMap (P => for (P_1 = E_1; ... P_N = E_N; ...))
G.flatMap (P => for (P_1 = E_1; ... P_N = E_N; ...))

And:

Expand Down
32 changes: 16 additions & 16 deletions content/byname-implicits.md
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ object Semigroup {
}
}
```

then we can manually write instances for, for example, tuples of types which have `Semigroup`
instances,

Expand Down Expand Up @@ -387,7 +387,7 @@ val showListInt: Show[List[Int]] =
showUnit
)
)
```
```

where at least one argument position between the val definition and the recursive occurrence of
`showListInt` is byname.
Expand Down Expand Up @@ -499,16 +499,16 @@ any _T<sub>j</sub>_, where _i_ < _j_.
The essence of the algorithm described in the Scala Language Specification is as follows,

> Call the sequence of open implicit types _O_. This is initially empty.
>
> To resolve an implicit of type _T_ given stack of open implicits _O_,
>
>
> To resolve an implicit of type _T_ given stack of open implicits _O_,
>
> + Identify the definition _d_ which satisfies _T_.
>
>
> + If the core type of _T_ dominates any element of _O_ then we have observed divergence and we're
> done.
>
>
> + If _d_ has no implicit arguments then the result is the value yielded by _d_.
>
>
> + Otherwise for each implicit argument _a_ of _d_, resolve _a_ against _O+T_, and the result is the
> value yielded by _d_ applied to its resolved arguments.
Expand Down Expand Up @@ -550,15 +550,15 @@ divergence check across the set of relevant implicit definitions.

This gives us the following,

> To resolve an implicit of type _T_ given stack of open implicits _O_,
>
> To resolve an implicit of type _T_ given stack of open implicits _O_,
>
> + Identify the definition _d_ which satisfies _T_.
>
>
> + If the core type of _T_ dominates the type _U_ of some element _<d, U>_ of _O_ then we have
> observed divergence and we're done.
>
>
> + If _d_ has no implicit arguments then the result is the value yielded by _d_.
>
>
> + Otherwise for each implicit argument _a_ of _d_, resolve _a_ against _O+<d, T>_, and the result is
> the value yielded by _d_ applied to its resolved arguments.
Expand Down Expand Up @@ -646,8 +646,8 @@ larger than _U_ despite using only elements that are present in _U_.

This gives us the following,

> To resolve an implicit of type _T_ given stack of open implicits _O_,
>
> To resolve an implicit of type _T_ given stack of open implicits _O_,
>
> + Identify the definition _d_ which satisfies _T_.
>
> + if there is an element _e_ of _O_ of the form _<d, T>_ such that at least one element between _e_
Expand All @@ -658,7 +658,7 @@ This gives us the following,
> observed divergence and we're done.
>
> + If _d_ has no implicit arguments then the result is the value yielded by _d_.
>
>
> + Otherwise for each implicit argument _a_ of _d_, resolve _a_ against _O+<d, T>_, and the result is
> the value yielded by _d_ applied to its resolved arguments.
Expand Down
2 changes: 1 addition & 1 deletion content/clause-interleaving.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ This definition provides the expected source API at call site, but it has two is

Another workaround is to return a polymorphic function, for example:
~~~scala
def getOrElse(k:Key): [V >: k.Value] => (default: V) => V =
def getOrElse(k:Key): [V >: k.Value] => (default: V) => V =
[V] => (default: V) => ???
~~~
While again, this provides the expected API at call site, it also has issues:
Expand Down
2 changes: 1 addition & 1 deletion content/drop-stdlib-forwards-bin-compat.md
Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,7 @@ repositories {
dependencies {
implementation 'org.scala-lang:scala-library:2.13.8'
implementation 'com.softwaremill.sttp.client3:core_2.13:3.8.3'
implementation 'com.softwaremill.sttp.shared:ws_2.13:1.2.7'
implementation 'com.softwaremill.sttp.shared:ws_2.13:1.2.7'
}
$> gradle dependencies --configuration runtimeClasspath
Expand Down
2 changes: 1 addition & 1 deletion content/interpolation-quote-escape.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ escape a `"` character to represent a literal `"` withing a string.
## Motivating Example

That the `"` character can't be easily escaped in interpolations has been an
open issue since at least 2012[^1], and how to deal with this issue is a
open issue since at least 2012[^1], and how to deal with this issue is a
somewhat common SO question[^2][^3]

{% highlight Scala %}
Expand Down
26 changes: 13 additions & 13 deletions content/polymorphic-eta-expansion.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ permalink: /sips/:title.html
- For a first-time reader, a high-level overview of what they should expect to see in the proposal.
- For returning readers, a quick reminder of what the proposal is about. -->

We propose to extend eta-expansion to polymorphic methods.
We propose to extend eta-expansion to polymorphic methods.
This means automatically transforming polymorphic methods into corresponding polymorphic functions when required, for example:

~~~ scala
Expand All @@ -44,10 +44,10 @@ This section should clearly express the scope of the proposal. It should make it

Regular eta-expansion is so ubiquitous that most users are not aware of it, for them it is intuitive and obvious that methods can be passed where functions are expected.

When manipulating polymorphic methods, we wager that most users find it confusing not to be able to do the same.
When manipulating polymorphic methods, we wager that most users find it confusing not to be able to do the same.
This is the main motivation of this proposal.

It however remains to be demonstrated that such cases appear often enough for time and maintenance to be devoted to fixing it.
It however remains to be demonstrated that such cases appear often enough for time and maintenance to be devoted to fixing it.
To this end, the remainder of this section will show a manufactured example with tuples, as well as real-world examples taken from the [Shapeless-3](https://index.scala-lang.org/typelevel/shapeless-3) and [kittens](https://index.scala-lang.org/typelevel/kittens) libraries.


Expand Down Expand Up @@ -89,7 +89,7 @@ There is however the following case, where a function is very large:
case (acc, Some(t)) => Some((t, acc._1))
}
}
~~~
~~~

By factoring out the function, it is possible to make the code more readable:

Expand All @@ -113,7 +113,7 @@ By factoring out the function, it is possible to make the code more readable:
case (acc, Some(t)) => Some((t, acc._1))
}
}
~~~
~~~

It is natural at this point to want to transform the function into a method, as the syntax for the latter is more familiar, and more readable:

Expand All @@ -139,7 +139,7 @@ It is natural at this point to want to transform the function into a method, as
}
~~~

However, this does not compile.
However, this does not compile.
Only monomorphic eta-expansion is applied, leading to the same issue as with our previous `Tuple.map` example.

#### Kittens ([source](https://github.com/typelevel/kittens/blob/e10a03455ac3dd52096a1edf0fe6d4196a8e2cad/core/src/main/scala-3/cats/derived/DerivedTraverse.scala#L44-L48))
Expand Down Expand Up @@ -251,7 +251,7 @@ For example, if the syntax of the language is changed, this section should list

Before we go on, it is important to clarify what we mean by "polymorphic method", we do not mean, as one would expect, "a method taking at least one type parameter clause", but rather "a (potentially partially applied) method whose next clause is a type clause", here is an example to illustrate:

~~~ scala
~~~ scala
extension (x: Int)
def poly[T](x: T): T = x
// signature: (Int)[T](T): T
Expand Down Expand Up @@ -279,15 +279,15 @@ Note: Polymorphic functions always take term parameters (but `k` can equal zero
1. Copies of `T_i`s are created, and replaced in `U_i`s, `L_i`s, `A_i`s and `R`, noted respectively `T'_i`, `U'_i`, `L'_i`, `A'_i` and `R'`.

2. Is the expected type a polymorphic context function ?
* 1. If yes then `m` is replaced by the following:
* 1. If yes then `m` is replaced by the following:
~~~ scala
[T'_1 <: U'_1 >: L'_1, ... , T'_n <: U'_n >: L'_n]
[T'_1 <: U'_1 >: L'_1, ... , T'_n <: U'_n >: L'_n]
=> (a_1: A'_1 ..., a_k: A'_k)
?=> m[T'_1, ..., T'_n]
~~~
* 2. If no then `m` is replaced by the following:
* 2. If no then `m` is replaced by the following:
~~~ scala
[T'_1 <: U'_1 >: L'_1, ... , T'_n <: U'_n >: L'_n]
[T'_1 <: U'_1 >: L'_1, ... , T'_n <: U'_n >: L'_n]
=> (a_1: A'_1 ..., a_k: A'_k)
=> m[T'_1, ..., T'_n](a_1, ..., a_k)
~~~
Expand Down Expand Up @@ -321,7 +321,7 @@ extension [A](x: A)
def foo[B](y: B) = (x, y)

val voo: [T] => T => [U] => U => (T, U) = foo
// foo expands to:
// foo expands to:
// [T'] => (t: T') => ( foo[T'](t) with expected type [U] => U => (T', U) )
// [T'] => (t: T') => [U'] => (u: U') => foo[T'](t)[U'](u)
~~~
Expand Down Expand Up @@ -384,7 +384,7 @@ Not included in this proposal are:
* Polymorphic SAM conversion
* Polymorphic functions from wildcard: `foo[_](_)`

While all of the above could be argued to be valuable, we deem they are out of the scope of this proposal.
While all of the above could be argued to be valuable, we deem they are out of the scope of this proposal.

We encourage the creation of follow-up proposals to motivate their inclusion.

Expand Down
Loading

0 comments on commit 7750694

Please sign in to comment.