Living Document. J. S. Choi, 2018-12.
Table of contents
- Relations to other work
- Pipelines in other programming languages
- Topic references in other programming languages
do
expressions- Function binding
- Function composition
- Partial function application
- Optional
catch
binding - Pattern matching
- Block parameters
do
expressions- Private class fields, class decorators, nullish coalescing, and optional chaining
- Alternative pipeline Babel plugin
- Alternative pipeline proposals
The concept of a pipe operator appears in numerous other languages, variously called “pipeline”, “threading”, and “feed” operators. This is because developers find the concept useful.
|
tacit only parameter | |
---|---|---|
|
tacit first parameter | |
|
tacit last parameter | |
|
tacit only parameter | |
|
| |
|
tacit first parameter or | |
|
tacit last parameter | |
|
tacit first or last parameters or arbitrary placeholder | |
tacit only parameter |
Factor, Forth, Joy, Onyx, PostScript, RPL |
Pipe operators are also conceptually similar to WHATWG-stream piping and Node-stream piping.
The concept of the “topic variable” already exists in many other programming
languages, commonly named with an
underscore _
or $_
. These languages often integrate their topic variables
into their function-call control-flow syntaxes, with Perl 6 as perhaps the most
extensive, unified example. Integration of topic with
syntax enables especially pithy, terse tacit programming.
In addition, many JavaScript console REPLs, such as the WebKit Web Inspector console variables, other browsers’ console variables and the Node.js console variables.
Several disadvantages to these prior approaches may increase the probability of developer surprise, in which “surprise” refers to behavior difficult to predict by the developer.
One disadvantage arises from their frequent dynamic binding rather than lexical
binding, as the former is not statically analyzable and is more stateful
than the latter. It may also cause surprising results when coupled with
bare/tacit calls: it becomes more difficult to tell whether a bare identifier
print
is meant to be a simple variable reference or a bare function call on
the topic value.
Another disadvantage arises from the ability to clobber or overwrite the value of the topic variable, which may affect code in surprising ways.
However, JavaScript’s topic references are is different than this prior art. It
is lexically bound, with simple scoping, and it is statically
analyzable. It also cannot be accidentally bound; the developer must opt
into binding it by using the pipe operator |>
. (This
includes Additional Feature TS, which requires the use of |>
.) The topic
also cannot be accidentally used; it is an early error when #
is used
outside of a pipeline step (see Core Proposal and static analyzability).
The proposal is as a whole designed to [prevent footguns][“don’t shoot me in the
foot”].
The topic is conceptually general and could be extended to other forms. This proposal is forward compatible with such extensions, which would increase its expressive versatility, and potentially multiplying its benefits toward untangled flow, terse variables, and human writability, while still preserving simple scoping and static analyzability.
There is a TC39 proposal for do
expressions at Stage 1. Smart pipelines do
not require do
expressions. However, if do
expressions also become
part of JavaScript, then, as with any other type of expression, a pipeline
step in topic style may use a do
expression, as long as the do
expression
contains the topic reference #
. The topic reference #
is bound to its
input’s value, the do
expression is evaluated, then the result of the
do
block becomes the final result of that pipeline, and the lexical
environment is reset – all as usual.
In this manner, pipelines with do
expressions act as a way to create a
“topic-context block”, similarly to Perl 6’s given block. Within this block,
statements may use the topic reference may be used as an abbreviation for the
same value. This can be useful for embedding side effects, if
else
statements, try
statements, and switch
statements within pipelines.
They may be made even pithier with Additional Feature BP, explained later.
However, smart pipelines do not depend on do
expressions, with the exception
of Additional Feature BP.
An existing proposal for ECMAScript function binding has three use cases:
- Extracting a method from an object as a standalone function:
object.method.bind(object)
as::object.method
. - Calling a function as if it were a method call on an object:
func.call(object, ...args)
asobject::func(...args)
- Creating a function by binding an object to it:
func.bind(object)
asobject::func
.
The smart-pipelines Core Proposal + Additional Feature PF subsumes
the ECMAScript function binding proposal in the first use case (prefix
::
). But the other two use cases (infix ::
) are not addressed by
smart pipelines. Smart pipelines and infix function binding ::
can and
should coexist. In fact, infix function binding could be made more ergonomic
in many cases by replacing prefix ::function
with a shortcut for the
expression #::function
.
With smart pipelines |
With existing proposal |
---|---|
Some forms of method extraction can be addressed by pipeline functions
alone, as a natural result of their pipe-operator-like semantics. Promise.resolve(123).then(+> console.log); |
Promise.resolve(123).then((...$) => console.log(...$)); |
$('.some-link').on('click', +> view.reset); |
$('.some-link').on('click', ::view.reset); |
Note that this is not the same as const consoleLog =
console.log.bind(console.log);
const arrayFrom =
Array.from.bind(Array.from);
const arrayMap =
Function.bind.call(Function.call,
Array.prototype.map);
…
input
|> process
|> consoleLog;
input
|> arrayFrom
|> arrayMap(#, $ => $ + 1)
|> consoleLog; This robust method extraction is a use case that this proposal leaves to
another operator, such as prefix |
const consoleLog =
console.log.bind(console.log);
const arrayFrom =
Array.from.bind(Array.from);
const arrayMap =
Function.bind.call(Function.call, Array.prototype.map);
…
consoleLog(
process(input));
consoleLog(
arrayMap(arrayFrom(input), $ => $ + 1)); |
…
input
|> process
|> &console.log;
input
|> &Array.from
|> #::&Array.prototype.map($ => $ + 1)
|> &console.log; Pipeline functions would not preclude adding another operator that addresses
robust method extraction with inline caching, such as the hypothetical prefix |
…
consoleLog(
process(input));
consoleLog(
&Array.from(input)
::&Array.prototype.map($ => $ + 1)); |
const { hasOwnProperty } = Object.prototype;
const x = { key: 5 };
x::hasOwnProperty;
x::hasOwnProperty('key'); For terse method calling/binding, the infix |
const { hasOwnProperty } = Object.prototype;
const x = { key: 5 };
x::hasOwnProperty;
x::hasOwnProperty('key'); |
The function bind operator a(1, +>
::b(2, +> …)
); See block parameters for further examples. |
a(1, $ =>
$::b(2, $ => …)
); |
Terse composition on unary functions is a goal of smart pipelines. It is equivalent to piping a value through several function calls, within a unary function, starting with the outer function’s tacit unary parameter.
There are several existing proposals for unary functional composition, which
Additional Feature PF would all subsume. Additional Feature PF can compose
not only unary functions but expressions of any type,
including object methods, async functions, and if
else
statements. And with
Additional Feature NP, even functional composition into n-ary functions
would be supported, which no current proposal yet addresses.
With smart pipelines | With alternative proposals |
---|---|
array.map(+> f |> g |> h(2, #) |> # + 2); |
array.map($ => h(2, g(f($))) + 2); |
When compared to the proposal for syntactic functional composition by
TheNavigateur, this syntax does not need
to give implicit special treatment to async functions. There is instead an async
version of the pipe-function operator, within which const doubleThenSquareThenHalfAsync =
async +>
|> double
|> await squareAsync
|> half; This example uses Additional Feature BA for |
const doubleThenSquareThenHalfAsync =
async $ =>
half(await squareAsync(double($))); const doubleThenSquareThenHalfAsync =
double +> squareAsync +> half; From the proposal for syntactic functional composition by TheNavigateur. |
const toSlug =
$ => $
|> #.split(' ')
|> #.map($ => $.toLowerCase())
|> #.join('-')
|> encodeURIComponent; const toSlug =
+> #.split(' ')
|> #.map(+> #.toLowerCase())
|> #.join('-')
|> encodeURIComponent; When compared to the proposal for syntactic functional composition by Isiah Meadows, this syntax does not need to surround each non-function expression with an arrow function. The smart step syntax has more powerful expressive versatility, improving the readability of the code. |
const toSlug = $ =>
encodeURIComponent(
$.split(' ')
.map(str =>
str.toLowerCase())
.join('-')); const toSlug =
_ => _.split(" ")
:> _ => _.map(str =>
str.toLowerCase())
:> _ => _.join("-")
:> encodeURIComponent; From the proposal for syntactic functional composition by Isiah Meadows. |
Lifting of non-sync-function expressions into function expressions is unnecessary for composition with Additional Feature PF. const getTemperatureFromServerInLocalUnits =
async +>
|> await getTemperatureKelvinFromServerAsync
|> convertTemperatureKelvinToLocalUnits; This example uses Additional Feature BA for
|
Promise.prototype[Symbol.lift] = f => x =>
x.then(f);
const getTemperatureFromServerInLocalUnits =
getTemperatureKelvinFromServerAsync
:> convertTemperatureKelvinToLocalUnits; From the proposal for syntactic functional composition by Isiah Meadows. |
// Functional Building Blocks
const car = +>
|> startMotor
|> useFuel
|> turnKey;
const electricCar = +>
|> startMotor
|> usePower
|> turnKey;
// Control Flow Management
const getData = +>
|> truncate
|> sort
|> filter
|> request;
// Argument Assignment
const sortBy = 'date';
const getData = +>
|> truncate
|> sort
|> #::filter(sortBy)
|> request; This example also uses function binding. |
// Functional Building Blocks
const car = startMotor.compose(
useFuel, turnKey);
const electricCar = startMotor.compose(
usePower, turnKey);
// Control Flow Management
const getData = truncate.compose(
sort, filter, request);
// Argument Assignment
const sortBy = 'date';
const getData = truncate.compose(
sort,
$ => filter.bind($, sortBy),
request
); From the proposal for syntactic functional composition by Simon Staton. |
const pluck = +> map |> prop; |
const pluck = compose(map)(prop); From a comment about syntactic functional composition by Tom Harding. |
Terse partial application is a goal of smart pipelines. The current proposal for syntactic ECMAScript partial application by Ron Buckton would be subsumed by Additional Feature PF and Additional Feature NP. Terse partial application into an N-ary function is equivalent to piping N tacit parameters into an N-ary function-call expression, within which the parameters are resolvable topic references. (Additional Feature PF alone would only address partial application into unary functions.)
Pipeline functions look similar to the alternative proposal, except that
partial-application expressions are simply pipeline steps that are prefixed by
the pipeline-function operator, and consecutive ?
placeholders are instead
consecutive topic references #
, ##
, ###
.
The current proposal for partial function application assumes that each use
of the same ?
placeholder token represents a different parameter. In contrast,
each use of #
within the same scope always refers to the same value. This is
why additional topic parameters are required.
The resulting model is more flexible: with Additional Feature NP with
Additional Feature PF, +> f(#, 4, ##)
is different from +> f(#, 4, #)
.
The former refers to a binary function: a function with two parameters,
essentially (x, y) => f(x, 4, y)
. The latter refers to a unary function
that passes the same one argument into both the first and third parameters of
the original function f
: x => f(x, 4, x)
. The same symbol refers to the same
value in the same lexical environment.
With smart pipelines | With alternative proposals |
---|---|
array.map($ => $ |> f(2, #));
array.map(+> f(2, #)); |
array.map(f(2, ?));
array.map($ => f(2, $)); |
const addOne = +> add(1, #);
addOne(2); // 3 |
const addOne = add(1, ?);
addOne(2); // 3 |
const addTen = +> add(#, 10);
addTen(2); // 12 |
const addTen = add(?, 10);
addTen(2); // 12 |
let newScore = player.score
|> add(7, #)
|> clamp(0, 100, #); |
let newScore = player.score
|> add(7, ?)
|> clamp(0, 100, ?); |
const toSlug = +>
|> encodeURIComponent
|> _.split(#, " ")
|> _.map(#, _.toLower)
|> _.join(#, "-"); Additional Feature PF simultaneously handles function composition and partial application into unary functions. |
const toSlug =
encodeURIComponent
:> _.split(?, " ")
:> _.map(?, _.toLower)
:> _.join(?, "-"); From the proposal for syntactic functional composition by Isiah Meadows. |
[ { x: 22 }, { x: 42 } ]
.map(+> #.x)
.reduce(+> # - ##, 0); |
[ { x: 22 }, { x: 42 } ]
.map(el => el.x)
.reduce((_0, _1) => _0 - _1, 0); |
const f = (x, y, z) => [x, y, z];
const g = +> f(#, 4, ##);
g(1, 2) // [1, 4, 2]; |
const f = (x, y, z) => [x, y, z];
const g = f(?, 4, ?);
g(1, 2) // [1, 4, 2]; |
const maxGreaterThanZero =
+> Math.max(0, ...);
maxGreaterThanZero(1, 2); // 2
maxGreaterThanZero(-1, -2); // 0 Partial application into a variadic function is also naturally handled by Additional Feature NP with Additional Feature PF. |
const maxGreaterThanZero =
Math.max(0, ...);
maxGreaterThanZero(1, 2); // 2
maxGreaterThanZero(-1, -2); // 0 In this case, the topic function version looks once again nearly identical to the other proposal’s code. |
In addition, a bare catch
form, completely lacking a parenthesized antecedent,
has already been proposed as ECMAScript optional catch
binding. This bare
form is mutually compatible with this proposal, including with Additional
Feature TS. The developer must opt into using Additional
Feature TS by using a pipeline token |>
, followed by the
pipeline step. No existing code would be affected. Any, some, or none of the
three clauses in a try
statement may be in a pipeline form versus the regular
block form or the bare block form.
With smart pipelines too | With optional `catch` binding only |
---|---|
value
|> f
|> {
try {
|> 1 / #;
}
catch {
{ type: error };
}
}
|> g; Even with Additional Feature TS, omitting |
let _1;
try {
_1 = 1 / f(value);
}
catch (error) {
_1 = { message: error.message };
}
g (_1, 1); |
value
|> f
|> {
try { 1 / #; }
catch {
#.message |> console.error;
}
}
|> g(#, 1);
// 🚫 Syntax Error:
// Lexical context `catch { … }`
// contains a topic reference
// but has no topic binding. If the developer leaves out the |
The smart pipelines and topic references of the Core Proposal would be a boon to the proposal for ECMAScript pattern matching.
With smart pipelines | With pattern matching only |
---|---|
…
|> f
|> match (#) {
100:
#,
Array:
#.length,
/(\d)(\d)(\d)/:
#.groups |> #[0] + #[1] + #[2],
} The |
match (f(…)) {
100:
x,
Array -> a:
x.length,
/(\d)(\d)(\d)/ -> m:
m.groups |> #[0] + #[1] + #[2],
} With a topic binding, the |
…
|> f
|> match {
{ x, y }:
(x ** 2 + y ** 2)
|> Math.sqrt,
[...]:
#.length,
else:
throw new Error(#),
} ECMAScript pattern matching could also have a completely tacit version,
in which the parenthesized antecedent ( |
match (f(…)) {
{ x, y }:
(x ** 2 + y ** 2)
|> Math.sqrt,
[...] -> a:
a.length,
else:
throw new Error(vector),
} |
try { … }
catch
|> match {
SyntaxError:
#|> f,
TypeError:
#|> g |> h(#, {strict: true}),
Error:
throw #,
}; With ECMAScript pattern matching + the smart-pipelines Core Proposal + Additional Feature TS, handling caught errors (and promise rejections) based on error type becomes more ergonomic. |
try { … }
catch (error) {
match (error) {
SyntaxError:
f(error),
TypeError:
h(g(error), {strict: true}),
Error:
throw error,
};
} The version with pattern matching alone is more verbose with five more |
The proposed syntax of ECMAScript block parameters may greatly benefit from the pipeline and topic concepts, which would be able to explain much of their desired behavior.
With smart pipelines | With block parameters only |
---|---|
materials.map { #|> f |> .length; } The first parameter of the arrow function that the block parameter implicitly
creates is bound to the primary topic, which in turn is fed into the pipeline:
|
materials.map do (m) { f(m).length; } The block-parameter proposal itself has not yet settled on how to parameterize its block parameters. The topic reference may be the key to solving this problem, making other, special block parameters unnecessary. |
Note that this would be the same as: materials.map(+> f |> .length); |
materials.map (m) { f(m).length; }; |
a(1) {
#::b(2) { … };
}; The block-parameter proposal in particular has not settled on how to nest block parameters. But the Core Proposal’s simple syntax rules already can handle nested block parameters. |
a(1) {
::b(2) { … };
}; The block-parameter proposal’s authors have been exploring using a sygil –
perhaps related to the function bind operator |
This would simply be equivalent to: a(1, +> {
#::b(2, +> { … });
}); |
a(1) {
::b(2) { … };
}; |
…and if the function bind operator a(1) {
::b(2) { … };
}; |
a(1) {
::b(2) { … };
}; |
server(app) {
#::get('/') do (response) {
request()
|> .get('param1')
|> `hello world ${#}`
|> response.send;
};
#::listen(3000) {
log('hello');
};
} And again, if the function bind operator |
server(app) do (_app) {
_app::get('/') do (response) {
request()
|> #.get('param1')
|> `hello world ${#}`
|> response.send;
};
_app::listen(3000) {
log('hello');
};
}; |
In the event that TC39 seriously considers the topic function definitions
shown above, a function.topic
metaprogramming operator, in the style of
the new.target
operator, could be useful in creating topic-aware functions.
This might be especially useful in creating APIs resembling domain-specific
languages with ECMAScript block parameters. This example creates
three functions that form an API resembling Visual Basic’s select
statement. Two of these functions (when
and otherwise
) that are expected
to be called always within the third function (select
)’s callback block.
Such a solution is not yet specified by the current proposal for ECMAScript block parameters. Lexical topics can fill in that gap. (The example below also uses Additional Feature BC.)
class CompletionRecord {
type, value;
constructor (testValue) {
this.testValue = testValue;
}
}
export function select (value, block) {
const selectBlockTopic = new CompletionRecord();
return block(topic)
|> match {
CompletionRecord:
#.value,
else:
throw 'Invalid clause was used in select block'
|> new Error,
};
}
export function otherwise (block) {
return function.topic
|> match (selectBlockTopic) {
CompletionRecord: {
if (#.type === undefined) {
#.type = 'normal';
#.value = {};
}
#;
},
else:
throw 'Invalid otherwise clause was used outside select block'
|> new Error,
};
}
export function when (caseValue, block) {
return function.topic
|> match (selectBlockTopic) {
CompletionRecord:
|> applyWhen(#, caseValue, block),
else:
throw 'Invalid when clause used was outside select block'
|> new Error,
};
}
function applyWhen (selectBlockTopic, caseValue, block) {
return match (#.value) {
[...]:
(selectBlockTopic, caseValue, block)
|> applyWhenArray,
else:
(selectBlockTopic, caseValue, block)
|> applyWhenValue,
}
}
function applyWhenArray (selectBlockTopic, testArray, block) {
return testArray.some(arrayValue =>
contextTopic |> {
if (when(arrayValue, block))
#;
});
}
function applyWhenValue (contextTopic, testValue, block) {
return contextTopic |> {
let match
if (#.type !== undefined
&& (match = #[Symbol.matches](testValue))
) {
#.type = 'normal';
#.value = match |> block;
}
#;
};
}
select ('world') {
when ([Boolean, Number]) {
|> log;
};
when (String) {
|> `Hello ${#}`
|> log;
};
otherwise {
throw `Error: ${|> format}`
|> new Error;
};
};
Because pipeline topic style supports arbitrary expressions, when do
expressions are added to JavaScript they too will be
supported within pipeline steps. When this occurs, topic references would be
allowed within inner do
expressions, along with arrow functions and if
and
try
statements. Additional Feature BP would extend this further, also
supporting pipeline-step blocks that act nearly identically to do
expressions.
This proposal’s compatibility with these four proposals depends on its choice of
tokens for its topic references, such as #
/##
/###
/...
, @
/@@
/@@@
,
or ?
/??
/???
. This is being bikeshedded at tc39/proposal-pipe-operator
issue #91.
Because topic proposals are nullary operators, these are unambiguous with all
four proposals, with one exception: ?
/??
/???
is not compatible with
nullish coalescing and optional chaining’s current choice of … ??: …
,
…??.…
, …??[…]
, and …??(…)
. This is not a problem with
#
/##
/###
/...
and with @
/@@
/@@@
.
In fact, if Additional Feature PF and Additional Feature NP subsume the
current partial function application proposal, which uses nullary ?
, then
single ?
might be freed up for optional chaining.
Gajus Kuizinas wrote a Babel plugin for syntactic functional composition, which may be useful to compare with this proposal’s smart pipelines. Kuizinas’s plugin makes the choice to insert input values into its pipeline steps’ last parameters, which is convenient for Ramda-style functions but not for (equally reasonable) Underscore-style functions.
With smart pipelines | Kuizinas’ Babel plugin |
---|---|
apple
|> foo('foo parameter 0', 'foo parameter 1', #)
|> bar('bar parameter 0', #)
|> baz('baz parameter 0', #); |
apple
::foo('foo parameter 0', 'foo parameter 1')
::bar('bar parameter 0')
::baz('baz parameter 0'); |
{x: 'x'}
|> assocPath(['y', 'a'], 'a', #)
|> assocPath(['y', 'b'], 'b', #); |
{x: 'x'}
::assocPath(['y', 'a'], 'a')
::assocPath(['y', 'b'], 'b'); |
There are several other alternative pipe-operator proposals competing with the smart-pipeline Core Proposal. The Core Proposal is only one variant of the first pipe-operator proposal also championed by Ehrenberg; this variant is listed as Proposal 4: Smart Mix in the pipe-proposal wiki. The variant resulted from previous discussions in the previous pipe-operator proposal, discussions which culminated in an invitation by Ehrenberg to try writing a specification draft.
All variants attempt to address the goals of untangled flow, distinguishable punctuators, terse function calls, and human writability. But the authors of this proposal believe that the smart pipe operator may be the best choice among these competing proposals at fulfilling all the goals listed above.
Only the smart pipe operator does not need to create unnecessary one-off arrow functions for non-function-call expressions, which better fulfills the goal of zero runtime cost. Only the smart pipe operator has the forward compatibility and conceptual generality to support not only terse unary function application but also terse N-ary function application, terse expression application, terse function composition, terse expression composition, terse partial application, and terse method extraction – all with a single simple and unified general concept.
Indeed, the original pipeline proposal was blocked from Stage 2 by TC39 during its 60th meeting, on September 2017, for similar reasons. At that time, several members expressed concern that it could be coordinated more with the proposals for function binding and partial function application in a more coherent approach. Smart pipelines open the door to such an approach to all these use cases.
Smart pipelines and their smart step syntax sacrifice a small amount of simplicity in return for a vast amount of expressive versatility and conceptual generality. And because it makes many of the other operator proposals above either unnecessary or possibly simpler, it may result in less complexity on average anyway. And thanks to its syntactic locality and numerous statically detectable early errors, the mental burden on the developer in remembering smart step syntax is light.
The benefits of smart pipelines on many real-world examples are well demonstrated in the Motivation section above, and many of the examples are not possible with the other pipeline proposals. It is hoped that the Core Proposal is strongly considered by TC39, keeping in mind that its simple but versatile syntax would open the door to addressing the use cases of many other proposals in a uniform manner.