profile
viewpoint
Roniel ronielramos EQI Diadema - SP, Brasil Developer

dev-tests/ionic-4-e2e 0

For study e2e tests

JonasREJCS/start-graph-ql 0

Primieor projeto usando GraphQL com NodeJS e express

ronielramos/Awesome-Profile-README-templates 0

A collection of awesome readme templates to display on your profile

ronielramos/continuos 0

projeto para estudos

ronielramos/continuos-server 0

Servidor do aplicativo continuos

ronielramos/frontend-bootcamp 0

Frontend Workshop from HTML/CSS/JS to TypeScript/React/Redux

ronielramos/pdffiller 0

Take an existing PDF Form and data and PDF Filler will create a new PDF with all given fields populated.

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

@laurieontech Considering your informal poll recently that has shown overwhelming desire for the pipeline operator be supported (more upvotes than even your tweet) & your role within the TC39 Educator Committee, how such findings can help this proposal go forward with what that committee can do within TC39's structure?

// cc @littledan (Current champion associated w/ this proposal) & @tabatkins

pygy

comment created time in a day

created repositoryLeandro1441/DigTeste

created time in a day

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

I personally don't really like the "smart" variant, because it's not a reverse application operator. In my opinion, a currying / partial application feature should not be limited to only one syntax construct (to one operator). I believe Scala has more general approach:

object Functional { // helper stuff, skip to main function
   class PipedObject[T] private[Functional] (value:T) // not my code
   {
       def |>[R] (f : T => R) = f(this.value)
   }
   implicit def toPiped[T] (value:T) = new PipedObject[T](value)

   def map[T, U](f: (T) => U)(xs: Seq[T]): Seq[U] = xs.map(f)
   def foldLeft[T, A](z: A)(f: (A, T) => A)(xs: Seq[T]): A = xs.foldLeft(z)(f)
}

object HelloWorld {
   import Functional._
    
   def main(args: Array[String]) {
      val input = (1 to 10).toSeq
       
      val res0 = input.map(x => x + 1).foldLeft(0)((x, y) => x + y)
      println(res0)

      val res1 = input.map(_ + 1).foldLeft(0)(_ + _)
      println(res1)

      val plusOne: (Int) => Int = _ + 1
      val plus: (Int, Int) => Int = _ + _
      val res2 = input.map(plusOne).foldLeft(0)(plus)
      println(res2)

      val res3 = input |> map(plusOne) |> foldLeft(0)(plus)
      println(res3)

      val res4 = input |> (plusOne |> map[Int, Int]) |> (plus |> (0 |> foldLeft[Int, Int]))
      println(res4)
   }
}

All res* do the same thing. It's mostly referentially transparent even in the syntax sense (refactoring - just cut&paste and add types where compiler struggles). If I understand the proposal correctly, you can't do const plusOne = plus # 1. It's also worth noting that Scala's _ (placeholder?) is positional, so _ + _ is equivalent to (x, y) => x + y, not x => x + x.

How often are currying used in Clojure, or in Elixir? ... Just some perspective.

For the perspective, I have some experience in Haskell (years of hobby projects, I am mildly advanced/intermediate) and currying (use of partial application) is everywhere. I would say 90% of functions use at least on partially applied function.

Here's a rewritten previous example to Haskell (quick Haskell introduction: & is "pipe" operator and <&> is "pipe-y" map operator, \x y -> x + y translates to JS as x => y => x + y, f a b roughly equals to f(a)(b) in JS, fmap is like map, foldl is reduce in JS, $ is function application [usually same as a pair of parenthesis], <$> is map [reversed order of arguments when compared to <&>]):

import Data.Function ((&))
import Data.Functor ((<&>))

main = do
    let xs = [1..10]

    let res0 = foldl (\x y -> x + y) 0 (fmap (\x -> x + 1) xs)
    print res0
    
    let res1 = foldl (+) 0 (fmap (+ 1) xs)
    print res1
    
    let plus = (+)
    let plusOne = (+ 1)
    let res2 = foldl plus 0 (fmap plusOne xs)
    print res2

    let res3 = xs <&> plusOne & foldl plus 0
    print res3
    
    -- in my opinion best solution (reads from left to right)
    let res3' = xs <&> (+ 1) & foldl (+) 0
    print res3'
    
    -- this is I believe idiomatic Haskell (reads from right to left)
    let res3'' = foldl (+) 0 $ (+ 1) <$> xs
    print res3''

    let res4 = xs & (plusOne & fmap) & (0 & (plus & foldl))
    print res4

In Haskell I think everything is fully referentially and cut/paste transparent (at least in this example, compiler inferred everything correctly). As you can see partial application (currying) is used in what I would consider common Haskell (res3'' and res3'): foldl (+) 0 (last parameter is applied later by <$> or <&>) and (+ 1) (first parameter is applied later by foldl).

I have significantly less experience with Reason (few weeks) and PureScript (only tried for a few days, but it's really similar to Haskell), however I would wager partial application is used there in copious amounts as well.

pygy

comment created time in 2 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

I think I'm starting to understand how fundamentally differently we both view the pipeline operator. I think that fundamental difference is somehow related to the evaluation order and context.

With the F# operator, I think of it as evaluating like this:

valueExpression |> functionExpression
  1. valueExpression is evaluated in the outermost context
  2. functionExpression is evaluated in the outermost context
  3. valueExpression |> functionExpression is evaluated:
    • Apply the result of valueExpression to the result of functionExpression.

I believe this is semantically completely the same as if it were written as (functionExpression) (valueExpression).

Now your view, on the other hand, is that |> creates a lexical context. I think this may be the view that's also at the root of the misconception you had when giving your last example. Perhaps you thought that add(3) would be evaluated in a different context. So in the case of Hack, I guess it goes like this:

valueExpression |> topicExpression
  1. valueExpression is evaluated in the outermost context
  2. valueExpression |> topicExpression is evaluated:
    • Create a new context where # binds to the result of valueExpression
    • topicExpression is evaluated within this new context

If I would think of |> as creating a lexical context (similarly to =>), then the arguments given in this thread that are pro-F# might seem very foreign to me. Personally, I think of |> as simply being function application in reverse (no additional lexical contexts), and so the arguments for Hack actually do seem very foreign to me. We are talking about two completely different operators.

pygy

comment created time in 2 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

I don't think my foundations are shaky, but I appreciate someone putting them under a pressure test :)

And I appreciate your counter examples. I should say though that initially when I was posting my comment about referential transparency, I was under the impression that you were arguing for both (or either) the Smart or the Hack-only proposal. After reading some of your earlier comments I realized that you were strictly arguing for the Hack-only proposal and none other, making some of the comments I less irrelevant to you directly. I tried my best to modify my earlier comment to make that apparent. It seems we are in agreement about the importance of referential transparency.

To further clarify my comment, let me provide a summary: I essentially gave four arguments against the Smart proposal, and three against the Hack proposal:

  1. The proposal foregoes referential transparency: This only affects Smart, not Hack. As you have also noted. That's because the biggest problem I see with referential transparency is with the tacit call style of the Smart proposal.
  2. The proposal makes refactoring code more difficult: This affects the topic style of both the Smart and Hack proposals.
  3. The proposal has extra syntax on top of F# style: Referring to |> .. (#) vs just |>. This may seem obvious, but my real point was that due to the existence of my former two points, the only usable style remaining is having the topic in the trailing position, at which point the extra syntax just seems excessive.
  4. The proposal stifles the development of others: This affects Smart and Hack, as I am of the opinion that they are trying to take on too many problems at once, making it less appealing to solve these problems later in ways that benefit other parts of the language too.

I hope this clarifies the points I was trying to make.


Now with that out of the way, I would like to scrutinize the counter-examples you have given.

As you can see, referential transparency is about replacing an expression with its value. Here's an example with Hack pipeline [showing that Hack uses referentially transparent syntax]

You are right. I am not trying to deny that Hack has referentially transparent syntax, and as I hope I have made clear in the clarification I gave above, I never did. Instead, I argued that the tacit style from the Smart proposal foregoes referential transparency (see point 1).

My argument about Hack is that, although it's technically referentially transparent, it still does make code more difficult to refactor in a similar way to how the lack of referential transparency does (see point 2). To argue my point, I gave this example:

[41] |> map (?, x => x + 1)

//...is equivalent to:

const myExpression = map (?, x => x + 1)
[41] |> myExpression

...Where the value expression for myExpression did not need to be altered if this were F# + Partial Application, but it would have to be altered if this were Hack.

So to be clear: I agree that Hack is referentially transparent. My argument was never about that.

According to your definition of referential transparency, this purely functional code is not referentially transparent const foo = (x) => x + 3

I can see how I gave you that impression. I hope I have clarified that that's now what I think referential transparency is.

Finally, here is an example code that breaks* referential transparency, using the F# proposal:

const add = (x, y) => x + y;
12 |> add(3)

When you replace the RHS with its value*, you get

12 |> NaN

which would evaluate to NaN, clearly not the right result.

Now here, I think you have a misconception (or maybe I do), and I hope we can clear that up.

Both programs you have given in your example behave in the same way. Both examples try to apply 12 to NaN, leading to an error like NaN is not a function or something. This is what referential transparency is about: you were able to replace an expression with its corresponding value without altering the behaviour of the program. Therefore, I do not believe that any part of your example shows that referential transparency was broken.

pygy

comment created time in 2 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

@xixixao Your definition of referential transparency matches @Avaq's, but your understanding seems lacking.

But I think it is a red herring anyway. Hack-style is referentially transparent from a semantic standpoint (i.e. deterministic and free of side effects), but not from a syntactic one, because of the implied (#) =>. The semantics matter more in FP, but it is nice to have syntactically as well when refactoring.

FP proponents in JS often use Ramda (and in effect, code in Ramda, whose functional operators replace most JS statements). For them, the F# style is syntactically superior to either a pipe() function (which is not visually distinctive) or to the hack (style which is typographically heavier, beside the refactoring concerns).

pygy

comment created time in 2 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

First I want to mention that adding more conversation about the proposals here is likely not going to lead anywhere. We should wait for replies from people on TC-39 to understand which direction to take this discussion.

@xixixao If that's how you feel, then feel free to not participate and to just wait for the people at the top to decide which direction you should take. But please don't discourage other people from expressing their opinion and having a discussion about future they want for their language.


I'm not sure that I followed your point, but in your example I would have curried the add function, of course:

const add = (x) => (y) => x + y;
12 |> add(3)

And so the Hack-style variant would look like this:

const add = (x) => (y) => x + y;
12 |> add(3) (#)

Which gives the feeling of double |> ... (#) operator mentioned by Avaq.

So I'm not sure what you wanted to demonstrate. Do you assume, by any chance, that Hack-style allows to avoid currying or something like that?

pygy

comment created time in 2 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

First I want to mention that adding more conversation about the proposals here is likely not going to lead anywhere. We should wait for replies from people on TC-39 to understand which direction to take this discussion.

@Avaq I appreciate your care about referential transparency. Indeed referential transparency is why I think the Hack proposal is superior to the F# / similar proposals discussed here. Please have a look at the definition of referential transparency, here is the first link Google spit out for me: https://www.sitepoint.com/what-is-referential-transparency/

As you can see, referential transparency is about replacing an expression with its value. Here's an example with Hack pipeline:

const add = (x, y) => x + y;
12 |> add(#, 3)

The RHS of the pipeline can be replaced with its value:

12 |> 15

to give the same result.

According to your definition of referential transparency, this purely functional code is not referentially transparent:

const foo = (x) => x + 3

because x is a binding (just like # is a binding above). Clearly, this code is referentially transparent.

Finally, here is an example code that breaks* referential transparency, using the F# proposal:

const add = (x, y) => x + y;
12 |> add(3)

When you replace the RHS with its value*, you get

12 |> NaN

which would evaluate to NaN, clearly not the right result.

* It's important that the value replacement is done based on the local expression syntax. Indeed if you take into account that the expression is on the RHS of the F# pipeline and use its evaluation rules, the code is referentially transparent (any purely functional code is). But that's exactly "the problem" with this proposal and syntax. To retain the referential transparency property, you need to use non-local syntactic information. This is not true anywhere else in JS (besides with blocks, which is why they are now consider a Bad Part of JS).

So no, I don't think my foundations are shaky, but I appreciate someone putting them under a pressure test :)

pygy

comment created time in 2 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

Thanks for confirming my suspicion. I was using Clojure and Elixir as two examples to say that currying may not be a common practice in some functional languages. It's a commonly understood concept, but not used widely. Both languages also have pipeline operators.

pygy

comment created time in 3 days

created repositoryyankaique/challenge-ruptiva-react-native

created time in 4 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

@highmountaintea few years in the belt, almost never.

From time to time you find a function that takes one argument but was not due to thinking "oh I need a curried function", it was simply because of the needs of just that function.

pygy

comment created time in 5 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

@highmountaintea as someone who works in Elixir daily, currying is essentially non-existent. Firstly, it's not built into the language. Secondly, between pipelines, pattern matching, and macros; it's hard to see when/where currying would be useful.

pygy

comment created time in 5 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

How often are currying used in Clojure, or in Elixir? Do collection/enumeration libraries such as Enum and List provide auto-currying functions? Just some perspective.

pygy

comment created time in 5 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

Thanks everybody for the corrections. I believe we are getting valuable inputs from everybody, but want to ensure that we are not arguing in cross purpose. Everybody here wants some form of pipeline operator to be implemented, and I believe @xixixao is (seemingly) in favor of the Hack proposal. Labeling him as "anti-pipeline" seems incorrect and unfair to him.

pygy

comment created time in 5 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

Correction: I may have conflated some features I thought were attributed to the Hack proposal and some Smart proposal features. In the interest of keeping information accurate I have crossed out parts of my previous post that mentioned Hack. My comments are addressed to the Smart proposal in particular, which may not have been what @xixixao was arguing for. I'm going to make an attempt to better understand the differences between the Hack and the Smart proposals.

pygy

comment created time in 5 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

My argument is that the Hack-style pipeline is a language feature beneficial to both people who do "mainstream" FP and to people who are into "extreme" FP, while the F#-style is only useful to the latter group -- @xixixao in https://github.com/tc39/proposal-pipeline-operator/issues/167#issuecomment-756487569

I believe that this idea is what's at the root of @xixixao's view: The idea that the Hack/Smart style pipline proposal is (just as) beneficial to what they'd call "extreme functional programmers" as the F# variant.

I'd like to explain why I don't think this is the case. I've said that the Hack and Smart proposals are "not half as useful" to, and that they "don't properly support", the functional programming style. I'll try to formulate why that is.

Firstly, though, I should mention that I don't agree with the classification of "extreme FP features" versus what's "mainstream". Other people have already argued this point with similar ideas to mine, so I'll leave it at that.

The problem that I (and I think many a functional programmer) have with the Hack or Smart pipeline proposal, is that they forego a property that is very fundamental to functional programming, and that is of course referential transparency.

An expression is called referentially transparent if it can be replaced with its corresponding value (and vice-versa) without changing the program's behavior. -- wikipdia: referential transparency (from John C. Mitchell (2002). Concepts in Programming Languages)

const f = map (x => x + 1)

[41] |> f

Here, it is my understanding that the expression f in [41] |> f is referentially transparent if we can replace it by its corresponding value map (x => x + 1). This holds true for the unary pipeline operator and none other:

[41] |> map (x => x + 1)

The above is a syntax error in the hack and smart pipeline proposals. All I did was replace an expression with its corresponding value. The fact that this changed the program's behaviour is incredibly surprising to a functional programmer. So much so, in fact, that I would personally (and I don't think I'm speaking for only myself) not use it at all.


The same is true to a lesser extent for the "topic mode" expressions:

[41] |> map (#, x => x + 1)

Here, the expression map (#, x => x + 1) is tied to its scope in the same way that a lambda that captures a variable is then tied to the scope of that variable. The only way to extract this expression now is to wrap it in a function:

const myExpression = $ => map ($, x => x + 1)
[41] |> myExpression (#)

So to extract this expression, I was forced to create a unary function, putting me back into a state where the unary pipeline operator would have been just has useful. I am aware this is not a direct argument for referential transparency, but I've highlighted it here because the same would be totally possible with the unary pipeline operator plus partial application:

[41] |> map (?, x => x + 1)

//...is equivalent to:

const myExpression = map (?, x => x + 1)
[41] |> myExpression

Note how the value of myExpression did not need to change for it to be assigned to a constant.

This leaves only one mode of operation that is useful to the functional programmer when it comes to the Hack/Smart proposal:

  • ~tacit style~ not referentially transparent
  • ~topic style with partial application~ difficult to extract expressions
  • topic style with the topic in the trailing place:
[41] |> map (x => x + 1) (#)

In the above example, I've left the topic trailing, and my map function is already curried. Now the expression map (x => x + 1) is referentially transparent, and I could assign it to a constant and replace the entire expression with the name of said constant.

This, to me, is the Smart/Hack style pipeline operator behaving in its most useful way, because it finally allows for using logical steps to refactor code. However, do you see how this is just the unary pipeline operator with extra syntax? Instead of x |> f, the proposal looks to me like x |> f (#): The operator is not |>, but it's some kind of dual operator |> .. (#) like for .. in. I think it is only natural that we have a strong preference for the former (|>).


Finally, besides the arguments of referential transparency and syntax, I also would like reiterate the point that many of the problems that the smart or hack proposals try to solve can be addressed separately, in ways that wouldn't only complement the unary pipeline operator, but also many other existing patterns and features.

Solving these problems within the context of the pipeline operator means that we're lowering the return on investment from solving them outside of this context. On top of that, we're also assigning a symbol (#, ?, or @) to this feature, further lowering the chances that other solutions (such as partial application) would make it into the language.


In conclusion, I have tried showing that the foundation of @xixixao arguments (that the Smart proposal is just as useful for functional programmers) is shaky. I also have concerns that the Smart proposal will stifle the development of other features that could be very useful to functional programmers (eg. await expressions, bind syntax, partial application).

pygy

comment created time in 5 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

Reverse application (pipe operator), partial application and currying are "extreme FP" features? Maybe in JavaScript, but in my opinion in general, no, they are basic (beginner level) FP features. I would compare reverse application to calling a method in OOP (it's actually pretty similar - you pass/state a context, an operation and parameters).

return context.operation(para, meters);
return context |> operation(para, meters);

Using fluent API one can mimic a "pipe" operator, but it's not generally extensible (yes, you can modify prototype and do other magic, but that's usually not recommended), it's up to the library developer to add new operators (methods) unlike "pipe" operator where one can mix and match functions from many libraries or use custom ones.

// example A: basic use of fluent API vs pipe operator
return seq([1]).map(x => x + 1).op(7).value; // OOP
return [1] |> map(x => x + 1) |> op(7); // FP: `map` and `op` doesn't have to be from a same library

// example B: using custom functions
return [1] |> map(x => x + 1) |> myFunc |> mySecondFunc(5) |> myThirdFunc; // FP
return myThirdFunc(mySecondFunc(myFunc(seq([1]).map(x => x + 1).value), 5)); // "OOP"

"Extreme" (more like "advanced", assuming lazy pure static) FP features would be something like dependent types, proofs, proper use of laziness, use of more arcane math types (much more arcane than e.g. a common monad, monoid, functor), maybe GADTs and tracking of side-effects via monads (those two are probably just an intermediate level).

pygy

comment created time in 6 days

startedBrunoRabbit/React-Native-tip

started time in 7 days

startedScrowszinho/TodoList

started time in 7 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

@lozandier xixixao never said pipeline operator is an "extreme FP" feature. Instead he said currying is.

I quote @xixixao here:

So here's where I think the difference is: From your POV "functional programming paradigm" includes things I do not think belong to JS (or at least do not belong to it atm, with its current feature set). Partial application / currying, the F#-style pipeline operator that folks are arguing for here.

Based on my interpretation, he is saying F# style pipeline operator favors partial application / currying. Both of these features are considered "extreme FP". Hack style pipeline operator is more style agnostic in this sense.

I stand corrected on any misinterpretation; in any case, I also don't agree with partial application & currying being "extreme FP" features with essentially the same reasoning as @samhh; a take that they are to me is borderline naivety of the chicken-&-egg situation caused by proposals like this being indefinitely on hold.

I'm of the opinion that the partial application & currying friendliness of the F# proposal is a strong factor why that proposal that adds to the minimal proposal has far more favorable traction vs the Hack-style proposal as of this point–even when slicing such an analysis to the period of time they were both publicly available to be analyzed by the JS community & those active in threads like these.

pygy

comment created time in 10 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

For currying and partial application to be that useful you really need composition or at least piped function application a la the F# proposal here. Most users, in my anecdotal experience, prefer to stick relatively closely to what's standardised, come hell or high water.

With that being the case, of course they're "extreme FP" features insofar as they're not broadly utilised in the JS community; we don't have the requisite language features, like this, for them to be adopted en masse.

Adoption would increase with the F# proposal's standardisation, and it'd no longer be "extreme". Do you see the circularity of this argument?

pygy

comment created time in 11 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

@lozandier xixixao never said pipeline operator is an "extreme FP" feature. Instead he said currying is.

I quote @xixixao here:

So here's where I think the difference is: From your POV "functional programming paradigm" includes things I do not think belong to JS (or at least do not belong to it atm, with its current feature set). Partial application / currying, the F#-style pipeline operator that folks are arguing for here.

Based on my interpretation, he is saying F# style pipeline operator favors partial application / currying. Both of these features are considered "extreme FP". Hack style pipeline operator is more style agnostic in this sense.

pygy

comment created time in 11 days

issue openedtc39/proposal-pipeline-operator

Possible ambiguity when tuples and smart pipelines' placeholder are used together

Currently, smart pipelines use the hash symbol for placeholders

['hello', 'world'] |> #.concat(#[0]);
// ['hello', 'world', 'hello']

But with the record & tuple proposal which is currently in stage 2, the hash symbol is also used for creating tuples and records. This means a user might expect a different result in the above example:

['hello', 'world'] |> #.concat(#[0]);
// ['hello', 'world', #[0]]

created time in 11 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

@laurieontech Considering your informal poll recently that has shown overwhelming desire for the pipeline operator be supported (more upvotes than even your tweet) & your role within the TC39 Educator Committee, how such findings can help this proposal go forward with what that committee can do within TC39's structure?

pygy

comment created time in 11 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

@xixixao Yeah, we have to respectfully disagree categorizing pipeline operator as an "extreme FP" feature. I think that's an egregious thing to associate with the operator.

It's first & formost an operator strongly desired by functional programmers to greatly simply everyday functional programming throughout the JS community. It would simply API usage of a variety of functional programming libraries tremendously.

Furthermore, it's valuable for programmers doing data pipeline processing as diverse as unix shell users do everyday with && & $.

Finally, it's often forgotten the operator would greatly make more clear & easier the management of mixins, a common OOP pattern that's demonstrated by this proposal's very readme:

// Before:
class Comment extends Sharable(Editable(Model)) {
  // ...
}

// After:
class Comment extends Model |> Editable |> Sharable {
  // ...
 }

Accordingly it's for these reasons among many more to me it's a feature that has been desired for its broad appeal vs. the niche categorization of functioning programming features you're suggesting labeling it as an "extreme FP" feature. It's appropriately seen as a game-changer accordingly to how many will code leveraging itonce approved. That's simultaneously because many strongly want it; that wouldn't be the case if it was an "extreme FP" feature that you've categorized it to be.

I would argue if it's ratified, it'll rival arrow functions & other functional programming affordances added to the language in how prevalent it becomes–perhaps rival majority of the other new features in JS the past few years.

That to me can't be associated with an "extreme FP" feature.

pygy

comment created time in 11 days

issue commenttc39/proposal-pipeline-operator

argue this proposal's design-pattern is less readable than using a temp-variable

@kaizhu256 In my experience with Elixir, it would be a very rare scenario to have a pipeline with many functions that don't expect the preceding returned value as the 1st arg; which makes your subsequent points a bit of a strawman. In reality, it would look more like this:

foo
|> bar()
|> (returnedBarValue => oops(42, returnedBarValue))
|> baz()
|> qux()

Additionally, I don't see any reason for a "temp" variable. The returned value is simply the first argument in the next function (in this case, an anonymous function that wraps the oops function so you can use the preceding value as the 2nd arg).

kaizhu256

comment created time in 11 days

issue commenttc39/proposal-pipeline-operator

argue this proposal's design-pattern is less readable than using a temp-variable

It seems that arrow function is more similar to pipeline operator than temp variable

Basic

// original
let result = exclaim(capitalize(doubleSay("hello")));

// pipeline-operator
let result = "hello"
  |> doubleSay
  |> capitalize
  |> exclaim;

// temp-variable
let tmp;
tmp = person.score;
tmp = double(tmp);
tmp = add(7, tmp);
tmp = boundScore(0, 100, tmp);
let newScore = tmp;

// arrow-function
let result = (_ => (_= "hello"
  ,_= doubleSay(_)
  ,_= capitalize(_)
  ,_= exclaim(_))()

Functions with Multiple Arguments

// original
let newScore = boundScore( 0, 100, add(7, double(person.score)) );

// pipeline-operator
let newScore = person.score
  |> double
  |> (_ => add(7, _))
  |> (_ => boundScore(0, 100, _));

// temp-variable
let tmp;
tmp = person.score;
tmp = double(tmp);
tmp = add(7, tmp);
tmp = boundScore(0, 100, tmp);
let newScore = tmp;


// arrow-function
let result = (_ => (_= person.score
  ,_= double(_)
  ,_= add(7, _)
  ,_= boundScore(0, 100, _))()
kaizhu256

comment created time in 11 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

@ljharb Some friends of mine have quit programming before it has stabilized.

pygy

comment created time in 12 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

@mAAdhaTTah that is almost surely primarily a result of TypeScript and Angular unwisely pushing usage of that feature long, long before it has stabilized (which it still has not done).

pygy

comment created time in 12 days

issue commenttc39/proposal-pipeline-operator

Moving ahead with the minimal proposal

As a data point in favor of the OOP focus, 57% of people have used Decorators despite them not even being finalized.

pygy

comment created time in 12 days

more