Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why :: need to be used while defining? #8

Open
LongTengDao opened this issue Mar 3, 2021 · 19 comments
Open

Why :: need to be used while defining? #8

LongTengDao opened this issue Mar 3, 2021 · 19 comments

Comments

@LongTengDao
Copy link

I saw the proposal is const ::toArray=... now.

Why not const toArray=... simply, like old proposal did?

@hax
Copy link
Member

hax commented Mar 8, 2021

First, only binary form need defining first, or import ::{toArray} from ....

The design goal is making extension (virtual) methods/properties behave as close as normal (real) methods/properties. The only difference between them is how to lookup the methods/properties.

  • o.foo(...args) Normal methods: lookup method foo in the object o
  • o::foo(...args) Binary form: lookup method foo in the defined local extension methods
  • o::X:foo(...args) Ternary form: lookup method foo in the object X

When get the method, all invoke the method with the receiver o and arguments args.

Note foo in o.foo not use the local name of foo. So to behave as close as o.foo, o::foo should also use separate lexical namespace, or it will cause some developer experience problem.

One such problem of old proposal (lookup foo from normal namespace, aka. const foo=...; o::foo()) is, it's very common to write let json = obj.json(), but let json = obj::json() doesn't work. This make the refactoring between normal methods and extension methods trouble. There is also shadow problem (json may be shadowed by accidents), See also #4 for similar discussion. Note in many cases, extension methods would use common words like first, last, max, etc. and have very high possibility to be shadowed in local scope.

Another benefit of defining first is it could give error in more early stage.

https://www.smashingmagazine.com/2018/10/taming-this-javascript-bind-operator/ have a good example:

const plus = (x) => this + x;
...
1::plus(1) // expect 2 but got "[object Window]1" or NaN or  possible some other values

With this new proposal, we could throw type error in first place.

Of coz, it seems forcing defining is not as convenience as old proposal, but consider the common usage is importing extension methods from modules, with import ::{first, last} from 'mod' syntax, it have the same ergonomics .

@hax
Copy link
Member

hax commented Mar 8, 2021

Note, I understand it's not common to have separate namespace in JS, I'm still investigating the other options which mitigate the problem.

One possibility is remove const ::x = ... syntax at all, so the only two syntax will be

  • import ::{foo, bar} from "module"
  • const ::{foo, bar} from extensionObject

which seems more consistent and simpler.

It still in separate namespace but may be more acceptable...

@hax
Copy link
Member

hax commented Mar 8, 2021

Another point, if we really want applying method from a local binding directly, we could have syntax like o::[foo] or o::(foo). Especially o::(foo) is somewhat compatible with the old proposal. But:

  1. It's not easy to read code like o::(function square() { return this * this }), I feel most people will eventually write const square = ...; o::(square). From the ergonomic view , there is no much difference with const ::square = ...; o::square.
  2. o :: (function () {...}) make it more like pipeline op in fp, but o :: (x => x * x) is wrong, unless we automatically convert arrow functions and use the first param as receiver, but how about function (x) { return x * x }? It seem user expectations will be broken in various manner whichever we choose. This is why that part of old proposal is dead and left to pipeline proposal (though pipeline have their own issues).
  3. Use cases like arrayLike::(Array.prototype.map)(x => x * x) should have more ergonomics form, this is one of the reasons of introducing ternary form, so we can write arrayLike::Array:map(x => x * x).
  4. As we see, foo in o::(foo) should be a "free method" (an unbound function contains dynamic this outside of classes or object literals) or accessor in property descriptor-like form. It seems both are uncommon in daily codes (especially very rare in exports) and may be confusing to average programmers. (This is also one of the reasons why I'm considering removing const ::foo = ... syntax at all.)

@ljharb
Copy link
Member

ljharb commented Mar 8, 2021

I’m not sure i see the case for a separate namespace at all. I don’t recall the old proposal being considered to have any problems related to o::json() or similar. Any kind of separate namespace is, to me, a bit of a nonstarter.

@hax
Copy link
Member

hax commented Mar 8, 2021

@ljharb I hope I could express the ideas clearer:

High-level goal: make extension methods behave as close as real methods in syntax, semantic and developer experiences; eliminate the barrier and troubles of using extensions as possible as we can.

Old proposal issues which affect this goal:

  1. Refactoring hazard of let json = o.json() <=> let json = o::json() or let size = collection.size <=> let size = collection::size or let max = Math.max(...data) <=> let max = data::max() or many other possible cases.
  2. Easy to be shadowed by accidents because extensions tend to use short names and common words. This could be mitigated by linter, but adopting linter and specific rules always have big ecosystem cost, and no-shadow rule is not enabled by default as eslint recommended rules.
  3. If use bindings directly, they should be "free methods", and it's uncommon and potentially confusing to average users. This proposal solve this problem by introducing a importing syntax like import ::{last} from 'lodash' which will auto convert the functions to extension methods, avoid such pitfalls and could leverage the ecosystem in the best. Technically such converted last method could still in same namespace, but actually not useful at all, and only cause problems if it wrongly used in other form.

@ljharb
Copy link
Member

ljharb commented Mar 8, 2021

I'm not clear on how that's a refactoring hazard.

"Adopting a linter" is not really a cost; everyone who's going to use one likely already has one - but shadowing is also a normal part of the language, and it's often desired to shadow.

Why does a function need conversion? function () { return this; } should already be an extension method, as should Array.prototype.slice.

@hax
Copy link
Member

hax commented Mar 8, 2021

The refactoring hazard is programmers are forced to either rename the local variable or rename the extension method name. And naming is hard.

"Adopting a linter" (and adopting special rules) is always a cost. But I don't want to argue this point with you again and again.

but shadowing is also a normal part of the language, and it's often desired to shadow.

I somewhat agree. This is why people often override and disable no-shadow rule when they use some coding style (like airbnb eslint config) which enable no-shadow by default.

The point here is, when someone use let size = 10, it's very impossible they desired to shadow ::size extension method. And vice versa.

Why does a function need conversion

Because most libraries provide util functions like export function last(a) { return a[a.length - 1] } not free methods like export function last() { return this[this.length - 1] }. But people want to use the util functions like extension methods: let last = a::last().

@ljharb
Copy link
Member

ljharb commented Mar 8, 2021

I don't understand why that's something we'd want to support. Methods used as if they're in a chain should be ones written to be used as such - just like decorators aren't designed to be used with functions that weren't written to be decorators.

A user who wants that can function last() { return utilLast(this); }; the language shouldn't try to handle that (nor assume that such utility functions will always use the first argument).

@hax
Copy link
Member

hax commented Mar 8, 2021

Methods used as if they're in a chain should be ones written to be used as such

Actually I agree that.

The point here is , actually there is no much differences between these two last function, they have same purpose and semantic except one use this receiver, one use first argument. If you look inside lodash, most util functions are match that pattern.

The reason why lodash (and other util libs) is written like that is just because the language do NOT have extension methods. Library authors are forced to provide the functions in util function form.

So if we introduce extensions in language level, it would be better to levage the assets if possible, not force every libraries write the same conversion logic, and publish a new versions , and users have to upgrade to new versions to use the feature.

@ljharb
Copy link
Member

ljharb commented Mar 8, 2021

I don't think that would be better. I think that libraries like lodash would either provide forms that use this, or, would provide their own adapter method.

@hax
Copy link
Member

hax commented Mar 8, 2021

This question would be better answered by library maintainers. But I at least see some factors :

  • require publishing new versions even the conversion is very trivial
  • runtime conversion would have perf cost
  • do we need publish two package for each form? Note in a big project, there may be someone use extensions someone use utils form. so such policy increase the deploy complexity of application devs.

Note every library need to repeat that. And also there are tons of libraries have no active maintaining.

@ljharb
Copy link
Member

ljharb commented Mar 8, 2021

As a library maintainer, I would prefer that functions I write to not use this not be suddenly used in ways I did not intend, and to suddenly expose me to a flood of new bug reports as a result.

You don't need two packages, but you could easily have one package with two entry points.

@ghost
Copy link

ghost commented Mar 8, 2021

I've used Lodash via "lodash-bound" before and I thought it should be considered when evaluating how 3rd party libraries can adapt to this operator. I used it against the same babel implementation of the :: operator that we've always had. I'm not at a desktop or I might go into greater detail. :)

@hax
Copy link
Member

hax commented Mar 9, 2021

suddenly expose me to a flood of new bug reports

Could you explain what bug reports will occur? I hope we can try to analyze and avoid them.

@ljharb
Copy link
Member

ljharb commented Mar 9, 2021

Any time someone uses a package in a way the author didn’t intend, there’s a likelihood of new bugs. An automatic adapter is very unlikely to be able to handle all edge cases, including “the second argument is the receiver”, or “it’s sloppy mode but i don’t expect the global object” or “it’s strict mode but i do expect the global object” or “it’s sloppy mode but i do expect a primitive receiver” or the inverse, etc.

@hax
Copy link
Member

hax commented Apr 19, 2021

@ljharb Thank you for provide such cases, I'd like to do some research on these cases. If you could provide some real world cases of them it will be much helpful. Thank you very much!

@ljharb
Copy link
Member

ljharb commented Apr 19, 2021

I don't have anything concrete to share. You're welcome to look over my packages for examples; the main thing is, that if i don't explicitly intend a function to be used with a receiver, it should simply not be differently usable in a receiver position.

For example, with any function f that's written to ignore this, ({ f }).f(...args) or f.call(o, ...args) shouldn't do anything different than f(...args), and i would expect that any "function borrowing" syntax would also not do anything different - whereas, with any function g that's written to respect this, ({ g }).g(...args) or g.call(o, ...args) shouldn't do anything different than g.call(o, ...args)`, and I would expect that any "function borrowing" syntax would also not do anything different.

@Jamesernator
Copy link

With the pipeline proposal reaching Stage 2 I'm not convinced that conversion utilities are as a critical part of an extensions proposal going forward. Even with extension methods we could just mix them with pipelining as needed, rather than trying to convert everything into one:

import { flatMap } from "./iterableUtils.js";
import { sortBy } from "lodash";
import { takeMiddle } from "./arrayUtil.js";

myCollection
  ::flatMap((i) => [i, i])
  |> Array.from(^)
  |> sortBy(^, item => item.id)
  ::takeMiddle(5);

I do think the

const plus = (x) => this + x;
...
1::plus(1)

hazard is a bit unfortunate though, although for one this can be solved by tools like TypeScript, or even just unit testing.

Although if using arrows in such a way were still sufficiently desirable, we could just make this configurable as a parameter for arrow functions i.e.:

// possible syntax
const plus = (this, x) => this + x;

1::plus(1); // 2

The concern about shadowing, (i.e. let last = a::last()), doesn't seem like an issue that is specific to this proposal. The exact same issue already happens today with regular old functions (let last = last(a)).


Also having previously used the older bind operator proposal with Babel quite a bit a few years ago, I can say from experience it was easy enough to work with that I don't think huge changes were super neccessary.

For functions written specifically to be this-taking methods, it was always trivial to work with. There was some boilerplate with converting some operations, but for the most part it wasn't too bad, and if pipelines had been a thing back then I don't think it would've been a problem at all.

@SimonMeskens
Copy link

For me this is a non-starter. I don't understand the need for namespacing after reading this whole issue, and I prefer the old proposal. I feel like this proposal has way more complexity.

For me, the main value for this operator is a new calling style for functions that are already in scope and with this proposal, that wouldn't be allowed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants