These are chat archives for ramda/ramda

21st
Mar 2019
Rakesh Pai
@rakeshpai
Mar 21 02:19

@ildella I'd probably implement this as:

const f = pipe(f1, then(unless(isNil, f2)));
const result = await f(11);

f1 and f2 are regular (custom) functions, and are expected to return a promise. All the other functions are from ramda. Not sure if this satisfies what you're looking for.

Rakesh Pai
@rakeshpai
Mar 21 02:28
This might be more readable than using pipeWith, but f1 and f2 are hard-coded in the body of the function.
Arti Villa
@artivilla_twitter
Mar 21 06:01

I stop using ramda and forget so quickly. How would I create an array of array to an array of objects with keys.

const investmentSelections = [
    ['Crypto', 'Cryptocurrency'],
    ['HSA', 'HSA'],
    ['Roth 401K', 'Roth401K'],
];

into:


const investmentSelections = [
    {
        label: 'Crypto',
        selectionType: 'Cryptocurrency',
    },
    {
        label: 'HSA',
        selectionType: 'HSA',
    },
    {
        label: 'Roth 401K',
        selectionType: 'Roth401K',
    },
];
Nika
@overflowz
Mar 21 10:12
feels like it can be done better:
const arr = [
    ['Crypto', 'Cryptocurrency'],
    ['HSA', 'HSA'],
    ['Roth 401K', 'Roth401K'],
];

const convert = R.reduce((acc, x) => acc.concat({ label: head(x), selectionType: last(x) }), []);

convert(arr);
Ville Saukkonen
@villesau
Mar 21 16:12
Hi, why R.mapAccum does not map objects?
Also reduce is not available for objects.
Riku Tiira
@rikutiira
Mar 21 16:20
I’m guessing because iteration order can matter with reduce unlike with map or filter
you can do R.toPairs(obj) and reduce afterwards
Ville Saukkonen
@villesau
Mar 21 16:24
But that's pretty cumbersome and makes code hard to read. I don't think that worrying about iteration order with objects is very practical. It's well known in JS ecosystem that iteration order of objects should not be trusted.
Riku Tiira
@rikutiira
Mar 21 16:26
that was just my guess, not sure if it’s the real reason or not
Ville Saukkonen
@villesau
Mar 21 16:26
If the iteration order matters, the same problem would be with R.toPairs as well.
Riku Tiira
@rikutiira
Mar 21 16:26
but you could just write your own implementation of reduce which works with objects and problem solved
Ville Saukkonen
@villesau
Mar 21 16:27
Yeah true. But I'm using libraries like Ramda because I'd like to avoid writing functions like that :)
Riku Tiira
@rikutiira
Mar 21 16:27
I feel the issue is more profound than that. If you explicitly choose to do R.toPairs + reduce, then you are making a conscious choice there.
Anyway, it might just be oversight too, who knows (maybe you can find an issue related to it?)
Ville Saukkonen
@villesau
Mar 21 16:28
Same argument could be used when you deliberately pass object to reduce
Bijoy Thomas
@bijoythomas
Mar 21 16:29
@villesau reduce expects something that's of type Foldable .. like an array .. so given an array of some type T, reduce will reduce (or fold) that array into a resulting type of T .. for example you can reduce an Array[Integer] to Integer
Riku Tiira
@rikutiira
Mar 21 16:29
Bijoy Thomas
@bijoythomas
Mar 21 16:29
what would you expect the resulting type of reduce on an Object to be?
Ville Saukkonen
@villesau
Mar 21 16:30
what ever I return from the reduce function. Or did I misunderstood the question?
Brad Compton (he/him)
@Bradcomp
Mar 21 16:30
This has been an ongoing conversation among maintainers, start here: https://github.com/ramda/ramda/pull/2580#issuecomment-404455904
Here is another one: ramda/ramda#1067
That one links to a bunch of others
Riku Tiira
@rikutiira
Mar 21 16:32
@villesau I disagree that they are the same thing. Like the issue I linked shows, the kind of results object reducing can have is not at all obvious, whereas using toPairs you end up with an array with explicit order and if you need to further order it, you can sort it before reducing.
Wwhen you pass an array to reduce, it’s very clear what’s happening whereas it’s not with object and that makes reduce simple to reason about.
Ville Saukkonen
@villesau
Mar 21 16:35
You can, but the important question is that do you need to? I don't think so. At least so far i've not seen case where ordering matters when reducing objects with Lodash for example. So I don't think that it's very pragmatic to think about object key/value ordering.
Thanks @Bradcomp ! Quite a long threads. Any consensus there?
Bijoy Thomas
@bijoythomas
Mar 21 16:36
@villesau I believe ordering does matter .. that's why there is reduce and reduceRight
Brad Compton (he/him)
@Bradcomp
Mar 21 16:36
Nope!
Riku Tiira
@rikutiira
Mar 21 16:37
I think it’s more important to stick to clean and simple functions than end up with something like lodash because of convenience
Ville Saukkonen
@villesau
Mar 21 16:37
It might matter for lists, they are ordered by definition. Objects are not, so you shouldn't be caring about object ordering.
Bijoy Thomas
@bijoythomas
Mar 21 16:39
if the ordering does not matter, the same reduction on an object might return different results
is that something you are expecting?
Riku Tiira
@rikutiira
Mar 21 16:40
@bijoythomas he means that when you choose to reduce an object, you aren’t really doing the type of reducing where order matters and with that I agree (but I don’t agree that it’s a good reason to change the implementation of reduce)
Ville Saukkonen
@villesau
Mar 21 16:40
At least I would be comfortable with it. I'm not interested in my object property order before, nor after the iteration.
Bijoy Thomas
@bijoythomas
Mar 21 16:41
you are comfortable with a reduce on an object returning potentially different results each time you call it?
Ville Saukkonen
@villesau
Mar 21 16:43
Can you be a bit more explicit?
Bijoy Thomas
@bijoythomas
Mar 21 16:46
each iteration of reduce invokes the iterator function with the accumulator and an element .. if you are reducing an object .. what would you pass as the elem? the key, the value, or both key and value as a pair?
in any of those cases, how can you know that the ordering or the key, or value or pair of key & value is consistent?
Ville Saukkonen
@villesau
Mar 21 16:46

This is how lodash works: _.reduce({a: 123, b: 456}, (red, value, key) => ({...red, [key]: value}), {c: 789}) // yields consistently {c: 789, a: 123, b: 456}

https://lodash.com/docs/4.17.11#reduce

Stefano Vozza
@svozza
Mar 21 16:47
ah we haven’t had the reduce on objects debate in a while
Riku Tiira
@rikutiira
Mar 21 16:48
that particular reducing is not depending on iteration order
Ville Saukkonen
@villesau
Mar 21 16:48
Should it be? Can you give an example with objects that would be?
Riku Tiira
@rikutiira
Mar 21 16:49
well an arbitrary example would be reducing all values and concatting them to a string, you could end up with values concatted in any arbitrary order if iteration order is not guaranteed
The linked issues explain the reasoning well
Ville Saukkonen
@villesau
Mar 21 16:50
So this: _.reduce({a: 123, b: 456}, (red, value, key) => red + value, '') // yields consistently "123456" ?
Riku Tiira
@rikutiira
Mar 21 16:50
that is not guaranteed
it can depend on totally undeterministic things such as what environment you are running that in
Ville Saukkonen
@villesau
Mar 21 16:51
Can you explain "guaranteed" in this case? Can you show an environment that would yield different results?
Riku Tiira
@rikutiira
Mar 21 16:52
Generally that will result a consistent “123456” but that’s basically you trusting that javascript engines do that despite javascript spec not guaranteeing it.
Ville Saukkonen
@villesau
Mar 21 16:52
In theory it's not guaranteed. Each browser (and node) has similar way of handling objects. In practice there is no problem at all. Even React relies on that
Bijoy Thomas
@bijoythomas
Mar 21 16:53
@villesau the lodash docs itself has the disclaimer that iteration order is not guaranteed https://lodash.com/docs/4.17.11#reduce
Riku Tiira
@rikutiira
Mar 21 16:53
Honestly if you feel that it’s okay knowing all that then I think the way we think about code is just quite different :smile:
It’d be weird to have a library of pure functions which can still be undeterministic
Ville Saukkonen
@villesau
Mar 21 16:55
@bijoythomas Such disclaimer wouldn't hurt. Although there (AFAIK) wouldn't be a case where the ordering would change.
Riku Tiira
@rikutiira
Mar 21 16:55
And rely on something like “let’s just assume all environments implement this thing the same way despite there not being any rule to do so"
Ville Saukkonen
@villesau
Mar 21 16:56

Some rather large companies (and you too if you write any react) are relying on it already: https://reactjs.org/docs/create-fragment.html

Note also that we’re relying on the JavaScript engine preserving object enumeration order here, which is not guaranteed by the spec but is implemented by all major browsers and VMs for objects with non-numeric keys.

Bijoy Thomas
@bijoythomas
Mar 21 16:58
wow .. I wish I worked in an environment with such guarantees :-)
Brad Compton (he/him)
@Bradcomp
Mar 21 16:58
The implementation details are irrelevant as soon as we move from reference equality to value equality. The issue isn't that they have a consistent order, it's that the order is arbitrarly based on the instantiation code
so for the purposes of Ramda, equals({a: 1, b: 2} , {b:2, a:1}) === true
Riku Tiira
@rikutiira
Mar 21 16:59
@villesau Ah, alright! But even so, React generally knows where it’s going to be used in (browsers or node) and it’s also very different from Ramda which only has referentially transparent functions so it’s good to keep that in mind
Brad Compton (he/him)
@Bradcomp
Mar 21 17:00
but if you were to treat them equally during an object reduce you could get different answers for two items with value equality
Riku Tiira
@rikutiira
Mar 21 17:00
I don’t think you should really go with Ramda if you value convenience over its design principles
you could use lodash/fp for example
Brad Compton (he/him)
@Bradcomp
Mar 21 17:02
A reasonable notion of equality for objects is that they have the same set of keys and those keys map to the same values
but the JS implementation has a quirk in that it maintains an ordering that is based on the insertion order. This might be useful behavior, but when it comes to reduce we run into the issue (which you can see being discussed extensively in the links) that equals(a, b) doesn't imply equals(f(a), f(b) which is problematic for referential transparency
Ville Saukkonen
@villesau
Mar 21 17:05
es6 -> has ways to ensure object order: https://stackoverflow.com/a/30919039/2201572
Bijoy Thomas
@bijoythomas
Mar 21 17:05
@Bradcomp that's an interesting point .. so two objects that are equal by Ramda's equal could potentially return different results with the same reduction
although I don't think I have ever used equals with objects
Ville Saukkonen
@villesau
Mar 21 17:07
Where can I read about Ramda design principles? What I'm after here is just pragmatic choices. I really love using Ramda, but lack of proper object iteration functions makes it quite cumbersome.
Riku Tiira
@rikutiira
Mar 21 17:09

https://ramdajs.com/

"Ramda emphasizes a purer functional style. Immutability and side-effect free functions are at the heart of its design philosophy. This can help you get the job done with simple, elegant code.”

Although I guess RT is not explicitly stated anywhere.

Ville Saukkonen
@villesau
Mar 21 17:10

This can help you get the job done with simple, elegant code.

This is exactly what I'm looking for :)

Riku Tiira
@rikutiira
Mar 21 17:10
is it really such a huge difference to do R.values(o).reduce(…) though? (guessing you don’t need keys)
Ville Saukkonen
@villesau
Mar 21 17:16
I'd say it's quite big. You would need to have two different mental models and different documentation to iterate objects and arrays when the overall expectation (based on my personal observations) is that you could get the job done with confidence only using one model with Library like Ramda. And it's clearly quite a big thing for the JS devs coming to Ramda in overall when looking at the issues pasted above.
Riku Tiira
@rikutiira
Mar 21 17:17
Ramda’s map and filter (for example) do work on objects because there iteration order doesn’t matter
Ville Saukkonen
@villesau
Mar 21 17:18
I actually ended up thinking this when saw code that used toPairs/fromPairs and went to refactor it because "there for sure is neater way to do this so that others does not need to think about what this all code does".
Bijoy Thomas
@bijoythomas
Mar 21 17:22
@villesau have you tried https://ramdajs.com/docs/#evolve ?
Ville Saukkonen
@villesau
Mar 21 17:25
Evolve does not seem what i'm after. I'm recursively applying asynchronous operation (promise) for each string property of an object. mapAccum looked like good candidate for that.
currently it's roughly like this:
export const doOperation = async (nestedObject, item) => {
  const nestedObjectPairs = await Promise.all(
    toPairs(nestedObject).map(async ([key, value]) => {
      if (typeof value === 'object') {
        return [key, await doOperation(value, item)];
      }
      const resolvedValue = await asyncOperation(value as string, item);
      return [key, resolvedValue];
    })
  );
  // @ts-ignore
  return fromPairs(nestedObjectPairs);
};
Ben Briggs
@ben-eb
Mar 21 17:45
@overflowz map(zipObj(['label', 'selectionType']))
Nika
@overflowz
Mar 21 17:50
@ben-eb exactly! x) I'm still new so, pardon me ;p
Ben Briggs
@ben-eb
Mar 21 17:51
That's alright, we're all still learning :)