These are chat archives for ramda/ramda

11th
Aug 2016
Tim Navrotskyy
@dypsilon
Aug 11 2016 00:18
@arzig thanks for your input, the question was answered in the fantasy land channel.
olsonpm
@olsonpm
Aug 11 2016 03:32
So I'm reading about transducers and am realizing my understanding of a 'reducer' is incorrect - however I'm not finding a good definition for what a reducer function is. I'm looking really for why it's called "reduce"
Coming from lodash, my understanding was that I mainly use reduce to turn a collection into an aggregate such as array -> obj or array -> sum etc.
that fit well for all my use-cases, but it conflicts with how transducers are being described
This is the article I'm stumbling through anyway
olsonpm
@olsonpm
Aug 11 2016 04:45
egh - I understand what a transducer is now and its utility - still stumped on why the heck a "reducer" function is named reduce. My new understanding of a reducer function is that it has no semantics. It takes a result and a new value, and spits out a new result. I can't see how this remotely applies to the word 'reduce', and more googling shows i'm not alone in this confusion.
James Forbes
@JAForbes
Aug 11 2016 07:09
@olsonpm that is an amazing article
olsonpm
@olsonpm
Aug 11 2016 07:13
Yeah I was impressed
James Forbes
@JAForbes
Aug 11 2016 07:22

Reduce takes a collection of values and returns a single result. I think the confusing part is, that result could be a list of values. So it seems like nothing is being reduced at all. But you just need to expand your definition of what a single "result" is.

The reducer is the function within the call to reduce, it accepts the previously transformed value and the current value being transformed and returns a single value. This type signature (a,b) -> a sums up why it is called reduce, take 2 things return 1.

Reducers are everywhere, they are the simplest possible abstraction you could have for combining items into a final result. So they are really more a discovery than an invention. You've probably been writing reducers in your applications already without even noticing.

Thank you for the article, its the best I've seen re: transducers
Martin Broder
@mrtnbroder
Aug 11 2016 07:38
Is there a default sort function available for sort?
e.g. if I pass down an array instead of a function first, would it use a default sorting algorithm?
Martin Broder
@mrtnbroder
Aug 11 2016 07:45
looks like it doesn't.
James Forbes
@JAForbes
Aug 11 2016 08:06
Ramda functions are rarely variadic, which seems annoying at first, but it makes the API a lot more consistent
Dinesh Bhosale
@dineshbhosale
Aug 11 2016 08:37
@dypsilon , yes I asked this question by mistake
Barry G
@bgits
Aug 11 2016 12:19
Is there a ramda function that can create a new field in an object using other fields, something like this perhaps? https://dpaste.de/zc0C
Alastair Hole
@afhole
Aug 11 2016 12:47
Hey all - I want a function that takes a string and if it's empty returns null, otherwise returns the string. Is there anything pithier than: when(isEmpty, always(null)) not sure if there's something built-in that does it already?
Ryan Zeigler
@rzeigler
Aug 11 2016 13:33
@mrtnbroder i don't believe so, there's not much in terms of variadic things you can do
@mrtnbroder however, if you have something intrinsicly sortable you can do stuff like const defaultSort = R.sortBy(R.identity)
Martin Broder
@mrtnbroder
Aug 11 2016 13:34
@arzig yeah that's what I did at first but figured out that that one doesn't work for strings and alphabetical sort.
Ryan Zeigler
@rzeigler
Aug 11 2016 13:35
that is surprising to me
Martin Broder
@mrtnbroder
Aug 11 2016 13:35
so defaultSortfor me is now a > b ? -1 : a > b 1 : 0
Ryan Zeigler
@rzeigler
Aug 11 2016 13:36
was this not what you were trying to do?
Martin Broder
@mrtnbroder
Aug 11 2016 13:36
ugh
I was using sort
lol
Ryan Zeigler
@rzeigler
Aug 11 2016 13:36
ah, yes, that will do it
Brad Compton (he/him)
@Bradcomp
Aug 11 2016 14:26
@mrtnbroder If it's useful, the comparator function can help fot making functions to pass to sort. It takes a boolean function and turns it into a comparator function.
R.comparator
olsonpm
@olsonpm
Aug 11 2016 14:35

@JAForbes - Yeah thanks for acknowledging the confusing part. I can settle with your definition because it's about as good as it's going to get. I think the property you mentioned "they are the simplest possible abstraction you could have for combining items" is what helps me understand why the author of that article was writing map and filter using reduce. I kept thinking "yeah sure you can, but why?"

I appreciate your time!

James Forbes
@JAForbes
Aug 11 2016 15:24

I think I can help with the why.

So often, you will be aggregating some data into a new format, or a summary, maybe for reporting or for rendering in the UI. Most of what programmers do is turn one data format into another.

And you could use a forEach or a for loop, or a while loop, or a do while etc... but you will end up requiring some kind of state outside the closure, like a counting variable, or a final list of results.

So the first observation: this is all ceremony, its stuff we do every time that doesn't need to be done. It is machinery and that increases opportunities to introduce error. A lot of bugs are caused by name collision in loops or off by one errors. This level of abstraction is rarely relevant to the task at hand.
Aggregation is conceptually simpler than our code would lead us to believe.

So we refactor it to be a reducer. Just a function that accepts the aggregate, the current value and returns a new aggregate.
Well, now the entire process is localized, pure and unceremonial. Less parts, tends to mean less bugs. That's great, but that's not all.

Now we have the ability to control the aggregation directly via whatever means we choose. We could just plug our list into reduce. But it could also be completely asynchronous, we could process chunks of data as they arrive. Our reducer represents a single transaction in the larger process of aggregation so we can control that aggregation to a fine degree (without the ceremony).

So that's cool. But that isn't all either.

reduce is a generic concept that applies to any data-structure, not just lists. So you could reduce a Set, or a Map, or a Stream, or anything at all. Notice the reducer has no knowledge of the source data structure, and if you are using a transducer, you can also write transforms that are ignorant of the target data structure as well.

So instead of writing a new type of map, or filter for every data structure, we can just write it once for all data structures. But that isn't all either.

The reducer itself is often a useful function on its own, even when we aren't reducing. e.g. add or concat are generic functions we can use in other contexts.
So by moving our logic out of the for loop and into a reducer, we are likely be able to reuse useful abstractions again and again. (For example reusing the add function to sum a tree, or a key/value object, or a stream without editing our reducer.)

Additionally, if we can turn our custom code into reducers we will often discover our reducer already existed as a standard function in lodash, or ramda or even native JS. We might think we are doing something unique, but it already exists, it might be R.merge, R.max, R.product or even R.compose.
The more we can rely on these existing well tested functions, the less probability there is that our code will break.

From there the benefits lead into other topics like the article you linked to transducers, monoids, channels. The fact that reduce is discovered, and not invented means that relying on it is like relying on other fundamental principles of mathematics or physics; it will be compatible and useful in many domains, even domains we are yet to discover (transducers are fairly new in the scheme of things).

A great example of an architecture that relies heavily on reducers is the Elm Architecture / Redux. The update function is a reduction of the model and the current action ( model -> action -> model ). Elm reduces a stream of actions and the reducer is the update function. Such a simple signature scales to an app of any size.

So reduce pulls its weight, big time.

Barry Kern
@bskern
Aug 11 2016 19:43
is there anything like props but that would return me the key & value ..I am trying to pluck out a few keys and values from large JSON object
Brad Compton (he/him)
@Bradcomp
Aug 11 2016 19:43
R.pick
Brad Compton (he/him)
@Bradcomp
Aug 11 2016 19:44
@bskern pick is probably what you're looking for
Barry Kern
@bskern
Aug 11 2016 19:48
perfect one more question... I picked some values I want out of large json and now I want to transform two date fields ...would evolve be the best choice? The API returns time since epoch so I wanted to format that via moment .. I have evolve written out but its not working at the moment for me
Barry Kern
@bskern
Aug 11 2016 19:55
this is what I am trying (omitted the other tranformations because they work)
Brad Compton (he/him)
@Bradcomp
Aug 11 2016 19:55
evolve is probably what you want.
Barry Kern
@bskern
Aug 11 2016 19:55
  const transformations = {
    my_date: moment(R.__).format('dddd, MMMM Do YYYY, h:mm:ss a')
  }
  const result = R.evolve(transformations,data);
end up still seeing something like
my_date:1470942594
Brad Compton (he/him)
@Bradcomp
Aug 11 2016 19:56
Yeah, you won't be able to use a placeholder there
You'll want

  const transformations = {
    my_date: d => moment(d).format('dddd, MMMM Do YYYY, h:mm:ss a')
  }
  const result = R.evolve(transformations,data);
moment is evaluating on R.__ as soon as it's declared.
Barry Kern
@bskern
Aug 11 2016 20:13
thanks again
Rick Medina
@rickmed
Aug 11 2016 22:33
hello all! a bit generalized question: what do you think about types (typescript/flow) with fp style in js?