by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jan 31 2019 22:17
    CrossEye commented #2779
  • Jan 31 2019 21:04
    ArturAralin commented #2779
  • Jan 31 2019 20:08
    CrossEye commented #2779
  • Jan 31 2019 18:56
    buzzdecafe commented #2631
  • Jan 31 2019 18:09
    ArturAralin commented #2779
  • Jan 31 2019 16:18
    CrossEye commented #2779
  • Jan 31 2019 16:10
    CrossEye commented #2631
  • Jan 31 2019 16:06
    CrossEye commented #2777
  • Jan 31 2019 14:44
    ArturAralin opened #2779
  • Jan 31 2019 07:39
    inferusvv commented #2631
  • Jan 31 2019 03:07
    sespinozj commented #2771
  • Jan 31 2019 02:33
    machad0 commented #2771
  • Jan 31 2019 02:26
    JeffreyChan commented #2777
  • Jan 30 2019 14:30
    CrossEye closed #2777
  • Jan 30 2019 12:13
    vanyadymousky updated the wiki
  • Jan 30 2019 01:42
    JeffreyChan commented #2777
  • Jan 29 2019 21:06
    vanyadymousky updated the wiki
  • Jan 29 2019 16:28
    CrossEye commented #2777
  • Jan 29 2019 15:50
    mbostock commented #2772
  • Jan 29 2019 15:48
    CrossEye commented #2772
Vladimir Kapustin
@kapooostin
ap and chain take up nicely the cases seemingly meant for converge, where there are just two functions and one of them is identity.
Vladimir Kapustin
@kapooostin
ap gives you ap (f, g) (a) = f (a) (g (a)) and chainchain (f, g) (a) = f (g (a)) (a)
Vladimir Kapustin
@kapooostin
:point_up: addPropertyLogin should be:
// addProposedLogin :: Object -> Object
const addProposedLogin = chain(
  assoc('proposedLogin'),
  getProposedLogin
);
Johnny Hauser
@m59peacemaker
I feel like I mostly understand transducers, but there's one case that blew up my brain and I'm having trouble getting through it... with the idea of transducing a reactive type, like an frp event/stream kind of thing - it's pretty straightforward that you start with a stream, each value from it goes through the pipeline, and there's a builder on the end pushing the transformed value into the resulting stream. I don't think I have an issue with my mental model here. But when I approach the idea of a fold on an event/stream, I am lost.
eventStream.fold((acc, v) => [ ...acc, v ], [])
what would that mean with transducers?
Johnny Hauser
@m59peacemaker
I think it would be like [ 1, 2, 3 ] to [ [ 1 ], [ 1, 2 ], [ 1, 2, 3 ] ]
John Hartnup
@ukslim
@kapooostin @cilquirm , this is great. I've got a function I use often that isn't written in Ramda, but I use from Ramda a lot:
```
const assocDerived = (name, f, obj) => ({
  ...obj,
  [name]: f(obj),
})
(Wrap it in curry() ... I left it off for brevity)
But I may start using chain instead.
Aadi Deshpande
@cilquirm
thank you so much @kapooostin. this completely opened my :eyes:
Johnny Hauser
@m59peacemaker
I came across the statement 'transducers are perverse lenses', but I don't understand anything said on that topic, of which there is little. Transducers can be used on push-based reactive types. Can lenses be used for such things instead?
klequis
@klequis

I have been using a few functions from Ramda for a while but this is my first
time using transduce.

The sample is some data and code from a Ract app. There is a stripped-down
version of the React app at: https://codesandbox.io/s/data-with-ramda-38j7b

The code is building a table and applies formatting when a dataField has a formatFn property.

It is expected to process about 1,000 objects/rows, each with 14 properties and maybe 7 of them having a formatFn property.

It works as is but I am wondering if there is a better way to make use of Ramda to make the code more efficient?

Data


    const data = [
      {
        _id: "1234",
        date: "2020-01-23T08:00:00.000Z",
        description: "ATM CHECK DEPOSIT",
        debit: 0,
        credit: 91.5,
        omit: false
      },
      {
        _id: "5678",
        date: "2020-02-10T08:00:00.000Z",
        description: "CHEVRON",
        debit: -63.51,
        credit: 0,
        omit: true
      }
    ];

    const columns = ["_id", "date", "description", "credit", "debit", "omit"];

    const dataFields = {
      _id: {
        name: "_id",
        description: "Id"
      },
      date: {
        name: "date",
        description: "Date",
        formatFn: d => `++${d}` // actual format is from date-fns
      },
      description: {
        name: "description",
        description: "Description"
      },
      credit: {
        name: "credit",
        description: "Credidt"
      },
      debit: {
        name: "debit",
        description: "Debit"
      },
      omit: {
        name: "omit",
        description: "Omit",
        formatFn: d => (d ? "yes" : "no")
      }
    };

Code


    const modValues = (value, key, obj) => {
      const column = dataFields[key];
      if (has("formatFn")(column)) {
        const { formatFn } = column;
        return formatFn(value);
      } else {
        return value;
      }
    };

    const _transform = map(mapObjIndexed(modValues));
    transduce(_transform, flip(append), [], data);
Brad Compton (he/him)
@Bradcomp
@klequis When you are only performing one pass over the data there is no performance benefit in using transduce rather than just calling map(/**/) directly.
Also, consider using the evolve function for performing the formatting of your object:
map(evolve({
  date: d => `++${d}`,
  omit: d => (d ? "yes" : "no"),
  // ... and whatever else
}))
Brad Compton (he/him)
@Bradcomp

To expand a bit:

Transducers come with a certain amount of overhead. If you want to perform a number of operations on a list (map, filter, take, etc.) it will reduce the performance hit for the combined operations. In the case of functions like take it also has the benefit of stopping the whole composition, so there is no need to map over 1000 items if you just wanna take(5) at the end. The transducer will ensure you only process 5 items.

// Let's turn this into something `evolve` can use.
const dataFields = {
      _id: {
        name: "_id",
        description: "Id"
      },
      date: {
        name: "date",
        description: "Date",
        formatFn: d => `++${d}` // actual format is from date-fns
      },
      description: {
        name: "description",
        description: "Description"
      },
      credit: {
        name: "credit",
        description: "Credidt"
      },
      debit: {
        name: "debit",
        description: "Debit"
      },
      omit: {
        name: "omit",
        description: "Omit",
        formatFn: d => (d ? "yes" : "no")
      }
    };

const evolver = filter(Boolean, map(prop('formatFn'), dataFields));

map(evolve(evolver))(data);
Brad Compton (he/him)
@Bradcomp

I would start there and measure the performance...

If it's not sufficient then we can go from there. If you need the whole table to render at the same time then mutation will probably be the most efficient option but it is of course more dangerous

mitcho
@mitchoSR_twitter
can i get some wisdom on this, thought i was fine with reference passing but im feeling pretty silly atm
let data = [
    [
        { type:'dog' },
        { type:'cat' },
        { type:'cat' },
    ],
    [
        { type:'dog' },
        { type:'llama' },
        { type:'cat' },
    ],
];

const aReducer = R.reduce((a,c) => {

    if(a[c.type] === undefined){
        a[c.type] = 0;    
        console.log(a)
    }
    a[c.type] += 1;

    return a;

},{});

console.log(
    R.map(aReducer, data)
)

// expected 
// [
//     { dog:1, cat:2 },
//     { dog:1, cat:1, llama: 1 },
// ]

// got 
// [
//    {dog: 2, cat: 3, llama: 1}
//    {dog: 2, cat: 3, llama: 1}
// ]
John Hartnup
@ukslim
@mitchoSR_twitter It's subtle, but:
You create aReducer once. const aReducer = reduce(myIteratorFunction, {})
Now, your iteratorFunction mutates the accumulator {}
When you use aReducer a second time, it doesn't create a new base accumulator. It continues using the one you gave it (which has now been mutated)
John Hartnup
@ukslim
The simplest change to your code to make it work as you expect, is to make your iteratorFunction pure, by returning a new object:
```
const aReducer = R.reduce((a,c) => {
    const a1 = R.clone(a)
    if(a1[c.type] === undefined){
        a1[c.type] = 0;    
    }
    a1[c.type] += 1;

    return a1;

},{});
But there are neater ways to write it without even mutating local variables.
(BTW you can make it "fail early" by making the base accumulator Object.freeze({}) )
系我衰又唔够衰
@CsYakamoz
const aReducer = R.reduce((a, c) => {
    if (a[c.type] === undefined) {
        a[c.type] = 0;
        // console.log(a);
    }
    a[c.type] += 1;

    return a;
});     // no accumulator

console.log(R.map(item => aReducer({}, item), data));
there is a solution.
But I wonder if there is a more concise solution?
John Hartnup
@ukslim
const bReducer = reduce(
  (a, c) => ({
     ...a,
     [c.type]: (a[c.type] === undefined) ? 0 : (a[c.type] + 1)
  }),
  {}
);
The spread operator ... and the ternatory operator predicate ? something : elseSomething are good JS idioms for writing pure functions without using Ramda
Generally it's good to think of += and ++ as code smells when you're going for FP.
mitcho
@mitchoSR_twitter
oh wow
thank you so much!
John Hartnup
@ukslim
@CsYakamoz Your solution works, but it's wasteful. It's good to reuse the base accumulator, as long as the iterator function is pure.
mitcho
@mitchoSR_twitter
havent fully read everything but will as soon as i can, but i get the main point
系我衰又唔够衰
@CsYakamoz
@ukslim It's good to reuse the base accumulator, as long as the iterator function is pure.
thank you, I got a new way to solve this question
John Hartnup
@ukslim
More Ramda-ish version of the reducer:
const cReducer = reduce(
  (a, c) => chain(
    assoc(c.type),
    pipe(
      prop(c.type),
      ifElse(isNil, always(0), add(1)),
    ),
  )(a),
  {}
);
Fully using Ramda practically guarantees you functional purity.
But IMO the spread operator version is more readable in this case.
Keep all your functions pure, and you don't get this kind of problem; that's the main point.
John Hartnup
@ukslim
One more thing... Extracting it into lots of small functions, I think makes it cleaner. And in general, runtimes can optimise that kind of thing better:
const getCountOrZero = propName => pipe(
  propOr(-1, propName),
  inc,
)

const initOrIncrement = propName => chain(
  assoc(propName),
  getCountOrZero(propName),
)

const iteratorFn = (a, c) => initOrIncrement(c.type)(a)

const dReducer = reduce(iteratorFn, {});
(for extra style, that iteratorFn could use parameter decomposition: const iteratorFn = (a, { type }) => initOrIncrement(type)(a)
klequis
@klequis

@Bradcomp

averge times evolve version / transduce version
19.26 / 45.94 = 42%

Thank you! Will spend some time understanding.

Johnny Hauser
@m59peacemaker
I can't find a test anywhere for transducer preservingReducedas used in the cat transducers. My transducer cat tests pass with or without preservingReduced... not sure what case it's meant to solve. Any ideas?
Brad Compton (he/him)
@Bradcomp

Disclaimer: I haven't looked at the code

reduced is meant to short circuit a reducer. If you return reduced(acc) in a reducer function it will no longer keep iterating through the list. Transducer support, I think, would be to make sure that within a transducer it actually stops when reduced is returned...

const allExist = reduce((acc, item) => item ? true : reduced(false), true);
Johnny Hauser
@m59peacemaker
I might be missing something, or didn't convey the issue. You have the main/top-level regular iterator calling the step function / pipeline on each value from the collection, and looking for Reduced as a result of that pipeline, which signals that iterator to bail. No problem there... but the cat transducers do a nested reduce, which also can return Reduced, short circuiting the nested reduce. I feel like that makes just as much sense.
But then there's this preservingReduced in the cat transducer
It looks like what happens is that cat nested reduce gets short circuited by Reduced, but preservingReduced has already wrapped it in an extra Reduced, so that when the nested reduce unwraps it, it's still Reduced, and that sends a Reduced up to the primary iterator, short circuiting that iterator, too
but that doesn't make any sense to me.