by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jan 31 2019 22:17
    CrossEye commented #2779
  • Jan 31 2019 21:04
    ArturAralin commented #2779
  • Jan 31 2019 20:08
    CrossEye commented #2779
  • Jan 31 2019 18:56
    buzzdecafe commented #2631
  • Jan 31 2019 18:09
    ArturAralin commented #2779
  • Jan 31 2019 16:18
    CrossEye commented #2779
  • Jan 31 2019 16:10
    CrossEye commented #2631
  • Jan 31 2019 16:06
    CrossEye commented #2777
  • Jan 31 2019 14:44
    ArturAralin opened #2779
  • Jan 31 2019 07:39
    inferusvv commented #2631
  • Jan 31 2019 03:07
    sespinozj commented #2771
  • Jan 31 2019 02:33
    machad0 commented #2771
  • Jan 31 2019 02:26
    JeffreyChan commented #2777
  • Jan 30 2019 14:30
    CrossEye closed #2777
  • Jan 30 2019 12:13
    vanyadymousky updated the wiki
  • Jan 30 2019 01:42
    JeffreyChan commented #2777
  • Jan 29 2019 21:06
    vanyadymousky updated the wiki
  • Jan 29 2019 16:28
    CrossEye commented #2777
  • Jan 29 2019 15:50
    mbostock commented #2772
  • Jan 29 2019 15:48
    CrossEye commented #2772
John Hartnup
@ukslim
You create aReducer once. const aReducer = reduce(myIteratorFunction, {})
Now, your iteratorFunction mutates the accumulator {}
When you use aReducer a second time, it doesn't create a new base accumulator. It continues using the one you gave it (which has now been mutated)
The simplest change to your code to make it work as you expect, is to make your iteratorFunction pure, by returning a new object:
```
John Hartnup
@ukslim
const aReducer = R.reduce((a,c) => {
    const a1 = R.clone(a)
    if(a1[c.type] === undefined){
        a1[c.type] = 0;    
    }
    a1[c.type] += 1;

    return a1;

},{});
But there are neater ways to write it without even mutating local variables.
(BTW you can make it "fail early" by making the base accumulator Object.freeze({}) )
系我衰又唔够衰
@CsYakamoz
const aReducer = R.reduce((a, c) => {
    if (a[c.type] === undefined) {
        a[c.type] = 0;
        // console.log(a);
    }
    a[c.type] += 1;

    return a;
});     // no accumulator

console.log(R.map(item => aReducer({}, item), data));
there is a solution.
But I wonder if there is a more concise solution?
John Hartnup
@ukslim
const bReducer = reduce(
  (a, c) => ({
     ...a,
     [c.type]: (a[c.type] === undefined) ? 0 : (a[c.type] + 1)
  }),
  {}
);
The spread operator ... and the ternatory operator predicate ? something : elseSomething are good JS idioms for writing pure functions without using Ramda
Generally it's good to think of += and ++ as code smells when you're going for FP.
mitcho
@mitchoSR_twitter
oh wow
thank you so much!
John Hartnup
@ukslim
@CsYakamoz Your solution works, but it's wasteful. It's good to reuse the base accumulator, as long as the iterator function is pure.
mitcho
@mitchoSR_twitter
havent fully read everything but will as soon as i can, but i get the main point
系我衰又唔够衰
@CsYakamoz
@ukslim It's good to reuse the base accumulator, as long as the iterator function is pure.
thank you, I got a new way to solve this question
John Hartnup
@ukslim
More Ramda-ish version of the reducer:
const cReducer = reduce(
  (a, c) => chain(
    assoc(c.type),
    pipe(
      prop(c.type),
      ifElse(isNil, always(0), add(1)),
    ),
  )(a),
  {}
);
Fully using Ramda practically guarantees you functional purity.
But IMO the spread operator version is more readable in this case.
Keep all your functions pure, and you don't get this kind of problem; that's the main point.
John Hartnup
@ukslim
One more thing... Extracting it into lots of small functions, I think makes it cleaner. And in general, runtimes can optimise that kind of thing better:
const getCountOrZero = propName => pipe(
  propOr(-1, propName),
  inc,
)

const initOrIncrement = propName => chain(
  assoc(propName),
  getCountOrZero(propName),
)

const iteratorFn = (a, c) => initOrIncrement(c.type)(a)

const dReducer = reduce(iteratorFn, {});
(for extra style, that iteratorFn could use parameter decomposition: const iteratorFn = (a, { type }) => initOrIncrement(type)(a)
klequis
@klequis

@Bradcomp

averge times evolve version / transduce version
19.26 / 45.94 = 42%

Thank you! Will spend some time understanding.

Johnny Hauser
@m59peacemaker
I can't find a test anywhere for transducer preservingReducedas used in the cat transducers. My transducer cat tests pass with or without preservingReduced... not sure what case it's meant to solve. Any ideas?
Brad Compton (he/him)
@Bradcomp

Disclaimer: I haven't looked at the code

reduced is meant to short circuit a reducer. If you return reduced(acc) in a reducer function it will no longer keep iterating through the list. Transducer support, I think, would be to make sure that within a transducer it actually stops when reduced is returned...

const allExist = reduce((acc, item) => item ? true : reduced(false), true);
Johnny Hauser
@m59peacemaker
I might be missing something, or didn't convey the issue. You have the main/top-level regular iterator calling the step function / pipeline on each value from the collection, and looking for Reduced as a result of that pipeline, which signals that iterator to bail. No problem there... but the cat transducers do a nested reduce, which also can return Reduced, short circuiting the nested reduce. I feel like that makes just as much sense.
But then there's this preservingReduced in the cat transducer
It looks like what happens is that cat nested reduce gets short circuited by Reduced, but preservingReduced has already wrapped it in an extra Reduced, so that when the nested reduce unwraps it, it's still Reduced, and that sends a Reduced up to the primary iterator, short circuiting that iterator, too
but that doesn't make any sense to me.
It must not do what I think it does, since it would flatten [ [ 1, 2 ], [ 3, 4 ] ] into [ 1, 2 ] and then quit the whole thing because the nested list was finished.
Rafael Sorto
@rafaelsorto
hey guys, I hope you are all doing great!
I am building a function to convert every key in an object to a snakeCase format. I am starting with ramda and I have come up with a solution that works, but I am definitely sure there has to be a better way to do it. Is this the right channel to ask such things ?
mitcho
@mitchoSR_twitter
yes
try a line break after ```
Rafael Sorto
@rafaelsorto
thanks sorry about that
import {
  allPass,
  adjust,
  fromPairs,
  identity,
  ifElse,
  compose,
  map,
  toPairs,
  values,
  toUpper,
} from "ramda";

const isObject = element => typeof element === "object";

const isNotArray = element => !Array.isArray(element);

export const deepApplyFnToObjectKeys = fn => {
  const deepMapObjectKeys = obj =>
    compose(
      fromPairs,
      map(
        adjust(
          1,
          ifElse(allPass([isObject, isNotArray]), deepMapObjectKeys, identity)
        )
      ),
      map(
        adjust(
          1,
          ifElse(
            Array.isArray.bind(Array),
            compose(
              values,
              map(deepMapObjectKeys)
            ),
            identity
          )
        )
      ),
      map(adjust(0, fn)),
      toPairs
    )(obj);

  return deepMapObjectKeys;
};

const result = deepApplyFnToObjectKeys(toUpper)({
  hi: { hello: "world", other: { field: "yes", data: [{ i_dont_know: true }] } }
});
This is a codesandbox where I am testing it
https://codesandbox.io/s/zen-goldwasser-or6o6
It works currently, every key is converted in this case toUpper, and In my actual code I am using another package called snakeCase
mitcho
@mitchoSR_twitter
const isObj = obj => Object.prototype.toString.call(obj) === '[object Object]';
this might help you simplify
mitcho
@mitchoSR_twitter
heres something i came up with
const isObj = obj => Object.prototype.toString.call(obj) === '[object Object]';

const mapKeysVals = R.curry(
    (fkey, fval, obj) => Object.entries(obj).map(([k,v]) => [fkey(k), fval(v)]).reduce((a,c) => ({ ...a, [c[0]]:c[1] }), {})
);

function deepMapKeys(x){

    return R.ifElse(
        isObj,
        mapKeysVals(R.toUpper, deepMapKeys),
        R.identity
    )(x);

}
and a ramda repl https://tinyurl.com/u66ebl6
not sure how useful that is to you, i just wanted to try it myself :P
Brad Compton (he/him)
@Bradcomp
R.is(Object) :smile_cat:
mitcho
@mitchoSR_twitter
true for arrays?
Brad Compton (he/him)
@Bradcomp
yeah :expressionless:
John Hartnup
@ukslim
So, both(is(Object), complement(is(Array))), if that's what you want.
NB you can always replace ifElse(p, f1, identity) with when(p, f1)
eslint-plugin-ramda has a recommended rule for this.
mitcho
@mitchoSR_twitter
nice tip
Rafael Sorto
@rafaelsorto
Thank you guys!, grabbing some nice tips from this. I was trying to avoid using so much reduce to stick with ramda, but it seems you also think going with reduce is the best option here. At first I came up with a solution of a recursive function calling itself if the value for a key was an object, but decided to give ramda a try