## Where communities thrive

• Join over 1.5M+ people
• Join over 100K+ communities
• Free without limits
##### Activity
• May 12 23:21
evhub labeled #659
• May 12 23:11
JacobFV opened #659
• Apr 23 21:14
evhub closed #658
• Apr 23 21:14
evhub commented #658
• Apr 23 21:13
evhub labeled #658
• Apr 23 21:13
evhub milestoned #658
• Apr 23 21:13
evhub labeled #658
• Apr 23 21:13
evhub opened #658
• Apr 22 20:29
evhub commented #657
• Apr 22 20:29
evhub closed #657
• Apr 22 20:29
evhub commented #657
• Apr 22 20:26
evhub milestoned #657
• Apr 22 20:26
evhub labeled #657
• Apr 22 20:25
evhub labeled #657
• Apr 18 16:47
tbsexton opened #657
• Mar 29 21:36
evhub demilestoned #654
• Mar 28 07:48
evhub closed #656
• Mar 28 07:48
evhub commented #656
• Mar 28 07:45
evhub labeled #656
• Mar 28 05:12
Sitwon commented #656
Thurston Sexton
@tbsexton
Actually, pattern-matching on the keys seems broken further, eg. using tuple destructuring in the key:
{('q1',n): '1'+val} = {('q1',2):'12'}
print(n)
Thurston Sexton
@tbsexton

yknow what, I literally asked this in 2019 haha. You said:

@tbsexton Dictionary keys in patterns currently have to be constants, unfortunately--there's no current syntax for looking up keys based on values, only for looking up values based on keys. Feel free to raise an issue for that, though.

Evan Hubinger
@evhub
Yup, that's a limitation of the pattern-matching system—and unfortunately it's a very tricky one to solve, since it leads to a combinatorial explosion of possible matches based on trying to figure out which key should match.
Evan Hubinger
@evhub
(It's the same reason you can't do head + [2, 3] + last = [1, 2, 3, 4].)
Thurston Sexton
@tbsexton

yeah that's a great point. Though I will say, my workaround was to use tuples which work fine.

It was a bit of an edge-case, perhaps. This was data formed from an OpenAPI schema, so they had a lists of single-pair dicts. [{k1,v1},{k2:v2},...] etc. So I was pattern matching on only one key to dispatch to whatever function needed to be executed on the value for a given key. I knew the dicts were singletons, so it was ok to use match def f(d is dict) = f(d.items()|>list|>.[0]) and do all the logic in tuples

I do have another question: what would the equivalent of yield from be in terms of maps?

When using a nested lazy sequence, even calling list on the resulting generator usually returns a list of a bunch of lazy sequences (of possibly unknown depth). Is there a command/pattern for recursively consuming/yielding all generators? I know in the past I used for gen in super-gen: yield from gen, but alas, this inhibits my quest to be rid of for-loops ^_^

Thurston Sexton
@tbsexton
(p.s. this is a super common pattern in e.g. Pandas dataframe construction, where there might be a single generator you pass to the constructor that yields rows, but internally you want it to be defined with sub-generators. Obviously I'd prefer to build these with map/fmap as opposed to for ... yield from but i always seem to end up with a top-level list full of un-executed generator functions
Thurston Sexton
@tbsexton
I did figure out a single-level version of this... top_gen |> (::) <*.. map$(nested_gen) but this is not recursive... cannot exhaust all generators in the "stack". Mostly just don't know what I'd call this, I'm imagining it's like a "flush" where every generator gets exhausted if it exists, returning their result (e.g. into a list). Evan Hubinger @evhub @tbsexton Maybe try: def eval_iters(() :: it) = it |> map$(eval_iters) |> list

addpattern def eval_iters(x) = x
Thurston Sexton
@tbsexton
@evhub huh actually when you write it out like that, this is just an iterator over a "tree" of iterators to grab all the leaves in order.
like a DFS where we just return the leaves
we may not need the |> list part, rather leaving as a generator. the whole "stack" of generators would get flattened into one generator over the leaves
Evan Hubinger
@evhub
No, I don't think so—the function as I wrote it shouldn't flatten anything, and without list I don't think it does anything.
Evan Hubinger
@evhub
Here's a more general function that might help showcase what eval_iters is doing:
def recursive_map(func, () :: it) =
it |> map$(recursive_map$(func)) |> func

addpattern def recursive_map(func, x) = func(x)

def list_it(() :: it) = list(it)

eval_iters = recursive_map$(list_it) (|1, (|2, 3|), 4, (|5, 6|)|) |> eval_iters |> print Henning @henningsway I use VScode (Insiders) with the new native Notebook API for interactive Python Scripting. As a former R user the functional possibilities of coconut appeal to me, but I have trouble selecting a coconut-kernel in vscode so far. It does work great for Jupyter in the Browser, though. Is there a chance to get this to work in vscode as well? :) Evan Hubinger @evhub @henningsway Theoretically, VSCode supports TextMate syntax highlighting, which already exists for Coconut here—though I don't know how to get VSCode to use that file (I don't use VSCode myself). If you figure it out, I'd appreciate it if you could create a package or at least put the instructions here and I'll add them to the documentation. kobarity @kobarity @evhub, I just published a VSCode extension kobarity.coconut based on your Coconut.tmLanguage. Thanks for your great work. Actually, another extension based on the older version of Coconut.tmLanguage already exists, but it seems not maintained. My new extension supports syntax highlighting on Coconut files and in Coconut code block in Markdown. kobarity @kobarity @henningsway, I tried VSCode Insiders and could select Coconut kernel. First I installed the extensions Jupyter, Python and Coconut. Then I connected to a remote Jupyter server, and selected the Coconut kernel. Evan Hubinger @evhub Thanks @kobarity! I'll add that to the documentation. Uroš Nedić @urosn Does Coconut support typeclasses? Does Coconut suppory Unicode function names (combined with infix notation we can do very modern things)? Evan Hubinger @evhub @urosn For typeclass-like syntax, you can use addpattern to match a different function depending on the input type, e.g. match def to_json(x is str) = repr(x) addpattern def to_json(x is int) = str(x) addpattern def to_json(x is list) = x |> map$(to_json) |> list |> str
For unicode function names, Coconut supports PEP 3131 if given --target 3 (or above) and also supports a host of unicode alternatives to built-in operators.
Evan Hubinger
@evhub
Currently, as per PEP 3131, non-alphanumeric variable names are disallowed, though Coconut could theoretically add support for them if there's a compelling use case—feel free to raise an issue for that if you want it.
Also, for typeclasses, you can always just use typeclasses or something, though addpattern is probably going to be better than any library like that if you're working in Coconut.
fakuivan
@fakuivan
Hello
I'm using coconut as a replacement for the python kernel on jupyter
It's been great so far
There were some issues with the specs in kernel.json though, I installed coconut with pip3.9, but the argv array only refered to "python" and not "python3.9" or to the full path
Also, is there an analogue to toolz's valmap function?
fakuivan
@fakuivan
coconut is great, thanks for making this c:
Evan Hubinger
@evhub
@fakuivan If you run coconut --jupyter, it should automatically install a coconut kernel for the version that you run that command under. As for valmap, you can use fmap:
def valmap(val_func, input_dict) = input_dict |> fmap\$((k, v) -> (k, val_func(v)))
fakuivan
@fakuivan
Wait
Does fmap know to destructure (the key, value) tuple?
the (key, value)*
Evan Hubinger
@evhub
yep! that's how fmap works on abc.Mapping objects.
Hello
Is it possible to define fmap on new types?
Like addpattern fmap(f, obj is MyClass) = obj.map(f)
oh I see
the fmap dunder
it would be nice to have fmap be a pattern matched function so that this functionality could be added to third party libraries
Evan Hubinger
@evhub
@servadestroya __fmap__ is the way to do it, yep. If you want to add fmap support to a third-party object, you can always monkey-patch in an __fmap__ with
def ThirdPartyObject.__fmap__(self, func) = ...
Alternatively, if you want to use pattern-matching, you could also just define your own new fmap with
match def my_fmap(obj is ThirdPartyObject, func) = ...
addpattern def my_fmap(obj, func) = fmap(obj, func)
Pedro Queiroga
@pedroqueiroga

Hello friends, is this a bug?

>>> [-1] + rest = [-1,2,3,4]

CoconutParseError: parsing failed (line 1)

Coconut: Version 1.5.0 [Fish License] running on Python 3.9.2 and Cython cPyparsing v2.4.5.0.1.2

Evan Hubinger
@evhub
@pedroqueiroga Definitely a bug, but I just tested it on the latest coconut-develop and it looks like it's already fixed, so just pip install -U coconut-develop to get the fix.
anthonymccanny
@anthonymccanny

Hey all, I'm really new to coconut, and I think this is the right place for this question, but if I'm wrong, let me know.
Is there anyway in coconut to pipe an object and then call a method of that object, passing one of that objects' attributes as an argument of the method?
This question really arises from using pipes with pandas where I want to use a pipe to process a dataframe, and in that pipe change the index of that dataframe and then later access this new index as an argument within a .groupby() call or something similar. Of course I could assign a new dataframe variable, and complete this task but I was hoping there was a way to avoid it.
Here is a simple (and useless) example of what I'd like to do:

class A_Class:
attribute = 1

def method (self, argument):
print(f"The argument is {argument}")

an_object = A_Class()

an_object |> \
.method(.attribute)

Output: The argument is operator.attrgetter('attribute')
My Desired Output: The argument is 1

Is there anyway to do what I'm trying to do? Am I misunderstanding something crucial? Thank you!

Evan Hubinger
@evhub
@anthonymccanny I think you probably just want to use a lambda for that case—so maybe something like:
result = (
an_object
|> o -> o.method(o.attribute)
)