Hi there! I'm new to Coconut (amazing project, by the way!!), and I was wondering if there's a way to enable --mypy
type checking in a Jupyter kernel. I tried starting the kernel with python -m coconut.icoconut --mypy
, but this gives me an error saying basically that --mypy
is not a valid argument. I'm wondering if there's a way to forward along --mypy
to the underlying Coconut runtime?
Or alternatively (more jankily), is there a way to enable mypy (e.g. some magic constant I can set to True
) within the already-running Coconut runtime in order to have it enable mypy type checking?
Thank you!
@michaeltingley There's currently no way to do this using the Coconut Jupyter kernel, but you can do it with the Coconut Jupyter magic if you do %load_ext coconut
and then:
%%coconut --mypy
x: str = 1
which should give you an error. Also, I should note that calling coconut.icoconut
directly is generally not what you want to be doing; you should probably default to coconut --jupyter console
or coconut --jupyter notebook
instead.
(There are also a few other warnings, which you can see in that image:
CoconutSyntaxWarning: unnecessary from __future__ import (Coconut does these automatically) (line 3)
from __future__ import print_function, absolute_import, unicode_literals, division
CoconutSyntaxError: variable names cannot start with reserved prefix _coconut (line 4)
import sys as _coconut_sys
^
File "<string>", line unknown
SyntaxError: variable names cannot start with reserved prefix _coconut
import sys as _coconut_sys
)
addpattern
syntax with lambda expressions. That would absolutely seal the deal for me :smile:
@miutamihai Statement lambdas certainly offer support for pattern-matching syntax—for example:
xs |> map$(def ({"a": a}) -> a + 1)
And in terms of using addpattern
, it's just a function, so you can certainly do something like:
def ignore_zero(x, 0) = float("nan")
safediv = addpattern(ignore_zero)(def (x, y) -> x / y)
coconut-language-server
as its own separate repo. Do you want to make a base repo with the dev-ops stuff you prefer, for us to PR to it? I've been researching how to implement an LSP with pylgs: https://github.com/openlawlibrary/pygls
conf.py
? the sidebar never scrolls unless I get waaaay down in the docs... just a minor pet peeve ^_^ https://github.com/ryan-roemer/sphinx-bootstrap-theme/issues/136#issuecomment-186860368)
globaltoc_depth:3
option? It that does what I'm assuming, then the sidebar would just have less "stuff" in it.
['q'+n, val] = ['q1',2]
print(n)
>>> 1
{('q1',n): '1'+val} = {('q1',2):'12'}
print(n)
yknow what, I literally asked this in 2019 haha. You said:
@tbsexton Dictionary keys in patterns currently have to be constants, unfortunately--there's no current syntax for looking up keys based on values, only for looking up values based on keys. Feel free to raise an issue for that, though.
yeah that's a great point. Though I will say, my workaround was to use tuples which work fine.
It was a bit of an edge-case, perhaps. This was data formed from an OpenAPI schema, so they had a lists of single-pair dicts. [{k1,v1},{k2:v2},...] etc. So I was pattern matching on only one key to dispatch to whatever function needed to be executed on the value for a given key. I knew the dicts were singletons, so it was ok to use match def f(d is dict) = f(d.items()|>list|>.[0])
and do all the logic in tuples
I do have another question: what would the equivalent of yield from
be in terms of maps?
When using a nested lazy sequence, even calling list
on the resulting generator usually returns a list of a bunch of lazy sequences (of possibly unknown depth). Is there a command/pattern for recursively consuming/yielding all generators? I know in the past I used for gen in super-gen: yield from gen
, but alas, this inhibits my quest to be rid of for-loops ^_^
map
/fmap
as opposed to for ... yield from
but i always seem to end up with a top-level list full of un-executed generator functions
|> list
part, rather leaving as a generator. the whole "stack" of generators would get flattened into one generator over the leaves
eval_iters
is doing:def recursive_map(func, () :: it) =
it |> map$(recursive_map$(func)) |> func
addpattern def recursive_map(func, x) = func(x)
def list_it(() :: it) = list(it)
addpattern def list_it(x) = x
eval_iters = recursive_map$(list_it)
(|1, (|2, 3|), 4, (|5, 6|)|) |> eval_iters |> print