Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 19:36
    hiiamboris commented #4206
  • 18:55
    greggirwin commented #4206
  • 18:06
    dockimbel commented #4206
  • 18:02
    dockimbel commented #4206
  • 17:58
    dockimbel commented #4206
  • 17:44
    greggirwin commented #4206
  • 17:20

    dockimbel on IO

    TESTS: adds regression test for… FIX: attempt to improve base se… (compare)

  • 08:24
    hiiamboris commented #4206
  • 08:24
    hiiamboris commented #4206
  • 07:20
    qtxie commented #4206
  • Sep 27 19:13
    Oldes commented #4206
  • Sep 27 18:53
    hiiamboris commented #4206
  • Sep 27 18:52
    hiiamboris commented #4206
  • Sep 27 18:44

    dockimbel on master

    FIX: attempt to improve base se… Merge pull request #5218 from h… (compare)

  • Sep 27 18:44
    dockimbel closed #5218
  • Sep 27 18:41
    luce80 commented #4206
  • Sep 27 18:15
    hiiamboris opened #5218
  • Sep 27 16:51
    hiiamboris commented #5119
  • Sep 27 16:39
    greggirwin commented #4206
  • Sep 27 16:04
    dockimbel labeled #4226
Oldes Huhuman
@Oldes
@toomasv That is exactly what I was proposing.. not requesting.. that the file should be loadable. But I really don't care about JSON for now.
Vladimir Vasilyev
@9214
By gradually improving and extending JSON... you'll get Redbol. Evidently, all JSON extensions (comments, trailing commas, object literals, multiline strings, rich numeric syntax, etc) are trying to make it a proper subset of EcmaScript, which means that this whole trend was started by web developers trying to re-invent the Rebol wheel of "human readable" data exchange format.
Christopher Ross-Gill
@rgchris
@dockimbel Not sure, even just valid-utf8-char or some such. Something that's tied to a low-level UTF-8 decoder/validator: parse my-binary [some valid-utf8-char]. It may be worth distinguishing character sequences of different sizes in situations where you might have a length constraint.
Nenad Rakocevic
@dockimbel

@toomasv The ECMA-404 "The JSON Data Interchange Syntax" standard spec says in paragraph 2:

A conforming JSON text is a sequence of Unicode code points that strictly conforms to the JSON grammar
defined by this specification.
A conforming processor of JSON texts should not accept any inputs that are not conforming JSON texts. A
conforming processor may impose semantic restrictions that limit the set of conforming JSON texts that it will
process.

Which contradicts the RFC... Is there anything in the web stack that is not a mess? ;-)

Gregg Irwin
@greggirwin

People keep commenting, but nobody speaks to my points here :point_up: September 17, 2019 7:30 PM. (Thanks @9214 for referencing it in cleanup ) Please say whether you agree with the points I make there, though you can still disagree with my choice. I'd also like to hear if you agree with my closing thought.

Let me ask a slightly different question. Anecdotally, many of us have never been affected by comments in JSON in the wild. Who here has ever needed this feature, because some API returned JSON with comments in it? I don't mean finding an example config JSON config file in some JS project, but in your own work. That may include your own JSON configs, if noted as such. I ask, because we don't have data, and those who want it represent JSON-commenters, but we don't know how many of them exist, or if they even care.

@x8x, would it help in your case to have Red support JSON comments? That is, if you're using Red to load the config, just use Red. If you're using JS tooling, don't use Red. If you're driving both JS and Red tooling with a single config, that's a special use case. (And, using JSON for configs makes you a rule-breaker, but we knew that about you already. ;^)

Nenad Rakocevic
@dockimbel
@greggirwin Maybe a separate a cleaner lib as you propose, covering common data exchange formats JSON/XML/HTML/CSV/... would be a useful tool, to "fix" non-conforming files before passing them to the codecs. So we don't have to cripple our codecs with arbitrary non-conforming "extensions".
Toomas Vooglaid
@toomasv

@dockimbel That's really interesting. From ECMA-404:

(para2) This specification, ECMA-404, replaces those earlier definitions of the JSON syntax. Concurrently, the IETF published RFC 7158/7159 and in 2017 RFC 8259 as updates to RFC 4627. The JSON syntax specified by this specification and by RFC 8259 are intended to be identical.
[...]
(para3) This specification and [RFC8259] both provide specifications of the JSON grammar but do so using different formalisms. The intent is that both specifications define the same syntactic language.

And RFC8259 repeats verbatim RFC7159 as referred above.

Vladimir Vasilyev
@9214

@dockimbel

Maybe a separate a cleaner lib as you propose

I second that. Dedicated clean dialect / module to clean up messy data in external formats (if possible at all), and make them conformable to respective specifications. This keeps codecs tiny and robust, while untying our hands in how to handle so-called "common out of standards use cases"; and it's a good project idea by itself, that leverages Red's key strengths and can possibly attract users from other languages.

Gregg Irwin
@greggirwin
And is a great parse example. If designed properly, it's also something others can extend.
Petr Krenzelok
@pekr
You have to have two parse passes thru one source, right? Not sure it is a problem, speedwise. But would not work for a "streamed" parser, if we will see it one day?
Vladimir Vasilyev
@9214

Another point that only a few here seems to get (and that @greggirwin already articulated) - if core team, as language designers, accept this proposal, then it opens up a Pandora box for other similar feature requests, with no end in sight. So, I agree with no-comment stance - we need concrete data, not anecdotal evidence, and that's especially crucial with JSON , because, being one of the first codecs, it sets the tone for all future format parsers.

As far as I understand, comments in JSON are used only in configuration files maintaned by humans, and they are never included in API responses and other automatically generated data. And I already said that using JSON for handwritten configs makes no sense in Red universe; so, what we need are examples (both real and hypothetical) of Red interfacing with JSON-based tooling that uses said "messy" configs.

Gregg Irwin
@greggirwin

It's interesting to look at the tradeoffs and design intent behind a spec. IIRC, Crockford wanted to keep JSON's grammar as simple as possible, to facilitate writing correct parsers and consistent generators. Some aspects also came from JS itself of course. Quoting keys, trailing commas, removing comment support, no leading zeros on numbers, unicode char format; all conscious choices.

For us, it's also interesting to note that Rebol never had a formal grammar defined. @meijeru has taken on the monumental task of formalizing one for Red, which covers a lot. We offer libRed for others, so we are the reference standard, but Ren (ren-data.org) was intended to be a Redbol spin on JSON. Define literal forms that could be consistently shared between implementations. I still want to revive that, but there are open questions to be answered.

We make these choices all the time, and few are easy. Consider date!'s format. It now supports T as well as / as the time separator, which Rebol consciously did not. We believe / is a better fit for our world, but T makes it match the ISO8601 spec, which is widely used. But note that we don't make T the default. We need to move forward, and make Red the best it can be, while still helping people adopt it. Helpers and band-aids may seem inconvenient, but the idea is that they can be added if needed, and removed over time, without affecting the core.

Dave Andersen
@dander
My sense is that most people using the tools just don't want to be blocked because they can't load a document. Whether that is possible from the built in codec or a separate module is probably less important to a user of Red than whether they need to write their own parser or not. I understand that cleaning data is a big part of AI work too. That sounds like a great project
Gregg Irwin
@greggirwin
I should have said RFC3339, rather than ISO8601 above.
Nenad Rakocevic
@dockimbel
FYI, we have postponed the switch of View code to a submodule to tomorrow morning, as we are not fully ready yet.
GaryMiller
@GaryMiller
Apparently almost 62,000 Javascript users can't live without JSON comments! https://www.npmjs.com/package/comment-json
Nenad Rakocevic
@dockimbel

@GaryMiller

supports comments everywhere, yes, EVERYWHERE in a JSON file, eventually
Stringify the objects into JSON strings with comments if there are

So fully non-compliant with all the official JSON specs! :+1:

Nenad Rakocevic
@dockimbel
From what I can find online, there's 10M+ JS developers in the world, so:
>> to-percent 62e3 / 10e6
== 0.62%
Gregg Irwin
@greggirwin

@GaryMiller good find. Not sure we can say "can't live without". ;^) At a quick glance, 58K downloads of json-parser, which depends on it account for the vast bulk. Json-parser is deprecated. It also doesn't tell us how much it's actually used. Still, it's a metric. It would be good to know, for example, how many use it to keep comments, rather than just stripping them. Because I don't think we're going to keep comments, especially after looking at the results for how they do it. If you need commentary in JSON, that's fine. Include it as data. It breaks no rules, it round-trips by default, and can easily be ignore by processors. The argument against bloat there would only apply in data exchange scenarios, where it would still be a better solution than comments, because every JSON system would support it by default.

It would be great to extract some of this chat, so the rationale isn't lost.

Oldes Huhuman
@Oldes
Rationale is, that although it is quite easy to add few lines into existing parser, there is no will to do it and you will rather let people to write own parsers if they need it and do two load passes. But can we just move into something else? I really don't want to ready anything about JSON anymore.
GaryMiller
@GaryMiller
True, but if one developer pulls it down with NPM to our server our other 2,999 developers don't have to. So I should correct myself and say at least 62,000 shops that develop in Javascript instead of individual developers. But I know too that not all shops have 3,000 developers. Where I see comment used a a lot is in NOSQL database which are called document databases and basically just large JSON collections distributed across multiple servers. They are commonly versioned which means that new JSON data comes and other JSON data goes from version to version. Usually its safer to comment obsolete JSON data for a while until you're sure you won't need to rollback to an earlier version.
Stan. S. Krupoderov
@iNode
@9214 "quantity isn't quality" - I can agree there, why one would write or use another JSON parser(s) if original one is good enough? Definition of "good enough" may differ of course as in this case. And/but having a number of compeeting libraries may make some sense from "evolution" stand point of view in case if community is quite big and have a lot of spare resources. But it's hardly applicable for Red community and we need more focus there. So duplicate libraries are waste at this stage and proposal to have your own parser when original one is present it is proposal to waste community time. And I'm OK with the point to save core team resources on some "minor" things, from their point of view, but it shouldn't be hidden under "we want to follow the spec in this case".
Vladimir Vasilyev
@9214
@Oldes it's quite easy, technically, to jump from the roof. You know why very few actually do that? Same reason we weight in on JSON comments - it has long-stretching consequences. I think everyone agrees that Red should have an RFC-conforming codec, but whenever or not it needs a JSON5 one is an open question, to answer which we need some first-hand experience.
Oldes Huhuman
@Oldes
My experience was, that I took the first JSON file I found and it failed to load... as a someone who is not following JSON development I was not awared, that comments are not legal anymore and though that it is a Red bug. That's all... and now I can jump elsewhere.
Gregg Irwin
@greggirwin
I declare the JSON comment case...closed. <bangs gavel>
Vladimir Vasilyev
@9214
Actually, two interesting points:
  • Original JSON file that spawned this thread was used, as far as I can tell, to train GANs, and comments in it are a hacky way to manually patch model's options, or debug leftovers.
  • It got replaced by YAML.
GiuseppeChillemi
@GiuseppeChillemi
Developers should receive the reason for the failure to avoid such flow of thougths. Other tools loads this JSON file - > RED does not - > RED codec is buggy. My intuition tells me in such situation it is common to exchange a good compliant implementation for a buggy one: if everything else loads the file and a good tool does not, then the good tool become the bad one. This let me think about people following rules in a territory where people do the opposite: good guys become bad guys and vice versa ! I see a pattern... I could call it an inversion of perceived law
Gregg Irwin
@greggirwin
Red clearly tells you the offending JSON is invalid @GiuseppeChillemi.
hiiamboris
@hiiamboris

wow such a hot debate..

@dockimbel

@hiiamboris About your Red stack related post :point_up: September 10, 2019 9:59 AM:

I don't understand your point here. The stack is a working place for Red functions, so it's heavily used in many different places, as a lot of processing happens on the stack.
You are making a generality from a specific case (and you've picked up one of the worst). Fortunately, it's not that bad, even though the current stack handling is not fully satisfying.
I don't see how that would be helpful. Which problem are you trying to solve with that?

The point is that when I make Red function calls from R/S, I have to make sure my stack layout is correct. But I have no means to test for that correctness. Okay, sure, I can write a ton of asserts, but this approach has some shortcomings..

Anyway, it was just a vague idea born from pretty brief R/S exposure, that I wanted to share with you. If you say this problem is uncommon, I trust your judgement :)

GiuseppeChillemi
@GiuseppeChillemi
@greggirwin does RED tell you the reason ?
Gregg Irwin
@greggirwin
@GaryMiller thanks for the use case info.
@GiuseppeChillemi do you mean "Your JSON contains comments, which are not valid."? No, because then our codec would become massively bloated with every possible invalid construct someone could come up with.
That could be another external utility though.
Oldes Huhuman
@Oldes
@greggirwin funny thing is, that Red actually don't load the file because other reason than comment:
load/as  {
  "name": "debug_002_RRDB_ESRGAN_x4_DIV2K"
  , "use_tb_logger": true
} 'JSON
*** User Error: {Invalid json string. Near: {^^/  "name": "debug_002_RRDB_ESRGAN_x4_DIV}}
*** Where: ???
Vladimir Vasilyev
@9214
Well, what do you know. There's a missing pair of wrapping {...}.
GiuseppeChillemi
@GiuseppeChillemi
@greggirwin Just remember that our mind tend to fill the uncertainess with its own explanation when no other explanation is provided. Sometime this could put you off track. However, if the combination of error cases is low (< 20 ?) explaining would be good. Otherwhise (cases >20) no explanation or generic one, or divided in classes of error, is advised.
GiuseppeChillemi
@GiuseppeChillemi
@9214 However, I have not followed the debate on the reason why LOAD will not have an /option refinement to pass a block of options to the internal specific data parser/codec. Has this happened here ?
Gregg Irwin
@greggirwin
Load spec blocks are a conversation for a later time.
Gregg Irwin
@greggirwin
@Oldes as @9214 said your example is malformed. If I remove the comments from the original URL-sourced data, it loads fine. So it is the comments.
Oldes Huhuman
@Oldes
:point_up: September 18, 2019 11:20 PM there is no comment.
Gregg Irwin
@greggirwin
Yes, and your content is malformed.
load/as  {{
  "name": "debug_002_RRDB_ESRGAN_x4_DIV2K"
  , "use_tb_logger": true
}} 'JSON
Oldes Huhuman
@Oldes
Ok.. I really have to end with this. Good night.
GiuseppeChillemi
@GiuseppeChillemi
Good night Oldes, do not despair! We are humans. Our difficulties let us all exchange opinions and evolve. It's falling down and emitting strange sounds we learn to walk and talk.
Koba-yu
@koba-yu

Though I have not read and understood all discussion here(Sorry, not have enough time and knowledge for me), I just want to comment
VS Code treats JSON and JSON with Comments as different languages.

https://drive.google.com/file/d/1rH2lXn1Nms_BDHamyNFMOBin7hgf-ufK/view?usp=sharing

Maybe this is one practical choice example that JSON and JSON with Comments are different.
If Red distinguishes them in the same way, JSON codec(if it is not JSON with Comments codec) does not allow comments is understandable.

Gregg Irwin
@greggirwin
Thanks for the info @koba-yu.
Koba-yu
@koba-yu
@greggirwin Thank you for your reading!
packetrhino
@packetrhino
Question: is anyone familiar with the "expect" program that was based on TCL? Is there an equivalent for Red, at least under Linux/Unix systems?
Gregg Irwin
@greggirwin
There is no Excpect for Red...yet. I have Exploring Expect on my shelf, and have thought about implementing it at times. Step 1, I think is a CLI dialect that offers interrogation and feedback for itself, and sub-commands it may launch.