by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 02:49

    bitbegin on GTK

    FIX: add missing files for came… FIX: sort files for gtk3 include (compare)

  • Jul 03 14:32
    hiiamboris commented #4356
  • Jul 03 08:16

    bitbegin on GTK

    FEAT: init v4l2 for camera FEAT: add close for camera FEAT: set stream on for v4l2 and 19 more (compare)

  • Jul 02 19:13
    hiiamboris edited #4338
  • Jul 02 19:05
    hiiamboris commented #4333
  • Jul 02 19:05
    hiiamboris commented #4337
  • Jul 02 18:50

    dockimbel on PE-hints

    FEAT: generates hint values in … (compare)

  • Jul 02 18:45

    dockimbel on master

    FEAT: minor improvements in PE … (compare)

  • Jul 02 18:28
    hiiamboris commented #4337
  • Jul 02 15:39
    hiiamboris commented #4330
  • Jul 02 15:21
    hiiamboris commented #4330
  • Jul 01 20:53
    greggirwin commented #4275
  • Jul 01 20:14

    dockimbel on master

    FEAT: commented link-version fi… (compare)

  • Jul 01 15:01

    dockimbel on master

    FEAT: forces PE checksum field … (compare)

  • Jul 01 14:45
    9214 commented #4563
  • Jul 01 14:33
    meijeru commented #4563
  • Jul 01 11:27
    9214 commented #4563
  • Jul 01 06:09
    meijeru commented #4563
  • Jun 30 22:26

    dockimbel on PE-patches

    (compare)

  • Jun 30 22:25

    dockimbel on master

    FEAT: minor fixes and improveme… FIX: set the PE linker version … FEAT: minor improvement to prec… and 1 more (compare)

Gregg Irwin
@greggirwin
@GiuseppeChillemi best to point people back to us here, or where we post team info in the future.
GiuseppeChillemi
@GiuseppeChillemi
@greggirwin So you are suggesting a link to your gitter post ?
Gregg Irwin
@greggirwin
Sure. Or just tell people there is a small team and if they come here, they'll meet some of us. :^)
GiuseppeChillemi
@GiuseppeChillemi
Of course ! The best part of life is being together and not reading a web page with no people around.
Palaing
@Palaing
@greggirwin thanks, that (CSV codec) will certainly be better than the one I made!
Petr Krenzelok
@pekr
@rebolek Where should I report findings and/or questions towards the CSV codec? First - when I try to use the delimiter parameter ";" all I get is an empty block. Second - load-csv/with #";" returns an error stating, that delimitercan't be a char! type, though its help string states otherwise. And lastly - why some lines are enclosed in quotes, whereas most of the lines are enclosed in {}?
Another question - why default mode returns block, whereas load-csv/header returns a map?
Boleslav Březovský
@rebolek
@pekr thanks for report! I'm on phone now, so I look at it in the evening and let you know. Delimiter can't be char error looks very strange, there are tests for it and they pass.
If you can send me your CSV privately, it would really help.
Petr Krenzelok
@pekr
Btw, the difference between the "" and {} is there even with read/lines. I am on the phone right now too ....
Btw, is there any plan to prettify console output of help on objects?
Gregg Irwin
@greggirwin

"" vs {} came up not long ago. It's Rebol's design, and confuses people. Beyond 50 chars the runtime molds strings with {} instead of "". If, for example, truncated console output always added a closing } that would be useful, because you could persist console sessions.

Object help alignment looks like a bug/regression. @bitbegin, I hate to say it, but it was your recent help changes that caused it.

Good catch @pekr.
Gregg Irwin
@greggirwin

why default mode returns block, whereas load-csv/header returns a map?

The rationale here is that /header means it should be used, not just that it exists. If you want the default format, and to separate the header, we (you? ;^) can add an example like this to the wiki page:

>> s: {a,b,c^/1,2,3^/4,5,6}
== "a,b,c^/1,2,3^/4,5,6"
>> data: load-csv s
== [["a" "b" "c"] ["1" "2" "3"] ["4" "5" "6"]]
>> hdr: take data
== ["a" "b" "c"]
>> data
== [["1" "2" "3"] ["4" "5" "6"]]
Petr Krenzelok
@pekr
I can tell you just one thing and most probably you might agree - there's always a tradeoff between the simplicity and the features offered. I gave codec a try, found some bugs or inconsistencies with just few tries. Well, bugs might get fixed, but I don't want to even think, why should I get a block vs the map. What for? To think just about that aspect is probably more complex than it might be useful to me.
Worked with CSVs for 15 - 20 years with Rebol. Never ever had I to use more than just few lines of code, and it was alway consistent: `data: read/lines %file.csv, remove first dataif there is header present ... and then finally the loop foreach row data [items: parse/all data ";" ....].... done ....
Gregg Irwin
@greggirwin
It's still open for discussion, and is a tough call. If you get back a block of blocks by default, and user /header, what do you expect to get back?
Petr Krenzelok
@pekr
I will surely give it more try. The truth is that I never faced other delimiter than semicolon. Constructing resulting converted CSV was once again a foreach loop with write/append applied ....
OTOH I have also faced a space delimited data. Fields of certain length and missed the default R2 left pad funciton back at those times ...
Gregg Irwin
@greggirwin
I noted at one point that I've almost always taken the low level approach myself, because that's all we had, and it wasn't worth writing higher level mappings when you're trying to get a single job done, just for you. With this design we're trying to save lots of people effort over time.
Looking at past projects, there were 2 big ones where these features would have helped me a lot, and I probably should have taken the time to implement them. One was data aggregation and analysis (column model), the other was a data scraping system that mapped data to records. In the latter case, whether you use "records" or just blocks depends on what you're tying to. e.g. a SQL model for DB or a grid UI that takes blocks, or a CRUD system.
Petr Krenzelok
@pekr
The problem is/was, that Rebol's low level aproach, was flexible enough to get things done in just few lines of code :-)
Gregg Irwin
@greggirwin
And you still can.
That's even the default.
Gregg Irwin
@greggirwin
Another argument for the extended features being standard is that it facilitates data exploration and verification, including generic tools we can offer.
Having concrete examples is great @pekr, so if you have old code you port to use the codec, and can post both examples, that's helpful.
Boleslav Březovský
@rebolek

The truth is that I never faced other delimiter than semicolon.

CSV is comma separated values, so comma is used as default separator. Anyway, loading data with semicolon should be as easy as load-csv/with data #";". If it doesn't work, there may be a bug. We have tests for it, but I could have overlooked something, that's possible.

>> load-csv/with {a;b;c^/1;2;3^/4;5;6} #";"
== [["a" "b" "c"] ["1" "2" "3"] ["4" "5" "6"]]

Anyway, what you describe parse data #";" is simply not enough, you were lucky if it worked always for you, or you had used very simple data.

>> to-csv/with [["@pekr" {convert this ; with your "parse loop"}]["@pekr" "I think it won't work"]] #";"
== {@pekr;"convert this ; with your ""parse loop"""^/@pekr;"I think it won't work"^/}
Petr Krenzelok
@pekr
SCV name might suggest that, but comma is the last value, which is used by default. That's just my experience, always semicolon. It works without the /with refinement, and returns an error with one :-)
I work with company enterprise dat, always check for the length of the parse operation and return an error, if not of the same value thru the whole data set. I know, call me just lucky :-)
Boleslav Březovský
@rebolek
@pekr as shown above, it works with /with refinement. If you have an example where it doesn't work, I'll be happy to see it and improve the codec.
Petr Krenzelok
@pekr
I can not provide you with the whole dataset - company employee data from the HR system, but here is a sample:
>> data: load-csv/with ";" "12345;Krenzelok;Petr;;19.12.2018;;19.12.2018;19.12.2018;"
== [[""]]
>> data: load-csv/with #";" "12345;Krenzelok;Petr;;19.12.2018;;19.12.2018;19.12.2018;"
*** Script Error: load-csv does not allow char! for its data argument
*** Where: load-csv
*** Stack: load-csv
shouldn't both cases work? Or am I doing something incorrectly?
Vladimir Vasilyev
@9214
@pekr you do realize that arguments are mixed up in your example?
So codec actually behaves as expected.
Petr Krenzelok
@pekr
Ah, sorry then - data first, argument last. Years without active rebolling, that's just emberrassing on my side :-(
Boleslav Březovský
@rebolek
@pekr it's a refinement...
Ok, no problem :)
Petr Krenzelok
@pekr
Was confused by not using any refinement. It "loaded", and I thought that the codec is not much useful, as it returned a block of strings, which I then parsed using split. That's why I wondered how is that useful in comparison to a "manual mode"
Boleslav Březovský
@rebolek
Just switch your args and it should much more useful ;)
GiuseppeChillemi
@GiuseppeChillemi
image.png

Hi, RED consolle is totally hung trying to execute the following code:

probe bookmark-source:  to-file "/C/Users/mybook/AppData/Local/Google/Chrome/user data/Default/Bookmarks"

probe read bookmark-source

halt

My bookmark file is big, about 65.500 entries

53 MB File.
Using latest (06 October) build
GiuseppeChillemi
@GiuseppeChillemi
It seems the consolle does not like very big printout.
Semseddin Moldibi
@endo64
For GUI console yes, CLI console can do that, just tested with over 50.000 lines text file, if the CLI console is minimized it is almost instant, otherwise it takes some time but not too much (around 20-30 seconds).
GiuseppeChillemi
@GiuseppeChillemi
image.png
It has not displayed anything on screen, only the HALT message:
It is totally hung and had to quit.
GiuseppeChillemi
@GiuseppeChillemi

Well, I have changed the code to experiment.

I read different webpages and the GUI is not updated until the last network read. Instead ist should output at each read round.

sites: [
    http://www.repubblica.it
    http://www.virgilio.it/
    http://www.slashdot.org
]


forall sites [
    counter: 0
    until [
        counter: counter + 1
        page: first sites    
        page-text: attempt [read page]
        either not none? page-text [success: true] [success: false]
        print ["page: " page " counter: " counter " Success: " success]
;        probe page-text
        counter > 1
    ]
]

Is it normal ? Should I report it to the BUG section ?