These are chat archives for juttle/juttle

21st
Jan 2016
David Majda
@dmajda
Jan 21 2016 09:41 UTC
@demmer @go-oleg @rlgomes Re #225: After reviewing the changes and usage of collections in node-grok I decided it makes sense to go forward rather than revert the grok parser. So I just released juttle 0.3.1.
I did it by creating a 0.3.x branch off the v0.3.0 tag, porting the fix there + adapting changelog (#228), and releasing. Then I merged 0.3.x into master (#229). I think this is the right approach since it means 0.3.1 has only the fix and no other changes. Merging back into master ensures convergence and I think is better than git cp because the merge clearly marks that the changes were taken back to master. Any possible future 0.3.x releases should go from the 0.3.x branch again. And I think we should use a similar approach for patch releases in the future (except when there are no commits on master since the last release when then need of a patch release is realized).
If you don't have anything against this approach, I'll add it to the Releases doc.
Oleg Seletsky
@go-oleg
Jan 21 2016 15:58 UTC
That sounds good to me.
Michael Demmer
@demmer
Jan 21 2016 16:00 UTC
I agree -- that seems great.
Rodney Lopes Gomes
@rlgomes
Jan 21 2016 16:39 UTC
thanks @dmajda
David Cook
@davidbcook
Jan 21 2016 19:15 UTC
Hey guys, I'm trying to use the historical zoom functionality of timechart but the context chart isn't appearing. It seems like it's not appearing because the chart doesn't finish drawing. The chart will draw different lengths of data before it stops updating on different runs. If I replace the timechart with tail 1000 | view table all the data appears.
Oleg Seletsky
@go-oleg
Jan 21 2016 19:16 UTC
@davidbcook, can you post the complete juttle?
David Cook
@davidbcook
Jan 21 2016 19:17 UTC
(
  read file -file 'Archive/Empatica-Rollups/file1.json' -to :1970-01-01T02:29:08.000Z:
  | put city = 'Chicago'
  ;
  read file -file 'Archive/Empatica-Rollups/file2.json' -to :1970-01-01T02:29:08.000Z:
  ;
  read file -file 'Archive/Empatica-Rollups/file3.json' -to :1970-01-01T02:29:08.000Z:
  ;
  read file -file 'Archive/Empatica-Rollups/file4.json' -to :1970-01-01T02:29:08.000Z:
)
| filter time < :1970-01-01T02:29:08.000Z:
| reduce -every :1s: avg_hr = avg(avg_hr), avg_skin_temp = avg(avg_skin_temp), avg_bvp = avg(avg_bvp), avg_eda = avg(avg_eda), avg_rms = avg(avg_accel_rms), stan_dev_accel_rms = sigma(stan_dev_accel_rms)
| split avg_hr, avg_skin_temp, avg_rms, stan_dev_accel_rms
//| tail 1000
//| view table
| view timechart -keyField 'name' -valueField 'value' -display.dataDensity 0 -series [{name: 'stan_dev_accel_rms', yScale: 'secondary'}, {name: 'avg_rms', yScale: 'secondary'}] -title 'Empatica Revenant Data';
removing the -to options doesn't change anything noticeable; same with removing the filter...
Oleg Seletsky
@go-oleg
Jan 21 2016 19:20 UTC
okay i’ll look into it locally and get back to you
David Cook
@davidbcook
Jan 21 2016 19:21 UTC
thanks
Oleg Seletsky
@go-oleg
Jan 21 2016 19:37 UTC
@davidbcook, are you installing outrigger from npm github?
David Cook
@davidbcook
Jan 21 2016 19:37 UTC
yeah
Oleg Seletsky
@go-oleg
Jan 21 2016 19:37 UTC
which one ;-)?
oh oops i meant npm or github
David Cook
@davidbcook
Jan 21 2016 19:37 UTC
oh npm
Oleg Seletsky
@go-oleg
Jan 21 2016 19:38 UTC
hmm, having trouble reproducing what you describe with a simplified json file and juttle
David Cook
@davidbcook
Jan 21 2016 19:39 UTC
darn, I can't share my json files :(
is there another way to sort this out?
Oleg Seletsky
@go-oleg
Jan 21 2016 19:46 UTC
guessing you can’t share the view text output of the juttle program either?
do you see any errors in chrome devtools console?
David Cook
@davidbcook
Jan 21 2016 19:53 UTC
ok, so I didn't remove some of the unneeded fields in that program above. After doing so, it now works
Oleg Seletsky
@go-oleg
Jan 21 2016 19:54 UTC
well glad it works, can you share what types those fields were?
since you specified both the keyField and the valueField it shouldn’t really have an impact on the chart
David Cook
@davidbcook
Jan 21 2016 19:58 UTC
they were floats
David Cook
@davidbcook
Jan 21 2016 20:16 UTC
Now I have a question about how -timeField is interpreted in the file source. I have a milliseconds since epoch field in a file. When that field was full of integers, the table output of the file said the first point occurred in the year 2000 even though the value for the point was 0. When I changed all the values from ints to floats, 0.00 got interpreted as Thu Jan 1 00:00:00 1970 UTC but then 267000.00 got interpreted as Sun Jan 4 02:10:00 1970 UTC. Weird!
I can share the file with someone privately if that would help
Rodney Lopes Gomes
@rlgomes
Jan 21 2016 21:04 UTC
milliseconds timefield with a value of 267000 ? how old is this data ? show me a few of those timestamps as I'm not sure they're epoch timestamps
if you punch 267000 into http://www.epochconverter.com/ you will get the exact "Sun Jan 4 02:10:00 1970 UTC" you got there
just to get a reference todays timestamp as a few seconds ago 1453410354
David Cook
@davidbcook
Jan 21 2016 21:21 UTC
So I have data from several different dates that I want to compare. The dates are like 7:30-10 on one day and 6-8:30 on another day so I can't just overlay the dates with -duration. Instead, I'm zeroing the timestamps from each chunk of data based on the start of the data.
Rodney Lopes Gomes
@rlgomes
Jan 21 2016 21:23 UTC
humm ok so you're making all of your ranges start at :0: which is 1970-01-01T00:00:00.000Z
David Cook
@davidbcook
Jan 21 2016 21:23 UTC
Yeah. I thought the -timefield option required milliseconds (due to the conversation here juttle/juttle#166) but if it returns Sun Jan 4 02:10:00 1970 UTC from 267000.00 then it must think 267000.00 is a time in seconds since epoch
Daria Mehra
@dmehra
Jan 21 2016 21:32 UTC
after reviewing your use case, and realizing it’s the common one, the decision was to change default unit for -timeField to seconds.
not sure it’s already done in the build you’re running, can check
but i’m thinking you should still be able to overlay without doing low-level time resetting magic, lemme see.
can you do -duration :22.5 hours: -overlayTime true? or are the time periods of interest at unequal distances from each other?
David Cook
@davidbcook
Jan 21 2016 21:39 UTC
unequal distances
Daria Mehra
@dmehra
Jan 21 2016 21:39 UTC
i see. and no, the issue to treat time as seconds is still outstanding, not merged. so you should be working in milliseconds.
David Cook
@davidbcook
Jan 21 2016 21:39 UTC
and I haven't grabbed a new build since I created that issue
Daria Mehra
@dmehra
Jan 21 2016 21:43 UTC
ok, to put that to rest, yes we are still on milliseconds.
$ cat /tmp/data.json 
[
{ "mytime": 1.00, "a" : "a0" },
{ "mytime": 267000.00, "a": "a1" },
{ "mytime": 267001.00, "a": "a2" },
{ "mytime": 268000.00, "a": "a3" }
]
in juttle treated as ms:
juttle> read file -file '/tmp/data.json' -timeField 'mytime'
┌────────────────────────────────────┬──────────┬──────────┐
│ time                               │ a        │ mytime   │
├────────────────────────────────────┼──────────┼──────────┤
│ 1970-01-01T00:00:00.001Z           │ a0       │ 1        │
├────────────────────────────────────┼──────────┼──────────┤
│ 1970-01-01T00:04:27.000Z           │ a1       │ 267000   │
├────────────────────────────────────┼──────────┼──────────┤
│ 1970-01-01T00:04:27.001Z           │ a2       │ 267001   │
├────────────────────────────────────┼──────────┼──────────┤
│ 1970-01-01T00:04:28.000Z           │ a3       │ 268000   │
└────────────────────────────────────┴──────────┴──────────┘
note that value “0.00” for -timeField doesn’t work, result in Warning: point is missing a time, so make sure you’re not zeroing out the first point to the actual 0
David Cook
@davidbcook
Jan 21 2016 21:46 UTC
Then how come my 267000.00 gives me a date on 1970-01-04 and not 1970-01-01?
Daria Mehra
@dmehra
Jan 21 2016 21:46 UTC
shouldn't
i’m using it above and getting 01-01 at time 04:27
at time 00:04:27 i mean
which is correct: 267,000 ms is 267 s is 4 min 27 s
Rodney Lopes Gomes
@rlgomes
Jan 21 2016 21:47 UTC
might be easier to keep the timestamps as they are not use timeField and then add a | put time=Date.new(timestamp) + :today: and that way all of your dates don't start at 0 but instead from todays midnight going forward ?
Daria Mehra
@dmehra
Jan 21 2016 21:47 UTC
can you show the bit of code that gave you 01-04?
Rodney Lopes Gomes
@rlgomes
Jan 21 2016 21:48 UTC
267000 seconds are 3 days
@davidbcook ^
David Cook
@davidbcook
Jan 21 2016 21:49 UTC
@rlgomes yeah, but jut should be interpreting 267000 as 267000 milliseconds
Daria Mehra
@dmehra
Jan 21 2016 21:49 UTC
it is for me, see code above. there must be something you’re doing differently
David Cook
@davidbcook
Jan 21 2016 21:49 UTC
read file -file 'Archive/events.csv' -timeField 'time_begin_ms' -format 'csv' | view table
the values for time_begin_ms are:
0.00
267000.00
1456000.00
1623000.00
2750000.00
2782000.00
2818000.00
3563000.00
4024000.00
4266000.00
5212000.00
6564000.00
6677000.00
6984000.00
7045000.00
7169000.00
7316000.00
7820000.00
8021000.00
8503000.00
8776000.00
Rodney Lopes Gomes
@rlgomes
Jan 21 2016 21:50 UTC
oh
CSV file ?
csv does not read "types" everything is a string
as there is are no types in csv
Daria Mehra
@dmehra
Jan 21 2016 21:52 UTC
wait up, trying csv.
Rodney Lopes Gomes
@rlgomes
Jan 21 2016 21:52 UTC
csv is reading a string... I'm not sure what our moment parser does with the string containing a number
David Cook
@davidbcook
Jan 21 2016 21:54 UTC
it does something really screwy because if the numbers above are ints, the first two time values are Sat Jan 1 08:00:00 2000 UTC and Wed Jan 1 08:00:00 7000 UTC
Daria Mehra
@dmehra
Jan 21 2016 21:55 UTC
hold on...
Daria Mehra
@dmehra
Jan 21 2016 22:01 UTC
yes we are unable to properly read -timeField from a csv file (doesn’t matter if it has 267000.00 or just 267000)
the workaround is to do this
read file -file 'Archive/events.csv' -format 'csv' 
| put time = Date.new(Number.fromString(time_begin_ms)/1000)
the complication is that Date.new() constructor expects seconds which is why i did that conversion.
Daria Mehra
@dmehra
Jan 21 2016 22:07 UTC
you must have been getting Jan 4 when trying to follow a similar path through Date.new or other time math that was seconds-based. Only -timeField (in the build you’re running) is still expecting ms.
but we won’t be using -timeField here because it’s broken for csv type. I’m filing an issue on that.
let me know if the above works @davidbcook
David Cook
@davidbcook
Jan 21 2016 22:08 UTC
Ah ok, thanks for looking into the issue and providing the workaround (that worked for me) Daria!
Rodney Lopes Gomes
@rlgomes
Jan 21 2016 22:08 UTC
sweet!
Daria Mehra
@dmehra
Jan 21 2016 22:23 UTC
issue filed: juttle/juttle#243