These are chat archives for locationtech/geomesa

30th
Apr 2018
Ken Tore Tallakstad
@kentore82
Apr 30 2018 07:11
Hi guys! Been off for awhile so sorry if youve already been over this. Any experience with how well/poorly Spark 2.3 is supported? Are there plans to officially support it on GM 2.0? Cheers!
Emilio
@elahrvivaz
Apr 30 2018 13:00
@kentore82 I'm not sure we've tried it yet
James Hughes
@jnh5y
Apr 30 2018 16:30
I think @aheyne has poked at Spark 2.3 some and noticed some issues (but that may have just been with the Python bits)
as a general note, GeoMesa is using package protected interfaces
so a change in Spark might make what we do impossible or require a substantial amount of work to adapt
Austin Heyne
@aheyne
Apr 30 2018 16:31
The only issue I was having was with the .show() method being removed from Dataframes in python. I'm not sure if we'd have that same problem in Scala, I didn't investigate
jg895512
@jg895512
Apr 30 2018 17:26
i'm loading some new data.. the source data has the date broken down in to multiple fields (one for day, month, year, hour, minute, second). what is the best way to transform? I see the concatenate function, so should I do something like this?:
transform = date('YYYYMMddHHmmss', concatenate(concatenate(concatenate(concatenate($year, $month),$day)),$hour), $minute), $second)
any other/better ideas?
Emilio
@elahrvivaz
Apr 30 2018 17:28
concat accepts varargs
jg895512
@jg895512
Apr 30 2018 17:29
variable number of args? so I can do it in just one then, nice!
Emilio
@elahrvivaz
Apr 30 2018 17:29
yep: date('pattern', concatenate($year, $month, $day, $hour, $minute, $second))
jg895512
@jg895512
Apr 30 2018 17:29
thanks!
Emilio
@elahrvivaz
Apr 30 2018 17:30
sure, np
James Srinivasan
@jrs53
Apr 30 2018 19:00
Emilio
@elahrvivaz
Apr 30 2018 19:02
i added an example: sc.sql("select * from foo where dtg > cast('2016-01-01T01:00:00Z' as timestamp)")
James Srinivasan
@jrs53
Apr 30 2018 19:03
what breaks?
Emilio
@elahrvivaz
Apr 30 2018 19:03
i've got a fix up for it here: locationtech/geomesa#1944
the date gets turned into microseconds by spark, which we naively set in our geotools filter
so you get filters that don't evaluate correctly
James Srinivasan
@jrs53
Apr 30 2018 19:04
ah, that's not good
glad I've not upgraded yet
Emilio
@elahrvivaz
Apr 30 2018 19:05
:)
James Srinivasan
@jrs53
Apr 30 2018 19:12
now I'll have to bug y'all for a 2.1.0 release
Emilio
@elahrvivaz
Apr 30 2018 19:12
yeah i think we're going to cut a 2.0.1 pretty soon