by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Oct 22 2016 16:57
    ming-codes opened #19
  • Aug 03 2016 20:12
    mridgway added as member
  • Aug 24 2015 14:15
    Travis yahoo/druid-dashboard#18 still failing (60)
  • Aug 24 2015 14:15
    pjain1 closed #17
  • Aug 24 2015 14:15
    pjain1 commented #17
  • Aug 24 2015 14:12
    pjain1 opened #18
  • Aug 21 2015 20:30
    jkusa commented #17
  • Aug 21 2015 03:18
    yahoocla commented #17
  • Aug 21 2015 03:18
    himanshug commented #17
  • Aug 20 2015 22:17
    Travis yahoo/druid-dashboard#17 still failing (59)
  • Aug 20 2015 22:15
    pjain1 opened #17
  • Aug 13 2015 20:06
    Travis yahoo/druid-dashboard (master) broken (58)
  • Aug 13 2015 20:02

    cheddar on master

    type ahead with select-2 add dimension selector styling … refactor table layout and 2 more (compare)

  • Aug 13 2015 20:01
    cheddar closed #16
  • Aug 10 2015 22:20
    pjain1 commented #16
  • Aug 05 2015 18:51
    pjain1 commented #16
  • Aug 05 2015 06:33
    yahoocla commented #16
  • Aug 05 2015 06:33
    mike-north commented #16
  • Aug 04 2015 19:10
    Travis yahoo/druid-dashboard#16 broken (57)
  • Aug 04 2015 19:07
    pjain1 synchronize #16
Siddharth Gupta
@sid88in
Hey Guys!
Anyone there?
+639278148667
@wazmigs2902_twitter
there u go
Sumit Khanna
@bohemia420
anyone around?
I have the jars, same as my maven repo ones
the maven project (java) is working just fine
wanted to port that code to scala for some reason
but jars getting downloaded on gradle mytask
so are the poms
however
the io.druid classes and com.metamx tranquility I am unable to import
Sumit Khanna
@bohemia420
lo! that is eclipse problem, able to see that in idea
Lavanya Pant
@pantlavanya
hi all

Hello,

we have the following structure in our data. The Json data comes in at every 15 mins interval.

Our usecase needs to find the last - minus on the stream_value field for every 15 mins

{"timestamp": "2018-08-16T11:00Z","key":"key1","stream_value":"25"}
{"timestamp": "2018-08-16T11:15Z","key":"key1","stream_value":"50"}
{"timestamp": "2018-08-16T11:30Z","key":"key1","stream_value":"75"}
{"timestamp": "2018-08-16T11:45Z","key":"key1","stream_value":"100"}
{"timestamp": "2018-08-16T12:00Z","key":"key1","stream_value":"125"}

Below is the query which we are using
{

  "queryType": "groupBy",
  "dataSource": "test 

_datasource",
  "granularity": "fifteen_minute",
  "dimensions": ["key"], 

  "aggregations": [

           { "type": "doubleLast", "name": "last_value", "fieldName": "stream_value" },
           { "type": "doubleFirst", "name": "first_value", "fieldName": "stream_value" }
        ],
  "postAggregations": [
           { "type": "arithmetic",
            "name": "difference",
            "fn": "-",
            "fields": [
              { "type": "hyperUniqueCardinality",  "fieldName": "last_value" },
              { "type": "hyperUniqueCardinality",  "fieldName": "first_value" }
            ]
          }
   ],

}

The granularity seems to be inclusive at the left and exclusive at the right because of which the first and last values are the same.

{
"version" : "v1",
"timestamp" : "2018-08-16T11:00Z",
"event" : {
"difference" : 0.0,
"first_value" : 25,
"last_value" : 25,
"source_key" : "key1"
}
}, {
"version" : "v1",
"timestamp" : "2018-08-16T11:15Z",
"event" : {
"generation" : 0.0,
"first_value" : 50,
"last_value" : 50,
"source_key" : "key1"
}
}

Please let me know if there is anyway where we can make inclusive at both the ends of the Intervals.

Thanks very much for the help.

Chandresh
@ChandTurakhia_twitter
How to avoid creating new segments after 20 mins of completion of kafka ingestion spec in druid?
_