by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Krzysiek Ciesielski
    @kciesielski
    Hello, is there anything in progress regarding a build for Scala 2.13?
    Flavian Alexandru
    @alexflav23
    @kciesielski outworkers/phantom#898
    Still some work to do on upgrading collections libray usage
    Krzysiek Ciesielski
    @kciesielski
    i see, thanks
    Wojciech Łukasiewicz
    @wojtuch

    hello everyone! I have a problem with pagination. I can't figure out the proper setup for reading the result in pages.
    after reading the first page:

        select
          .where(_.email eqs email)
          .orderBy(_.timestamp.descending)
          .paginateRecord(_.setFetchSize(numItems))

    when I want to use the returned pagingState to retrieve the next page

        select
          .where(_.email eqs email)
          .orderBy(_.timestamp.descending)
          .paginateRecord(pagingState)

    I get all the remaining records instead of the next page (of size numItems from the first call).

    Trying to add .limit(numItems) after orderBy and before paginateRecord in the second call results in PagingStateException: Paging state mismatch, this means that either the paging state contents were altered, or you're trying to apply it to a different statement.

    Any ideas what I am doing wrong?
    Thanks in advance!

    Wojciech Łukasiewicz
    @wojtuch
    I can't edit my message (maybe I used the limit as I edited it 2-3 times already) so forgive me another post.
    This example (https://github.com/outworkers/phantom/blob/976479391ec8f2bf919e6738dc6243153bf34c8d/phantom-streams/src/test/scala/com/outworkers/phantom/streams/suites/iteratee/IteratorTest.scala#L50) doesn't help as the entire result is read in 2 batches.
    Wojciech Łukasiewicz
    @wojtuch
    I managed to get it to work. For those who experience a similar issue: subsequent calls should set fetchSize and pagingState on the filtering statement:
        select
          .where(_.email eqs email)
          .orderBy(_.timestamp.descending)
          .paginateRecord(_.setPagingState(pagingState).setFetchSize(numItems))
    Flavian Alexandru
    @alexflav23
    Yes
    Sorry for the late reply. @wojtuch that’s the specific method you need to call to get the paging state.
    There are docs as well
    Wojciech Łukasiewicz
    @wojtuch
    I didn't find a specific example of chaining the set... methods in a paginateRecord overload to achieve this and it didn't occur to me immediately :) thanks for the reply!
    Harsh Gupta
    @hardmettle
    Hi @\all I just started exploring phantom and I have hit a block.I wanted to know what resolves this import import com.outworkers.phantom.auto._
    Flavian Alexandru
    @alexflav23
    @hardmettle Auto is a pro feature, we have a subscription version that gets you SLA, support, and some other modules includes auto which helps with automated derivation of encodings.
    We are looking at next year’s roadmap for making some of that public
    Harsh Gupta
    @hardmettle
    @alexflav23 thanks, but as of now if I have to write complicated nested case classes to cassandra without pro then how can I do that ? Please suggest
    Jelmer Kuperus
    @jelmerk

    If I try and add a

    def save(record: JsonClass): Future[ResultSet] = store(record).future() method to

    https://github.com/outworkers/phantom/blob/v2.28.0/phantom-dsl/src/test/scala/com/outworkers/phantom/json/JsonTable.scala#L49

    It won't compile and i get the error no implicit parameters thl

    why does this not work
    Jelmer Kuperus
    @jelmerk
    even the most basic things in this library give cryptic macro errors...
    Jelmer Kuperus
    @jelmerk
    And my next woe .. why does select .function(t => sum(t.count)).aggregate() return Future[Option[Option[Int]]] instead of Future[Option[Int]]
    Flavian Alexandru
    @alexflav23
    @hardmettle Unless you are very comfortable with shapeless or similar things, it’s going to be very hard, hence why we put so much work in a pro product.
    @jelmerk That’s a fair comment, it’s unfortunate that is the case, however that’s not as basic functionality as it seems, sadly.
    Berke SOKHΛN
    @berkesokhan_twitter
    So IIUC, Cassandra UDT support and Monix support is in Pro version. What is the pricing like, I couldn't seem to find a public pricing for Pro subscription?
    Also we had issues with Quill's UDT-in-UDT scenarios. Does phantom-pro support mapping an object for a frozen UDT list of an object which has another frozen UDT collection in?
    vonchav
    @voonchav_gitlab
    Hi @alexflav23 , is phantom-monix a commercial product? I can't seem to find any docs about it.
    lkolla
    @lkolla
    hi There, I'm trying to implement a functionality.. read around 1mn records from Casandra and write it into a file.. upon completion of process .. I see duplicate data in file.. I'm using basics of phantom ..
    I'm thinking phantom dsl may be issuing duplicate queries (it may be due to Cassandra timeout) to fetch data.. at the end.. it by be having data from both queries.. which is causing duplicate data in the file.
    lkolla
    @lkolla
    anyone has any idea about how to solve it?
    thanks in advance
    Flavian Alexandru
    @alexflav23
    You could use phantom streams/
    and control the batches @lkolla
    @berkesokhan_twitter We do indeed support UDT in UDT.
    vonchav
    @voonchav_gitlab
    @alexflav23 Is it possible to run a raw query via Phantom? I wanted to test C* connection by running SELECT now() FROM system.local; Thanks.
    vonchav
    @voonchav_gitlab
    I can see there is a cql helper in the dsl package object. Just wondering if you can give an example how to wire things up and actually execute the cql? Thanks.
    vonchav
    @voonchav_gitlab
    I got it working :)
    3 replies
    Flavian Alexandru
    @alexflav23
    You call .future on it
    cql(“SELECT * from ….).future()
    vonchav
    @voonchav_gitlab
    thanks
    Arjun Karnwal
    @arjunkarnwal

    I have a question !!
    I can run the following cql in cassandra

    select * from activation_events where customerid = 18 and (campaignid, entityid) in ((91, 'X'),(94, 'Y'));

    How can I write the same in phantom dsl as

    select
      .where(_.customerId in ?)
      .and(_.campaignId in ?) // Here I need to have it as tuple for campaignId and entityid
      .consistencyLevel_=(ConsistencyLevel.ONE)
      .prepare()

    Please note that campaignid and entityid are seperate columns and not a tuple. As I said I am able to run the query using cql

    Flavian Alexandru
    @alexflav23
    there’s a syntax for this too in phantom
    I belive it was using ~
    Manuel Ramírez
    @DeaThrash

    Hello everyone! I'm currently moving from phantom-dsl 2.13.4 to 2.59.0. Back in 2.13.4 I used to have a Seq of insertQueries of different tables like this

    val inserts = Seq[InsertQuery[_ >: TableA with TableB with TableC <: Table[_ >: TableA with TableB with TableC, _ >: TableARecord with TableBRecord with TableCRecord], _ >: TableARecord with TableBRecord with TableCRecord <: Product, Unspecified, HNil]]

    And with that I was able to map them and execute each with the future() like this: (which looks like is part of the ExecutableStatement trait)

    inserts.map(insert => insert.consistencyLevel_=(LOCAL_QUORUM).future())

    But now that I'm working on 2.59.0 I'm not able to do the same. Is there a workaround provided by phantom? or should I split the list into single table statements?

    vonchav
    @voonchav_gitlab
    @alexflav23 I have a similar problem doing an IN clause with tuples. You mentioned ~. I searched Phantom docs as well as github and couldn't find an example. Could you be so kind to show an example of how to translate (campaignid, entityid) in ((91, 'X'),(94, 'Y')) in CQL into Phantom's DSL? Thanks a lot.
    Flavian Alexandru
    @alexflav23
    In is a list
    1 reply
    vonchav
    @voonchav_gitlab
    Hi, any plan for Phantom to be on Scala 2.13.3? I tried to upgrade my project to Scala 2.13.3 from 2.13.1 but I was getting some very weird compilation errors in my Phantom code - the rest of my project compiles okay. The main error is Auto-application to ( ) is deprecated. Supply the empty argument list ( ) explicitly to invoke method set on a line like this object Foo extends SetColumn[String] where I'm merely declaring a column in C*. The compiler seems to think I need to do SetColumn[String] ( ) but I'm only extending from SetColumn.
    I had a similar problem in code where I'm using .where(_.id.in( aList ) ). The 2.13.3 compiler complains the same error for the .in call. .in method take a list and a curried param as an implicit. I couldn't figure out why the compiler complains about Auto-application to ( ) is deprecated, unless it wants me to pass in the implicit explicitly.
    Flavian Alexandru
    @alexflav23
    You use ->?
    @voonchav_gitlab You can simply use a -> b syntax to construct tuples as well
    vonchav
    @voonchav_gitlab
    @alexflav23 I don't think it's the tuple syntax. 2.13.3 deprecated the optional ( ) when calling a method.
    Leif Ramming
    @leifblaese
    Hey guys,
    first of all, Phantom is an amazing library, thanks for creating it.
    I have one question though. Let's say that I have some table that I want to add a column to. How do I do it using phantom? Is there a way to auto-apply changes to the table as sort of an ALTER TABLE statement? I know that it is possible to automatically create the table if it does not exist, but how about updating it to match the scala model?