Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    jeremyrsmith
    @jeremyrsmith
    (if you would rather it just work, I can make the change... but in case it might be fun for you :smile:)
    Lanking
    @lanking520
    Sure, will take a look at here. Adding jvm-1.8 is necessary but it could be an option if anytime in the future user would like to import Java11+ pacakges and set this flag to a higher level. @jeremyrsmith should I apply the flag in the ScalaCompiler instead of build.sbt?
    jeremyrsmith
    @jeremyrsmith
    @lanking520 yes, this is a runtime-compile-time concern, not a compile-time compile-time concern :grinning: (it gets confusing...)
    Lanking
    @lanking520
    I see, could you also point me where parsing happened? I can help to add that flag and let this module to consume it.
    João Costa
    @JD557
    @VincentBrule just saw your blogpost on scala times: https://vincentbrule.com/2020-09-19-polynote/ :)
    Lanking
    @lanking520
    Is there any instruction to create a fat jar just when I need to test locally? I try to run sbt package and it seemed to build for each submodule. I tried to build from source
    jeremyrsmith
    @jeremyrsmith
    @lanking520 if you want to build the distribution you can use sbt dist (make sure to build the frontend first with npm run build in polynote-frontend)
    ebigram
    @ebigram
    aye, been using polynote for couple of years now and it has easily supplanted jupyter/zeppelin/spark-notebook for me. However, lately, I’ve been digging the IDE integrations for VSCode/IntelliJ/Atom w/ Zeppelin and Jupyter, especially w/ the adorable almond. Would this be theoretically doable down the line with polynote? I recognize that it might defeath the purpose, but I’m unorthodox that way.
    jeremyrsmith
    @jeremyrsmith
    @ebigram what sort of IDE integrations did you have in mind? Like, writing the notebook in the IDE and running it on a Polynote server?
    ebigram
    @ebigram
    Yeah pretty much, kinda like Hydrogen or IntelliJ’s Big Data Tools
    VSCode has Azure Databricks
    But I don’t have that kinda $$$
    ebigram
    @ebigram
    To be clear, it's not a dealbreaker or anything, just curious if it's feasible from an architecture POV
    I found the sequential execution model to make it nearly impossible for me to go back to other notebooks, even with the nice IDE support; it's very much vital to the way I work as I don't like the idea of an exogenous state being introduced to immutability
    ebigram
    @ebigram
    I do wish Databricks made a really basic community edition open/free (no hosting, no support, no advanced features, staggered release of DBFS tooling) version of their notebooks, but they don't seem too keen on doing that. Makes projects like polynote all the more welcome.
    Lanking
    @lanking520
    Just send a PR out. I tested it locally with sbt dist and use a Java8 package. However, when I try to launch the server with Java8, I saw java.lang.NoSuchMethodError: java.nio.ByteBuffer.rewind()Ljava/nio/ByteBuffer; It seemed the public distribution is built from Java9 and above. Are we using JShell features? Or do we need to tell users to upgrade to Java9+?
    jeremyrsmith
    @jeremyrsmith
    @lanking520 is there a stack trace?
    we use Java 8, so AFAIK it should work fine
    Lanking
    @lanking520
    [INFO]   Deploying with command:
       |     /Library/Java/JavaVirtualMachines/jdk1.8.0_231.jdk/Contents/Home/jre/bin/java -cp polynote.jar:deps/polynote-runtime.jar:deps/polynote-spark-runtime.jar:deps/scala-collection-compat_2.11-2.1.1.jar:deps/scala-compiler-2.11.12.jar:deps/scala-library-2.11.12.jar:deps/scala-reflect-2.11.12.jar:deps/scala-xml_2.11-1.2.0.jar polynote.kernel.remote.RemoteKernelClient --address 127.0.0.1 --port 52142 --kernelFactory polynote.kernel.LocalKernelFactory
    [ERROR]   (Logged from NotebookSession.scala:174)
       |     java.lang.NoSuchMethodError: java.nio.ByteBuffer.rewind()Ljava/nio/ByteBuffer;
       |     polynote.kernel.remote.SocketTransport$FramedSocket.polynote$kernel$remote$SocketTransport$FramedSocket$$readBuffer(transport.scala:414)
       |     polynote.kernel.remote.SocketTransport$FramedSocket$$anonfun$read$2.apply(transport.scala:437)
       |     polynote.kernel.remote.SocketTransport$FramedSocket$$anonfun$read$2.apply(transport.scala:437)
       |     zio.internal.FiberContext.evaluateNow(FiberContext.scala:458)
       |     zio.internal.FiberContext.zio$internal$FiberContext$$run$body$2(FiberContext.scala:687)
       |     zio.internal.FiberContext$$anonfun$12.run(FiberContext.scala:687)
       |     java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
       |     java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
       |     java.lang.Thread.run(Thread.java:748)
    [REMOTE | hello.ipynb]
            |   Fiber failed.
            |   An unchecked error was produced.
            |   java.lang.NoSuchMethodError: java.nio.ByteBuffer.rewind()Ljava/nio/ByteBuffer;
            |       at polynote.kernel.remote.SocketTransport$FramedSocket.polynote$kernel$remote$SocketTransport$FramedSocket$$writeSize(transport.scala:460)
    Erik LaBianca
    @easel
    Anybody else had trouble with spark dependencies in polynote? spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.0.1 works just fine, trying the same thing a polynote dependency results in Uncaught exception: org/apache/spark/kafka010/KafkaConfigUpdater (java.lang.NoClassDefFoundError).
    jonathanindig
    @jonathanindig
    Hi @easel can you provide more information? Which version of Polynote are you using? How are you setting dependencies? What does your code look like?
    Erik LaBianca
    @easel
    @jonathanindig sure thing. I’m using the latest version of polynote downloaded from GitHub this morning — 0.3.12. It’s set up with a clean download of spark-3.0.1-hadoop-3.2 with SPARK_HOME and SPARK_HOME/bin appended to the path. I’m using openjdk 11. I then have a pyenv ve based on miniconda3-4.7.12 that runs the polynote.py. I simply add to the dependencies section of the botebook setup. My test code is the following:
    (spark
      .read
      .format("kafka")
      .option("kafka.bootstrap.servers", "localhost:9092")
      .option("subscribe", “topic")
      .load()).show()
    jonathanindig
    @jonathanindig
    I wonder if this is some issue related to Spark 3. Do you run into the same issue with Spark 2.4?
    jonathanindig
    @jonathanindig
    oh also, is your test code written in Scala or Python? If it’s Python, try Scala
    Erik LaBianca
    @easel
    It’s running under scala. Let me try spark 2.4 rq.
    Erik LaBianca
    @easel
    @jonathanindig so for better or for worse, scala-2.11 with spark 2.4.7 works fine.
    jonathanindig
    @jonathanindig
    Hmm, is there anything potentially more interesting in Polynote’s logs?
    jonathanindig
    @jonathanindig
    We don’t really build or test Polynote against Spark 3 so there might be some sort of binary incompatibility. There’s an open issue about Spark 3.0 support polynote/polynote#926 it seems to work for some people but they are probably doing relatively trivial things / sticking to accidentally compatible APIs. Not sure if anyone here has had more luck using Spark 3
    jeremyrsmith
    @jeremyrsmith
    @easel your polynote is built for 2.12 right? Just checking
    Erik LaBianca
    @easel
    @jeremyrsmith yes, polynote 2.12 against spark 3 with 2.12, polynote 2.11 against spark 2.4 with scala 2.11.
    jeremyrsmith
    @jeremyrsmith
    (you would have to grab polynote-dist-2.12
    Erik LaBianca
    @easel
    @jonathanindig i am in the process of seeing if I can build from source with the spark dependency swapped to 3.0.1 and see what happens. Does that seem like a reasonable step?
    jeremyrsmith
    @jeremyrsmith
    Hmm... it's hard to tell what could be going wrong there. I'll try to dig and repro when I get a chance
    @easel yes it sounds pretty reasonable. Though, if the issue was that polynote was built against a different spark version, it's hard to imagine how that would turn into NoClassDefFound within the kafka library
    Erik LaBianca
    @easel
    @jeremyrsmith agreed. I can say that this is not the first time I’ve run into classpath ordering issues with polynote , or spark in general. Sticking jars in extraJars has very different behavior than using —packages, has different behavior than a fat jar, and from what I can tell polynote does something different from all of the above because it’s using coursier for resolution.
    Erik LaBianca
    @easel
    Is there “DEVELOPING.md” or similiar in the repo I should be aware of? I think i’ve got things figured out, and can try to gin something up if it would be useful.
    jeremyrsmith
    @jeremyrsmith
    @easel it's a work in progress :disappointed:
    But if it's just for building from source (we should have a simple doc that says this) the dist task will make a polynote release tgz (by default for 2.11 – +dist will make both 2.11 and 2.12). You should go into polynote-frontend first and run npm run dist first also, in case sbt doesn't do that for you (I think it should but YMMV)
    Erik LaBianca
    @easel
    Yeah it didn’t =p Doing that now.
    jeremyrsmith
    @jeremyrsmith
    heh, help is welcome on any of this :laughing: there's mainly just myself and @jonathanindig working on it when we can, so things like "decent developer experience" succumb to the cobbler's children's shoes effect I'm afraid
    Erik LaBianca
    @easel
    all good. i will keep some notes and pr them
    also need to have webpack installed =p
    jeremyrsmith
    @jeremyrsmith
    oh yeah, you'll have to npm install before npm run dist
    (that should take care of webpack, I think?)
    tbh frontend is what we both struggle with the most, I think... it all gets rather needlessly complex as soon as you want to use anything from the ecosystem
    Erik LaBianca
    @easel
    have you been targetting a specific node version?
    jeremyrsmith
    @jeremyrsmith
    no, I don't think so
    Erik LaBianca
    @easel
    ok. i will put 13 in the doc assuming it works for me =p
    node is generally pretty stable in my experience but i think npm has evolved more?
    jeremyrsmith
    @jeremyrsmith
    FWIW I have 14.11.0 locally. But it's only used to run the build so it's whatever those things need