Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    jeremyrsmith
    @jeremyrsmith
    Oh, like for the interpreter's compiler? Yeah, we definitely need to be able to pass compiler flags into there! I feel like we had some way to do this but I can't remember what it was... PR would be welcome for this though, even if it's something simple/non-configurable to start with
    (I feel like things like that are OK to start out as like env variables or server arguments, and then can grow into having UI around them... I don't want to stall useful features based on making UI around them)
    (or, even config file elements. That's not too bad and doesn't require UI around it)
    TBH I think -target:jvm-1.8 probably ought to be there by default in the non-spark case, because even without Spark I think we still require JVM 1.8+
    (still PR welcome :grimacing: )
    Lanking
    @lanking520
    Scala will by default using jvm-1.7 which is the minimum support for 2.11. Some packages in Java require 1.8 so we need to add the ScalacOptions to compile and run the package
    Do you know if there is any places I can try to inject this flag and test? I can try to build from source and see if it works. If PolyNote require Java8 and above, I think it is worthwhile to add this compiler flag
    This is the error thrown:
    Error:Static methods in interface require -target:jvm-1.8 (Line 1)
    Lanking
    @lanking520
    Java (from 1.8) start supporting static and default interface methods
    Lanking
    @lanking520
    jeremyrsmith
    @jeremyrsmith
    I think it's fair to say that polynote requires Java 8 or above. We certainly don't test it with anything < Java 8. The first thing I would do here is add a notebook config option for "extra scalac flags" (probably called scalacOptions to align with sbt) that gets parsed into scala.tools.nsc.settings.ScalaSettings (there is a method to do this somewhere...) and gets applied to the default Settings in the ScalaInterpreter
    At some point we could also add a UI to edit that, or maybe even a fancy UI for those settings like IntelliJ has (with checkboxes for common things and whatnot)
    Actually, maybe just adding -target:jvm-1.8 as a default setting would be a fine first step.
    It wouldn't solve generally fiddling with the scalac flags, but at least would solve this thing and it's pretty easy and non-controversial. @lanking520 that would probably go hereabouts https://github.com/polynote/polynote/blob/master/polynote-kernel/src/main/scala/polynote/kernel/ScalaCompiler.scala#L658 if you want to play with it
    (if you would rather it just work, I can make the change... but in case it might be fun for you :smile:)
    Lanking
    @lanking520
    Sure, will take a look at here. Adding jvm-1.8 is necessary but it could be an option if anytime in the future user would like to import Java11+ pacakges and set this flag to a higher level. @jeremyrsmith should I apply the flag in the ScalaCompiler instead of build.sbt?
    jeremyrsmith
    @jeremyrsmith
    @lanking520 yes, this is a runtime-compile-time concern, not a compile-time compile-time concern :grinning: (it gets confusing...)
    Lanking
    @lanking520
    I see, could you also point me where parsing happened? I can help to add that flag and let this module to consume it.
    João Costa
    @JD557
    @VincentBrule just saw your blogpost on scala times: https://vincentbrule.com/2020-09-19-polynote/ :)
    Lanking
    @lanking520
    Is there any instruction to create a fat jar just when I need to test locally? I try to run sbt package and it seemed to build for each submodule. I tried to build from source
    jeremyrsmith
    @jeremyrsmith
    @lanking520 if you want to build the distribution you can use sbt dist (make sure to build the frontend first with npm run build in polynote-frontend)
    ebigram
    @ebigram
    aye, been using polynote for couple of years now and it has easily supplanted jupyter/zeppelin/spark-notebook for me. However, lately, I’ve been digging the IDE integrations for VSCode/IntelliJ/Atom w/ Zeppelin and Jupyter, especially w/ the adorable almond. Would this be theoretically doable down the line with polynote? I recognize that it might defeath the purpose, but I’m unorthodox that way.
    jeremyrsmith
    @jeremyrsmith
    @ebigram what sort of IDE integrations did you have in mind? Like, writing the notebook in the IDE and running it on a Polynote server?
    ebigram
    @ebigram
    Yeah pretty much, kinda like Hydrogen or IntelliJ’s Big Data Tools
    VSCode has Azure Databricks
    But I don’t have that kinda $$$
    ebigram
    @ebigram
    To be clear, it's not a dealbreaker or anything, just curious if it's feasible from an architecture POV
    I found the sequential execution model to make it nearly impossible for me to go back to other notebooks, even with the nice IDE support; it's very much vital to the way I work as I don't like the idea of an exogenous state being introduced to immutability
    ebigram
    @ebigram
    I do wish Databricks made a really basic community edition open/free (no hosting, no support, no advanced features, staggered release of DBFS tooling) version of their notebooks, but they don't seem too keen on doing that. Makes projects like polynote all the more welcome.
    Lanking
    @lanking520
    Just send a PR out. I tested it locally with sbt dist and use a Java8 package. However, when I try to launch the server with Java8, I saw java.lang.NoSuchMethodError: java.nio.ByteBuffer.rewind()Ljava/nio/ByteBuffer; It seemed the public distribution is built from Java9 and above. Are we using JShell features? Or do we need to tell users to upgrade to Java9+?
    jeremyrsmith
    @jeremyrsmith
    @lanking520 is there a stack trace?
    we use Java 8, so AFAIK it should work fine
    Lanking
    @lanking520
    [INFO]   Deploying with command:
       |     /Library/Java/JavaVirtualMachines/jdk1.8.0_231.jdk/Contents/Home/jre/bin/java -cp polynote.jar:deps/polynote-runtime.jar:deps/polynote-spark-runtime.jar:deps/scala-collection-compat_2.11-2.1.1.jar:deps/scala-compiler-2.11.12.jar:deps/scala-library-2.11.12.jar:deps/scala-reflect-2.11.12.jar:deps/scala-xml_2.11-1.2.0.jar polynote.kernel.remote.RemoteKernelClient --address 127.0.0.1 --port 52142 --kernelFactory polynote.kernel.LocalKernelFactory
    [ERROR]   (Logged from NotebookSession.scala:174)
       |     java.lang.NoSuchMethodError: java.nio.ByteBuffer.rewind()Ljava/nio/ByteBuffer;
       |     polynote.kernel.remote.SocketTransport$FramedSocket.polynote$kernel$remote$SocketTransport$FramedSocket$$readBuffer(transport.scala:414)
       |     polynote.kernel.remote.SocketTransport$FramedSocket$$anonfun$read$2.apply(transport.scala:437)
       |     polynote.kernel.remote.SocketTransport$FramedSocket$$anonfun$read$2.apply(transport.scala:437)
       |     zio.internal.FiberContext.evaluateNow(FiberContext.scala:458)
       |     zio.internal.FiberContext.zio$internal$FiberContext$$run$body$2(FiberContext.scala:687)
       |     zio.internal.FiberContext$$anonfun$12.run(FiberContext.scala:687)
       |     java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
       |     java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
       |     java.lang.Thread.run(Thread.java:748)
    [REMOTE | hello.ipynb]
            |   Fiber failed.
            |   An unchecked error was produced.
            |   java.lang.NoSuchMethodError: java.nio.ByteBuffer.rewind()Ljava/nio/ByteBuffer;
            |       at polynote.kernel.remote.SocketTransport$FramedSocket.polynote$kernel$remote$SocketTransport$FramedSocket$$writeSize(transport.scala:460)
    Erik LaBianca
    @easel
    Anybody else had trouble with spark dependencies in polynote? spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.0.1 works just fine, trying the same thing a polynote dependency results in Uncaught exception: org/apache/spark/kafka010/KafkaConfigUpdater (java.lang.NoClassDefFoundError).
    jonathanindig
    @jonathanindig
    Hi @easel can you provide more information? Which version of Polynote are you using? How are you setting dependencies? What does your code look like?
    Erik LaBianca
    @easel
    @jonathanindig sure thing. I’m using the latest version of polynote downloaded from GitHub this morning — 0.3.12. It’s set up with a clean download of spark-3.0.1-hadoop-3.2 with SPARK_HOME and SPARK_HOME/bin appended to the path. I’m using openjdk 11. I then have a pyenv ve based on miniconda3-4.7.12 that runs the polynote.py. I simply add to the dependencies section of the botebook setup. My test code is the following:
    (spark
      .read
      .format("kafka")
      .option("kafka.bootstrap.servers", "localhost:9092")
      .option("subscribe", “topic")
      .load()).show()
    jonathanindig
    @jonathanindig
    I wonder if this is some issue related to Spark 3. Do you run into the same issue with Spark 2.4?
    jonathanindig
    @jonathanindig
    oh also, is your test code written in Scala or Python? If it’s Python, try Scala
    Erik LaBianca
    @easel
    It’s running under scala. Let me try spark 2.4 rq.
    Erik LaBianca
    @easel
    @jonathanindig so for better or for worse, scala-2.11 with spark 2.4.7 works fine.
    jonathanindig
    @jonathanindig
    Hmm, is there anything potentially more interesting in Polynote’s logs?
    jonathanindig
    @jonathanindig
    We don’t really build or test Polynote against Spark 3 so there might be some sort of binary incompatibility. There’s an open issue about Spark 3.0 support polynote/polynote#926 it seems to work for some people but they are probably doing relatively trivial things / sticking to accidentally compatible APIs. Not sure if anyone here has had more luck using Spark 3
    jeremyrsmith
    @jeremyrsmith
    @easel your polynote is built for 2.12 right? Just checking
    Erik LaBianca
    @easel
    @jeremyrsmith yes, polynote 2.12 against spark 3 with 2.12, polynote 2.11 against spark 2.4 with scala 2.11.
    jeremyrsmith
    @jeremyrsmith
    (you would have to grab polynote-dist-2.12
    Erik LaBianca
    @easel
    @jonathanindig i am in the process of seeing if I can build from source with the spark dependency swapped to 3.0.1 and see what happens. Does that seem like a reasonable step?
    jeremyrsmith
    @jeremyrsmith
    Hmm... it's hard to tell what could be going wrong there. I'll try to dig and repro when I get a chance
    @easel yes it sounds pretty reasonable. Though, if the issue was that polynote was built against a different spark version, it's hard to imagine how that would turn into NoClassDefFound within the kafka library
    Erik LaBianca
    @easel
    @jeremyrsmith agreed. I can say that this is not the first time I’ve run into classpath ordering issues with polynote , or spark in general. Sticking jars in extraJars has very different behavior than using —packages, has different behavior than a fat jar, and from what I can tell polynote does something different from all of the above because it’s using coursier for resolution.