Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Fernando Margueirat
    @adatasetaday
    Never mind, apparently there's a bug https://issues.apache.org/jira/browse/TOREE-467
    Rickyhai
    @rickyhai11

    Dear everyone,
    I am trying to generate uber-jar using sbt complie and sbt package commands for running my application on our remote server with spark installed as standalone mode there. I used deeplearning4j framework for building LSTM neural network and tend to perform traing model through spark. Nevertheless, I got into issue when running spark-submit command:

    spark-submit --class "lstm.SparkLSTM" --master local[*] stock_prediction_scala_2.11-0.1.jar --packages org.deeplearning4j:deeplearning4j-core:0.9.1 "/home/hadoop/ScalaWorkspace/Stock_Prediction_Scala/target/lstm_train/prices-split-adjusted.csv" "WLTW"

    The problem is that seemly spark-submit did not take effect in my circumstance. It has been done right after entering spark-submit withou throwing any error. I have seen the progress of training in the ouput.

    [hadoop@gaion34 lstm_train]$ spark-submit --class "lstm.SparkLSTM" --master local[*] stock_prediction_scala_2.11-0.1.jar --packages org.deeplearning4j:deeplearning4j-core:0.9.1 "/home/hadoop/ScalaWorkspace/Stock_Prediction_Scala/target/lstm_train/prices-split-adjusted.csv" "WLTW"
    2018-04-25 17:06:50 WARN  Utils:66 - Your hostname, gaion34 resolves to a loopback address: 127.0.0.1; using 192.168.0.173 instead (on interface eno1)
    2018-04-25 17:06:50 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
    2018-04-25 17:06:51 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    2018-04-25 17:06:51 INFO  ShutdownHookManager:54 - Shutdown hook called
    2018-04-25 17:06:51 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-c4aee15e-d23b-4c03-95a7-12d9d39f714a
    [hadoop@abc lstm_train]$ spark-submit --class "lstm.SparkLSTM" --master local[*] stock_prediction_scala_2.11-0.1.jar --packages org.deeplearning4j:deeplearning4j-nn:0.9.1 "/home/hadoop/ScalaWorkspace/Stock_Prediction_Scala/target/lstm_train/prices-split-adjusted.csv" "WLTW"
    2018-04-25 17:07:12 WARN  Utils:66 - Your hostname, abcresolves to a loopback address: 127.0.0.1; using 192.168.0.11 instead (on interface eno1)
    2018-04-25 17:07:12 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
    2018-04-25 17:07:13 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    2018-04-25 17:07:13 INFO  ShutdownHookManager:54 - Shutdown hook called
    2018-04-25 17:07:13 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-82fdaebf-1121-4e31-8c4f-37aea9683922

    my main class:

    
    object SparkLSTM {
      def main(args: Array[String])= {
        if (args.length == 2) {
          val filePath = args(0)    //"/Users/kym1992/STUDY/NEU/CSYE7200/Dataset/nyse/prices-split-adjusted.csv"
          val symbolName = args(1)
          BasicConfigurator.configure()
          val prepared = StockPricePredictionLSTM.prepare(filePath, symbolName, 0.90)
          val result = StockPricePredictionLSTM.predictPriceOneAhead(prepared._1, prepared._2, prepared._3, prepared._4, prepared._5)
          println("predicts, actual")
          (result.predicts, result.actuals).zipped.foreach((x, y) => println(x + ", " + y))
          saveAsCsv(result, symbolName)
    
          result.predicts.foreach(r => println(r))
        }
      }

    Any one has experienced this issue before, please advise me . thanks

    Rickyhai
    @rickyhai11
    I have not seen the progress of training in the ouput.
    Dean Wampler
    @deanwampler
    Hi, @rickyhai11. I see you asked on the Spark with Scala channel. That's a better place for general questions not specific to the "Just Enough..." tutorial. Good luck!
    abhisam
    @abhisam
    How does spark create external table works ?
    Does it create external location with table ?
    Please help me on this
    Dean Wampler
    @deanwampler
    It should, but I haven't tried it in a while. Try the https://gitter.im/spark-scala/Lobby channel, if you have problems. It has a lot more people participating.
    abhisam
    @abhisam
    Hi
    @deanwampler would you know how to write Scala test cases for object variables ?
    Dean Wampler
    @deanwampler
    They are like instance static class variables in Java, you just reference them with MyClass.variable, instead of myInstance.variable. With that, you can test them like any other variables.
    abhisam
    @abhisam
    @deanwampler is it possible to override Scala object method for test cases ?
    Dean Wampler
    @deanwampler
    No, they can't be overridden. You'll need to extract the method into a separate trait or class.
    abhisam
    @abhisam
    @deanwampler you mean I have to change main code object to trait or class ?
    Dean Wampler
    @deanwampler
    If you like the tutorial that goes with this Gitter channel, I just updated my other tutorial, which teaches Spark using Scala. It now has Jupyter notebook examples, like this tutorial has used for a while. https://github.com/deanwampler/spark-scala-tutorial
    ksg
    @ksg97031
    Hi~
    Dean Wampler
    @deanwampler
    I just merged in some improvements to how Jupyter is used and the instructions in the README.
    Dean Wampler
    @deanwampler
    I added the option to use BeakerX instead of the Jupyter image. The README provides details. I also added an HTML printout of the tutorial notebook to go with the PDF printout. Otherwise, no changes to the tutorial itself.
    Dean Wampler
    @deanwampler
    Well I was a bit premature with the BeakerX support. Had to revert that commit and rework it on a branch. Stay tuned...
    Dean Wampler
    @deanwampler
    Scala 2.12 support was just merged into Spark master. https://twitter.com/deanwampler/status/1025030404078796800
    MuralikrishnaDanam
    @MuralikrishnaDanam
    image.png
    Hello i just started my docker container for jupyter UI i followed all instruction when i logged into UI . "work" folder is empty , could you please let me know what went wrong
    MuralikrishnaDanam
    @MuralikrishnaDanam
    am sorry i figured it out
    Dean Wampler
    @deanwampler
    Glad you figured it out.
    François Sarradin
    @fsarradin
    Hi @deanwampler with my colleagues we have done a translation in French of your tutorial. It has been played yesterday for the first time. The tutorial just worked well and has met with approval. So we would like to thank for your great tutorial.
    The translation is available here: https://github.com/univalence/CeQuilFautDeScalaPourSpark
    Dean Wampler
    @deanwampler
    Awesome! Thanks for letting me know. I added a link in the README!
    Czkonverse
    @Czkonverse
    Hello~It's very grateful to get the course. But I met a problem here, when I run the file run.bat, the outpu is "Unable to find image 'jupyter/all-spark-notebook:latest' locally
    docker: Error response from daemon: Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers).
    See 'docker run --help'."
    I'm running in a windows 10 64-bit system, does anyone meet the same problem?
    @deanwampler
    Dean Wampler
    @deanwampler
    I’m guessing your local docker environment couldn’t download the image from Docker hub. Are you behind a network proxy, by chance? Can you pull other docker images? Example: docker pull hello-world pulls a very small test image. If you suspect problems pulling, this documentation should help: https://docs.docker.com/engine/reference/commandline/pull/. Note the proxy configuration section.
    Czkonverse
    @Czkonverse
    @deanwampler Thank U for your advice, I solved the problem after I configured the docker settings in Daemon, I have runed the jupyter notebook successfully, and thank U for the course again, thanks.
    Dean Wampler
    @deanwampler
    Great! good luck.