Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
    I mean using Matplotlib package ...
    Hi guys
    Joaquín Chemile
    Hello Guys!! Do you have any resourse where i can find the way to import my scala local functions to the Notebook (Something like when i don "Pip install ." whith Python) - Thanks!
    Javier Bianco
    Hi guys, do you know how can I select a different sparkmagic "config.json" file when a kernel is starting? I've tried modifying the "kernel.json" file adding two variables into "env" secction: "env":{"SPARKMAGIC_CONF_DIR":"/home/basic/.sparkmagic", "SPARKMAGIC_CONF_FILE":"config02.json"} but didnt work. Thanks!
    John Pugliesi

    Hello - I'm using sparkmagic to execute long-running spark jobs, and running into an issue where the sparkmagic livy session dies if the submitted spark job runs longer than the configured livy.server.session.timeout. The error manifests as the usual:

    An error was encountered:
    Invalid status code '404' from http://my-livy-server:8998/sessions/38 with error payload: {"msg":"Session '38' not found."}

    Does the sparkmagic session heartbeat thread not keep the session alive if a cell runs longer than the livy session's timeout?

    Appreciate the help

    Itamar Turner-Trauring
    woah I did not realize this existed
    I guess it's rarely used so maybe that's OK
    but in general—if there's no one here and you have a problem, please file an issue on GitHub
    Hey Team, I am trying to figure out a way in sparkmagic to supply additional jars after session creation.

    i tried using:

    %%configure -f
    {"conf": {
        "spark.jars.packages": "com.xyz.my-jar:0.1.1"}

    But this errors out with message: ERROR: Cell magic %%configure not found

    can someone guide me what to do in this case, i am using jupyternotebook
    @here ^^
    Palaniappan Nagarajan
    HI Team, I have a question about configuring spark magic in notebook programatically. I am using PaperMill to execute my notebook and I want to parameterize notebook such that I can execute notebook in different url end points (overriding kernel_python_credentials in ~/.spark/config.json). Is there a way to accomplish?
    Any help or pointers would be greatly helpful. Thanks!
    @here : any examples on configuring default endpoints ? Saw it from the codebase that there is a way to configure default endpoints without the user having to go through the widget. Any pointers in this would be helpful ?

    Hi, I am using for first time spark magic,

    I am in a good position now, as it connects with Livy, but I have a problem. No session is created:

    I execute :

    %load_ext sparkmagic.magics

    And the output is not a javascript widget, it is just

    MagicsControllerWidget(children=(Tab(children=(ManageSessionWidget(children=(HTML(value='<br/>'), HTML(value='…

    I have verified that pip install ipywidgets is installed and

    jupyter nbextension enable --py --sys-prefix widgetsnbextension
    Enabling notebook extension jupyter-js-widgets/extension...
          - Validating: OK

    Using the latest jupyterhub image.

    Then if I run :

    %%configure -f
    {"executorMemory": "1000M", "executorCores": 4}
    Current session configs: {'executorMemory': '1000M', 'executorCores': 4, 'kind': 'spark'}
    No active sessions.

    There are no active sessions.

    Not sure what it is missing. I am getting out of ideas. Any idea? Thanks a lot

    @dvirgiln : are you running it in jupyter notebook or jupyter lab ?
    Shripad Deshmukh
    does anyone know reason for ERROR: Cell magic %%configure not found
    Joaquín Chemile
    Hello! Someone is using Spark Magic in Ubuntu 20.04 to ask a question. I'm failing in order to install new kernels in spark magic. I'm following this tutorial: https://github.com/jupyter-incubator/sparkmagic#installation
    Marty Kemka
    Hi sparkmagic, I love spark and I love magic and your package has saved me years of work. However I am not smart enough to solve this: I have a jupytersession that connects to an EMR cluster with sparkmagic. How can i pass a variable from the python session to the spark session?
    1 reply
    Udit Bhatia
    We have a remote spark cluster running on AWS EMR with spark magic.. when we connect to EMR we have to restart our jupyter kernel, can we avoid restarting the jupyter kernel ?
    Ahmed Waheed
    Hello sparkmagic community. I am needing help to connect my local Jupyter notebook to an AWS EMR cluster. I tried looking up, but good not find something concrete and now hit a roadblock. I found something similar for Azure (https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-jupyter-notebook-install-locally) but could not for AWS. I know I can launch Notebooks in AWS that attach to the cluster, but they are not persistent.
    Hi all!
    I'm working on adding sparkmagic to my jupyterhub server images. It successfully installs but when running the following in my notebook:
    %load_ext sparkmagic.magics
    I am unable to create any sessions with my livy pyspark server. It either errors out with:
    HttpClientException: Error sending http request and maximum retry encountered.
    Or hangs entirely. Any help would be appreciated!!
    Hi! Under JupyterHub, is there a way to assign an owner to the corresponding livy session (livy is used on local interface without authentication). Currently, the owner and proxyUser are assigned null.