Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 20:57
    kevin-bates commented #5946
  • 20:33
    deenukhan opened #5948
  • 19:32
    jwhendy commented #5947
  • 19:32
    jwhendy closed #5947
  • 19:18
    Zsailer commented #5947
  • 19:10
    kevin-bates commented #2563
  • 19:08
    Zsailer commented #5947
  • 19:06
    Zsailer commented #5947
  • 19:06
    Zsailer commented #5947
  • 19:03
    Zsailer commented #5947
  • 18:42
    jwhendy opened #5947
  • 17:05
    jameskoch opened #5946
  • 17:01
    kevin-bates commented #5945
  • 10:10
    danieltomaselli opened #5945
  • 09:41
    asmak9 commented #5042
  • 09:40
    asmak9 commented #5042
  • 09:03
    elexira commented #2563
  • 05:41
    balasaravanan3502 closed #5940
  • 01:12
    JeMorriso closed #5944
  • 01:12
    JeMorriso commented #5944
Ashwin K Ashok
@AshwinKAshok
In the context of Jupyter kernel gateway, is there a way to relate kernel_id to the underlying process id?
Kevin Bates
@kevin-bates

Hi @AshwinKAshok - Since Kernel Gateway only launches local kernels (to itself) this answer would be similar to Notebook relating kernel_ids to processes since JKG is essentially the same implementation (just headless, except that the kernel management takes place on the JKG server).

I don’t know of a way to do this with Notebook and nowhere in the framework is the kernel-pid surfaced AFAIK. Do you know of a way to accomplish this using standard Notebook?

Of course, kernel management knows this relationship (jupyter_client in particular), but the pid is rarely used internally (the Popen instance is used for signals and polling).

Ashwin K Ashok
@AshwinKAshok
@kevin-bates I am still trying to figure out a way to connect kernel_id to process_id. I am actually trying to figure out a way to get kernel-specific metrics from kernel-gateway. The closest I have found is this : yuvipanda/nbresuse#41 . But this is a nb server extension and does not work with kernel gateway. I am trying to port this repo's logic into kernel gateway. But the hurdle here is the mentioned repo uses session_id to connect to a notebook and subsequently figure out the process_id and session_id is not something that kernel gateway maintains. The main motivation for doing this is that I have multiple remote kernels and I would like to provide metrics regarding these kernels and not the just the metrics of the notebook server.
Kevin Bates
@kevin-bates
Thanks for the clarification. (I also should have noted that JKG is “essentially the same implementation [as notebook]” but only supports /api/kernels and /api/kernelspecs endpoints.)
I guess the answer is the same as what I responded with on this issue: jupyter/kernel_gateway#336.
Had the metrics of NBresuse been implemented to extend /api/kernels/{kernel-id} and the gathering been done in MappingKernelManager where activity recording occurs, this would just work because that api is handled by gateway servers (JKG and EG).
I also see nbreuse using the kernel process’s command line to make the kernel-id association based on the fact that the kernel-id happens to be present in the connection-file name. This would not be necessary if also associated in MappingKernelManager - which has direct access (via the KernelManager instance) to the kernel process.
So, there are some things we can do, not so much in notebook, but perhaps in jupyter_server and I’d be happy to help (time permitting).
Victor Tolpegin
@DarkmatterVale
Hey all, I'm trying to add some security headers to JupyterHub's responses (see: jupyterhub/jupyterhub#3113)
I see here that we can specify the "headers" attribute in the JupyterHub config file to customize this information
But, it doesn't seem like customizing the headers data results in any change...when I curl the root page, I still don't see my new headers
Thoughts?
(and if im in the wrong chat room, which one would be the best place to post this question in?)
Kevin Bates
@kevin-bates
Hi @DarkmatterVale - you might get better traction in the Hub room: https://gitter.im/jupyterhub/jupyterhub
Victor Tolpegin
@DarkmatterVale
Thank you!
Monique Jones
@meshaun9
Hi, does anyone know if you can programatically get the execution time of a cell by using ExecuteTime NB extension?
Jason Grout
@jasongrout
@/all - We extended the JupyterCon Call for Proposals deadline two days until Wednesday, 22 July. There's still time to propose a talk, poster, tutorial, or sprint! https://jupytercon.com/participate/#Call%20for%20proposals
SenseTheTech
@SenseTheTech1_twitter
Can anyone help me with getting run time arguments through dialog.modal in Jupyter nb?
Kevin Bates
@kevin-bates

@/all - [ANN] Notebook 6.1.0 is available!
The Notebook 6.1.0 release is now available! You can catch up on the changes here.

This release culminates in a substantial community effort with nearly 60 different contributors. Thank you to all those that contributed, raised issues, and participated in its success!

To get 6.1.0:

For pip-managed installations:
new: pip install notebook
existing: pip install --upgrade notebook

For conda-managed installations:
new: conda install -c conda-forge notebook
existing: conda update -c conda-forge notebook

T. George
@tgeorgeux
:shipit:
Jason Grout
@jasongrout
Congratulations!
Greg Werner
@jgwerner
newb question: is there a simple configuration option to set the jupyterlab launcher icons to open new browser tabs? in particular, those that are enabled with jupyter-server-proxy.
biancaisla1
@biancaisla1
Also a newb here. I am using a Jupyter notebook to collaborate and create a .py script. However, not everyone is using a Microsoft OS. Some people are using Linux and others are using Apple. Is there a way to set the wd so that it works universally?
Angus Hollands
@agoose77
@biancaisla1 you might need to explain the problem a bit more - the working directory should normally be the same as the notebook.
biancaisla1
@biancaisla1
So, my issue is that I'm trying to collaborate using a Jupyter notebook and then produce a .py script that can be ran on both a Linux and a Microsoft OS. So far we've just been commenting each other's WDs out but that won't work for a script. Does that clarify?
Angus Hollands
@agoose77
So I think the issue here is more than different OS - different people run notebooks in different places. The best way that I can think of to handle this is to do things relative to the notebook cwd (which should be the directory in which it exists) or to pass it in as an environment variable
If you need to manipulate files you can also use a file upload widget to provide a UI to loading files
Carl Simon Adorf
@csadorf
Hi! I am currently trying to work out a solution where I can run individual notebooks in a dedicated virtual environment (with its own dedicated Jupyter kernel) to be able to isolate dependencies for these notebooks. Is there a way to dynamically select the correct Jupyter kernel based on a given notebook path? Or is there a better way to solve this issue?
Min RK
@minrk
@csadorf I'd do it in the Kernel Spec Manager, I think, dynamically creating kernespecs
Carl Simon Adorf
@csadorf
@minrk Thx for the hint. This would allow me to dynamically "create" kernel specs, but it doesn't allow me to dynamically select them, does it?
Sylvain Corlay
@SylvainCorlay
Hey I was off this morning. Doing this after lunch.
Biswarup Banerjee
@techguybiswa
Hello people!
So for the past few weeks, I have been building a Jupyter Notebook extension that creates a Pyspark cluster locally or on Kubernetes on the click on a few buttons.
But I am facing an issue.
I have made a modal, and inside the modal we have a few input boxe. But when I click on the input boxe and try to type in values the focus of the keyboard does not change. Whatever I try to type in the input boxe inside the modal, seems to go to the Jupyter Notebook Cells.
I am also attaching a screencast that will help you understand the issue: https://www.loom.com/share/342abfc3c34143b39bf9d98462cd9548
So do you all have any idea how to solve it? How to keep the focus of the keyboard on the modal and not on the Jupyter Notebook Cells?
Wei Ouyang
@oeway
Hi guys, I would like to support syntax highlighting for HTML code block in a quoted Python string, Google colab does that automatically when it sees IPython.display.HTML(...), so I am thinking maybe we can also support that in Jupyter notebook by detecting quoted string inside HTML() in a code block. Could you please give me some hints?
arghyakusumdas6163
@arghyakusumdas6163
Hi.... I am having an error xsrf cookie does not match. I am trying to access a Jupyter notebook from my web app deployed at heroku cloud.
However, when I am trying to access it directly from the web browser it is working
Angus Hollands
@agoose77
Hi all - i'm running voila using the built-in jupyter server, and it fails to resolve static files. It seems that it's looking at /voila which redirects to /hub/voila instead of /user/.../voila. Is there a way to resolve this?
Yilong Li
@yl3

Hi, I am having a very weird problem trying to run Jupyter notebook from a server via SSH port forwarding.

  1. Notebook hangs after I try to open, create or copy a notebook using the Jupyter interface.
  2. Once Jupyter starts hanging, it stops responding to Ctrl-C on the server
  3. Opening, creating or copying text files on the same Jupyter interface works, so it doesn't seem to be a file permission issue.
  4. Renaming notebooks using the Jupyter interface works, so it doesn't seem to be a file permission issue.
  5. The issue isn't with my browser, since Jupyter works from another server as well as when run from my laptop
  6. I've tried deleting my browser's cache, using different browsers as well as incognito without success, so doesn't seem to be a browser issue
  7. The problem persists even when I run Jupyter from a fresh Conda installation, so doesn't seem to be a Jupyter version issue. I've also verified that the exact same version of Jupyter notebook works when I run it from the other server
  8. I've tried a range of different SSH ports. I am always able to connect to the main Jupyter interface, and Jupyter starts hanging only when I try to open, create or copy a notebook
  9. I get the same problem whether I try to create an R, Python or Bash notebook, so doesn't seem to relate to the Kernel used

Anybody with any idea on what I could try next would be really great...

Kevin Bates
@kevin-bates
Hi @yl3, Could you share your SSH command? I’m not able to reproduce this using ssh -L 9123:localhost:8888 remotehost with Notebook listening on remotehost:8888 and me hitting localhost:9123 from a browser.
Yilong Li
@yl3
@kevin-bates here is an example: ssh -L 8894:localhost:8894 remotehost
and this exact command works for another server that I am using
Kevin Bates
@kevin-bates
Thanks @yl3. I’m not an ssh (or networking) expert, so can’t provide any helpful information. Are there any debug/verbosity flags that can be added to the ssh command? I noticed that when I went to destroy my tunnel, it got into what seemed like a half-destroyed state and, at that point, the browser interaction wasn’t working.
Have you tried operations across the tunnel but unrelated to Jupyter? i.e., try troubleshooting at a lower level.
Yilong Li
@yl3
@kevin-bates yeah I've tried to run ssh with -vvv but nothing struck my eye (I'm not a networking expert either thought). Regarding your second point, the thing is that Jupyter actually works completely fine - I am able to create and edit text files through Jupyter. It's only when I open or create a new notebook that Jupyter stops responding.
Kevin Bates
@kevin-bates
Yeah, those actions are tied to kernel invocations and I wonder if this is an issue with the websocket. Your original post mentioned that Copy was also an issue, which I found inconsistent with this argument but didn't bring that up. Is Copy still an issue? If not, then I would focus on the websocket angle to this (and this particular server).
Yilong Li
@yl3
Yep, copy is indeed still an issue the same way. Once I close the unresponsive window, I get the following log in my ssh -v -v -v:
debug2: channel 2: read<=0 rfd 9 len 0
debug2: channel 2: read failed
debug2: channel 2: chan_shutdown_read (i0 o0 sock 9 wfd 9 efd -1 [closed])
debug2: channel 2: input open -> drain
debug2: channel 3: read<=0 rfd 10 len 0
debug2: channel 3: read failed
debug2: channel 3: chan_shutdown_read (i0 o0 sock 10 wfd 10 efd -1 [closed])
debug2: channel 3: input open -> drain
debug2: channel 2: ibuf empty
debug2: channel 2: send eof
debug3: send packet: type 96
debug2: channel 2: input drain -> closed
debug2: channel 3: ibuf empty
debug2: channel 3: send eof
debug3: send packet: type 96
debug2: channel 3: input drain -> closed
...
Anyone has any idea what this could be?
Sylvain Corlay
@SylvainCorlay
Please RT the last announcement on signing up to jupytercon!: https://twitter.com/LorenaABarba/status/1311720288816685056
Matt Riedemann
@mriedem
Hi, I cloned the latest notebook master code and I haven't tweaked any of the logging configuration but when I start the notebook server with jupyter notebook I'm getting a KeyError on 'color', has anyone seen this?
--- Logging error --- Traceback (most recent call last): File "/usr/lib/python3.6/logging/__init__.py", line 994, in emit msg = self.format(record) File "/usr/lib/python3.6/logging/__init__.py", line 840, in format return fmt.format(record) File "/home/osboxes/jupyter/notebook/.tox/lib/python3.6/site-packages/traitlets/config/application.py", line 117, in format return super(LevelFormatter, self).format(record) File "/usr/lib/python3.6/logging/__init__.py", line 580, in format s = self.formatMessage(record) File "/usr/lib/python3.6/logging/__init__.py", line 549, in formatMessage return self._style.format(record) File "/usr/lib/python3.6/logging/__init__.py", line 391, in format return self._fmt % record.__dict__ KeyError: 'color'
8 replies
adamwoolhether
@adamwoolhether

Hi, I'm having difficulty switching conda environments while running a jupyter notebook. I first created a 2nd env, installed ipykernel in that environment, and created a 2nd kernel.

Shut down all kernels, open my notebook and choose the 2nd kernel (env2), but running the notebook, i can see that the old the commands are being run under the previous env, as demonstrated with a !conda env list.

as a test, i've ran it as a list. It seems like every subsequent command is run from a default shell.

!conda env list
!source /opt/conda/etc/profile.d/conda.sh && conda activate testy && conda env list
!conda env list
will show:
conda environments:
#
base * /opt/conda
testy /opt/conda/envs/testy

conda environments:
#
base /opt/conda
testy * /opt/conda/envs/testy

conda environments:
#
base * /opt/conda
testy /opt/conda/envs/testy

Dinendu Das
@dinendu
When I want to upgrade python from 3.5 to 3.6 and upper version :

Preparing transaction: done
Verifying transaction: done
Executing transaction: failed
ERROR conda.core.link:_execute(502): An error occurred while uninstalling package 'conda-forge::notebook-4.2.3-py35_0'.
OSError(16, 'Device or resource busy')
Attempting to roll back.

Rolling back transaction: done

OSError(16, 'Device or resource busy')