Z[index:index+chunk_size] = chunk, index += chunk_size
But you have to take care of the case where index is close to the end of Z. in such case, you need to split your data in two parts and make 2 uploads.
np.roll
. You could also do this kind of thing on a numpy array that's 2 times the size of the buffer you need and treat it like a circular buffer. You update the array in two places with your new chunk, but the data you upload to matplotlib/vispy is the slice of the large array that represents the current N data points.
hey guys - i have gotten the original vispy example up and running. What direction should I go to add cursors? I figured the first two things I need to do is add this example to a WxPython frame which seems possible and start getting cursors up and running. The cursors are to mimic an oscilloscope trigger (horizontal bar) and location (vertical yellow bar). If you aren't familiar with an O'scope, these will basically cause the signal content that breaks above the horzontal yellow bar to be centered at the vertical bar. So I need a way for the user to drag these bars around on the screen to affect the data displayed and the x-axis location. I see how the 'y data' vector is working, but how do you add a line/cursor to the plot? It looks like I would need to plot two separate vectors? One being the data vector and one being the cursor line vector? Maybe listing would be better
basically my next step is to try to get this looking like a plot. I've added a picture too.
Hi David, Thank you! I am currently just trying to gauge a little bit of the difficulty so I can estimate what kind of time investment there would be to get a pretty good proof of concept UI up and running. I don't have much code yet and am just working off a modified version of the example I originally started with (realtime signal example). Before I jumped in I was just trying to figure out how to connect cursors and UI controls (wx python UI widgets) to the data actually being plotted by the shaders (I think that would be the correct way to say it). As I'm new to this ideally I could conceptually wrap my head around what path to go down to create this interface and how to interact with the plots. It looks like so far you just create variables in the shader code and it is linked by the gloo.program() like the following.
self.program = gloo.Program(VERT_SHADER, FRAG_SHADER)
self.program['a_position'] = y.reshape(-1, 1)
self.program['a_color'] = color
self.program['a_index'] = index
self.program['u_scale'] = (1., 1.)
self.program['u_size'] = (nrows, ncols)
self.program['u_n'] = n
I was a little concerned though as to how difficult it would be to add cursors or other lines to the same plot the data is on. My current task is to figure out how to get a line on the plot along with the data that I can move up and down like the horizontal one I drew in the screenshot.
So if I understood you correctly the my approach would be to modify the VERT_SHADER code shown below to support a second position (I'm just calling "b_position") to hold the horizontal line I'm using as a cursor?
VERT_SHADER = """
// y coordinate of the position.
attribute float a_position;
attribute float b_position;
// row, col, and time index.
attribute vec3 a_index;
varying vec3 v_index;
// 2D scaling factor (zooming).
uniform vec2 u_scale;
// Size of the table.
uniform vec2 u_size;
// Number of samples per signal.
uniform float u_n;
// Color.
attribute vec3 a_color;
varying vec4 v_color;
// Varying variables used for clipping in the fragment shader.
varying vec2 v_position;
varying vec4 v_ab;
void main() {
float nrows = u_size.x;
float ncols = u_size.y;
// Compute the x coordinate from the time index.
float x = -1 + 2*a_index.z / (u_n-1);
vec2 position = vec2(x - (1 - 1 / u_scale.x), a_position);
// Find the affine transformation for the subplots.
vec2 a = vec2(1./ncols, 1./nrows)*.9;
vec2 b = vec2(-1 + 2*(a_index.x+.5) / ncols,
-1 + 2*(a_index.y+.5) / nrows);
// Apply the static subplot transformation + scaling.
gl_Position = vec4(a*u_scale*position+b, 0.0, 1.0);
v_color = vec4(a_color, 1.);
v_index = a_index;
// For clipping test in the fragment shader.
v_position = gl_Position.xy;
v_ab = vec4(a, b);
}
"""
@m3atwad Honestly, unless you are an OpenGL expert I would not use the realtime signals example and would use instead the higher level SceneCanvas API in vispy. That's what I was trying to say before about not liking the realtime signals example. In the future I may make a tutorial video on handling realtime signals, but I can't promise anything in the short term.
The SceneCanvas has more handling of mouse events, cameras, and drawing multiple visual elements. With pure gloo you are going to have to do it all from scratch or copy from examples.
Hi everyone, I'm new to VisPy and am trying to learn more about the API. To learn more about the package, I'm trying to build a simple image viewer application with PyQt5.
The first question I've encountered is whether I should use the app.Canvas or the scene.SceneCanvas to show images from a QMainWindow application?
The next question I have is regarding how to destroy these either of these canvases once they've been created. It seems that the def close():
function at least hides the widget, but it can still be visualized again when calling canvas_name.show()
when the canvas is stored in a layer list. I also can't figure out what pressing 'x' calls - it's not def close()
. Any advice on how to catch close events and make sure the canvas is actually destroyed?
Thanks!