Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jan 31 16:26
    kmuehlbauer commented #1568
  • Jan 31 16:15
    kmuehlbauer commented #1568
  • Jan 31 15:53
    djhoese commented #1568
  • Jan 31 15:46
    kmuehlbauer edited #1568
  • Jan 31 15:44
    kmuehlbauer opened #1568
  • Jan 31 15:32
    djhoese commented #1566
  • Jan 31 15:30
    djhoese commented #1567
  • Jan 31 14:18
    GuillaumeFavelier commented #1561
  • Jan 31 14:15
    GuillaumeFavelier commented #1561
  • Jan 31 12:46
    opekar synchronize #1130
  • Jan 31 09:54
    kmuehlbauer commented #1567
  • Jan 31 09:53
    kmuehlbauer synchronize #1567
  • Jan 31 09:37
    kmuehlbauer labeled #1567
  • Jan 31 09:37
    kmuehlbauer labeled #1567
  • Jan 31 09:37
    kmuehlbauer labeled #1563
  • Jan 31 09:36
    kmuehlbauer labeled #1563
  • Jan 31 09:28
    kmuehlbauer commented #1567
  • Jan 31 09:27
    kmuehlbauer synchronize #1567
  • Jan 31 09:19
    kmuehlbauer assigned #1563
  • Jan 31 09:17
    kmuehlbauer opened #1567
Matthew Spellings
@klarh
Excellent, great job!
David Hoese
@djhoese
@larsoner @kmuehlbauer FYI I'd like to switch vispy to automatically using tags for version numbers. You'll see a lot of projects like xarray and dask using versioneer and I've used that in my own projects like Satpy. However, versioneer has been unmaintained for a while and it seems that setuptools-scm (https://pypi.org/project/setuptools-scm/) is the best alternative. I haven't completely switched to it for Satpy yet (PR waiting to be merged) but I'd like to use it for vispy too. Let me know what you think or if you have other ideas/concerns?
Kai Mühlbauer
@kmuehlbauer
@djhoese Can you be a bit more verbose how things should work then? I have no experience with versioneer nor setuptools-scm, so please be kind :-).
David Hoese
@djhoese

@kmuehlbauer Yeah no problem. I typed that out before I had to go offline so left out some details.

Right now we manually set the version of vispy in vispy/__init__.py. When we make a release we update it from 0.6.0.dev0 to 0.6.0, make the release, then update it again to 0.6.1.dev0.

With versioneer or setuptools-scm, the tools are setup to look at the latest git tag matching a specific pattern (ex. vX.Y.Z). If the current commit is the same as the git tag then vispy.__version__ will be the version in the tag (ex. 0.6.0). If the current git is a couple commits beyond a tag then you'll get something like 0.6.0+abc123 where abc123 is the current commit hash. If the current working directory is dirty (uncommitted changes) then you'll get a version like 0.6.0+abc123.dirty or something like that.

This means that whenever we need to make a release, especially because of the automated PyPI deployment I just got working, all we have to do it git tag -a v0.7.0 -m "Version 0.7.0"; git push --follow-tag. The CIs will clone the repository, the current git tag will be used for the version of the package, build the sdist/wheel for that version, then deploy it. Then as we merge more PRs/features, the version we install from github will automatically increase from the last released version.

Bugs in versioneer that haven't been fixed because of no maintenance: wheels have a version of 0+unknown if the sdist isn't built first and the extra files needed for versioneer to work don't meet standard style rules (iirc)
Matthew Spellings
@klarh
it does prevent users from being able to install from github tarballs of the project, but that probably doesn't happen too often (I ran into this with some project within the last week or so, but it's certainly not the end of the world and setuptools-scm gives a very clear error message about it)
Elliott Sales de Andrade
@QuLogic
that can be somewhat worked around with setuptools_scm_git_archive
David Hoese
@djhoese
@klarh The tool @QuLogic mentions should fix that
David Hoese
@djhoese
@kmuehlbauer @larsoner I decided to play with .obj files and MeshVisuals because I really don't have much experience with them. A couple questions I was hoping you could verify for me:
  1. vispy has no "give me contents of .obj file and put it in a mesh" visual class or utility. Right?
  2. MeshVisuals have no way of the user giving an array of normals, right? Is this a missing feature or relatively unnecessary? .obj files have normals defined so wouldn't it be advantageous to provide them?
For 1, I was mainly thinking if I have the texture data for an .obj and all of the information from the .obj file, there is no builtin Visual that takes mesh information and a texture...right?
image.png
Model of the GOES-16 satellite with no real texture data :smile:
Eric Larson
@larsoner
MeshData can handle normals
and internally somewhere Mesh must do it IIRC because things are smooth shaded
Kai Mühlbauer
@kmuehlbauer
@djhoese Quite some work was done already by @asnt and @asnt put up a tracking issue: vispy/vispy#1665 . For adding texture there is the filter approach (also by @asnt, see tracking issue) which attaches a texture to a mesh. Good time to consolidate everything within the linked tracking issue.
David Hoese
@djhoese

@kmuehlbauer Thanks.

@larsoner Yes, normals are used but they are calculated from the vertex/face data. Nothing currently takes normals as an input from the user in any way

David Hoese
@djhoese
@asnt in your opinion what is the most important PR(s) that needs to be merged before other work can be done on Meshes
asnt
@asnt
@djhoese I far as I understood, #1462 was a required change in the logic of MeshVisual._update_data()before being able to implement the TextureFilter. Then, #1444 (texture filter) and #1463 (shading filter) would apply.
asnt
@asnt

I think there are some limitations in the current system:

Loading:

  • The obj loader assumes texture coordinates are vertex attributes but they can also be attributes of the face corners (e.g. allowing for a texture atlas or distinct normals on either side of a crease). The current obj loader seems to truncate the data when face corner attributes do not map to vertex attributes.
  • Reading of texture to be added supported.
  • (I can propose a tentative obj loader handling the above two cases.)
  • The current convention of the loaders is to return the tuple (vertices, faces, normals, texcoords) or something similar. This assumes texcoords are vertex attributes (see above). We probably need to be more flexible. More generally, there would be arrays of data (vertex array, normals array, texture coordinates array) and arrays of indices telling how the arrays of data are mapped to the face corners (currently on the "face indices" is well supported).

Internal representation / API:

  • The current MeshData might also assume texcoords are vertex attributes.
  • Extend API (?): Maybe separate the geometry of the mesh (MeshData) from its representation (e.g. a new Material class) and the possible light sources (e.g. new Light class). MeshVisual would compose these and manage the rendering. Drawing inspiration from threejs. Note: The filter idea is doing this in a way. Maybe if MeshData is extended to support the all the data we need, it can be made accessible to the filter for appropriate rendering.
  • Should the mesh reader return a MeshData object for flexibility instead of a fixed tuple?

Future filters:

  • wireframe

These are some thoughts and suggestions I could gather. Not sure if it's totally clear/correct. Let's discuss.

asnt
@asnt
I keep updating #1665 as thoughts come up
David Hoese
@djhoese

@asnt What are your thoughts on depending on a library like https://pypi.org/project/PyWavefront/

If a user has it installed then they can load .obj files, if not then they can't. We could keep the "dumb" reader for now

asnt
@asnt
@djhoese Of course that could work as long as the user can obtain the required arrays. Not sure about PyWavefront: it seems to interleave the arrays to be processed by the gpu. In vispy, we'd rather need the arrays as independent pieces, it seems.
asnt
@asnt
@djhoese Here's an basic obj loader/saver with the features I mentioned in #1655: https://gist.github.com/asnt/22c2fa04b9a5811c418e5ce781744717
  • Handle vertex color attributes
  • Load array of normal data (on top of vertices and texture coords)
  • Load face corners attributes (face, normals and texture coordinates)
  • Parse the MTL file for the texture image (i.e. map_Kd "diffuse mapping")
  • Load texture image (depends on Pillow)
  • Able to write back this information to disk
    Only tested by me. Many other features of the wavefront format are ignored (some present in PyWavefront fro ma quick look.)
David Hoese
@djhoese

@asnt What I want to avoid is maintaining our own functionality and/or library. I would hope to depend on someone else's work. Granted it is just very likely that pywavefront will die and go away, BUT I would much rather contribute to an open source project, make it better, and keep it alive by making it better. Looks like pywavefront has a roadmap that includes using numpy arrays: pywavefront/PyWavefront#92

And from what I can tell it is possible to access the individual elements (vertices, etc) as lists.

Side note: I have an optimization for your code: if you read all the vertices first, determine how many there are, then pre-initialize (np.empty) for all the other types then you don't have to waste the memory/performance creating lists and converting them to numpy arrays. You can instead set the value for each norms[idx] = value.

If you disagree, of course, argue. I'm not the primary user of this stuff. I just don't want to accidentally become the maintainer of another project
asnt
@asnt
@djhoese I see your point. I think it's reasonable to implement the visualisation independently of the data loading. We can assume for now that the user provides the arrays. https://github.com/mikedh/trimesh might also be an alternative for loading.
@djhoese Thanks for the feedback on the code. Will definitely give it a try.
I'll have a look at syncing the PR's with master over the weekend.
David Hoese
@djhoese
@asnt Looks like we have vispy.io.stl that copies from trimesh
yes, I like trimesh much more
asnt
@asnt
@djhoese I synced the PR's with master: #1462 #1444 #1463 and added a tentative wireframe filter #1689
I am not sure about the CI error yet
David Hoese
@djhoese
I restarted the failed job on 1462, it happens occasionally and I haven't had the time to figure it out. Seems like a race condition with cassowary which is bundled with vispy. We have another PR to switch to a new version or other similar library but it was never finished
David Hoese
@djhoese
@asnt if some of your recently updated PRs are failing for the travis "examples" job you may want to merge with master again. A chance in numpy was causing warnings which was causing tests to fail
asnt
@asnt
@djhoese Thanks, I'll try when possible
David Hoese
@djhoese
thank you
Wei Jiang
@jiangwei221
Hello, I have a question about the camera transformation, for example, view.camera.transform.matrix, here the matrix is a world2camera or camera2world matrix? Thanks!
David Hoese
@djhoese
@jiangwei221 Technically both. I believe the .map method should be world to camera and .imap should be camera to world.
David Hoese
@djhoese
sorry, the .map functions should be on the .transform object and that's what you should use instead of accessing .matrix directly
Wei Jiang
@jiangwei221
Thanks! @djhoese
David Hoese
@djhoese
If anyone is around that uses _screenshot for some of there stuff. Have you tried using it inside a PyQt5 Widget? I just realized an old project of mine that switched from PyQt4 to PyQt5 is now saving the whole window instead of just the vispy canvas
KonstantinMakovsky
@KonstantinMakovsky
hello. Can anyone send the most simple example of VisPy animation of watch arrow?
David Hoese
@djhoese
Have you looked through the examples in the vispy repository? What do you mean by "watch arrow"?
KonstantinMakovsky
@KonstantinMakovsky
I looked it but I need only few raws of code get it. A line must move. X and Y data must be lists
David Hoese
@djhoese
@KonstantinMakovsky I'm still not sure what you mean exactly. But maybe you could get some ideas from https://github.com/vispy/vispy/blob/master/examples/basics/scene/line.py or https://github.com/vispy/vispy/blob/master/examples/basics/scene/line_update.py
KonstantinMakovsky
@KonstantinMakovsky
How can I set X and Y plot limits?
David Hoese
@djhoese

The plotting API for vispy is very limited and not well documented so it can be difficult to find answers for this kind of stuff. I think you should be able to do:

x_axis.axis.domain = (x_min, x_max)