ImportError: DLL load failed while importing ogrext: The specified procedure could not be found.
Everything seems to work fine on local machines when installing the environment, so it might be a GA issue. Has any of you experienced something similar and if yes, have you found a solution to fix this? You can find the error logs here: https://github.com/r5py/r5py/runs/7856124475?check_suite_focus=truepytest
: https://github.com/r5py/r5py/pull/149#issuecomment-1216617678
shapely==2.0a1
alongside GeoPandas? Would be nice to start using some of the new features from shapely
v2 but currently clashes with GeoPandas. I see there is a pull request geopandas/geopandas#2275 working on this, so maybe this will be released into a prerelease at some point in the near-ish future?
Hey guys, quick question about runtime performance improvement for pygeos:
is it possible to store pygeos objects, such as Polygons, as memory-mapped files? Given that I have bunch of planes growing throughout simulation, but not every part of polygon is accessed or written at each time step, it would be great if it can be mapped to memory, such as numpy memory mapped arrays...
But opening a multi-processing of a GEOS object is possible right? There is a way to pass the Polygon instance between different processes? Do I need to define that explicitly?
Between different processes you will have to actually copy (serialize/deserialize) the object, you can't share such a python object between processes. You can do that in multithreading though (we do that in dask-geopandas)
But opening a multi-processing of a GEOS object is possible right? There is a way to pass the Polygon instance between different processes? Do I need to define that explicitly?
Between different processes you will have to actually copy (serialize/deserialize) the object, you can't share such a python object between processes. You can do that in multithreading though (we do that in dask-geopandas)
Once you share the copy, you would have to merge the final version of all parallel processed polygons together?
But opening a multi-processing of a GEOS object is possible right? There is a way to pass the Polygon instance between different processes? Do I need to define that explicitly?
Between different processes you will have to actually copy (serialize/deserialize) the object, you can't share such a python object between processes. You can do that in multithreading though (we do that in dask-geopandas)
Following up on this one, can I use Dask-Geopandas or Geopandas to do operations that are done in pygeos? such as polygon.intersect()
, polygon.union_all()
?
And if I use Geopands
to check the element-wise intersection between a list of polygons and another geometry, is this faster than using list comprehension to check those intersections?
You don't need cython if you install pygeos from pip or conda but you do need it if you want to install from source.
Following up on this one, can I use Dask-Geopandas or Geopandas to do operations that are done in pygeos? such as polygon.intersect(), polygon.union_all() ?
Yes, if you have pygeos installed, it is automatically used under the hood to do all these. Not that the unary_union is the name for union_all in geopandas.
And if I use Geopands to check the element-wise intersection between a list of polygons and another geometry, is this faster than using list comprehension to check those intersections?
Yes, much faster as it is using vectorized approach of pygeos.