Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jan 31 2019 19:15
    ranouf reopened #281
  • Jan 31 2019 19:15
    ranouf commented #281
  • Jan 30 2019 17:14
    ranouf closed #281
  • Jan 30 2019 17:14
    ranouf commented #281
  • Jan 27 2019 11:21
  • Jan 25 2019 08:28
    FObermaier commented #281
  • Jan 25 2019 08:12
    FObermaier labeled #281
  • Jan 25 2019 08:12
    FObermaier commented #281
  • Jan 23 2019 20:53
    ranouf opened #281
  • Jan 21 2019 04:08
  • Jan 19 2019 18:12
    roji commented #233
  • Jan 19 2019 14:19
    YohDeadfall commented #280
  • Jan 19 2019 13:29
    airbreather commented #233
  • Jan 19 2019 13:26
    airbreather commented #233
  • Jan 19 2019 13:26
    airbreather commented #233
  • Jan 19 2019 11:17
    airbreather unlabeled #233
  • Jan 19 2019 11:17
    airbreather unlabeled #280
  • Jan 19 2019 11:17
    airbreather labeled #280
  • Jan 19 2019 11:13
    airbreather opened #280
  • Jan 17 2019 00:41
Joe Amenta
@airbreather
Technically DotSpatialAffineCoordinateSequence storage is actually a hybrid of SoA and AoS: one array stores data for two dimensions, and each of the other two arrays stores data for just one dimension... I'm not really sure what to call it either.

optional support to point to an offset in the source arrays

If that's important, then perhaps the underlying storage could be Memory<T> / ReadOnlyMemory<T>? These types are more generalized versions of ArraySegment<T> that could be backed by other forms of (contiguous) memory

Joe Amenta
@airbreather

WRT DotSpatialAffineCoordinateSequence I have no objections deprecating it and replace it with a functionally equivalent sequence/factory.

I agree... perhaps the (preferred) layout of data in the sequence could be configurable at the CoordinateSequenceFactory level? Some use cases (like parts of the bulk-optimized methods in ProjNet) want to see x, y, and z as parallel arrays, whereas others (like FlatGeobuf) want to see it laid out like in today's DotSpatialAffineCoordinateSequence with x and y together, but z and m still separate... I'm not sure one way or the other whether or not these should be the same class

Björn Harrtell
@bjornharrtell
@airbreather: I've looked into use of Memory/Span and find it a bit difficult to determine what is best. If a class owns a simple array it can be used with Span which is supposedly more efficient than using Memory?
@airbreather: Agreed there is no obvious good name at all. I don't see how a flexible class can be written with regard to array structure and still be performant.
Joe Amenta
@airbreather
@bjornharrtell Memory<T> has a Span property that gives you the same Span<T> that the simple array would give you. It is, indeed, slightly more efficient to convert a T[] to Span<T> than to convert a Memory<T> to a Span<T>, but it's really not that bad.
Joe Amenta
@airbreather

I don't see how a flexible class can be written with regard to array structure and still be performant.

Considering that using it as an abstract CoordinateSequence, every individual access is a virtual call which can't be inlined... given that, I doubt that it would make a huge difference to have a (Memory<double> array, int ordinateCount)[] to store the underlying data and (int sourceIndex, int offset)[] that tells us where to look in there for each dimension. so the implementation of GetOrdinate would look something like:

public override double GetOrdinate(int index, int ordinateIndex)
{
    var (sourceIndex, offset) = _ordinateLookup[ordinateIndex];
    var (memory, stride) = _underlyingData[sourceIndex];
    return memory.Span[(index * stride) + offset];
}

My intuition is that the performance of this kind of routine will be in the same ballpark as the performance of existing CoordinateSequence implementations.

We could also expose the underlying data (similar to what PackedFooCoordinateSequence.GetRawCoordinates) using something like:

public Memory<double> GetRawData(int ordinateIndex, out int stride)
{
    var (sourceIndex, offset) = _ordinateLookup[ordinateIndex];
    Memory<double> result;
    (result, stride) = _underlyingData[sourceIndex];
    return result.Slice(offset);
}

This underlying data could be used for routines like the bulk methods of ProjNet's MathTransform class / MapProjection class

This is all just speculation, though. I don't have measurements to support either the claim that the abstract version is going to be fast enough for people to use it, or the claim that the GetRawData-style method is going to enable use cases that are compelling enough for people to justify the extra work to contort themselves to use it. i.e., if each ordinate gets its own dedicated array, then you can get full advantage of SIMD for routines that would support it... but routines that don't support SIMD would probably tend to perform worse than when X and Y are packed together because they would pretty much never be in the same cache line.

Björn Harrtell
@bjornharrtell
@airbreather: Agreed there are considerations here that makes the design choice tradeoffs difficult. I did make NetTopologySuite/NetTopologySuite.IO.PostGis#18 to get some measurements on that case though, and it will be even better if I can solve the remaining bugs with the writing part (NetTopologySuite/NetTopologySuite.IO.PostGis#19). I've not worked much with Span/Memory yet and they do seem to be nice and performant abstractions. But if the memory model is paired XY and separated other dimensions which is the memory model I want to target optimization for (Memory<double> array, int ordinateCount)[] is probably not much better than transposing the data for PackedCoordinateSequence? The point of my optimization is avoiding transposing data and even copying the data in the first place.
Joe Amenta
@airbreather

if the memory model is paired XY and separated other dimensions which is the memory model I want to target optimization for (Memory<double> array, int ordinateCount)[] is probably not much better than transposing the data for PackedCoordinateSequence? The point of my optimization is avoiding transposing data and even copying the data in the first place.

This model should support any layout that keeps the underlying values for the coordinates ordered sequentially.

If you have [x0, y0, x1, y1, ..., xn, yn, z0, z1, ..., zn, m0, m1, ..., mn], then this model would look like:

byte[] data = /* ... */
Memory<double> xy = ConvertFrom(data.AsMemory(0, xyLength));
Memory<double> z = ConvertFrom(data.AsMemory(xyLength, zLength));
Memory<double> m = ConvertFrom(data.AsMemory(xyLength + zLength, mLength));

(Memory<double> array, int ordinateCount)[] dataDefinitions =
{
    (xy, 2),
    (z, 1),
    (m, 1),
};

(int sourceIndex, int offset)[] ordinateDefinitions =
{
    (0, 0),
    (0, 1),
    (1, 0),
    (2, 0),
};
return new SparklyCoordinateSequence(dataDefinitions, ordinateDefinitions);

ConvertFrom(Memory<byte>) would probably have to return something backed by a custom MemoryManager<T> implementation in order to make it copy-free, which could have minor performance implications when accessing the values one-by-one, but it shouldn't require any transposing to get into that format.

Björn Harrtell
@bjornharrtell
Ah yes I see it now, clever. 🙂
And we can call it MemoryCoordinateSequence?
Joe Amenta
@airbreather
MemoryCoordinateSequence is fine, CustomizablePackedDoubleCoordinateSequence is longer but perhaps more descriptive
Joe Amenta
@airbreather
#482 opened
Björn Harrtell
@bjornharrtell
Sounds good. But need a strategy for the common layouts, they need to be well known and named to allow for a more optimized path when they are used?
Or perhaps logic can just introspect if two instances are compatible and use optimized path.. but still, would be good with a well known name for what is called DotSpatialAffine today.. but mm, can't come up with a fitting name.
Felix Obermaier
@FObermaier
I like RawCoordinateSequence. For DotSpatialAffine- and PackedCoordinateSequence we have a GetRawCoordinates() function.
CoordinateSequence alone is such a long descriptor, we should keep the rest short.
Björn Harrtell
@bjornharrtell
Agreed RawCoordinateSequence sounds good.
Björn Harrtell
@bjornharrtell
Is DefaultCoordinateSequenceFactory for NtsGeometryServices intentionally get only? It's documented as "Gets or sets the coordiate sequence factory to use".
Felix Obermaier
@FObermaier
Yes, it is get only, but you can set it using the constructor of NtsGeometryServices.
Amirmasoud Ramezani
@Itsamirmasoud
hey guys. i am trying to create a shape file in memory using an overload of ShapefileDataWriter which receives an instance of ShapefileStreamProviderRegistry . when i try to write the features , i get an exception saying that "Memory stream is not expandable."}. now since none of the streams used in the code implements IDisposable i can't use them in a Using statement. i was wondering if anyone has encountered this issue before.
PaulContributor
@PaulContributor
Hello guys, I have a problem with .Net core Entityframework(3.1) & Nettopologysuite.
Deserialization problem "parameterless constructor in Geometry class". I am already using NetTopologySuite.IO.GeoJSON.
Björn Harrtell
@bjornharrtell
I note that WKTReader(GeometryFactory factory) was deprecated in NTS 2.2 but it also seems it changed behaviour so that it will not respect supplied factory any longer, was this intentional?
Joe Amenta
@airbreather

it will not respect supplied factory any longer

It never really did before. v1.15.3, v2.0.0, v2.1.0 all decomposed it the component pieces that it actually used in order to create a new factory on-the-fly, which it did every time. So the misleading constructor was marked obsolete.

I probably should have considered a change to make it start using the provided factory for #502, come to think of it.
Björn Harrtell
@bjornharrtell
@airbreather: hmm but I see completely different behaviour with 2.1 vs 2.2. I have now stopped using it so not something that is a big deal for me but it had me confused for a while.
Martin
@mayermart
Hi, is there any NTS.IO Package for directly Read from WFS 2.0 Services or can someone recommed any third pary tool?
Björn Harrtell
@bjornharrtell
When upgrading from NTS 2.1 to 2.2 it seems to interpolate Z fx. intersecting LINESTRING(1 1 1, 3 3 3) with LINESTRING(1 3 3, 3 1 1) gives POINT Z(2 2 2) in NTS 2.2 and POINT (2 2) in NTS 2.1. I wonder if it is an intentional change?
Björn Harrtell
@bjornharrtell
Has been able to confirm that the change originates from JTS 1.17.1 to 1.18.0. (without use of OverlayNG)
petterssonjohan
@petterssonjohan
How to make a Patch request work containing a replace for my Geometry object?
Microsoft.AspNetCore.JsonPatch.Exceptions.JsonPatchException: The value '{
"type": "Point",
"coordinates": [
[
20.30714,
63.843494
]
]
}' is invalid for target location. (dotnet)
My JsonPatchDocument<MyEntityModel> as a parameter in patch method, on that model i do inParameter.ApplyTo(current); but with a replace for the Geometry object it throws error. Works with other fields.
noamgat
@noamgat
Hi. Is there a recommended plotting library that works with JTS?
I would like to plot polygons that I calculated to (humanly) judge their correctness
ArnauGomar
@ArnauGomar
Hello everybody. I am working with this wonderful library for my final degree project and I am running into an error that I do not know how to solve. When executing multiple intersections, straight lines appear in places that should not appear. Anyone have any idea what I may be doing wrong? Thank you!
Felix Obermaier
@FObermaier
@ArnauGomar I don't know what you mean, could you provide an image, some data or code at https://github.com/NetTopologySuite/NetTopologySuite/discussions?
Thanks.
Ben
@abennouna12_twitter

Hello,

I'm contacting you to understand the tolerance variable of the EqualsExact method : https://nettopologysuite.github.io/NetTopologySuite/api/NetTopologySuite.Geometries.Geometry.html#

Let's assume we have a polygon with these coordinates (you can import theme here to view the polygon : https://www.keene.edu/campus/maps/tool/)
2.5510168075561523,49.181485983121604
2.5504803657531734,49.17603653042171
2.552690505981445,49.17610666875947
2.553248405456543,49.1813807870118
2.5510168075561523,49.181485983121604

In the equalsExact method there's a tolerance parameter. Could you please explain to me what's the unit of this tolerance.

For our case let's set the tolerance of 0.001. Does it mean that the previous polygon will be equal to this :
2.5520168075561523,49.182485983121604
2.5514803657531734,49.17703653042171
2.553690505981445,49.17710666875947
2.554248405456543,49.1823807870118
2.5520168075561523,49.182485983121604

or this :
2.5500168075561523,49.180485983121604
2.5494803657531734,49.17503653042171
2.551690505981445,49.17510666875947
2.552248405456543,49.1803807870118
2.5500168075561523,49.180485983121604

Thanks for your help

Felix Obermaier
@FObermaier
@abennouna12_twitter NTS treats every coordinate as if it were planar. The tolerance has to be in the units of the coordinates.
Davide Giannuzzi
@dvdgnz
Hi all, I am running into issues while deserializing a List<Coordinate[]>> using GeoJSON and the CoordinateConverter. Does anybody have any tips or things I can try? It seems to be tripping while parsing the JSON.
I have opened an issue here with more details:
Joe Amenta
@airbreather
I'm assuming there's no easy, built-in way to check to see if a set of polygons forms a coverage (i.e., that it's valid for use with CoverageUnion)?
The intention is to run a check (offline, fine if it's slow) on a set of polygons to see if the application will later be allowed to run CoverageUnion later on any arbitrary subset of those polygons
Current plan is to just check every pairwise combination using a spatial index to filter out pairs that aren't even close to one another, but if there's something in-the-box that I'm forgetting about (or that I never knew about), that would be just wonderful.
Björn Harrtell
@bjornharrtell
There is a note on NetTopologySuite.IO.ShapeFile that it is obsoleted by NetTopologySuite.IO.Esri but I cannot find any released version of NetTopologySuite.IO.Esri or nuget package. Is it simply not done yet or am I looking at the wrong place?
Diego Guidi
@DGuidi
"simply not done yet"
Björn Harrtell
@bjornharrtell
Tried to use NetTopologySuite.IO.ShapeFile but it was no fun at all. Ended up copying the source of NetTopologySuite.IO.Esri and using that, it "just works". :)
Another thing entirely. I've for a long time been unsuccessful at getting locationtech/jts#715 accepted to upstream JTS. But in the end my use case is in NTS. Could it be a feature (opting out of Z-interpolation that was added to JTS in 1.18) that is acceptable to NTS even if not added to upstream JTS? I think it would be sad myself but since I may have to give up on upstreaming it I guess it doesn't hurt to ask.
Diego Guidi
@DGuidi
my2cents: the NTS policy is to stay as close as possible to JTS code, but as long as this improvement can be turned off by default and activated using some config flags or similar I think that can be done. Something similar is already done with the overlay strategy (that is part of JTS, actually)
Felix Obermaier
@FObermaier

NetTopologySuite.IO.Esri and using that, it "just works". :)

We should do a release then.

Felix Obermaier
@FObermaier
I'd be open to accept a PR that enables users of NTS to opt-out of the use of an ElevationModel when performing buffer operations. I think that ElevationModelshould be an IElevationModel.