Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
Jerome Chatillon
@JChatillon

@abellgithub : Thanks, indeed the doc needs an update then.
When adding the where option to filters.cluster ("where": "Classification == 2"), I found something strange:

  • using the pdal command line application: working fine
  • using (python-pdal) : pdal.execute() throws an error.

I have to investigate this a little further.

Guilhem Villemin
@gui2dev
@mwilcoxnz set OSR_STRIP_TOWGS84=YES should be the right way to do it.
Woutervb1
@Woutervb1
Thanks. That works. Much appreciated.
Jean-Roc Morreale
@Jean-Roc
Hello everyone, is there a way to make use of formula on dimensions such as "X" + 10 in a pipeline without filter.python and numpy ?
Jean-Roc Morreale
@Jean-Roc
and of course I found out about filter.assign by scrolling up to @JChatillon just after asking :) Jerome we are also trying to evaluate the use of pdal outside aerial, do you have any public repository ?
Alex Knoll
@arknoll
Quick question, what database does PDAL use when looking up epsg codes?
Howard Butler
@hobu
@arknoll it uses whatever PROJ is using. Is there a specific issue?
Alex Knoll
@arknoll

@hobu There appears to be an issue with an epsg code where it is in meters in the database, but should actually be in usft: pyproj4/pyproj#160

My plan of action was to create a custom epsg code in proj/other.extras

But, it doesn't seem to be recognizing the new epsg within pdal
Getting (Error) GDAL failure (1) PROJ: proj_create_compound_crs: components of the compound CRS do not belong to one of the allowed combinations of http://docs.opengeospatial.org/as/18-005r4/18-005r4.html#34
[
{
"type":"readers.las",
"filename":"/home/ubuntu/LiDARProcessingData/890/0/split/split_5.las"
},
{
"type":"filters.reprojection",
"in_srs":"epsg:32615",
"out_srs":"epsg:8698+6360"
},
{
"type":"writers.las",
"filename":"/home/ubuntu/LiDARProcessingData/890/0/processed/mergedInt/split_5.las",
"offset_x":"auto",
"offset_y":"auto",
"offset_z":"auto",
"scale_x":"auto",
"scale_y":"auto",
"scale_z":"auto"
}
]
8698 is the custom epsg code I added to other.extras
Howard Butler
@hobu
You might need to use this new unreleased PROJ capability to get what you want OSGeo/PROJ#2577
If the EPSG db entry is bad, you should submit a fix to EPSG and they will include it as part of the next release
Alex Knoll
@arknoll
@hobu thank you for that!
Another possibility I have is using a wkt or proj string instead of the epsg code. However, it is not clear how to use a proj or wkt string when doing both the horizontal and vertical projection
See above where I have epsg:8698+6360
Alex Knoll
@arknoll
@hobu I was also trying to follow this: OSGeo/gdal#3136 to use a custom projection.
Stuart Attenborrow
@stuarta0
Is there a way to create an explicit dimension using the implicit point id from an input file?
Andrew Bell
@abellgithub
@stuarta0 : EPT datasets have this value, but otherwise no, though it would be trivial to create a filter that provides this functionality. One potential issue to be aware of is that a pipeline may have many input sources which could complicate logic during processing, depending on your situation.
Stuart Attenborrow
@stuarta0
No worries. I've added a python stage for now with a numpy.arange. As a simple filter it would work well, you could always ferry the input source point id using tags if you later used a merge or something
James Lovell
@jlovelll4

Question regarding pdal translate. Whenever I forward or explicitly output a point data record format of 6, all of the points get counted as legacy point records. How do I prevent that from happening?

I am using version 1.9.1, although it was doing the same with the latest release.

Andrew Bell
@abellgithub
@jlovelll4 There is no support for this. I don't know a use case for it, though you apparently have one :)
The value of the field will be 0 if there are more points in the output than would fit in a LAS1.3 file, as per the LAS1.4 specification.
James Lovell
@jlovelll4

@abellgithub I see. I would use PDAL (1.4 to 1.4 translate ) to clean up header info and make all files uniform prior to shipping to client.

So the software is set up to choose for you the ASPRS las spec based on the number of points....essentially?

Andrew Bell
@abellgithub

@jlovelll4 : No. PDAL always writes the LAS version based on the minor_version option. The default is 2, so you'll get version 1.2 output unless you request otherwise.

This really has nothing to do with the legacy point count. The legacy point count is always the same as the actual point count unless the output version is 1.4 and the number of points is greater than 2^32 - 1.

James Lovell
@jlovelll4
@abellgithub the problem with that is the newest 1.4 spec indicates that the legacy point count should be 0 when PDRF 6-10 is used.
Andrew Bell
@abellgithub

@jlovelll4 : You'll have to take this up with the LAS committee. IMO they shouldn't change the specification once released, yet they persist in doing this. There is no way for anyone to know if a file does or doesn't conform with a specification that changes.

I'm not sure why any software that reads LAS would care about the legacy point count field if it supports reading version 1.4, but perhaps some do.

James Lovell
@jlovelll4

@abellgithub I hear ya....but it does make a difference when a client will not accept las files because they are out of spec. I guess I could use lasinfo to correct the legacy points after pdal processing. Unless I am missing something, pdal info doesn't read out legacy points. The client caught it using lasinfo, cause it gives you a warning that you have legacy points with 1.4 and pdrf of 6-10

So if you are saying this isn't a capability of PDAL then it should be.

Patrick Sapinski
@drkow_gitlab
some more sanity checking in general would be useful, we've run into a few las files that cause entwine and pdal info to completely freeze up (corruption I'm assuming). I've spent the last few days randomly corrupting a las file to find the best way to weed these out, I'm curious if y'all ran into any issues with bad lidar data when processing the datasets at usgs.entwine.io
Andrew Bell
@abellgithub
@drkow_gitlab : If you have files that cause a problem with PDAL, you should open a ticket. We certainly can't intuit problems you're having.
Patrick Sapinski
@drkow_gitlab
sure I can upload the problem files I corrupted
Andrew Bell
@abellgithub

@jlovelll4 : You should tell your client that the file conforms to LAS 1.4 R13. You can set bytes 107-130 to zero if you want the file to conform to the newer language.

There is no such thing as a "legacy point". Points are points.

The lasinfo warning is only valid if a file is LAS 1.4 R14 or R15. If it's R1 -> R13, it's incorrect. Software can't know what is correct without more information that's not part of the file. The LAS standard is bad because it changes. ASPRS also doesn't qualify/validate software that produces LAS files.

James Lovell
@jlovelll4
@abellgithub Oh they know, its USGS and they love those standards! Anyways, how would i go about setting bytes 107-130 to
Guilhem Villemin
@gui2dev
@jlovelll4, something like this
https://stackoverflow.com/a/34524796
James Lovell
@jlovelll4
@gui2dev that worked! Thank you so much!!!
James Lovell
@jlovelll4

Is there a way to create a custom vertical coordinate system? Prior to GDAL 3, vertcs.override.csv made it pretty easy. Specifically, I need a revised version of NAVD88, like this:

VERT_CS["NAVD88 height - Geoid12B (Meters)"

Any help would be appreciated

Howard Butler
@hobu
that's an invalid VERT_CS entity name, btw. USGS's specification is wrong here.
James Lovell
@jlovelll4
@hobu yes indeed!! ..but it's a requirement with the 3DEP contract
Preston Hartzell
@pjhartzell

If I crop a BPF file I lose the UTM projection info unless I set the "coord_id" option to "auto" on the BPF writer. So I lose the projection info for a pipe that looks like this:

[
    "in.bpf",
    {
        "type": "filters.crop",
        "bounds": "([626278.47, 627866.28], [4322579.16, 4324152.09])"
    }
    "out.bpf"
]

Seems clunky to have to explicitly create a BPF writer stage and set the "coord_id" option to "auto" in order to retain the projection info. Curious as to the reason that "auto" is not the default when allowing PDAL to infer the writer from the filename extension?

Andrew Bell
@abellgithub
@pjhartzell : The crop filter should not change or eliminate a projection. You should see the same behavior without the crop filter. BPF doesn't support standard WKT it's not always clear if a CRS should be coerced into a particular output CRS in BPF. Auto isn't the default because it didn't exist in early versions so setting it to be the default would be a potentially unexpected behavior change. You can lobby @hobu if you would like the default behavior modified.
Preston Hartzell
@pjhartzell
@abellgithub Makes sense. Point taken on it having nothing to do with any intermediate filters. I see the same behavior with a simple pdal translate to a different filename. I don't touch BPF files very much, so I'll leave it alone. Thanks.
ap
@apiszcz
Compiling PDAL with vcpkg i am not getting all the drivers, density, icp, etc. are missing. Is there a command line argument or other flag to enable building all the plugs on Windows? Thanks.
EvertEt
@EvertEt

Apologies if this is the wrong place to ask conda related questions, but an issue on the feedstock felt worse.

Is there a way to install PDAL via conda with GDAL 3.2.X? I see gdal32 PR is merged, not sure if this is related. Does there need to be a new PDAL release for this, or just a new conda release? Do they have a release schedule? Many questions and many thanks!

EvertEt
@EvertEt
I think I checked the wrong things and the 3.2 GDAL is included in some builds, and it seems every master commit on the feedstock is pushed to a new build to be immediately available.
Philipp Glira
@pglira
Hi, can anybody tell me how pdal can be used to filter points based on an attribute threshold? E.g. all points with Planarity > 0.8. Thanks!
Philipp Glira
@pglira
Ok... found it! Obviously this can be done with filters.range (not easy to find)
Alex
@wildintellect
I'm looking for a pipeline json example showing how to assign the Type to a new Dimension, in particular I'm converting HDF5 to Las, and I was using "extra_dims":"all" but wasn't clear what Type would get used if the read doesn't specify what it should be read as
Andrew Bell
@abellgithub
@wildintellect : What do you mean by "type"? There is commonly a dimension know as "classification". Is that what you mean?
ddarosa
@ddarosa
Hi! I have just started to learn about PDAL and EPT. It's amazing!
Do anyone knows if there is one place where I could find which countries or organizations are serving data using EPT? Like a list of EPT sources.