by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Toni
    @t-book

    @smcintosh-icf you're using docker, right? never wired the container with an external db. In any case I think as django might has set the db service as a dependency in docker-compose you would have to start it. I would suggest starting with small steps. First of all I would test if all works fine with your connection and the used docker network. Unsure if the django container has postgres-client installed if not you could test with something like:

    docker exec -it django4your_geonode bash
    apt-get update
    apt-get install -y postgresql-client
    psql --host=postgres.cb9sgrxx87k8.us-east-1.rds.amazonaws.com --port=5432 --username=geonode -c "SELECT 'Connection works;'"

    After that continue do update your env. file

    Scott McIntosh
    @smcintosh-icf
    @t-book yes, I'm using docker. I can test the connection easily. Mostly I was unclear on the URL format. I think I have the format correct (user:pwd@host etc.), although I did wonder why the password was supplies in the url if its set in the password properties.
    Toni
    @t-book
    ahm a password in the connection string :/ only thing I know where this happens is with basic auth. But maybe it is expected. You get something meaningful in startup log of django?
    like connection not successful ... ?
    Scott McIntosh
    @smcintosh-icf
    Connect worked fine. postgres client was already there.
    Nothing in the log that looked like connection attempt.
    Scott McIntosh
    @smcintosh-icf
    Let me look
    Scott McIntosh
    @smcintosh-icf
    Oh, yeah I did look at that.
    Toni
    @t-book
    the invoke tasks come from tasks.py . you should see that fixtures and migration run in your startup logs
    what is currently done should be printed like: https://github.com/GeoNode/geonode-project/blob/master/tasks.py#L194
    Scott McIntosh
    @smcintosh-icf
    Thanks. I'll look at logs some more
    Toni
    @t-book
    maybe only follow django logs : docker-compose logs -f django
    Scott McIntosh
    @smcintosh-icf
    ok
    Toni
    @t-book
    if your stuck you could even change tasks.py to gather more information about your connection. it's all python. good night ;))
    Scott McIntosh
    @smcintosh-icf
    @t-book Got it working. invoke.log was helpful.
    First issue was Postgres-client was in the container, but not in the host (which performs the migration).
    Second issue that came up was it always tried to use nysdot_geonode and nysdot_geonode_data even though I had geonode and geonode_data in .env properties. So I gave up and renamed my database to nysdot_geonode and nysdot_geonode_data and it came up.
    Though you'd like to know :)
    Toni
    @t-book
    Congrats! Great to hear!
    Scott McIntosh
    @smcintosh-icf
    Thanks. Now to get the external data moved to a mounted filesystem...
    Tek Bahadur Kshetri
    @iamtekson
    I am planning to add around 40 thousand rasters into geonode for data visualization. Each raster having an average size of 300mb. These rasters may differ from the different time periods, crop type, Reflected Ceiling Plan. I am afraid due to this huge amount of data will get crashed. @afabiani @t-book could you please suggest me how can I store these rasters so that the GeoNode won't get crashed?
    I already asked a question in StackOverflow (https://gis.stackexchange.com/questions/363579/big-data-with-geoserver) but no reply there, so I am asking here. Please suggest a better idea.
    Toni
    @t-book

    @iamtekson It is strongly advised to preprocess the raster files! First test jpeg compression and YCbCr color model on a subset of your data. Keep in mind that you might have to handle 1 and 2 band next to raster images with an alpha channel as well. This should help you getting started: https://geoserver.geo-solutions.it/edu/en/raster_data/processing.html

    Besides of lowering your image sizes and f.e. block sizes you should think about future data migration, backups etc. Last but not least do not forget. sync_permissions (geonode <> geoserver) is a slow process 2300 layers took around 4 hours yesterday for me. So be prepared 40k will take some time!

    (in case you import them in geoserver first and sync afterwards ...)
    Tek Bahadur Kshetri
    @iamtekson
    Thank you for the suggestion. My rasters are GeoTIFF files. some of them are continuous rasters and some of them are classified raster. I don't care how much time does it takes for the first time setup. What if I uploaded these rasters to GeoServer using Geoserver rest API? And after that how can I list out the layers in GeoNode?
    Tek Bahadur Kshetri
    @iamtekson
    My main concern is about will the GeoServer get crashed due to big data or not? Does it support such a huge files?
    Alessio Fabiani
    @afabiani
    @smcintosh-icf if you could find some time/resources to improve the docs with your findings, that would be great
    Toni
    @t-book
    @iamtekson As said I would suggest testing on a subset and see how things work with your server performance. In my opinion, the mentioned preprocessing is essential. For example, 346gb of geotiffs shrink to 46 GB in a recent job.
    regarding performance, the first thing which would come to my mind is scaling but since geoserver afaik does not scale horizontally your only chance is vertical scaling, optimization of data, and configuration. (A bit older https://de.slideshare.net/geosolutions/geoserver-on-steroids-foss4g-2015 but still an interesting read). By conscientious testing, you might find bottlenecks before spending days or weeks trying to handle 40k of raster files. In case you directly add your tiffs to geoserver (which I would do as well) you can use https://docs.geonode.org/en/master/admin/mgmt_commands/index.html?highlight=updatelayers#management-command-updatelayers
    Last but not least I do remember a geonode instance with ~80k of layers that had getCapabilies disabled to not stress the server with something it cannot handle. Not much but hope this helps.
    Scott McIntosh
    @smcintosh-icf
    @afabiani As soon as I get everything up then I can do that. I'm making good notes as I go. But right now I've just started testing and discovered I can't upload a layer, so I need to sort that.
    Have you come across this error?
    Client Error: Bad Request
    Unexpected exception 'NoneType' object has no attribute 'store_type'
    Was simply uploading a small GeoJSON file. Upload went all the way, then the error.
    Toni
    @t-book
    @smcintosh-icf does your geoserver has access to your geonode_data db as well?
    Tek Bahadur Kshetri
    @iamtekson
    Wonderful ❣️😊. Thank you very much for your kind suggestion and documents link. As you mentioned about the geonode with ~80k layers, that motivated me a lot to visualize these rasters with geonode. I will try to follow your suggestion and if any problem occur, i will let you know. Thank you very much for quick response.😊😊
    Scott McIntosh
    @smcintosh-icf
    @toni I didn't think to check. I'll look tomorrow and let you know.
    Meant for @t-book ...
    adonis-albelda
    @adonis-albelda

    Hi all, I tried to run docker-compose but it shows this error

    Unable to locate package postgresql-client #1537

    How to solve this one ?
    adonis-albelda
    @adonis-albelda
    @t-book ?
    Tek Bahadur Kshetri
    @iamtekson
    I just checked my raster. They have only one band and type=Int16. So I am unable to compress these rasters to jpeg YCbCr color module (since YCbCr module required source raster with only 3 bands and BitsPerSample 16 not allowed for JPEG ). @t-book can you suggest me other good compression technique which will compatible with GeoServer?
    Toni
    @t-book
    @iamtekson you could use DEFLATE to save disc space. Stil you would check BLOCKSIZE etc. See: https://geoserver.geo-solutions.it/edu/en/raster_data/advanced_gdal/example1.html
    Toni
    @t-book
    @adonis-albelda docker-compose logs -f gives you stdout and stderror of all services. can you tell me which container is throwing the error?
    adonis-albelda
    @adonis-albelda
    @t-book , I fixed it by downgrading the version of postgres client 11 to postgres client 9.6 since pg 11 has no package for arm64
    Toni
    @t-book
    Is this master branch?
    adonis-albelda
    @adonis-albelda
    yes
    Toni
    @t-book
    can you please open a ticket @ github describing the steps to reproduce this ? Just did an install this week which worked.
    (but we should be sure others at least can find your solution)
    adonis-albelda
    @adonis-albelda
    Okay, noted. I'll double check it first, seems new error appeared again next to my current issue(pg client).
    Toni
    @t-book
    see invoke.log which might tell you more
    adonis-albelda
    @adonis-albelda
    but there's no invoke.log existing ?, on my previous installment there is.
    Toni
    @t-book
    @adonis-albelda maybe you're missing to build the container again? --build? also make sure that you have IS_FIRST_START=True
    Tek Bahadur Kshetri
    @iamtekson
    Ok thank you @t-book.
    Another small issue, I installed geonode core in ubunutu 18.04. Everything working fine but I am unable to preview the layers from geoserver. In geoserver.log file, it shows following error,