Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Erik van Sebille
    @erikvansebille
    Philippe Delandmeter
    @delandmeterp
    Hi everyone,
    To implement a new fancy feature into Parcels, we’d need a better understanding of ast module of Python: https://docs.python.org/3/library/ast.html
    Anyone familiar with ast?
    phand
    @HandmannP
    Dear everybody, I just came across a very weird thing. I am reading in the nemo output fields with : parcels 2.0.0beta and when I check before all field contain the SPNA completly shown on the first attached plot, as soon as I have read the fields with the FieldSet.from_nemo The region is cut to the plot nr2 - what happens here?
    T_mask.png
    pset_with_field.png
    I need the full field?!
    Erik van Sebille
    @erikvansebille
    Hmm.. This is probably an error in the plotting routine, rather than an error in the FieldSet method itself.. What if you do a plt.imshow of your fset.U.data[0, 0, :, :]?
    phand
    @HandmannP
    By doing this the full field is available! OK great, thats then an error in the plotting and not in my fields! Thanks for the quick response Erik!
    Erik van Sebille
    @erikvansebille
    Good! An alternative solution is to add the domain keyword to the fset.U.show(). See also the last cell in https://nbviewer.jupyter.org/github/OceanParcels/parcels/blob/master/parcels/examples/tutorial_nemo_3D.ipynb
    phand
    @HandmannP
    I already tried that but got an error - as my boardes where not included in the plottet field (but certainly in the original input field) it could not find them and hence gave an error
    Erik van Sebille
    @erikvansebille
    Still, this is a bug I reckon: field.show() does not correctly determine the domain for curvilinear grids
    Strange. Can you copy-paste that error here?
    Erik van Sebille
    @erikvansebille
    Ok, I've now filed an Issue at OceanParcels/parcels#580
    phand
    @HandmannP
    I've just been running into another problem : Although defined before while running parcels with dask, during kernel built the pointers are not found and ir gives a keyerror:
    _funccode_Diff_hom_200 = inspect.getsource(_Diff_hom_200.code)
    _funcvars_Diff_hom_200 = list(_Diff_hom_200.code.co_varnames)
    KeyError: '_funccode_Diff_hom_200'
    Willi Rath
    @willirath
    The problem is that globals() on the dask worker is not necessarily the same as on the frontend. (Which is closely related to the reason why we have to get the code and variable names on the front end in the first place.)
    phand
    @HandmannP
    so it is better to directly transfer te pointer to the function i guess?
    phand
    @HandmannP
    • this is also not working... Find example in the attachment The error is : in the second file of the attachment
    phand
    @HandmannP
    I am still struggeling with this - It does not work even if the kernel is defined inside the run function.... see in the html file which is attached
    Philippe Delandmeter
    @delandmeterp
    You shouldn’t import math and random in your kernel
    math is already available
    for random, you have access to parcels random functionalities
    random.uniform, random.normalvariate, random.expovariate, ...
    See those examples of diff kernels
    phand
    @HandmannP
    So the trick was here to set the unit converters which are not set automatically - fieldset.Kx.units = parcels.tools.converters.GeographicSquared() and to rename the Kx and Ky in the definition of the Kernel to Kxx and Kyy . This works then also parallelized in dask find the example attached.
    Arianna Olivelli
    @OlivelliAri_twitter

    I am running a backwards simulation and have just got an OutOfTimeError(particle, fieldset) stating this:

    Field sampled outside time domain at time 2019-01-29T00:00:00.000000000. Try setting allow_time_extrapolation to True.

    My fieldset is extending up until 20/3/2019 so I don't really understand why that happens.

    Erik van Sebille
    @erikvansebille
    Hi @OlivelliAri_twitter. It’s tricky to figure out what happens just from this information. Can you copy/paste a notebook, like @HandmannP did above?
    Arianna Olivelli
    @OlivelliAri_twitter
    Yes, of course. The problematic date is the last day of the 'days_plus7' in the FieldCompare file.
    giselebury
    @giselebury
    Hello, I am running a code to release particles using parcels and am getting a TimeExtrapolationError: temp sampled outside domain at time -3024000. Try setting allow_time_extrapolation to True. Are you able to run parcels for ncom hourly outputs over an extended period? If so how?
    Erik van Sebille
    @erikvansebille
    Hi @giselebury, I’m not sure what time units ncom has, but in principle it should be possible. What if you don’t sample temp? Can you advect the particles in your data?
    marcoturro
    @marcoturro
    Hey hello! Quick question, is the option to add kernels to the execution just by summing them (ex: tut1, AdvectionRK4 + kWestVel) still availabe? I did not manage to make it work no matter what. Works well as soon if I don't add the k_WestVel. (module k_WestVel is defined)
    Erik van Sebille
    @erikvansebille
    You need to cast the first Kernel to a pset.Kernel object. So pset.Kernel(AdvectionRK4) + kWestVel should work
    marcoturro
    @marcoturro
    @erikvansebille Thanks for the quick anwser, problem was in definition of the module that i tried to costum
    Ignasi Vallès
    @ignasivalles
    Hi all of you! After coming back to parcels I'm having a new problem with the new version and my old script. It looks like something wrong at the end of the script when exporting the temporary files .npy:
    Traceback (most recent call last):
      File "/home/valles/miniconda3/envs/py3_parcels2v1/lib/python3.7/site-packages/parcels/particlefile.py", line 195, in __del__
      File "/home/valles/miniconda3/envs/py3_parcels2v1/lib/python3.7/site-packages/parcels/particlefile.py", line 200, in close
      File "/home/valles/miniconda3/envs/py3_parcels2v1/lib/python3.7/site-packages/parcels/particlefile.py", line 361, in export
      File "/home/valles/miniconda3/envs/py3_parcels2v1/lib/python3.7/site-packages/numpy/lib/npyio.py", line 428, in load
    NameError: name 'open' is not defined
    Ignasi Vallès
    @ignasivalles
    Ignasi Vallès
    @ignasivalles
    I enclose the script to be more clear. I think the mistake it's within my script but I don't know what because with parcels2.0 was fine. If I call pfile.close() also gives me an error but the output it's converted to netcdf anyway. I have to check if the results are ok or not. I also tried it without the while loop and I get the same errors.
    Traceback (most recent call last):
      File "run_MOCBOX_forward2v1.py", line 190, in <module>
        pfile.close()
      File "/home/valles/miniconda3/envs/py3_parcels2v1/lib/python3.7/site-packages/parcels/particlefile.py", line 200, in close
        self.export()
      File "/home/valles/miniconda3/envs/py3_parcels2v1/lib/python3.7/site-packages/parcels/particlefile.py", line 379, in export
        getattr(self, var)[:] = self.read_from_npy(global_file_list_once, 1, var)
      File "/home/valles/miniconda3/envs/py3_parcels2v1/lib/python3.7/site-packages/parcels/particlefile.py", line 331, in read_from_npy
        data[id_ind, t_ind] = data_dict[var]
    IndexError: index 1124 is out of bounds for axis 0 with size 1124
    Ignasi Vallès
    @ignasivalles
    I've finally seen where it fails, it's when it writes the constant variable VT which I set to_write = 'once' in class sParticle(JITParticle): if I set it into True is all fine but now VT is 2D and I only need VT(traj) in order to reduce the size of the output file. So at the moment it works but I would like to know if I can write this variable only for traj dimension again.