Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
    @HaydenSchilling Thanks!
    Daniel Hewitt
    Thanks for the speedy reply @erikvansebille. I didn't realise that it was implemented via particleset.from_field(). I should've been more specific: these crabs have multiple discrete spawning areas (/aggregations) and I want the release lat and lon to be randomised every repeatdt. Ultimately, I want a randomly chosen spot within several different areas and for a new random spot to be chosen for each area on the next release. My understanding is that this wouldn't be possible with particleset.from_field(). Is that correct? I just want to be sure so I don't reinvent the wheel.
    Hi @DEHewitt, if you would like to randomise your release locations at every new release, it is not possible at the moment to use repeatdt. Instead, I might suggest two alternatives using either pythons or NumPys random number generators: the most straightforward way would be to specify all release times explicitly in a single particleset and randomising the lists of longitudes and latitudes. This is shown in the first example here . For a better computing performance (and cleaner coding perhaps) you could write a for-loop in which you add new particles at random locations using particleset.add(). That could look something like this:
    pset = ParticleSet(fieldset,pclass,lon=[random list],lat=[random list], depth = [random list],  time=[first release])
    for i in range(number_of_releases):
         pset_new = ParticleSet(fieldset,pclass,lon=[random list],lat=[random list], depth = [random list],  time=[second release])
    Luc Vandenbulcke
    Dear all,
    I am using Parcels for a couple of weeks and already love it. Maybe the gitter channel could be advertised more on the OceanParcels website and/or Github page, or maybe it's my fault that I didn't know about Gitter before.
    Anyway, with help obtained through the Github issues, I somehow managed to have Parcels transport particles using a 3D triply-nested field of currents (from Nemo-Agrif nested grids), as well as CMEMS Stokes drift. I found the Breivik et al 2016 code to propagate (surface) Stokes drift to depth, in the OpenDrift source code ; and I just copy/pasted it in my own kernel. On top of nested currents, a Stokes drift VectorField, my fieldset now also contains significant wave height and mean peak period (required to compute propagation to depth); and Parcels nicely (an quickly!) extracts all the required values for me!
    My question is the following. Is there a tutorial or documentation about what one can or cannot do in custom kernels ? When using Euler integration, it's OK to do the Stokes->depth computations directly in the kernel. With RK4, I would need to copy/paste this code 4 times, and it starts to look a little clumsy. So I wanted to use a new function in the module where I store my custom kernels, and call that in the kernel. I couldn't make it work with JIT, and also even not with Scipy. I can post detailed errors for both cases, but it's probably too long to discuss here.
    I also saw that there are some commits (from 2018!) about using directly C code for the kernel, but I didn't really find any documentation about how to do this.

    Hi @lucvandenbulcke,
    Glad to hear that you are so enthusiastic. We are not always very active on Gitter ourselves, which is why it is not very prominent on the website :). Regarding your question, it is often a challenge to prevent the kernels from looking a little clumsy. The documentation on what you can or cannot do in kernels is on another page which maybe should be more prominent on the website: https://oceanparcels.org/faq.html. If you want to write your own C kernels you might learn something from this example.

    Let me know if this helps,

    Good luck!

    2 replies
    Hi, @erikvansebille,
    Is it possible to define return functions in the kernels py file. I am trying to define a temperature gradient field and then force the particles to follow the gradient. However, the temperature T(time,depth,lat,lon) should be imposed to another function.
    Hi all. Thanks for maintaining this site; it is really helpful. I have a question about vertical locations of velocity components. I set up my 3D grid following your template (i.e. reading the values from *.nc files, and specifying f-points and w-levels). It is my understanding from your and NEMO's documentation, that the U and V components are meant to be located at mid-depth of a grid cell (T-levels). I assumed that Parcels would designate the location of these components accordingly, like it does with the horizontal offsets. However when I test my code, it appears that the U and V values that are read in are instead designated to be at the top of the cell (w-level k-1). I am concerned I am doing something incorrectly, and will therefore get an incorrect interpolation. Can you please explain what is actually going on?
    Hi @SaAlrabeei,
    If your particleclass contains a Variable like particle.T, you can sample the temperature (gradient) field at T(time,depth,lat,lon) and use particle.T in another function in a subsequent kernel. There is no need to return anything in this way as the functions will be executed as a single long kernel in which particle.T can be accessed by both functions. Does this answer your question?
    Hi @ElizaJayne11 ,
    If you pass the locations of the f-nodes and the w-levels to parcels using either FieldSet.from_nemo, FieldSet.from_c_grid_dataset or Fieldset.from_netcdf(interp_method='cgrid_velocity' ), parcels should correctly shift the U and V values by half a depth-level to be at the mid-depth. If your test shows this is not the case, could you show how you have created the fieldset and how the test shows the U and V values are interpolated incorrectly?
    Hi. I had used FieldSet.from_nemo with dimensions for glamf and gphif. I did not use w-depths since strangely the depths in my w-files are t-depths. I had assumed that Parcels would treat those t-depth as if they were w-depths since that is what it was expecting. Therefore I assumed that my U and V would be shifted accordingly. However, what I am getting is that U and V are assumed to be at the t-level, which would be correct if the program actually knew it was a t-level. Does this mean that Parcels is smart enough to know I am really giving it t-levels?
    Hmmm, Parcels expects w-depths as you say, so it should not recognize that you have passed it t-levels. Can you show how your test shows that U and V are interpreted at the t-levels by Parcels? Here is an example of a unit test we have used to check whether the particles in a particleset show the values corresponding to the known fieldset values.
    Thanks @reint-fischer for reply. Yes, I already have Temperature for each particle. However, I will tell you wat I want: I want to find the gradient of a function R. This function depends on the Temperature T ( time, depth,lat , lon). The function R is conditional.
    Here what I did, but didn't work

    """Find the temperature gradeint. Where the temperature is defined as a temperature field. That is,

    Function needs to be converted to Kernel object before execution"""

    dx = .03 # for spherical coords, dx is in degrees

    dy = .008 # for spherical coords, dy is in degrees

    #(minx, maxx, miny, maxy) = fieldset.T.grid.lonlat_minmax # recall the fieldT boundaries

    (minx, maxx, miny, maxy)=(4.629741, 47.646046, 64.129555, 78.43126)

    x = particle.lon

    y = particle.lat

    if x <= minx:

    dFdx = (fieldset.T[time, particle.depth, y, minx+dx]- fieldset.T[time, particle.depth, y, minx])/ (dx)

    elif x >= maxx:

    dFdx = (fieldset.T[time, particle.depth, y, maxx]-fieldset.T[time, particle.depth, y, maxx-dx]) / (dx)


    dFdx = (fieldset.T[time, particle.depth, y, x+dx]-fieldset.T[time, particle.depth, y, x-dx]) / (2 * dx)

    if y <= miny:

    dFdy = (fieldset.T[time, particle.depth, miny+dy, x]- fieldset.T[time, particle.depth, miny, x]) / (dy)

    elif y >= maxy:

    dFdy = (fieldset.T[time, particle.depth, maxy, x]- fieldset.T[time, particle.depth, maxy-dy, x]) / (dy)


    dFdy = (fieldset.T[time, particle.depth, y+dy, x]- fieldset.T[time, particle.depth, y-dy, x]) / (2*dy)

    norm=math.hypot(dFdx, dFdy)

    if norm>=1e-14:

    particle.gradx = dFdx / norm

    particle.grady = dFdy / norm


    particle.gradx = dFdx

    particle.grady = dFdy

    # def R(time, fieldset, particle, x, y):
    #     T=fieldset.T[time,  particle.depth,  y,  x]
    #     T1=fieldset.T1
    #     T2=fieldset.T2
    #     if T <= T1:
    #         fun = - (T - T1)**3
    #     elif (T > T1) and (T < T2):
    #         fun = 0
    #     else:
    #         fun = - ( T - T2)**2
    #     return fun
    Hi @SaAlrabeei , it is not exactly clear to me how the function R relates to the function to find the temperature gradient in your code. I think all operations that I see should work in Parcels except for the ** operator in JIT mode. Maybe you can try to use math.pow() there instead? If that doesn't solve your problem, could you maybe share the error message?
    @SaAlrabeei , I have some more tips that might solve some problems:
    • Kernels can only have the arguments (particle, fieldset, time), in that order. Check here for a more detailed explanation on kernels too.
    • If you want to use a constant in one of your kernels, like dx or dy, you can add those to the fieldset using FieldSet.add_constant() that way, you can pass the values to any kernel without having to define the constants in your kernels

    Hi. Below is a snippet of my code that I am using. As you can read, I am querying what u is at a particular lat,lon,depth (u=-7.1577 E-7), which is the western natural node for u on that grid cell is, and thus should give me back the u in my file (-5.308E-7). It does not. Any insight is greatly appreciated. The bathymetry scalar. value is correctly interpolated.ufiles = data_path +'3D_U_September_BNAM.nc'
    vfiles = data_path +'3D_V_September_BNAM.nc'
    wfiles = data_path + '3D_W_September_BNAM.nc'
    mesh_mask = data_path + '3D_Mask_September_BNAM.nc'

    filenames = {'U': {'lon': mesh_mask, 'lat': mesh_mask, 'depth': wfiles, 'data': ufiles},
    'V': {'lon': mesh_mask, 'lat': mesh_mask, 'depth': wfiles, 'data': vfiles},
    'W': {'lon': mesh_mask, 'lat': mesh_mask, 'depth': wfiles, 'data': wfiles}}

    variables = {'U': 'U',
    'V': 'V',
    'W': 'W'}

    dimensions = {'U': {'lon': 'glamf', 'lat': 'gphif', 'depth': 'deptht', 'time': 'time_counter'},
    'V': {'lon': 'glamf', 'lat': 'gphif', 'depth': 'deptht', 'time': 'time_counter'},
    'W': {'lon': 'glamf', 'lat': 'gphif', 'depth': 'deptht', 'time': 'time_counter'}}

    fieldset = FieldSet.from_nemo(filenames, variables, dimensions, allow_time_extrapolation=True)

    Read in depth data

    bathy = loadmat(data_path +'lonlat_Bathy.mat')['Bathymetry']
    fieldset.add_field(Field('Bathy',data=bathy, lon=fieldset.W.lon, lat=fieldset.W.lat, interp_method = 'cgrid_tracer',allow_time_extrapolation=True))

    Coords are for kn at ii=150, jj = 100

    From nav_lon and nav_lat in Mixed coof codes

    Load and set initial coordinates of particles#a = np.loadtxt('/Users/wendy/Library/Mobile Documents/com~apple~CloudDocs/CONNECTIVITY/SCALLOPS/SCALLOP DATA/ICs/ICs1.txt')

    lonp, latp = [-69.1964569, 40.6681061] #Cell center. In cell

    lonp, latp = [-69.2380676, 40.6671028] #Western U. In cell

    lonp, latp = [-69.15485, 40.66912] #Eastern U. Only in cell because refined longitude. These coords are clearly calculated and round off issues

    lonp, latp = [-69.19522, 40.63871] #Southern V. In cell

    lonp, latp = [-69.1977, 40.6975] #Northern V. In cell

    lonp, latp = [-69.23683, 40.637703] #SW F. In cell

    lonp, latp = [-69.15361, 40.639717] #SE F. Only in cell if add last "1" in decimal. NOT in f value

    lonp, latp = [ -69.2393, 40.69648] #NW F. Not In cell unless drop precision

    lonp, latp = [-69.156082, 40.69851] #NE F. not In cell not in cell

    depthp = np.ones(2)

    timep = np.zeros(2)

    lonpx = lonp
    latpx = latp
    print(lonpx, latpx)

    Hi @ElizaJayne11, unfortunately I find it hard to read what you have done exactly, partly because Gitter has formatted your code a bit awkwardly. I think Gitter uses Markdown which means that all lines starting with a '#' become headers. It is good to hear the bathymetry value is correctly interpolated. I think you have printed fieldset.U at [0,1.0207,lat_westernU,lon_westernU]. Could you explain why the depth value is 1.0207 and at what depth you do get the correct value? Earlier you mentioned the velocities were not shifted by the half-cell difference between w-levels and t-levels. Again, I think the correct way to create your fieldset is to pass the w-levels, so that might fix your problem.
    Thanks for getting back quickly. I agree the formatting of this Gitter site makes it very hard to see what is going on. Is there a way I can send you my files? Meanwhile...In answer to your question, 1.0207 is the midpoint of the depth between my top 2 t-levels, which I think Parcels is interpreting as z-levels. If so, then 1.0207 would be the midpoint where the velocity should be the same as the top value in the field. It is not. You are also correct that I used to be able to get back my velocities at the top, but I can no longer reproduce that result. My velocities do not appear correct at the tlevel or midway between them. I also tried using a different set of files that have the data for the w-levels. Again I can get my scalars (Bathy and Temperature) fine when I run with cgrid-tracer. But when I run with temperature not being C-grid I can't get it to return the right value for the NE f point. I also don't get the correct value for the western U. It may be an indexing thing. I am using the cells in i=169,170 and j = 99 and 100. I probably am doing something daft, and would appreciate your help figuring out what it is.
    Luc Vandenbulcke
    I'm trying to write a custom kernel in C. As a first step, I'm trying to re-write the AdvectionRK4 kernel in C.
    Thanks to the examples pointed by @reint-fischer on Oct 20, I think that I managed to correctly pass some arguments to the C function: the U and V CFields; the particle lon, lat,depth; and dt.
    However I have trouble finding out what function should be called to interpolate the CFields at the current location, and obtain the local u and v.
    If I look at the C code generated from the Python AdvectionRK4 kernel by codegenerator.py, it calls the function temporal_interpolationUV . Should I use that one also in my function? I would then need to pass extra arguments such as the grids of the field etc ? Could you help me with the exact syntax for that (especially in the case of nested U and V fields) ?
    Thank you !
    Hi @lucvandenbulcke,
    The reason there is so little documentation avaiable on writing your own C-kernels is that we are not aware of any people seriously trying this before in Parcels. I think it is great that you are trying and we are definitely interested in your findings, but I am afraid we cannot really advise you on how to write the C code.
    Luc Vandenbulcke
    Hello @reint-fischer ,
    Thanks for your answer and encouragement. To give you some context, I use a beaching algorithm (a python kernel) similar to the one in the North Sea example. But in my area, I have a dike and some particules manage to "jump" over it, as there is water on both sides. So, what I would like to do is add a segment intersection algorithm and if necessary, push the particles back to where they came from. That is easy and already OK in my C kernel. My (only) problem is the call to the interpolation function (the C equivalent to fields[....] in python).
    For velocity, for example, I would guess that I'm supposed to call temporal_interpolationUV(...) from the parcels code, I'm just having trouble finding the arguments to pass to the function. In the list, I understand how to get the first ones (lon, lat, depth, time, U, V), but I'm lost with the next ones: (&particles->xi[pnumngrid], &particles->yi[pnumngrid], &particles->zi[pnumngrid], &particles->ti[pnumngrid]). Then I guess the next 2 ones are the result (&parcels_tmpvar0, &parcels_tmpvar1) and the last 2 ones should be (for velocities coming from NEMO) : (CGRID_VELOCITY, NEMO)
    I understand you cannot help everybody and do their code for them, but I'd be so grateful if you could indicate how I obtain the xi, yi, zi ti arguments in C.
    Hi @lucvandenbulcke,
    The indices xi, yi, zi and ti, describe the location of the particle in your gridded data from the previous timestep, so Parcels can find the data to interpolate faster. Every particle should have xi, yi, zi and ti defined as Variables which you can access similar to lon, lat, depth and time. In these lines they get defined. The JITParticle the writes these to 'xip' and 'cxi' here. Could you pass those to the interpolation function? I'm not sure about the exact syntax in C.
    1 reply
    Luc Vandenbulcke
    Hi @reint-fischer ,
    That worked ! I use JITParticles, but in the C code, I had to use the particle attributes xi,yi,zi,ti . If I try to use cxi,...., the attribute is not found; and same for xip.
    Anyway it seems to work with xi,... ; the values are updated along the simulation. Only ti stays at zero all the time (but maybe it's normal). I made an "AdvectionRK4" kernel in C (exactly similar to the standard Python one) and it's ok.
    I can now proceed with my custom kernels ! Thank you very much,
    Luc Vandenbulcke

    Hi again,
    When I used (multiple) Py kernels, I did something like this:

    kernels = pset.Kernel(Advection_RK4)
    kernels += DiffusionZ

    To use just a C kernel, I can use it with something like this:

    kernels = pset.Kernel(my_ckernel, c_include=my_ckernel)

    But I can't find out how to use the C kernel together with (i.e. after) Py Kernels. For example, I would like to use kernels Advection_RK4 + DiffusionZ +my_ckernel.
    I tried to first define kernels with the 2 Py kernels inside, and try to instantiate a new Kernel (for the C kernel) with the idea to "+" both afterward. But to build the second kernel instance, I need to provide again the fieldset... ; so I suppose that's not the way to go.
    Thanks !

    10 replies
    Schermopname (2)_LI.jpg
    Hello! I'm using parcels to try and simulate particles in a small region influenced by tides (English channel). I took the ATLANTIC - EUROPEAN NORTH WEST SHELF - OCEAN PHYSICS ANALYSIS AND FORECAST data from Copernicus marine services to get a good spatial resolution in such a small area of the world. For now, I've been able to have a visualization of the particles on the map and the Fieldset U, but when I try to advect them I have this error: 'Kernel' object has no attribute '_function'. I've tried to change the runtime to see if it was that but it still doesn't work. Is there a line of code I need to add or is it something else? Thank you !
    Hi @adegroodt, that is an uncommon error! It looks like you've done everything correctly. Maybe you can try to complie the kernel before execution with this line: pset.Kernel(AdvectionRK4). If that doesn't help, could you show us the code in the cells above the ones you've shown here?
    Hi ! thanks for the reply, it's greatly appreciated :) I've tried to complie the kernel but I have another error which comes up very often : "OSError during compilation. Please check if compiler exists: gcc"
    Hi! I finally managed to make it work :) It was a problem with the folders
    1 reply
    Hi guys, I have been using parcels for a while, it is a really useful and efficient framework. I have some doubt in the time-stepping. We have this scheme x=x+v*dt, the ncdf file (data in v) should be read day-by-day by parcles. Then, when we assign the time step (dt) to be for example 1 second, does parcel interpolate the daily data! into seconds! if so, what is the function name!
    2 replies
    Hi - Thanks again for providing such a powerful and useful tool. My question today is about using it with time evolving fields. Specifically, we have NetCDF files for velocities and other variables (MLD, temperature etc.) that we want to linearly interpolate between. It is not the same as your summed fields, because the interpolation results in a sum that is a weighted average that changes in time. I considered reading in multiple fields and then writing custom kernels for Advection etc, but the problem is getting Parcels to recognize that I have multiple fields. Specifically, I seem limited in ParticleSet to specify one fieldset. Have you a suggestion how I might achieve temporal interpolation of my fields?
    6 replies
    Hi again, I am using Copernicus and marine service data to simulate particles in a coastal area but I cannot find a file that has the unbeaching velocities. I have been following several projects in github such as the Mediterranean or Galapagos and they both use this boundary velocity file. Do you know where I can find such a file or do I have to download it from Parcels ? Thanks ! :)
    2 replies
    Daniel Hewitt
    Hi all, I am using Parcels to simulate crab larval dispersal along eastern Australia and I want to use two high-resolution regional models with some overlap in their domains, however as they both use curvilinear grids I cannot use the Parcels nesting functions for this (see OceanParcels/parcels#889). As a workaround I am advecting particles using Model 1 and writing the output as normal. From this file I am then taking all particles that cross the boundary where the models overlap and using their first lat/lon after crossing this boundary as release lat/lon for advection in Model 2, then putting the two separate outputs together for further processing. I have the code worked out for this and things look as expected (i.e. no obviously strange particle behaviour). I am aware that this is a somewhat hacky workaround and am just hoping to check if, in principle, there is anything wrong with this approach that I may be overlooking.
    Luc Vandenbulcke

    I am trying to feed Parcels with ROMS outputs on a time-varying rectilinear s-grid. Thanks to the examples and git pages, I could make it work.
    However the ROMS netcdf files use _FillValue=1.e37, not zero as usual with NEMO. Parcels seems to use this values, and therefor sometimes particles got crazy positions, and an out-of-bounds or through-surface error is thrown.
    Is there a way to have Parcels use the _FillValue netcdf attribute, and replace the corresponding velocity with zero ? Maybe I can change something in field.py ?
    Then, another problem... even with realistic velocities, it could happen that a particle gets "beached" below the bottom. The mask variables (from the netcdf file) cannot help me here, because being an s-grid, the mask is a 2D variable and does not depend on depth. I suppose that I am trying to do something similar to what Erik proposed in the example code at the bottom of OceanParcels/parcels#692:

    fieldset.add_field(Field('bottom_depth', fieldset.U.depth[-1, :, :], lon=lons, lat=lats))

    In my fieldset, I already have a field called z_w, which is actually used to set the (time-changing) depth coordinate of the W field. However I don't understand how to access the bottom or top values, e.g. z_w[time,-1,particle.lon,particle.lat] . How to indicate to Parcels, that I want to interpolate in the horizontal and in time, but that -1 is the layer index, not an actual depth in meters? (the same question goes for Erik's example, when he uses

    if particle.depth <= fieldset.top_depth[0, 0, particle.lat, particle.lon]:

    how does parcels knows that the second zero means the first layer, and not depth=0 meter ?
    Thank you very much !

    6 replies
    Hello -- I have a program which runs parcels many times on changing data which depends on the results of the prior parcels run. I would like to run it in parallel, for speed reasons. Unfortunately, when I use mpirun it runs the code multiple times simultaneously, as one would expect. Is it possible to invoke the parallel of oceanparcels from within a python program? Or should I make a little python program that only does the particle tracking from some inputs it reads from temporary files, and then invoke that via mpirun as a shell command? I am familiar with MPI in Fortran.

    We have been working on setting up a fieldset that includes three monthly mean velocity fields, which are interpolated in time during our particle track simulation. We have used two different approaches to set up this time-varying fieldset. The first is to compile multiple monthly climatologies into one netcdf file with time stamps (in seconds) corresponding to the time represented by the fields. This single file is then loaded in by the Fieldset.from_nemo function in Parcels, and then Parcels does the interpolation.

    We have also tried an approach adapted from what was done in Kaandorp et al 2020's Parcels code for loading and adding additional velocity fields (in their case wind and stokes velocities). With this approach, our three monthly velocity fields are loaded separately into their own fieldsets, with September being associated with U and V, and then the other two are loaded and added to the original fieldset. Then the fields are manually linearly interpolated in time in a custom advection kernel.

    For this second approach, the series of commands we are using to load the additional fields are (borrowed form Kaandorp et al)

    fieldset_Oct = FieldSet.from_netcdf(filenames2, variables2, dimensions2, allow_time_extrapolation=True)

    fieldset_Oct.U_Oct.units = GeographicPolar()
    fieldset_Oct.V_Oct.units = Geographic()


    vectorField_Oct = VectorField('UV_Oct',fieldset.U_Oct,fieldset.V_Oct)

    To our understanding, these two approaches to setting up the fieldset should result in the same spatially-interpolated velocity fields. However, when we query the two fieldsets at the same point in space (and time), Parcels does not return the same velocity value from the two fieldsets set up using our two different approaches. However, when we change the units call above to:

    fieldset_Oct.U_Oct.units = Geographic()
    fieldset_Oct.V_Oct.units = Geographic()

    the interpolated values correspond in space in time between the two fieldsets set up using our two different approaches.. However, to our knowledge, this Geographic is not the correct unit conversion for U velocities. We are hoping you can provide some insight into what is causing this discrepancy?”


    I am trying to find the nearest grid point to each particle ( actually nearest 8 cells), I used this

    (xsi, eta, , xi, yi, ) = fieldset.T.search_indices(particle.lon,particle.lat,particle.depth,0,0) # find the nearest grid point

    This works and gives me the index of the grid when I run it in the main file. However, when I define it as a function in the kernel, it gives me this error

    'FieldNode' object has no attribute 'search_indices'

    any help, please

    I tried to fix this by doing redefining some metric functions inside the kernel file in which I needed to recall the grids. Every time I call for example feildset.U.grid.lon, I got errors stating the same error (FieldNode' object has no attribute 'search_indices'). I defined the whole grid (lon and lats) as two fieldset constants to be able to call them from the kernel but didn't make it, I got an error stating that I have to recall only one single value.
    One last try was to use min and max function which are not in math lib, I changed to numpy lib which required to change particles type to ScipyParticle, but still didn't work, 'np' was not recognized although I already imported the numpy lib inside the kernel folder.

    Hello, i'm new to Parcels and Gitter, so i apologize in advance if i submit my problem incorrecctly. I'm having some issues with fields nesting from NetCDF data. I'm trying to interpolate a grid with low spacial resolution but high detail with a grid which has much higher spatial resolution but lower detail. Here is my code:

    from IPython import get_ipython
    get_ipython().run_line_magic('matplotlib', 'inline')
    from parcels import Field, FieldSet, NestedField, ParticleSet, JITParticle, AdvectionRK4, ErrorCode, ParticleFile, plotTrajectoriesFile
    from glob import glob
    from datetime import timedelta
    from datetime import timedelta as delta


    filenames = {'U':"D:/Data_harbour/BCNPRT-HC.nc",
    variables = {'U': 'u', 'V': 'v'}
    dimensions = {'lat': 'latitude', 'lon': 'longitude', 'time': 'time'}

    fieldset = FieldSet.from_netcdf(filenames, variables, dimensions,field_chunksize='auto', deferred_load=False, interp_method='cgrid_velocity', netcdf_engine='scipy')

    basepath_cst=data_dir_cst + 'BCNCST-PdE-hm-*-HC.nc'

    dimensionsU={'lon':'longitude', 'lat':'latitude', 'time':'time'}
    dimensionsV={'lon':'longitude', 'lat':'latitude', 'time':'time'}

    U_cst=Field.from_netcdf(filenames_cst, ('U_cst','u'), dimensionsU, fieldtype='U', field_chunksize='auto', deferred_load=False, interp_method='cgrid_velocity', netcdf_engine='scipy')
    V_cst=Field.from_netcdf(filenames_cst, ('V_cst','v'), dimensionsV, fieldtype='V', grid=U_cst.grid, dataFiles=U_cst.dataFiles, field_chunksize='auto', deferred_load=False, interp_method='cgrid_velocity', netcdf_engine='scipy')


    U=NestedField('U', [fieldset.U_cst, fieldset.U])
    V=NestedField('V', [fieldset.V_cst, fieldset.V])

    nest=FieldSet(U, V)

    pset = ParticleSet.from_line(nest, pclass=JITParticle,
    start=(2.163, 41.324),
    finish=(2.175, 41.307), size=5)

    output_file= ParticleFile(name='Simulation_nest.nc',particleset=pset, outputdt=timedelta(hours=1))

    def DeleteParticle(particle, fieldset, time):

    pset.execute(AdvectionRK4, runtime=timedelta(hours=10),
    dt=timedelta(minutes=5), output_file=output_file,
    recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle})


    The problem is that when i try to run the simulation, the trajectories output indicates that the field used in the pset.execute is only one. and not the combination of the two. In particular, the simulation uses the first U and V velocity written in lines 41 and 42 (U=NestedField(...); V=NestedField(...). It seems like it ignores the second velocities. Probably this issue is linked to the fact that the program rises a warning before executing the pset, which is : "WARNING: Trying to initialize a shared grid with different chunking sizes - action prohibited. Replacing requested field_chunksize with grid's master chunksize". I tried to intervene in the file chunking process but i didn't obtain any result. I'm using data from opendap.puertos.es. Thank you very much for any help or suggestion!

    Hello! We found the problem in this script, which is within the data. In fact, the extention of the smaller grid is bigger than the area where the measures are actually available. In the points where no measure is available, the parameter are set to NaN. This implies that the measurements are surrounded by a NaN area, in which the particle behaviour cannot be calculated. This is what probably causes the failed nesting between the two grids. We are now working on the data using indices to identify the areas where the measures are available to create the Fields without NaNs.
    I asked about this once before on here, but have had a further thought. I'm running parcels with nemo output, with c-grid, z-levels and partial/shaved cells at the bed. Partial cells are not implemented with c-grid, z-levels, but I was wondering if I tell Parcels this is c-grid, s-level, and give it the relevant grid information will this effectively deal with the partial cells? Or do I just need to try it and see?
    Jennifer Wong-Ala
    Hello. I am currently new to using parcels and am trying to add in a constant diffusivity term using the DiffusionUniformKh. The code runs fine without the diffusion, but once we add in the DiffusionUniformKh kernel we get error messages and particles are not advected correctly. We looked at the examples on the GitHub, but am wondering if there is another example using the DiffusionUniformKh?
    Dear @jwongala What are the errors? I just did some experiments with DiffusionUniformKh, mybe I can help.
    Hello! I have a very basic question but I can't find the answer anywhere on the documentations. What type of grid does FieldSet.from_data() load arrays in as? Is it possible to change the grid indexing type of this method?
    Daniel Hewitt
    Hi all, I am about to set about tracking particles in 3D, however the oceanographic model does not have a W velocity field (and it would require rerunning the analysis to obtain them - not an option). Does anybody know of any resources for generating a W-field populated with 0's. I want to use 3D advection as the species my particles represent spawn at the bottom and then float to the surface by ~ 10 days old. Any ideas? Thanks!
    2 replies

    @jwongala I add constant diffusivity (using DiffusionUniformKh) after running advection using AdvectionRK4. I had to add two more fields to my fieldset for it to work:

    kh = 10 # This is the constant eddy diffusivity in m2/s
    fieldset.add_constant_field('Kh_zonal', kh, mesh='spherical')
    fieldset.add_constant_field('Kh_meridional', kh, mesh='spherical')

    Hi everyone! I am pretty new to Parcels (and Python). I'm trying to trace particles around Guam. The predominant flow is East to West so I have to deal with beaching. As a first step to unbeach I would like to "bounce" the particle back offshore to it's previous position. Is there a way to reset latitude and longitude to the location from the previous timestep? How would I index that? Note: before this new kernel I run a kernel to flag the particle as "beached" (4) if the u and v velocity components both ==0.

    def Bounce(particle, fieldset, time):
    if particle.beached == 4: #particle identified as beached

        particle.lon = #INDEX PREVIOUS 
        particle.lat = #INDEX PREVIOUS
        particle.beached = 5         #indicate that it has bounced back and need to advect alongshore

    p.s. I apologize, I haven't figured out how to format code copied into this chat. Thank you for your help!