Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    vpavlo21
    @vpavlo21

    Hi all
    I am new to Parcels and Python in general and I just have a hopefully very quick question. I am trying to run a simulation that runs over multiple days, and for each day I have a new dataset. So I have created a list with all the files I am using and then when I am trying to initialize my fieldset I run into the following,

    file_names ={'U':files,
                 'V':files,
                 'W':files}
    variables = {'U':'uo',
                 'V':'vo',
                 'W':'w'}
    dimensions =  {'lat':'latitude',
                   'lon':'longitude',
                   'depth':'depth',
                   'time':'time'}
    fset = FieldSet.from_netcdf(file_names,variables,dimensions)

    which results in the error

    ValueError: Could not convert object to NumPy datetime

    Normally when I run a simulation with just one day of data I don't include the time dimension but here if I don't put it in I get an error telling me it's required.
    Also when I execute the simulation on a particle set will it automatically switch to a new set of data after a day has passed?

    Thank you very much

    Erik van Sebille
    @erikvansebille
    Hi @JG-2020. For your question about slowness of setting the Fields; it’s probably easiest to create a Kernel that does this interpolation and then execute it once (with dt=0). If you use JITParticles, then the interpolaiton should be very fast
    Hi @vpavlo21. To answer your question, I need to know a bit more about the time dimension in your netcdf files. It may be that it uses a calendar that can not be convertedby xarray’s cftime handler. Also, where exactly (on which line in the parcels code) do you get the ValueError? It’s always a good idea to submit the entire error log
    JG-2020
    @JG-2020

    Hi @erikvansebille
    Thanks very much for your reply.

    I believe we are using JITparticles, although we created a new class of particle that samples our field properties, following the tutorial example. Our goal is to speed up the particle initialization because it is currently taking about an hour, and we plan to run simulations with larger domains and many more particles. As I am a beginner, I am hoping you could expand a bit on what you are recommending so I know exactly how to make the code more efficient. Specifically:

    1) Do you mean we should define a kernel that is used separately from our pset.execute kernels, since our issue is with slow initialization of variables before the tracking starts?

    2) Do you mean we should still use Parcel's fieldset.X, where X is one of our variables (e.g. MLD, Bathy etc.), to interpolate the variable to the particle location for each individual particle (one at a time) inside a for loop, or is there a way to do many particles at once and/or another way to interpolate?

    Many thanks

    Erik van Sebille
    @erikvansebille
    Hi @JG-2020. Yes, so what I suggest is something like
     def initSample(particle, fieldset, time):
        particle.bathyp = fieldset.Bathy[particle.time,particle.depth,particle.lat, particle.lon]
    
    pset.execute(initSample, dt=0)  # run without incrementing the time, to initialise particle.bathyp
    
    pset.execute(…)
    Where (…) is your ‘normal’ kernel execution
    JG-2020
    @JG-2020

    Thanks @erikvansebille for the help. We greatly appreciate it.

    On another note, I encountered some issues with running the latest Parcels (2.2), installed with Python 3.8, on MacBooks running the latest OS Catalina (please see below). I have identified work-arounds for these issues, but am hoping that by posting here, we can get some advice about whether those workarounds are reasonable, and/or what else we should do and/or what they might need/want to change. I also had a question about implementing the Milstein in 2D with spatially-varying advection.

    1) The newest Parcels does not appear to be backward compatible with previous versions because you cannot import the BrownianMotion2D or SpatiallyVaryingBrownianMotion2D built-in kernels. WORKAROUND: Use old version's kernel code as new version's custom kernel. Will this affect speed? If so, can they return old kernels back to Parcels so we can run old versions of our code?

    2) The new advection-diffusion kernels are only for 2D advection, and therefore new Parcels has lost the built-in functionality of simulating 3D advection with spatially-varying horizontal diffusion that you could do with old Parcels. WORKAROUND: I have modified the AdvectionRK4DiffusionM1 to do a 3D RK4 for advection, and am using this as a custom kernel inside the main code. Will this affect speed? If so, can they develop a 3D version (or add mine) to the built in kernels?

    3) The new AdvectionRK4DiffusionM1 does not execute. I get:
    AttributeError: 'FieldSet" object has no attribute 'dres'
    Is there something else I need to import or define to make this work?
    No workaround here: Right now using custom kernel with old SpatiallyVaryingBrownianMotion

    4) Running with the newer versions of Parcels results in warnings that did not occur with older versions. Specifically, when loading files I get:
    WARNING: Trying to initialize a shared grid with different chunking sizes - action prohibited. Replacing requested field_chunksize with grid's master chunksize.
    WARNING: Field chunksize and Grid master chunksize are not equal - erroneous behaviour expected.
    I am not sure what this warning means, and am a little concerned that our interpolation may be affected.
    Code still executes but want confirmation that this is not introducing a problem

    5) Curious: I see that you have partitioned the calculation of the displacements due to advection and the displacements due to diffusion in AdvectionRK4DiffusionM1. These means that the estimate of spatially varying diffusion is not accounting for the contribution of advection to the distances over which diffusion would vary. I was wondering if you experimented with combining them, e.g. by including the Milstein kick into the "sampling" of rates of change at different locations that is done during RK4?

    Thank you very much.

    Erik van Sebille
    @erikvansebille
    Thanks you for this extensive feedback, @JG-2020. Could I ask you to put this into a new Issue at https://github.com/OceanParcels/parcels/issues, so that we can address these issues one by one?
    JG-2020
    @JG-2020
    Hi @erikvansebille. Should I submit them as one issue or as separate issues? Thanks.
    Erik van Sebille
    @erikvansebille
    As one Issue is fine, I think
    Daniel Hewitt
    @DEHewitt

    Hi, I am pretty new to Parcels and am attempting to nest models - it appears I can nest U and V (i.e. no error appears when I run the code), however when I attempt to include temperature I get the error AttributeError: 'NestedField' object has no attribute 'items'

    My code is as follows (where fieldset_ROMS and fieldset_BRAN are fieldsets derived from the respective models):
    U = NestedField('U', [fieldset_ROMS.U, fieldset_BRAN.U]) V = NestedField('V', [fieldset_ROMS.V, fieldset_BRAN.V]) temp = NestedField('temp', [fieldset_ROMS.temp, fieldset_BRAN.temp]) fieldset = FieldSet(U, V, temp)

    and the full error message:
    `fieldset = FieldSet(U, V, temp)
    Traceback (most recent call last):

    File "<ipython-input-66-9784dc69a747>", line 1, in <module>
    fieldset = FieldSet(U, V, temp)

    File "C:\Users\Dan\anaconda3\envs\py3_parcels\lib\site-packages\parcels\fieldset.py", line 43, in init
    for name, field in fields.items():

    AttributeError: 'NestedField' object has no attribute 'items'`

    Any help would be awesome!

    phand
    @HandmannP

    After not using parcels for a long time I now upgraded my version to run 3d experiments. I am now using parcels 2.1.3 and I am currently setting up 3d experiments within a NEMO based model
    sampling MLD, T and S apart from the trajectory information.

    I am having problems with my kernel deleting the particles : def DeleteParticle(particle, fieldset, time):
    particle.delete() ..... pset.execute(AdvectionRK4_3D + k_SampleMLD,
    runtime=timedelta(days=runtime_in_days),
    dt=timedelta(minutes=dt_in_minutes*run_dir),
    output_file=pfile,
    recovery={
    ErrorCode.ErrorOutOfBounds: DeleteParticle}) ---------------------------------------------------------------------------
    IndexError IndexError: list assignment index out of range The full code is in the html file

    phand
    @HandmannP
    I used the DeleteParticle as in my code before so I do not understand the error. I would really appreciate some ideas on that
    Erik van Sebille
    @erikvansebille
    Hi @HandmannP. This is not a bug in DeleteParticle, but in the new Dask Data Chunking. See the arrow at line 937.
    Erik van Sebille
    @erikvansebille
    I’m not sure though what exactly causes the bug. What if you add field_chunksize=False to the return FieldSet.from_nemo(…) line? Does that help?
    Erik van Sebille
    @erikvansebille
    Could you send me (via email) a weshare file or something with a subset of the netcdf files, so that I can check?
    JG-2020
    @JG-2020

    Hi, I have a question regarding fieldset interpolation of velocities.
    Our code uses a kernel which deletes particles on land by identifying particles which have zero velocity at each time step. The code for this is:

       def CheckLand(particle, fieldset, time): 
                    (up, vp, wp) = fieldset.UVW[time, particle.depth, particle.lat, particle.lon] 
                     if fabs(up) < 1e-10 and fabs(vp) < 1e-10 and fabs(wp)<1e-10 and time <35:
                         particle.delete()

    From our initial 52,175 particles, 10 were removed using this kernel, leaving us with 52,165 particles.

    We also ran this code with the kernel defined slightly differently, as shown here:

    def CheckLand(particle, fieldset, time): 
            up = fieldset.U[time, particle.depth, particle.lat, particle.lon]
            vp = fieldset.V[time, particle.depth, particle.lat, particle.lon]
            wp = fieldset.W[time, particle.depth, particle.lat, particle.lon]
            if fabs(up) < 1e-10 and fabs(vp) < 1e-10 and fabs(wp)<1e-10 and time <35:
                particle.delete()

    I would expect this to yield the exact same result, however, when I used this code 24 particles were removed, leaving us with 52,151 particles.
    We validated that this kernel is what was causing the discrepancy between the two results by running a barebones version of our main code, which allowed us to rule out any other potential factors.

    Would you please be able to explain why interpolating UVW at once returns us a different result than interpolating U and V and W separately?

    Thanks.

    Erik van Sebille
    @erikvansebille
    Hi @JG-2020. This could be the case if you use a (curvilinear) C-grid. Depending on the grid, fieldset.UVW uses a different interpolation than the three separately. See also https://www.geosci-model-dev.net/12/3571/2019/gmd-12-3571-2019.html and https://nbviewer.jupyter.org/github/OceanParcels/parcels/blob/master/parcels/examples/tutorial_interpolation.ipynb
    alanfox
    @alanfox
    Hi. I'm relatively new to using Parcels, having previously used Ariane and a tracking code I wrote myself. I have a bunch of questions but will start with the one that feels most important. I'm running Parcels with a NEMO model (ORCA C-grid, z-levels). This uses partial depth cells at the bed, but from what I can see these are not implemented in Parcels nemo FieldSets. Is that right, and should I worry about it (I'm running particles throughout the watercolumn).
    Erik van Sebille
    @erikvansebille
    Hi @alanfox. Welcome to Parcels! No, we don’t have partial cell support (yet) in Parcels.
    JG-2020
    @JG-2020

    Hi @erikvansebille, we recently installed Parcels on another computer, and we are having an issue which we can't seem to figure out.

    When running our simulations, we are having trouble with the saving of output netCDF files. When we run the code the first time (i.e. after restarting kernel), the temporary folder will remain after the execution has completed, the netCDF file will not save, and we get the error messages shown below. However, if we run the code a second time, the netCDF file does save and we do not get the error messages.

    Do you have any idea why this may be happening and/or do you know how to solve this issue?

    Here is the execute command:

    pset.execute(kernels, runtime=delta(days=45), dt=delta(hours=1),recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle},
                 output_file=ParticleFile("Particles_Testing_2.nc", pset, outputdt=delta(days=1)))

    And here are the error messages:

    Exception ignored in: <function ParticleFile.__del__ at 0x7fb160a951f0>
    Traceback (most recent call last):
      File "/Users/Nadine/opt/anaconda3/envs/py3_parcels/lib/python3.8/site-packages/parcels/particlefile.py", line 196, in __del__
        self.close()
      File "/Users/Nadine/opt/anaconda3/envs/py3_parcels/lib/python3.8/site-packages/parcels/particlefile.py", line 201, in close
        self.export()
      File "/Users/Nadine/opt/anaconda3/envs/py3_parcels/lib/python3.8/site-packages/parcels/particlefile.py", line 356, in export
        raise RuntimeError("No npy files found in %s" % self.tempwritedir_base)
    RuntimeError: No npy files found in out-YVVZVFJW
    INFO: Compiled JITParticleAdvectionRK4DiffusionM1 ==> /var/folders/kb/3zhpw8y922jgdxy0ql0xpcs00000gp/T/parcels-502/5802e9976d3edb2f5718613e31fb4982_0.so
    INFO: Temporary output files are stored in out-OOHEAOEA.wice
    INFO: You can use "parcels_convert_npydir_to_netcdf out-OOHEAOEA" to convert these to a NetCDF file during the run.

    Thanks.

    alanfox
    @alanfox
    Thanks @erikvansebille. Any idea/guess what errors no partial cells might introduce for near-bed particles? Horizontal movement in the partial cell would presumably be unaffected, but vertical movement I'm not sure. And continuity in the bottom cells would be broken. I'm not even convinced that vertical movement/position of particles in partial cells is well-defined.
    Erik van Sebille
    @erikvansebille
    Hi @JG-2020. Do you run this in a norebook? In that case, you will need to
    output_file = ParticleFile("Particles_Testing_2.nc", pset, outputdt=delta(days=1))
    pset.execute(kernels, runtime=delta(days=45), dt=delta(hours=1),recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle},  output_file=output_file)
    output_file.close()
    Hi @alanfox. Good question. We haven’t needed partical/shaved cells ourselves in my team yet, so no plans to implement it in the near future. But if you want to, you can of course propose a way to deal with this yourself?
    alanfox
    @alanfox
    Thanks @erikvansebille. I might have a think about it, I know there is a recent paper https://doi.org/10.1016/j.envsoft.2020.104621 explaining how they implemented partial cells in LTRANS-Zlev. Doesn't look straightforward, and I don't know how it fits with how Parcels does c-grid advection etc.
    JG-2020
    @JG-2020

    Hi @erikvansebille, thanks for your response, that fix seems to work.

    On another note, I have a question about creating a 3D NEMO dimensions dictionary. I noticed that in your NEMO 3D tutorial, when creating your dimensions dictionary, you set your U, V, and W depth dimensions to all come from depthw from the w files. From looking at your data, depthw is offset from depthu and depthv by half a meter because the data is on a C grid. I am therefore confused as to why the dimensions dictionary for the FieldSet.from_nemo function uses the same depth dimensions for all three variables. Does fieldset require all 3 variables to use w depths to establish the common grid and therefore perform the interpolations correctly? Or should I set the depths for each field to the depths associated to that specific field (i.e. depthu for U, depthv for V, depthw for W)? Also, would the interpolation work correctly if the depths for U, V, and W were all set to t depths, e.g. depthu ?

    Thanks.

    ElizaJayne11
    @ElizaJayne11
    Help with user-contributed kernels: I had been interacting with you about Issue #858 which is now closed. You had suggested I contribute my new code to Parcels. I just tried doing that but ran into difficulty. I have posted my question about contributing as a response to that #858 issue, but am not sure you will see my question now that the issue is closed. So this message is just to alert you that I have posted a question there. Please advise on how to proceed.
    I also have a question about the spatially-varying diffusion calculations. Is horizontal diffusivity treated as a scalar within in a grid cell? If so, then wouldn't the local gradients of Kh be zero for C-grids such that the correction term is not applied and "corrections" only occur when dres puts the point in a neighbouring cell? Similarly, when Kh is linearly-interpolated, then wouldn't dres be unimportant (since centered difference would give you the same constant value for different dres), unless dres put you in a neighbouring cell? If that is the case, can you comment on what type of cell-to-cell variation in Kh would warrant use of non-uniform diffusivity, and what kinds of errors are associated with constant/liner Kh within a cell by step-changes across a cell?
    Oakes Holland
    @EcoOakes_twitter
    image.png
    Hi all! I am having some trouble with exporting to a netcdf file. I thought it was my code, so I tried the NestedFields tutorial code without modification and both came up with "AttributeError: 'ParticleFile object has no attribute 'export' " after running output_file.export(). I've had a look at the documentation page and it says there that export() is an attribute of the ParticleFile module, so I am a bit lost as to how to fix this? I am running a Jupyter notebook with Anaconda.
    JG-2020
    @JG-2020

    Thank you for your prompt response. The tutorial on 2D interpolation you suggested did not fully address our question, so we are following up with what are hopefully more clear questions:

    1) We want to import 3D C-Grid velocity data to Parcels such that subsequent interpolations are done correctly. However, we are confused about assigning the lat/lon/depths in the dimension line because are are not sure what Parcels is doing with these values. Since your NEMO 3D tutorial assigns lat = glamf, lon=gphif, and depth = depthw=wfiles[0] for U, V and W, it seems that Parcels uses those general dimensions to define the grid cell, and then assigns the correct offsets for U, V and W to use in the interpolation (e.g. as in Delandmeter et al., 2019)? Is that correct, or should we be asigning U,V and W their respective lat, lon and tdepth/wdepths?

    2) We also want to import 2D and 3D scalars in a way that their subsequent interpolation will be consistent with the ocean circulation model (i.e. constant throughout a grid cell). We had been using fieldset.add_field with lat = fieldset.W.lat, lon = fieldset.W.lon, depth = fieldset.U.depth, which we believe puts them at the cell center, and then using "nearest" for the interpolation. However, we are confused because your 2D tracer interpolation tutorial says that tracers are set to the upper right corner of the grid cell, which implies using a different lat/lon value when adding to the fieldset and then using "cgrid_tracer" interpolation. Can you please advise which is appropriate, or if we should be doing something else entirely?

    Thanks.

    Erik van Sebille
    @erikvansebille
    Hi @EcoOakes_twitter, sorry for the slow response. Anyways, I don’t understand what causes your bug. Could you file an Issue in Github, so that we can discuss it there?
    Erik van Sebille
    @erikvansebille
    Hi @JG-2020. Interpolation is really tricky, and to be honest even we don’t have our head around it entirely. If you really want to understand what’s going on, I suggest you dig into the code at https://github.com/OceanParcels/parcels/blob/master/parcels/field.py#L822-L970 (probably this python code used for ScipyParticles is more readible than the C++ code for JITParticle interpolation, but they do the same)
    JamiePringle
    @JamiePringle
    Quick question. How can I run particle tracking with particles released at different times, but each particle is only tracked for a sub set of the run -- for example, particles drift for 30 days, but are released over six months? This is useful for, e.g., larval dispersal experiments. Right now, if I use either runtime or endtime, I end up tracking particles released at the beginning of the run for the full integration time, and the computational load grows as the length of the model time squared. If I could truncate the particle runs at a given time, the computational load would only grow linearly... Another way to do this would be to kill the particles after they have drifted for a set time. Thanks, Jamie
    HaydenSchilling
    @HaydenSchilling
    @JamiePringle You can kill the particles after X time with a kernel which tracks the age and deletes the particle once this age is exceeded. An example would be:
    fieldset.add_constant('maxage', 40.*86400) # an example for 40 days (40 x number of seconds in a day)
    
    def SampleAge(particle, fieldset, time):
        particle.age = particle.age + math.fabs(particle.dt)
        if particle.age > fieldset.maxage:
            particle.delete()
    Daniel Hewitt
    @DEHewitt
    Hi all, I am attempting to simulate the dispersal of a few species of crab along the east coast of Australia and want to release particles from within their spawning area. Does anybody know of a way that I can randomise release locations within an area defined by latitude, longitude (and depth possibly)?
    Erik van Sebille
    @erikvansebille
    Sure, use particleset.from_field()? Or, if you want to have more control, just write your own python function that generates a list of longitude and latitudes and then feed that to Parcels
    JamiePringle
    @JamiePringle
    @HaydenSchilling Thanks!
    Daniel Hewitt
    @DEHewitt
    Thanks for the speedy reply @erikvansebille. I didn't realise that it was implemented via particleset.from_field(). I should've been more specific: these crabs have multiple discrete spawning areas (/aggregations) and I want the release lat and lon to be randomised every repeatdt. Ultimately, I want a randomly chosen spot within several different areas and for a new random spot to be chosen for each area on the next release. My understanding is that this wouldn't be possible with particleset.from_field(). Is that correct? I just want to be sure so I don't reinvent the wheel.
    Reint
    @reint-fischer
    Hi @DEHewitt, if you would like to randomise your release locations at every new release, it is not possible at the moment to use repeatdt. Instead, I might suggest two alternatives using either pythons or NumPys random number generators: the most straightforward way would be to specify all release times explicitly in a single particleset and randomising the lists of longitudes and latitudes. This is shown in the first example here . For a better computing performance (and cleaner coding perhaps) you could write a for-loop in which you add new particles at random locations using particleset.add(). That could look something like this:
    pset = ParticleSet(fieldset,pclass,lon=[random list],lat=[random list], depth = [random list],  time=[first release])
    
    for i in range(number_of_releases):
         pset.execute(dt=repeatdt)
         pset_new = ParticleSet(fieldset,pclass,lon=[random list],lat=[random list], depth = [random list],  time=[second release])
         pset.add(pset_new)
    Luc Vandenbulcke
    @lucvandenbulcke
    Dear all,
    I am using Parcels for a couple of weeks and already love it. Maybe the gitter channel could be advertised more on the OceanParcels website and/or Github page, or maybe it's my fault that I didn't know about Gitter before.
    Anyway, with help obtained through the Github issues, I somehow managed to have Parcels transport particles using a 3D triply-nested field of currents (from Nemo-Agrif nested grids), as well as CMEMS Stokes drift. I found the Breivik et al 2016 code to propagate (surface) Stokes drift to depth, in the OpenDrift source code ; and I just copy/pasted it in my own kernel. On top of nested currents, a Stokes drift VectorField, my fieldset now also contains significant wave height and mean peak period (required to compute propagation to depth); and Parcels nicely (an quickly!) extracts all the required values for me!
    My question is the following. Is there a tutorial or documentation about what one can or cannot do in custom kernels ? When using Euler integration, it's OK to do the Stokes->depth computations directly in the kernel. With RK4, I would need to copy/paste this code 4 times, and it starts to look a little clumsy. So I wanted to use a new function in the module where I store my custom kernels, and call that in the kernel. I couldn't make it work with JIT, and also even not with Scipy. I can post detailed errors for both cases, but it's probably too long to discuss here.
    I also saw that there are some commits (from 2018!) about using directly C code for the kernel, but I didn't really find any documentation about how to do this.
    Reint
    @reint-fischer

    Hi @lucvandenbulcke,
    Glad to hear that you are so enthusiastic. We are not always very active on Gitter ourselves, which is why it is not very prominent on the website :). Regarding your question, it is often a challenge to prevent the kernels from looking a little clumsy. The documentation on what you can or cannot do in kernels is on another page which maybe should be more prominent on the website: https://oceanparcels.org/faq.html. If you want to write your own C kernels you might learn something from this example.

    Let me know if this helps,

    Good luck!

    SaAlrabeei
    @SaAlrabeei
    Hi, @erikvansebille,
    Is it possible to define return functions in the kernels py file. I am trying to define a temperature gradient field and then force the particles to follow the gradient. However, the temperature T(time,depth,lat,lon) should be imposed to another function.
    ElizaJayne11
    @ElizaJayne11
    Hi all. Thanks for maintaining this site; it is really helpful. I have a question about vertical locations of velocity components. I set up my 3D grid following your template (i.e. reading the values from *.nc files, and specifying f-points and w-levels). It is my understanding from your and NEMO's documentation, that the U and V components are meant to be located at mid-depth of a grid cell (T-levels). I assumed that Parcels would designate the location of these components accordingly, like it does with the horizontal offsets. However when I test my code, it appears that the U and V values that are read in are instead designated to be at the top of the cell (w-level k-1). I am concerned I am doing something incorrectly, and will therefore get an incorrect interpolation. Can you please explain what is actually going on?
    Reint
    @reint-fischer
    Hi @SaAlrabeei,
    If your particleclass contains a Variable like particle.T, you can sample the temperature (gradient) field at T(time,depth,lat,lon) and use particle.T in another function in a subsequent kernel. There is no need to return anything in this way as the functions will be executed as a single long kernel in which particle.T can be accessed by both functions. Does this answer your question?
    Reint
    @reint-fischer
    Hi @ElizaJayne11 ,
    If you pass the locations of the f-nodes and the w-levels to parcels using either FieldSet.from_nemo, FieldSet.from_c_grid_dataset or Fieldset.from_netcdf(interp_method='cgrid_velocity' ), parcels should correctly shift the U and V values by half a depth-level to be at the mid-depth. If your test shows this is not the case, could you show how you have created the fieldset and how the test shows the U and V values are interpolated incorrectly?
    ElizaJayne11
    @ElizaJayne11
    Hi. I had used FieldSet.from_nemo with dimensions for glamf and gphif. I did not use w-depths since strangely the depths in my w-files are t-depths. I had assumed that Parcels would treat those t-depth as if they were w-depths since that is what it was expecting. Therefore I assumed that my U and V would be shifted accordingly. However, what I am getting is that U and V are assumed to be at the t-level, which would be correct if the program actually knew it was a t-level. Does this mean that Parcels is smart enough to know I am really giving it t-levels?
    Reint
    @reint-fischer
    Hmmm, Parcels expects w-depths as you say, so it should not recognize that you have passed it t-levels. Can you show how your test shows that U and V are interpreted at the t-levels by Parcels? Here is an example of a unit test we have used to check whether the particles in a particleset show the values corresponding to the known fieldset values.