Siphon - A collection of Python utilities for retrieving data from Unidata data technologies.
.catalog_refs
attribute.
base_url = 'https://rda.ucar.edu/thredds/catalog/files/g/ds084.1/'
dt = datetime(2017, 10, 1)
url = base_url + f'{dt:%Y/%Y%m%d}'
HTTPSConnectionPool(host='rda.ucar.edu', port=443): Max retries exceeded with url: /thredds/catalog/files/g/ds084.1/2019/20190224/catalog.xml (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7fec30e04650>: Failed to establish a
new connection: [Errno 110] Connection timed out',))
Unfortunately, NCSS does not allow requesting multiple levels. (Correct me if I’m wrong @lesserwhirls )
Dang...sorry for the delay here @PTH1_twitter @dopplershift ! Currently in 5.0, NCSS can only pull out a single level, but all the variables in the request need to be on the same vertical coordinate system. For example, in GRIB collections, you'll often see coordinates like isobaric1
, isobaric2
, height_above_ground
, altitude_above_msl
, etc...the variables would all need to use the same coordinate in order to subset a single level from the vertical (e.g. all using isobaric1
and not isobaric2
). Also, the value you'd use in the request needs to be of the same unit as the coordinate system is defined, as the NCSS API does not support supplying a unit. So, for GFS 0.25 degree forecast temperatures on isobaric surfaces, we'd have pressure in Pa, and would request something like vertCoord=85000
.
time=2019-11-30T18:00:00&timeInterval=P6h
, which would signal a 6 hour duration for the accumulation valid at 2019-11-30 18Z. Then, for a request with a time range, it would return any 6 hour accumulation intersecting the time_start and time_end parameters of the request. We'd still need to decide what to do by default if no timeInterval was specified in the request.