Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
John Leeman
@jrleeman
Should we lowercase whatever the user passes for region and use that as keys so North America, NORTH AMERICA, and north america all work?
Ryan May
@dopplershift
sure
John Leeman
@jrleeman
Wyoming soundings have station lat/lon/elevation in the footer. IA does not. Should we include it for WY or not since both don't have it?
Ryan May
@dopplershift
:scream:
John Leeman
@jrleeman
I'd opt for reporting it - if someone is using multiple sources it could be handled by checking for the column. Just seems like really useful data
Speaking from experience on the CAPE project
Ryan May
@dopplershift
We don’t have one interface to paper over them, so I guess it’s not a big deal for them to differ.
John Leeman
@jrleeman
I also need to email larry that the region selector on the API doesn't actually do anything. You can request OUN with a region of Antarctica and it's happy to return the right data
Ryan May
@dopplershift
lol
Sean Arms
@lesserwhirls
45c11 d474 rul35 lolololol
Andrew
@IAteAnDrew1_twitter
I have a question about the examples page. I'm a bit confused about what I can do with this example after I get the list of catalog_refs
image.png
John Leeman
@jrleeman
You can drill further down or grab datasets. Have a look at some of the gallery examples: http://unidata.github.io/python-gallery/index.html or the workshop material on siphon https://github.com/Unidata/unidata-python-workshop/tree/master/notebooks/Siphon
Andrew
@IAteAnDrew1_twitter
Thanks! I feel like the workshop material should be linked on the documentation (I might have missed it though!)
John Leeman
@jrleeman
@IAteAnDrew1_twitter That's an excellent point - would you mind opening an issue on siphon for that? We'll make sure it happens!
Andrew
@IAteAnDrew1_twitter
yep, just did it. thanks!
Scott
@scollis
Hey all! Rather than add to your issue tracker I thought I would check an issue I am having here ...
Some times I get this error requests.exceptions.HTTPError: Error accessing http://thredds.ucar.edu/thredds/ncss/nws/metar/ncdecoded/Metar_Station_Data_fc.cdmr?var=weather&var=dew_point_temperature&var=inches_ALTIM&var=air_temperature&var=wind_from_direction&var=cloud_area_fraction&var=wind_speed&time=2018-06-11T15%3A17%3A59&west=-87.0&east=-89.0&south=40.5&north=42.7&accept=csv: 500 Infinite loop in linked list at recno= 23284
Note I have now wrapped it in a try to get it working again.. but no pretty obs
Is it a server issue?
Ryan May
@dopplershift
Yeah, it’s a pretty normal issue with the surface obs. Somebody thought it was a good idea to implement a linked list within a netcdf file, which means things go haywire occasionally. We need to replace it.
Scott
@scollis
Cool.. so not part of Siphon? Let me know if I can help..
dafekt1ve
@dafekt1ve
@jrleeman Thanks for letting me know about the IAStateUpperAir data pull update. When do these updates hit anaconda? Is there a nightly build?
Ryan May
@dopplershift
No nightly build. If you can manage a git checkout (or even just download the zip of the repo from GitHub), you can install it with a pip install -e . from the checkout.
dafekt1ve
@dafekt1ve
Thanks @dopplershift. Got it and I am trying to test it out for a case I am looking at. You had mentioned to me at the Users Workshop that there was a lookup table for the sounding sites to get their lat-long if you knew the 3 or 4-letter code. Can you point me to it?
Ryan May
@dopplershift
I did?
dafekt1ve
@dafekt1ve
Hehe... I thought you did. Maybe it was one of the Johns instead. I am using the IA State Upper Air pull and it doesn't have the lat-long points like the University of Wyoming dataset does, so I need to find them somewhere. You have an idea of how/where to do that?
Ryan May
@dopplershift
There are some lists of stations here: https://github.com/Unidata/station-lists But no code to easily do what you’re looking for.
Maybe @jrleeman or @jthielen know something else
John Leeman
@jrleeman
I've got a station list in the CAPE work I did that might be helpful: https://github.com/jrleeman/CAPE-SciPy-2017/tree/master/Notebooks
Bryan Guarente
@bryanguarente
Thanks @jrleeman That'll work nicely. Is something like this lookup table worth putting into the codebase or building a class/function for doing this? Seems like something that will have to be done often now with the new IAStateUpperAir method.
John Leeman
@jrleeman
I've thought about including it... but (there's always one) we need to encode time as well since multiple stations have been relocated throughout history. Probably something to go data hunting for at some point.
Andrew
@IAteAnDrew1_twitter
is this related?
https://mesonet.agron.iastate.edu/sites/networks.php?network=RAOB&format=csv&nohtml=on
if so, you can do pd.read_csv('https://mesonet.agron.iastate.edu/sites/networks.php?network=RAOB&format=csv&nohtml=on')
Ryan May
@dopplershift
Yeah, that looks right. May eventually be worth including that.
Bryan Guarente
@bryanguarente
If you change that read_csv from @IAteAnDrew1_twitter to pd.read_csv('https://mesonet.agron.iastate.edu/sites/networks.php?network=_ALL_&format=csv&nohtml=on') it broadens to every station in the IEM database which is quite useful. In the link above, it only limited down to the US. I am working on some code to see if this works on my end, but nothing yet for within siphon or MetPy itself. I may try out writing a class and method eventually, but not yet (lacking in some of those skills, but need the professional development soon).
Andrew
@IAteAnDrew1_twitter
yep! that website is really useful for grabbing station data (and with pandas, you can parse most of it in one line) [also they have ASOS data that can easily be pulled similarly]
Andrew
@IAteAnDrew1_twitter
some example code
DATA_URL_FMT = (
    'http://mesonet.agron.iastate.edu/'
    'cgi-bin/request/daily.py?'
    'network=IL_ASOS&stations={0}&'
    'year1=2014&month1=1&day1=1&year2=2018&month2=1&day2=1'
)
STATIONS = ['ORD', 'DEC']

def to_df(station):
    data_url = DATA_URL_FMT.format(station)
    df = pd.read_csv(data_url, index_col=['station', 'day'], parse_dates=True)
    df = df.apply(pd.to_numeric, errors='coerce')
    return df

df = pd.concat(to_df(station) for station in STATIONS)
Andrew
@IAteAnDrew1_twitter
is there any way to optimize reading a point from gfs using siphon? it's taking me 12 minutes
image.png
Ryan May
@dopplershift
OUCH. That query is being done on the server, so not much you can do.
Andrew
@IAteAnDrew1_twitter
alright, I guess I'll stick to downloading the files with fastgrib; thanks!
Bryan Guarente
@bryanguarente
@IAteAnDrew1_twitter Have you seen any documentation from the IAState website about what the 'begints' (begin timestamp) format is? More specifically, there are some dates that make little to no sense to me: 1874-10-01 06:09:24-05:50:36. I can't seem to figure out what the second time is (after the dash). It would seemingly be a UTC Offset based on the pattern of other datetimes in this file, but I have never seen a UTC Offset like that before. Thoughts?
Andrew
@IAteAnDrew1_twitter
Which station is that? If it's international it's likely bad data
Andrew
@IAteAnDrew1_twitter
actually there's quite a lot of those... chicago, nebraska, michigan, wisconsin, midwestern states are all UTC-6 (CST) which is pretty close to -5:50:36; I suggest to just round those, or do a simple replace df['begints'] = df['begints'].str.replace('-5:50:36', '-06:00:00')
James Simkins
@jsimkins2

Hello! I'm not sure if this is the correct gitter to be posting this in, and I apologize if it isn't, but I've been unable to connect to the Unidata Thredds AWS radar server all day today. Here is a code snippet that I use and the error that I've been getting:

from siphon.catalog import TDSCatalog from siphon.radarserver import RadarServer, get_radarserver_datasets from datetime import datetime, timedelta

cat = TDSCatalog('http://thredds-aws.unidata.ucar.edu/thredds/radarServer/catalog.xml') rs = RadarServer(cat.catalog_refs['S3 NEXRAD Level II'].href) query = rs.query() query.stations('KDOX').time(datetime.utcnow()) cat = rs.get_catalog(query) raw_list = list(cat.catalog_refs) ds = list(cat.datasets.values())[0]

"ConnectionError: HTTPConnectionPool(host='thredds-aws.unidata.ucar.edu', port=80): Max retries exceeded with url"

Ryan May
@dopplershift
@jsimkins2 Here is fine. Not sure what went wrong, but a quick reboot seems to have fixed the issues I saw. Is it working for you now?
James Simkins
@jsimkins2

@dopplershift works now for python 3.5!...for 2.7, I receive this error message...

"HTTPError: 403 Client Error: Forbidden for url: http://thredds-aws.unidata.ucar.edu/thredds/radarServer/catalog.xml"

Ryan May
@dopplershift
@jsimkins2 I’m not seeing any problems when I test on Python 2.7. If 2.7 is important for your use, if you point me to the code that’s failing I can dig in further.
James Simkins
@jsimkins2

Hello!

I'm receiving a 'Connection Refused Error' due to Max entries exceeded. Last time it just required a reboot. Here's the error:

requests.exceptions.ConnectionError: HTTPConnectionPool(host='thredds-aws.unidata.ucar.edu', port=80): Max retries exceeded with url: /thredds/radarServer/nexrad/level2/S3/dataset.xml (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f24ce4c3748>: Failed to establish a new connection: [Errno 111] Connection refused',))