megies on fdsn_raspi_url
CI: pin sqlalchemy < 2 (see #32… (compare)
megies on fdsn_raspi_url
changelog (compare)
Hi.
I am using event_client.get_events method to find events, 1- everywhere, 2- near from a certain location (in a 30° radius) , 3- far from this location (out of 30° radius).
for events happening everywhere:
cat = event_client.get_events(starttime=UTCDateTime(research_time["starttime"]), endtime=UTCDateTime(research_time["endtime"]),
minmagnitude=magnitude["min"], maxmagnitude=magnitude["max"])
working great.
Far events:
minradius = 30
maxradius = 180
cat = event_client.get_events(starttime=UTCDateTime(research_time["starttime"]), endtime=UTCDateTime(research_time["endtime"]),
minmagnitude=magnitude["min"], maxmagnitude=magnitude["max"],
latitude = station["latitude"], longitude = station["longitude"],
minradius = minradius, maxradius = maxradius)
working great.
But for near events I tried many options without getting what I want. It always shows me the same result as the "everywhere" options.
Here's what I tried:
minradius = 0
maxradius = 30
cat = event_client.get_events(starttime=UTCDateTime(research_time["starttime"]), endtime=UTCDateTime(research_time["endtime"]),
minmagnitude=magnitude["min"], maxmagnitude=magnitude["max"],
latitude = station["latitude"], longitude = station["longitude"],
minradius = minradius, maxradius = maxradius)
or
minlatitude = 39.9
maxlatitude = 64
minlongitude = -19
maxlongitude = 27.84
cat = event_client.get_events(starttime=UTCDateTime(research_time["starttime"]), endtime=UTCDateTime(research_time["endtime"]),
minmagnitude=magnitude["min"], maxmagnitude=magnitude["max"],
minlatitude = minlatitude, maxlatitude = maxlatitude, minlongitude = minlongitude, maxlongitude = maxlongitude).
In both methods it shows me events happening on all the globe
Stream
@PatrickChawah - I just tried with IRIS for event_client = Client("IRIS")your final example and did not get a global catalogue. Can you check yours, and if the problem still exists copy and paste a minimal, but complete example of the code to reproduce your issue please.
The code I ran is:
from obspy import UTCDateTime
from obspy.clients.fdsn import Client
event_client = Client("IRIS")
minlatitude, maxlatitude, minlongitude, maxlongitude = 39.9, 64, -19, 27.84
cat = event_client.get_events(
starttime=UTCDateTime(2019, 1, 1), endtime=UTCDateTime(2019, 6, 1),
minmagnitude=2, maxmagnitude=7, minlatitude=minlatitude,
minlongitude=minlongitude, maxlatitude=maxlatitude, maxlongitude=maxlongitude)
morlet
can be implemented in continuous wavelet transfrom obspy.signal.tf_misfit.cwt
Hi All, I'm running into an issue with obspy.clients.get_stations where it is telling me "TypeError: The parameter 'includerestricted' is not supported by the service". However, the station service does appear to support includerestricted and a straight web service request to the url is successful. Does anyone have a solution for this?
Example:
client = obspy.clients.fdsn.Client("SCEDC")
inventory = client.get_stations(network="CI", station="ABL",channel="EHZ",level="channel",includerestricted="TRUE")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/lib/python3.8/site-packages/obspy/clients/fdsn/client.py", line 722, in get_stations
url = self._create_url_from_parameters(
File "/lib/python3.8/site-packages/obspy/clients/fdsn/client.py", line 1226, in _create_url_from_parameters
raise TypeError(msg)
TypeError: The parameter 'includerestricted' is not supported by the service.
But the following url works fine, with the includerestricted parameter:
http://service.scedc.caltech.edu/fdsnws/station/1/query?net=CI&sta=ABL&cha=EHZ&level=channel&includerestricted=TRUE&format=text
Nothing for a specific station - the closest is catalog.filter
. Otherwise just use something like:
cat.events = [ev for ev in cat if bob in {pick.waveform_id.station_code for pick in ev.picks}]
where bob
is your chosen station.
@mnky9800n you can just trim the trace (using trace.trim
or trace.slice
) around the time and then merge the cut traces using fill_value=0
, e.g. (this is deliberately verbose and could be shortened):
glitch_start, glitch_end = UTCDateTime(1990, 1, 1, 1, 0, 0), UTCDateTime(1990, 1, 1, 1, 0, 5)
# Assuming your trace is called tr
before_glitch = tr.slice(tr.stats.starttime, glitch_start)
after_glitch = tr.slice(glitch_end, tr.stats.endtime)
tr_deglitched = before_glitch + after_glitch
tr_deglitched.merge(fill_value=0)
Or you could just edit the tr.data
numpy array to set to zero where you want zeros.
Catalog
and Event
objects from any information. The supported formats for reading catalogs are here, but csv is not a well described format - you will probably have to convert yourself to obspy Event
s and write out to QUAKEML that way.
Hi! I'm kind of new to obspy, and I was wondering if it is possible to save files of a given time duration? Let's say I am receiving miniSEED files from a Ringserver. After receiving 10 minutes worth of data, I want to save them in a single .sac file after merging them.
My first approach would be to look at the miniSEED files timestamps, save each trace contained in the miniSEED files inside a stream, and after I received 10 minutes of data I would save the stream into a .sac file (after merging the traces with the Stream.merge() function to avoid gaps). Then I would start over with another stream and so on. Or I guess I could do something similar by using the sampling frequency and number of samples in the packet.
I feel like the approach is kind of clunky though, and I was wondering if there is maybe a better way or a dedicated function?
Hi everyone,
I want to load a request client for specific data structure archive on local filesystem. The structure is:
"{year}-{month}/{year}-{month}-{day}/{network}.{station}.{location}.{channel}.{year}-{julianday}"
Do you know any way to do this?
Thank you very much for your help.
Emmanuel Castillo