-
Notifications
You must be signed in to change notification settings - Fork 30
Open
Description
I was thinking that maybe it is not a bad idea to not only read one hd5 but also a directory which contains all _hd5_s and have a list of all the files in that directory. Since it is just a pointer (before any real reading), there should not be any problem in terms of memory. This can make it possible to have some filterings on events, stations and etc. Let me make an example to clear what I am suggesting:
- Assume that we have the following directory that has several hd5 files:
dir_h5
file1.h5 file2.h5 file3.h5 - read the whole directory by lets say:
data_set = ASDFDataSet("dir_h5")which reads all the files.
- we are only interested in the events and/or stations in one specific location or whatever:
data_set.filter_events("...similar filtering as read_event in obspy...")or:
data_set.filter_stations("...channel='*Z', coordinates and etc...")- come up with a dataset with only desired stations/events.
The advantage is that we would have the functionality to read all files at once and filter them in one or two lines...and afterwards you have your whole desired dataset out of a larger archive.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels