Compare commits
31 Commits
develop
...
38-simplif
Author | SHA1 | Date | |
---|---|---|---|
|
e434bda993 | ||
c0c3cbbd7b | |||
76d4ec290c | |||
63dac0fff6 | |||
b15cfe2e1d | |||
29a1e4ebe6 | |||
50cabb0038 | |||
8dd5789b0c | |||
c4aeab0d89 | |||
ef69fc429f | |||
c8f9c1c33a | |||
59f2c4b46f | |||
8eb958c91b | |||
6b7f297d7a | |||
70d5c2d621 | |||
7393201b90 | |||
d02f74ab10 | |||
0a5f5f0817 | |||
db976e5ea9 | |||
c021ca19d7 | |||
356988e71d | |||
ad62284e0e | |||
0fb0b0f11c | |||
e3dd4a4e28 | |||
2bbb84190c | |||
fb32d5e0c5 | |||
f043401cc0 | |||
eb15382b5f | |||
7fbc3bc5ae | |||
221743fe20 | |||
554afc5a81 |
1
.gitignore
vendored
1
.gitignore
vendored
@ -2,4 +2,3 @@
|
||||
*~
|
||||
.idea
|
||||
pylot/RELEASE-VERSION
|
||||
/tests/test_autopicker/dmt_database_test/
|
||||
|
39
.mailmap
39
.mailmap
@ -1,39 +0,0 @@
|
||||
Darius Arnold <Darius.Arnold@ruhr-uni-bochum.de> <Darius_A@web.de>
|
||||
Darius Arnold <Darius.Arnold@ruhr-uni-bochum.de> <darius.arnold@rub.de>
|
||||
Darius Arnold <Darius.Arnold@ruhr-uni-bochum.de> <darius.arnold@ruhr-uni-bochum.de>
|
||||
Darius Arnold <Darius.Arnold@ruhr-uni-bochum.de> <mail@dariusarnold.de>
|
||||
|
||||
Dennis Wlecklik <dennisw@minos02.geophysik.ruhr-uni-bochum.de>
|
||||
|
||||
Jeldrik Gaal <jeldrikgaal@gmail.com>
|
||||
|
||||
Kaan Coekerim <kaan.coekerim@ruhr-uni-bochum.de>
|
||||
Kaan Coekerim <kaan.coekerim@ruhr-uni-bochum.de> <kaan.coekerim@rub.de>
|
||||
|
||||
Ludger Kueperkoch <kueperkoch@igem-energie.de> <kueperkoch@bestec-for-nature.com>
|
||||
Ludger Kueperkoch <kueperkoch@igem-energie.de> <ludger@quake2.(none)>
|
||||
Ludger Kueperkoch <kueperkoch@igem-energie.de> <ludger@sauron.bestec-for-nature>
|
||||
|
||||
Marc S. Boxberg <marc.boxberg@rub.de>
|
||||
|
||||
Marcel Paffrath <marcel.paffrath@ruhr-uni-bochum.de> <marcel.paffrath@rub.de>
|
||||
Marcel Paffrath <marcel.paffrath@ruhr-uni-bochum.de> <marcel@minos01.geophysik.ruhr-uni-bochum.de>
|
||||
Marcel Paffrath <marcel.paffrath@ruhr-uni-bochum.de> <marcel@minos02.geophysik.ruhr-uni-bochum.de>
|
||||
Marcel Paffrath <marcel.paffrath@ruhr-uni-bochum.de> <marcel@minos25.geophysik.ruhr-uni-bochum.de>
|
||||
Marcel Paffrath <marcel.paffrath@ruhr-uni-bochum.de> <marcel@email.com>
|
||||
|
||||
Sally Zimmermann <sally.zimmermann@ruhr-uni-bochum.de>
|
||||
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastianw@minos01.geophysik.ruhr-uni-bochum.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastianw@minos02.geophysik.ruhr-uni-bochum.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastianw@minos22.geophysik.ruhr-uni-bochum.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastian.wehling-benatelli@scisys.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastian.wehling@rub.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastian.wehling@rub.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <DarkBeQst@users.noreply.github.com>
|
||||
|
||||
Thomas Moeller <thomas.moeller@rub.de>
|
||||
|
||||
Ann-Christin Koch <ann-christin.koch@ruhr-uni-bochum.de> <Ann-Christin.Koch@ruhr-uni-bochum.de>
|
||||
|
||||
Sebastian Priebe <sebastian.priebe@rub.de>
|
47
README.md
47
README.md
@ -11,7 +11,7 @@ PILOT has originally been developed in Mathworks' MatLab. In order to distribute
|
||||
problems, it has been decided to redevelop the software package in Python. The great work of the ObsPy group allows easy
|
||||
handling of a bunch of seismic data and PyLoT will benefit a lot compared to the former MatLab version.
|
||||
|
||||
The development of PyLoT is part of the joint research project MAGS2, AlpArray and AdriaArray.
|
||||
The development of PyLoT is part of the joint research project MAGS2 and AlpArray.
|
||||
|
||||
## Installation
|
||||
|
||||
@ -27,44 +27,58 @@ Afterwards run (from the PyLoT main directory where the files *requirements.txt*
|
||||
conda env create -f pylot.yml
|
||||
or
|
||||
|
||||
conda create -c conda-forge --name pylot_311 python=3.11 --file requirements.txt
|
||||
conda create --name pylot_38 --file requirements.txt
|
||||
|
||||
to create a new Anaconda environment called *pylot_311*.
|
||||
to create a new Anaconda environment called "pylot_38".
|
||||
|
||||
Afterwards activate the environment by typing
|
||||
|
||||
conda activate pylot_311
|
||||
conda activate pylot_38
|
||||
|
||||
#### Prerequisites:
|
||||
|
||||
In order to run PyLoT you need to install:
|
||||
|
||||
- Python 3
|
||||
- cartopy
|
||||
- joblib
|
||||
- obspy
|
||||
- pyaml
|
||||
- pyqtgraph
|
||||
- pyside2
|
||||
- pyqtgraph
|
||||
- cartopy
|
||||
|
||||
(the following are already dependencies of the above packages):
|
||||
- scipy
|
||||
- numpy
|
||||
- matplotlib
|
||||
- matplotlib <= 3.3.x
|
||||
|
||||
#### Some handwork:
|
||||
|
||||
Some extra information on error estimates (just needed for reading old PILOT data) and the Richter magnitude scaling
|
||||
PyLoT needs a properties folder on your system to work. It should be situated in your home directory
|
||||
(on Windows usually C:/Users/*username*):
|
||||
|
||||
mkdir ~/.pylot
|
||||
|
||||
In the next step you have to copy some files to this directory:
|
||||
|
||||
*for local distance seismicity*
|
||||
|
||||
cp path-to-pylot/inputs/pylot_local.in ~/.pylot/pylot.in
|
||||
|
||||
*for regional distance seismicity*
|
||||
|
||||
cp path-to-pylot/inputs/pylot_regional.in ~/.pylot/pylot.in
|
||||
|
||||
*for global distance seismicity*
|
||||
|
||||
cp path-to-pylot/inputs/pylot_global.in ~/.pylot/pylot.in
|
||||
|
||||
and some extra information on error estimates (just needed for reading old PILOT data) and the Richter magnitude scaling
|
||||
relation
|
||||
|
||||
cp path-to-pylot/inputs/PILOT_TimeErrors.in path-to-pylot/inputs/richter_scaling.data ~/.pylot/
|
||||
|
||||
You may need to do some modifications to these files. Especially folder names should be reviewed.
|
||||
|
||||
PyLoT has been tested on Mac OSX (10.11), Debian Linux 8 and on Windows 10/11.
|
||||
|
||||
## Example Dataset
|
||||
An example dataset with waveform data, metadata and automatic picks in the obspy-dmt dataset format for testing the teleseismic picking can be found at https://zenodo.org/doi/10.5281/zenodo.13759803
|
||||
PyLoT has been tested on Mac OSX (10.11), Debian Linux 8 and on Windows 10.
|
||||
|
||||
## Release notes
|
||||
|
||||
@ -73,7 +87,6 @@ An example dataset with waveform data, metadata and automatic picks in the obspy
|
||||
- event organisation in project files and waveform visualisation
|
||||
- consistent manual phase picking through predefined SNR dependant zoom level
|
||||
- consistent automatic phase picking routines using Higher Order Statistics, AIC and Autoregression
|
||||
- pick correlation correction for teleseismic waveforms
|
||||
- interactive tuning of auto-pick parameters
|
||||
- uniform uncertainty estimation from waveform's properties for automatic and manual picks
|
||||
- pdf representation and comparison of picks taking the uncertainty intrinsically into account
|
||||
@ -82,7 +95,7 @@ An example dataset with waveform data, metadata and automatic picks in the obspy
|
||||
|
||||
#### Known issues:
|
||||
|
||||
Current release is still in development progress and has several issues. We are currently lacking manpower, but hope to assess many of the issues in the near future.
|
||||
We hope to solve these with the next release.
|
||||
|
||||
## Staff
|
||||
|
||||
@ -95,4 +108,4 @@ Others: A. Bruestle, T. Meier, W. Friederich
|
||||
|
||||
[ObsPy]: http://github.com/obspy/obspy/wiki
|
||||
|
||||
September 2024
|
||||
April 2022
|
||||
|
29
autoPyLoT.py
29
autoPyLoT.py
@ -136,9 +136,11 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
if parameter.hasParam('datastructure'):
|
||||
# getting information on data structure
|
||||
datastructure = DATASTRUCTURE[parameter.get('datastructure')]()
|
||||
dsfields = {'dpath': parameter.get('datapath'),}
|
||||
dsfields = {'root': parameter.get('rootpath'),
|
||||
'dpath': parameter.get('datapath'),
|
||||
'dbase': parameter.get('database')}
|
||||
|
||||
exf = ['dpath']
|
||||
exf = ['root', 'dpath', 'dbase']
|
||||
|
||||
if parameter['eventID'] != '*' and fnames == 'None':
|
||||
dsfields['eventID'] = parameter['eventID']
|
||||
@ -184,15 +186,15 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
if not input_dict:
|
||||
# started in production mode
|
||||
datapath = datastructure.expandDataPath()
|
||||
if fnames in [None, 'None'] and parameter['eventID'] == '*':
|
||||
if fnames == 'None' and parameter['eventID'] == '*':
|
||||
# multiple event processing
|
||||
# read each event in database
|
||||
events = [event for event in glob.glob(os.path.join(datapath, '*')) if
|
||||
(os.path.isdir(event) and not event.endswith('EVENTS-INFO'))]
|
||||
elif fnames in [None, 'None'] and parameter['eventID'] != '*' and not type(parameter['eventID']) == list:
|
||||
elif fnames == 'None' and parameter['eventID'] != '*' and not type(parameter['eventID']) == list:
|
||||
# single event processing
|
||||
events = glob.glob(os.path.join(datapath, parameter['eventID']))
|
||||
elif fnames in [None, 'None'] and type(parameter['eventID']) == list:
|
||||
elif fnames == 'None' and type(parameter['eventID']) == list:
|
||||
# multiple event processing
|
||||
events = []
|
||||
for eventID in parameter['eventID']:
|
||||
@ -204,10 +206,12 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
locflag = 2
|
||||
else:
|
||||
# started in tune or interactive mode
|
||||
datapath = parameter['datapath']
|
||||
datapath = os.path.join(parameter['rootpath'],
|
||||
parameter['datapath'])
|
||||
events = []
|
||||
for eventID in eventid:
|
||||
events.append(os.path.join(datapath,
|
||||
parameter['database'],
|
||||
eventID))
|
||||
|
||||
if not events:
|
||||
@ -234,15 +238,12 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
data.get_evt_data().path = eventpath
|
||||
print('Reading event data from filename {}...'.format(filename))
|
||||
except Exception as e:
|
||||
if type(e) == FileNotFoundError:
|
||||
print('Creating new event file.')
|
||||
else:
|
||||
print('Could not read event from file {}: {}'.format(filename, e))
|
||||
print('Could not read event from file {}: {}'.format(filename, e))
|
||||
data = Data()
|
||||
pylot_event = Event(eventpath) # event should be path to event directory
|
||||
data.setEvtData(pylot_event)
|
||||
if fnames in [None, 'None']:
|
||||
data.setWFData(glob.glob(os.path.join(event_datapath, '*')))
|
||||
if fnames == 'None':
|
||||
data.set_wf_data(glob.glob(os.path.join(datapath, event_datapath, '*')))
|
||||
# the following is necessary because within
|
||||
# multiple event processing no event ID is provided
|
||||
# in autopylot.in
|
||||
@ -257,7 +258,7 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
now.minute)
|
||||
parameter.setParam(eventID=eventID)
|
||||
else:
|
||||
data.setWFData(fnames)
|
||||
data.set_wf_data(fnames)
|
||||
|
||||
eventpath = events[0]
|
||||
# now = datetime.datetime.now()
|
||||
@ -267,7 +268,7 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
# now.hour,
|
||||
# now.minute)
|
||||
parameter.setParam(eventID=eventid)
|
||||
wfdat = data.getWFData() # all available streams
|
||||
wfdat = data.get_wf_data() # all available streams
|
||||
if not station == 'all':
|
||||
wfdat = wfdat.select(station=station)
|
||||
if not wfdat:
|
||||
|
@ -1,77 +0,0 @@
|
||||
# Pick-Correlation Correction
|
||||
|
||||
## Introduction
|
||||
Currently, the pick-correlation correction algorithm is not accessible from they PyLoT GUI. The main file *pick_correlation_correction.py* is located in the directory *pylot\correlation*.
|
||||
The program only works for an obspy dmt database structure.
|
||||
|
||||
The basic workflow of the algorithm is shown in the following diagram. The first step **(1)** is the normal (automatic) picking procedure in PyLoT. Everything from step **(2)** to **(5)** is part of the correlation correction algorithm.
|
||||
|
||||
*Note: The first step is not required in case theoretical onsets are used instead of external picks when the parameter use_taupy_onsets is set to True. However, an existing event quakeML (.xml) file generated by PyLoT might be required for each event in case not external picks are used.*
|
||||
|
||||
![images/workflow_stacking.png](images/workflow_stacking.png)
|
||||
|
||||
A detailed description of the algorithm can be found in the corresponding publication:
|
||||
|
||||
*Paffrath, M., Friederich, W., and the AlpArray and AlpArray-SWATH D Working Groups: Teleseismic P waves at the AlpArray seismic network: wave fronts, absolute travel times and travel-time residuals, Solid Earth, 12, 1635–1660, https://doi.org/10.5194/se-12-1635-2021, 2021.*
|
||||
|
||||
## How to use
|
||||
To use the program you have to call the main program providing two mandatory arguments: a path to the obspy dmt database folder *dmt_database_path* and the path to the PyLoT infile *pylot.in* for picking of the beam trace:
|
||||
|
||||
```python pick_correlation_correction.py dmt_database_path pylot.in```
|
||||
|
||||
By default, the parameter file *parameters.yaml* is used. You can use the command line option *--params* to specify a different parameter file and other optional arguments such as *-pd* for plotting detailed information or *-n 4* to use 4 cores for parallel processing:
|
||||
|
||||
```python pick_correlation_correction.py dmt_database_path pylot.in --params parameters_adriaarray.yaml -pd -n 4```
|
||||
|
||||
## Cross-Correlation Parameters
|
||||
|
||||
The program uses the parameters in the file *parameters.yaml* by default. You can use the command line option *--params* to specify a different parameter file. An example of the parameter file is provided in the *correlation\parameters.yaml* file.
|
||||
|
||||
In the top level of the parameter file the logging level *logging* can be set, as well as a list of pick phases *pick_phases* (e.g. ['P', 'S']).
|
||||
|
||||
For each pick phase the different parameters can be set in the first sub-level of the parameter file, e.g.:
|
||||
|
||||
```yaml
|
||||
logging: info
|
||||
pick_phases: ['P', 'S']
|
||||
|
||||
P:
|
||||
min_corr_stacking: 0.8
|
||||
min_corr_export: 0.6
|
||||
[...]
|
||||
|
||||
S:
|
||||
min_corr_stacking: 0.7
|
||||
[...]
|
||||
```
|
||||
|
||||
The following parameters are available:
|
||||
|
||||
|
||||
| Parameter Name | Description | Parameter Type |
|
||||
|--------------------------------|----------------------------------------------------------------------------------------------------|----------------|
|
||||
| min_corr_stacking | Minimum correlation coefficient for building beam trace | float |
|
||||
| min_corr_export | Minimum correlation coefficient for pick export | float |
|
||||
| min_stack | Minimum number of stations for building beam trace | int |
|
||||
| t_before | Correlation window before reference pick | float |
|
||||
| t_after | Correlation window after reference pick | float |
|
||||
| cc_maxlag | Maximum shift for initial correlation | float |
|
||||
| cc_maxlag2 | Maximum shift for second (final) correlation (also for calculating pick uncertainty) | float |
|
||||
| initial_pick_outlier_threshold | Threshold for excluding large outliers of initial (AIC) picks | float |
|
||||
| export_threshold | Automatically exclude all onsets which deviate more than this threshold from corrected taup onsets | float |
|
||||
| min_picks_export | Minimum number of correlated picks for export | int |
|
||||
| min_picks_autopylot | Minimum number of reference auto picks to continue with event | int |
|
||||
| check_RMS | Do RMS check to search for restitution errors (very experimental) | bool |
|
||||
| use_taupy_onsets | Use taupy onsets as reference picks instead of external picks | bool |
|
||||
| station_list | Use the following stations as reference for stacking | list[str] |
|
||||
| use_stacked_trace | Use existing stacked trace if found (spare re-computation) | bool |
|
||||
| data_dir | obspyDMT data subdirectory (e.g. 'raw', 'processed') | str |
|
||||
| pickfile_extension | Use quakeML files (PyLoT output) with the following extension | str |
|
||||
| dt_stacking | Time difference for stacking window (in seconds) | list[float] |
|
||||
| filter_options | Filter for first correlation (rough) | dict |
|
||||
| filter_options_final | Filter for second correlation (fine) | dict |
|
||||
| filter_type | Filter type (e.g. bandpass) | str |
|
||||
| sampfreq | Sampling frequency (in Hz) | float |
|
||||
|
||||
## Example Dataset
|
||||
An example dataset with waveform data, metadata and automatic picks in the obspy-dmt dataset format for testing can be found at https://zenodo.org/doi/10.5281/zenodo.13759803
|
@ -203,6 +203,8 @@ The meaning of the header entries is:
|
||||
PyLoT GUI starts with an empty project. To add events, use the add event data button. Select one or multiple folders
|
||||
containing events.
|
||||
|
||||
[//]: <> (TODO: explain _Directories: Root path, Data path, Database path_)
|
||||
|
||||
### Saving projects
|
||||
|
||||
Save the current project from the menu with File->Save project or File->Save project as. PyLoT uses ``.plp`` files to
|
||||
|
Binary file not shown.
Before Width: | Height: | Size: 64 KiB |
199
docs/tuning.md
199
docs/tuning.md
@ -6,123 +6,122 @@ A description of the parameters used for determining automatic picks.
|
||||
|
||||
Parameters applied to the traces before picking algorithm starts.
|
||||
|
||||
| Name | Description |
|
||||
|---------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *P Start*, *P | |
|
||||
| Stop* | Define time interval relative to trace start time for CF calculation on vertical trace. Value is relative to theoretical onset time if 'Use TauPy' option is enabled in main settings of 'Tune Autopicker' dialogue. |
|
||||
| *S Start*, *S | |
|
||||
| Stop* | Define time interval relative to trace start time for CF calculation on horizontal traces. Value is relative to theoretical onset time if 'Use TauPy' option is enabled in main settings of 'Tune Autopicker' dialogue. |
|
||||
| *Bandpass | |
|
||||
| Z1* | Filter settings for Butterworth bandpass applied to vertical trace for calculation of initial P pick. |
|
||||
| *Bandpass | |
|
||||
| Z2* | Filter settings for Butterworth bandpass applied to vertical trace for calculation of precise P pick. |
|
||||
| *Bandpass | |
|
||||
| H1* | Filter settings for Butterworth bandpass applied to horizontal traces for calculation of initial S pick. |
|
||||
| *Bandpass | |
|
||||
| H2* | Filter settings for Butterworth bandpass applied to horizontal traces for calculation of precise S pick. |
|
||||
| Name | Description |
|
||||
|---------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *P Start*, *P
|
||||
Stop* | Define time interval relative to trace start time for CF calculation on vertical trace. Value is relative to theoretical onset time if 'Use TauPy' option is enabled in main settings of 'Tune Autopicker' dialogue. |
|
||||
| *S Start*, *S
|
||||
Stop* | Define time interval relative to trace start time for CF calculation on horizontal traces. Value is relative to theoretical onset time if 'Use TauPy' option is enabled in main settings of 'Tune Autopicker' dialogue. |
|
||||
| *Bandpass
|
||||
Z1* | Filter settings for Butterworth bandpass applied to vertical trace for calculation of initial P pick. |
|
||||
| *Bandpass
|
||||
Z2* | Filter settings for Butterworth bandpass applied to vertical trace for calculation of precise P pick. |
|
||||
| *Bandpass
|
||||
H1* | Filter settings for Butterworth bandpass applied to horizontal traces for calculation of initial S pick. |
|
||||
| *Bandpass
|
||||
H2* | Filter settings for Butterworth bandpass applied to horizontal traces for calculation of precise S pick. |
|
||||
|
||||
## Inital P pick
|
||||
|
||||
Parameters used for determination of initial P pick.
|
||||
|
||||
| Name | Description |
|
||||
|-------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------|
|
||||
| * | |
|
||||
| tLTA* | Size of gliding LTA window in seconds used for calculation of HOS-CF. |
|
||||
| *pickwin | |
|
||||
| P* | Size of time window in seconds in which the minimum of the AIC-CF in front of the maximum of the HOS-CF is determined. |
|
||||
| * | |
|
||||
| AICtsmooth* | Average of samples in this time window will be used for smoothing of the AIC-CF. |
|
||||
| * | |
|
||||
| checkwinP* | Time in front of the global maximum of the HOS-CF in which to search for a second local extrema. |
|
||||
| *minfactorP* | Used with * |
|
||||
| checkwinP*. If a second local maximum is found, it has to be at least as big as the first maximum * *minfactorP*. | |
|
||||
| * | |
|
||||
| tsignal* | Time window in seconds after the initial P pick used for determining signal amplitude. |
|
||||
| * | |
|
||||
| tnoise* | Time window in seconds in front of initial P pick used for determining noise amplitude. |
|
||||
| *tsafetey* | Time in seconds between *tsignal* and * |
|
||||
| tnoise*. | |
|
||||
| * | |
|
||||
| tslope* | Time window in seconds after initial P pick in which the slope of the onset is calculated. |
|
||||
| Name | Description |
|
||||
|--------------|------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *
|
||||
tLTA* | Size of gliding LTA window in seconds used for calculation of HOS-CF. |
|
||||
| *pickwin
|
||||
P* | Size of time window in seconds in which the minimum of the AIC-CF in front of the maximum of the HOS-CF is determined. |
|
||||
| *
|
||||
AICtsmooth* | Average of samples in this time window will be used for smoothing of the AIC-CF. |
|
||||
| *
|
||||
checkwinP* | Time in front of the global maximum of the HOS-CF in which to search for a second local extrema. |
|
||||
| *minfactorP* | Used with *
|
||||
checkwinP*. If a second local maximum is found, it has to be at least as big as the first maximum * *minfactorP*. |
|
||||
| *
|
||||
tsignal* | Time window in seconds after the initial P pick used for determining signal amplitude. |
|
||||
| *
|
||||
tnoise* | Time window in seconds in front of initial P pick used for determining noise amplitude. |
|
||||
| *tsafetey* | Time in seconds between *tsignal* and *
|
||||
tnoise*. |
|
||||
| *
|
||||
tslope* | Time window in seconds after initial P pick in which the slope of the onset is calculated. |
|
||||
|
||||
## Inital S pick
|
||||
|
||||
Parameters used for determination of initial S pick
|
||||
|
||||
| Name | Description |
|
||||
|-------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------|
|
||||
| * | |
|
||||
| tdet1h* | Length of time window in seconds in which AR params of the waveform are determined. |
|
||||
| * | |
|
||||
| tpred1h* | Length of time window in seconds in which the waveform is predicted using the AR model. |
|
||||
| * | |
|
||||
| AICtsmoothS* | Average of samples in this time window is used for smoothing the AIC-CF. |
|
||||
| * | |
|
||||
| pickwinS* | Time window in which the minimum in the AIC-CF in front of the maximum in the ARH-CF is determined. |
|
||||
| * | |
|
||||
| checkwinS* | Time in front of the global maximum of the ARH-CF in which to search for a second local extrema. |
|
||||
| *minfactorP* | Used with * |
|
||||
| checkwinS*. If a second local maximum is found, it has to be at least as big as the first maximum * *minfactorS*. | |
|
||||
| * | |
|
||||
| tsignal* | Time window in seconds after the initial P pick used for determining signal amplitude. |
|
||||
| * | |
|
||||
| tnoise* | Time window in seconds in front of initial P pick used for determining noise amplitude. |
|
||||
| *tsafetey* | Time in seconds between *tsignal* and * |
|
||||
| tnoise*. | |
|
||||
| * | |
|
||||
| tslope* | Time window in seconds after initial P pick in which the slope of the onset is calculated. |
|
||||
| Name | Description |
|
||||
|---------------|------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *
|
||||
tdet1h* | Length of time window in seconds in which AR params of the waveform are determined. |
|
||||
| *
|
||||
tpred1h* | Length of time window in seconds in which the waveform is predicted using the AR model. |
|
||||
| *
|
||||
AICtsmoothS* | Average of samples in this time window is used for smoothing the AIC-CF. |
|
||||
| *
|
||||
pickwinS* | Time window in which the minimum in the AIC-CF in front of the maximum in the ARH-CF is determined. |
|
||||
| *
|
||||
checkwinS* | Time in front of the global maximum of the ARH-CF in which to search for a second local extrema. |
|
||||
| *minfactorP* | Used with *
|
||||
checkwinS*. If a second local maximum is found, it has to be at least as big as the first maximum * *minfactorS*. |
|
||||
| *
|
||||
tsignal* | Time window in seconds after the initial P pick used for determining signal amplitude. |
|
||||
| *
|
||||
tnoise* | Time window in seconds in front of initial P pick used for determining noise amplitude. |
|
||||
| *tsafetey* | Time in seconds between *tsignal* and *
|
||||
tnoise*. |
|
||||
| *
|
||||
tslope* | Time window in seconds after initial P pick in which the slope of the onset is calculated. |
|
||||
|
||||
## Precise P pick
|
||||
|
||||
Parameters used for determination of precise P pick.
|
||||
|
||||
| Name | Description |
|
||||
|-------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *Precalcwin* | Time window in seconds for recalculation of the HOS-CF. The new CF will be two times the size of * |
|
||||
| Precalcwin*, since it will be calculated from the initial pick to +/- *Precalcwin*. | |
|
||||
| * | |
|
||||
| tsmoothP* | Average of samples in this time window will be used for smoothing the second HOS-CF. |
|
||||
| * | |
|
||||
| ausP* | Controls artificial uplift of samples during precise picking. A common local minimum of the smoothed and unsmoothed HOS-CF is found when the previous sample is larger or equal to the current sample times (1+* |
|
||||
| ausP*). | |
|
||||
| Name | Description |
|
||||
|--------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *Precalcwin* | Time window in seconds for recalculation of the HOS-CF. The new CF will be two times the size of *
|
||||
Precalcwin*, since it will be calculated from the initial pick to +/- *Precalcwin*. |
|
||||
| *
|
||||
tsmoothP* | Average of samples in this time window will be used for smoothing the second HOS-CF. |
|
||||
| *
|
||||
ausP* | Controls artificial uplift of samples during precise picking. A common local minimum of the smoothed and unsmoothed HOS-CF is found when the previous sample is larger or equal to the current sample times (1+*
|
||||
ausP*). |
|
||||
|
||||
## Precise S pick
|
||||
|
||||
Parameters used for determination of precise S pick.
|
||||
|
||||
| Name | Description |
|
||||
|--------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| * | |
|
||||
| tdet2h* | Time window for determination of AR coefficients. |
|
||||
| * | |
|
||||
| tpred2h* | Time window in which the waveform is predicted using the determined AR parameters. |
|
||||
| *Srecalcwin* | Time window for recalculation of ARH-CF. New CF will be calculated from initial pick +/- * |
|
||||
| Srecalcwin*. | |
|
||||
| * | |
|
||||
| tsmoothS* | Average of samples in this time window will be used for smoothing the second ARH-CF. |
|
||||
| * | |
|
||||
| ausS* | Controls artificial uplift of samples during precise picking. A common local minimum of the smoothed and unsmoothed ARH-CF is found when the previous sample is larger or equal to the current sample times (1+* |
|
||||
| ausS*). | |
|
||||
| * | |
|
||||
| pickwinS* | Time window around initial pick in which to look for a precise pick. |
|
||||
| Name | Description |
|
||||
|--------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *
|
||||
tdet2h* | Time window for determination of AR coefficients. |
|
||||
| *
|
||||
tpred2h* | Time window in which the waveform is predicted using the determined AR parameters. |
|
||||
| *Srecalcwin* | Time window for recalculation of ARH-CF. New CF will be calculated from initial pick +/- *
|
||||
Srecalcwin*. |
|
||||
| *
|
||||
tsmoothS* | Average of samples in this time window will be used for smoothing the second ARH-CF. |
|
||||
| *
|
||||
ausS* | Controls artificial uplift of samples during precise picking. A common local minimum of the smoothed and unsmoothed ARH-CF is found when the previous sample is larger or equal to the current sample times (1+*
|
||||
ausS*). |
|
||||
| *
|
||||
pickwinS* | Time window around initial pick in which to look for a precise pick. |
|
||||
|
||||
## Pick quality control
|
||||
|
||||
Parameters used for checking quality and integrity of automatic picks.
|
||||
|
||||
| Name | Description |
|
||||
|--------------------------------------------|-----------------------------------------------------------------------|
|
||||
| * | |
|
||||
| minAICPslope* | Initial P picks with a slope lower than this value will be discared. |
|
||||
| * | |
|
||||
| minAICPSNR* | Initial P picks with a SNR below this value will be discarded. |
|
||||
| * | |
|
||||
| minAICSslope* | Initial S picks with a slope lower than this value will be discarded. |
|
||||
| * | |
|
||||
| minAICSSNR* | Initial S picks with a SNR below this value will be discarded. |
|
||||
| *minsiglength*, *noisefacor*. *minpercent* | Parameters for checking signal length. In the time window of size * |
|
||||
|
||||
| Name | Description |
|
||||
|--------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *
|
||||
minAICPslope* | Initial P picks with a slope lower than this value will be discared. |
|
||||
| *
|
||||
minAICPSNR* | Initial P picks with a SNR below this value will be discarded. |
|
||||
| *
|
||||
minAICSslope* | Initial S picks with a slope lower than this value will be discarded. |
|
||||
| *
|
||||
minAICSSNR* | Initial S picks with a SNR below this value will be discarded. |
|
||||
| *minsiglength*, *noisefacor*. *minpercent* | Parameters for checking signal length. In the time window of size *
|
||||
minsiglength* after the initial P pick *
|
||||
minpercent* of samples have to be larger than the RMS value. |
|
||||
| *
|
||||
@ -140,12 +139,12 @@ wdttolerance* | Maximum allowed deviation of S onset
|
||||
|
||||
Parameters for discrete quality classes.
|
||||
|
||||
| Name | Description |
|
||||
|--------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| * | |
|
||||
| timeerrorsP* | Width of the time windows in seconds between earliest and latest possible pick which represent the quality classes 0, 1, 2, 3 for P onsets. |
|
||||
| * | |
|
||||
| timeerrorsS* | Width of the time windows in seconds between earliest and latest possible pick which represent the quality classes 0, 1, 2, 3 for S onsets. |
|
||||
| *nfacP*, *nfacS* | For determination of latest possible onset time. The time when the signal reaches an amplitude of * |
|
||||
| nfac* * mean value of the RMS amplitude in the time window *tnoise* corresponds to the latest possible onset time. | |
|
||||
| Name | Description |
|
||||
|------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *
|
||||
timeerrorsP* | Width of the time windows in seconds between earliest and latest possible pick which represent the quality classes 0, 1, 2, 3 for P onsets. |
|
||||
| *
|
||||
timeerrorsS* | Width of the time windows in seconds between earliest and latest possible pick which represent the quality classes 0, 1, 2, 3 for S onsets. |
|
||||
| *nfacP*, *nfacS* | For determination of latest possible onset time. The time when the signal reaches an amplitude of *
|
||||
nfac* * mean value of the RMS amplitude in the time window *tnoise* corresponds to the latest possible onset time. |
|
||||
|
||||
|
@ -4,8 +4,10 @@
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
#datapath# %data path
|
||||
#eventID# %event ID for single event processing (* for all events found in datapath)
|
||||
#rootpath# %project path
|
||||
#datapath# %data path
|
||||
#database# %name of data base
|
||||
#eventID# %event ID for single event processing (* for all events found in database)
|
||||
#invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
@ -41,7 +43,6 @@ global #extent# %extent of a
|
||||
1150.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6
|
||||
P,Pdiff #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
0.05 0.5 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
0.001 0.5 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
0.05 0.5 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
|
@ -4,8 +4,10 @@
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
/DATA/Insheim/EVENT_DATA/LOCAL/2018.02_Insheim #datapath# %data path
|
||||
e0006.038.18 #eventID# %event ID for single event processing (* for all events found in datapath)
|
||||
/DATA/Insheim #rootpath# %project path
|
||||
EVENT_DATA/LOCAL #datapath# %data path
|
||||
2018.02_Insheim #database# %name of data base
|
||||
e0006.038.18 #eventID# %event ID for single event processing (* for all events found in database)
|
||||
/DATA/Insheim/STAT_INFO #invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
@ -40,8 +42,7 @@ local #extent# %extent of a
|
||||
-0.5 #sstart# %start time [s] relative to P-onset for calculating CF for S-picking
|
||||
10.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
False #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation
|
||||
P #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation
|
||||
2.0 20.0 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
2.0 30.0 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
2.0 10.0 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
|
@ -4,8 +4,10 @@
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
#datapath# %data path
|
||||
#eventID# %event ID for single event processing (* for all events found in datapath)
|
||||
#rootpath# %project path
|
||||
#datapath# %data path
|
||||
#database# %name of data base
|
||||
#eventID# %event ID for single event processing (* for all events found in database)
|
||||
#invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
@ -40,8 +42,7 @@ local #extent# %extent of a
|
||||
-1.0 #sstart# %start time [s] relative to P-onset for calculating CF for S-picking
|
||||
10.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation
|
||||
P #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation
|
||||
2.0 10.0 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
2.0 12.0 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
2.0 8.0 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
|
18
pylot.yml
18
pylot.yml
@ -1,12 +1,14 @@
|
||||
name: pylot_311
|
||||
name: pylot_38
|
||||
channels:
|
||||
- conda-forge
|
||||
- defaults
|
||||
dependencies:
|
||||
- cartopy=0.23.0=py311hcf9f919_1
|
||||
- joblib=1.4.2=pyhd8ed1ab_0
|
||||
- obspy=1.4.1=py311he736701_3
|
||||
- pyaml=24.7.0=pyhd8ed1ab_0
|
||||
- pyqtgraph=0.13.7=pyhd8ed1ab_0
|
||||
- pyside2=5.15.8=py311h3d699ce_4
|
||||
- pytest=8.3.2=pyhd8ed1ab_0
|
||||
- cartopy=0.20.2
|
||||
- matplotlib-base=3.3.4
|
||||
- numpy=1.22.3
|
||||
- obspy=1.3.0
|
||||
- pyqtgraph=0.12.4
|
||||
- pyside2>=5.13.2
|
||||
- python=3.8.12
|
||||
- qt>=5.12.9
|
||||
- scipy=1.8.0
|
@ -9,7 +9,7 @@ PyLoT - the Python picking and Localization Tool
|
||||
|
||||
This python library contains a graphical user interfaces for picking
|
||||
seismic phases. This software needs ObsPy (http://github.com/obspy/obspy/wiki)
|
||||
and the Qt libraries to be installed first.
|
||||
and the Qt4 libraries to be installed first.
|
||||
|
||||
PILOT has been developed in Mathworks' MatLab. In order to distribute
|
||||
PILOT without facing portability problems, it has been decided to re-
|
||||
|
@ -1,660 +1,69 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import copy
|
||||
import logging
|
||||
import os
|
||||
from dataclasses import dataclass, field
|
||||
from typing import List, Union
|
||||
|
||||
from PySide2.QtWidgets import QMessageBox
|
||||
from obspy import read_events
|
||||
from obspy.core import read, Stream, UTCDateTime
|
||||
from obspy.core.event import Event as ObsPyEvent
|
||||
from obspy.io.sac import SacIOError
|
||||
from obspy import UTCDateTime
|
||||
|
||||
import pylot.core.loc.focmec as focmec
|
||||
import pylot.core.loc.hypodd as hypodd
|
||||
import pylot.core.loc.velest as velest
|
||||
from pylot.core.io.phases import readPILOTEvent, picks_from_picksdict, \
|
||||
picksdict_from_pilot, merge_picks, PylotParameter
|
||||
from pylot.core.util.errors import FormatError, OverwriteError
|
||||
from pylot.core.util.event import Event
|
||||
from pylot.core.util.obspyDMT_interface import qml_from_obspyDMT
|
||||
from pylot.core.util.utils import fnConstructor, full_range, check4rotated, \
|
||||
check_for_gaps_and_merge, trim_station_components, check_for_nan
|
||||
from pylot.core.io.event import EventData
|
||||
from pylot.core.io.waveformdata import WaveformData
|
||||
from pylot.core.util.dataprocessing import Metadata
|
||||
|
||||
|
||||
class Data(object):
|
||||
"""
|
||||
Data container with attributes wfdata holding ~obspy.core.stream.
|
||||
|
||||
:type parent: PySide2.QtWidgets.QWidget object, optional
|
||||
:param parent: A PySide2.QtWidgets.QWidget object utilized when
|
||||
called by a GUI to display a PySide2.QtWidgets.QMessageBox instead of printing
|
||||
to standard out.
|
||||
:type evtdata: ~obspy.core.event.Event object, optional
|
||||
:param evtdata ~obspy.core.event.Event object containing all derived or
|
||||
loaded event. Container object holding, e.g. phase arrivals, etc.
|
||||
"""
|
||||
@dataclass
|
||||
class Data:
|
||||
event_data: EventData = field(default_factory=EventData)
|
||||
waveform_data: WaveformData = field(default_factory=WaveformData)
|
||||
metadata: Metadata = field(default_factory=Metadata)
|
||||
_parent: Union[None, 'QtWidgets.QWidget'] = None
|
||||
|
||||
def __init__(self, parent=None, evtdata=None):
|
||||
self._parent = parent
|
||||
if self.getParent():
|
||||
self.comp = parent.getComponent()
|
||||
else:
|
||||
self.comp = 'Z'
|
||||
self.wfdata = Stream()
|
||||
self._new = False
|
||||
if isinstance(evtdata, ObsPyEvent) or isinstance(evtdata, Event):
|
||||
pass
|
||||
elif isinstance(evtdata, dict):
|
||||
evt = readPILOTEvent(**evtdata)
|
||||
evtdata = evt
|
||||
elif isinstance(evtdata, str):
|
||||
try:
|
||||
cat = read_events(evtdata)
|
||||
if len(cat) != 1:
|
||||
raise ValueError('ambiguous event information for file: '
|
||||
'{file}'.format(file=evtdata))
|
||||
evtdata = cat[0]
|
||||
except TypeError as e:
|
||||
if 'Unknown format for file' in e.message:
|
||||
if 'PHASES' in evtdata:
|
||||
picks = picksdict_from_pilot(evtdata)
|
||||
evtdata = ObsPyEvent()
|
||||
evtdata.picks = picks_from_picksdict(picks)
|
||||
elif 'LOC' in evtdata:
|
||||
raise NotImplementedError('PILOT location information '
|
||||
'read support not yet '
|
||||
'implemented.')
|
||||
elif 'event.pkl' in evtdata:
|
||||
evtdata = qml_from_obspyDMT(evtdata)
|
||||
else:
|
||||
raise e
|
||||
else:
|
||||
raise e
|
||||
else: # create an empty Event object
|
||||
self.setNew()
|
||||
evtdata = ObsPyEvent()
|
||||
evtdata.picks = []
|
||||
self.evtdata = evtdata
|
||||
self.wforiginal = None
|
||||
self.cuttimes = None
|
||||
self.dirty = False
|
||||
self.processed = None
|
||||
self.event_data = EventData(evtdata)
|
||||
self.waveform_data = WaveformData()
|
||||
|
||||
def __str__(self):
|
||||
return str(self.wfdata)
|
||||
return str(self.waveform_data.wfdata)
|
||||
|
||||
def __add__(self, other):
|
||||
assert isinstance(other, Data), "operands must be of same type 'Data'"
|
||||
rs_id = self.get_evt_data().get('resource_id')
|
||||
rs_id_other = other.get_evt_data().get('resource_id')
|
||||
if other.isNew() and not self.isNew():
|
||||
picks_to_add = other.get_evt_data().picks
|
||||
old_picks = self.get_evt_data().picks
|
||||
wf_ids_old = [pick.waveform_id for pick in old_picks]
|
||||
for new_pick in picks_to_add:
|
||||
wf_id = new_pick.waveform_id
|
||||
if wf_id in wf_ids_old:
|
||||
for old_pick in old_picks:
|
||||
comparison = [old_pick.waveform_id == new_pick.waveform_id,
|
||||
old_pick.phase_hint == new_pick.phase_hint,
|
||||
old_pick.method_id == new_pick.method_id]
|
||||
if all(comparison):
|
||||
del (old_pick)
|
||||
old_picks.append(new_pick)
|
||||
elif not other.isNew() and self.isNew():
|
||||
new = other + self
|
||||
self.evtdata = new.get_evt_data()
|
||||
elif self.isNew() and other.isNew():
|
||||
if not isinstance(other, Data):
|
||||
raise TypeError("Operands must be of type 'Data'")
|
||||
if self.event_data.is_new() and other.event_data.is_new():
|
||||
pass
|
||||
elif rs_id == rs_id_other:
|
||||
other.setNew()
|
||||
elif other.event_data.is_new():
|
||||
new_picks = other.event_data.evtdata.picks
|
||||
old_picks = self.event_data.evtdata.picks
|
||||
old_picks.extend([pick for pick in new_picks if pick not in old_picks])
|
||||
elif self.event_data.is_new():
|
||||
return other + self
|
||||
elif self.event_data.get_id() == other.event_data.get_id():
|
||||
other.event_data.set_new()
|
||||
return self + other
|
||||
else:
|
||||
raise ValueError("both Data objects have differing "
|
||||
"unique Event identifiers")
|
||||
raise ValueError("Both Data objects have differing unique Event identifiers")
|
||||
return self
|
||||
|
||||
def getPicksStr(self):
|
||||
"""
|
||||
Return picks in event data
|
||||
:return: picks seperated by newlines
|
||||
:rtype: str
|
||||
"""
|
||||
picks_str = ''
|
||||
for pick in self.get_evt_data().picks:
|
||||
picks_str += str(pick) + '\n'
|
||||
return picks_str
|
||||
|
||||
def getParent(self):
|
||||
"""
|
||||
Get PySide.QtGui.QWidget parent object
|
||||
"""
|
||||
def get_parent(self):
|
||||
return self._parent
|
||||
|
||||
def isNew(self):
|
||||
return self._new
|
||||
def filter_wf_data(self, **kwargs):
|
||||
self.waveform_data.wfdata.detrend('linear')
|
||||
self.waveform_data.wfdata.taper(0.02, type='cosine')
|
||||
self.waveform_data.wfdata.filter(**kwargs)
|
||||
self.waveform_data.dirty = True
|
||||
|
||||
def setNew(self):
|
||||
self._new = True
|
||||
def set_wf_data(self, fnames: List[str], fnames_alt: List[str] = None, check_rotated=False, metadata=None, tstart=0, tstop=0):
|
||||
return self.waveform_data.load_waveforms(fnames, fnames_alt, check_rotated, metadata, tstart, tstop)
|
||||
|
||||
def getCutTimes(self):
|
||||
"""
|
||||
Returns earliest start and latest end of all waveform data
|
||||
:return: minimum start time and maximum end time as a tuple
|
||||
:rtype: (UTCDateTime, UTCDateTime)
|
||||
"""
|
||||
if self.cuttimes is None:
|
||||
self.updateCutTimes()
|
||||
return self.cuttimes
|
||||
def reset_wf_data(self):
|
||||
self.waveform_data.reset()
|
||||
|
||||
def updateCutTimes(self):
|
||||
"""
|
||||
Update cuttimes to contain earliest start and latest end time
|
||||
of all waveform data
|
||||
:rtype: None
|
||||
"""
|
||||
self.cuttimes = full_range(self.getWFData())
|
||||
def get_wf_data(self):
|
||||
return self.waveform_data.wfdata
|
||||
|
||||
def getEventFileName(self):
|
||||
ID = self.getID()
|
||||
# handle forbidden filenames especially on windows systems
|
||||
return fnConstructor(str(ID))
|
||||
|
||||
def checkEvent(self, event, fcheck, forceOverwrite=False):
|
||||
"""
|
||||
Check information in supplied event and own event and replace with own
|
||||
information if no other information are given or forced by forceOverwrite
|
||||
:param event: Event that supplies information for comparison
|
||||
:type event: pylot.core.util.event.Event
|
||||
:param fcheck: check and delete existing information
|
||||
can be a str or a list of strings of ['manual', 'auto', 'origin', 'magnitude']
|
||||
:type fcheck: str, [str]
|
||||
:param forceOverwrite: Set to true to force overwrite own information. If false,
|
||||
supplied information from event is only used if there is no own information in that
|
||||
category (given in fcheck: manual, auto, origin, magnitude)
|
||||
:type forceOverwrite: bool
|
||||
:return:
|
||||
:rtype: None
|
||||
"""
|
||||
if 'origin' in fcheck:
|
||||
self.replaceOrigin(event, forceOverwrite)
|
||||
if 'magnitude' in fcheck:
|
||||
self.replaceMagnitude(event, forceOverwrite)
|
||||
if 'auto' in fcheck:
|
||||
self.replacePicks(event, 'auto')
|
||||
if 'manual' in fcheck:
|
||||
self.replacePicks(event, 'manual')
|
||||
|
||||
def replaceOrigin(self, event, forceOverwrite=False):
|
||||
"""
|
||||
Replace own origin with the one supplied in event if own origin is not
|
||||
existing or forced by forceOverwrite = True
|
||||
:param event: Event that supplies information for comparison
|
||||
:type event: pylot.core.util.event.Event
|
||||
:param forceOverwrite: always replace own information with supplied one if true
|
||||
:type forceOverwrite: bool
|
||||
:return:
|
||||
:rtype: None
|
||||
"""
|
||||
if self.get_evt_data().origins or forceOverwrite:
|
||||
if event.origins:
|
||||
print("Found origin, replace it by new origin.")
|
||||
event.origins = self.get_evt_data().origins
|
||||
|
||||
def replaceMagnitude(self, event, forceOverwrite=False):
|
||||
"""
|
||||
Replace own magnitude with the one supplied in event if own magnitude is not
|
||||
existing or forced by forceOverwrite = True
|
||||
:param event: Event that supplies information for comparison
|
||||
:type event: pylot.core.util.event.Event
|
||||
:param forceOverwrite: always replace own information with supplied one if true
|
||||
:type forceOverwrite: bool
|
||||
:return:
|
||||
:rtype: None
|
||||
"""
|
||||
if self.get_evt_data().magnitudes or forceOverwrite:
|
||||
if event.magnitudes:
|
||||
print("Found magnitude, replace it by new magnitude")
|
||||
event.magnitudes = self.get_evt_data().magnitudes
|
||||
|
||||
def replacePicks(self, event, picktype):
|
||||
"""
|
||||
Replace picks in event with own picks
|
||||
:param event: Event that supplies information for comparison
|
||||
:type event: pylot.core.util.event.Event
|
||||
:param picktype: 'auto' or 'manual' picks
|
||||
:type picktype: str
|
||||
:return:
|
||||
:rtype: None
|
||||
"""
|
||||
checkflag = 1
|
||||
picks = event.picks
|
||||
# remove existing picks
|
||||
for j, pick in reversed(list(enumerate(picks))):
|
||||
try:
|
||||
if picktype in str(pick.method_id.id):
|
||||
picks.pop(j)
|
||||
checkflag = 2
|
||||
except AttributeError as e:
|
||||
msg = '{}'.format(e)
|
||||
print(e)
|
||||
checkflag = 0
|
||||
if checkflag > 0:
|
||||
if checkflag == 1:
|
||||
print("Write new %s picks to catalog." % picktype)
|
||||
if checkflag == 2:
|
||||
print("Found %s pick(s), remove them and append new picks to catalog." % picktype)
|
||||
|
||||
# append new picks
|
||||
for pick in self.get_evt_data().picks:
|
||||
if picktype in str(pick.method_id.id):
|
||||
picks.append(pick)
|
||||
|
||||
def exportEvent(self, fnout, fnext='.xml', fcheck='auto', upperErrors=None):
|
||||
"""
|
||||
Export event to file
|
||||
:param fnout: basename of file
|
||||
:param fnext: file extensions xml, cnv, obs, focmec, or/and pha
|
||||
:param fcheck: check and delete existing information
|
||||
can be a str or a list of strings of ['manual', 'auto', 'origin', 'magnitude']
|
||||
"""
|
||||
from pylot.core.util.defaults import OUTPUTFORMATS
|
||||
if not type(fcheck) == list:
|
||||
fcheck = [fcheck]
|
||||
|
||||
try:
|
||||
evtformat = OUTPUTFORMATS[fnext]
|
||||
except KeyError as e:
|
||||
errmsg = '{0}; selected file extension {1} not ' \
|
||||
'supported'.format(e, fnext)
|
||||
raise FormatError(errmsg)
|
||||
|
||||
if hasattr(self.get_evt_data(), 'notes'):
|
||||
try:
|
||||
with open(os.path.join(os.path.dirname(fnout), 'notes.txt'), 'w') as notes_file:
|
||||
notes_file.write(self.get_evt_data().notes)
|
||||
except Exception as e:
|
||||
print('Warning: Could not save notes.txt: ', str(e))
|
||||
|
||||
# check for already existing xml-file
|
||||
if fnext == '.xml':
|
||||
if os.path.isfile(fnout + fnext):
|
||||
print("xml-file already exists! Check content ...")
|
||||
cat = read_events(fnout + fnext)
|
||||
if len(cat) > 1:
|
||||
raise IOError('Ambigious event information in file {}'.format(fnout + fnext))
|
||||
if len(cat) < 1:
|
||||
raise IOError('No event information in file {}'.format(fnout + fnext))
|
||||
event = cat[0]
|
||||
if not event.resource_id == self.get_evt_data().resource_id:
|
||||
QMessageBox.warning(self, 'Warning', 'Different resource IDs!')
|
||||
return
|
||||
self.checkEvent(event, fcheck)
|
||||
self.setEvtData(event)
|
||||
|
||||
self.get_evt_data().write(fnout + fnext, format=evtformat)
|
||||
|
||||
# try exporting event
|
||||
else:
|
||||
evtdata_org = self.get_evt_data()
|
||||
picks = evtdata_org.picks
|
||||
eventpath = evtdata_org.path
|
||||
picks_copy = copy.deepcopy(picks)
|
||||
evtdata_copy = Event(eventpath)
|
||||
evtdata_copy.picks = picks_copy
|
||||
|
||||
# check for stations picked automatically as well as manually
|
||||
# Prefer manual picks!
|
||||
for i in range(len(picks)):
|
||||
if picks[i].method_id == 'manual':
|
||||
mstation = picks[i].waveform_id.station_code
|
||||
mstation_ext = mstation + '_'
|
||||
for k in range(len(picks_copy)):
|
||||
if ((picks_copy[k].waveform_id.station_code == mstation) or
|
||||
(picks_copy[k].waveform_id.station_code == mstation_ext)) and \
|
||||
(picks_copy[k].method_id == 'auto'):
|
||||
del picks_copy[k]
|
||||
break
|
||||
lendiff = len(picks) - len(picks_copy)
|
||||
if lendiff != 0:
|
||||
print("Manual as well as automatic picks available. Prefered the {} manual ones!".format(lendiff))
|
||||
|
||||
|
||||
no_uncertainties_p = []
|
||||
no_uncertainties_s = []
|
||||
if upperErrors:
|
||||
# check for pick uncertainties exceeding adjusted upper errors
|
||||
# Picks with larger uncertainties will not be saved in output file!
|
||||
for j in range(len(picks)):
|
||||
for i in range(len(picks_copy)):
|
||||
if picks_copy[i].phase_hint[0] == 'P':
|
||||
# Skipping pick if no upper_uncertainty is found and warning user
|
||||
if picks_copy[i].time_errors['upper_uncertainty'] is None:
|
||||
#print("{1} P-Pick of station {0} does not have upper_uncertainty and cant be checked".format(
|
||||
# picks_copy[i].waveform_id.station_code,
|
||||
# picks_copy[i].method_id))
|
||||
if not picks_copy[i].waveform_id.station_code in no_uncertainties_p:
|
||||
no_uncertainties_p.append(picks_copy[i].waveform_id.station_code)
|
||||
continue
|
||||
|
||||
#print ("checking for upper_uncertainty")
|
||||
if (picks_copy[i].time_errors['uncertainty'] is None) or \
|
||||
(picks_copy[i].time_errors['upper_uncertainty'] >= upperErrors[0]):
|
||||
print("Uncertainty exceeds or equal adjusted upper time error!")
|
||||
print("Adjusted uncertainty: {}".format(upperErrors[0]))
|
||||
print("Pick uncertainty: {}".format(picks_copy[i].time_errors['uncertainty']))
|
||||
print("{1} P-Pick of station {0} will not be saved in outputfile".format(
|
||||
picks_copy[i].waveform_id.station_code,
|
||||
picks_copy[i].method_id))
|
||||
del picks_copy[i]
|
||||
break
|
||||
if picks_copy[i].phase_hint[0] == 'S':
|
||||
|
||||
# Skipping pick if no upper_uncertainty is found and warning user
|
||||
if picks_copy[i].time_errors['upper_uncertainty'] is None:
|
||||
#print("{1} S-Pick of station {0} does not have upper_uncertainty and cant be checked".format(
|
||||
#picks_copy[i].waveform_id.station_code,
|
||||
#picks_copy[i].method_id))
|
||||
if not picks_copy[i].waveform_id.station_code in no_uncertainties_s:
|
||||
no_uncertainties_s.append(picks_copy[i].waveform_id.station_code)
|
||||
continue
|
||||
|
||||
|
||||
if (picks_copy[i].time_errors['uncertainty'] is None) or \
|
||||
(picks_copy[i].time_errors['upper_uncertainty'] >= upperErrors[1]):
|
||||
print("Uncertainty exceeds or equal adjusted upper time error!")
|
||||
print("Adjusted uncertainty: {}".format(upperErrors[1]))
|
||||
print("Pick uncertainty: {}".format(picks_copy[i].time_errors['uncertainty']))
|
||||
print("{1} S-Pick of station {0} will not be saved in outputfile".format(
|
||||
picks_copy[i].waveform_id.station_code,
|
||||
picks_copy[i].method_id))
|
||||
del picks_copy[i]
|
||||
break
|
||||
for s in no_uncertainties_p:
|
||||
print("P-Pick of station {0} does not have upper_uncertainty and cant be checked".format(s))
|
||||
for s in no_uncertainties_s:
|
||||
print("S-Pick of station {0} does not have upper_uncertainty and cant be checked".format(s))
|
||||
|
||||
if fnext == '.obs':
|
||||
try:
|
||||
evtdata_copy.write(fnout + fnext, format=evtformat)
|
||||
# write header afterwards
|
||||
evid = str(evtdata_org.resource_id).split('/')[1]
|
||||
header = '# EQEVENT: Label: EQ%s Loc: X 0.00 Y 0.00 Z 10.00 OT 0.00 \n' % evid
|
||||
nllocfile = open(fnout + fnext)
|
||||
l = nllocfile.readlines()
|
||||
# Adding A0/Generic Amplitude to .obs file
|
||||
# l2 = []
|
||||
# for li in l:
|
||||
# for amp in evtdata_org.amplitudes:
|
||||
# if amp.waveform_id.station_code == li[0:5].strip():
|
||||
# li = li[0:64] + '{:0.2e}'.format(amp.generic_amplitude) + li[73:-1] + '\n'
|
||||
# l2.append(li)
|
||||
# l = l2
|
||||
nllocfile.close()
|
||||
l.insert(0, header)
|
||||
nllocfile = open(fnout + fnext, 'w')
|
||||
nllocfile.write("".join(l))
|
||||
nllocfile.close()
|
||||
except KeyError as e:
|
||||
raise KeyError('''{0} export format
|
||||
not implemented: {1}'''.format(evtformat, e))
|
||||
if fnext == '.cnv':
|
||||
try:
|
||||
velest.export(picks_copy, fnout + fnext, eventinfo=self.get_evt_data())
|
||||
except KeyError as e:
|
||||
raise KeyError('''{0} export format
|
||||
not implemented: {1}'''.format(evtformat, e))
|
||||
if fnext == '_focmec.in':
|
||||
try:
|
||||
parameter = PylotParameter()
|
||||
logging.warning('Using default input parameter')
|
||||
focmec.export(picks_copy, fnout + fnext, parameter, eventinfo=self.get_evt_data())
|
||||
except KeyError as e:
|
||||
raise KeyError('''{0} export format
|
||||
not implemented: {1}'''.format(evtformat, e))
|
||||
if fnext == '.pha':
|
||||
try:
|
||||
parameter = PylotParameter()
|
||||
logging.warning('Using default input parameter')
|
||||
hypodd.export(picks_copy, fnout + fnext, parameter, eventinfo=self.get_evt_data())
|
||||
except KeyError as e:
|
||||
raise KeyError('''{0} export format
|
||||
not implemented: {1}'''.format(evtformat, e))
|
||||
|
||||
def getComp(self):
|
||||
"""
|
||||
Get component (ZNE)
|
||||
"""
|
||||
return self.comp
|
||||
|
||||
def getID(self):
|
||||
"""
|
||||
Get unique resource id
|
||||
"""
|
||||
try:
|
||||
return self.evtdata.get('resource_id').id
|
||||
except:
|
||||
return None
|
||||
|
||||
def filterWFData(self, kwargs):
|
||||
"""
|
||||
Filter waveform data
|
||||
:param kwargs: arguments to pass through to filter function
|
||||
"""
|
||||
data = self.getWFData()
|
||||
data.detrend('linear')
|
||||
data.taper(0.02, type='cosine')
|
||||
data.filter(**kwargs)
|
||||
self.dirty = True
|
||||
|
||||
def setWFData(self, fnames, fnames_alt=None, checkRotated=False, metadata=None, tstart=0, tstop=0):
|
||||
"""
|
||||
Clear current waveform data and set given waveform data
|
||||
:param fnames: waveform data names to append
|
||||
:param fnames_alt: alternative data to show (e.g. synthetic/processed)
|
||||
:type fnames: list
|
||||
"""
|
||||
def check_fname_exists(filenames: list) -> list:
|
||||
if filenames:
|
||||
filenames = [fn for fn in filenames if os.path.isfile(fn)]
|
||||
return filenames
|
||||
|
||||
self.wfdata = Stream()
|
||||
self.wforiginal = None
|
||||
self.wf_alt = Stream()
|
||||
if tstart == tstop:
|
||||
tstart = tstop = None
|
||||
self.tstart = tstart
|
||||
self.tstop = tstop
|
||||
|
||||
# remove directories
|
||||
fnames = check_fname_exists(fnames)
|
||||
fnames_alt = check_fname_exists(fnames_alt)
|
||||
|
||||
# if obspy_dmt:
|
||||
# wfdir = 'raw'
|
||||
# self.processed = False
|
||||
# for fname in fnames:
|
||||
# if fname.endswith('processed'):
|
||||
# wfdir = 'processed'
|
||||
# self.processed = True
|
||||
# break
|
||||
# for fpath in fnames:
|
||||
# if fpath.endswith(wfdir):
|
||||
# wffnames = [os.path.join(fpath, fname) for fname in os.listdir(fpath)]
|
||||
# if 'syngine' in fpath.split('/')[-1]:
|
||||
# wffnames_syn = [os.path.join(fpath, fname) for fname in os.listdir(fpath)]
|
||||
# else:
|
||||
# wffnames = fnames
|
||||
if fnames is not None:
|
||||
self.appendWFData(fnames)
|
||||
if fnames_alt is not None:
|
||||
self.appendWFData(fnames_alt, alternative=True)
|
||||
else:
|
||||
return False
|
||||
|
||||
# various pre-processing steps:
|
||||
# remove possible underscores in station names
|
||||
# self.wfdata = remove_underscores(self.wfdata)
|
||||
# check for gaps and merge
|
||||
self.wfdata, _ = check_for_gaps_and_merge(self.wfdata)
|
||||
# check for nans
|
||||
check_for_nan(self.wfdata)
|
||||
# check for stations with rotated components
|
||||
if checkRotated and metadata is not None:
|
||||
self.wfdata = check4rotated(self.wfdata, metadata, verbosity=0)
|
||||
# trim station components to same start value
|
||||
trim_station_components(self.wfdata, trim_start=True, trim_end=False)
|
||||
|
||||
# make a copy of original data
|
||||
self.wforiginal = self.getWFData().copy()
|
||||
self.dirty = False
|
||||
return True
|
||||
|
||||
def appendWFData(self, fnames, alternative=False):
|
||||
"""
|
||||
Read waveform data from fnames and append it to current wf data
|
||||
:param fnames: waveform data to append
|
||||
:type fnames: list
|
||||
"""
|
||||
assert isinstance(fnames, list), "input parameter 'fnames' is " \
|
||||
"supposed to be of type 'list' " \
|
||||
"but is actually" \
|
||||
" {0}".format(type(fnames))
|
||||
if self.dirty:
|
||||
self.resetWFData()
|
||||
|
||||
orig_or_alternative_data = {True: self.wf_alt,
|
||||
False: self.wfdata}
|
||||
|
||||
warnmsg = ''
|
||||
for fname in set(fnames):
|
||||
try:
|
||||
orig_or_alternative_data[alternative] += read(fname, starttime=self.tstart, endtime=self.tstop)
|
||||
except TypeError:
|
||||
try:
|
||||
orig_or_alternative_data[alternative] += read(fname, format='GSE2', starttime=self.tstart, endtime=self.tstop)
|
||||
except Exception as e:
|
||||
try:
|
||||
orig_or_alternative_data[alternative] += read(fname, format='SEGY', starttime=self.tstart,
|
||||
endtime=self.tstop)
|
||||
except Exception as e:
|
||||
warnmsg += '{0}\n{1}\n'.format(fname, e)
|
||||
except SacIOError as se:
|
||||
warnmsg += '{0}\n{1}\n'.format(fname, se)
|
||||
if warnmsg:
|
||||
warnmsg = 'WARNING in appendWFData: unable to read waveform data\n' + warnmsg
|
||||
print(warnmsg)
|
||||
|
||||
def getWFData(self):
|
||||
return self.wfdata
|
||||
|
||||
def getOriginalWFData(self):
|
||||
return self.wforiginal
|
||||
|
||||
def getAltWFdata(self):
|
||||
return self.wf_alt
|
||||
|
||||
def resetWFData(self):
|
||||
"""
|
||||
Set waveform data to original waveform data
|
||||
"""
|
||||
if self.getOriginalWFData():
|
||||
self.wfdata = self.getOriginalWFData().copy()
|
||||
else:
|
||||
self.wfdata = Stream()
|
||||
self.dirty = False
|
||||
|
||||
def resetPicks(self):
|
||||
"""
|
||||
Clear all picks from event
|
||||
"""
|
||||
self.get_evt_data().picks = []
|
||||
|
||||
def get_evt_data(self):
|
||||
return self.evtdata
|
||||
|
||||
def setEvtData(self, event):
|
||||
self.evtdata = event
|
||||
|
||||
def applyEVTData(self, data, typ='pick'):
|
||||
"""
|
||||
Either takes an `obspy.core.event.Event` object and applies all new
|
||||
information on the event to the actual data if typ is 'event or
|
||||
creates ObsPy pick objects and append it to the picks list from the
|
||||
PyLoT dictionary contain all picks if type is pick
|
||||
:param data: data to apply, either picks or complete event
|
||||
:type data:
|
||||
:param typ: which event data to apply, 'pick' or 'event'
|
||||
:type typ: str
|
||||
:param authority_id: (currently unused)
|
||||
:type: str
|
||||
:raise OverwriteError:
|
||||
"""
|
||||
|
||||
def applyPicks(picks):
|
||||
"""
|
||||
Creates ObsPy pick objects and append it to the picks list from the
|
||||
PyLoT dictionary contain all picks.
|
||||
:param picks:
|
||||
:raise OverwriteError: raises an OverwriteError if the picks list is
|
||||
not empty. The GUI will then ask for a decision.
|
||||
"""
|
||||
# firstonset = find_firstonset(picks)
|
||||
# check for automatic picks
|
||||
print("Writing phases to ObsPy-quakeml file")
|
||||
for key in picks:
|
||||
if not picks[key].get('P'):
|
||||
continue
|
||||
if picks[key]['P']['picker'] == 'auto':
|
||||
print("Existing auto-picks will be overwritten in pick-dictionary!")
|
||||
picks = picks_from_picksdict(picks)
|
||||
break
|
||||
else:
|
||||
if self.get_evt_data().picks:
|
||||
raise OverwriteError('Existing picks would be overwritten!')
|
||||
else:
|
||||
picks = picks_from_picksdict(picks)
|
||||
break
|
||||
self.get_evt_data().picks = picks
|
||||
# if 'smi:local' in self.getID() and firstonset:
|
||||
# fonset_str = firstonset.strftime('%Y_%m_%d_%H_%M_%S')
|
||||
# ID = ResourceIdentifier('event/' + fonset_str)
|
||||
# ID.convertIDToQuakeMLURI(authority_id=authority_id)
|
||||
# self.get_evt_data().resource_id = ID
|
||||
|
||||
def applyEvent(event):
|
||||
"""
|
||||
takes an `obspy.core.event.Event` object and applies all new
|
||||
information on the event to the actual data
|
||||
:param event:
|
||||
"""
|
||||
if event is None:
|
||||
print("applyEvent: Received None")
|
||||
return
|
||||
if self.isNew():
|
||||
self.setEvtData(event)
|
||||
else:
|
||||
# prevent overwriting original pick information
|
||||
event_old = self.get_evt_data()
|
||||
if not event_old.resource_id == event.resource_id:
|
||||
print("WARNING: Missmatch in event resource id's: {} and {}".format(
|
||||
event_old.resource_id,
|
||||
event.resource_id))
|
||||
else:
|
||||
picks = copy.deepcopy(event_old.picks)
|
||||
event = merge_picks(event, picks)
|
||||
# apply event information from location
|
||||
event_old.update(event)
|
||||
|
||||
applydata = {'pick': applyPicks,
|
||||
'event': applyEvent}
|
||||
|
||||
applydata[typ](data)
|
||||
self._new = False
|
||||
def rotate_wf_data(self):
|
||||
self.waveform_data.rotate_zne(self.metadata)
|
||||
|
||||
|
||||
class GenericDataStructure(object):
|
||||
@ -839,22 +248,6 @@ class PilotDataStructure(GenericDataStructure):
|
||||
self.setExpandFields(['root', 'database'])
|
||||
|
||||
|
||||
class ObspyDMTdataStructure(GenericDataStructure):
|
||||
"""
|
||||
Object containing the data access information for the old PILOT data
|
||||
structure.
|
||||
"""
|
||||
|
||||
def __init__(self, **fields):
|
||||
if not fields:
|
||||
fields = {'database': '',
|
||||
'root': ''}
|
||||
|
||||
GenericDataStructure.__init__(self, **fields)
|
||||
|
||||
self.setExpandFields(['root', 'database'])
|
||||
|
||||
|
||||
class SeiscompDataStructure(GenericDataStructure):
|
||||
"""
|
||||
Dictionary containing the data access information for an SDS data archive:
|
||||
|
@ -6,14 +6,24 @@ import numpy as np
|
||||
Default parameters used for picking
|
||||
"""
|
||||
|
||||
defaults = {'datapath': {'type': str,
|
||||
'tooltip': 'path to eventfolders',
|
||||
defaults = {'rootpath': {'type': str,
|
||||
'tooltip': 'project path',
|
||||
'value': '',
|
||||
'namestring': 'Root path'},
|
||||
|
||||
'datapath': {'type': str,
|
||||
'tooltip': 'data path',
|
||||
'value': '',
|
||||
'namestring': 'Data path'},
|
||||
|
||||
'database': {'type': str,
|
||||
'tooltip': 'name of data base',
|
||||
'value': '',
|
||||
'namestring': 'Database path'},
|
||||
|
||||
'eventID': {'type': str,
|
||||
'tooltip': 'event ID for single event processing (* for all events found in datapath)',
|
||||
'value': '*',
|
||||
'tooltip': 'event ID for single event processing (* for all events found in database)',
|
||||
'value': '',
|
||||
'namestring': 'Event ID'},
|
||||
|
||||
'extent': {'type': str,
|
||||
@ -501,7 +511,7 @@ defaults = {'datapath': {'type': str,
|
||||
|
||||
'taup_model': {'type': str,
|
||||
'tooltip': 'Define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6',
|
||||
'value': 'iasp91',
|
||||
'value': None,
|
||||
'namestring': 'TauPy model'},
|
||||
|
||||
'taup_phases': {'type': str,
|
||||
@ -512,7 +522,9 @@ defaults = {'datapath': {'type': str,
|
||||
|
||||
settings_main = {
|
||||
'dirs': [
|
||||
'rootpath',
|
||||
'datapath',
|
||||
'database',
|
||||
'eventID',
|
||||
'invdir',
|
||||
'datastructure',
|
||||
|
106
pylot/core/io/event.py
Normal file
106
pylot/core/io/event.py
Normal file
@ -0,0 +1,106 @@
|
||||
import copy
|
||||
from dataclasses import dataclass
|
||||
from typing import Union
|
||||
|
||||
from obspy import read_events
|
||||
from obspy.core.event import Event as ObsPyEvent
|
||||
|
||||
|
||||
@dataclass
|
||||
class EventData:
|
||||
evtdata: Union[ObsPyEvent, None] = None
|
||||
_new: bool = False
|
||||
|
||||
def __init__(self, evtdata=None):
|
||||
self.set_event_data(evtdata)
|
||||
|
||||
def set_event_data(self, evtdata):
|
||||
if isinstance(evtdata, ObsPyEvent):
|
||||
self.evtdata = evtdata
|
||||
elif isinstance(evtdata, dict):
|
||||
self.evtdata = self.read_pilot_event(**evtdata)
|
||||
elif isinstance(evtdata, str):
|
||||
self.evtdata = self.read_event_file(evtdata)
|
||||
else:
|
||||
self.set_new()
|
||||
self.evtdata = ObsPyEvent(picks=[])
|
||||
|
||||
def read_event_file(self, evtdata: str) -> ObsPyEvent:
|
||||
try:
|
||||
cat = read_events(evtdata)
|
||||
if len(cat) != 1:
|
||||
raise ValueError(f'ambiguous event information for file: {evtdata}')
|
||||
return cat[0]
|
||||
except TypeError as e:
|
||||
self.handle_event_file_error(e, evtdata)
|
||||
|
||||
def handle_event_file_error(self, e: TypeError, evtdata: str):
|
||||
if 'Unknown format for file' in str(e):
|
||||
if 'PHASES' in evtdata:
|
||||
picks = self.picksdict_from_pilot(evtdata)
|
||||
evtdata = ObsPyEvent(picks=self.picks_from_picksdict(picks))
|
||||
elif 'LOC' in evtdata:
|
||||
raise NotImplementedError('PILOT location information read support not yet implemented.')
|
||||
elif 'event.pkl' in evtdata:
|
||||
evtdata = self.qml_from_obspy_dmt(evtdata)
|
||||
else:
|
||||
raise e
|
||||
else:
|
||||
raise e
|
||||
|
||||
def set_new(self):
|
||||
self._new = True
|
||||
|
||||
def is_new(self) -> bool:
|
||||
return self._new
|
||||
|
||||
def get_picks_str(self) -> str:
|
||||
return '\n'.join(str(pick) for pick in self.evtdata.picks)
|
||||
|
||||
def replace_origin(self, event: ObsPyEvent, force_overwrite: bool = False):
|
||||
if self.evtdata.origins or force_overwrite:
|
||||
event.origins = self.evtdata.origins
|
||||
|
||||
def replace_magnitude(self, event: ObsPyEvent, force_overwrite: bool = False):
|
||||
if self.evtdata.magnitudes or force_overwrite:
|
||||
event.magnitudes = self.evtdata.magnitudes
|
||||
|
||||
def replace_picks(self, event: ObsPyEvent, picktype: str):
|
||||
checkflag = 1
|
||||
picks = event.picks
|
||||
for j, pick in reversed(list(enumerate(picks))):
|
||||
if picktype in str(pick.method_id.id):
|
||||
picks.pop(j)
|
||||
checkflag = 2
|
||||
|
||||
if checkflag > 0:
|
||||
for pick in self.evtdata.picks:
|
||||
if picktype in str(pick.method_id.id):
|
||||
picks.append(pick)
|
||||
|
||||
def get_id(self) -> Union[str, None]:
|
||||
try:
|
||||
return self.evtdata.resource_id.id
|
||||
except:
|
||||
return None
|
||||
|
||||
def apply_event_data(self, data, typ='pick'):
|
||||
if typ == 'pick':
|
||||
self.apply_picks(data)
|
||||
elif typ == 'event':
|
||||
self.apply_event(data)
|
||||
|
||||
def apply_picks(self, picks):
|
||||
self.evtdata.picks = picks
|
||||
|
||||
def apply_event(self, event: ObsPyEvent):
|
||||
if self.is_new():
|
||||
self.evtdata = event
|
||||
else:
|
||||
old_event = self.evtdata
|
||||
if old_event.resource_id == event.resource_id:
|
||||
picks = copy.deepcopy(old_event.picks)
|
||||
event = self.merge_picks(event, picks)
|
||||
old_event.update(event)
|
||||
else:
|
||||
print(f"WARNING: Mismatch in event resource id's: {old_event.resource_id} and {event.resource_id}")
|
@ -1,7 +1,5 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
import logging
|
||||
import os
|
||||
|
||||
from pylot.core.io import default_parameters
|
||||
from pylot.core.util.errors import ParameterError
|
||||
@ -90,10 +88,10 @@ class PylotParameter(object):
|
||||
return bool(self.__parameter)
|
||||
|
||||
def __getitem__(self, key):
|
||||
if key in self.__parameter:
|
||||
try:
|
||||
return self.__parameter[key]
|
||||
else:
|
||||
logging.warning(f'{key} not found in PylotParameter')
|
||||
except:
|
||||
return None
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
try:
|
||||
@ -420,28 +418,6 @@ class PylotParameter(object):
|
||||
line = value + name + ttip
|
||||
fid.write(line)
|
||||
|
||||
@staticmethod
|
||||
def check_deprecated_parameters(parameters):
|
||||
if parameters.hasParam('database') and parameters.hasParam('rootpath'):
|
||||
parameters['datapath'] = os.path.join(parameters['rootpath'], parameters['datapath'],
|
||||
parameters['database'])
|
||||
logging.warning(
|
||||
f'Parameters database and rootpath are deprecated. '
|
||||
f'Tried to merge them to now path: {parameters["datapath"]}.'
|
||||
)
|
||||
|
||||
remove_keys = []
|
||||
for key in parameters:
|
||||
if not key in default_parameters.defaults.keys():
|
||||
remove_keys.append(key)
|
||||
logging.warning(f'Removing deprecated parameter: {key}')
|
||||
|
||||
for key in remove_keys:
|
||||
del parameters[key]
|
||||
|
||||
parameters._settings_main = default_parameters.settings_main
|
||||
parameters._settings_special_pick = default_parameters.settings_special_pick
|
||||
|
||||
|
||||
class FilterOptions(object):
|
||||
'''
|
||||
|
@ -21,25 +21,6 @@ from pylot.core.util.utils import get_owner, full_range, four_digits, transformF
|
||||
backtransformFilterString, loopIdentifyPhase, identifyPhase
|
||||
|
||||
|
||||
def add_amplitudes(event, amplitudes):
|
||||
amplitude_list = []
|
||||
for pick in event.picks:
|
||||
try:
|
||||
a0 = amplitudes[pick.waveform_id.station_code]
|
||||
amplitude = ope.Amplitude(generic_amplitude=a0 * 1e-3)
|
||||
amplitude.unit = 'm'
|
||||
amplitude.category = 'point'
|
||||
amplitude.waveform_id = pick.waveform_id
|
||||
amplitude.magnitude_hint = 'ML'
|
||||
amplitude.pick_id = pick.resource_id
|
||||
amplitude.type = 'AML'
|
||||
amplitude_list.append(amplitude)
|
||||
except KeyError:
|
||||
continue
|
||||
event.amplitudes = amplitude_list
|
||||
return event
|
||||
|
||||
|
||||
def readPILOTEvent(phasfn=None, locfn=None, authority_id='RUB', **kwargs):
|
||||
"""
|
||||
readPILOTEvent - function
|
||||
@ -193,31 +174,6 @@ def convert_pilot_times(time_array):
|
||||
return UTCDateTime(*times)
|
||||
|
||||
|
||||
def picksdict_from_obs(fn):
|
||||
"""
|
||||
create pick dictionary from obs file
|
||||
:param fn: filename
|
||||
:type fn:
|
||||
:return:
|
||||
:rtype:
|
||||
"""
|
||||
picks = dict()
|
||||
station_name = str()
|
||||
for line in open(fn, 'r'):
|
||||
if line.startswith('#'):
|
||||
continue
|
||||
else:
|
||||
phase_line = line.split()
|
||||
if not station_name == phase_line[0]:
|
||||
phase = dict()
|
||||
station_name = phase_line[0]
|
||||
phase_name = phase_line[4].upper()
|
||||
pick = UTCDateTime(phase_line[6] + phase_line[7] + phase_line[8])
|
||||
phase[phase_name] = dict(mpp=pick, fm=phase_line[5])
|
||||
picks[station_name] = phase
|
||||
return picks
|
||||
|
||||
|
||||
def picksdict_from_picks(evt, parameter=None):
|
||||
"""
|
||||
Takes an Event object and return the pick dictionary commonly used within
|
||||
@ -373,636 +329,228 @@ def picks_from_picksdict(picks, creation_info=None):
|
||||
return picks_list
|
||||
|
||||
|
||||
def reassess_pilot_db(root_dir, db_dir, out_dir=None, fn_param=None, verbosity=0):
|
||||
# TODO: change root to datapath
|
||||
db_root = os.path.join(root_dir, db_dir)
|
||||
evt_list = glob.glob1(db_root, 'e????.???.??')
|
||||
|
||||
for evt in evt_list:
|
||||
if verbosity > 0:
|
||||
print('Reassessing event {0}'.format(evt))
|
||||
reassess_pilot_event(root_dir, db_dir, evt, out_dir, fn_param, verbosity)
|
||||
|
||||
|
||||
def reassess_pilot_event(root_dir, db_dir, event_id, out_dir=None, fn_param=None, verbosity=0):
|
||||
from obspy import read
|
||||
|
||||
from pylot.core.io.inputs import PylotParameter
|
||||
from pylot.core.pick.utils import earllatepicker
|
||||
# TODO: change root to datapath
|
||||
|
||||
default = PylotParameter(fn_param, verbosity)
|
||||
|
||||
search_base = os.path.join(root_dir, db_dir, event_id)
|
||||
phases_file = glob.glob(os.path.join(search_base, 'PHASES.mat'))
|
||||
if not phases_file:
|
||||
return
|
||||
if verbosity > 1:
|
||||
print('Opening PILOT phases file: {fn}'.format(fn=phases_file[0]))
|
||||
picks_dict = picksdict_from_pilot(phases_file[0])
|
||||
if verbosity > 0:
|
||||
print('Dictionary read from PHASES.mat:\n{0}'.format(picks_dict))
|
||||
datacheck = list()
|
||||
info = None
|
||||
for station in picks_dict.keys():
|
||||
fn_pattern = os.path.join(search_base, '{0}*'.format(station))
|
||||
try:
|
||||
st = read(fn_pattern)
|
||||
except TypeError as e:
|
||||
if 'Unknown format for file' in e.message:
|
||||
try:
|
||||
st = read(fn_pattern, format='GSE2')
|
||||
except ValueError as e:
|
||||
if e.message == 'second must be in 0..59':
|
||||
info = 'A known Error was raised. Please find the list of corrupted files and double-check these files.'
|
||||
datacheck.append(fn_pattern + ' (time info)\n')
|
||||
continue
|
||||
else:
|
||||
raise ValueError(e.message)
|
||||
except Exception as e:
|
||||
if 'No file matching file pattern:' in e.message:
|
||||
if verbosity > 0:
|
||||
warnings.warn('no waveform data found for station {station}'.format(station=station),
|
||||
RuntimeWarning)
|
||||
datacheck.append(fn_pattern + ' (no data)\n')
|
||||
continue
|
||||
else:
|
||||
raise e
|
||||
else:
|
||||
raise e
|
||||
for phase in picks_dict[station].keys():
|
||||
try:
|
||||
mpp = picks_dict[station][phase]['mpp']
|
||||
except KeyError as e:
|
||||
print(e.message, station)
|
||||
continue
|
||||
sel_st = select_for_phase(st, phase)
|
||||
if not sel_st:
|
||||
msg = 'no waveform data found for station {station}'.format(station=station)
|
||||
warnings.warn(msg, RuntimeWarning)
|
||||
continue
|
||||
stime, etime = full_range(sel_st)
|
||||
rel_pick = mpp - stime
|
||||
epp, lpp, spe = earllatepicker(sel_st,
|
||||
default.get('nfac{0}'.format(phase)),
|
||||
default.get('tsnrz' if phase == 'P' else 'tsnrh'),
|
||||
Pick1=rel_pick,
|
||||
iplot=0,
|
||||
verbosity=0)
|
||||
if epp is None or lpp is None:
|
||||
continue
|
||||
epp = stime + epp
|
||||
lpp = stime + lpp
|
||||
min_diff = 3 * st[0].stats.delta
|
||||
if lpp - mpp < min_diff:
|
||||
lpp = mpp + min_diff
|
||||
if mpp - epp < min_diff:
|
||||
epp = mpp - min_diff
|
||||
picks_dict[station][phase] = dict(epp=epp, mpp=mpp, lpp=lpp, spe=spe)
|
||||
if datacheck:
|
||||
if info:
|
||||
if verbosity > 0:
|
||||
print(info + ': {0}'.format(search_base))
|
||||
fncheck = open(os.path.join(search_base, 'datacheck_list'), 'w')
|
||||
fncheck.writelines(datacheck)
|
||||
fncheck.close()
|
||||
del datacheck
|
||||
# create Event object for export
|
||||
evt = ope.Event(resource_id=event_id)
|
||||
evt.picks = picks_from_picksdict(picks_dict)
|
||||
# write phase information to file
|
||||
if not out_dir:
|
||||
fnout_prefix = os.path.join(root_dir, db_dir, event_id, 'PyLoT_{0}.'.format(event_id))
|
||||
else:
|
||||
out_dir = os.path.join(out_dir, db_dir)
|
||||
if not os.path.isdir(out_dir):
|
||||
os.makedirs(out_dir)
|
||||
fnout_prefix = os.path.join(out_dir, 'PyLoT_{0}.'.format(event_id))
|
||||
evt.write(fnout_prefix + 'xml', format='QUAKEML')
|
||||
|
||||
|
||||
def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
def write_phases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
"""
|
||||
Function of methods to write phases to the following standard file
|
||||
formats used for locating earthquakes:
|
||||
Writes earthquake phase data to different file formats.
|
||||
|
||||
HYPO71, NLLoc, VELEST, HYPOSAT, FOCMEC, and hypoDD
|
||||
|
||||
:param arrivals:dictionary containing all phase information including
|
||||
station ID, phase, first motion, weight (uncertainty), ...
|
||||
:param arrivals: Dictionary containing phase information (station ID, phase, first motion, weight, etc.)
|
||||
:type arrivals: dict
|
||||
|
||||
:param fformat: chosen file format (location routine),
|
||||
choose between NLLoc, HYPO71, HYPOSAT, VELEST,
|
||||
HYPOINVERSE, FOCMEC, and hypoDD
|
||||
:param fformat: File format to write to (e.g., 'NLLoc', 'HYPO71', 'HYPOSAT', 'VELEST', 'HYPODD', 'FOCMEC')
|
||||
:type fformat: str
|
||||
|
||||
:param filename: full path and name of phase file
|
||||
:type filename: string
|
||||
|
||||
:param parameter: all input information
|
||||
:type parameter: object
|
||||
|
||||
:param eventinfo: optional, needed for VELEST-cnv file
|
||||
and FOCMEC- and HASH-input files
|
||||
:type eventinfo: `obspy.core.event.Event` object
|
||||
:param filename: Path and name of the output phase file
|
||||
:type filename: str
|
||||
:param parameter: Additional parameters for writing the phase data
|
||||
:type parameter: object
|
||||
:param eventinfo: Event information needed for specific formats like VELEST, FOCMEC, and HASH
|
||||
:type eventinfo: obspy.core.event.Event
|
||||
"""
|
||||
if fformat == 'NLLoc':
|
||||
print("Writing phases to %s for NLLoc" % filename)
|
||||
fid = open("%s" % filename, 'w')
|
||||
# write header
|
||||
fid.write('# EQEVENT: %s Label: EQ%s Loc: X 0.00 Y 0.00 Z 10.00 OT 0.00 \n' %
|
||||
(parameter.get('datapath'), parameter.get('eventID')))
|
||||
arrivals = chooseArrivals(arrivals) # MP MP what is chooseArrivals? It is not defined anywhere
|
||||
for key in arrivals:
|
||||
# P onsets
|
||||
if 'P' in arrivals[key]:
|
||||
try:
|
||||
fm = arrivals[key]['P']['fm']
|
||||
except KeyError as e:
|
||||
print(e)
|
||||
fm = None
|
||||
if fm is None:
|
||||
fm = '?'
|
||||
onset = arrivals[key]['P']['mpp']
|
||||
year = onset.year
|
||||
month = onset.month
|
||||
day = onset.day
|
||||
hh = onset.hour
|
||||
mm = onset.minute
|
||||
ss = onset.second
|
||||
ms = onset.microsecond
|
||||
ss_ms = ss + ms / 1000000.0
|
||||
pweight = 1 # use pick
|
||||
try:
|
||||
if arrivals[key]['P']['weight'] >= 4:
|
||||
pweight = 0 # do not use pick
|
||||
print("Station {}: Uncertain pick, do not use it!".format(key))
|
||||
except KeyError as e:
|
||||
print(e.message + '; no weight set during processing')
|
||||
fid.write('%s ? ? ? P %s %d%02d%02d %02d%02d %7.4f GAU 0 0 0 0 %d \n' % (key,
|
||||
fm,
|
||||
year,
|
||||
month,
|
||||
day,
|
||||
hh,
|
||||
mm,
|
||||
ss_ms,
|
||||
pweight))
|
||||
# S onsets
|
||||
if 'S' in arrivals[key] and arrivals[key]['S']['mpp'] is not None:
|
||||
fm = '?'
|
||||
onset = arrivals[key]['S']['mpp']
|
||||
year = onset.year
|
||||
month = onset.month
|
||||
day = onset.day
|
||||
hh = onset.hour
|
||||
mm = onset.minute
|
||||
ss = onset.second
|
||||
ms = onset.microsecond
|
||||
ss_ms = ss + ms / 1000000.0
|
||||
sweight = 1 # use pick
|
||||
try:
|
||||
if arrivals[key]['S']['weight'] >= 4:
|
||||
sweight = 0 # do not use pick
|
||||
except KeyError as e:
|
||||
print(str(e) + '; no weight set during processing')
|
||||
Ao = arrivals[key]['S']['Ao'] # peak-to-peak amplitude
|
||||
if Ao == None:
|
||||
Ao = 0.0
|
||||
# fid.write('%s ? ? ? S %s %d%02d%02d %02d%02d %7.4f GAU 0 0 0 0 %d \n' % (key,
|
||||
fid.write('%s ? ? ? S %s %d%02d%02d %02d%02d %7.4f GAU 0 %9.2f 0 0 %d \n' % (key,
|
||||
fm,
|
||||
year,
|
||||
month,
|
||||
day,
|
||||
hh,
|
||||
mm,
|
||||
ss_ms,
|
||||
Ao,
|
||||
sweight))
|
||||
|
||||
fid.close()
|
||||
elif fformat == 'HYPO71':
|
||||
print("Writing phases to %s for HYPO71" % filename)
|
||||
fid = open("%s" % filename, 'w')
|
||||
# write header
|
||||
fid.write(' %s\n' %
|
||||
parameter.get('eventID'))
|
||||
arrivals = chooseArrivals(arrivals) # MP MP what is chooseArrivals? It is not defined anywhere
|
||||
for key in arrivals:
|
||||
if arrivals[key]['P']['weight'] < 4:
|
||||
stat = key
|
||||
if len(stat) > 4: # HYPO71 handles only 4-string station IDs
|
||||
stat = stat[1:5]
|
||||
Ponset = arrivals[key]['P']['mpp']
|
||||
Sonset = arrivals[key]['S']['mpp']
|
||||
pweight = arrivals[key]['P']['weight']
|
||||
sweight = arrivals[key]['S']['weight']
|
||||
fm = arrivals[key]['P']['fm']
|
||||
if fm is None:
|
||||
fm = '-'
|
||||
Ao = arrivals[key]['S']['Ao']
|
||||
if Ao is None:
|
||||
Ao = ''
|
||||
else:
|
||||
Ao = str('%7.2f' % Ao)
|
||||
year = Ponset.year
|
||||
if year >= 2000:
|
||||
year = year - 2000
|
||||
else:
|
||||
year = year - 1900
|
||||
month = Ponset.month
|
||||
day = Ponset.day
|
||||
hh = Ponset.hour
|
||||
mm = Ponset.minute
|
||||
ss = Ponset.second
|
||||
ms = Ponset.microsecond
|
||||
ss_ms = ss + ms / 1000000.0
|
||||
if pweight < 2:
|
||||
pstr = 'I'
|
||||
elif pweight >= 2:
|
||||
pstr = 'E'
|
||||
if arrivals[key]['S']['weight'] < 4:
|
||||
Sss = Sonset.second
|
||||
Sms = Sonset.microsecond
|
||||
Sss_ms = Sss + Sms / 1000000.0
|
||||
Sss_ms = str('%5.02f' % Sss_ms)
|
||||
if sweight < 2:
|
||||
sstr = 'I'
|
||||
elif sweight >= 2:
|
||||
sstr = 'E'
|
||||
fid.write('%-4s%sP%s%d %02d%02d%02d%02d%02d%5.2f %s%sS %d %s\n' % (stat,
|
||||
pstr,
|
||||
fm,
|
||||
pweight,
|
||||
year,
|
||||
month,
|
||||
day,
|
||||
hh,
|
||||
mm,
|
||||
ss_ms,
|
||||
Sss_ms,
|
||||
sstr,
|
||||
sweight,
|
||||
Ao))
|
||||
else:
|
||||
fid.write('%-4s%sP%s%d %02d%02d%02d%02d%02d%5.2f %s\n' % (stat,
|
||||
pstr,
|
||||
fm,
|
||||
pweight,
|
||||
year,
|
||||
month,
|
||||
day,
|
||||
hh,
|
||||
mm,
|
||||
ss_ms,
|
||||
Ao))
|
||||
def write_nlloc():
|
||||
with open(filename, 'w') as fid:
|
||||
fid.write('# EQEVENT: {} Label: EQ{} Loc: X 0.00 Y 0.00 Z 10.00 OT 0.00 \n'.format(
|
||||
parameter.get('database'), parameter.get('eventID')))
|
||||
for key, value in arrivals.items():
|
||||
for phase in ['P', 'S']:
|
||||
if phase in value:
|
||||
fm = value[phase].get('fm', '?')
|
||||
onset = value[phase]['mpp']
|
||||
ss_ms = onset.second + onset.microsecond / 1000000.0
|
||||
weight = 1 if value[phase].get('weight', 0) < 4 else 0
|
||||
amp = value[phase].get('Ao', 0.0) if phase == 'S' else ''
|
||||
fid.write('{} ? ? ? {} {}{}{} {}{} {:7.4f} GAU 0 {} 0 0 {}\n'.format(
|
||||
key, phase, fm, onset.year, onset.month, onset.day, onset.hour, onset.minute, ss_ms, amp,
|
||||
weight))
|
||||
|
||||
fid.close()
|
||||
def write_hypo71():
|
||||
with open(filename, 'w') as fid:
|
||||
fid.write(
|
||||
' {}\n'.format(parameter.get('eventID')))
|
||||
for key, value in arrivals.items():
|
||||
if value['P'].get('weight', 0) < 4:
|
||||
stat = key[:4]
|
||||
Ponset = value['P']['mpp']
|
||||
Sonset = value.get('S', {}).get('mpp')
|
||||
pweight = value['P'].get('weight', 0)
|
||||
sweight = value.get('S', {}).get('weight', 0)
|
||||
fm = value['P'].get('fm', '-')
|
||||
Ao = value.get('S', {}).get('Ao', '')
|
||||
year = Ponset.year - 2000 if Ponset.year >= 2000 else Ponset.year - 1900
|
||||
ss_ms = Ponset.second + Ponset.microsecond / 1000000.0
|
||||
if Sonset:
|
||||
Sss_ms = Sonset.second + Sonset.microsecond / 1000000.0
|
||||
fid.write('{}P{}{}{} {}{}{}{}{} {:5.2f} {}{}S {} {}\n'.format(
|
||||
stat, 'I' if pweight < 2 else 'E', fm, pweight, year, Ponset.month, Ponset.day,
|
||||
Ponset.hour, Ponset.minute, ss_ms, Sss_ms, 'I' if sweight < 2 else 'E', sweight, Ao))
|
||||
else:
|
||||
fid.write('{}P{}{}{} {}{}{}{}{} {:5.2f} {}\n'.format(
|
||||
stat, 'I' if pweight < 2 else 'E', fm, pweight, year, Ponset.month, Ponset.day,
|
||||
Ponset.hour, Ponset.minute, ss_ms, Ao))
|
||||
|
||||
elif fformat == 'HYPOSAT':
|
||||
print("Writing phases to %s for HYPOSAT" % filename)
|
||||
fid = open("%s" % filename, 'w')
|
||||
# write header
|
||||
fid.write('%s, event %s \n' % (parameter.get('datapath'), parameter.get('eventID')))
|
||||
arrivals = chooseArrivals(arrivals) # MP MP what is chooseArrivals? It is not defined anywhere
|
||||
for key in arrivals:
|
||||
# P onsets
|
||||
if 'P' in arrivals[key] and arrivals[key]['P']['mpp'] is not None:
|
||||
if arrivals[key]['P']['weight'] < 4:
|
||||
Ponset = arrivals[key]['P']['mpp']
|
||||
pyear = Ponset.year
|
||||
pmonth = Ponset.month
|
||||
pday = Ponset.day
|
||||
phh = Ponset.hour
|
||||
pmm = Ponset.minute
|
||||
pss = Ponset.second
|
||||
pms = Ponset.microsecond
|
||||
Pss = pss + pms / 1000000.0
|
||||
# use symmetrized picking error as std
|
||||
# (read the HYPOSAT manual)
|
||||
pstd = arrivals[key]['P']['spe']
|
||||
if pstd is None:
|
||||
errorsP = parameter.get('timeerrorsP')
|
||||
if arrivals[key]['P']['weight'] == 0:
|
||||
pstd = errorsP[0]
|
||||
elif arrivals[key]['P']['weight'] == 1:
|
||||
pstd = errorsP[1]
|
||||
elif arrivals[key]['P']['weight'] == 2:
|
||||
pstd = errorsP[2]
|
||||
elif arrivals[key]['P']['weight'] == 3:
|
||||
psrd = errorsP[3]
|
||||
else:
|
||||
pstd = errorsP[4]
|
||||
fid.write('%-5s P1 %4.0f %02d %02d %02d %02d %05.02f %5.3f -999. 0.00 -999. 0.00\n'
|
||||
% (key, pyear, pmonth, pday, phh, pmm, Pss, pstd))
|
||||
# S onsets
|
||||
if 'S' in arrivals[key] and arrivals[key]['S']['mpp'] is not None:
|
||||
if arrivals[key]['S']['weight'] < 4:
|
||||
Sonset = arrivals[key]['S']['mpp']
|
||||
syear = Sonset.year
|
||||
smonth = Sonset.month
|
||||
sday = Sonset.day
|
||||
shh = Sonset.hour
|
||||
smm = Sonset.minute
|
||||
sss = Sonset.second
|
||||
sms = Sonset.microsecond
|
||||
Sss = sss + sms / 1000000.0
|
||||
sstd = arrivals[key]['S']['spe']
|
||||
if pstd is None:
|
||||
errorsS = parameter.get('timeerrorsS')
|
||||
if arrivals[key]['S']['weight'] == 0:
|
||||
pstd = errorsS[0]
|
||||
elif arrivals[key]['S']['weight'] == 1:
|
||||
pstd = errorsS[1]
|
||||
elif arrivals[key]['S']['weight'] == 2:
|
||||
pstd = errorsS[2]
|
||||
elif arrivals[key]['S']['weight'] == 3:
|
||||
psrd = errorsS[3]
|
||||
else:
|
||||
pstd = errorsP[4]
|
||||
fid.write('%-5s S1 %4.0f %02d %02d %02d %02d %05.02f %5.3f -999. 0.00 -999. 0.00\n'
|
||||
% (key, syear, smonth, sday, shh, smm, Sss, sstd))
|
||||
fid.close()
|
||||
def write_hyposat():
|
||||
with open(filename, 'w') as fid:
|
||||
fid.write('{}, event {} \n'.format(parameter.get('database'), parameter.get('eventID')))
|
||||
for key, value in arrivals.items():
|
||||
for phase in ['P', 'S']:
|
||||
if phase in value and value[phase].get('weight', 0) < 4:
|
||||
onset = value[phase]['mpp']
|
||||
ss_ms = onset.second + onset.microsecond / 1000000.0
|
||||
std = value[phase].get('spe', parameter.get('timeerrorsP')[value[phase].get('weight', 0)])
|
||||
fid.write(
|
||||
'{:<5} {}1 {:4} {:02} {:02} {:02} {:02} {:05.02f} {:5.3f} -999. 0.00 -999. 0.00\n'.format(
|
||||
key, phase, onset.year, onset.month, onset.day, onset.hour, onset.minute, ss_ms, std))
|
||||
|
||||
elif fformat == 'VELEST':
|
||||
print("Writing phases to %s for VELEST" % filename)
|
||||
fid = open("%s" % filename, 'w')
|
||||
# get informations needed in cnv-file
|
||||
# check, whether latitude is N or S and longitude is E or W
|
||||
try:
|
||||
eventsource = eventinfo.origins[0]
|
||||
except:
|
||||
def write_velest():
|
||||
if not eventinfo:
|
||||
print("No source origin calculated yet, thus no cnv-file creation possible!")
|
||||
return
|
||||
if eventsource['latitude'] < 0:
|
||||
cns = 'S'
|
||||
else:
|
||||
cns = 'N'
|
||||
if eventsource['longitude'] < 0:
|
||||
cew = 'W'
|
||||
else:
|
||||
cew = 'E'
|
||||
# get last two integers of origin year
|
||||
stime = eventsource['time']
|
||||
if stime.year - 2000 >= 0:
|
||||
syear = stime.year - 2000
|
||||
else:
|
||||
syear = stime.year - 1900
|
||||
ifx = 0 # default value, see VELEST manual, pp. 22-23
|
||||
# write header
|
||||
fid.write('%s%02d%02d %02d%02d %05.2f %7.4f%c %8.4f%c %7.2f %6.2f %02.0f 0.0 0.03 1.0 1.0\n' % (
|
||||
syear, stime.month, stime.day, stime.hour, stime.minute, stime.second, eventsource['latitude'],
|
||||
cns, eventsource['longitude'], cew, eventsource['depth'], eventinfo.magnitudes[0]['mag'], ifx))
|
||||
n = 0
|
||||
# check whether arrivals are dictionaries (autoPyLoT) or pick object (PyLoT)
|
||||
if isinstance(arrivals, dict) == False:
|
||||
# convert pick object (PyLoT) into dictionary
|
||||
evt = ope.Event(resource_id=eventinfo['resource_id'])
|
||||
evt.picks = arrivals
|
||||
arrivals = picksdict_from_picks(evt)
|
||||
# check for automatic and manual picks
|
||||
# prefer manual picks
|
||||
usedarrivals = chooseArrivals(arrivals)
|
||||
for key in usedarrivals:
|
||||
# P onsets
|
||||
if 'P' in usedarrivals[key]:
|
||||
if usedarrivals[key]['P']['weight'] < 4:
|
||||
n += 1
|
||||
stat = key
|
||||
if len(stat) > 4: # VELEST handles only 4-string station IDs
|
||||
stat = stat[1:5]
|
||||
Ponset = usedarrivals[key]['P']['mpp']
|
||||
Pweight = usedarrivals[key]['P']['weight']
|
||||
Prt = Ponset - stime # onset time relative to source time
|
||||
if n % 6 != 0:
|
||||
fid.write('%-4sP%d%6.2f' % (stat, Pweight, Prt))
|
||||
else:
|
||||
fid.write('%-4sP%d%6.2f\n' % (stat, Pweight, Prt))
|
||||
# S onsets
|
||||
if 'S' in usedarrivals[key]:
|
||||
if usedarrivals[key]['S']['weight'] < 4:
|
||||
n += 1
|
||||
stat = key
|
||||
if len(stat) > 4: # VELEST handles only 4-string station IDs
|
||||
stat = stat[1:5]
|
||||
Sonset = usedarrivals[key]['S']['mpp']
|
||||
Sweight = usedarrivals[key]['S']['weight']
|
||||
Srt = Ponset - stime # onset time relative to source time
|
||||
if n % 6 != 0:
|
||||
fid.write('%-4sS%d%6.2f' % (stat, Sweight, Srt))
|
||||
else:
|
||||
fid.write('%-4sS%d%6.2f\n' % (stat, Sweight, Srt))
|
||||
fid.close()
|
||||
with open(filename, 'w') as fid:
|
||||
origin = eventinfo.origins[0]
|
||||
lat_dir = 'S' if origin.latitude < 0 else 'N'
|
||||
lon_dir = 'W' if origin.longitude < 0 else 'E'
|
||||
year = origin.time.year - 2000 if origin.time.year >= 2000 else origin.time.year - 1900
|
||||
fid.write(
|
||||
'{}{}{} {}{} {} {:05.2f} {:7.4f}{} {:8.4f}{} {:7.2f} {:6.2f} {:02.0f} 0.0 0.03 1.0 1.0\n'.format(
|
||||
year, origin.time.month, origin.time.day, origin.time.hour, origin.time.minute, origin.time.second,
|
||||
origin.latitude, lat_dir, origin.longitude, lon_dir, origin.depth, eventinfo.magnitudes[0].mag, 0))
|
||||
for key, value in arrivals.items():
|
||||
for phase in ['P', 'S']:
|
||||
if phase in value and value[phase].get('weight', 0) < 4:
|
||||
onset = value[phase]['mpp']
|
||||
rt = (onset - origin.time).total_seconds()
|
||||
fid.write('{:<4}{}{}{:6.2f}\n'.format(key[:4], phase, value[phase].get('weight', 0), rt))
|
||||
|
||||
elif fformat == 'HYPODD':
|
||||
print("Writing phases to %s for hypoDD" % filename)
|
||||
fid = open("%s" % filename, 'w')
|
||||
# get event information needed for hypoDD-phase file
|
||||
try:
|
||||
eventsource = eventinfo.origins[0]
|
||||
except:
|
||||
def write_hypodd():
|
||||
if not eventinfo:
|
||||
print("No source origin calculated yet, thus no hypoDD-infile creation possible!")
|
||||
return
|
||||
stime = eventsource['time']
|
||||
try:
|
||||
event = eventinfo['pylot_id']
|
||||
hddID = event.split('.')[0][1:5]
|
||||
except:
|
||||
print("Error 1111111!")
|
||||
hddID = "00000"
|
||||
# write header
|
||||
fid.write('# %d %d %d %d %d %5.2f %7.4f +%6.4f %7.4f %4.2f 0.1 0.5 %4.2f %s\n' % (
|
||||
stime.year, stime.month, stime.day, stime.hour, stime.minute, stime.second,
|
||||
eventsource['latitude'], eventsource['longitude'], eventsource['depth'] / 1000,
|
||||
eventinfo.magnitudes[0]['mag'], eventsource['quality']['standard_error'], hddID))
|
||||
# check whether arrivals are dictionaries (autoPyLoT) or pick object (PyLoT)
|
||||
if isinstance(arrivals, dict) == False:
|
||||
# convert pick object (PyLoT) into dictionary
|
||||
evt = ope.Event(resource_id=eventinfo['resource_id'])
|
||||
evt.picks = arrivals
|
||||
arrivals = picksdict_from_picks(evt)
|
||||
# check for automatic and manual picks
|
||||
# prefer manual picks
|
||||
usedarrivals = chooseArrivals(arrivals)
|
||||
for key in usedarrivals:
|
||||
if 'P' in usedarrivals[key]:
|
||||
# P onsets
|
||||
if usedarrivals[key]['P']['weight'] < 4:
|
||||
Ponset = usedarrivals[key]['P']['mpp']
|
||||
Prt = Ponset - stime # onset time relative to source time
|
||||
fid.write('%s %6.3f 1 P\n' % (key, Prt))
|
||||
if 'S' in usedarrivals[key]:
|
||||
# S onsets
|
||||
if usedarrivals[key]['S']['weight'] < 4:
|
||||
Sonset = usedarrivals[key]['S']['mpp']
|
||||
Srt = Sonset - stime # onset time relative to source time
|
||||
fid.write('%-5s %6.3f 1 S\n' % (key, Srt))
|
||||
with open(filename, 'w') as fid:
|
||||
origin = eventinfo.origins[0]
|
||||
stime = origin.time
|
||||
fid.write('# {} {} {} {} {} {} {:7.4f} +{:6.4f} {:7.4f} {:4.2f} 0.1 0.5 {:4.2f} {}\n'.format(
|
||||
stime.year, stime.month, stime.day, stime.hour, stime.minute, stime.second,
|
||||
origin.latitude, origin.longitude, origin.depth / 1000, eventinfo.magnitudes[0].mag,
|
||||
origin.quality.standard_error, "00000"))
|
||||
for key, value in arrivals.items():
|
||||
for phase in ['P', 'S']:
|
||||
if phase in value and value[phase].get('weight', 0) < 4:
|
||||
onset = value[phase]['mpp']
|
||||
rt = (onset - stime).total_seconds()
|
||||
fid.write('{} {:6.3f} 1 {}\n'.format(key, rt, phase))
|
||||
|
||||
fid.close()
|
||||
|
||||
elif fformat == 'FOCMEC':
|
||||
print("Writing phases to %s for FOCMEC" % filename)
|
||||
fid = open("%s" % filename, 'w')
|
||||
# get event information needed for FOCMEC-input file
|
||||
try:
|
||||
eventsource = eventinfo.origins[0]
|
||||
except:
|
||||
def write_focmec():
|
||||
if not eventinfo:
|
||||
print("No source origin calculated yet, thus no FOCMEC-infile creation possible!")
|
||||
return
|
||||
stime = eventsource['time']
|
||||
|
||||
# avoid printing '*' in focmec-input file
|
||||
if parameter.get('eventid') == '*' or parameter.get('eventid') is None:
|
||||
evID = 'e0000'
|
||||
else:
|
||||
evID = parameter.get('eventid')
|
||||
|
||||
# write header line including event information
|
||||
fid.write('%s %d%02d%02d%02d%02d%02.0f %7.4f %6.4f %3.1f %3.1f\n' % (evID,
|
||||
stime.year, stime.month, stime.day,
|
||||
stime.hour, stime.minute, stime.second,
|
||||
eventsource['latitude'],
|
||||
eventsource['longitude'],
|
||||
eventsource['depth'] / 1000,
|
||||
eventinfo.magnitudes[0]['mag']))
|
||||
picks = eventinfo.picks
|
||||
# check whether arrivals are dictionaries (autoPyLoT) or pick object (PyLoT)
|
||||
if isinstance(arrivals, dict) == False:
|
||||
# convert pick object (PyLoT) into dictionary
|
||||
evt = ope.Event(resource_id=eventinfo['resource_id'])
|
||||
evt.picks = arrivals
|
||||
arrivals = picksdict_from_picks(evt)
|
||||
# check for automatic and manual picks
|
||||
# prefer manual picks
|
||||
usedarrivals = chooseArrivals(arrivals)
|
||||
for key in usedarrivals:
|
||||
if 'P' in usedarrivals[key]:
|
||||
if usedarrivals[key]['P']['weight'] < 4 and usedarrivals[key]['P']['fm'] is not None:
|
||||
stat = key
|
||||
for i in range(len(picks)):
|
||||
station = picks[i].waveform_id.station_code
|
||||
if station == stat:
|
||||
# get resource ID
|
||||
resid_picks = picks[i].get('resource_id')
|
||||
# find same ID in eventinfo
|
||||
# there it is the pick_id!!
|
||||
for j in range(len(eventinfo.origins[0].arrivals)):
|
||||
resid_eventinfo = eventinfo.origins[0].arrivals[j].get('pick_id')
|
||||
if resid_eventinfo == resid_picks and eventinfo.origins[0].arrivals[j].phase == 'P':
|
||||
if len(stat) > 4: # FOCMEC handles only 4-string station IDs
|
||||
stat = stat[1:5]
|
||||
az = eventinfo.origins[0].arrivals[j].get('azimuth')
|
||||
inz = eventinfo.origins[0].arrivals[j].get('takeoff_angle')
|
||||
fid.write('%-4s %6.2f %6.2f%s \n' % (stat,
|
||||
az,
|
||||
inz,
|
||||
usedarrivals[key]['P']['fm']))
|
||||
with open(filename, 'w') as fid:
|
||||
origin = eventinfo.origins[0]
|
||||
stime = origin.time
|
||||
fid.write('{} {}{:02d}{:02d}{:02d}{:02d}{:02.0f} {:7.4f} {:6.4f} {:3.1f} {:3.1f}\n'.format(
|
||||
parameter.get('eventid', 'e0000'), stime.year, stime.month, stime.day, stime.hour, stime.minute,
|
||||
stime.second, origin.latitude, origin.longitude, origin.depth / 1000, eventinfo.magnitudes[0].mag))
|
||||
for key, value in arrivals.items():
|
||||
if 'P' in value and value['P'].get('weight', 0) < 4 and value['P'].get('fm'):
|
||||
for pick in eventinfo.picks:
|
||||
if pick.waveform_id.station_code == key:
|
||||
for arrival in origin.arrivals:
|
||||
if arrival.pick_id == pick.resource_id and arrival.phase == 'P':
|
||||
stat = key[:4]
|
||||
az = arrival.azimuth
|
||||
inz = arrival.takeoff_angle
|
||||
fid.write('{:<4} {:6.2f} {:6.2f}{}\n'.format(stat, az, inz, value['P']['fm']))
|
||||
break
|
||||
|
||||
fid.close()
|
||||
def write_hash():
|
||||
# Define filenames for HASH driver 1 and 2
|
||||
filename1 = f"{filename}drv1.phase"
|
||||
filename2 = f"{filename}drv2.phase"
|
||||
|
||||
print(f"Writing phases to {filename1} for HASH-driver 1")
|
||||
print(f"Writing phases to {filename2} for HASH-driver 2")
|
||||
|
||||
# Open files for writing
|
||||
with open(filename1, 'w') as fid1, open(filename2, 'w') as fid2:
|
||||
# Get event information needed for HASH-input file
|
||||
try:
|
||||
eventsource = eventinfo.origins[0]
|
||||
except IndexError:
|
||||
print("No source origin calculated yet, thus no cnv-file creation possible!")
|
||||
return
|
||||
|
||||
event = parameter.get('eventID')
|
||||
hashID = event.split('.')[0][1:5]
|
||||
latdeg = eventsource['latitude']
|
||||
latmin = (eventsource['latitude'] * 60) / 10000
|
||||
londeg = eventsource['longitude']
|
||||
lonmin = (eventsource['longitude'] * 60) / 10000
|
||||
|
||||
erh = (eventsource.origin_uncertainty['min_horizontal_uncertainty'] +
|
||||
eventsource.origin_uncertainty['max_horizontal_uncertainty']) / 2000
|
||||
erz = eventsource.depth_errors['uncertainty']
|
||||
|
||||
stime = eventsource['time']
|
||||
syear = stime.year % 100 # Calculate two-digit year
|
||||
|
||||
picks = eventinfo.picks
|
||||
|
||||
# Write header line including event information for HASH-driver 1
|
||||
fid1.write(f"{syear:02d}{stime.month:02d}{stime.day:02d}{stime.hour:02d}{stime.minute:02d}"
|
||||
f"{stime.second:05.2f}{latdeg:2d}N{latmin:05.2f}{londeg:3d}E{lonmin:05.2f}"
|
||||
f"{eventsource['depth']:6.2f}{eventinfo.magnitudes[0]['mag']:4.2f}{erh:5.2f}{erz:5.2f}{hashID}\n")
|
||||
|
||||
# Write header line including event information for HASH-driver 2
|
||||
fid2.write(f"{syear:02d}{stime.month:02d}{stime.day:02d}{stime.hour:02d}{stime.minute:02d}"
|
||||
f"{stime.second:05.2f}{latdeg}N{latmin:05.2f}{londeg}E{lonmin:6.2f}{eventsource['depth']:5.2f}"
|
||||
f"{eventsource['quality']['used_phase_count']:3d}{erh:5.2f}{erz:5.2f}"
|
||||
f"{eventinfo.magnitudes[0]['mag']:4.2f}{hashID}\n")
|
||||
|
||||
# Write phase lines
|
||||
for key, arrival in arrivals.items():
|
||||
if 'P' in arrival and arrival['P']['weight'] < 4 and arrival['P']['fm'] is not None:
|
||||
stat = key
|
||||
ccode = arrival['P']['channel']
|
||||
ncode = arrival['P']['network']
|
||||
Pqual = 'I' if arrival['P']['weight'] < 2 else 'E'
|
||||
|
||||
for pick in picks:
|
||||
if pick.waveform_id.station_code == stat:
|
||||
resid_picks = pick.get('resource_id')
|
||||
for origin_arrival in eventinfo.origins[0].arrivals:
|
||||
if (origin_arrival.get('pick_id') == resid_picks and
|
||||
origin_arrival.phase == 'P'):
|
||||
if len(stat) > 4: # HASH handles only 4-character station IDs
|
||||
stat = stat[1:5]
|
||||
|
||||
az = origin_arrival.get('azimuth')
|
||||
inz = origin_arrival.get('takeoff_angle')
|
||||
dist = origin_arrival.get('distance')
|
||||
|
||||
# Write phase line for HASH-driver 1
|
||||
fid1.write(f"{stat:<4}{Pqual}P{arrival['P']['fm']}{arrival['P']['weight']:d}"
|
||||
f"{dist:3.1f}{inz:03d}{az:03d}{ccode}\n")
|
||||
|
||||
# Write phase line for HASH-driver 2
|
||||
fid2.write(f"{stat:<4} {ncode} {ccode} {Pqual} {arrival['P']['fm']}\n")
|
||||
break
|
||||
|
||||
fid1.write(f"{'':<36}{hashID}")
|
||||
|
||||
# Prefer Manual Picks over automatic ones if possible
|
||||
arrivals = chooseArrivals(arrivals) # Function not defined, assumed to exist
|
||||
|
||||
if fformat == 'NLLoc':
|
||||
write_nlloc()
|
||||
elif fformat == 'HYPO71':
|
||||
write_hypo71()
|
||||
elif fformat == 'HYPOSAT':
|
||||
write_hyposat()
|
||||
elif fformat == 'VELEST':
|
||||
write_velest()
|
||||
elif fformat == 'HYPODD':
|
||||
write_hypodd()
|
||||
elif fformat == 'FOCMEC':
|
||||
write_focmec()
|
||||
elif fformat == 'HASH':
|
||||
# two different input files for
|
||||
# HASH-driver 1 and 2 (see HASH manual!)
|
||||
filename1 = filename + 'drv1' + '.phase'
|
||||
filename2 = filename + 'drv2' + '.phase'
|
||||
print("Writing phases to %s for HASH for HASH-driver 1" % filename1)
|
||||
fid1 = open("%s" % filename1, 'w')
|
||||
print("Writing phases to %s for HASH for HASH-driver 2" % filename2)
|
||||
fid2 = open("%s" % filename2, 'w')
|
||||
# get event information needed for HASH-input file
|
||||
try:
|
||||
eventsource = eventinfo.origins[0]
|
||||
except:
|
||||
print("No source origin calculated yet, thus no cnv-file creation possible!")
|
||||
return
|
||||
eventsource = eventinfo.origins[0]
|
||||
event = parameter.get('eventID')
|
||||
hashID = event.split('.')[0][1:5]
|
||||
latdeg = eventsource['latitude']
|
||||
latmin = eventsource['latitude'] * 60 / 10000
|
||||
londeg = eventsource['longitude']
|
||||
lonmin = eventsource['longitude'] * 60 / 10000
|
||||
erh = 1 / 2 * (eventsource.origin_uncertainty['min_horizontal_uncertainty'] +
|
||||
eventsource.origin_uncertainty['max_horizontal_uncertainty']) / 1000
|
||||
erz = eventsource.depth_errors['uncertainty']
|
||||
stime = eventsource['time']
|
||||
if stime.year - 2000 >= 0:
|
||||
syear = stime.year - 2000
|
||||
else:
|
||||
syear = stime.year - 1900
|
||||
picks = eventinfo.picks
|
||||
# write header line including event information
|
||||
# for HASH-driver 1
|
||||
fid1.write('%s%02d%02d%02d%02d%5.2f%2dN%5.2f%3dE%5.2f%6.3f%4.2f%5.2f%5.2f%s\n' % (syear,
|
||||
stime.month, stime.day,
|
||||
stime.hour, stime.minute,
|
||||
stime.second,
|
||||
latdeg, latmin, londeg,
|
||||
lonmin, eventsource['depth'],
|
||||
eventinfo.magnitudes[0][
|
||||
'mag'], erh, erz,
|
||||
hashID))
|
||||
# write header line including event information
|
||||
# for HASH-driver 2
|
||||
fid2.write(
|
||||
'%d%02d%02d%02d%02d%5.2f%dN%5.2f%3dE%6.2f%5.2f %d %5.2f %5.2f %4.2f %s \n' % (
|
||||
syear, stime.month, stime.day,
|
||||
stime.hour, stime.minute, stime.second,
|
||||
latdeg, latmin, londeg, lonmin,
|
||||
eventsource['depth'],
|
||||
eventsource['quality']['used_phase_count'],
|
||||
erh, erz, eventinfo.magnitudes[0]['mag'],
|
||||
hashID))
|
||||
# Prefer Manual Picks over automatic ones if possible
|
||||
arrivals = chooseArrivals(arrivals) # MP MP what is chooseArrivals? It is not defined anywhere
|
||||
# write phase lines
|
||||
for key in arrivals:
|
||||
if 'P' in arrivals[key]:
|
||||
if arrivals[key]['P']['weight'] < 4 and arrivals[key]['P']['fm'] is not None:
|
||||
stat = key
|
||||
ccode = arrivals[key]['P']['channel']
|
||||
ncode = arrivals[key]['P']['network']
|
||||
|
||||
if arrivals[key]['P']['weight'] < 2:
|
||||
Pqual = 'I'
|
||||
else:
|
||||
Pqual = 'E'
|
||||
|
||||
for i in range(len(picks)):
|
||||
station = picks[i].waveform_id.station_code
|
||||
if station == stat:
|
||||
# get resource ID
|
||||
resid_picks = picks[i].get('resource_id')
|
||||
# find same ID in eventinfo
|
||||
# there it is the pick_id!!
|
||||
for j in range(len(eventinfo.origins[0].arrivals)):
|
||||
resid_eventinfo = eventinfo.origins[0].arrivals[j].get('pick_id')
|
||||
if resid_eventinfo == resid_picks and eventinfo.origins[0].arrivals[j].phase == 'P':
|
||||
if len(stat) > 4: # HASH handles only 4-string station IDs
|
||||
stat = stat[1:5]
|
||||
az = eventinfo.origins[0].arrivals[j].get('azimuth')
|
||||
inz = eventinfo.origins[0].arrivals[j].get('takeoff_angle')
|
||||
dist = eventinfo.origins[0].arrivals[j].get('distance')
|
||||
# write phase line for HASH-driver 1
|
||||
fid1.write(
|
||||
'%-4s%sP%s%d 0 %3.1f %03d %03d 2 1 %s\n' % (
|
||||
stat, Pqual, arrivals[key]['P']['fm'], arrivals[key]['P']['weight'],
|
||||
dist, inz, az, ccode))
|
||||
# write phase line for HASH-driver 2
|
||||
fid2.write('%-4s %s %s %s %s \n' % (
|
||||
stat,
|
||||
ncode,
|
||||
ccode,
|
||||
Pqual,
|
||||
arrivals[key]['P']['fm']))
|
||||
break
|
||||
|
||||
fid1.write(' %s' % hashID)
|
||||
fid1.close()
|
||||
fid2.close()
|
||||
write_hash()
|
||||
|
||||
|
||||
def chooseArrivals(arrivals):
|
||||
@ -1122,7 +670,7 @@ def getQualitiesfromxml(path, errorsP, errorsS, plotflag=1, figure=None, verbosi
|
||||
mstation = pick.waveform_id.station_code
|
||||
mstation_ext = mstation + '_'
|
||||
for mpick in arrivals_copy:
|
||||
phase = identifyPhase(loopIdentifyPhase(pick.phase_hint)) # MP MP catch if this fails?
|
||||
phase = identifyPhase(loopIdentifyPhase(pick.phase_hint)) # MP MP catch if this fails?
|
||||
if ((mpick.waveform_id.station_code == mstation) or
|
||||
(mpick.waveform_id.station_code == mstation_ext)) and \
|
||||
(mpick.method_id.id.split('/')[1] == 'auto') and \
|
||||
|
177
pylot/core/io/project.py
Normal file
177
pylot/core/io/project.py
Normal file
@ -0,0 +1,177 @@
|
||||
import os
|
||||
|
||||
from pylot.core.util.event import Event
|
||||
|
||||
|
||||
class Project(object):
|
||||
'''
|
||||
Pickable class containing information of a PyLoT project, like event lists and file locations.
|
||||
'''
|
||||
|
||||
# TODO: remove rootpath
|
||||
def __init__(self):
|
||||
self.eventlist = []
|
||||
self.location = None
|
||||
self.rootpath = None
|
||||
self.datapath = None
|
||||
self.dirty = False
|
||||
self.parameter = None
|
||||
self._table = None
|
||||
|
||||
def add_eventlist(self, eventlist):
|
||||
'''
|
||||
Add events from an eventlist containing paths to event directories.
|
||||
Will skip existing paths.
|
||||
'''
|
||||
if len(eventlist) == 0:
|
||||
return
|
||||
for item in eventlist:
|
||||
event = Event(item)
|
||||
event.rootpath = self.parameter['rootpath']
|
||||
event.database = self.parameter['database']
|
||||
event.datapath = self.parameter['datapath']
|
||||
if not event.path in self.getPaths():
|
||||
self.eventlist.append(event)
|
||||
self.setDirty()
|
||||
else:
|
||||
print('Skipping event with path {}. Already part of project.'.format(event.path))
|
||||
self.eventlist.sort(key=lambda x: x.pylot_id)
|
||||
self.search_eventfile_info()
|
||||
|
||||
def remove_event(self, event):
|
||||
self.eventlist.remove(event)
|
||||
|
||||
def remove_event_by_id(self, eventID):
|
||||
for event in self.eventlist:
|
||||
if eventID in str(event.resource_id):
|
||||
self.remove_event(event)
|
||||
break
|
||||
|
||||
def read_eventfile_info(self, filename, separator=','):
|
||||
'''
|
||||
Try to read event information from file (:param:filename) comparing specific event datetimes.
|
||||
File structure (each row): event, date, time, magnitude, latitude, longitude, depth
|
||||
separated by :param:separator each.
|
||||
'''
|
||||
with open(filename, 'r') as infile:
|
||||
for line in infile.readlines():
|
||||
eventID, date, time, mag, lat, lon, depth = line.split(separator)[:7]
|
||||
# skip first line
|
||||
try:
|
||||
day, month, year = date.split('/')
|
||||
except:
|
||||
continue
|
||||
year = int(year)
|
||||
# hardcoded, if year only consists of 2 digits (e.g. 16 instead of 2016)
|
||||
if year < 100:
|
||||
year += 2000
|
||||
datetime = '{}-{}-{}T{}'.format(year, month, day, time)
|
||||
try:
|
||||
datetime = UTCDateTime(datetime)
|
||||
except Exception as e:
|
||||
print(e, datetime, filename)
|
||||
continue
|
||||
for event in self.eventlist:
|
||||
if eventID in str(event.resource_id) or eventID in event.origins:
|
||||
if event.origins:
|
||||
origin = event.origins[0] # should have only one origin
|
||||
if origin.time == datetime:
|
||||
origin.latitude = float(lat)
|
||||
origin.longitude = float(lon)
|
||||
origin.depth = float(depth)
|
||||
else:
|
||||
continue
|
||||
elif not event.origins:
|
||||
origin = Origin(resource_id=event.resource_id,
|
||||
time=datetime, latitude=float(lat),
|
||||
longitude=float(lon), depth=float(depth))
|
||||
event.origins.append(origin)
|
||||
event.magnitudes.append(Magnitude(resource_id=event.resource_id,
|
||||
mag=float(mag),
|
||||
mag_type='M'))
|
||||
break
|
||||
|
||||
def search_eventfile_info(self):
|
||||
'''
|
||||
Search all datapaths in rootpath for filenames with given file extension fext
|
||||
and try to read event info from it
|
||||
'''
|
||||
datapaths = []
|
||||
fext = '.csv'
|
||||
for event in self.eventlist:
|
||||
if not event.datapath in datapaths:
|
||||
datapaths.append(event.datapath)
|
||||
for datapath in datapaths:
|
||||
# datapath = os.path.join(self.rootpath, datapath)
|
||||
if os.path.isdir(datapath):
|
||||
for filename in os.listdir(datapath):
|
||||
filename = os.path.join(datapath, filename)
|
||||
if os.path.isfile(filename) and filename.endswith(fext):
|
||||
try:
|
||||
self.read_eventfile_info(filename)
|
||||
except Exception as e:
|
||||
print('Failed on reading eventfile info from file {}: {}'.format(filename, e))
|
||||
else:
|
||||
print("Directory %s does not exist!" % datapath)
|
||||
|
||||
def getPaths(self):
|
||||
'''
|
||||
Returns paths (eventlist) of all events saved in the project.
|
||||
'''
|
||||
paths = []
|
||||
for event in self.eventlist:
|
||||
paths.append(event.path)
|
||||
return paths
|
||||
|
||||
def setDirty(self, value=True):
|
||||
self.dirty = value
|
||||
|
||||
def getEventFromPath(self, path):
|
||||
'''
|
||||
Search for an event in the project by event path.
|
||||
'''
|
||||
for event in self.eventlist:
|
||||
if event.path == path:
|
||||
return event
|
||||
|
||||
def save(self, filename=None):
|
||||
'''
|
||||
Save PyLoT Project to a file.
|
||||
Can be loaded by using project.load(filename).
|
||||
'''
|
||||
try:
|
||||
import pickle
|
||||
except ImportError:
|
||||
import _pickle as pickle
|
||||
|
||||
if filename:
|
||||
self.location = filename
|
||||
else:
|
||||
filename = self.location
|
||||
|
||||
table = self._table # MP: see below
|
||||
try:
|
||||
outfile = open(filename, 'wb')
|
||||
self._table = [] # MP: Workaround as long as table cannot be saved as part of project
|
||||
pickle.dump(self, outfile, protocol=pickle.HIGHEST_PROTOCOL)
|
||||
self.setDirty(False)
|
||||
self._table = table # MP: see above
|
||||
return True
|
||||
except Exception as e:
|
||||
print('Could not pickle PyLoT project. Reason: {}'.format(e))
|
||||
self.setDirty()
|
||||
self._table = table # MP: see above
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def load(filename):
|
||||
'''
|
||||
Load project from filename.
|
||||
'''
|
||||
import pickle
|
||||
infile = open(filename, 'rb')
|
||||
project = pickle.load(infile)
|
||||
infile.close()
|
||||
project.location = filename
|
||||
print('Loaded %s' % filename)
|
||||
return project
|
13
pylot/core/io/utils.py
Normal file
13
pylot/core/io/utils.py
Normal file
@ -0,0 +1,13 @@
|
||||
import os
|
||||
from typing import List
|
||||
|
||||
|
||||
def validate_filenames(filenames: List[str]) -> List[str]:
|
||||
"""
|
||||
validate a list of filenames for file abundance
|
||||
:param filenames: list of possible filenames
|
||||
:type filenames: List[str]
|
||||
:return: list of valid filenames
|
||||
:rtype: List[str]
|
||||
"""
|
||||
return [fn for fn in filenames if os.path.isfile(fn)]
|
123
pylot/core/io/waveformdata.py
Normal file
123
pylot/core/io/waveformdata.py
Normal file
@ -0,0 +1,123 @@
|
||||
import logging
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Union, List
|
||||
|
||||
from obspy import Stream, read
|
||||
from obspy.io.sac import SacIOError
|
||||
|
||||
from pylot.core.io.utils import validate_filenames
|
||||
from pylot.core.util.dataprocessing import Metadata
|
||||
from pylot.core.util.utils import get_stations, check_for_nan, check4rotated
|
||||
|
||||
|
||||
@dataclass
|
||||
class WaveformData:
|
||||
wfdata: Stream = field(default_factory=Stream)
|
||||
wforiginal: Union[Stream, None] = None
|
||||
wf_alt: Stream = field(default_factory=Stream)
|
||||
dirty: bool = False
|
||||
|
||||
def load_waveforms(self, fnames: List[str], fnames_alt: List[str] = None, check_rotated=False, metadata=None, tstart=0, tstop=0):
|
||||
fn_list = validate_filenames(fnames)
|
||||
if not fn_list:
|
||||
logging.warning('No valid filenames given for loading waveforms')
|
||||
else:
|
||||
self.clear()
|
||||
self.add_waveforms(fn_list)
|
||||
|
||||
if fnames_alt is None:
|
||||
pass
|
||||
else:
|
||||
alt_fn_list = validate_filenames(fnames_alt)
|
||||
if not alt_fn_list:
|
||||
logging.warning('No valid alternative filenames given for loading waveforms')
|
||||
else:
|
||||
self.add_waveforms(alt_fn_list, alternative=True)
|
||||
|
||||
if not fn_list and not alt_fn_list:
|
||||
logging.error('No filenames or alternative filenames given for loading waveforms')
|
||||
return False
|
||||
|
||||
self.merge()
|
||||
self.replace_nan()
|
||||
if not check_rotated or not metadata:
|
||||
pass
|
||||
else:
|
||||
self.rotate_zne()
|
||||
self.trim_station_traces()
|
||||
self.wforiginal = self.wfdata.copy()
|
||||
self.dirty = False
|
||||
return True
|
||||
|
||||
def add_waveforms(self, fnames: List[str], alternative: bool = False):
|
||||
data_stream = self.wf_alt if alternative else self.wfdata
|
||||
warnmsg = ''
|
||||
for fname in set(fnames):
|
||||
try:
|
||||
data_stream += read(fname)
|
||||
except TypeError:
|
||||
try:
|
||||
data_stream += read(fname, format='GSE2')
|
||||
except Exception as e:
|
||||
try:
|
||||
data_stream += read(fname, format='SEGY')
|
||||
except Exception as e:
|
||||
warnmsg += f'{fname}\n{e}\n'
|
||||
except SacIOError as se:
|
||||
warnmsg += f'{fname}\n{se}\n'
|
||||
|
||||
if warnmsg:
|
||||
print(f'WARNING in add_waveforms: unable to read waveform data\n{warnmsg}')
|
||||
|
||||
def clear(self):
|
||||
self.wfdata = Stream()
|
||||
self.wforiginal = None
|
||||
self.wf_alt = Stream()
|
||||
|
||||
def reset(self):
|
||||
"""
|
||||
Resets the waveform data to its original state.
|
||||
"""
|
||||
if self.wforiginal:
|
||||
self.wfdata = self.wforiginal.copy()
|
||||
else:
|
||||
self.wfdata = Stream()
|
||||
self.dirty = False
|
||||
|
||||
def merge(self):
|
||||
"""
|
||||
check for gaps in Stream and merge if gaps are found
|
||||
"""
|
||||
gaps = self.wfdata.get_gaps()
|
||||
if gaps:
|
||||
merged = ['{}.{}.{}.{}'.format(*gap[:4]) for gap in gaps]
|
||||
self.wfdata.merge(method=1)
|
||||
logging.info('Merged the following stations because of gaps:')
|
||||
for station in merged:
|
||||
logging.info(station)
|
||||
|
||||
def replace_nan(self):
|
||||
"""
|
||||
Replace all NaNs in data with 0. (in place)
|
||||
"""
|
||||
self.wfdata = check_for_nan(self.wfdata)
|
||||
|
||||
def rotate_zne(self, metadata: Metadata = None):
|
||||
"""
|
||||
Check all traces in stream for rotation. If a trace is not in ZNE rotation (last symbol of channel code is numeric) and the trace
|
||||
is in the metadata with azimuth and dip, rotate it to classical ZNE orientation.
|
||||
Rotating the traces requires them to be of the same length, so, all traces will be trimmed to a common length as a
|
||||
side effect.
|
||||
"""
|
||||
|
||||
self.wfdata = check4rotated(self.wfdata, metadata)
|
||||
|
||||
def trim_station_traces(self):
|
||||
"""
|
||||
trim data stream to common time window
|
||||
"""
|
||||
|
||||
for station in get_stations(self.wfdata):
|
||||
station_traces = self.wfdata.select(station=station)
|
||||
station_traces.trim(starttime=max([trace.stats.starttime for trace in station_traces]),
|
||||
endtime=min([trace.stats.endtime for trace in station_traces]))
|
@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.io.phases import write_phases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
@ -25,4 +25,4 @@ def export(picks, fnout, parameter, eventinfo):
|
||||
:type eventinfo: list object
|
||||
'''
|
||||
# write phases to FOCMEC-phase file
|
||||
writephases(picks, 'FOCMEC', fnout, parameter, eventinfo)
|
||||
write_phases(picks, 'FOCMEC', fnout, parameter, eventinfo)
|
||||
|
@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.io.phases import write_phases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
@ -25,4 +25,4 @@ def export(picks, fnout, parameter, eventinfo):
|
||||
:type eventinfo: list object
|
||||
'''
|
||||
# write phases to HASH-phase file
|
||||
writephases(picks, 'HASH', fnout, parameter, eventinfo)
|
||||
write_phases(picks, 'HASH', fnout, parameter, eventinfo)
|
||||
|
@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.io.phases import write_phases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
@ -22,4 +22,4 @@ def export(picks, fnout, parameter):
|
||||
:type parameter: object
|
||||
'''
|
||||
# write phases to HYPO71-phase file
|
||||
writephases(picks, 'HYPO71', fnout, parameter)
|
||||
write_phases(picks, 'HYPO71', fnout, parameter)
|
||||
|
@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.io.phases import write_phases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
@ -25,4 +25,4 @@ def export(picks, fnout, parameter, eventinfo):
|
||||
:type eventinfo: list object
|
||||
'''
|
||||
# write phases to hypoDD-phase file
|
||||
writephases(picks, 'HYPODD', fnout, parameter, eventinfo)
|
||||
write_phases(picks, 'HYPODD', fnout, parameter, eventinfo)
|
||||
|
@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.io.phases import write_phases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
@ -22,4 +22,4 @@ def export(picks, fnout, parameter):
|
||||
:type parameter: object
|
||||
'''
|
||||
# write phases to HYPOSAT-phase file
|
||||
writephases(picks, 'HYPOSAT', fnout, parameter)
|
||||
write_phases(picks, 'HYPOSAT', fnout, parameter)
|
||||
|
@ -7,7 +7,7 @@ import subprocess
|
||||
|
||||
from obspy import read_events
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.io.phases import write_phases
|
||||
from pylot.core.util.gui import which
|
||||
from pylot.core.util.utils import getPatternLine, runProgram
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
@ -34,7 +34,7 @@ def export(picks, fnout, parameter):
|
||||
:type parameter: object
|
||||
'''
|
||||
# write phases to NLLoc-phase file
|
||||
writephases(picks, 'NLLoc', fnout, parameter)
|
||||
write_phases(picks, 'NLLoc', fnout, parameter)
|
||||
|
||||
|
||||
def modify_inputs(ctrfn, root, nllocoutn, phasefn, tttn):
|
||||
|
@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.io.phases import write_phases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
@ -25,4 +25,4 @@ def export(picks, fnout, eventinfo, parameter=None):
|
||||
:type parameter: object
|
||||
'''
|
||||
# write phases to VELEST-phase file
|
||||
writephases(picks, 'VELEST', fnout, parameter, eventinfo)
|
||||
write_phases(picks, 'VELEST', fnout, parameter, eventinfo)
|
||||
|
@ -262,10 +262,6 @@ class AutopickStation(object):
|
||||
self.metadata = metadata
|
||||
self.origin = origin
|
||||
|
||||
# initialize TauPy pick estimates
|
||||
self.estFirstP = None
|
||||
self.estFirstS = None
|
||||
|
||||
# initialize picking results
|
||||
self.p_results = PickingResults()
|
||||
self.s_results = PickingResults()
|
||||
@ -447,15 +443,15 @@ class AutopickStation(object):
|
||||
for arr in arrivals:
|
||||
phases[identifyPhaseID(arr.phase.name)].append(arr)
|
||||
# get first P and S onsets from arrivals list
|
||||
arrival_time_p = 0
|
||||
arrival_time_s = 0
|
||||
estFirstP = 0
|
||||
estFirstS = 0
|
||||
if len(phases['P']) > 0:
|
||||
arrP, arrival_time_p = min([(arr, arr.time) for arr in phases['P']], key=lambda t: t[1])
|
||||
arrP, estFirstP = min([(arr, arr.time) for arr in phases['P']], key=lambda t: t[1])
|
||||
if len(phases['S']) > 0:
|
||||
arrS, arrival_time_s = min([(arr, arr.time) for arr in phases['S']], key=lambda t: t[1])
|
||||
arrS, estFirstS = min([(arr, arr.time) for arr in phases['S']], key=lambda t: t[1])
|
||||
print('autopick: estimated first arrivals for P: {} s, S:{} s after event'
|
||||
' origin time using TauPy'.format(arrival_time_p, arrival_time_s))
|
||||
return arrival_time_p, arrival_time_s
|
||||
' origin time using TauPy'.format(estFirstP, estFirstS))
|
||||
return estFirstP, estFirstS
|
||||
|
||||
def exit_taupy():
|
||||
"""If taupy failed to calculate theoretical starttimes, picking continues.
|
||||
@ -481,13 +477,10 @@ class AutopickStation(object):
|
||||
raise AttributeError('No source origins given!')
|
||||
|
||||
arrivals = create_arrivals(self.metadata, self.origin, self.pickparams["taup_model"])
|
||||
arrival_P, arrival_S = first_PS_onsets(arrivals)
|
||||
|
||||
self.estFirstP = (self.origin[0].time + arrival_P) - self.ztrace.stats.starttime
|
||||
|
||||
estFirstP, estFirstS = first_PS_onsets(arrivals)
|
||||
# modifiy pstart and pstop relative to estimated first P arrival (relative to station time axis)
|
||||
self.pickparams["pstart"] += self.estFirstP
|
||||
self.pickparams["pstop"] += self.estFirstP
|
||||
self.pickparams["pstart"] += (self.origin[0].time + estFirstP) - self.ztrace.stats.starttime
|
||||
self.pickparams["pstop"] += (self.origin[0].time + estFirstP) - self.ztrace.stats.starttime
|
||||
print('autopick: CF calculation times respectively:'
|
||||
' pstart: {} s, pstop: {} s'.format(self.pickparams["pstart"], self.pickparams["pstop"]))
|
||||
# make sure pstart and pstop are inside the starttime/endtime of vertical trace
|
||||
@ -498,10 +491,9 @@ class AutopickStation(object):
|
||||
# for the two horizontal components take earliest and latest time to make sure that the s onset is not clipped
|
||||
# if start and endtime of horizontal traces differ, the s windowsize will automatically increase
|
||||
trace_s_start = min([self.etrace.stats.starttime, self.ntrace.stats.starttime])
|
||||
self.estFirstS = (self.origin[0].time + arrival_S) - trace_s_start
|
||||
# modifiy sstart and sstop relative to estimated first S arrival (relative to station time axis)
|
||||
self.pickparams["sstart"] += self.estFirstS
|
||||
self.pickparams["sstop"] += self.estFirstS
|
||||
self.pickparams["sstart"] += (self.origin[0].time + estFirstS) - trace_s_start
|
||||
self.pickparams["sstop"] += (self.origin[0].time + estFirstS) - trace_s_start
|
||||
print('autopick: CF calculation times respectively:'
|
||||
' sstart: {} s, sstop: {} s'.format(self.pickparams["sstart"], self.pickparams["sstop"]))
|
||||
# make sure pstart and pstop are inside the starttime/endtime of horizontal traces
|
||||
@ -617,12 +609,6 @@ class AutopickStation(object):
|
||||
# plot tapered trace filtered with bpz2 filter settings
|
||||
ax1.plot(tdata, self.tr_filt_z_bpz2.data / max(self.tr_filt_z_bpz2.data), color=linecolor, linewidth=0.7,
|
||||
label='Data')
|
||||
# plot pickwindows for P
|
||||
pstart, pstop = self.pickparams['pstart'], self.pickparams['pstop']
|
||||
if pstart is not None and pstop is not None:
|
||||
ax1.axvspan(pstart, pstop, color='r', alpha=0.1, zorder=0, label='P window')
|
||||
if self.estFirstP is not None:
|
||||
ax1.axvline(self.estFirstP, ls='dashed', color='r', alpha=0.4, label='TauPy estimate')
|
||||
if self.p_results.weight < 4:
|
||||
# plot CF of initial onset (HOScf or ARZcf)
|
||||
ax1.plot(self.cf1.getTimeArray(), self.cf1.getCF() / max(self.cf1.getCF()), 'b', label='CF1')
|
||||
@ -727,15 +713,6 @@ class AutopickStation(object):
|
||||
ax3.plot([refSpick.getpick() - 0.5, refSpick.getpick() + 0.5], [-1.3, -1.3], 'g', linewidth=2)
|
||||
ax3.plot([self.s_results.lpp, self.s_results.lpp], [-1.1, 1.1], 'g--', label='lpp')
|
||||
ax3.plot([self.s_results.epp, self.s_results.epp], [-1.1, 1.1], 'g--', label='epp')
|
||||
|
||||
# plot pickwindows for S
|
||||
sstart, sstop = self.pickparams['sstart'], self.pickparams['sstop']
|
||||
if sstart is not None and sstop is not None:
|
||||
for axis in [ax2, ax3]:
|
||||
axis.axvspan(sstart, sstop, color='b', alpha=0.1, zorder=0, label='S window')
|
||||
if self.estFirstS is not None:
|
||||
axis.axvline(self.estFirstS, ls='dashed', color='b', alpha=0.4, label='TauPy estimate')
|
||||
|
||||
ax3.legend(loc=1)
|
||||
ax3.set_yticks([])
|
||||
ax3.set_ylim([-1.5, 1.5])
|
||||
@ -858,21 +835,14 @@ class AutopickStation(object):
|
||||
self.cf1 = None
|
||||
assert isinstance(self.cf1, CharacteristicFunction), 'cf1 is not set correctly: maybe the algorithm name ({})' \
|
||||
' is corrupted'.format(self.pickparams["algoP"])
|
||||
# get the original waveform stream from first CF class cut to identical length as CF for plotting
|
||||
cut_ogstream = self.cf1.getDataArray(self.cf1.getCut())
|
||||
|
||||
# MP: Rename to cf_stream for further use of z_copy and to prevent chaos when z_copy suddenly becomes a cf
|
||||
# stream and later again a waveform stream
|
||||
cf_stream = z_copy.copy()
|
||||
cf_stream[0].data = self.cf1.getCF()
|
||||
|
||||
# calculate AIC cf from first cf (either HOS or ARZ)
|
||||
aiccf = AICcf(cf_stream, cuttimes)
|
||||
z_copy[0].data = self.cf1.getCF()
|
||||
aiccf = AICcf(z_copy, cuttimes)
|
||||
# get preliminary onset time from AIC-CF
|
||||
self.set_current_figure('aicFig')
|
||||
aicpick = AICPicker(aiccf, self.pickparams["tsnrz"], self.pickparams["pickwinP"], self.iplot,
|
||||
Tsmooth=self.pickparams["aictsmooth"], fig=self.current_figure,
|
||||
linecolor=self.current_linecolor, ogstream=cut_ogstream)
|
||||
linecolor=self.current_linecolor)
|
||||
# save aicpick for plotting later
|
||||
self.p_data.aicpick = aicpick
|
||||
# add pstart and pstop to aic plot
|
||||
@ -885,7 +855,7 @@ class AutopickStation(object):
|
||||
label='P stop')
|
||||
ax.legend(loc=1)
|
||||
|
||||
Pflag = self._pick_p_quality_control(aicpick, cf_stream, tr_filt)
|
||||
Pflag = self._pick_p_quality_control(aicpick, z_copy, tr_filt)
|
||||
# go on with processing if AIC onset passes quality control
|
||||
slope = aicpick.getSlope()
|
||||
if not slope: slope = 0
|
||||
@ -924,7 +894,7 @@ class AutopickStation(object):
|
||||
refPpick = PragPicker(self.cf2, self.pickparams["tsnrz"], self.pickparams["pickwinP"], self.iplot,
|
||||
self.pickparams["ausP"],
|
||||
self.pickparams["tsmoothP"], aicpick.getpick(), self.current_figure,
|
||||
self.current_linecolor, ogstream=cut_ogstream)
|
||||
self.current_linecolor)
|
||||
# save PragPicker result for plotting
|
||||
self.p_data.refPpick = refPpick
|
||||
self.p_results.mpp = refPpick.getpick()
|
||||
@ -1176,14 +1146,11 @@ class AutopickStation(object):
|
||||
# calculate AIC cf
|
||||
haiccf = self._calculate_aic_cf_s_pick(cuttimesh)
|
||||
|
||||
# get the original waveform stream cut to identical length as CF for plotting
|
||||
ogstream = haiccf.getDataArray(haiccf.getCut())
|
||||
|
||||
# get preliminary onset time from AIC cf
|
||||
self.set_current_figure('aicARHfig')
|
||||
aicarhpick = AICPicker(haiccf, self.pickparams["tsnrh"], self.pickparams["pickwinS"], self.iplot,
|
||||
Tsmooth=self.pickparams["aictsmoothS"], fig=self.current_figure,
|
||||
linecolor=self.current_linecolor, ogstream=ogstream)
|
||||
linecolor=self.current_linecolor)
|
||||
# save pick for later plotting
|
||||
self.aicarhpick = aicarhpick
|
||||
|
||||
|
@ -17,11 +17,7 @@ autoregressive prediction: application ot local and regional distances, Geophys.
|
||||
:author: MAGS2 EP3 working group
|
||||
"""
|
||||
import numpy as np
|
||||
try:
|
||||
from scipy.signal import tukey
|
||||
except ImportError:
|
||||
from scipy.signal.windows import tukey
|
||||
|
||||
from scipy import signal
|
||||
from obspy.core import Stream
|
||||
|
||||
from pylot.core.pick.utils import PickingFailedException
|
||||
@ -60,7 +56,7 @@ class CharacteristicFunction(object):
|
||||
self.setOrder(order)
|
||||
self.setFnoise(fnoise)
|
||||
self.setARdetStep(t2)
|
||||
self.calcCF()
|
||||
self.calcCF(self.getDataArray())
|
||||
self.arpara = np.array([])
|
||||
self.xpred = np.array([])
|
||||
|
||||
@ -212,15 +208,17 @@ class CharacteristicFunction(object):
|
||||
data = self.orig_data.copy()
|
||||
return data
|
||||
|
||||
def calcCF(self):
|
||||
pass
|
||||
def calcCF(self, data=None):
|
||||
self.cf = data
|
||||
|
||||
|
||||
class AICcf(CharacteristicFunction):
|
||||
|
||||
def calcCF(self):
|
||||
def calcCF(self, data):
|
||||
"""
|
||||
Function to calculate the Akaike Information Criterion (AIC) after Maeda (1985).
|
||||
:param data: data, time series (whether seismogram or CF)
|
||||
:type data: tuple
|
||||
:return: AIC function
|
||||
:rtype:
|
||||
"""
|
||||
@ -229,7 +227,7 @@ class AICcf(CharacteristicFunction):
|
||||
ind = np.where(~np.isnan(xnp))[0]
|
||||
if ind.size:
|
||||
xnp[:ind[0]] = xnp[ind[0]]
|
||||
xnp = tukey(len(xnp), alpha=0.05) * xnp
|
||||
xnp = signal.tukey(len(xnp), alpha=0.05) * xnp
|
||||
xnp = xnp - np.mean(xnp)
|
||||
datlen = len(xnp)
|
||||
k = np.arange(1, datlen)
|
||||
@ -258,11 +256,13 @@ class HOScf(CharacteristicFunction):
|
||||
"""
|
||||
super(HOScf, self).__init__(data, cut, pickparams["tlta"], pickparams["hosorder"])
|
||||
|
||||
def calcCF(self):
|
||||
def calcCF(self, data):
|
||||
"""
|
||||
Function to calculate skewness (statistics of order 3) or kurtosis
|
||||
(statistics of order 4), using one long moving window, as published
|
||||
in Kueperkoch et al. (2010), or order 2, i.e. STA/LTA.
|
||||
:param data: data, time series (whether seismogram or CF)
|
||||
:type data: tuple
|
||||
:return: HOS cf
|
||||
:rtype:
|
||||
"""
|
||||
@ -277,28 +277,47 @@ class HOScf(CharacteristicFunction):
|
||||
elif self.getOrder() == 4: # this is kurtosis
|
||||
y = np.power(xnp, 4)
|
||||
y1 = np.power(xnp, 2)
|
||||
elif self.getOrder() == 2: # this is variance, used for STA/LTA processing
|
||||
y = np.power(xnp, 2)
|
||||
y1 = np.power(xnp, 2)
|
||||
|
||||
# Initialisation
|
||||
# t2: long term moving window
|
||||
ilta = int(round(self.getTime2() / self.getIncrement()))
|
||||
ista = int(round((self.getTime2() / 10) / self.getIncrement())) # TODO: still hard coded!!
|
||||
lta = y[0]
|
||||
lta1 = y1[0]
|
||||
sta = y[0]
|
||||
# moving windows
|
||||
LTA = np.zeros(len(xnp))
|
||||
STA = np.zeros(len(xnp))
|
||||
for j in range(0, len(xnp)):
|
||||
if j < 4:
|
||||
LTA[j] = 0
|
||||
STA[j] = 0
|
||||
elif j <= ista and self.getOrder() == 2:
|
||||
lta = (y[j] + lta * (j - 1)) / j
|
||||
if self.getOrder() == 2:
|
||||
sta = (y[j] + sta * (j - 1)) / j
|
||||
# elif j < 4:
|
||||
elif j <= ilta:
|
||||
lta = (y[j] + lta * (j - 1)) / j
|
||||
lta1 = (y1[j] + lta1 * (j - 1)) / j
|
||||
if self.getOrder() == 2:
|
||||
sta = (y[j] - y[j - ista]) / ista + sta
|
||||
else:
|
||||
lta = (y[j] - y[j - ilta]) / ilta + lta
|
||||
lta1 = (y1[j] - y1[j - ilta]) / ilta + lta1
|
||||
if self.getOrder() == 2:
|
||||
sta = (y[j] - y[j - ista]) / ista + sta
|
||||
# define LTA
|
||||
if self.getOrder() == 3:
|
||||
LTA[j] = lta / np.power(lta1, 1.5)
|
||||
elif self.getOrder() == 4:
|
||||
LTA[j] = lta / np.power(lta1, 2)
|
||||
else:
|
||||
LTA[j] = lta
|
||||
STA[j] = sta
|
||||
|
||||
# remove NaN's with first not-NaN-value,
|
||||
# so autopicker doesnt pick discontinuity at start of the trace
|
||||
@ -307,7 +326,10 @@ class HOScf(CharacteristicFunction):
|
||||
first = ind[0]
|
||||
LTA[:first] = LTA[first]
|
||||
|
||||
self.cf = LTA
|
||||
if self.getOrder() > 2:
|
||||
self.cf = LTA
|
||||
else: # order 2 means STA/LTA!
|
||||
self.cf = STA / LTA
|
||||
self.xcf = x
|
||||
|
||||
|
||||
@ -317,10 +339,12 @@ class ARZcf(CharacteristicFunction):
|
||||
super(ARZcf, self).__init__(data, cut, t1=t1, t2=t2, order=pickparams["Parorder"],
|
||||
fnoise=pickparams["addnoise"])
|
||||
|
||||
def calcCF(self):
|
||||
def calcCF(self, data):
|
||||
"""
|
||||
function used to calculate the AR prediction error from a single vertical trace. Can be used to pick
|
||||
P onsets.
|
||||
:param data:
|
||||
:type data: ~obspy.core.stream.Stream
|
||||
:return: ARZ cf
|
||||
:rtype:
|
||||
"""
|
||||
@ -451,12 +475,14 @@ class ARHcf(CharacteristicFunction):
|
||||
super(ARHcf, self).__init__(data, cut, t1=t1, t2=t2, order=pickparams["Sarorder"],
|
||||
fnoise=pickparams["addnoise"])
|
||||
|
||||
def calcCF(self):
|
||||
def calcCF(self, data):
|
||||
"""
|
||||
Function to calculate a characteristic function using autoregressive modelling of the waveform of
|
||||
both horizontal traces.
|
||||
The waveform is predicted in a moving time window using the calculated AR parameters. The difference
|
||||
between the predicted and the actual waveform servers as a characteristic function.
|
||||
:param data: wavefor stream
|
||||
:type data: ~obspy.core.stream.Stream
|
||||
:return: ARH cf
|
||||
:rtype:
|
||||
"""
|
||||
@ -605,12 +631,14 @@ class AR3Ccf(CharacteristicFunction):
|
||||
super(AR3Ccf, self).__init__(data, cut, t1=t1, t2=t2, order=pickparams["Sarorder"],
|
||||
fnoise=pickparams["addnoise"])
|
||||
|
||||
def calcCF(self):
|
||||
def calcCF(self, data):
|
||||
"""
|
||||
Function to calculate a characteristic function using autoregressive modelling of the waveform of
|
||||
all three traces.
|
||||
The waveform is predicted in a moving time window using the calculated AR parameters. The difference
|
||||
between the predicted and the actual waveform servers as a characteristic function
|
||||
:param data: stream holding all three traces
|
||||
:type data: ~obspy.core.stream.Stream
|
||||
:return: AR3C cf
|
||||
:rtype:
|
||||
"""
|
||||
|
@ -37,8 +37,7 @@ class AutoPicker(object):
|
||||
|
||||
warnings.simplefilter('ignore')
|
||||
|
||||
def __init__(self, cf, TSNR, PickWindow, iplot=0, aus=None, Tsmooth=None, Pick1=None,
|
||||
fig=None, linecolor='k', ogstream=None):
|
||||
def __init__(self, cf, TSNR, PickWindow, iplot=0, aus=None, Tsmooth=None, Pick1=None, fig=None, linecolor='k'):
|
||||
"""
|
||||
Create AutoPicker object
|
||||
:param cf: characteristic function, on which the picking algorithm is applied
|
||||
@ -60,15 +59,12 @@ class AutoPicker(object):
|
||||
:type fig: `~matplotlib.figure.Figure`
|
||||
:param linecolor: matplotlib line color string
|
||||
:type linecolor: str
|
||||
:param ogstream: original stream (waveform), e.g. for plotting purposes
|
||||
:type ogstream: `~obspy.core.stream.Stream`
|
||||
"""
|
||||
|
||||
assert isinstance(cf, CharacteristicFunction), "%s is not a CharacteristicFunction object" % str(cf)
|
||||
self._linecolor = linecolor
|
||||
self._pickcolor_p = 'b'
|
||||
self.cf = cf.getCF()
|
||||
self.ogstream = ogstream
|
||||
self.Tcf = cf.getTimeArray()
|
||||
self.Data = cf.getXCF()
|
||||
self.dt = cf.getIncrement()
|
||||
@ -177,7 +173,7 @@ class AICPicker(AutoPicker):
|
||||
nn = np.isnan(self.cf)
|
||||
if len(nn) > 1:
|
||||
self.cf[nn] = 0
|
||||
# taper AIC-CF to get rid of side maxima
|
||||
# taper AIC-CF to get rid off side maxima
|
||||
tap = np.hanning(len(self.cf))
|
||||
aic = tap * self.cf + max(abs(self.cf))
|
||||
# smooth AIC-CF
|
||||
@ -320,7 +316,16 @@ class AICPicker(AutoPicker):
|
||||
plt.close(fig)
|
||||
return
|
||||
iislope = islope[0][0:imax + 1]
|
||||
dataslope = self.Data[0].data[iislope]
|
||||
# MP MP change slope calculation
|
||||
# get all maxima of aicsmooth
|
||||
iaicmaxima = argrelmax(aicsmooth)[0]
|
||||
# get first index of maximum after pickindex (indices saved in iaicmaxima)
|
||||
aicmax = iaicmaxima[np.where(iaicmaxima > pickindex)[0]]
|
||||
if len(aicmax) > 0:
|
||||
iaicmax = aicmax[0]
|
||||
else:
|
||||
iaicmax = -1
|
||||
dataslope = aicsmooth[pickindex: iaicmax]
|
||||
# calculate slope as polynomal fit of order 1
|
||||
xslope = np.arange(0, len(dataslope), 1)
|
||||
try:
|
||||
@ -331,7 +336,7 @@ class AICPicker(AutoPicker):
|
||||
else:
|
||||
self.slope = 1 / (len(dataslope) * self.Data[0].stats.delta) * (datafit[-1] - datafit[0])
|
||||
# normalize slope to maximum of cf to make it unit independent
|
||||
self.slope /= self.Data[0].data[icfmax]
|
||||
self.slope /= aicsmooth[iaicmax]
|
||||
except Exception as e:
|
||||
print("AICPicker: Problems with data fitting! {}".format(e))
|
||||
|
||||
@ -351,12 +356,6 @@ class AICPicker(AutoPicker):
|
||||
self.Tcf = self.Tcf[0:len(self.Tcf) - 1]
|
||||
ax1.plot(self.Tcf, cf / max(cf), color=self._linecolor, linewidth=0.7, label='(HOS-/AR-) Data')
|
||||
ax1.plot(self.Tcf, aicsmooth / max(aicsmooth), 'r', label='Smoothed AIC-CF')
|
||||
# plot the original waveform also for evaluation of the CF and pick
|
||||
if self.ogstream:
|
||||
data = self.ogstream[0].data
|
||||
if len(data) == len(self.Tcf):
|
||||
ax1.plot(self.Tcf, 0.5 * data / max(data), 'k', label='Seismogram', alpha=0.3, zorder=0,
|
||||
lw=0.5)
|
||||
if self.Pick is not None:
|
||||
ax1.plot([self.Pick, self.Pick], [-0.1, 0.5], 'b', linewidth=2, label='AIC-Pick')
|
||||
ax1.set_xlabel('Time [s] since %s' % self.Data[0].stats.starttime)
|
||||
@ -377,7 +376,7 @@ class AICPicker(AutoPicker):
|
||||
label='Signal Window')
|
||||
ax2.axvspan(self.Tcf[iislope[0]], self.Tcf[iislope[-1]], color='g', alpha=0.2, lw=0,
|
||||
label='Slope Window')
|
||||
ax2.plot(self.Tcf[iislope], datafit, 'g', linewidth=2,
|
||||
ax2.plot(self.Tcf[pickindex: iaicmax], datafit, 'g', linewidth=2,
|
||||
label='Slope') # MP MP changed temporarily!
|
||||
|
||||
if self.slope is not None:
|
||||
|
@ -15,7 +15,7 @@ import numpy as np
|
||||
from obspy.core import Stream, UTCDateTime
|
||||
from scipy.signal import argrelmax
|
||||
|
||||
from pylot.core.util.utils import get_bool, get_none, SetChannelComponents, common_range
|
||||
from pylot.core.util.utils import get_bool, get_none, SetChannelComponents
|
||||
|
||||
|
||||
def earllatepicker(X, nfac, TSNR, Pick1, iplot=0, verbosity=1, fig=None, linecolor='k'):
|
||||
@ -828,22 +828,14 @@ def checksignallength(X, pick, minsiglength, pickparams, iplot=0, fig=None, line
|
||||
if len(X) > 1:
|
||||
# all three components available
|
||||
# make sure, all components have equal lengths
|
||||
earliest_starttime = min(tr.stats.starttime for tr in X)
|
||||
cuttimes = common_range(X)
|
||||
X = X.slice(cuttimes[0], cuttimes[1])
|
||||
x1, x2, x3 = X[:3]
|
||||
|
||||
if not (len(x1) == len(x2) == len(x3)):
|
||||
raise PickingFailedException('checksignallength: unequal lengths of components!')
|
||||
|
||||
ilen = min([len(X[0].data), len(X[1].data), len(X[2].data)])
|
||||
x1 = X[0][0:ilen]
|
||||
x2 = X[1][0:ilen]
|
||||
x3 = X[2][0:ilen]
|
||||
# get RMS trace
|
||||
rms = np.sqrt((np.power(x1, 2) + np.power(x2, 2) + np.power(x3, 2)) / 3)
|
||||
ilen = len(rms)
|
||||
dt = earliest_starttime - X[0].stats.starttime
|
||||
pick -= dt
|
||||
else:
|
||||
x1 = X[0].data
|
||||
x2 = x3 = None
|
||||
ilen = len(x1)
|
||||
rms = abs(x1)
|
||||
|
||||
@ -882,10 +874,6 @@ def checksignallength(X, pick, minsiglength, pickparams, iplot=0, fig=None, line
|
||||
fig._tight = True
|
||||
ax = fig.add_subplot(111)
|
||||
ax.plot(t, rms, color=linecolor, linewidth=0.7, label='RMS Data')
|
||||
ax.plot(t, x1, 'k', alpha=0.3, lw=0.3, zorder=0)
|
||||
if x2 is not None and x3 is not None:
|
||||
ax.plot(t, x2, 'r', alpha=0.3, lw=0.3, zorder=0)
|
||||
ax.plot(t, x3, 'g', alpha=0.3, lw=0.3, zorder=0)
|
||||
ax.axvspan(t[inoise[0]], t[inoise[-1]], color='y', alpha=0.2, lw=0, label='Noise Window')
|
||||
ax.axvspan(t[isignal[0]], t[isignal[-1]], color='b', alpha=0.2, lw=0, label='Signal Window')
|
||||
ax.plot([t[isignal[0]], t[isignal[len(isignal) - 1]]],
|
||||
@ -895,7 +883,6 @@ def checksignallength(X, pick, minsiglength, pickparams, iplot=0, fig=None, line
|
||||
ax.set_xlabel('Time [s] since %s' % X[0].stats.starttime)
|
||||
ax.set_ylabel('Counts')
|
||||
ax.set_title('Check for Signal Length, Station %s' % X[0].stats.station)
|
||||
ax.set_xlim(pickparams["pstart"], pickparams["pstop"])
|
||||
ax.set_yticks([])
|
||||
if plt_flag == 1:
|
||||
fig.show()
|
||||
|
@ -5,17 +5,14 @@ import traceback
|
||||
|
||||
import cartopy.crs as ccrs
|
||||
import cartopy.feature as cf
|
||||
from cartopy.mpl.ticker import LongitudeFormatter, LatitudeFormatter
|
||||
import matplotlib
|
||||
import matplotlib.patheffects as PathEffects
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
import obspy
|
||||
from PySide2 import QtWidgets, QtGui
|
||||
from PySide2 import QtWidgets
|
||||
from matplotlib.backends.backend_qt5agg import FigureCanvasQTAgg as FigureCanvas
|
||||
from mpl_toolkits.axes_grid1.inset_locator import inset_axes
|
||||
from obspy import UTCDateTime
|
||||
|
||||
from pylot.core.util.utils import identifyPhaseID
|
||||
from scipy.interpolate import griddata
|
||||
|
||||
@ -27,10 +24,10 @@ matplotlib.use('Qt5Agg')
|
||||
|
||||
class MplCanvas(FigureCanvas):
|
||||
|
||||
def __init__(self, extern_axes=None, projection=None, width=15, height=5, dpi=100):
|
||||
def __init__(self, parent=None, extern_axes=None, width=5, height=4, dpi=100):
|
||||
if extern_axes is None:
|
||||
self.fig = plt.figure(figsize=(width, height), dpi=dpi)
|
||||
self.axes = self.fig.add_subplot(111, projection=projection)
|
||||
self.axes = self.fig.add_subplot(111)
|
||||
else:
|
||||
self.fig = extern_axes.figure
|
||||
self.axes = extern_axes
|
||||
@ -62,30 +59,24 @@ class Array_map(QtWidgets.QWidget):
|
||||
self.parameter = parameter if parameter else parent._inputs
|
||||
|
||||
self.picks_rel = {}
|
||||
self.picks_rel_mean_corrected = {}
|
||||
self.marked_stations = []
|
||||
self.highlighted_stations = []
|
||||
|
||||
# call functions to draw everything
|
||||
self.projection = ccrs.PlateCarree()
|
||||
self.init_graphics()
|
||||
self.ax = self.canvas.axes
|
||||
self.ax.set_adjustable('datalim')
|
||||
|
||||
self.init_stations()
|
||||
self.init_crtpyMap()
|
||||
self.init_map()
|
||||
|
||||
# set original map limits to fall back on when home button is pressed
|
||||
self.org_xlim = self.ax.get_xlim()
|
||||
self.org_ylim = self.ax.get_ylim()
|
||||
|
||||
self.org_xlim = self.canvas.axes.get_xlim()
|
||||
self.org_ylim = self.canvas.axes.get_ylim()
|
||||
# initial map without event
|
||||
self.ax.set_xlim(self.org_xlim[0], self.org_xlim[1])
|
||||
self.ax.set_ylim(self.org_ylim[0], self.org_ylim[1])
|
||||
self.canvas.axes.set_xlim(self.org_xlim[0], self.org_xlim[1])
|
||||
self.canvas.axes.set_ylim(self.org_ylim[0], self.org_ylim[1])
|
||||
|
||||
self._style = None if not hasattr(parent, '_style') else parent._style
|
||||
|
||||
|
||||
def init_map(self):
|
||||
self.init_colormap()
|
||||
self.connectSignals()
|
||||
@ -98,24 +89,23 @@ class Array_map(QtWidgets.QWidget):
|
||||
# initialize figure elements
|
||||
|
||||
if self.extern_plot_axes is None:
|
||||
self.canvas = MplCanvas(projection=self.projection)
|
||||
self.canvas = MplCanvas(self)
|
||||
self.plotWidget = FigureCanvas(self.canvas.fig)
|
||||
else:
|
||||
self.canvas = MplCanvas(extern_axes=self.extern_plot_axes)
|
||||
|
||||
self.plotWidget = self.canvas
|
||||
self.canvas = MplCanvas(self, extern_axes=self.extern_plot_axes)
|
||||
self.plotWidget = FigureCanvas(self.canvas.fig)
|
||||
|
||||
# initialize GUI elements
|
||||
self.status_label = QtWidgets.QLabel()
|
||||
self.map_reset_button = QtWidgets.QPushButton('Reset Map View')
|
||||
self.save_map_button = QtWidgets.QPushButton('Save Map')
|
||||
self.go2eq_button = QtWidgets.QPushButton('Go to Event Location')
|
||||
self.subtract_mean_cb = QtWidgets.QCheckBox('Subtract mean')
|
||||
|
||||
self.main_box = QtWidgets.QVBoxLayout()
|
||||
self.setLayout(self.main_box)
|
||||
|
||||
self.top_row = QtWidgets.QHBoxLayout()
|
||||
self.main_box.addLayout(self.top_row, 0)
|
||||
self.main_box.addLayout(self.top_row, 1)
|
||||
|
||||
self.comboBox_phase = QtWidgets.QComboBox()
|
||||
self.comboBox_phase.insertItem(0, 'P')
|
||||
@ -134,8 +124,8 @@ class Array_map(QtWidgets.QWidget):
|
||||
self.cmaps_box = QtWidgets.QComboBox()
|
||||
self.cmaps_box.setMaxVisibleItems(20)
|
||||
[self.cmaps_box.addItem(map_name) for map_name in sorted(plt.colormaps())]
|
||||
# try to set to plasma as default
|
||||
self.cmaps_box.setCurrentIndex(self.cmaps_box.findText('plasma'))
|
||||
# try to set to viridis as default
|
||||
self.cmaps_box.setCurrentIndex(self.cmaps_box.findText('viridis'))
|
||||
|
||||
self.top_row.addWidget(QtWidgets.QLabel('Select a phase: '))
|
||||
self.top_row.addWidget(self.comboBox_phase)
|
||||
@ -148,15 +138,14 @@ class Array_map(QtWidgets.QWidget):
|
||||
self.top_row.addWidget(self.auto_refresh_box)
|
||||
self.top_row.addWidget(self.refresh_button)
|
||||
|
||||
self.main_box.addWidget(self.plotWidget, 10)
|
||||
self.main_box.addWidget(self.plotWidget, 1)
|
||||
|
||||
self.bot_row = QtWidgets.QHBoxLayout()
|
||||
self.main_box.addLayout(self.bot_row, 0)
|
||||
self.main_box.addLayout(self.bot_row, 0.3)
|
||||
self.bot_row.addWidget(QtWidgets.QLabel(''), 5)
|
||||
self.bot_row.addWidget(self.map_reset_button, 2)
|
||||
self.bot_row.addWidget(self.go2eq_button, 2)
|
||||
self.bot_row.addWidget(self.save_map_button, 2)
|
||||
self.bot_row.addWidget(self.subtract_mean_cb, 0)
|
||||
self.bot_row.addWidget(self.status_label, 5)
|
||||
|
||||
def init_colormap(self):
|
||||
@ -164,12 +153,14 @@ class Array_map(QtWidgets.QWidget):
|
||||
self.init_lat_lon_grid()
|
||||
|
||||
def init_crtpyMap(self):
|
||||
self.ax.add_feature(cf.LAND)
|
||||
self.ax.add_feature(cf.OCEAN)
|
||||
self.ax.add_feature(cf.COASTLINE, linewidth=1, edgecolor='gray')
|
||||
self.ax.add_feature(cf.BORDERS, alpha=0.7)
|
||||
self.ax.add_feature(cf.LAKES, alpha=0.7)
|
||||
self.ax.add_feature(cf.RIVERS, linewidth=1)
|
||||
self.canvas.axes.cla()
|
||||
self.canvas.axes = plt.axes(projection=ccrs.PlateCarree())
|
||||
self.canvas.axes.add_feature(cf.LAND)
|
||||
self.canvas.axes.add_feature(cf.OCEAN)
|
||||
self.canvas.axes.add_feature(cf.COASTLINE, linewidth=1, edgecolor='gray')
|
||||
self.canvas.axes.add_feature(cf.BORDERS, alpha=0.7)
|
||||
self.canvas.axes.add_feature(cf.LAKES, alpha=0.7)
|
||||
self.canvas.axes.add_feature(cf.RIVERS, linewidth=1)
|
||||
|
||||
# parallels and meridians
|
||||
self.add_merid_paral()
|
||||
@ -177,8 +168,12 @@ class Array_map(QtWidgets.QWidget):
|
||||
self.canvas.fig.tight_layout()
|
||||
|
||||
def add_merid_paral(self):
|
||||
self.gridlines = self.ax.gridlines(draw_labels=False, alpha=0.6, color='gray',
|
||||
linewidth=self.linewidth / 2, zorder=7, crs=ccrs.PlateCarree())
|
||||
self.gridlines = self.canvas.axes.gridlines(draw_labels=False, alpha=0.6, color='gray',
|
||||
linewidth=self.linewidth / 2, zorder=7)
|
||||
# TODO: current cartopy version does not support label removal. Devs are working on it.
|
||||
# Should be fixed in coming cartopy versions
|
||||
# self.gridlines.xformatter = LONGITUDE_FORMATTER
|
||||
# self.gridlines.yformatter = LATITUDE_FORMATTER
|
||||
|
||||
def remove_merid_paral(self):
|
||||
if len(self.gridlines.xline_artists):
|
||||
@ -186,24 +181,24 @@ class Array_map(QtWidgets.QWidget):
|
||||
self.gridlines.yline_artists[0].remove()
|
||||
|
||||
def org_map_view(self):
|
||||
self.ax.set_xlim(self.org_xlim[0], self.org_xlim[1])
|
||||
self.ax.set_ylim(self.org_ylim[0], self.org_ylim[1])
|
||||
self.canvas.axes.set_xlim(self.org_xlim[0], self.org_xlim[1])
|
||||
self.canvas.axes.set_ylim(self.org_ylim[0], self.org_ylim[1])
|
||||
# parallels and meridians
|
||||
#self.remove_merid_paral()
|
||||
#self.add_merid_paral()
|
||||
self.remove_merid_paral()
|
||||
self.add_merid_paral()
|
||||
|
||||
self.canvas.draw_idle()
|
||||
self.canvas.axes.figure.canvas.draw_idle()
|
||||
|
||||
def go2eq(self):
|
||||
if self.eventLoc:
|
||||
lats, lons = self.eventLoc
|
||||
self.ax.set_xlim(lons - 10, lons + 10)
|
||||
self.ax.set_ylim(lats - 5, lats + 5)
|
||||
self.canvas.axes.set_xlim(lons - 10, lons + 10)
|
||||
self.canvas.axes.set_ylim(lats - 5, lats + 5)
|
||||
# parallels and meridians
|
||||
#self.remove_merid_paral()
|
||||
#self.add_merid_paral()
|
||||
self.remove_merid_paral()
|
||||
self.add_merid_paral()
|
||||
|
||||
self.canvas.draw_idle()
|
||||
self.canvas.axes.figure.canvas.draw_idle()
|
||||
|
||||
else:
|
||||
self.status_label.setText('No event information available')
|
||||
@ -217,7 +212,6 @@ class Array_map(QtWidgets.QWidget):
|
||||
self.map_reset_button.clicked.connect(self.org_map_view)
|
||||
self.go2eq_button.clicked.connect(self.go2eq)
|
||||
self.save_map_button.clicked.connect(self.saveFigure)
|
||||
self.subtract_mean_cb.stateChanged.connect(self.toggle_subtract_mean)
|
||||
|
||||
self.plotWidget.mpl_connect('motion_notify_event', self.mouse_moved)
|
||||
self.plotWidget.mpl_connect('scroll_event', self.mouse_scroll)
|
||||
@ -226,32 +220,21 @@ class Array_map(QtWidgets.QWidget):
|
||||
|
||||
# set mouse events -----------------------------------------------------
|
||||
def mouse_moved(self, event):
|
||||
if not event.inaxes == self.ax:
|
||||
if not event.inaxes == self.canvas.axes:
|
||||
return
|
||||
else:
|
||||
cont, inds = self.sc.contains(event)
|
||||
lat = event.ydata
|
||||
lon = event.xdata
|
||||
text = f'Longitude: {lon:3.3f}, Latitude: {lat:3.3f}'
|
||||
|
||||
if cont:
|
||||
indices = inds['ind']
|
||||
text += ' | Station: ' if len(indices) == 1 else ' | Stations: '
|
||||
text += ' - '.join([self._station_onpick_ids[index] for index in indices[:5]])
|
||||
if len(indices) > 5:
|
||||
text += '...'
|
||||
|
||||
self.status_label.setText(text)
|
||||
self.status_label.setText('Latitude: {:3.5f}, Longitude: {:3.5f}'.format(lat, lon))
|
||||
|
||||
def mouse_scroll(self, event):
|
||||
if not event.inaxes == self.ax:
|
||||
if not event.inaxes == self.canvas.axes:
|
||||
return
|
||||
|
||||
zoom = {'up': 1. / 2., 'down': 2.}
|
||||
|
||||
if event.button in zoom:
|
||||
xlim = self.ax.get_xlim()
|
||||
ylim = self.ax.get_ylim()
|
||||
xlim = self.canvas.axes.get_xlim()
|
||||
ylim = self.canvas.axes.get_ylim()
|
||||
|
||||
x, y = event.xdata, event.ydata
|
||||
|
||||
@ -263,24 +246,24 @@ class Array_map(QtWidgets.QWidget):
|
||||
yb = y - 0.5 * ydiff
|
||||
yt = y + 0.5 * ydiff
|
||||
|
||||
self.ax.set_xlim(xl, xr)
|
||||
self.ax.set_ylim(yb, yt)
|
||||
self.canvas.axes.set_xlim(xl, xr)
|
||||
self.canvas.axes.set_ylim(yb, yt)
|
||||
# parallels and meridians
|
||||
#self.remove_merid_paral()
|
||||
#self.add_merid_paral()
|
||||
self.remove_merid_paral()
|
||||
self.add_merid_paral()
|
||||
|
||||
self.ax.figure.canvas.draw_idle()
|
||||
self.canvas.axes.figure.canvas.draw_idle()
|
||||
|
||||
def mouseLeftPress(self, event):
|
||||
if not event.inaxes == self.ax:
|
||||
if not event.inaxes == self.canvas.axes:
|
||||
return
|
||||
self.map_x = event.xdata
|
||||
self.map_y = event.ydata
|
||||
self.map_xlim = self.ax.get_xlim()
|
||||
self.map_ylim = self.ax.get_ylim()
|
||||
self.map_xlim = self.canvas.axes.get_xlim()
|
||||
self.map_ylim = self.canvas.axes.get_ylim()
|
||||
|
||||
def mouseLeftRelease(self, event):
|
||||
if not event.inaxes == self.ax:
|
||||
if not event.inaxes == self.canvas.axes:
|
||||
return
|
||||
new_x = event.xdata
|
||||
new_y = event.ydata
|
||||
@ -288,13 +271,13 @@ class Array_map(QtWidgets.QWidget):
|
||||
dx = new_x - self.map_x
|
||||
dy = new_y - self.map_y
|
||||
|
||||
self.ax.set_xlim((self.map_xlim[0] - dx, self.map_xlim[1] - dx))
|
||||
self.ax.set_ylim(self.map_ylim[0] - dy, self.map_ylim[1] - dy)
|
||||
self.canvas.axes.set_xlim((self.map_xlim[0] - dx, self.map_xlim[1] - dx))
|
||||
self.canvas.axes.set_ylim(self.map_ylim[0] - dy, self.map_ylim[1] - dy)
|
||||
# parallels and meridians
|
||||
#self.remove_merid_paral()
|
||||
#self.add_merid_paral()
|
||||
self.remove_merid_paral()
|
||||
self.add_merid_paral()
|
||||
|
||||
self.ax.figure.canvas.draw_idle()
|
||||
self.canvas.axes.figure.canvas.draw_idle()
|
||||
|
||||
def onpick(self, event):
|
||||
btn_msg = {1: ' in selection. Aborted', 2: ' to delete a pick on. Aborted', 3: ' to display info.'}
|
||||
@ -374,6 +357,12 @@ class Array_map(QtWidgets.QWidget):
|
||||
def get_max_from_stations(self, key):
|
||||
return self._from_dict(max, key)
|
||||
|
||||
def get_min_from_picks(self):
|
||||
return min(self.picks_rel.values())
|
||||
|
||||
def get_max_from_picks(self):
|
||||
return max(self.picks_rel.values())
|
||||
|
||||
def current_picks_dict(self):
|
||||
picktype = self.comboBox_am.currentText().split(' ')[0]
|
||||
auto_manu = {'auto': self.autopicks_dict,
|
||||
@ -418,34 +407,22 @@ class Array_map(QtWidgets.QWidget):
|
||||
print('Cannot display pick for station {}. Reason: {}'.format(station_name, e))
|
||||
return picks, uncertainties
|
||||
|
||||
def get_picks_rel(picks, func=min):
|
||||
def get_picks_rel(picks):
|
||||
picks_rel = {}
|
||||
picks_utc = []
|
||||
for pick in picks.values():
|
||||
if type(pick) is UTCDateTime:
|
||||
picks_utc.append(pick.timestamp)
|
||||
if type(pick) is obspy.core.utcdatetime.UTCDateTime:
|
||||
picks_utc.append(pick)
|
||||
if picks_utc:
|
||||
self._reference_picktime = UTCDateTime(func(picks_utc))
|
||||
self._earliest_picktime = min(picks_utc)
|
||||
for st_id, pick in picks.items():
|
||||
if type(pick) is UTCDateTime:
|
||||
pick -= self._reference_picktime
|
||||
if type(pick) is obspy.core.utcdatetime.UTCDateTime:
|
||||
pick -= self._earliest_picktime
|
||||
picks_rel[st_id] = pick
|
||||
return picks_rel
|
||||
|
||||
def get_picks_rel_mean_corr(picks):
|
||||
return get_picks_rel(picks, func=np.nanmean)
|
||||
|
||||
self.picks, self.uncertainties = get_picks(self.stations_dict)
|
||||
self.picks_rel = get_picks_rel(self.picks)
|
||||
self.picks_rel_mean_corrected = get_picks_rel_mean_corr(self.picks)
|
||||
|
||||
def toggle_subtract_mean(self):
|
||||
if self.subtract_mean_cb.isChecked():
|
||||
cmap = 'seismic'
|
||||
else:
|
||||
cmap = 'viridis'
|
||||
self.cmaps_box.setCurrentIndex(self.cmaps_box.findText(cmap))
|
||||
self._refresh_drawings()
|
||||
|
||||
def init_lat_lon_dimensions(self):
|
||||
# init minimum and maximum lon and lat dimensions
|
||||
@ -476,12 +453,11 @@ class Array_map(QtWidgets.QWidget):
|
||||
return stations, latitudes, longitudes
|
||||
|
||||
def get_picks_lat_lon(self):
|
||||
picks_rel = self.picks_rel_mean_corrected if self.subtract_mean_cb.isChecked() else self.picks_rel
|
||||
picks = []
|
||||
uncertainties = []
|
||||
latitudes = []
|
||||
longitudes = []
|
||||
for st_id, pick in picks_rel.items():
|
||||
for st_id, pick in self.picks_rel.items():
|
||||
picks.append(pick)
|
||||
uncertainties.append(self.uncertainties.get(st_id))
|
||||
latitudes.append(self.stations_dict[st_id]['latitude'])
|
||||
@ -493,20 +469,13 @@ class Array_map(QtWidgets.QWidget):
|
||||
stat_dict = self.stations_dict['{}.{}'.format(network, station)]
|
||||
lat = stat_dict['latitude']
|
||||
lon = stat_dict['longitude']
|
||||
self.highlighted_stations.append(self.ax.scatter(lon, lat, s=self.pointsize, edgecolors=color,
|
||||
facecolors='none', zorder=12,
|
||||
transform=ccrs.PlateCarree(), label='deleted'))
|
||||
self.highlighted_stations.append(self.canvas.axes.scatter(lon, lat, s=self.pointsize, edgecolors=color,
|
||||
facecolors='none', zorder=12,
|
||||
transform=ccrs.PlateCarree(), label='deleted'))
|
||||
|
||||
def openPickDlg(self, ind):
|
||||
try:
|
||||
wfdata = self._parent.get_data().getWFData()
|
||||
except AttributeError:
|
||||
QtWidgets.QMessageBox.warning(
|
||||
self, "PyLoT Warning",
|
||||
"No waveform data found. Check if they were already loaded in Waveform plot tab."
|
||||
)
|
||||
return
|
||||
wfdata_comp = self._parent.get_data().getAltWFdata()
|
||||
wfdata = self._parent.get_data().get_wf_data()
|
||||
wfdata_comp = self._parent.get_data().get_wf_dataComp()
|
||||
for index in ind:
|
||||
network, station = self._station_onpick_ids[index].split('.')[:2]
|
||||
pyl_mw = self._parent
|
||||
@ -521,6 +490,7 @@ class Array_map(QtWidgets.QWidget):
|
||||
picks=self._parent.get_current_event().getPick(station),
|
||||
autopicks=self._parent.get_current_event().getAutopick(station),
|
||||
filteroptions=self._parent.filteroptions, metadata=self.metadata,
|
||||
model=self.parameter.get('taup_model'),
|
||||
event=pyl_mw.get_current_event())
|
||||
except Exception as e:
|
||||
message = 'Could not generate Plot for station {st}.\n {er}'.format(st=station, er=e)
|
||||
@ -548,19 +518,12 @@ class Array_map(QtWidgets.QWidget):
|
||||
print(message, e)
|
||||
print(traceback.format_exc())
|
||||
|
||||
def draw_contour_filled(self, nlevel=51):
|
||||
if self.subtract_mean_cb.isChecked():
|
||||
abs_max = self.get_residuals_absmax()
|
||||
levels = np.linspace(-abs_max, abs_max, nlevel)
|
||||
else:
|
||||
levels = np.linspace(min(self.picks_rel.values()), max(self.picks_rel.values()), nlevel)
|
||||
def draw_contour_filled(self, nlevel=50):
|
||||
levels = np.linspace(self.get_min_from_picks(), self.get_max_from_picks(), nlevel)
|
||||
|
||||
self.contourf = self.ax.contourf(self.longrid, self.latgrid, self.picksgrid_active, levels,
|
||||
linewidths=self.linewidth * 5, transform=ccrs.PlateCarree(),
|
||||
alpha=0.4, zorder=8, cmap=self.get_colormap())
|
||||
|
||||
def get_residuals_absmax(self):
|
||||
return np.max(np.absolute(list(self.picks_rel_mean_corrected.values())))
|
||||
self.contourf = self.canvas.axes.contourf(self.longrid, self.latgrid, self.picksgrid_active, levels,
|
||||
linewidths=self.linewidth * 5, transform=ccrs.PlateCarree(),
|
||||
alpha=0.4, zorder=8, cmap=self.get_colormap())
|
||||
|
||||
def get_colormap(self):
|
||||
return plt.get_cmap(self.cmaps_box.currentText())
|
||||
@ -568,18 +531,18 @@ class Array_map(QtWidgets.QWidget):
|
||||
def scatter_all_stations(self):
|
||||
stations, lats, lons = self.get_st_lat_lon_for_plot()
|
||||
|
||||
self.sc = self.ax.scatter(lons, lats, s=self.pointsize * 3, facecolor='none', marker='.',
|
||||
zorder=10, picker=True, edgecolor='0.5', label='Not Picked',
|
||||
transform=ccrs.PlateCarree())
|
||||
self.sc = self.canvas.axes.scatter(lons, lats, s=self.pointsize * 3, facecolor='none', marker='.',
|
||||
zorder=10, picker=True, edgecolor='0.5', label='Not Picked',
|
||||
transform=ccrs.PlateCarree())
|
||||
|
||||
self.cid = self.plotWidget.mpl_connect('pick_event', self.onpick)
|
||||
self._station_onpick_ids = stations
|
||||
if self.eventLoc:
|
||||
lats, lons = self.eventLoc
|
||||
self.sc_event = self.ax.scatter(lons, lats, s=5 * self.pointsize, facecolor='red', zorder=11,
|
||||
label='Event (might be outside map region)', marker='*',
|
||||
edgecolors='black',
|
||||
transform=ccrs.PlateCarree())
|
||||
self.sc_event = self.canvas.axes.scatter(lons, lats, s=5 * self.pointsize, facecolor='red', zorder=11,
|
||||
label='Event (might be outside map region)', marker='*',
|
||||
edgecolors='black',
|
||||
transform=ccrs.PlateCarree())
|
||||
|
||||
def scatter_picked_stations(self):
|
||||
picks, uncertainties, lats, lons = self.get_picks_lat_lon()
|
||||
@ -592,13 +555,8 @@ class Array_map(QtWidgets.QWidget):
|
||||
for uncertainty in uncertainties])
|
||||
|
||||
cmap = self.get_colormap()
|
||||
|
||||
vmin = vmax = None
|
||||
if self.subtract_mean_cb.isChecked():
|
||||
vmin, vmax = -self.get_residuals_absmax(), self.get_residuals_absmax()
|
||||
|
||||
self.sc_picked = self.ax.scatter(lons, lats, s=sizes, edgecolors='white', cmap=cmap, vmin=vmin, vmax=vmax,
|
||||
c=picks, zorder=11, label='Picked', transform=ccrs.PlateCarree())
|
||||
self.sc_picked = self.canvas.axes.scatter(lons, lats, s=sizes, edgecolors='white', cmap=cmap,
|
||||
c=picks, zorder=11, label='Picked', transform=ccrs.PlateCarree())
|
||||
|
||||
def annotate_ax(self):
|
||||
self.annotations = []
|
||||
@ -616,20 +574,20 @@ class Array_map(QtWidgets.QWidget):
|
||||
if st in self.marked_stations:
|
||||
color = 'red'
|
||||
self.annotations.append(
|
||||
self.ax.annotate(f'{st}', xy=(x + 0.003, y + 0.003), fontsize=self.pointsize / 4.,
|
||||
fontweight='semibold', color=color, alpha=0.8,
|
||||
transform=ccrs.PlateCarree(), zorder=14,
|
||||
path_effects=[PathEffects.withStroke(
|
||||
linewidth=self.pointsize / 15., foreground='k')]))
|
||||
self.canvas.axes.annotate(' %s' % st, xy=(x + 0.003, y + 0.003), fontsize=self.pointsize / 4.,
|
||||
fontweight='semibold', color=color, alpha=0.8,
|
||||
transform=ccrs.PlateCarree(), zorder=14,
|
||||
path_effects=[PathEffects.withStroke(
|
||||
linewidth=self.pointsize / 15., foreground='k')]))
|
||||
|
||||
self.legend = self.ax.legend(loc=1, framealpha=1)
|
||||
self.legend = self.canvas.axes.legend(loc=1, framealpha=1)
|
||||
self.legend.set_zorder(100)
|
||||
self.legend.get_frame().set_facecolor((1, 1, 1, 0.95))
|
||||
|
||||
def add_cbar(self, label):
|
||||
self.cbax_bg = inset_axes(self.ax, width="6%", height="75%", loc=5)
|
||||
cbax = inset_axes(self.ax, width='2%', height='70%', loc=5)
|
||||
cbar = self.ax.figure.colorbar(self.sc_picked, cax=cbax)
|
||||
self.cbax_bg = inset_axes(self.canvas.axes, width="6%", height="75%", loc=5)
|
||||
cbax = inset_axes(self.canvas.axes, width='2%', height='70%', loc=5)
|
||||
cbar = self.canvas.axes.figure.colorbar(self.sc_picked, cax=cbax)
|
||||
cbar.set_label(label)
|
||||
cbax.yaxis.tick_left()
|
||||
cbax.yaxis.set_label_position('left')
|
||||
@ -674,9 +632,7 @@ class Array_map(QtWidgets.QWidget):
|
||||
if picks_available:
|
||||
self.scatter_picked_stations()
|
||||
if hasattr(self, 'sc_picked'):
|
||||
self.cbar = self.add_cbar(
|
||||
label='Time relative to reference onset ({}) [s]'.format(self._reference_picktime)
|
||||
)
|
||||
self.cbar = self.add_cbar(label='Time relative to first onset ({}) [s]'.format(self._earliest_picktime))
|
||||
self.comboBox_phase.setEnabled(True)
|
||||
else:
|
||||
self.comboBox_phase.setEnabled(False)
|
||||
|
@ -27,10 +27,6 @@ class Metadata(object):
|
||||
# saves which metadata files are from obspy dmt
|
||||
self.obspy_dmt_invs = []
|
||||
if inventory:
|
||||
# make sure that no accidental backslashes mess up the path
|
||||
if isinstance(inventory, str):
|
||||
inventory = inventory.replace('\\', '/')
|
||||
inventory = os.path.abspath(inventory)
|
||||
if os.path.isdir(inventory):
|
||||
self.add_inventory(inventory)
|
||||
if os.path.isfile(inventory):
|
||||
@ -59,8 +55,6 @@ class Metadata(object):
|
||||
:type path_to_inventory: str
|
||||
:return: None
|
||||
"""
|
||||
path_to_inventory = path_to_inventory.replace('\\', '/')
|
||||
path_to_inventory = os.path.abspath(path_to_inventory)
|
||||
assert (os.path.isdir(path_to_inventory)), '{} is no directory'.format(path_to_inventory)
|
||||
if path_to_inventory not in self.inventories:
|
||||
self.inventories.append(path_to_inventory)
|
||||
@ -268,6 +262,9 @@ class Metadata(object):
|
||||
if not fnames:
|
||||
# search for station name in filename
|
||||
fnames = glob.glob(os.path.join(path_to_inventory, '*' + station + '*'))
|
||||
if not fnames:
|
||||
# search for network name in filename
|
||||
fnames = glob.glob(os.path.join(path_to_inventory, '*' + network + '*'))
|
||||
if not fnames:
|
||||
if self.verbosity:
|
||||
print('Could not find filenames matching station name, network name or seed id')
|
||||
@ -279,7 +276,7 @@ class Metadata(object):
|
||||
continue
|
||||
invtype, robj = self._read_metadata_file(os.path.join(path_to_inventory, fname))
|
||||
try:
|
||||
robj.get_coordinates(station_seed_id)
|
||||
# robj.get_coordinates(station_seed_id) # TODO: Commented out, failed with Parser, is this needed?
|
||||
self.inventory_files[fname] = {'invtype': invtype,
|
||||
'data': robj}
|
||||
if station_seed_id in self.seed_ids.keys():
|
||||
@ -287,7 +284,6 @@ class Metadata(object):
|
||||
self.seed_ids[station_seed_id] = fname
|
||||
return True
|
||||
except Exception as e:
|
||||
logging.warning(e)
|
||||
continue
|
||||
print('Could not find metadata for station_seed_id {} in path {}'.format(station_seed_id, path_to_inventory))
|
||||
|
||||
@ -342,19 +338,6 @@ class Metadata(object):
|
||||
return inv, exc
|
||||
|
||||
|
||||
def time_from_header(header):
|
||||
"""
|
||||
Function takes in the second line from a .gse file and takes out the date and time from that line.
|
||||
:param header: second line from .gse file
|
||||
:type header: string
|
||||
:return: a list of integers of form [year, month, day, hour, minute, second, microsecond]
|
||||
"""
|
||||
timeline = header.split(' ')
|
||||
time = timeline[1].split('/') + timeline[2].split(':')
|
||||
time = time[:-1] + time[-1].split('.')
|
||||
return [int(t) for t in time]
|
||||
|
||||
|
||||
def check_time(datetime):
|
||||
"""
|
||||
Function takes in date and time as list and validates it's values by trying to make an UTCDateTime object from it
|
||||
@ -392,167 +375,6 @@ def check_time(datetime):
|
||||
return False
|
||||
|
||||
|
||||
# TODO: change root to datapath
|
||||
def get_file_list(root_dir):
|
||||
"""
|
||||
Function uses a directorie to get all the *.gse files from it.
|
||||
:param root_dir: a directorie leading to the .gse files
|
||||
:type root_dir: string
|
||||
:return: returns a list of filenames (without path to them)
|
||||
"""
|
||||
file_list = glob.glob1(root_dir, '*.gse')
|
||||
return file_list
|
||||
|
||||
|
||||
def checks_station_second(datetime, file):
|
||||
"""
|
||||
Function uses the given list to check if the parameter 'second' is set to 60 by mistake
|
||||
and sets the time correctly if so. Can only correct time if no date change would be necessary.
|
||||
:param datetime: [year, month, day, hour, minute, second, microsecond]
|
||||
:return: returns the input with the correct value for second
|
||||
"""
|
||||
if datetime[5] == 60:
|
||||
if datetime[4] == 59:
|
||||
if datetime[3] == 23:
|
||||
err_msg = 'Date should be next day. ' \
|
||||
'File not changed: {0}'.format(file)
|
||||
raise ValueError(err_msg)
|
||||
else:
|
||||
datetime[3] += 1
|
||||
datetime[4] = 0
|
||||
datetime[5] = 0
|
||||
else:
|
||||
datetime[4] += 1
|
||||
datetime[5] = 0
|
||||
return datetime
|
||||
|
||||
|
||||
def make_time_line(line, datetime):
|
||||
"""
|
||||
Function takes in the original line from a .gse file and a list of date and
|
||||
time values to make a new line with corrected date and time.
|
||||
:param line: second line from .gse file.
|
||||
:type line: string
|
||||
:param datetime: list of integers [year, month, day, hour, minute, second, microsecond]
|
||||
:type datetime: list
|
||||
:return: returns a string to write it into a file.
|
||||
"""
|
||||
ins_form = '{0:02d}:{1:02d}:{2:02d}.{3:03d}'
|
||||
insertion = ins_form.format(int(datetime[3]),
|
||||
int(datetime[4]),
|
||||
int(datetime[5]),
|
||||
int(datetime[6] * 1e-3))
|
||||
newline = line[:16] + insertion + line[28:]
|
||||
return newline
|
||||
|
||||
|
||||
def evt_head_check(root_dir, out_dir=None):
|
||||
"""
|
||||
A function to make sure that an arbitrary number of .gse files have correct values in their header.
|
||||
:param root_dir: a directory leading to the .gse files.
|
||||
:type root_dir: string
|
||||
:param out_dir: a directory to store the new files somwhere els.
|
||||
:return: returns nothing
|
||||
"""
|
||||
if not out_dir:
|
||||
print('WARNING files are going to be overwritten!')
|
||||
inp = str(input('Continue? [y/N]'))
|
||||
if not inp == 'y':
|
||||
sys.exit()
|
||||
filelist = get_file_list(root_dir)
|
||||
nfiles = 0
|
||||
for file in filelist:
|
||||
infile = open(os.path.join(root_dir, file), 'r')
|
||||
lines = infile.readlines()
|
||||
infile.close()
|
||||
datetime = time_from_header(lines[1])
|
||||
if check_time(datetime):
|
||||
continue
|
||||
else:
|
||||
nfiles += 1
|
||||
datetime = checks_station_second(datetime, file)
|
||||
print('writing ' + file)
|
||||
# write File
|
||||
lines[1] = make_time_line(lines[1], datetime)
|
||||
if not out_dir:
|
||||
out = open(os.path.join(root_dir, file), 'w')
|
||||
out.writelines(lines)
|
||||
out.close()
|
||||
else:
|
||||
out = open(os.path.join(out_dir, file), 'w')
|
||||
out.writelines(lines)
|
||||
out.close()
|
||||
print(nfiles)
|
||||
|
||||
|
||||
def read_metadata(path_to_inventory):
|
||||
"""
|
||||
take path_to_inventory and return either the corresponding list of files
|
||||
found or the Parser object for a network dataless seed volume to prevent
|
||||
read overhead for large dataless seed volumes
|
||||
:param path_to_inventory:
|
||||
:return: tuple containing a either list of files or `obspy.io.xseed.Parser`
|
||||
object and the inventory type found
|
||||
:rtype: tuple
|
||||
"""
|
||||
dlfile = list()
|
||||
invfile = list()
|
||||
respfile = list()
|
||||
# possible file extensions specified here:
|
||||
inv = dict(dless=dlfile, xml=invfile, resp=respfile, dseed=dlfile[:])
|
||||
if os.path.isfile(path_to_inventory):
|
||||
ext = os.path.splitext(path_to_inventory)[1].split('.')[1]
|
||||
inv[ext] += [path_to_inventory]
|
||||
else:
|
||||
for ext in inv.keys():
|
||||
inv[ext] += glob.glob1(path_to_inventory, '*.{0}'.format(ext))
|
||||
|
||||
invtype = key_for_set_value(inv)
|
||||
|
||||
if invtype is None:
|
||||
print("Neither dataless-SEED file, inventory-xml file nor "
|
||||
"RESP-file found!")
|
||||
print("!!WRONG CALCULATION OF SOURCE PARAMETERS!!")
|
||||
robj = None,
|
||||
elif invtype == 'dless': # prevent multiple read of large dlsv
|
||||
print("Reading metadata information from dataless-SEED file ...")
|
||||
if len(inv[invtype]) == 1:
|
||||
fullpath_inv = os.path.join(path_to_inventory, inv[invtype][0])
|
||||
robj = Parser(fullpath_inv)
|
||||
else:
|
||||
robj = inv[invtype]
|
||||
else:
|
||||
print("Reading metadata information from inventory-xml file ...")
|
||||
robj = read_inventory(inv[invtype])
|
||||
return invtype, robj
|
||||
|
||||
|
||||
# idea to optimize read_metadata
|
||||
# def read_metadata_new(path_to_inventory):
|
||||
# metadata_objects = []
|
||||
# # read multiple files from directory
|
||||
# if os.path.isdir(path_to_inventory):
|
||||
# fnames = os.listdir(path_to_inventory)
|
||||
# # read single file
|
||||
# elif os.path.isfile(path_to_inventory):
|
||||
# fnames = [path_to_inventory]
|
||||
# else:
|
||||
# print("Neither dataless-SEED file, inventory-xml file nor "
|
||||
# "RESP-file found!")
|
||||
# print("!!WRONG CALCULATION OF SOURCE PARAMETERS!!")
|
||||
# fnames = []
|
||||
#
|
||||
# for fname in fnames:
|
||||
# path_to_inventory_filename = os.path.join(path_to_inventory, fname)
|
||||
# try:
|
||||
# ftype, robj = read_metadata_file(path_to_inventory_filename)
|
||||
# metadata_objects.append((ftype, robj))
|
||||
# except Exception as e:
|
||||
# print('Could not read metadata file {} '
|
||||
# 'because of the following Exception: {}'.format(path_to_inventory_filename, e))
|
||||
# return metadata_objects
|
||||
|
||||
|
||||
def restitute_trace(input_tuple):
|
||||
def no_metadata(tr, seed_id):
|
||||
print('no metadata file found '
|
||||
@ -652,8 +474,6 @@ def restitute_data(data, metadata, unit='VEL', force=False, ncores=0):
|
||||
"""
|
||||
|
||||
# data = remove_underscores(data)
|
||||
if not data:
|
||||
return
|
||||
|
||||
# loop over traces
|
||||
input_tuples = []
|
||||
@ -661,14 +481,9 @@ def restitute_data(data, metadata, unit='VEL', force=False, ncores=0):
|
||||
input_tuples.append((tr, metadata, unit, force))
|
||||
data.remove(tr)
|
||||
|
||||
if ncores == 0:
|
||||
result = []
|
||||
for input_tuple in input_tuples:
|
||||
result.append(restitute_trace(input_tuple))
|
||||
else:
|
||||
pool = gen_Pool(ncores)
|
||||
result = pool.imap_unordered(restitute_trace, input_tuples)
|
||||
pool.close()
|
||||
pool = gen_Pool(ncores)
|
||||
result = pool.imap_unordered(restitute_trace, input_tuples)
|
||||
pool.close()
|
||||
|
||||
for tr, remove_trace in result:
|
||||
if not remove_trace:
|
||||
|
@ -22,11 +22,14 @@ class Event(ObsPyEvent):
|
||||
:param path: path to event directory
|
||||
:type path: str
|
||||
"""
|
||||
# TODO: remove rootpath and database
|
||||
self.pylot_id = path.split('/')[-1]
|
||||
# initialize super class
|
||||
super(Event, self).__init__(resource_id=ResourceIdentifier('smi:local/' + self.pylot_id))
|
||||
self.path = path
|
||||
self.database = path.split('/')[-2]
|
||||
self.datapath = os.path.split(path)[0] # path.split('/')[-3]
|
||||
self.rootpath = '/' + os.path.join(*path.split('/')[:-3])
|
||||
self.pylot_autopicks = {}
|
||||
self.pylot_picks = {}
|
||||
self.notes = ''
|
||||
|
@ -20,7 +20,7 @@ import matplotlib
|
||||
matplotlib.use('Qt5Agg')
|
||||
sys.path.append(os.path.join('/'.join(sys.argv[0].split('/')[:-1]), '../../..'))
|
||||
|
||||
from PyLoT import Project
|
||||
from pylot.core.io.project import Project
|
||||
from pylot.core.util.dataprocessing import Metadata
|
||||
from pylot.core.util.array_map import Array_map
|
||||
|
||||
|
@ -1,7 +1,6 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
import os
|
||||
from functools import lru_cache
|
||||
|
||||
try:
|
||||
import pyqtgraph as pg
|
||||
@ -26,14 +25,14 @@ def pick_linestyle_pg(picktype, key):
|
||||
:return: Qt line style parameters
|
||||
:rtype:
|
||||
"""
|
||||
linestyles_manu = {'mpp': (QtCore.Qt.SolidLine, 2),
|
||||
'epp': (QtCore.Qt.DashLine, 1),
|
||||
'lpp': (QtCore.Qt.DashLine, 1),
|
||||
'spe': (QtCore.Qt.DashLine, 1)}
|
||||
linestyles_auto = {'mpp': (QtCore.Qt.DotLine, 2),
|
||||
'epp': (QtCore.Qt.DashDotLine, 1),
|
||||
'lpp': (QtCore.Qt.DashDotLine, 1),
|
||||
'spe': (QtCore.Qt.DashDotLine, 1)}
|
||||
linestyles_manu = {'mpp': (QtCore.Qt.SolidLine, 2.),
|
||||
'epp': (QtCore.Qt.DashLine, 1.),
|
||||
'lpp': (QtCore.Qt.DashLine, 1.),
|
||||
'spe': (QtCore.Qt.DashLine, 1.)}
|
||||
linestyles_auto = {'mpp': (QtCore.Qt.DotLine, 2.),
|
||||
'epp': (QtCore.Qt.DashDotLine, 1.),
|
||||
'lpp': (QtCore.Qt.DashDotLine, 1.),
|
||||
'spe': (QtCore.Qt.DashDotLine, 1.)}
|
||||
linestyles = {'manual': linestyles_manu,
|
||||
'auto': linestyles_auto}
|
||||
return linestyles[picktype][key]
|
||||
@ -81,7 +80,6 @@ def which(program, parameter):
|
||||
return None
|
||||
|
||||
|
||||
@lru_cache(maxsize=128)
|
||||
def make_pen(picktype, phase, key, quality):
|
||||
"""
|
||||
Make PyQtGraph.QPen
|
||||
|
@ -6,7 +6,7 @@ Created on Wed Jan 26 17:47:25 2015
|
||||
@author: sebastianw
|
||||
"""
|
||||
|
||||
from pylot.core.io.data import SeiscompDataStructure, PilotDataStructure, ObspyDMTdataStructure
|
||||
from pylot.core.io.data import SeiscompDataStructure, PilotDataStructure
|
||||
|
||||
DATASTRUCTURE = {'PILOT': PilotDataStructure, 'SeisComP': SeiscompDataStructure,
|
||||
'obspyDMT': ObspyDMTdataStructure, None: PilotDataStructure}
|
||||
'obspyDMT': PilotDataStructure, None: PilotDataStructure}
|
||||
|
@ -8,7 +8,6 @@ import platform
|
||||
import re
|
||||
import subprocess
|
||||
import warnings
|
||||
from typing import Literal, Tuple, Type
|
||||
from functools import lru_cache
|
||||
|
||||
import numpy as np
|
||||
@ -107,7 +106,7 @@ def gen_Pool(ncores=0):
|
||||
|
||||
print('gen_Pool: Generated multiprocessing Pool with {} cores\n'.format(ncores))
|
||||
|
||||
pool = multiprocessing.Pool(ncores, maxtasksperchild=100)
|
||||
pool = multiprocessing.Pool(ncores)
|
||||
return pool
|
||||
|
||||
|
||||
@ -358,8 +357,10 @@ def get_bool(value):
|
||||
False
|
||||
>>> get_bool(None)
|
||||
None
|
||||
>>> get_bool('Stream')
|
||||
'Stream'
|
||||
"""
|
||||
if type(value) is bool:
|
||||
if type(value) == bool:
|
||||
return value
|
||||
elif value in ['True', 'true']:
|
||||
return True
|
||||
@ -900,6 +901,19 @@ def trim_station_components(data, trim_start=True, trim_end=True):
|
||||
return data
|
||||
|
||||
|
||||
def merge_stream(stream):
|
||||
gaps = stream.get_gaps()
|
||||
if gaps:
|
||||
# list of merged stations (seed_ids)
|
||||
merged = ['{}.{}.{}.{}'.format(*gap[:4]) for gap in gaps]
|
||||
stream.merge(method=1)
|
||||
print('Merged the following stations because of gaps:')
|
||||
for merged_station in merged:
|
||||
print(merged_station)
|
||||
|
||||
return stream, gaps
|
||||
|
||||
|
||||
def check4gapsAndRemove(data):
|
||||
"""
|
||||
check for gaps in Stream and remove them
|
||||
@ -920,12 +934,12 @@ def check4gapsAndRemove(data):
|
||||
return data
|
||||
|
||||
|
||||
def check_for_gaps_and_merge(data):
|
||||
def check4gapsAndMerge(data):
|
||||
"""
|
||||
check for gaps in Stream and merge if gaps are found
|
||||
:param data: stream of seismic data
|
||||
:type data: `~obspy.core.stream.Stream`
|
||||
:return: data stream, gaps returned from obspy get_gaps
|
||||
:return: data stream
|
||||
:rtype: `~obspy.core.stream.Stream`
|
||||
"""
|
||||
gaps = data.get_gaps()
|
||||
@ -936,7 +950,7 @@ def check_for_gaps_and_merge(data):
|
||||
for merged_station in merged:
|
||||
print(merged_station)
|
||||
|
||||
return data, gaps
|
||||
return data
|
||||
|
||||
|
||||
def check4doubled(data):
|
||||
@ -1076,7 +1090,7 @@ def check4rotated(data, metadata=None, verbosity=1):
|
||||
return wfs_in
|
||||
|
||||
# check metadata quality
|
||||
t_start = full_range(wfs_in)[0]
|
||||
t_start = full_range(wfs_in)
|
||||
try:
|
||||
azimuths = []
|
||||
dips = []
|
||||
@ -1084,9 +1098,8 @@ def check4rotated(data, metadata=None, verbosity=1):
|
||||
azimuths.append(metadata.get_coordinates(tr_id, t_start)['azimuth'])
|
||||
dips.append(metadata.get_coordinates(tr_id, t_start)['dip'])
|
||||
except (KeyError, TypeError) as err:
|
||||
logging.warning(f"Rotating not possible, not all azimuth and dip information "
|
||||
f"available in metadata. Stream remains unchanged.")
|
||||
logging.debug(f"Rotating not possible, {err=}, {type(err)=}")
|
||||
logging.error(f"{type(err)=} occurred: {err=} Rotating not possible, not all azimuth and dip information "
|
||||
f"available in metadata. Stream remains unchanged.")
|
||||
return wfs_in
|
||||
except Exception as err:
|
||||
print(f"Unexpected {err=}, {type(err)=}")
|
||||
@ -1205,7 +1218,6 @@ def identifyPhase(phase):
|
||||
return False
|
||||
|
||||
|
||||
@lru_cache
|
||||
def identifyPhaseID(phase):
|
||||
"""
|
||||
Returns phase id (capital P or S)
|
||||
|
@ -8,7 +8,6 @@ import copy
|
||||
import datetime
|
||||
import getpass
|
||||
import glob
|
||||
import logging
|
||||
import multiprocessing
|
||||
import os
|
||||
import subprocess
|
||||
@ -141,18 +140,6 @@ class LogWidget(QtWidgets.QWidget):
|
||||
self.stderr.append(60 * '#' + '\n\n')
|
||||
|
||||
|
||||
def getDataType(parent):
|
||||
type = QInputDialog().getItem(parent, "Select phases type", "Type:",
|
||||
["manual", "automatic"])
|
||||
|
||||
if type[0].startswith('auto'):
|
||||
type = 'auto'
|
||||
else:
|
||||
type = type[0]
|
||||
|
||||
return type
|
||||
|
||||
|
||||
def plot_pdf(_axes, x, y, annotation, bbox_props, xlabel=None, ylabel=None,
|
||||
title=None):
|
||||
# try method or data
|
||||
@ -1010,15 +997,6 @@ class WaveformWidgetPG(QtWidgets.QWidget):
|
||||
time_ax = np.linspace(time_ax[0], time_ax[-1], num=len(data))
|
||||
return data, time_ax
|
||||
|
||||
# def getAxes(self):
|
||||
# return self.axes
|
||||
|
||||
# def getXLims(self):
|
||||
# return self.getAxes().get_xlim()
|
||||
|
||||
# def getYLims(self):
|
||||
# return self.getAxes().get_ylim()
|
||||
|
||||
def setXLims(self, lims):
|
||||
vb = self.plotWidget.getPlotItem().getViewBox()
|
||||
vb.setXRange(float(lims[0]), float(lims[1]), padding=0)
|
||||
@ -1172,8 +1150,6 @@ class PylotCanvas(FigureCanvas):
|
||||
break
|
||||
if not ax_check: return
|
||||
|
||||
# self.updateCurrentLimits() #maybe put this down to else:
|
||||
|
||||
# calculate delta (relative values in axis)
|
||||
old_x, old_y = self.press_rel
|
||||
xdiff = gui_event.x - old_x
|
||||
@ -1381,125 +1357,140 @@ class PylotCanvas(FigureCanvas):
|
||||
component='*', nth_sample=1, iniPick=None, verbosity=0,
|
||||
plot_additional=False, additional_channel=None, scaleToChannel=None,
|
||||
snr=None):
|
||||
def get_wf_dict(data: Stream = Stream(), linecolor = 'k', offset: float = 0., **plot_kwargs):
|
||||
return dict(data=data, linecolor=linecolor, offset=offset, plot_kwargs=plot_kwargs)
|
||||
ax = self.prepare_plot()
|
||||
self.clearPlotDict()
|
||||
|
||||
wfstart, wfend = self.get_wf_range(wfdata)
|
||||
compclass = self.get_comp_class()
|
||||
plot_streams = self.get_plot_streams(wfdata, wfdata_compare, component, compclass)
|
||||
|
||||
st_main = plot_streams['wfdata']['data']
|
||||
if mapping:
|
||||
plot_positions = self.calcPlotPositions(st_main, compclass)
|
||||
|
||||
nslc = self.get_sorted_nslc(st_main)
|
||||
nmax = self.plot_traces(ax, plot_streams, nslc, wfstart, mapping, plot_positions,
|
||||
scaleToChannel, noiselevel, scaleddata, nth_sample, verbosity)
|
||||
|
||||
if plot_additional and additional_channel:
|
||||
self.plot_additional_trace(ax, wfdata, additional_channel, scaleToChannel,
|
||||
scaleddata, nth_sample, wfstart)
|
||||
|
||||
self.finalize_plot(ax, wfstart, wfend, nmax, zoomx, zoomy, iniPick, title, snr)
|
||||
|
||||
def prepare_plot(self):
|
||||
ax = self.axes[0]
|
||||
ax.cla()
|
||||
return ax
|
||||
|
||||
self.clearPlotDict()
|
||||
wfstart, wfend = full_range(wfdata)
|
||||
nmax = 0
|
||||
def get_wf_range(self, wfdata):
|
||||
return full_range(wfdata)
|
||||
|
||||
def get_comp_class(self):
|
||||
settings = QSettings()
|
||||
compclass = SetChannelComponents.from_qsettings(settings)
|
||||
return SetChannelComponents.from_qsettings(settings)
|
||||
|
||||
def get_plot_streams(self, wfdata, wfdata_compare, component, compclass):
|
||||
def get_wf_dict(data=Stream(), linecolor='k', offset=0., **plot_kwargs):
|
||||
return dict(data=data, linecolor=linecolor, offset=offset, plot_kwargs=plot_kwargs)
|
||||
|
||||
linecolor = (0., 0., 0., 1.) if not self.style else self.style['linecolor']['rgba_mpl']
|
||||
plot_streams = {
|
||||
'wfdata': get_wf_dict(linecolor=linecolor, linewidth=0.7),
|
||||
'wfdata_comp': get_wf_dict(offset=0.1, linecolor='b', alpha=0.7, linewidth=0.5)
|
||||
}
|
||||
|
||||
plot_streams = dict(wfdata=get_wf_dict(linecolor=linecolor, linewidth=0.7),
|
||||
wfdata_comp=get_wf_dict(offset=0.1, linecolor='b', alpha=0.7, linewidth=0.5))
|
||||
|
||||
if not component == '*':
|
||||
if component != '*':
|
||||
alter_comp = compclass.getCompPosition(component)
|
||||
# alter_comp = str(alter_comp[0])
|
||||
|
||||
plot_streams['wfdata']['data'] = wfdata.select(component=component)
|
||||
plot_streams['wfdata']['data'] += wfdata.select(component=alter_comp)
|
||||
plot_streams['wfdata']['data'] = wfdata.select(component=component) + wfdata.select(component=alter_comp)
|
||||
if wfdata_compare:
|
||||
plot_streams['wfdata_comp']['data'] = wfdata_compare.select(component=component)
|
||||
plot_streams['wfdata_comp']['data'] += wfdata_compare.select(component=alter_comp)
|
||||
plot_streams['wfdata_comp']['data'] = wfdata_compare.select(
|
||||
component=component) + wfdata_compare.select(component=alter_comp)
|
||||
else:
|
||||
plot_streams['wfdata']['data'] = wfdata
|
||||
if wfdata_compare:
|
||||
plot_streams['wfdata_comp']['data'] = wfdata_compare
|
||||
|
||||
st_main = plot_streams['wfdata']['data']
|
||||
return plot_streams
|
||||
|
||||
if mapping:
|
||||
plot_positions = self.calcPlotPositions(st_main, compclass)
|
||||
|
||||
# list containing tuples of network, station, channel and plot position (for sorting)
|
||||
nslc = []
|
||||
for plot_pos, trace in enumerate(st_main):
|
||||
if not trace.stats.channel[-1] in ['Z', 'N', 'E', '1', '2', '3']:
|
||||
print('Warning: Unrecognized channel {}'.format(trace.stats.channel))
|
||||
continue
|
||||
nslc.append(trace.get_id())
|
||||
nslc.sort()
|
||||
nslc.reverse()
|
||||
def get_sorted_nslc(self, st_main):
|
||||
nslc = [trace.get_id() for trace in st_main if trace.stats.channel[-1] in ['Z', 'N', 'E', '1', '2', '3']]
|
||||
nslc.sort(reverse=True)
|
||||
return nslc
|
||||
|
||||
def plot_traces(self, ax, plot_streams, nslc, wfstart, mapping, plot_positions, scaleToChannel, noiselevel,
|
||||
scaleddata, nth_sample, verbosity):
|
||||
nmax = 0
|
||||
for n, seed_id in enumerate(nslc):
|
||||
network, station, location, channel = seed_id.split('.')
|
||||
for wf_name, wf_dict in plot_streams.items():
|
||||
st_select = wf_dict.get('data')
|
||||
if not st_select:
|
||||
continue
|
||||
st = st_select.select(id=seed_id)
|
||||
trace = st[0].copy()
|
||||
trace = st_select.select(id=seed_id)[0].copy()
|
||||
if mapping:
|
||||
n = plot_positions[trace.stats.channel]
|
||||
if n > nmax:
|
||||
nmax = n
|
||||
if verbosity:
|
||||
msg = 'plotting %s channel of station %s' % (channel, station)
|
||||
print(msg)
|
||||
stime = trace.stats.starttime - wfstart
|
||||
time_ax = prep_time_axis(stime, trace)
|
||||
if time_ax is not None:
|
||||
if scaleToChannel:
|
||||
st_scale = wfdata.select(channel=scaleToChannel)
|
||||
if st_scale:
|
||||
tr = st_scale[0]
|
||||
trace.detrend('constant')
|
||||
trace.normalize(np.max(np.abs(tr.data)) * 2)
|
||||
scaleddata = True
|
||||
if not scaleddata:
|
||||
trace.detrend('constant')
|
||||
trace.normalize(np.max(np.abs(trace.data)) * 2)
|
||||
print(f'plotting {channel} channel of station {station}')
|
||||
time_ax = prep_time_axis(trace.stats.starttime - wfstart, trace)
|
||||
self.plot_trace(ax, trace, time_ax, wf_dict, n, scaleToChannel, noiselevel, scaleddata, nth_sample)
|
||||
self.setPlotDict(n, seed_id)
|
||||
return nmax
|
||||
|
||||
offset = wf_dict.get('offset')
|
||||
def plot_trace(self, ax, trace, time_ax, wf_dict, n, scaleToChannel, noiselevel, scaleddata, nth_sample):
|
||||
if time_ax is not None:
|
||||
if scaleToChannel:
|
||||
self.scale_trace(trace, scaleToChannel)
|
||||
scaleddata = True
|
||||
if not scaleddata:
|
||||
trace.detrend('constant')
|
||||
trace.normalize(np.max(np.abs(trace.data)) * 2)
|
||||
offset = wf_dict.get('offset')
|
||||
|
||||
times = [time for index, time in enumerate(time_ax) if not index % nth_sample]
|
||||
data = [datum + n + offset for index, datum in enumerate(trace.data) if not index % nth_sample]
|
||||
ax.axhline(n, color="0.5", lw=0.5)
|
||||
ax.plot(times, data, color=wf_dict.get('linecolor'), **wf_dict.get('plot_kwargs'))
|
||||
if noiselevel is not None:
|
||||
for level in [-noiselevel[channel], noiselevel[channel]]:
|
||||
ax.plot([time_ax[0], time_ax[-1]],
|
||||
[n + level, n + level],
|
||||
color=wf_dict.get('linecolor'),
|
||||
linestyle='dashed')
|
||||
self.setPlotDict(n, seed_id)
|
||||
if plot_additional and additional_channel:
|
||||
compare_stream = wfdata.select(channel=additional_channel)
|
||||
if compare_stream:
|
||||
trace = compare_stream[0]
|
||||
if scaleToChannel:
|
||||
st_scale = wfdata.select(channel=scaleToChannel)
|
||||
if st_scale:
|
||||
tr = st_scale[0]
|
||||
trace.detrend('constant')
|
||||
trace.normalize(np.max(np.abs(tr.data)) * 2)
|
||||
scaleddata = True
|
||||
if not scaleddata:
|
||||
trace.detrend('constant')
|
||||
trace.normalize(np.max(np.abs(trace.data)) * 2)
|
||||
time_ax = prep_time_axis(stime, trace)
|
||||
times = [time for index, time in enumerate(time_ax) if not index % nth_sample]
|
||||
p_data = compare_stream[0].data
|
||||
# #normalize
|
||||
# p_max = max(abs(p_data))
|
||||
# p_data /= p_max
|
||||
for index in range(3):
|
||||
ax.plot(times, p_data, color='red', alpha=0.5, linewidth=0.7)
|
||||
p_data += 1
|
||||
times = [time for index, time in enumerate(time_ax) if not index % nth_sample]
|
||||
data = [datum + n + offset for index, datum in enumerate(trace.data) if not index % nth_sample]
|
||||
ax.axhline(n, color="0.5", lw=0.5)
|
||||
ax.plot(times, data, color=wf_dict.get('linecolor'), **wf_dict.get('plot_kwargs'))
|
||||
if noiselevel is not None:
|
||||
self.plot_noise_level(ax, time_ax, noiselevel, channel, n, wf_dict.get('linecolor'))
|
||||
|
||||
def scale_trace(self, trace, scaleToChannel):
|
||||
st_scale = wfdata.select(channel=scaleToChannel)
|
||||
if st_scale:
|
||||
tr = st_scale[0]
|
||||
trace.detrend('constant')
|
||||
trace.normalize(np.max(np.abs(tr.data)) * 2)
|
||||
|
||||
def plot_noise_level(self, ax, time_ax, noiselevel, channel, n, linecolor):
|
||||
for level in [-noiselevel[channel], noiselevel[channel]]:
|
||||
ax.plot([time_ax[0], time_ax[-1]], [n + level, n + level], color=linecolor, linestyle='dashed')
|
||||
|
||||
def plot_additional_trace(self, ax, wfdata, additional_channel, scaleToChannel, scaleddata, nth_sample, wfstart):
|
||||
compare_stream = wfdata.select(channel=additional_channel)
|
||||
if compare_stream:
|
||||
trace = compare_stream[0]
|
||||
if scaleToChannel:
|
||||
self.scale_trace(trace, scaleToChannel)
|
||||
scaleddata = True
|
||||
if not scaleddata:
|
||||
trace.detrend('constant')
|
||||
trace.normalize(np.max(np.abs(trace.data)) * 2)
|
||||
time_ax = prep_time_axis(trace.stats.starttime - wfstart, trace)
|
||||
self.plot_additional_data(ax, trace, time_ax, nth_sample)
|
||||
|
||||
def plot_additional_data(self, ax, trace, time_ax, nth_sample):
|
||||
times = [time for index, time in enumerate(time_ax) if not index % nth_sample]
|
||||
p_data = trace.data
|
||||
for index in range(3):
|
||||
ax.plot(times, p_data, color='red', alpha=0.5, linewidth=0.7)
|
||||
p_data += 1
|
||||
|
||||
def finalize_plot(self, ax, wfstart, wfend, nmax, zoomx, zoomy, iniPick, title, snr):
|
||||
if iniPick:
|
||||
ax.vlines(iniPick, ax.get_ylim()[0], ax.get_ylim()[1],
|
||||
colors='m', linestyles='dashed',
|
||||
linewidth=2)
|
||||
xlabel = 'seconds since {0}'.format(wfstart)
|
||||
ax.vlines(iniPick, ax.get_ylim()[0], ax.get_ylim()[1], colors='m', linestyles='dashed', linewidth=2)
|
||||
xlabel = f'seconds since {wfstart}'
|
||||
ylabel = ''
|
||||
self.updateWidget(xlabel, ylabel, title)
|
||||
self.setXLims(ax, [0, wfend - wfstart])
|
||||
@ -1509,15 +1500,14 @@ class PylotCanvas(FigureCanvas):
|
||||
if zoomy is not None:
|
||||
self.setYLims(ax, zoomy)
|
||||
if snr is not None:
|
||||
if snr < 2:
|
||||
warning = 'LOW SNR'
|
||||
if snr < 1.5:
|
||||
warning = 'VERY LOW SNR'
|
||||
ax.text(0.1, 0.9, 'WARNING - {}'.format(warning), ha='center', va='center', transform=ax.transAxes,
|
||||
color='red')
|
||||
|
||||
self.plot_snr_warning(ax, snr)
|
||||
self.draw()
|
||||
|
||||
def plot_snr_warning(self, ax, snr):
|
||||
if snr < 2:
|
||||
warning = 'LOW SNR' if snr >= 1.5 else 'VERY LOW SNR'
|
||||
ax.text(0.1, 0.9, f'WARNING - {warning}', ha='center', va='center', transform=ax.transAxes, color='red')
|
||||
|
||||
@staticmethod
|
||||
def getXLims(ax):
|
||||
return ax.get_xlim()
|
||||
@ -1871,14 +1861,13 @@ class PickDlg(QDialog):
|
||||
|
||||
def __init__(self, parent=None, data=None, data_compare=None, station=None, network=None, location=None, picks=None,
|
||||
autopicks=None, rotate=False, parameter=None, embedded=False, metadata=None, show_comp_data=False,
|
||||
event=None, filteroptions=None, wftype=None):
|
||||
event=None, filteroptions=None, model=None, wftype=None):
|
||||
super(PickDlg, self).__init__(parent, Qt.Window)
|
||||
self.orig_parent = parent
|
||||
self.setAttribute(Qt.WA_DeleteOnClose)
|
||||
|
||||
# initialize attributes
|
||||
self.parameter = parameter
|
||||
model = self.parameter.get('taup_model')
|
||||
self._embedded = embedded
|
||||
self.showCompData = show_comp_data
|
||||
self.station = station
|
||||
@ -1923,7 +1912,7 @@ class PickDlg(QDialog):
|
||||
# set attribute holding data
|
||||
if data is None or not data:
|
||||
try:
|
||||
data = parent.get_data().getWFData().copy()
|
||||
data = parent.get_data().get_wf_data().copy()
|
||||
self.data = data.select(station=station)
|
||||
except AttributeError as e:
|
||||
errmsg = 'You either have to put in a data or an appropriate ' \
|
||||
@ -1957,7 +1946,7 @@ class PickDlg(QDialog):
|
||||
self.cur_xlim = None
|
||||
self.cur_ylim = None
|
||||
|
||||
self.stime, self.etime = full_range(self.getWFData())
|
||||
self.stime, self.etime = full_range(self.get_wf_data())
|
||||
|
||||
# initialize plotting widget
|
||||
self.multicompfig = PylotCanvas(parent=self, multicursor=True)
|
||||
@ -1971,7 +1960,7 @@ class PickDlg(QDialog):
|
||||
self.referenceChannel.addItem('-', None)
|
||||
self.scaleChannel.addItem('individual', None)
|
||||
|
||||
for trace in self.getWFData():
|
||||
for trace in self.get_wf_data():
|
||||
channel = trace.stats.channel
|
||||
self.referenceChannel.addItem(channel, trace)
|
||||
if not channel[-1] in ['Z', 'N', 'E', '1', '2', '3']:
|
||||
@ -1990,7 +1979,7 @@ class PickDlg(QDialog):
|
||||
if self.wftype is not None:
|
||||
title += ' | ({})'.format(self.wftype)
|
||||
|
||||
self.multicompfig.plotWFData(wfdata=self.getWFData(), wfdata_compare=self.getWFDataComp(),
|
||||
self.multicompfig.plotWFData(wfdata=self.get_wf_data(), wfdata_compare=self.get_wf_dataComp(),
|
||||
title=title)
|
||||
|
||||
self.multicompfig.setZoomBorders2content()
|
||||
@ -2271,8 +2260,8 @@ class PickDlg(QDialog):
|
||||
arrivals = func[plot](source_origin.depth,
|
||||
source_origin.latitude,
|
||||
source_origin.longitude,
|
||||
station_coords.get('latitude'),
|
||||
station_coords.get('longitude'),
|
||||
station_coords['latitude'],
|
||||
station_coords['longitude'],
|
||||
phases)
|
||||
self.arrivals = arrivals
|
||||
|
||||
@ -2525,7 +2514,7 @@ class PickDlg(QDialog):
|
||||
if self.autoFilterAction.isChecked():
|
||||
for filteraction in [self.filterActionP, self.filterActionS]:
|
||||
filteraction.setChecked(False)
|
||||
self.filterWFData()
|
||||
self.filter_wf_data()
|
||||
self.draw()
|
||||
else:
|
||||
self.draw()
|
||||
@ -2609,10 +2598,10 @@ class PickDlg(QDialog):
|
||||
def getGlobalLimits(self, ax, axis):
|
||||
return self.multicompfig.getGlobalLimits(ax, axis)
|
||||
|
||||
def getWFData(self):
|
||||
def get_wf_data(self):
|
||||
return self.data
|
||||
|
||||
def getWFDataComp(self):
|
||||
def get_wf_dataComp(self):
|
||||
if self.showCompData:
|
||||
return self.data_compare
|
||||
else:
|
||||
@ -2627,17 +2616,17 @@ class PickDlg(QDialog):
|
||||
return tr
|
||||
|
||||
if component == 'E' or component == 'N':
|
||||
for trace in self.getWFData():
|
||||
for trace in self.get_wf_data():
|
||||
trace = selectTrace(trace, 'NE')
|
||||
if trace:
|
||||
wfdata.append(trace)
|
||||
elif component == '1' or component == '2':
|
||||
for trace in self.getWFData():
|
||||
for trace in self.get_wf_data():
|
||||
trace = selectTrace(trace, '12')
|
||||
if trace:
|
||||
wfdata.append(trace)
|
||||
else:
|
||||
wfdata = self.getWFData().select(component=component)
|
||||
wfdata = self.get_wf_data().select(component=component)
|
||||
return wfdata
|
||||
|
||||
def getPicks(self, picktype='manual'):
|
||||
@ -2740,8 +2729,8 @@ class PickDlg(QDialog):
|
||||
stime = self.getStartTime()
|
||||
|
||||
# copy wfdata for plotting
|
||||
wfdata = self.getWFData().copy()
|
||||
wfdata_comp = self.getWFDataComp().copy()
|
||||
wfdata = self.get_wf_data().copy()
|
||||
wfdata_comp = self.get_wf_dataComp().copy()
|
||||
wfdata = self.getPickPhases(wfdata, phase)
|
||||
wfdata_comp = self.getPickPhases(wfdata_comp, phase)
|
||||
for wfd in [wfdata, wfdata_comp]:
|
||||
@ -2850,7 +2839,7 @@ class PickDlg(QDialog):
|
||||
filteroptions = None
|
||||
|
||||
# copy and filter data for earliest and latest possible picks
|
||||
wfdata = self.getWFData().copy().select(channel=channel)
|
||||
wfdata = self.get_wf_data().copy().select(channel=channel)
|
||||
if filteroptions:
|
||||
try:
|
||||
wfdata.detrend('linear')
|
||||
@ -2897,7 +2886,7 @@ class PickDlg(QDialog):
|
||||
minFMSNR = parameter.get('minFMSNR')
|
||||
quality = get_quality_class(spe, parameter.get('timeerrorsP'))
|
||||
if quality <= minFMweight and snr >= minFMSNR:
|
||||
FM = fmpicker(self.getWFData().select(channel=channel).copy(), wfdata.copy(), parameter.get('fmpickwin'),
|
||||
FM = fmpicker(self.get_wf_data().select(channel=channel).copy(), wfdata.copy(), parameter.get('fmpickwin'),
|
||||
pick - stime_diff)
|
||||
|
||||
# save pick times for actual phase
|
||||
@ -3189,7 +3178,7 @@ class PickDlg(QDialog):
|
||||
def togglePickBlocker(self):
|
||||
return not self.pick_block
|
||||
|
||||
def filterWFData(self, phase=None):
|
||||
def filter_wf_data(self, phase=None):
|
||||
if not phase:
|
||||
phase = self.currentPhase
|
||||
if self.getPhaseID(phase) == 'P':
|
||||
@ -3207,8 +3196,8 @@ class PickDlg(QDialog):
|
||||
self.cur_xlim = self.multicompfig.axes[0].get_xlim()
|
||||
self.cur_ylim = self.multicompfig.axes[0].get_ylim()
|
||||
# self.multicompfig.updateCurrentLimits()
|
||||
wfdata = self.getWFData().copy()
|
||||
wfdata_comp = self.getWFDataComp().copy()
|
||||
wfdata = self.get_wf_data().copy()
|
||||
wfdata_comp = self.get_wf_dataComp().copy()
|
||||
title = self.getStation()
|
||||
if filter:
|
||||
filtoptions = None
|
||||
@ -3245,14 +3234,14 @@ class PickDlg(QDialog):
|
||||
def filterP(self):
|
||||
self.filterActionS.setChecked(False)
|
||||
if self.filterActionP.isChecked():
|
||||
self.filterWFData('P')
|
||||
self.filter_wf_data('P')
|
||||
else:
|
||||
self.refreshPlot()
|
||||
|
||||
def filterS(self):
|
||||
self.filterActionP.setChecked(False)
|
||||
if self.filterActionS.isChecked():
|
||||
self.filterWFData('S')
|
||||
self.filter_wf_data('S')
|
||||
else:
|
||||
self.refreshPlot()
|
||||
|
||||
@ -3311,7 +3300,7 @@ class PickDlg(QDialog):
|
||||
if self.autoFilterAction.isChecked():
|
||||
self.filterActionP.setChecked(False)
|
||||
self.filterActionS.setChecked(False)
|
||||
# data = self.getWFData().copy()
|
||||
# data = self.get_wf_data().copy()
|
||||
# title = self.getStation()
|
||||
filter = False
|
||||
phase = None
|
||||
@ -3811,8 +3800,8 @@ class TuneAutopicker(QWidget):
|
||||
fnames = self.station_ids[self.get_current_station_id()]
|
||||
if not fnames == self.fnames:
|
||||
self.fnames = fnames
|
||||
self.data.setWFData(fnames)
|
||||
wfdat = self.data.getWFData() # all available streams
|
||||
self.data.set_wf_data(fnames)
|
||||
wfdat = self.data.get_wf_data() # all available streams
|
||||
# remove possible underscores in station names
|
||||
# wfdat = remove_underscores(wfdat)
|
||||
# rotate misaligned stations to ZNE
|
||||
@ -3837,7 +3826,7 @@ class TuneAutopicker(QWidget):
|
||||
self.stb_names = ['aicARHfig', 'refSpick', 'el_S1pick', 'el_S2pick']
|
||||
|
||||
def add_parameters(self):
|
||||
self.paraBox = PylotParameterWidget(self.parameter, parent=self, windowflag=Qt.Widget)
|
||||
self.paraBox = PylotParaBox(self.parameter, parent=self, windowflag=Qt.Widget)
|
||||
self.paraBox.set_tune_mode(True)
|
||||
self.update_eventID()
|
||||
self.parameter_layout.addWidget(self.paraBox)
|
||||
@ -3917,8 +3906,8 @@ class TuneAutopicker(QWidget):
|
||||
network = None
|
||||
location = None
|
||||
|
||||
wfdata = self.data.getWFData()
|
||||
wfdata_comp = self.data.getWFDataComp()
|
||||
wfdata = self.data.get_wf_data()
|
||||
wfdata_comp = self.data.get_wf_dataComp()
|
||||
metadata = self.parent().metadata
|
||||
event = self.get_current_event()
|
||||
filteroptions = self.parent().filteroptions
|
||||
@ -3973,7 +3962,7 @@ class TuneAutopicker(QWidget):
|
||||
for plotitem in self._manual_pick_plots:
|
||||
self.clear_plotitem(plotitem)
|
||||
self._manual_pick_plots = []
|
||||
st = self.data.getWFData()
|
||||
st = self.data.get_wf_data()
|
||||
tr = st.select(station=self.get_current_station())[0]
|
||||
starttime = tr.stats.starttime
|
||||
# create two lists with figure names and subindices (for subplots) to get the correct axes
|
||||
@ -4204,7 +4193,7 @@ class TuneAutopicker(QWidget):
|
||||
self.qmb.show()
|
||||
|
||||
|
||||
class PylotParameterWidget(QtWidgets.QWidget):
|
||||
class PylotParaBox(QtWidgets.QWidget):
|
||||
accepted = QtCore.Signal(str)
|
||||
rejected = QtCore.Signal(str)
|
||||
|
||||
@ -4318,11 +4307,6 @@ class PylotParameterWidget(QtWidgets.QWidget):
|
||||
grid = QtWidgets.QGridLayout()
|
||||
|
||||
for index1, name in enumerate(parameter_names):
|
||||
if name in ['rootpath', 'database']:
|
||||
logging.warning(
|
||||
f'Deprecated parameter loaded: {name}. Check if datapath is still correct in parameter widget.'
|
||||
)
|
||||
continue
|
||||
default_item = self.parameter.get_defaults()[name]
|
||||
tooltip = default_item['tooltip']
|
||||
tooltip += ' | type: {}'.format(default_item['type'])
|
||||
@ -4892,7 +4876,7 @@ class PropTab(QWidget):
|
||||
def getValues(self):
|
||||
return None
|
||||
|
||||
def resetValues(self, infile):
|
||||
def resetValues(self, infile=None):
|
||||
return None
|
||||
|
||||
|
||||
@ -4989,7 +4973,12 @@ class InputsTab(PropTab):
|
||||
else:
|
||||
index = 2
|
||||
datapath = para.get('datapath') if not para.get('datapath') is None else ''
|
||||
values = {"data/dataRoot": self.dataDirEdit.setText("%s" % datapath),
|
||||
rootpath = para.get('rootpath') if not para.get('rootpath') is None else ''
|
||||
database = para.get('database') if not para.get('database') is None else ''
|
||||
if isinstance(database, int):
|
||||
database = str(database)
|
||||
path = os.path.join(os.path.expanduser('~'), rootpath, datapath, database)
|
||||
values = {"data/dataRoot": self.dataDirEdit.setText("%s" % path),
|
||||
"user/FullName": self.fullNameEdit.text(),
|
||||
"data/Structure": self.structureSelect.setCurrentIndex(index),
|
||||
"tstart": self.tstartBox.setValue(0),
|
||||
|
@ -1,2 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
@ -1,95 +0,0 @@
|
||||
############################# correlation parameters #####################################
|
||||
# min_corr_stacking: minimum correlation coefficient for building beam trace
|
||||
# min_corr_export: minimum correlation coefficient for pick export
|
||||
# min_stack: minimum number of stations for building beam trace
|
||||
# t_before: correlation window before pick
|
||||
# t_after: correlation window after pick#
|
||||
# cc_maxlag: maximum shift for initial correlation
|
||||
# cc_maxlag2: maximum shift for second (final) correlation (also for calculating pick uncertainty)
|
||||
# initial_pick_outlier_threshold: (hopefully) threshold for excluding large outliers of initial (AIC) picks
|
||||
# export_threshold: automatically exclude all onsets which deviate more than this threshold from corrected taup onsets
|
||||
# min_picks_export: minimum number of correlated picks for export
|
||||
# min_picks_autopylot: minimum number of reference auto picks to continue with event
|
||||
# check_RMS: do RMS check to search for restitution errors (very experimental)
|
||||
# use_taupy_onsets: use taupy onsets as reference picks instead of external picks
|
||||
# station_list: use the following stations as reference for stacking
|
||||
# use_stacked_trace: use existing stacked trace if found (spare re-computation)
|
||||
# data_dir: obspyDMT data subdirectory (e.g. 'raw', 'processed')
|
||||
# pickfile_extension: use quakeML files (PyLoT output) with the following extension, e.g. '_autopylot' for pickfiles
|
||||
# such as 'PyLoT_20170501_141822_autopylot.xml'
|
||||
# dt_stacking: time shift for stacking (e.g. [0, 250] for 0 and 250 seconds shift)
|
||||
# filter_options: filter for first correlation (rough)
|
||||
# filter_options_final: filter for second correlation (fine)
|
||||
# filter_type: e.g. 'bandpass'
|
||||
# sampfreq: sampling frequency of the data
|
||||
|
||||
logging: info
|
||||
pick_phases: ['P', 'S']
|
||||
|
||||
# P-phase
|
||||
P:
|
||||
min_corr_stacking: 0.8
|
||||
min_corr_export: 0.6
|
||||
min_stack: 20
|
||||
t_before: 30.
|
||||
t_after: 50.
|
||||
cc_maxlag: 50.
|
||||
cc_maxlag2: 5.
|
||||
initial_pick_outlier_threshold: 30.
|
||||
export_threshold: 2.5
|
||||
min_picks_export: 100
|
||||
min_picks_autopylot: 50
|
||||
check_RMS: True
|
||||
use_taupy_onsets: False
|
||||
station_list: ['HU.MORH', 'HU.TIH', 'OX.FUSE', 'OX.BAD']
|
||||
use_stacked_trace: False
|
||||
data_dir: 'processed'
|
||||
pickfile_extension: '_autopylot'
|
||||
dt_stacking: [250, 250]
|
||||
|
||||
# filter for first correlation (rough)
|
||||
filter_options:
|
||||
freqmax: 0.5
|
||||
freqmin: 0.03
|
||||
# filter for second correlation (fine)
|
||||
filter_options_final:
|
||||
freqmax: 0.5
|
||||
freqmin: 0.03
|
||||
|
||||
filter_type: bandpass
|
||||
sampfreq: 20.0
|
||||
|
||||
# S-phase
|
||||
S:
|
||||
min_corr_stacking: 0.7
|
||||
min_corr_export: 0.6
|
||||
min_stack: 20
|
||||
t_before: 60.
|
||||
t_after: 60.
|
||||
cc_maxlag: 100.
|
||||
cc_maxlag2: 25.
|
||||
initial_pick_outlier_threshold: 30.
|
||||
export_threshold: 5.0
|
||||
min_picks_export: 200
|
||||
min_picks_autopylot: 50
|
||||
check_RMS: True
|
||||
use_taupy_onsets: False
|
||||
station_list: ['HU.MORH','HU.TIH', 'OX.FUSE', 'OX.BAD']
|
||||
use_stacked_trace: False
|
||||
data_dir: 'processed'
|
||||
pickfile_extension: '_autopylot'
|
||||
dt_stacking: [250, 250]
|
||||
|
||||
# filter for first correlation (rough)
|
||||
filter_options:
|
||||
freqmax: 0.1
|
||||
freqmin: 0.01
|
||||
|
||||
# filter for second correlation (fine)
|
||||
filter_options_final:
|
||||
freqmax: 0.2
|
||||
freqmin: 0.01
|
||||
|
||||
filter_type: bandpass
|
||||
sampfreq: 20.0
|
||||
|
File diff suppressed because it is too large
Load Diff
@ -1,40 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
#ulimit -s 8192
|
||||
#ulimit -v $(ulimit -v | awk '{printf("%d",$1*0.95)}')
|
||||
#ulimit -v
|
||||
|
||||
#655360
|
||||
|
||||
source /opt/anaconda3/etc/profile.d/conda.sh
|
||||
conda activate pylot_311
|
||||
NSLOTS=20
|
||||
|
||||
#qsub -l low -cwd -l "os=*stretch" -pe smp 40 submit_pick_corr_correction.sh
|
||||
#$ -l low
|
||||
#$ -l h_vmem=6G
|
||||
#$ -cwd
|
||||
#$ -pe smp 20
|
||||
#$ -N corr_pick
|
||||
|
||||
|
||||
export PYTHONPATH="$PYTHONPATH:/home/marcel/git/pylot_tools/"
|
||||
export PYTHONPATH="$PYTHONPATH:/home/marcel/git/"
|
||||
export PYTHONPATH="$PYTHONPATH:/home/marcel/git/pylot/"
|
||||
|
||||
#export MKL_NUM_THREADS=${NSLOTS:=1}
|
||||
#export NUMEXPR_NUM_THREADS=${NSLOTS:=1}
|
||||
#export OMP_NUM_THREADS=${NSLOTS:=1}
|
||||
|
||||
#python pick_correlation_correction.py '/data/AlpArray_Data/dmt_database_mantle_M5.8-6.0' '/home/marcel/.pylot/pylot_alparray_mantle_corr_stack_0.03-0.5.in' -pd -n ${NSLOTS:=1} -istart 0 -istop 100
|
||||
#python pick_correlation_correction.py '/data/AlpArray_Data/dmt_database_mantle_M5.8-6.0' '/home/marcel/.pylot/pylot_alparray_mantle_corr_stack_0.03-0.5.in' -pd -n ${NSLOTS:=1} -istart 100 -istop 200
|
||||
#python pick_correlation_correction.py '/data/AlpArray_Data/dmt_database_mantle_M6.0-6.5' '/home/marcel/.pylot/pylot_alparray_mantle_corr_stack_0.03-0.5.in' -pd -n ${NSLOTS:=1} -istart 0 -istop 100
|
||||
#python pick_correlation_correction.py '/data/AlpArray_Data/dmt_database_mantle_M5.8-6.0' '/home/marcel/.pylot/pylot_alparray_mantle_corr_stack_0.03-0.5.in' -pd -n ${NSLOTS:=1} -istart 100 -istop 200
|
||||
#python pick_correlation_correction.py 'H:\sciebo\dmt_database' 'H:\Sciebo\dmt_database\pylot_alparray_mantle_corr_S_0.01-0.2.in' -pd -n 4 -t
|
||||
|
||||
pylot_infile='/home/marcel/.pylot/pylot_alparray_syn_fwi_mk6_it3.in'
|
||||
#pylot_infile='/home/marcel/.pylot/pylot_adriaarray_corr_P_and_S.in'
|
||||
|
||||
# THIS SCRIPT SHOLD BE CALLED BY "submit_to_grid_engine.py" using the following line:
|
||||
python pick_correlation_correction.py $1 $pylot_infile -pd -n ${NSLOTS:=1} -istart $2 --params 'parameters_fwi_mk6_it3.yaml'
|
||||
#--event_blacklist eventlist.txt
|
@ -1,23 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import subprocess
|
||||
|
||||
fnames = [
|
||||
('/data/AlpArray_Data/dmt_database_synth_model_mk6_it3_no_rotation', 0),
|
||||
]
|
||||
|
||||
#fnames = [('/data/AlpArray_Data/dmt_database_mantle_0.01-0.2_SKS-phase', 0),
|
||||
# ('/data/AlpArray_Data/dmt_database_mantle_0.01-0.2_S-phase', 0),]
|
||||
|
||||
####
|
||||
script_location = '/home/marcel/VersionCtrl/git/code_base/correlation_picker/submit_pick_corr_correction.sh'
|
||||
####
|
||||
|
||||
for fnin, istart in fnames:
|
||||
input_cmds = f'qsub -q low.q@minos15,low.q@minos14,low.q@minos13,low.q@minos12,low.q@minos11 {script_location} {fnin} {istart}'
|
||||
|
||||
print(input_cmds)
|
||||
print(subprocess.check_output(input_cmds.split()))
|
||||
|
||||
|
||||
|
@ -1,61 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import os
|
||||
import glob
|
||||
import json
|
||||
|
||||
from obspy import read_events
|
||||
|
||||
from pylot.core.util.dataprocessing import Metadata
|
||||
from pylot.core.util.obspyDMT_interface import qml_from_obspyDMT
|
||||
|
||||
|
||||
def get_event_obspy_dmt(eventdir):
|
||||
event_pkl_file = os.path.join(eventdir, 'info', 'event.pkl')
|
||||
if not os.path.exists(event_pkl_file):
|
||||
raise IOError('Could not find event path for event: {}'.format(eventdir))
|
||||
event = qml_from_obspyDMT(event_pkl_file)
|
||||
return event
|
||||
|
||||
|
||||
def get_event_pylot(eventdir, extension=''):
|
||||
event_id = get_event_id(eventdir)
|
||||
filename = os.path.join(eventdir, 'PyLoT_{}{}.xml'.format(event_id, extension))
|
||||
if not os.path.isfile(filename):
|
||||
return
|
||||
cat = read_events(filename)
|
||||
return cat[0]
|
||||
|
||||
|
||||
def get_event_id(eventdir):
|
||||
event_id = os.path.split(eventdir)[-1]
|
||||
return event_id
|
||||
|
||||
|
||||
def get_picks(eventdir, extension=''):
|
||||
event_id = get_event_id(eventdir)
|
||||
filename = 'PyLoT_{}{}.xml'
|
||||
filename = filename.format(event_id, extension)
|
||||
fpath = os.path.join(eventdir, filename)
|
||||
fpaths = glob.glob(fpath)
|
||||
if len(fpaths) == 1:
|
||||
cat = read_events(fpaths[0])
|
||||
picks = cat[0].picks
|
||||
return picks
|
||||
elif len(fpaths) == 0:
|
||||
print('get_picks: File not found: {}'.format(fpath))
|
||||
return
|
||||
print(f'WARNING: Ambiguous pick file specification. Found the following pick files {fpaths}\nFilemask: {fpath}')
|
||||
return
|
||||
|
||||
|
||||
def write_json(object, fname):
|
||||
with open(fname, 'w') as outfile:
|
||||
json.dump(object, outfile, sort_keys=True, indent=4)
|
||||
|
||||
|
||||
def get_metadata(eventdir):
|
||||
metadata_path = os.path.join(eventdir, 'resp')
|
||||
metadata = Metadata(inventory=metadata_path, verbosity=0)
|
||||
return metadata
|
@ -1,7 +1,9 @@
|
||||
Cartopy==0.23.0
|
||||
joblib==1.4.2
|
||||
obspy==1.4.1
|
||||
pyaml==24.7.0
|
||||
pyqtgraph==0.13.7
|
||||
PySide2==5.15.8
|
||||
pytest==8.3.2
|
||||
# This file may be used to create an environment using:
|
||||
# $ conda create --name <env> --file <this file>
|
||||
# platform: win-64
|
||||
cartopy>=0.20.2
|
||||
numpy<2
|
||||
obspy>=1.3.0
|
||||
pyqtgraph>=0.12.4
|
||||
pyside2>=5.13.2
|
||||
scipy>=1.8.0
|
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -1,99 +0,0 @@
|
||||
%This is a parameter input file for PyLoT/autoPyLoT.
|
||||
%All main and special settings regarding data handling
|
||||
%and picking are to be set here!
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
dmt_database_test #datapath# %data path
|
||||
20171010_063224.a #eventID# %event ID for single event processing (* for all events found in database)
|
||||
#invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#NLLoc settings#
|
||||
None #nllocbin# %path to NLLoc executable
|
||||
None #nllocroot# %root of NLLoc-processing directory
|
||||
None #phasefile# %name of autoPyLoT-output phase file for NLLoc
|
||||
None #ctrfile# %name of autoPyLoT-output control file for NLLoc
|
||||
ttime #ttpatter# %pattern of NLLoc ttimes from grid
|
||||
AUTOLOC_nlloc #outpatter# %pattern of NLLoc-output file
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#parameters for seismic moment estimation#
|
||||
3530.0 #vp# %average P-wave velocity
|
||||
2500.0 #rho# %average rock density [kg/m^3]
|
||||
300.0 0.8 #Qp# %quality factor for P waves (Qp*f^a); list(Qp, a)
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#settings local magnitude#
|
||||
1.0 1.0 1.0 #WAscaling# %Scaling relation (log(Ao)+Alog(r)+Br+C) of Wood-Anderson amplitude Ao [nm] If zeros are set, original Richter magnitude is calculated!
|
||||
1.0 1.0 #magscaling# %Scaling relation for derived local magnitude [a*Ml+b]. If zeros are set, no scaling of network magnitude is applied!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#filter settings#
|
||||
0.03 0.03 #minfreq# %Lower filter frequency [P, S]
|
||||
0.5 0.5 #maxfreq# %Upper filter frequency [P, S]
|
||||
4 4 #filter_order# %filter order [P, S]
|
||||
bandpass bandpass #filter_type# %filter type (bandpass, bandstop, lowpass, highpass) [P, S]
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#common settings picker#
|
||||
global #extent# %extent of array ("local", "regional" or "global")
|
||||
-100.0 #pstart# %start time [s] for calculating CF for P-picking (if TauPy: seconds relative to estimated onset)
|
||||
50.0 #pstop# %end time [s] for calculating CF for P-picking (if TauPy: seconds relative to estimated onset)
|
||||
-50.0 #sstart# %start time [s] relative to P-onset for calculating CF for S-picking
|
||||
50.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
ak135 #taup_model# %Define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6
|
||||
P,Pdiff,S,SKS #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
0.03 0.5 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
0.01 0.5 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
0.03 0.5 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
0.01 0.5 #bph2# %lower/upper corner freq. of second band pass filter z-comp. [Hz]
|
||||
#special settings for calculating CF#
|
||||
%!!Edit the following only if you know what you are doing!!%
|
||||
#Z-component#
|
||||
HOS #algoP# %choose algorithm for P-onset determination (HOS, ARZ, or AR3)
|
||||
300.0 #tlta# %for HOS-/AR-AIC-picker, length of LTA window [s]
|
||||
4 #hosorder# %for HOS-picker, order of Higher Order Statistics
|
||||
2 #Parorder# %for AR-picker, order of AR process of Z-component
|
||||
16.0 #tdet1z# %for AR-picker, length of AR determination window [s] for Z-component, 1st pick
|
||||
10.0 #tpred1z# %for AR-picker, length of AR prediction window [s] for Z-component, 1st pick
|
||||
12.0 #tdet2z# %for AR-picker, length of AR determination window [s] for Z-component, 2nd pick
|
||||
6.0 #tpred2z# %for AR-picker, length of AR prediction window [s] for Z-component, 2nd pick
|
||||
0.001 #addnoise# %add noise to seismogram for stable AR prediction
|
||||
60.0 5.0 20.0 12.0 #tsnrz# %for HOS/AR, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]
|
||||
50.0 #pickwinP# %for initial AIC pick, length of P-pick window [s]
|
||||
30.0 #Precalcwin# %for HOS/AR, window length [s] for recalculation of CF (relative to 1st pick)
|
||||
2.0 #aictsmooth# %for HOS/AR, take average of samples for smoothing of AIC-function [s]
|
||||
2.0 #tsmoothP# %for HOS/AR, take average of samples in this time window for smoothing CF [s]
|
||||
0.006 #ausP# %for HOS/AR, artificial uplift of samples (aus) of CF (P)
|
||||
2.0 #nfacP# %for HOS/AR, noise factor for noise level determination (P)
|
||||
#H-components#
|
||||
ARH #algoS# %choose algorithm for S-onset determination (ARH or AR3)
|
||||
12.0 #tdet1h# %for HOS/AR, length of AR-determination window [s], H-components, 1st pick
|
||||
6.0 #tpred1h# %for HOS/AR, length of AR-prediction window [s], H-components, 1st pick
|
||||
8.0 #tdet2h# %for HOS/AR, length of AR-determinaton window [s], H-components, 2nd pick
|
||||
4.0 #tpred2h# %for HOS/AR, length of AR-prediction window [s], H-components, 2nd pick
|
||||
4 #Sarorder# %for AR-picker, order of AR process of H-components
|
||||
100.0 #Srecalcwin# %for AR-picker, window length [s] for recalculation of CF (2nd pick) (H)
|
||||
195.0 #pickwinS# %for initial AIC pick, length of S-pick window [s]
|
||||
60.0 10.0 30.0 12.0 #tsnrh# %for ARH/AR3, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]
|
||||
22.0 #aictsmoothS# %for AIC-picker, take average of samples in this time window for smoothing of AIC-function [s]
|
||||
20.0 #tsmoothS# %for AR-picker, take average of samples for smoothing CF [s] (S)
|
||||
0.001 #ausS# %for HOS/AR, artificial uplift of samples (aus) of CF (S)
|
||||
2.0 #nfacS# %for AR-picker, noise factor for noise level determination (S)
|
||||
#first-motion picker#
|
||||
1 #minfmweight# %minimum required P weight for first-motion determination
|
||||
3.0 #minFMSNR# %miniumum required SNR for first-motion determination
|
||||
10.0 #fmpickwin# %pick window [s] around P onset for calculating zero crossings
|
||||
#quality assessment#
|
||||
0.1 0.2 0.4 0.8 #timeerrorsP# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P
|
||||
4.0 8.0 16.0 32.0 #timeerrorsS# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S
|
||||
0.005 #minAICPslope# %below this slope [counts/s] the initial P pick is rejected
|
||||
1.1 #minAICPSNR# %below this SNR the initial P pick is rejected
|
||||
0.002 #minAICSslope# %below this slope [counts/s] the initial S pick is rejected
|
||||
1.3 #minAICSSNR# %below this SNR the initial S pick is rejected
|
||||
20.0 #minsiglength# %length of signal part for which amplitudes must exceed noiselevel [s]
|
||||
1.0 #noisefactor# %noiselevel*noisefactor=threshold
|
||||
10.0 #minpercent# %required percentage of amplitudes exceeding threshold
|
||||
0.1 #zfac# %P-amplitude must exceed at least zfac times RMS-S amplitude
|
||||
100.0 #mdttolerance# %maximum allowed deviation of P picks from median [s]
|
||||
50.0 #wdttolerance# %maximum allowed deviation from Wadati-diagram
|
||||
25.0 #jackfactor# %pick is removed if the variance of the subgroup with the pick removed is larger than the mean variance of all subgroups times safety factor
|
@ -1,67 +0,0 @@
|
||||
import os
|
||||
import pytest
|
||||
|
||||
from obspy import read_events
|
||||
|
||||
from autoPyLoT import autoPyLoT
|
||||
|
||||
|
||||
class TestAutopickerGlobal():
|
||||
def init(self):
|
||||
self.params_infile = 'pylot_alparray_mantle_corr_stack_0.03-0.5.in'
|
||||
self.test_event_dir = 'dmt_database_test'
|
||||
self.fname_outfile_xml = os.path.join(
|
||||
self.test_event_dir, '20171010_063224.a', 'PyLoT_20171010_063224.a_autopylot.xml'
|
||||
)
|
||||
|
||||
# check if the input files exist
|
||||
if not os.path.isfile(self.params_infile):
|
||||
print(f'Test input file {os.path.abspath(self.params_infile)} not found.')
|
||||
return False
|
||||
|
||||
if not os.path.exists(self.test_event_dir):
|
||||
print(
|
||||
f'Test event directory not found at location "{os.path.abspath(self.test_event_dir)}". '
|
||||
f'Make sure to load it from the website first.'
|
||||
)
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def test_autopicker(self):
|
||||
assert self.init(), 'Initialization failed due to missing input files.'
|
||||
# check for output file in test directory and remove it if necessary
|
||||
if os.path.isfile(self.fname_outfile_xml):
|
||||
os.remove(self.fname_outfile_xml)
|
||||
autoPyLoT(inputfile=self.params_infile, eventid='20171010_063224.a', obspyDMT_wfpath='processed')
|
||||
|
||||
# test for different known output files if they are identical or not
|
||||
compare_pickfiles(self.fname_outfile_xml, 'PyLoT_20171010_063224.a_autopylot.xml', True)
|
||||
compare_pickfiles(self.fname_outfile_xml, 'PyLoT_20171010_063224.a_saved_from_GUI.xml', True)
|
||||
compare_pickfiles(self.fname_outfile_xml, 'PyLoT_20171010_063224.a_corrected_taup_times_0.03-0.5_P.xml', False)
|
||||
|
||||
|
||||
def compare_pickfiles(pickfile1: str, pickfile2: str, samefile: bool = True) -> None:
|
||||
"""
|
||||
Compare the pick times and errors from two pick files.
|
||||
|
||||
Parameters:
|
||||
pickfile1 (str): The path to the first pick file.
|
||||
pickfile2 (str): The path to the second pick file.
|
||||
samefile (bool): A flag indicating whether the two files are expected to be the same. Defaults to True.
|
||||
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
cat1 = read_events(pickfile1)
|
||||
cat2 = read_events(pickfile2)
|
||||
picks1 = sorted(cat1[0].picks, key=lambda pick: str(pick.waveform_id))
|
||||
picks2 = sorted(cat2[0].picks, key=lambda pick: str(pick.waveform_id))
|
||||
pick_times1 = [pick.time for pick in picks1]
|
||||
pick_times2 = [pick.time for pick in picks2]
|
||||
pick_terrs1 = [pick.time_errors for pick in picks1]
|
||||
pick_terrs2 = [pick.time_errors for pick in picks2]
|
||||
|
||||
# check if times and errors are identical or not depending on the samefile flag
|
||||
assert (pick_times1 == pick_times2) is samefile, 'Pick times error'
|
||||
assert (pick_terrs1 == pick_terrs2) is samefile, 'Pick time errors errors'
|
@ -4,8 +4,10 @@
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
/home/darius/alparray/waveforms_used #datapath# %data path
|
||||
e0093.173.16 #eventID# %event ID for single event processing (* for all events found in datapath)
|
||||
/home/darius #rootpath# %project path
|
||||
alparray #datapath# %data path
|
||||
waveforms_used #database# %name of data base
|
||||
e0093.173.16 #eventID# %event ID for single event processing (* for all events found in database)
|
||||
/home/darius/alparray/metadata #invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
@ -41,7 +43,6 @@ global #extent# %extent of a
|
||||
875.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
False #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
IASP91 #taup_model# %define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6
|
||||
P,Pdiff,S,Sdiff #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
0.01 0.1 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
0.001 0.5 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
0.01 0.5 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
|
@ -4,8 +4,10 @@
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
/home/darius/alparray/waveforms_used #datapath# %data path
|
||||
e0093.173.16 #eventID# %event ID for single event processing (* for all events found in datapath)
|
||||
/home/darius #rootpath# %project path
|
||||
alparray #datapath# %data path
|
||||
waveforms_used #database# %name of data base
|
||||
e0093.173.16 #eventID# %event ID for single event processing (* for all events found in database)
|
||||
/home/darius/alparray/metadata #invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
@ -41,7 +43,6 @@ global #extent# %extent of a
|
||||
875.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
IASP91 #taup_model# %define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6
|
||||
P,Pdiff,S,Sdiff #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
0.01 0.1 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
0.001 0.5 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
0.01 0.5 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
|
@ -1,7 +1,6 @@
|
||||
import os
|
||||
import sys
|
||||
import unittest
|
||||
import pytest
|
||||
|
||||
import obspy
|
||||
from obspy import UTCDateTime
|
||||
@ -106,6 +105,7 @@ class TestAutopickStation(unittest.TestCase):
|
||||
# show complete diff when difference in results dictionaries are found
|
||||
self.maxDiff = None
|
||||
|
||||
# @skip("Works")
|
||||
def test_autopickstation_taupy_disabled_gra1(self):
|
||||
expected = {
|
||||
'P': {'picker': 'auto', 'snrdb': 15.405649120980094, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None,
|
||||
@ -121,8 +121,8 @@ class TestAutopickStation(unittest.TestCase):
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.gra1, pickparam=self.pickparam_taupy_disabled,
|
||||
metadata=(None, None))
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('GRA1', station)
|
||||
|
||||
def test_autopickstation_taupy_enabled_gra1(self):
|
||||
@ -140,8 +140,8 @@ class TestAutopickStation(unittest.TestCase):
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.gra1, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('GRA1', station)
|
||||
|
||||
def test_autopickstation_taupy_disabled_gra2(self):
|
||||
@ -157,8 +157,8 @@ class TestAutopickStation(unittest.TestCase):
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.gra2, pickparam=self.pickparam_taupy_disabled,
|
||||
metadata=(None, None))
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('GRA2', station)
|
||||
|
||||
def test_autopickstation_taupy_enabled_gra2(self):
|
||||
@ -175,8 +175,8 @@ class TestAutopickStation(unittest.TestCase):
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.gra2, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('GRA2', station)
|
||||
|
||||
def test_autopickstation_taupy_disabled_ech(self):
|
||||
@ -190,8 +190,8 @@ class TestAutopickStation(unittest.TestCase):
|
||||
'fm': None, 'spe': None, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.ech, pickparam=self.pickparam_taupy_disabled)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('ECH', station)
|
||||
|
||||
def test_autopickstation_taupy_enabled_ech(self):
|
||||
@ -208,8 +208,8 @@ class TestAutopickStation(unittest.TestCase):
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.ech, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('ECH', station)
|
||||
|
||||
def test_autopickstation_taupy_disabled_fiesa(self):
|
||||
@ -224,8 +224,8 @@ class TestAutopickStation(unittest.TestCase):
|
||||
'fm': None, 'spe': None, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.fiesa, pickparam=self.pickparam_taupy_disabled)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('FIESA', station)
|
||||
|
||||
def test_autopickstation_taupy_enabled_fiesa(self):
|
||||
@ -242,8 +242,8 @@ class TestAutopickStation(unittest.TestCase):
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.fiesa, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('FIESA', station)
|
||||
|
||||
def test_autopickstation_gra1_z_comp_missing(self):
|
||||
@ -272,8 +272,7 @@ class TestAutopickStation(unittest.TestCase):
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=wfstream, pickparam=self.pickparam_taupy_disabled,
|
||||
metadata=(None, None))
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertEqual(expected, result)
|
||||
self.assertEqual('GRA1', station)
|
||||
|
||||
def test_autopickstation_a106_taupy_enabled(self):
|
||||
@ -291,9 +290,7 @@ class TestAutopickStation(unittest.TestCase):
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.a106, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_autopickstation_station_missing_in_metadata(self):
|
||||
"""This station is not in the metadata, but Taupy is enabled. Taupy should exit cleanly and modify the starttime
|
||||
@ -314,37 +311,8 @@ class TestAutopickStation(unittest.TestCase):
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.a005a, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
|
||||
def run_dict_comparison(result, expected):
|
||||
for key, expected_value in expected.items():
|
||||
if isinstance(expected_value, dict):
|
||||
run_dict_comparison(result[key], expected[key])
|
||||
else:
|
||||
res = result[key]
|
||||
if isinstance(res, UTCDateTime) and isinstance(expected_value, UTCDateTime):
|
||||
res = res.timestamp
|
||||
expected_value = expected_value.timestamp
|
||||
assert expected_value == pytest.approx(res), f'{key}: {expected_value} != {res}'
|
||||
|
||||
|
||||
def compare_dicts(result, expected, hint=''):
|
||||
try:
|
||||
run_dict_comparison(result, expected)
|
||||
except AssertionError:
|
||||
raise AssertionError(f'{hint}Dictionaries not equal.'
|
||||
f'\n\n<<Expected>>\n{pretty_print_dict(expected)}'
|
||||
f'\n\n<<Result>>\n{pretty_print_dict(result)}')
|
||||
|
||||
|
||||
def pretty_print_dict(dct):
|
||||
retstr = ''
|
||||
for key, value in sorted(dct.items(), key=lambda x: x[0]):
|
||||
retstr += f"{key} : {value}\n"
|
||||
|
||||
return retstr
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
||||
|
@ -1,76 +0,0 @@
|
||||
import pytest
|
||||
from obspy import read, Trace, UTCDateTime
|
||||
|
||||
from pylot.correlation.pick_correlation_correction import XCorrPickCorrection
|
||||
|
||||
|
||||
class TestXCorrPickCorrection():
|
||||
def setup(self):
|
||||
self.make_test_traces()
|
||||
self.make_test_picks()
|
||||
self.t_before = 2.
|
||||
self.t_after = 2.
|
||||
self.cc_maxlag = 0.5
|
||||
|
||||
def make_test_traces(self):
|
||||
# take first trace of test Stream from obspy
|
||||
tr1 = read()[0]
|
||||
# filter trace
|
||||
tr1.filter('bandpass', freqmin=1, freqmax=20)
|
||||
# make a copy and shift the copy by 0.1 s
|
||||
tr2 = tr1.copy()
|
||||
tr2.stats.starttime += 0.1
|
||||
|
||||
self.trace1 = tr1
|
||||
self.trace2 = tr2
|
||||
|
||||
def make_test_picks(self):
|
||||
# create an artificial reference pick on reference trace (trace1) and another one on the 0.1 s shifted trace
|
||||
self.tpick1 = UTCDateTime('2009-08-24T00:20:07.7')
|
||||
# shift the second pick by 0.2 s, the correction should be around 0.1 s now
|
||||
self.tpick2 = self.tpick1 + 0.2
|
||||
|
||||
def test_slice_trace_okay(self):
|
||||
|
||||
self.setup()
|
||||
xcpc = XCorrPickCorrection(UTCDateTime(), Trace(), UTCDateTime(), Trace(),
|
||||
t_before=self.t_before, t_after=self.t_after, cc_maxlag=self.cc_maxlag)
|
||||
|
||||
test_trace = self.trace1
|
||||
pick_time = self.tpick2
|
||||
|
||||
sliced_trace = xcpc.slice_trace(test_trace, pick_time)
|
||||
assert ((sliced_trace.stats.starttime == pick_time - self.t_before - self.cc_maxlag / 2)
|
||||
and (sliced_trace.stats.endtime == pick_time + self.t_after + self.cc_maxlag / 2))
|
||||
|
||||
def test_slice_trace_fails(self):
|
||||
self.setup()
|
||||
|
||||
test_trace = self.trace1
|
||||
pick_time = self.tpick1
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
xcpc = XCorrPickCorrection(UTCDateTime(), Trace(), UTCDateTime(), Trace(),
|
||||
t_before=self.t_before + 20, t_after=self.t_after, cc_maxlag=self.cc_maxlag)
|
||||
xcpc.slice_trace(test_trace, pick_time)
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
xcpc = XCorrPickCorrection(UTCDateTime(), Trace(), UTCDateTime(), Trace(),
|
||||
t_before=self.t_before, t_after=self.t_after + 50, cc_maxlag=self.cc_maxlag)
|
||||
xcpc.slice_trace(test_trace, pick_time)
|
||||
|
||||
def test_cross_correlation(self):
|
||||
self.setup()
|
||||
|
||||
# create XCorrPickCorrection object
|
||||
xcpc = XCorrPickCorrection(self.tpick1, self.trace1, self.tpick2, self.trace2, t_before=self.t_before,
|
||||
t_after=self.t_after, cc_maxlag=self.cc_maxlag)
|
||||
|
||||
# execute correlation
|
||||
correction, cc_max, uncert, fwfm = xcpc.cross_correlation(False, '', '')
|
||||
|
||||
# define awaited test result
|
||||
test_result = (-0.09983091718314982, 0.9578431835689154, 0.0015285160561610929, 0.03625786256084631)
|
||||
|
||||
# check results
|
||||
assert pytest.approx(test_result, rel=1e-6) == (correction, cc_max, uncert, fwfm)
|
Loading…
Reference in New Issue
Block a user