Compare commits
No commits in common. "develop" and "445120990715f43c8aae18955ca533ff0d6d4fb8" have entirely different histories.
develop
...
4451209907
2
.gitignore
vendored
2
.gitignore
vendored
@ -1,5 +1,3 @@
|
||||
*.pyc
|
||||
*~
|
||||
.idea
|
||||
pylot/RELEASE-VERSION
|
||||
/tests/test_autopicker/dmt_database_test/
|
||||
|
39
.mailmap
39
.mailmap
@ -1,39 +0,0 @@
|
||||
Darius Arnold <Darius.Arnold@ruhr-uni-bochum.de> <Darius_A@web.de>
|
||||
Darius Arnold <Darius.Arnold@ruhr-uni-bochum.de> <darius.arnold@rub.de>
|
||||
Darius Arnold <Darius.Arnold@ruhr-uni-bochum.de> <darius.arnold@ruhr-uni-bochum.de>
|
||||
Darius Arnold <Darius.Arnold@ruhr-uni-bochum.de> <mail@dariusarnold.de>
|
||||
|
||||
Dennis Wlecklik <dennisw@minos02.geophysik.ruhr-uni-bochum.de>
|
||||
|
||||
Jeldrik Gaal <jeldrikgaal@gmail.com>
|
||||
|
||||
Kaan Coekerim <kaan.coekerim@ruhr-uni-bochum.de>
|
||||
Kaan Coekerim <kaan.coekerim@ruhr-uni-bochum.de> <kaan.coekerim@rub.de>
|
||||
|
||||
Ludger Kueperkoch <kueperkoch@igem-energie.de> <kueperkoch@bestec-for-nature.com>
|
||||
Ludger Kueperkoch <kueperkoch@igem-energie.de> <ludger@quake2.(none)>
|
||||
Ludger Kueperkoch <kueperkoch@igem-energie.de> <ludger@sauron.bestec-for-nature>
|
||||
|
||||
Marc S. Boxberg <marc.boxberg@rub.de>
|
||||
|
||||
Marcel Paffrath <marcel.paffrath@ruhr-uni-bochum.de> <marcel.paffrath@rub.de>
|
||||
Marcel Paffrath <marcel.paffrath@ruhr-uni-bochum.de> <marcel@minos01.geophysik.ruhr-uni-bochum.de>
|
||||
Marcel Paffrath <marcel.paffrath@ruhr-uni-bochum.de> <marcel@minos02.geophysik.ruhr-uni-bochum.de>
|
||||
Marcel Paffrath <marcel.paffrath@ruhr-uni-bochum.de> <marcel@minos25.geophysik.ruhr-uni-bochum.de>
|
||||
Marcel Paffrath <marcel.paffrath@ruhr-uni-bochum.de> <marcel@email.com>
|
||||
|
||||
Sally Zimmermann <sally.zimmermann@ruhr-uni-bochum.de>
|
||||
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastianw@minos01.geophysik.ruhr-uni-bochum.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastianw@minos02.geophysik.ruhr-uni-bochum.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastianw@minos22.geophysik.ruhr-uni-bochum.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastian.wehling-benatelli@scisys.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastian.wehling@rub.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <sebastian.wehling@rub.de>
|
||||
Sebastian Wehling-Benatelli <sebastian.wehling-benatelli@cgi.com> <DarkBeQst@users.noreply.github.com>
|
||||
|
||||
Thomas Moeller <thomas.moeller@rub.de>
|
||||
|
||||
Ann-Christin Koch <ann-christin.koch@ruhr-uni-bochum.de> <Ann-Christin.Koch@ruhr-uni-bochum.de>
|
||||
|
||||
Sebastian Priebe <sebastian.priebe@rub.de>
|
90
README.md
90
README.md
@ -1,70 +1,66 @@
|
||||
# PyLoT
|
||||
|
||||
version: 0.3
|
||||
version: 0.2
|
||||
|
||||
The Python picking and Localisation Tool
|
||||
|
||||
This python library contains a graphical user interfaces for picking seismic phases. This software needs [ObsPy][ObsPy]
|
||||
and the PySide2 Qt5 bindings for python to be installed first.
|
||||
This python library contains a graphical user interfaces for picking
|
||||
seismic phases. This software needs [ObsPy][ObsPy]
|
||||
and the PySide Qt4 bindings for python to be installed first.
|
||||
|
||||
PILOT has originally been developed in Mathworks' MatLab. In order to distribute PILOT without facing portability
|
||||
problems, it has been decided to redevelop the software package in Python. The great work of the ObsPy group allows easy
|
||||
handling of a bunch of seismic data and PyLoT will benefit a lot compared to the former MatLab version.
|
||||
PILOT has originally been developed in Mathworks' MatLab. In order to
|
||||
distribute PILOT without facing portability problems, it has been decided
|
||||
to redevelop the software package in Python. The great work of the ObsPy
|
||||
group allows easy handling of a bunch of seismic data and PyLoT will
|
||||
benefit a lot compared to the former MatLab version.
|
||||
|
||||
The development of PyLoT is part of the joint research project MAGS2, AlpArray and AdriaArray.
|
||||
The development of PyLoT is part of the joint research project MAGS2 and AlpArray.
|
||||
|
||||
## Installation
|
||||
|
||||
At the moment there is no automatic installation procedure available for PyLoT. Best way to install is to clone the
|
||||
repository and add the path to your Python path.
|
||||
|
||||
It is highly recommended to use Anaconda for a simple creation of a Python installation using either the *pylot.yml* or the *requirements.txt* file found in the PyLoT root directory. First make sure that the *conda-forge* channel is available in your Anaconda installation:
|
||||
|
||||
conda config --add channels conda-forge
|
||||
|
||||
Afterwards run (from the PyLoT main directory where the files *requirements.txt* and *pylot.yml* are located)
|
||||
|
||||
conda env create -f pylot.yml
|
||||
or
|
||||
|
||||
conda create -c conda-forge --name pylot_311 python=3.11 --file requirements.txt
|
||||
|
||||
to create a new Anaconda environment called *pylot_311*.
|
||||
|
||||
Afterwards activate the environment by typing
|
||||
|
||||
conda activate pylot_311
|
||||
At the moment there is no automatic installation procedure available for PyLoT.
|
||||
Best way to install is to clone the repository and add the path to your Python path.
|
||||
|
||||
#### Prerequisites:
|
||||
|
||||
In order to run PyLoT you need to install:
|
||||
|
||||
- Python 3
|
||||
- cartopy
|
||||
- joblib
|
||||
- obspy
|
||||
- pyaml
|
||||
- pyqtgraph
|
||||
- pyside2
|
||||
|
||||
(the following are already dependencies of the above packages):
|
||||
- python 2 or 3
|
||||
- scipy
|
||||
- numpy
|
||||
- matplotlib
|
||||
- obspy
|
||||
- pyside
|
||||
|
||||
#### Some handwork:
|
||||
|
||||
Some extra information on error estimates (just needed for reading old PILOT data) and the Richter magnitude scaling
|
||||
relation
|
||||
PyLoT needs a properties folder on your system to work. It should be situated in your home directory
|
||||
(on Windows usually C:/Users/*username*):
|
||||
|
||||
mkdir ~/.pylot
|
||||
|
||||
In the next step you have to copy some files to this directory:
|
||||
|
||||
*for local distance seismicity*
|
||||
|
||||
cp path-to-pylot/inputs/pylot_local.in ~/.pylot/pylot.in
|
||||
|
||||
*for regional distance seismicity*
|
||||
|
||||
cp path-to-pylot/inputs/pylot_regional.in ~/.pylot/pylot.in
|
||||
|
||||
*for global distance seismicity*
|
||||
|
||||
cp path-to-pylot/inputs/pylot_global.in ~/.pylot/pylot.in
|
||||
|
||||
and some extra information on error estimates (just needed for reading old PILOT data) and the Richter magnitude scaling relation
|
||||
|
||||
cp path-to-pylot/inputs/PILOT_TimeErrors.in path-to-pylot/inputs/richter_scaling.data ~/.pylot/
|
||||
|
||||
You may need to do some modifications to these files. Especially folder names should be reviewed.
|
||||
|
||||
PyLoT has been tested on Mac OSX (10.11), Debian Linux 8 and on Windows 10/11.
|
||||
PyLoT has been tested on Mac OSX (10.11), Debian Linux 8 and on Windows 10.
|
||||
|
||||
## Example Dataset
|
||||
An example dataset with waveform data, metadata and automatic picks in the obspy-dmt dataset format for testing the teleseismic picking can be found at https://zenodo.org/doi/10.5281/zenodo.13759803
|
||||
|
||||
## Release notes
|
||||
|
||||
@ -73,26 +69,28 @@ An example dataset with waveform data, metadata and automatic picks in the obspy
|
||||
- event organisation in project files and waveform visualisation
|
||||
- consistent manual phase picking through predefined SNR dependant zoom level
|
||||
- consistent automatic phase picking routines using Higher Order Statistics, AIC and Autoregression
|
||||
- pick correlation correction for teleseismic waveforms
|
||||
- interactive tuning of auto-pick parameters
|
||||
- uniform uncertainty estimation from waveform's properties for automatic and manual picks
|
||||
- pdf representation and comparison of picks taking the uncertainty intrinsically into account
|
||||
- pdf representation and comparison of picks taking the uncertainty intrinsically into account
|
||||
- Richter and moment magnitude estimation
|
||||
- location determination with external installation of [NonLinLoc](http://alomax.free.fr/nlloc/index.html)
|
||||
|
||||
#### Known issues:
|
||||
|
||||
Current release is still in development progress and has several issues. We are currently lacking manpower, but hope to assess many of the issues in the near future.
|
||||
- Sometimes an error might occur when using Qt
|
||||
|
||||
We hope to solve these with the next release.
|
||||
|
||||
## Staff
|
||||
|
||||
Developer(s): M. Paffrath, S. Wehling-Benatelli, L. Kueperkoch, D. Arnold, K. Cökerim, K. Olbert, M. Bischoff, C. Wollin, M. Rische, S. Zimmermann
|
||||
Original author(s): L. Kueperkoch, S. Wehling-Benatelli, M. Bischoff (PILOT)
|
||||
|
||||
Original author(s): M. Rische, S. Wehling-Benatelli, L. Kueperkoch, M. Bischoff (PILOT)
|
||||
Developer(s): S. Wehling-Benatelli, L. Kueperkoch, K. Olbert, M. Bischoff,
|
||||
C. Wollin, M. Rische, M. Paffrath
|
||||
|
||||
Others: A. Bruestle, T. Meier, W. Friederich
|
||||
|
||||
|
||||
[ObsPy]: http://github.com/obspy/obspy/wiki
|
||||
|
||||
March 2025
|
||||
September 2017
|
||||
|
50
autoPyLoT.py
50
autoPyLoT.py
@ -8,7 +8,6 @@ import datetime
|
||||
import glob
|
||||
import os
|
||||
import traceback
|
||||
|
||||
from obspy import read_events
|
||||
from obspy.core.event import ResourceIdentifier
|
||||
|
||||
@ -28,7 +27,7 @@ from pylot.core.util.dataprocessing import restitute_data, Metadata
|
||||
from pylot.core.util.defaults import SEPARATOR
|
||||
from pylot.core.util.event import Event
|
||||
from pylot.core.util.structure import DATASTRUCTURE
|
||||
from pylot.core.util.utils import get_none, trim_station_components, check4gapsAndRemove, check4doubled, \
|
||||
from pylot.core.util.utils import get_None, trim_station_components, check4gapsAndRemove, check4doubled, \
|
||||
check4rotated
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
@ -91,9 +90,9 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
sp=sp_info)
|
||||
print(splash)
|
||||
|
||||
parameter = get_none(parameter)
|
||||
inputfile = get_none(inputfile)
|
||||
eventid = get_none(eventid)
|
||||
parameter = get_None(parameter)
|
||||
inputfile = get_None(inputfile)
|
||||
eventid = get_None(eventid)
|
||||
|
||||
fig_dict = None
|
||||
fig_dict_wadatijack = None
|
||||
@ -119,9 +118,13 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
obspyDMT_wfpath = input_dict['obspyDMT_wfpath']
|
||||
|
||||
if not parameter:
|
||||
if not inputfile:
|
||||
print('Using default input parameter')
|
||||
parameter = PylotParameter(inputfile)
|
||||
if inputfile:
|
||||
parameter = PylotParameter(inputfile)
|
||||
# iplot = parameter['iplot']
|
||||
else:
|
||||
infile = os.path.join(os.path.expanduser('~'), '.pylot', 'pylot.in')
|
||||
print('Using default input file {}'.format(infile))
|
||||
parameter = PylotParameter(infile)
|
||||
else:
|
||||
if not type(parameter) == PylotParameter:
|
||||
print('Wrong input type for parameter: {}'.format(type(parameter)))
|
||||
@ -136,11 +139,13 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
if parameter.hasParam('datastructure'):
|
||||
# getting information on data structure
|
||||
datastructure = DATASTRUCTURE[parameter.get('datastructure')]()
|
||||
dsfields = {'dpath': parameter.get('datapath'),}
|
||||
dsfields = {'root': parameter.get('rootpath'),
|
||||
'dpath': parameter.get('datapath'),
|
||||
'dbase': parameter.get('database')}
|
||||
|
||||
exf = ['dpath']
|
||||
exf = ['root', 'dpath', 'dbase']
|
||||
|
||||
if parameter['eventID'] != '*' and fnames == 'None':
|
||||
if parameter['eventID'] is not '*' and fnames == 'None':
|
||||
dsfields['eventID'] = parameter['eventID']
|
||||
exf.append('eventID')
|
||||
|
||||
@ -148,7 +153,7 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
datastructure.setExpandFields(exf)
|
||||
|
||||
# check if default location routine NLLoc is available and all stations are used
|
||||
if get_none(parameter['nllocbin']) and station == 'all':
|
||||
if get_None(parameter['nllocbin']) and station == 'all':
|
||||
locflag = 1
|
||||
# get NLLoc-root path
|
||||
nllocroot = parameter.get('nllocroot')
|
||||
@ -184,15 +189,15 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
if not input_dict:
|
||||
# started in production mode
|
||||
datapath = datastructure.expandDataPath()
|
||||
if fnames in [None, 'None'] and parameter['eventID'] == '*':
|
||||
if fnames == 'None' and parameter['eventID'] is '*':
|
||||
# multiple event processing
|
||||
# read each event in database
|
||||
events = [event for event in glob.glob(os.path.join(datapath, '*')) if
|
||||
(os.path.isdir(event) and not event.endswith('EVENTS-INFO'))]
|
||||
elif fnames in [None, 'None'] and parameter['eventID'] != '*' and not type(parameter['eventID']) == list:
|
||||
elif fnames == 'None' and parameter['eventID'] is not '*' and not type(parameter['eventID']) == list:
|
||||
# single event processing
|
||||
events = glob.glob(os.path.join(datapath, parameter['eventID']))
|
||||
elif fnames in [None, 'None'] and type(parameter['eventID']) == list:
|
||||
elif fnames == 'None' and type(parameter['eventID']) == list:
|
||||
# multiple event processing
|
||||
events = []
|
||||
for eventID in parameter['eventID']:
|
||||
@ -204,10 +209,12 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
locflag = 2
|
||||
else:
|
||||
# started in tune or interactive mode
|
||||
datapath = parameter['datapath']
|
||||
datapath = os.path.join(parameter['rootpath'],
|
||||
parameter['datapath'])
|
||||
events = []
|
||||
for eventID in eventid:
|
||||
events.append(os.path.join(datapath,
|
||||
parameter['database'],
|
||||
eventID))
|
||||
|
||||
if not events:
|
||||
@ -234,15 +241,12 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
data.get_evt_data().path = eventpath
|
||||
print('Reading event data from filename {}...'.format(filename))
|
||||
except Exception as e:
|
||||
if type(e) == FileNotFoundError:
|
||||
print('Creating new event file.')
|
||||
else:
|
||||
print('Could not read event from file {}: {}'.format(filename, e))
|
||||
print('Could not read event from file {}: {}'.format(filename, e))
|
||||
data = Data()
|
||||
pylot_event = Event(eventpath) # event should be path to event directory
|
||||
data.setEvtData(pylot_event)
|
||||
if fnames in [None, 'None']:
|
||||
data.setWFData(glob.glob(os.path.join(event_datapath, '*')))
|
||||
if fnames == 'None':
|
||||
data.setWFData(glob.glob(os.path.join(datapath, event_datapath, '*')))
|
||||
# the following is necessary because within
|
||||
# multiple event processing no event ID is provided
|
||||
# in autopylot.in
|
||||
@ -273,7 +277,7 @@ def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, even
|
||||
if not wfdat:
|
||||
print('Could not find station {}. STOP!'.format(station))
|
||||
return
|
||||
# wfdat = remove_underscores(wfdat)
|
||||
#wfdat = remove_underscores(wfdat)
|
||||
# trim components for each station to avoid problems with different trace starttimes for one station
|
||||
wfdat = check4gapsAndRemove(wfdat)
|
||||
wfdat = check4doubled(wfdat)
|
||||
|
@ -3,10 +3,8 @@
|
||||
#$ -l low
|
||||
#$ -cwd
|
||||
#$ -pe smp 40
|
||||
##$ -l mem=3G
|
||||
#$ -l h_vmem=6G
|
||||
#$ -l mem=2G
|
||||
#$ -l h_vmem=2G
|
||||
#$ -l os=*stretch
|
||||
|
||||
conda activate pylot_311
|
||||
|
||||
python ./autoPyLoT.py -i /home/marcel/.pylot/pylot_adriaarray.in -c 20 -dmt processed
|
||||
python ./autoPyLoT.py -i /home/marcel/.pylot/pylot_alparray_mantle_corr_stack_0.03-0.5.in -dmt processed -c $NSLOTS
|
||||
|
@ -1,77 +0,0 @@
|
||||
# Pick-Correlation Correction
|
||||
|
||||
## Introduction
|
||||
Currently, the pick-correlation correction algorithm is not accessible from they PyLoT GUI. The main file *pick_correlation_correction.py* is located in the directory *pylot\correlation*.
|
||||
The program only works for an obspy dmt database structure.
|
||||
|
||||
The basic workflow of the algorithm is shown in the following diagram. The first step **(1)** is the normal (automatic) picking procedure in PyLoT. Everything from step **(2)** to **(5)** is part of the correlation correction algorithm.
|
||||
|
||||
*Note: The first step is not required in case theoretical onsets are used instead of external picks when the parameter use_taupy_onsets is set to True. However, an existing event quakeML (.xml) file generated by PyLoT might be required for each event in case not external picks are used.*
|
||||
|
||||

|
||||
|
||||
A detailed description of the algorithm can be found in the corresponding publication:
|
||||
|
||||
*Paffrath, M., Friederich, W., and the AlpArray and AlpArray-SWATH D Working Groups: Teleseismic P waves at the AlpArray seismic network: wave fronts, absolute travel times and travel-time residuals, Solid Earth, 12, 1635–1660, https://doi.org/10.5194/se-12-1635-2021, 2021.*
|
||||
|
||||
## How to use
|
||||
To use the program you have to call the main program providing two mandatory arguments: a path to the obspy dmt database folder *dmt_database_path* and the path to the PyLoT infile *pylot.in* for picking of the beam trace:
|
||||
|
||||
```python pick_correlation_correction.py dmt_database_path pylot.in```
|
||||
|
||||
By default, the parameter file *parameters.yaml* is used. You can use the command line option *--params* to specify a different parameter file and other optional arguments such as *-pd* for plotting detailed information or *-n 4* to use 4 cores for parallel processing:
|
||||
|
||||
```python pick_correlation_correction.py dmt_database_path pylot.in --params parameters_adriaarray.yaml -pd -n 4```
|
||||
|
||||
## Cross-Correlation Parameters
|
||||
|
||||
The program uses the parameters in the file *parameters.yaml* by default. You can use the command line option *--params* to specify a different parameter file. An example of the parameter file is provided in the *correlation\parameters.yaml* file.
|
||||
|
||||
In the top level of the parameter file the logging level *logging* can be set, as well as a list of pick phases *pick_phases* (e.g. ['P', 'S']).
|
||||
|
||||
For each pick phase the different parameters can be set in the first sub-level of the parameter file, e.g.:
|
||||
|
||||
```yaml
|
||||
logging: info
|
||||
pick_phases: ['P', 'S']
|
||||
|
||||
P:
|
||||
min_corr_stacking: 0.8
|
||||
min_corr_export: 0.6
|
||||
[...]
|
||||
|
||||
S:
|
||||
min_corr_stacking: 0.7
|
||||
[...]
|
||||
```
|
||||
|
||||
The following parameters are available:
|
||||
|
||||
|
||||
| Parameter Name | Description | Parameter Type |
|
||||
|--------------------------------|----------------------------------------------------------------------------------------------------|----------------|
|
||||
| min_corr_stacking | Minimum correlation coefficient for building beam trace | float |
|
||||
| min_corr_export | Minimum correlation coefficient for pick export | float |
|
||||
| min_stack | Minimum number of stations for building beam trace | int |
|
||||
| t_before | Correlation window before reference pick | float |
|
||||
| t_after | Correlation window after reference pick | float |
|
||||
| cc_maxlag | Maximum shift for initial correlation | float |
|
||||
| cc_maxlag2 | Maximum shift for second (final) correlation (also for calculating pick uncertainty) | float |
|
||||
| initial_pick_outlier_threshold | Threshold for excluding large outliers of initial (AIC) picks | float |
|
||||
| export_threshold | Automatically exclude all onsets which deviate more than this threshold from corrected taup onsets | float |
|
||||
| min_picks_export | Minimum number of correlated picks for export | int |
|
||||
| min_picks_autopylot | Minimum number of reference auto picks to continue with event | int |
|
||||
| check_RMS | Do RMS check to search for restitution errors (very experimental) | bool |
|
||||
| use_taupy_onsets | Use taupy onsets as reference picks instead of external picks | bool |
|
||||
| station_list | Use the following stations as reference for stacking | list[str] |
|
||||
| use_stacked_trace | Use existing stacked trace if found (spare re-computation) | bool |
|
||||
| data_dir | obspyDMT data subdirectory (e.g. 'raw', 'processed') | str |
|
||||
| pickfile_extension | Use quakeML files (PyLoT output) with the following extension | str |
|
||||
| dt_stacking | Time difference for stacking window (in seconds) | list[float] |
|
||||
| filter_options | Filter for first correlation (rough) | dict |
|
||||
| filter_options_final | Filter for second correlation (fine) | dict |
|
||||
| filter_type | Filter type (e.g. bandpass) | str |
|
||||
| sampfreq | Sampling frequency (in Hz) | float |
|
||||
|
||||
## Example Dataset
|
||||
An example dataset with waveform data, metadata and automatic picks in the obspy-dmt dataset format for testing can be found at https://zenodo.org/doi/10.5281/zenodo.13759803
|
455
docs/gui.md
455
docs/gui.md
@ -2,36 +2,36 @@
|
||||
|
||||
- [PyLoT Documentation](#pylot-documentation)
|
||||
- [PyLoT GUI](#pylot-gui)
|
||||
- [First start](#first-start)
|
||||
- [Main Screen](#main-screen)
|
||||
- [Waveform Plot](#waveform-plot)
|
||||
- [Mouse view controls](#mouse-view-controls)
|
||||
- [Buttons](#buttons)
|
||||
- [Array Map](#array-map)
|
||||
- [Eventlist](#eventlist)
|
||||
- [Usage](#usage)
|
||||
- [Projects and Events](#projects-and-events)
|
||||
- [Event folder structure](#event-folder-structure)
|
||||
- [Loading event information from CSV file](#loading-event-information-from-csv-file)
|
||||
- [Adding events to project](#adding-events-to-project)
|
||||
- [Saving projects](#saving-projects)
|
||||
- [Adding metadata](#adding-metadata)
|
||||
- [First start](#first-start)
|
||||
- [Main Screen](#main-screen)
|
||||
- [Waveform Plot](#waveform-plot)
|
||||
- [Mouse view controls](#mouse-view-controls)
|
||||
- [Buttons](#buttons)
|
||||
- [Array Map](#array-map)
|
||||
- [Eventlist](#eventlist)
|
||||
- [Usage](#usage)
|
||||
- [Projects and Events](#projects-and-events)
|
||||
- [Event folder structure](#event-folder-structure)
|
||||
- [Loading event information from CSV file](#loading-event-information-from-csv-file)
|
||||
- [Adding events to project](#adding-events-to-project)
|
||||
- [Saving projects](#saving-projects)
|
||||
- [Adding metadata](#adding-metadata)
|
||||
- [Picking](#picking)
|
||||
- [Manual Picking](#manual-picking)
|
||||
- [Picking window](#picking-window)
|
||||
- [Picking Window Settings](#picking-window-settings)
|
||||
- [Filtering](#filtering)
|
||||
- [Export and Import of manual picks](#export-and-import-of-manual-picks)
|
||||
- [Export](#export)
|
||||
- [Import](#import)
|
||||
- [Automatic Picking](#automatic-picking)
|
||||
- [Tuning](#tuning)
|
||||
- [Production run of the autopicker](#production-run-of-the-autopicker)
|
||||
- [Evaluation of automatic picks](#evaluation-of-automatic-picks)
|
||||
- [1. Jackknife check](#1-jackknife-check)
|
||||
- [2. Wadati check](#2-wadati-check)
|
||||
- [Comparison between automatic and manual picks](#comparison-between-automatic-and-manual-picks)
|
||||
- [Export and Import of automatic picks](#export-and-import-of-automatic-picks)
|
||||
- [Manual Picking](#manual-picking)
|
||||
- [Picking window](#picking-window)
|
||||
- [Picking Window Settings](#picking-window-settings)
|
||||
- [Filtering](#filtering)
|
||||
- [Export and Import of manual picks](#export-and-import-of-manual-picks)
|
||||
- [Export](#export)
|
||||
- [Import](#import)
|
||||
- [Automatic Picking](#automatic-picking)
|
||||
- [Tuning](#tuning)
|
||||
- [Production run of the autopicker](#production-run-of-the-autopicker)
|
||||
- [Evaluation of automatic picks](#evaluation-of-automatic-picks)
|
||||
- [1. Jackknife check](#1-jackknife-check)
|
||||
- [2. Wadati check](#2-wadati-check)
|
||||
- [Comparison between automatic and manual picks](#comparison-between-automatic-and-manual-picks)
|
||||
- [Export and Import of automatic picks](#export-and-import-of-automatic-picks)
|
||||
- [Location determination](#location-determination)
|
||||
- [FAQ](#faq)
|
||||
|
||||
@ -44,17 +44,15 @@ This section describes how to use PyLoT graphically to view waveforms and create
|
||||
After opening PyLoT for the first time, the setup routine asks for the following information:
|
||||
|
||||
Questions:
|
||||
|
||||
1. Full Name
|
||||
2. Authority: Enter authority/institution name
|
||||
3. Format: Enter output format (*.xml, *.cnv, *.obs)
|
||||
|
||||
[//]: <> (TODO: explain what these things mean, where they are used)
|
||||
[//]: <> (TODO: explain what these things mean, where they are used)
|
||||
|
||||
## Main Screen
|
||||
|
||||
After entering the [information](#first-start), PyLoTs main window is shown. It defaults to a view of
|
||||
the [Waveform Plot](#waveform-plot), which starts empty.
|
||||
After entering the [information](#first-start), PyLoTs main window is shown. It defaults to a view of the [Waveform Plot](#waveform-plot), which starts empty.
|
||||
|
||||
<img src=images/gui/pylot-main-screen.png alt="Tune autopicks button" title="Tune autopicks button">
|
||||
|
||||
@ -63,21 +61,24 @@ Add trace data by [loading a project](#projects-and-events) or by [adding event
|
||||
### Waveform Plot
|
||||
|
||||
The waveform plot shows a trace list of all stations of an event.
|
||||
Click on any trace to open the stations [picking window](#picking-window), where you can review automatic and manual
|
||||
picks.
|
||||
Click on any trace to open the stations [picking window](#picking-window), where you can review automatic and manual picks.
|
||||
|
||||
<img src=images/gui/pylot-waveform-plot.png alt="A Waveform Plot showing traces of one event">
|
||||
|
||||
Above the traces the currently displayed event can be selected. In the bottom bar information about the trace under the
|
||||
mouse cursor is shown. This information includes the station name (station), the absolute UTC time (T) of the point
|
||||
under the mouse cursor and the relative time since the first trace start in seconds (t) as well as a trace count.
|
||||
Above the traces the currently displayed event can be selected.
|
||||
In the bottom bar information about the trace under the mouse cursor is shown. This information includes the station name (station), the absolute UTC time (T) of the point under the mouse cursor and the relative time since the first trace start in seconds (t) as well as a trace count.
|
||||
|
||||
#### Mouse view controls
|
||||
#### Mouse view controls
|
||||
|
||||
Hold left mouse button and drag to pan view.
|
||||
|
||||
Hold right mouse button and Direction | Result --- | --- Move the mouse up | Increase amplitude scale Move the mouse
|
||||
down | Decrease amplitude scale Move the mouse right | Increase time scale Move the mouse left | Decrease time scale
|
||||
Hold right mouse button and
|
||||
Direction | Result
|
||||
--- | ---
|
||||
Move the mouse up | Increase amplitude scale
|
||||
Move the mouse down | Decrease amplitude scale
|
||||
Move the mouse right | Increase time scale
|
||||
Move the mouse left | Decrease time scale
|
||||
|
||||
Press right mouse button and click "View All" from the context menu to reset the view.
|
||||
|
||||
@ -85,100 +86,81 @@ Press right mouse button and click "View All" from the context menu to reset the
|
||||
|
||||
[//]: <> (Hack: We need these invisible spaces to add space to the first column, otherwise )
|
||||
|
||||
| Icon | Description |
|
||||
|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| <img src="../icons/newfile.png" alt="Create new project" width="64" height="64"> | Create a new project, for more information about projects see [Projects and Events](#projects-and-events). |
|
||||
| <img src="../icons/openproject.png" alt="Open project" width="64" height="64"> | Load a project file from disk. |
|
||||
| <img src="../icons/saveproject.png" alt="Save Project" width="64" height="64"> | Save all current events into an associated project file on disk. If there is no project file currently associated, you will be asked to create a new one. |
|
||||
| <img src="../icons/saveprojectas.png" alt="Save Project as" width="64" height="64"> | Save all current events into a new project file on disk. See [Saving projects](#saving-projects). |
|
||||
| <img src="../icons/add.png" alt="Add event data" width="64" height="64"> | Add event data by selecting directories containing waveforms. For more information see [Event folder structure](#event-folder-structure). |
|
||||
| <img src="../icons/openpick.png" alt="Load event information" width="64" height="64"> | Load picks/origins from disk into the currently displayed event. If a pick already exists for a station, the one from file will overwrite the existing one. |
|
||||
| <img src="../icons/openpicks.png" alt="Load information for all events" width="64" height="64"> | Load picks/origins for all events of the current project. PyLoT searches for files within the directory of the event and tries to load them for that event. For this function to work, the files containing picks/origins have to be named as described in [Event folder structure](#event-folder-structure). If a pick already exists for a station, the one from file will overwrite the existing one. |
|
||||
| <img src="../icons/savepicks.png" alt="Save picks" width="64" height="64"> | Save event information such as picks and origin to file. You will be asked to select a directory in which this information should be saved. |
|
||||
| <img src="../icons/openloc.png" alt="Load location information" width="64" height="64"> | Load location information from disk, |
|
||||
| <img src="../icons/Matlab_PILOT_icon.png" alt="Load legacy information" width="64" height="64"> | Load event information from a previous, MatLab based PILOT version. |
|
||||
| <img src="../icons/key_Z.png" alt="Display Z" width="64" height="64"> | Display Z component of streams in waveform plot. |
|
||||
| <img src="../icons/key_N.png" alt="Display N" width="64" height="64"> | Display N component of streams in waveform plot. |
|
||||
| <img src="../icons/key_E.png" alt="Display E" width="64" height="64"> | Display E component of streams in waveform plot. |
|
||||
| <img src="../icons/tune.png" alt="Tune Autopicker" width="64" height="64"> | Open the [Tune Autopicker window](#tuning). |
|
||||
| <img src="../icons/autopylot_button.png" alt="" width="64" height="64"> | Opens a window that allows starting the autopicker for all events ([Production run of the AutoPicker](#production-run-of-the-autopicker)). |
|
||||
| <img src="../icons/compare_button.png" alt="Comparison" width="64" height="64"> | Compare automatic and manual picks, only available if automatic and manual picks for an event exist. See [Comparison between automatic and manual picks](#comparison-between-automatic-and-manual-picks). |
|
||||
| <img src="../icons/locate_button.png" alt="Locate event" width="64" height="64"> | Run a location routine (NonLinLoc) as configured in the settings on the picks. See [Location determination](#location-determination). |
|
||||
Icon | Description
|
||||
--- | ---
|
||||
<img src="../icons/newfile.png" alt="Create new project" width="64" height="64"> | Create a new project, for more information about projects see [Projects and Events](#projects-and-events).
|
||||
<img src="../icons/openproject.png" alt="Open project" width="64" height="64"> | Load a project file from disk.
|
||||
<img src="../icons/saveproject.png" alt="Save Project" width="64" height="64"> | Save all current events into an associated project file on disk. If there is no project file currently associated, you will be asked to create a new one.
|
||||
<img src="../icons/saveprojectas.png" alt="Save Project as" width="64" height="64"> | Save all current events into a new project file on disk. See [Saving projects](#saving-projects).
|
||||
<img src="../icons/add.png" alt="Add event data" width="64" height="64"> | Add event data by selecting directories containing waveforms. For more information see [Event folder structure](#event-folder-structure).
|
||||
<img src="../icons/openpick.png" alt="Load event information" width="64" height="64"> | Load picks/origins from disk into the currently displayed event. If a pick already exists for a station, the one from file will overwrite the existing one.
|
||||
<img src="../icons/openpicks.png" alt="Load information for all events" width="64" height="64"> | Load picks/origins for all events of the current project. PyLoT searches for files within the directory of the event and tries to load them for that event. For this function to work, the files containing picks/origins have to be named as described in [Event folder structure](#event-folder-structure). If a pick already exists for a station, the one from file will overwrite the existing one.
|
||||
<img src="../icons/savepicks.png" alt="Save picks" width="64" height="64"> | Save event information such as picks and origin to file. You will be asked to select a directory in which this information should be saved.
|
||||
<img src="../icons/openloc.png" alt="Load location information" width="64" height="64"> | Load location information from disk,
|
||||
<img src="../icons/Matlab_PILOT_icon.png" alt="Load legacy information" width="64" height="64"> | Load event information from a previous, MatLab based PILOT version.
|
||||
<img src="../icons/key_Z.png" alt="Display Z" width="64" height="64"> | Display Z component of streams in waveform plot.
|
||||
<img src="../icons/key_N.png" alt="Display N" width="64" height="64"> | Display N component of streams in waveform plot.
|
||||
<img src="../icons/key_E.png" alt="Display E" width="64" height="64"> | Display E component of streams in waveform plot.
|
||||
<img src="../icons/tune.png" alt="Tune Autopicker" width="64" height="64"> | Open the [Tune Autopicker window](#tuning).
|
||||
<img src="../icons/autopylot_button.png" alt="" width="64" height="64"> | Opens a window that allows starting the autopicker for all events ([Production run of the AutoPicker](#production-run-of-the-autopicker)).
|
||||
<img src="../icons/compare_button.png" alt="Comparison" width="64" height="64"> | Compare automatic and manual picks, only available if automatic and manual picks for an event exist. See [Comparison between automatic and manual picks](#comparison-between-automatic-and-manual-picks).
|
||||
<img src="../icons/locate_button.png" alt="Locate event" width="64" height="64"> | Run a location routine (NonLinLoc) as configured in the settings on the picks. See [Location determination](#location-determination).
|
||||
|
||||
|
||||
### Array Map
|
||||
|
||||
The array map will display a color diagram to allow a visual check of the consistency of picks across multiple stations.
|
||||
This works by calculating the time difference of every onset to the earliest onset. Then isolines are drawn between
|
||||
stations with the same time difference and the areas between isolines are colored.
|
||||
The result should resemble a color gradient as the wavefront rolls over the network area. Stations where picks are
|
||||
earlier/later than their neighbours can be reviewed by clicking on them, which opens
|
||||
the [picking window](#picking-window).
|
||||
The array map will display a color diagram to allow a visual check of the consistency of picks across multiple stations. This works by calculating the time difference of every onset to the earliest onset. Then isolines are drawn between stations with the same time difference and the areas between isolines are colored.
|
||||
The result should resemble a color gradient as the wavefront rolls over the network area. Stations where picks are earlier/later than their neighbours can be reviewed by clicking on them, which opens the [picking window](#picking-window).
|
||||
|
||||
Above the Array Map the picks that are used to create the map can be customized. The phase of picks that should be used
|
||||
can be selected, which allows checking the consistency of the P- and S-phase separately. Additionally the pick type can
|
||||
be set to manual, automatic or hybrid, meaning display only manual picks, automatic picks or only display automatic
|
||||
picks for stations where there are no manual ones.
|
||||
Above the Array Map the picks that are used to create the map can be customized.
|
||||
The phase of picks that should be used can be selected, which allows checking the consistency of the P- and S-phase separately.
|
||||
Additionally the pick type can be set to manual, automatic or hybrid, meaning display only manual picks, automatic picks or only display automatic picks for stations where there are no manual ones.
|
||||
|
||||

|
||||
*Array Map for an event at the Northern Mid Atlantic Ridge, between North Africa and Mexico (Lat. 22.58, Lon. -45.11).
|
||||
The wavefront moved from west to east over the network area (Alps and Balcan region), with the earliest onsets in blue
|
||||
in the west.*
|
||||
*Array Map for an event at the Northern Mid Atlantic Ridge, between North Africa and Mexico (Lat. 22.58, Lon. -45.11). The wavefront moved from west to east over the network area (Alps and Balcan region), with the earliest onsets in blue in the west.*
|
||||
|
||||
To be able to display an array map PyLoT needs to load an inventory file, where the metadata of seismic stations is
|
||||
kept. For more information see [Metadata](#adding-metadata). Additionally automatic or manual picks need to exist for
|
||||
the current event.
|
||||
To be able to display an array map PyLoT needs to load an inventory file, where the metadata of seismic stations is kept. For more information see [Metadata](#adding-metadata). Additionally automatic or manual picks need to exist for the current event.
|
||||
|
||||
### Eventlist
|
||||
|
||||
The eventlist displays event parameters. The displayed parameters are saved in the .xml file in the event folder. Events
|
||||
can be deleted from the project by pressing the red X in the leftmost column of the corresponding event.
|
||||
The eventlist displays event parameters. The displayed parameters are saved in the .xml file in the event folder. Events can be deleted from the project by pressing the red X in the leftmost column of the corresponding event.
|
||||
|
||||
<img src="images/gui/eventlist.png" alt="Eventlist">
|
||||
|
||||
| Column | Description |
|
||||
|------------|--------------------------------------------------------------------------------------------------------------------|
|
||||
| Event | Full path to the events folder. |
|
||||
| Time | Time of event. |
|
||||
| Lat | Latitude in degrees of event location. |
|
||||
| Lon | Longitude in degrees of event location. |
|
||||
| Depth | Depth in km of event. |
|
||||
| Mag | Magnitude of event. |
|
||||
| [N] MP | Number of manual picks. |
|
||||
| [N] AP | Number of automatic picks. |
|
||||
| Tuning Set | Select whether this event is a Tuning event. See [Automatic Picking](#automatic-picking). |
|
||||
| Test Set | Select whether this event is a Test event. See [Automatic Picking](#automatic-picking). |
|
||||
| Notes | Free form text field for notes regarding this event. Text will be saved in the notes.txt file in the event folder. |
|
||||
Column | Description
|
||||
--- | ---
|
||||
Event | Full path to the events folder.
|
||||
Time | Time of event.
|
||||
Lat | Latitude in degrees of event location.
|
||||
Lon | Longitude in degrees of event location.
|
||||
Depth | Depth in km of event.
|
||||
Mag | Magnitude of event.
|
||||
[N] MP | Number of manual picks.
|
||||
[N] AP | Number of automatic picks.
|
||||
Tuning Set | Select whether this event is a Tuning event. See [Automatic Picking](#automatic-picking).
|
||||
Test Set | Select whether this event is a Test event. See [Automatic Picking](#automatic-picking).
|
||||
Notes | Free form text field for notes regarding this event. Text will be saved in the notes.txt file in the event folder.
|
||||
|
||||
## Usage
|
||||
|
||||
### Projects and Events
|
||||
|
||||
PyLoT uses projects to categorize different seismic data. A project consists of one or multiple events. Events contain
|
||||
seismic traces from one or multiple stations. An event also contains further information, e.g. origin time, source
|
||||
parameters and automatic as well as manual picks. Projects are used to group events which should be analysed together. A
|
||||
project could contain all events from a specific region within a timeframe of interest or all recorded events of a
|
||||
seismological experiment.
|
||||
PyLoT uses projects to categorize different seismic data. A project consists of one or multiple events. Events contain seismic traces from one or multiple stations. An event also contains further information, e.g. origin time, source parameters and automatic as well as manual picks.
|
||||
Projects are used to group events which should be analysed together. A project could contain all events from a specific region within a timeframe of interest or all recorded events of a seismological experiment.
|
||||
|
||||
### Event folder structure
|
||||
|
||||
PyLoT expects the following folder structure for seismic data:
|
||||
|
||||
* Every event should be in it's own folder with the following naming scheme for the folders:
|
||||
``e[id].[doy].[yy]``, where ``[id]`` is a four-digit numerical id increasing from 0001, ``[doy]`` the three digit day
|
||||
of year and ``[yy]`` the last two digits of the year of the event. This structure has to be created by the user of
|
||||
PyLoT manually.
|
||||
``e[id].[doy].[yy]``, where ``[id]`` is a four-digit numerical id increasing from 0001, ``[doy]`` the three digit day of year and ``[yy]`` the last two digits of the year of the event. This structure has to be created by the user of PyLoT manually.
|
||||
* These folders should contain the seismic data for their event as ``.mseed`` or other supported filetype
|
||||
* All automatic and manual picks should be in an ``.xml`` file in their event folder. PyLoT saves picks in this file.
|
||||
This file does not have to be added manually unless there are picks to be imported. The format used to save picks is
|
||||
QUAKEML.
|
||||
Picks are saved in a file with the same filename as the event folder with ``PyLoT_`` prepended.
|
||||
* The file ``notes.txt`` is used for saving analysts comments. Everything saved here will be displayed in the 'Notes'
|
||||
column of the eventlist.
|
||||
* All automatic and manual picks should be in an ``.xml`` file in their event folder. PyLoT saves picks in this file. This file does not have to be added manually unless there are picks to be imported. The format used to save picks is QUAKEML.
|
||||
Picks are saved in a file with the same filename as the event folder with ``PyLoT_`` prepended.
|
||||
* The file ``notes.txt`` is used for saving analysts comments. Everything saved here will be displayed in the 'Notes' column of the eventlist.
|
||||
|
||||
### Loading event information from CSV file
|
||||
|
||||
Event information can be saved in a ``.csv`` file located in the rootpath. The file is made from one header line, which
|
||||
is followed by one or multiple data lines. Values are separated by comma, while a dot is used as a decimal separator.
|
||||
Event information can be saved in a ``.csv`` file located in the rootpath. The file is made from one header line, which is followed by one or multiple data lines. Values are separated by comma, while a dot is used as a decimal separator.
|
||||
This information is then shown in the table in the [Eventlist tab](#Eventlist).
|
||||
|
||||
One example header and data line is shown below.
|
||||
@ -187,61 +169,50 @@ One example header and data line is shown below.
|
||||
|
||||
The meaning of the header entries is:
|
||||
|
||||
| Header | description |
|
||||
|----------------------|------------------------------------------------------------------------------------------------|
|
||||
| event | Event id, has to be the same as the folder name in which waveform data for this event is kept. |
|
||||
| Data | Origin date of the event, format DD/MM/YY or DD/MM/YYYY. |
|
||||
| Time | Origin time of the event. Format HH:MM:SS. |
|
||||
| Lat, Long | Origin latitude and longitude in decimal degrees. |
|
||||
| Region | Flinn-Engdahl region name. |
|
||||
| Basis Lat, Basis Lon | Latitude and longitude of the basis of the station network in decimal degrees. |
|
||||
| Distance [km] | Distance from origin coordinates to basis coordinates in km. |
|
||||
| Distance [rad] | Distance from origin coordinates to basis coordinates in rad. |
|
||||
Header | description
|
||||
--- | ---
|
||||
event | Event id, has to be the same as the folder name in which waveform data for this event is kept.
|
||||
Data | Origin date of the event, format DD/MM/YY or DD/MM/YYYY.
|
||||
Time | Origin time of the event. Format HH:MM:SS.
|
||||
Lat, Long | Origin latitude and longitude in decimal degrees.
|
||||
Region | Flinn-Engdahl region name.
|
||||
Basis Lat, Basis Lon | Latitude and longitude of the basis of the station network in decimal degrees.
|
||||
Distance [km] | Distance from origin coordinates to basis coordinates in km.
|
||||
Distance [rad] | Distance from origin coordinates to basis coordinates in rad.
|
||||
|
||||
|
||||
### Adding events to project
|
||||
|
||||
PyLoT GUI starts with an empty project. To add events, use the add event data button. Select one or multiple folders
|
||||
containing events.
|
||||
PyLoT GUI starts with an empty project. To add events, use the add event data button. Select one or multiple folders containing events.
|
||||
|
||||
[//]: <> (TODO: explain _Directories: Root path, Data path, Database path_)
|
||||
|
||||
### Saving projects
|
||||
|
||||
Save the current project from the menu with File->Save project or File->Save project as. PyLoT uses ``.plp`` files to
|
||||
save project information. This file format is not interchangeable between different versions of Python interpreters.
|
||||
Saved projects contain the automatic and manual picks. Seismic trace data is not included into the ``.plp`` file, but
|
||||
read from its location used when saving the file.
|
||||
Save the current project from the menu with File->Save project or File->Save project as.
|
||||
PyLoT uses ``.plp`` files to save project information. This file format is not interchangeable between different versions of Python interpreters.
|
||||
Saved projects contain the automatic and manual picks. Seismic trace data is not included into the ``.plp`` file, but read from its location used when saving the file.
|
||||
|
||||
### Adding metadata
|
||||
|
||||
[//]: <> (TODO: Add picture of metadata "manager" when it is done)
|
||||
|
||||
PyLoT can handle ``.dless``, ``.xml``, ``.resp`` and ``.dseed`` file formats for Metadata. Metadata files stored on disk
|
||||
can be added to a project by clicking *Edit*->*Manage Inventories*. This opens up a window where the folders which
|
||||
contain metadata files can be selected. PyLoT will then search these files for the station names when it needs the
|
||||
information.
|
||||
PyLoT can handle ``.dless``, ``.xml``, ``.resp`` and ``.dseed`` file formats for Metadata. Metadata files stored on disk can be added to a project by clicking *Edit*->*Manage Inventories*. This opens up a window where the folders which contain metadata files can be selected. PyLoT will then search these files for the station names when it needs the information.
|
||||
|
||||
# Picking
|
||||
|
||||
PyLoTs automatic and manual pick determination works as following:
|
||||
|
||||
* Using certain parameters, a first initial/coarse pick is determined. The first manual pick is determined by visual
|
||||
review of the whole waveform and selection of the most likely onset by the analyst. The first automatic pick is
|
||||
determined by calculation of a characteristic function (CF) for the seismic trace. When a wave arrives, the CFs
|
||||
properties change, which is determined as the signals onset.
|
||||
* Afterwards, a refined set of parameters is applied to a small part of the waveform around the initial onset. For
|
||||
manual picks this means a closer view of the trace, for automatic picks this is done by a recalculated CF with
|
||||
different parameters.
|
||||
* Using certain parameters, a first initial/coarse pick is determined. The first manual pick is determined by visual review of the whole waveform and selection of the most likely onset by the analyst. The first automatic pick is determined by calculation of a characteristic function (CF) for the seismic trace. When a wave arrives, the CFs properties change, which is determined as the signals onset.
|
||||
* Afterwards, a refined set of parameters is applied to a small part of the waveform around the initial onset. For manual picks this means a closer view of the trace, for automatic picks this is done by a recalculated CF with different parameters.
|
||||
* This second picking phase results in the precise pick, which is treated as the onset time.
|
||||
|
||||
## Manual Picking
|
||||
|
||||
To create manual picks, you will need to open or create a project that contains seismic trace data (
|
||||
see [Adding events to projects](#adding-events-to-project)). Click on a trace to open
|
||||
the [Picking window](#picking-window).
|
||||
To create manual picks, you will need to open or create a project that contains seismic trace data (see [Adding events to projects](#adding-events-to-project)). Click on a trace to open the [Picking window](#picking-window).
|
||||
|
||||
### Picking window
|
||||
|
||||
Open the picking window of a station by leftclicking on any trace in the waveform plot. Here you can create manual picks
|
||||
for the selected station.
|
||||
Open the picking window of a station by leftclicking on any trace in the waveform plot. Here you can create manual picks for the selected station.
|
||||
|
||||
<img src="images/gui/picking/pickwindow.png" alt="Picking window">
|
||||
|
||||
@ -249,24 +220,24 @@ for the selected station.
|
||||
|
||||
#### Picking Window Settings
|
||||
|
||||
| Icon | Shortcut | Menu Alternative | Description |
|
||||
|----------------------------------------------------------------------------------|----------------|-----------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| <img src="../icons/filter_p.png" alt="Filter P" width="64" height="64"> | p | Filter->Apply P Filter | Filter all channels according to the options specified in Filter parameter, P Filter section. |
|
||||
| <img src="../icons/filter_s.png" alt="Filter S" width="64" height="64"> | s | Filter->Apply S Filter | Filter all channels according to the options specified in Filter parameter, S Filter section. |
|
||||
| <img src="../icons/key_A.png" alt="Filter Automatically" width="64" height="64"> | Ctrl + a | Filter->Automatic Filtering | If enabled, automatically select the correct filter option (P, S) depending on the selected phase to be picked. |
|
||||
|  | 1 (P) or 5 (S) | Picks->P or S | Select phase to pick. If Automatic Filtering is enabled, this will apply the appropriate filter depending on the phase. |
|
||||
|  | - | - | Zoom into waveform. |
|
||||
|  | - | - | Reset zoom to default view. |
|
||||
|  | - | - | Delete all manual picks on this station. |
|
||||
|  | - | - | Click this button and then the picked phase to rename it. |
|
||||
|  | - | - | If checked, after accepting the manual picks for this station with 'OK', the picking window for the next station will be opened. This option is useful for fast manual picking of a complete event. |
|
||||
| Estimated onsets | - | - | Show the theoretical onsets for this station. Needs metadata and origin information. |
|
||||
| Compare to channel | - | - | Select a data channel to compare against. The selected channel will be displayed in the picking window behind every channel allowing the analyst to visually compare signal correlation between different channels. |
|
||||
| Scaling | - | - | Individual means every channel is scaled to its own maximum. If a channel is selected here, all channels will be scaled relatively to this channel. |
|
||||
Icon | Shortcut | Menu Alternative | Description
|
||||
---|---|---|---
|
||||
<img src="../icons/filter_p.png" alt="Filter P" width="64" height="64"> | p | Filter->Apply P Filter | Filter all channels according to the options specified in Filter parameter, P Filter section.
|
||||
<img src="../icons/filter_s.png" alt="Filter S" width="64" height="64"> | s | Filter->Apply S Filter | Filter all channels according to the options specified in Filter parameter, S Filter section.
|
||||
<img src="../icons/key_A.png" alt="Filter Automatically" width="64" height="64"> | Ctrl + a | Filter->Automatic Filtering | If enabled, automatically select the correct filter option (P, S) depending on the selected phase to be picked.
|
||||
 | 1 (P) or 5 (S) | Picks->P or S | Select phase to pick. If Automatic Filtering is enabled, this will apply the appropriate filter depending on the phase.
|
||||
 | - | - | Zoom into waveform.
|
||||
 | - | - | Reset zoom to default view.
|
||||
 | - | - | Delete all manual picks on this station.
|
||||
 | - | - | Click this button and then the picked phase to rename it.
|
||||
 | - | - | If checked, after accepting the manual picks for this station with 'OK', the picking window for the next station will be opened. This option is useful for fast manual picking of a complete event.
|
||||
Estimated onsets | - | - | Show the theoretical onsets for this station. Needs metadata and origin information.
|
||||
Compare to channel | - | - | Select a data channel to compare against. The selected channel will be displayed in the picking window behind every channel allowing the analyst to visually compare signal correlation between different channels.
|
||||
Scaling | - | - | Individual means every channel is scaled to its own maximum. If a channel is selected here, all channels will be scaled relatively to this channel.
|
||||
|
||||
| Menu Command | Shortcut | Description |
|
||||
|---------------------------|----------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| P Channels and S Channels | - | Select which channels should be treated as P or S channels during picking. When picking a phase, only the corresponding channels will be shown during the precise pick. Normally, the Z channel should be selected for the P phase and the N and E channel for the S phase. |
|
||||
Menu Command | Shortcut | Description
|
||||
---|---|---
|
||||
P Channels and S Channels | - | Select which channels should be treated as P or S channels during picking. When picking a phase, only the corresponding channels will be shown during the precise pick. Normally, the Z channel should be selected for the P phase and the N and E channel for the S phase.
|
||||
|
||||
### Filtering
|
||||
|
||||
@ -274,167 +245,105 @@ Access the Filter options by pressing Ctrl+f on the Waveform plot or by the menu
|
||||
|
||||
<img src=images/gui/pylot-filter-options.png>
|
||||
|
||||
Here you are able to select filter type, order and frequencies for the P and S pick separately. These settings are used
|
||||
in the GUI for displaying the filtered waveform data and during manual picking. The values used by PyLoT for automatic
|
||||
picking are displayed next to the manual values. They can be changed in the [Tune Autopicker dialog](#tuning).
|
||||
A green value automatic value means the automatic and manual filter parameter is configured the same, red means they are
|
||||
configured differently. By toggling the "Overwrite filteroptions" checkmark you can set whether the manual
|
||||
precise/second pick uses the filter settings for the automatic picker (unchecked) or whether it uses the filter options
|
||||
in this dialog (checked). To guarantee consistent picking results between automatic and manual picking it is recommended
|
||||
to use the same filter settings for the determination of automatic and manual picks.
|
||||
Here you are able to select filter type, order and frequencies for the P and S pick separately. These settings are used in the GUI for displaying the filtered waveform data and during manual picking. The values used by PyLoT for automatic picking are displayed next to the manual values. They can be changed in the [Tune Autopicker dialog](#tuning).
|
||||
A green value automatic value means the automatic and manual filter parameter is configured the same, red means they are configured differently.
|
||||
By toggling the "Overwrite filteroptions" checkmark you can set whether the manual precise/second pick uses the filter settings for the automatic picker (unchecked) or whether it uses the filter options in this dialog (checked).
|
||||
To guarantee consistent picking results between automatic and manual picking it is recommended to use the same filter settings for the determination of automatic and manual picks.
|
||||
|
||||
### Export and Import of manual picks
|
||||
|
||||
#### Export
|
||||
|
||||
After the creation of manual picks they can either be saved in the project file (
|
||||
see [Saving projects](#saving-projects)). Alternatively the picks can be exported by pressing
|
||||
the <img src="../icons/savepicks.png" alt="Save event information button" title="Save picks button" height=24 width=24>
|
||||
button above the waveform plot or in the menu File->Save event information (shortcut Ctrl+p). Select the event directory
|
||||
in which to save the file. The filename will be ``PyLoT_[event_folder_name].[filetype selected during first startup]``
|
||||
.
|
||||
You can rename and copy this file, but PyLoT will then no longer be able to automatically recognize the correct picks
|
||||
for an event and the file will have to be manually selected when loading.
|
||||
After the creation of manual picks they can either be saved in the project file (see [Saving projects](#saving-projects)). Alternatively the picks can be exported by pressing the <img src="../icons/savepicks.png" alt="Save event information button" title="Save picks button" height=24 width=24> button above the waveform plot or in the menu File->Save event information (shortcut Ctrl+p). Select the event directory in which to save the file. The filename will be ``PyLoT_[event_folder_name].[filetype selected during first startup]``.
|
||||
You can rename and copy this file, but PyLoT will then no longer be able to automatically recognize the correct picks for an event and the file will have to be manually selected when loading.
|
||||
|
||||
#### Import
|
||||
|
||||
To import previously saved picks press
|
||||
the <img src="../icons/openpick.png" alt="Load event information button" width="24" height="24"> button and select the
|
||||
file to load. You will be asked to save the current state of your project if you have not done so before. You can
|
||||
continue without saving by pressing "Discard". This does not delete any information from your project, it just means
|
||||
that no project file is saved before the changes of importing picks are applied. PyLoT will automatically load files
|
||||
named after the scheme it uses when saving picks, described in the paragraph above. If it can't find any matching files,
|
||||
a file dialogue will open and you can select the file you wish to load.
|
||||
To import previously saved picks press the <img src="../icons/openpick.png" alt="Load event information button" width="24" height="24"> button and select the file to load. You will be asked to save the current state of your project if you have not done so before. You can continue without saving by pressing "Discard". This does not delete any information from your project, it just means that no project file is saved before the changes of importing picks are applied.
|
||||
PyLoT will automatically load files named after the scheme it uses when saving picks, described in the paragraph above. If it can't find any matching files, a file dialogue will open and you can select the file you wish to load.
|
||||
|
||||
If you see a warning "Mismatch in event identifiers" and are asked whether to continue loading the picks, this means
|
||||
that PyLoT doesn't recognize the picks in the file as belonging to this specific event. They could have either been
|
||||
saved under a different installation of PyLoT but with the same waveform data, which means they are still compatible and
|
||||
you can continue loading them. Or they could be picks from a different event, in which case loading them is not
|
||||
recommended.
|
||||
If you see a warning "Mismatch in event identifiers" and are asked whether to continue loading the picks, this means that PyLoT doesn't recognize the picks in the file as belonging to this specific event. They could have either been saved under a different installation of PyLoT but with the same waveform data, which means they are still compatible and you can continue loading them. Or they could be picks from a different event, in which case loading them is not recommended.
|
||||
|
||||
## Automatic Picking
|
||||
|
||||
The general workflow for automatic picking is as following:
|
||||
|
||||
- After setting up the project by loading waveforms and optionally metadata, the right parameters for the autopicker
|
||||
have to be determined
|
||||
The general workflow for automatic picking is as following:
|
||||
- After setting up the project by loading waveforms and optionally metadata, the right parameters for the autopicker have to be determined
|
||||
- This [tuning](#tuning) is done for single stations with immediate graphical feedback of all picking results
|
||||
- Afterwards the autopicker can be run for all or a subset of events from the project
|
||||
|
||||
For automatic picking PyLoT discerns between tune and test events, which the user has to set as such. Tune events are
|
||||
used to calibrate the autopicking algorithm, test events are then used to test the calibration. The purpose of that is
|
||||
controlling whether the parameters found during tuning are able to reliably pick the "unknown" test events.
|
||||
If this behaviour is not desired and all events should be handled the same, dont mark any events. Since this is just a
|
||||
way to group events to compare the picking results, nothing else will change.
|
||||
For automatic picking PyLoT discerns between tune and test events, which the user has to set as such. Tune events are used to calibrate the autopicking algorithm, test events are then used to test the calibration. The purpose of that is controlling whether the parameters found during tuning are able to reliably pick the "unknown" test events.
|
||||
If this behaviour is not desired and all events should be handled the same, dont mark any events. Since this is just a way to group events to compare the picking results, nothing else will change.
|
||||
|
||||
### Tuning
|
||||
|
||||
Tuning describes the process of adjusting the autopicker settings to the characteristics of your data set. To do this in
|
||||
PyLoT, use the <img src=../icons/tune.png height=24 alt="Tune autopicks button" title="Tune autopicks button"> button to
|
||||
open the Tune Autopicker.
|
||||
Tuning describes the process of adjusting the autopicker settings to the characteristics of your data set. To do this in PyLoT, use the <img src=../icons/tune.png height=24 alt="Tune autopicks button" title="Tune autopicks button"> button to open the Tune Autopicker.
|
||||
|
||||
<img src=images/gui/tuning/tune_autopicker.png>
|
||||
|
||||
View of a station in the Tune Autopicker window.
|
||||
|
||||
View of a station in the Tune Autopicker window.
|
||||
1. Select the event to be displayed and processed.
|
||||
2. Select the station from the event.
|
||||
3. To pick the currently displayed trace, click
|
||||
the <img src=images/gui/tuning/autopick_trace_button.png alt="Pick trace button" title="Autopick trace button" height=16>
|
||||
button.
|
||||
4. These tabs are used to select the current view. __Traces Plot__ contains a plot of the stations traces, where manual
|
||||
picks can be created/edited. __Overview__ contains graphical results of the automatic picking process. The __P and S
|
||||
tabs__ contain the automatic picking results of the P and S phase, while __log__ contains a useful text output of
|
||||
automatic picking.
|
||||
5. These buttons are used to load/save/reset settings for automatic picking. The parameters can be saved in PyLoT input
|
||||
files, which have the file ending *.in*. They are human readable text files, which can also be edited by hand. Saving
|
||||
the parameters allows you to load them again later, even on different machines.
|
||||
6. These menus control the behaviour of the creation of manual picks from the Tune Autopicker window. Picks allows to
|
||||
select the phase for which a manual pick should be created, Filter allows to filter waveforms and edit the filter
|
||||
parameters. P-Channels and S-Channels allow to select the channels that should be displayed when creating a manual P
|
||||
or S pick.
|
||||
7. This menu is the same as in the [Picking Window](#picking-window-settings), with the exception of the __Manual
|
||||
Onsets__ options. The __Manual Onsets__ buttons accepts or reject the manual picks created in the Tune Autopicker
|
||||
window, pressing accept adds them to the manual picks for the event, while reject removes them.
|
||||
2. Select the station from the event.
|
||||
3. To pick the currently displayed trace, click the <img src=images/gui/tuning/autopick_trace_button.png alt="Pick trace button" title="Autopick trace button" height=16> button.
|
||||
4. These tabs are used to select the current view. __Traces Plot__ contains a plot of the stations traces, where manual picks can be created/edited. __Overview__ contains graphical results of the automatic picking process. The __P and S tabs__ contain the automatic picking results of the P and S phase, while __log__ contains a useful text output of automatic picking.
|
||||
5. These buttons are used to load/save/reset settings for automatic picking. The parameters can be saved in PyLoT input files, which have the file ending *.in*. They are human readable text files, which can also be edited by hand. Saving the parameters allows you to load them again later, even on different machines.
|
||||
6. These menus control the behaviour of the creation of manual picks from the Tune Autopicker window. Picks allows to select the phase for which a manual pick should be created, Filter allows to filter waveforms and edit the filter parameters. P-Channels and S-Channels allow to select the channels that should be displayed when creating a manual P or S pick.
|
||||
7. This menu is the same as in the [Picking Window](#picking-window-settings), with the exception of the __Manual Onsets__ options. The __Manual Onsets__ buttons accepts or reject the manual picks created in the Tune Autopicker window, pressing accept adds them to the manual picks for the event, while reject removes them.
|
||||
8. The traces plot in the centre allows creating manual picks and viewing the waveforms.
|
||||
9. The parameters which influence the autopicking result are in the Main settings and Advanced settings tabs on the left
|
||||
side. For a description of all the parameters see the [tuning documentation](tuning.md).
|
||||
9. The parameters which influence the autopicking result are in the Main settings and Advanced settings tabs on the left side. For a description of all the parameters see the [tuning documentation](tuning.md).
|
||||
|
||||
### Production run of the autopicker
|
||||
|
||||
After the settings used during tuning give the desired results, the autopicker can be used on the complete dataset. To
|
||||
invoke the autopicker on the whole set of events, click
|
||||
the <img src=../icons/autopylot_button.png alt="Autopick" title="Autopick" height=32> button.
|
||||
After the settings used during tuning give the desired results, the autopicker can be used on the complete dataset. To invoke the autopicker on the whole set of events, click the <img src=../icons/autopylot_button.png alt="Autopick" title="Autopick" height=32> button.
|
||||
|
||||
### Evaluation of automatic picks
|
||||
|
||||
PyLoT has two internal consistency checks for automatic picks that were determined for an event:
|
||||
|
||||
1. Jackknife check
|
||||
2. Wadati check
|
||||
|
||||
#### 1. Jackknife check
|
||||
|
||||
The jackknife test in PyLoT checks the consistency of automatically determined P-picks by checking the statistical
|
||||
variance of the picks. The variance of all P-picks is calculated and compared to the variance of subsets, in which one
|
||||
pick is removed.
|
||||
The idea is, that picks that are close together in time should not influence the estimation of the variance much, while
|
||||
picks whose positions deviates from the norm influence the variance to a greater extent. If the estimated variance of a
|
||||
subset with a pick removed differs to much from the estimated variance of all picks, the pick that was removed from the
|
||||
subset will be marked as invalid.
|
||||
The factor by which picks are allowed to skew from the estimation of variance can be configured, it is called *
|
||||
jackfactor*, see [here](tuning.md#Pick-quality-control).
|
||||
The jackknife test in PyLoT checks the consistency of automatically determined P-picks by checking the statistical variance of the picks. The variance of all P-picks is calculated and compared to the variance of subsets, in which one pick is removed.
|
||||
The idea is, that picks that are close together in time should not influence the estimation of the variance much, while picks whose positions deviates from the norm influence the variance to a greater extent. If the estimated variance of a subset with a pick removed differs to much from the estimated variance of all picks, the pick that was removed from the subset will be marked as invalid.
|
||||
The factor by which picks are allowed to skew from the estimation of variance can be configured, it is called *jackfactor*, see [here](tuning.md#Pick-quality-control).
|
||||
|
||||
Additionally, the deviation of picks from the median is checked. For that, the median of all P-picks that passed the
|
||||
Jackknife test is calculated. Picks whose onset times deviate from the mean onset time by more than the *mdttolerance*
|
||||
are marked as invalid.
|
||||
Additionally, the deviation of picks from the median is checked. For that, the median of all P-picks that passed the Jackknife test is calculated. Picks whose onset times deviate from the mean onset time by more than the *mdttolerance* are marked as invalid.
|
||||
|
||||
<img src=images/gui/jackknife_plot.png title="Jackknife/Median test diagram">
|
||||
|
||||
*The result of both tests (Jackknife and Median) is shown in a diagram afterwards. The onset time is plotted against a
|
||||
running number of stations. Picks that failed either the Jackknife or the median test are colored red. The median is
|
||||
plotted as a green line.*
|
||||
*The result of both tests (Jackknife and Median) is shown in a diagram afterwards. The onset time is plotted against a running number of stations. Picks that failed either the Jackknife or the median test are colored red. The median is plotted as a green line.*
|
||||
|
||||
The Jackknife and median check are suitable to check for picks that are outside of the expected time window, for
|
||||
example, when a wrong phase was picked. It won't recognize picks that are in close proximity to the right onset which
|
||||
are just slightly to late/early.
|
||||
The Jackknife and median check are suitable to check for picks that are outside of the expected time window, for example, when a wrong phase was picked. It won't recognize picks that are in close proximity to the right onset which are just slightly to late/early.
|
||||
|
||||
#### 2. Wadati check
|
||||
|
||||
The Wadati check checks the consistency of S picks. For this the SP-time, the time difference between S and P onset is
|
||||
plotted against the P onset time. A line is fitted to the points, which minimizes the error. Then the deviation of
|
||||
single picks to this line is checked. If the deviation in seconds is above the *wdttolerance*
|
||||
parameter ([see here](tuning.md#Pick-quality-control)), the pick is marked as invalid.
|
||||
The Wadati check checks the consistency of S picks. For this the SP-time, the time difference between S and P onset is plotted against the P onset time. A line is fitted to the points, which minimizes the error. Then the deviation of single picks to this line is checked. If the deviation in seconds is above the *wdttolerance* parameter ([see here](tuning.md#Pick-quality-control)), the pick is marked as invalid.
|
||||
|
||||
<img src=images/gui/wadati_plot.png title="Output diagram of Wadati check">
|
||||
|
||||
*The Wadati plot in PyLoT shows the SP onset time difference over the P onset time. A first line is fitted (black). All
|
||||
picks which deviate to much from this line are marked invalid (red). Then a second line is fitted which excludes the
|
||||
invalid picks. From this lines slope, the ratio of P and S wave velocity is determined.*
|
||||
*The Wadati plot in PyLoT shows the SP onset time difference over the P onset time. A first line is fitted (black). All picks which deviate to much from this line are marked invalid (red). Then a second line is fitted which excludes the invalid picks. From this lines slope, the ratio of P and S wave velocity is determined.*
|
||||
|
||||
### Comparison between automatic and manual picks
|
||||
|
||||
Every pick in PyLoT consists of an earliest possible, latest possible and most likely onset time. The earliest and
|
||||
latest possible onset time characterize the uncertainty of a pick. This approach is described in Diel, Kissling and
|
||||
Bormann (2012) - Tutorial for consistent phase picking at local to regional distances. These times are represented as a
|
||||
Probability Density Function (PDF) for every pick. The PDF is implemented as two exponential distributions around the
|
||||
most likely onset as the expected value.
|
||||
Every pick in PyLoT consists of an earliest possible, latest possible and most likely onset time.
|
||||
The earliest and latest possible onset time characterize the uncertainty of a pick.
|
||||
This approach is described in Diel, Kissling and Bormann (2012) - Tutorial for consistent phase picking at local to regional distances.
|
||||
These times are represented as a Probability Density Function (PDF) for every pick.
|
||||
The PDF is implemented as two exponential distributions around the most likely onset as the expected value.
|
||||
|
||||
To compare two single picks, their PDFs are cross correlated to create a new PDF. This corresponds to the subtraction of
|
||||
the automatic pick from the manual pick.
|
||||
To compare two single picks, their PDFs are cross correlated to create a new PDF.
|
||||
This corresponds to the subtraction of the automatic pick from the manual pick.
|
||||
|
||||
<img src=images/gui/comparison/comparison_pdf.png title="Comparison between automatic and manual pick">
|
||||
|
||||
*Comparison between an automatic and a manual pick for a station in PyLoT by comparing their PDFs.*
|
||||
*The upper plot shows the difference between the two single picks that are shown in the lower plot.*
|
||||
*The difference is implemented as a cross correlation between the two PDFs. and results in a new PDF, the comparison
|
||||
PDF.*
|
||||
*The expected value of the comparison PDF corresponds to the time distance between the automatic and manual picks most
|
||||
likely onset.*
|
||||
*The standard deviation corresponds to the combined uncertainty.*
|
||||
*Comparison between an automatic and a manual pick for a station in PyLoT by comparing their PDFs.*
|
||||
*The upper plot shows the difference between the two single picks that are shown in the lower plot.*
|
||||
*The difference is implemented as a cross correlation between the two PDFs. and results in a new PDF, the comparison PDF.*
|
||||
*The expected value of the comparison PDF corresponds to the time distance between the automatic and manual picks most likely onset.*
|
||||
*The standard deviation corresponds to the combined uncertainty.*
|
||||
|
||||
To compare the automatic and manual picks between multiple stations of an event, the properties of all the comparison
|
||||
PDFs are shown in a histogram.
|
||||
To compare the automatic and manual picks between multiple stations of an event, the properties of all the comparison PDFs are shown in a histogram.
|
||||
|
||||
<img src=images/gui/comparison/compare_widget.png title="Comparison between picks of an event">
|
||||
|
||||
@ -443,13 +352,11 @@ PDFs are shown in a histogram.
|
||||
*The bottom left plot shows the expected values of the comparison PDFs for P picks.*
|
||||
*The top right plot shows the standard deviation of the comparison PDFs for S picks.*
|
||||
*The bottom right plot shows the expected values of the comparison PDFs for S picks.*
|
||||
*The standard deviation plots show that most P picks have an uncertainty between 1 and 2 seconds, while S pick
|
||||
uncertainties have a much larger spread between 1 to 15 seconds.*
|
||||
*The standard deviation plots show that most P picks have an uncertainty between 1 and 2 seconds, while S pick uncertainties have a much larger spread between 1 to 15 seconds.*
|
||||
*This means P picks have higher quality classes on average than S picks.*
|
||||
*The expected values are largely negative, meaning that the algorithm tends to pick earlier than the analyst with the
|
||||
applied settings (Manual - Automatic).*
|
||||
*The number of samples mentioned in the plots legends is the amount of stations that have an automatic and a manual P
|
||||
pick.*
|
||||
*The expected values are largely negative, meaning that the algorithm tends to pick earlier than the analyst with the applied settings (Manual - Automatic).*
|
||||
*The number of samples mentioned in the plots legends is the amount of stations that have an automatic and a manual P pick.*
|
||||
|
||||
|
||||
### Export and Import of automatic picks
|
||||
|
||||
@ -462,11 +369,7 @@ To be added.
|
||||
# FAQ
|
||||
|
||||
Q: During manual picking the error "No channel to plot for phase ..." is displayed, and I am unable to create a pick.
|
||||
A: Select a channel that should be used for the corresponding phase in the Pickwindow. For further information
|
||||
read [Picking Window settings](#picking-window-settings).
|
||||
A: Select a channel that should be used for the corresponding phase in the Pickwindow. For further information read [Picking Window settings](#picking-window-settings).
|
||||
|
||||
Q: I see a warning "Mismatch in event identifiers" when loading picks from a file.
|
||||
A: This means that PyLoT doesn't recognize the picks in the file as belonging to this specific event. They could have
|
||||
been saved under a different installation of PyLoT but with the same waveform data, which means they are still
|
||||
compatible and you can continue loading them or they could be the picks of a different event, in which case loading them
|
||||
is not recommended.
|
||||
A: This means that PyLoT doesn't recognize the picks in the file as belonging to this specific event. They could have been saved under a different installation of PyLoT but with the same waveform data, which means they are still compatible and you can continue loading them or they could be the picks of a different event, in which case loading them is not recommended.
|
||||
|
Binary file not shown.
Before Width: | Height: | Size: 64 KiB |
173
docs/tuning.md
173
docs/tuning.md
@ -6,146 +6,95 @@ A description of the parameters used for determining automatic picks.
|
||||
|
||||
Parameters applied to the traces before picking algorithm starts.
|
||||
|
||||
| Name | Description |
|
||||
|---------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *P Start*, *P | |
|
||||
| Stop* | Define time interval relative to trace start time for CF calculation on vertical trace. Value is relative to theoretical onset time if 'Use TauPy' option is enabled in main settings of 'Tune Autopicker' dialogue. |
|
||||
| *S Start*, *S | |
|
||||
| Stop* | Define time interval relative to trace start time for CF calculation on horizontal traces. Value is relative to theoretical onset time if 'Use TauPy' option is enabled in main settings of 'Tune Autopicker' dialogue. |
|
||||
| *Bandpass | |
|
||||
| Z1* | Filter settings for Butterworth bandpass applied to vertical trace for calculation of initial P pick. |
|
||||
| *Bandpass | |
|
||||
| Z2* | Filter settings for Butterworth bandpass applied to vertical trace for calculation of precise P pick. |
|
||||
| *Bandpass | |
|
||||
| H1* | Filter settings for Butterworth bandpass applied to horizontal traces for calculation of initial S pick. |
|
||||
| *Bandpass | |
|
||||
| H2* | Filter settings for Butterworth bandpass applied to horizontal traces for calculation of precise S pick. |
|
||||
Name | Description
|
||||
--- | ---
|
||||
*P Start*, *P Stop* | Define time interval relative to trace start time for CF calculation on vertical trace. Value is relative to theoretical onset time if 'Use TauPy' option is enabled in main settings of 'Tune Autopicker' dialogue.
|
||||
*S Start*, *S Stop* | Define time interval relative to trace start time for CF calculation on horizontal traces. Value is relative to theoretical onset time if 'Use TauPy' option is enabled in main settings of 'Tune Autopicker' dialogue.
|
||||
*Bandpass Z1* | Filter settings for Butterworth bandpass applied to vertical trace for calculation of initial P pick.
|
||||
*Bandpass Z2* | Filter settings for Butterworth bandpass applied to vertical trace for calculation of precise P pick.
|
||||
*Bandpass H1* | Filter settings for Butterworth bandpass applied to horizontal traces for calculation of initial S pick.
|
||||
*Bandpass H2* | Filter settings for Butterworth bandpass applied to horizontal traces for calculation of precise S pick.
|
||||
|
||||
## Inital P pick
|
||||
|
||||
Parameters used for determination of initial P pick.
|
||||
|
||||
| Name | Description |
|
||||
|-------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------|
|
||||
| * | |
|
||||
| tLTA* | Size of gliding LTA window in seconds used for calculation of HOS-CF. |
|
||||
| *pickwin | |
|
||||
| P* | Size of time window in seconds in which the minimum of the AIC-CF in front of the maximum of the HOS-CF is determined. |
|
||||
| * | |
|
||||
| AICtsmooth* | Average of samples in this time window will be used for smoothing of the AIC-CF. |
|
||||
| * | |
|
||||
| checkwinP* | Time in front of the global maximum of the HOS-CF in which to search for a second local extrema. |
|
||||
| *minfactorP* | Used with * |
|
||||
| checkwinP*. If a second local maximum is found, it has to be at least as big as the first maximum * *minfactorP*. | |
|
||||
| * | |
|
||||
| tsignal* | Time window in seconds after the initial P pick used for determining signal amplitude. |
|
||||
| * | |
|
||||
| tnoise* | Time window in seconds in front of initial P pick used for determining noise amplitude. |
|
||||
| *tsafetey* | Time in seconds between *tsignal* and * |
|
||||
| tnoise*. | |
|
||||
| * | |
|
||||
| tslope* | Time window in seconds after initial P pick in which the slope of the onset is calculated. |
|
||||
Name | Description
|
||||
--- | ---
|
||||
*tLTA* | Size of gliding LTA window in seconds used for calculation of HOS-CF.
|
||||
*pickwin P* | Size of time window in seconds in which the minimum of the AIC-CF in front of the maximum of the HOS-CF is determined.
|
||||
*AICtsmooth* | Average of samples in this time window will be used for smoothing of the AIC-CF.
|
||||
*checkwinP* | Time in front of the global maximum of the HOS-CF in which to search for a second local extrema.
|
||||
*minfactorP* | Used with *checkwinP*. If a second local maximum is found, it has to be at least as big as the first maximum * *minfactorP*.
|
||||
*tsignal* | Time window in seconds after the initial P pick used for determining signal amplitude.
|
||||
*tnoise* | Time window in seconds in front of initial P pick used for determining noise amplitude.
|
||||
*tsafetey* | Time in seconds between *tsignal* and *tnoise*.
|
||||
*tslope* | Time window in seconds after initial P pick in which the slope of the onset is calculated.
|
||||
|
||||
## Inital S pick
|
||||
|
||||
Parameters used for determination of initial S pick
|
||||
|
||||
| Name | Description |
|
||||
|-------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------|
|
||||
| * | |
|
||||
| tdet1h* | Length of time window in seconds in which AR params of the waveform are determined. |
|
||||
| * | |
|
||||
| tpred1h* | Length of time window in seconds in which the waveform is predicted using the AR model. |
|
||||
| * | |
|
||||
| AICtsmoothS* | Average of samples in this time window is used for smoothing the AIC-CF. |
|
||||
| * | |
|
||||
| pickwinS* | Time window in which the minimum in the AIC-CF in front of the maximum in the ARH-CF is determined. |
|
||||
| * | |
|
||||
| checkwinS* | Time in front of the global maximum of the ARH-CF in which to search for a second local extrema. |
|
||||
| *minfactorP* | Used with * |
|
||||
| checkwinS*. If a second local maximum is found, it has to be at least as big as the first maximum * *minfactorS*. | |
|
||||
| * | |
|
||||
| tsignal* | Time window in seconds after the initial P pick used for determining signal amplitude. |
|
||||
| * | |
|
||||
| tnoise* | Time window in seconds in front of initial P pick used for determining noise amplitude. |
|
||||
| *tsafetey* | Time in seconds between *tsignal* and * |
|
||||
| tnoise*. | |
|
||||
| * | |
|
||||
| tslope* | Time window in seconds after initial P pick in which the slope of the onset is calculated. |
|
||||
Name | Description
|
||||
--- | ---
|
||||
*tdet1h* | Length of time window in seconds in which AR params of the waveform are determined.
|
||||
*tpred1h* | Length of time window in seconds in which the waveform is predicted using the AR model.
|
||||
*AICtsmoothS* | Average of samples in this time window is used for smoothing the AIC-CF.
|
||||
*pickwinS* | Time window in which the minimum in the AIC-CF in front of the maximum in the ARH-CF is determined.
|
||||
*checkwinS* | Time in front of the global maximum of the ARH-CF in which to search for a second local extrema.
|
||||
*minfactorP* | Used with *checkwinS*. If a second local maximum is found, it has to be at least as big as the first maximum * *minfactorS*.
|
||||
*tsignal* | Time window in seconds after the initial P pick used for determining signal amplitude.
|
||||
*tnoise* | Time window in seconds in front of initial P pick used for determining noise amplitude.
|
||||
*tsafetey* | Time in seconds between *tsignal* and *tnoise*.
|
||||
*tslope* | Time window in seconds after initial P pick in which the slope of the onset is calculated.
|
||||
|
||||
## Precise P pick
|
||||
|
||||
Parameters used for determination of precise P pick.
|
||||
|
||||
| Name | Description |
|
||||
|-------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| *Precalcwin* | Time window in seconds for recalculation of the HOS-CF. The new CF will be two times the size of * |
|
||||
| Precalcwin*, since it will be calculated from the initial pick to +/- *Precalcwin*. | |
|
||||
| * | |
|
||||
| tsmoothP* | Average of samples in this time window will be used for smoothing the second HOS-CF. |
|
||||
| * | |
|
||||
| ausP* | Controls artificial uplift of samples during precise picking. A common local minimum of the smoothed and unsmoothed HOS-CF is found when the previous sample is larger or equal to the current sample times (1+* |
|
||||
| ausP*). | |
|
||||
Name | Description
|
||||
--- | ---
|
||||
*Precalcwin* | Time window in seconds for recalculation of the HOS-CF. The new CF will be two times the size of *Precalcwin*, since it will be calculated from the initial pick to +/- *Precalcwin*.
|
||||
*tsmoothP* | Average of samples in this time window will be used for smoothing the second HOS-CF.
|
||||
*ausP* | Controls artificial uplift of samples during precise picking. A common local minimum of the smoothed and unsmoothed HOS-CF is found when the previous sample is larger or equal to the current sample times (1+*ausP*).
|
||||
|
||||
## Precise S pick
|
||||
|
||||
Parameters used for determination of precise S pick.
|
||||
|
||||
| Name | Description |
|
||||
|--------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| * | |
|
||||
| tdet2h* | Time window for determination of AR coefficients. |
|
||||
| * | |
|
||||
| tpred2h* | Time window in which the waveform is predicted using the determined AR parameters. |
|
||||
| *Srecalcwin* | Time window for recalculation of ARH-CF. New CF will be calculated from initial pick +/- * |
|
||||
| Srecalcwin*. | |
|
||||
| * | |
|
||||
| tsmoothS* | Average of samples in this time window will be used for smoothing the second ARH-CF. |
|
||||
| * | |
|
||||
| ausS* | Controls artificial uplift of samples during precise picking. A common local minimum of the smoothed and unsmoothed ARH-CF is found when the previous sample is larger or equal to the current sample times (1+* |
|
||||
| ausS*). | |
|
||||
| * | |
|
||||
| pickwinS* | Time window around initial pick in which to look for a precise pick. |
|
||||
Name | Description
|
||||
--- | ---
|
||||
*tdet2h* | Time window for determination of AR coefficients.
|
||||
*tpred2h* | Time window in which the waveform is predicted using the determined AR parameters.
|
||||
*Srecalcwin* | Time window for recalculation of ARH-CF. New CF will be calculated from initial pick +/- *Srecalcwin*.
|
||||
*tsmoothS* | Average of samples in this time window will be used for smoothing the second ARH-CF.
|
||||
*ausS* | Controls artificial uplift of samples during precise picking. A common local minimum of the smoothed and unsmoothed ARH-CF is found when the previous sample is larger or equal to the current sample times (1+*ausS*).
|
||||
*pickwinS* | Time window around initial pick in which to look for a precise pick.
|
||||
|
||||
|
||||
## Pick quality control
|
||||
|
||||
Parameters used for checking quality and integrity of automatic picks.
|
||||
|
||||
| Name | Description |
|
||||
|--------------------------------------------|-----------------------------------------------------------------------|
|
||||
| * | |
|
||||
| minAICPslope* | Initial P picks with a slope lower than this value will be discared. |
|
||||
| * | |
|
||||
| minAICPSNR* | Initial P picks with a SNR below this value will be discarded. |
|
||||
| * | |
|
||||
| minAICSslope* | Initial S picks with a slope lower than this value will be discarded. |
|
||||
| * | |
|
||||
| minAICSSNR* | Initial S picks with a SNR below this value will be discarded. |
|
||||
| *minsiglength*, *noisefacor*. *minpercent* | Parameters for checking signal length. In the time window of size * |
|
||||
|
||||
minsiglength* after the initial P pick *
|
||||
minpercent* of samples have to be larger than the RMS value. |
|
||||
| *
|
||||
zfac* | To recognize misattributed S picks, the RMS amplitude of vertical and horizontal traces are compared. The RMS amplitude of the vertical traces has to be at least *
|
||||
zfac* higher than the RMS amplitude on the horizontal traces for the pick to be accepted as a valid P pick. |
|
||||
| *
|
||||
jackfactor* | A P pick is removed if the jackknife pseudo value of the variance of his subgroup is larger than the variance of all picks multiplied with the *
|
||||
jackfactor*. |
|
||||
| *
|
||||
mdttolerance* | Maximum allowed deviation of P onset times from the median. Value in seconds. |
|
||||
| *
|
||||
wdttolerance* | Maximum allowed deviation of S onset times from the line during the Wadati test. Value in seconds. |
|
||||
Name | Description
|
||||
--- | ---
|
||||
*minAICPslope* | Initial P picks with a slope lower than this value will be discared.
|
||||
*minAICPSNR* | Initial P picks with a SNR below this value will be discarded.
|
||||
*minAICSslope* | Initial S picks with a slope lower than this value will be discarded.
|
||||
*minAICSSNR* | Initial S picks with a SNR below this value will be discarded.
|
||||
*minsiglength*, *noisefacor*. *minpercent* | Parameters for checking signal length. In the time window of size *minsiglength* after the initial P pick *minpercent* of samples have to be larger than the RMS value.
|
||||
*zfac* | To recognize misattributed S picks, the RMS amplitude of vertical and horizontal traces are compared. The RMS amplitude of the vertical traces has to be at least *zfac* higher than the RMS amplitude on the horizontal traces for the pick to be accepted as a valid P pick.
|
||||
*jackfactor* | A P pick is removed if the jackknife pseudo value of the variance of his subgroup is larger than the variance of all picks multiplied with the *jackfactor*.
|
||||
*mdttolerance* | Maximum allowed deviation of P onset times from the median. Value in seconds.
|
||||
*wdttolerance* | Maximum allowed deviation of S onset times from the line during the Wadati test. Value in seconds.
|
||||
|
||||
## Pick quality determination
|
||||
|
||||
Parameters for discrete quality classes.
|
||||
|
||||
| Name | Description |
|
||||
|--------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| * | |
|
||||
| timeerrorsP* | Width of the time windows in seconds between earliest and latest possible pick which represent the quality classes 0, 1, 2, 3 for P onsets. |
|
||||
| * | |
|
||||
| timeerrorsS* | Width of the time windows in seconds between earliest and latest possible pick which represent the quality classes 0, 1, 2, 3 for S onsets. |
|
||||
| *nfacP*, *nfacS* | For determination of latest possible onset time. The time when the signal reaches an amplitude of * |
|
||||
| nfac* * mean value of the RMS amplitude in the time window *tnoise* corresponds to the latest possible onset time. | |
|
||||
Name | Description
|
||||
--- | ---
|
||||
*timeerrorsP* | Width of the time windows in seconds between earliest and latest possible pick which represent the quality classes 0, 1, 2, 3 for P onsets.
|
||||
*timeerrorsS* | Width of the time windows in seconds between earliest and latest possible pick which represent the quality classes 0, 1, 2, 3 for S onsets.
|
||||
*nfacP*, *nfacS* | For determination of latest possible onset time. The time when the signal reaches an amplitude of *nfac* * mean value of the RMS amplitude in the time window *tnoise* corresponds to the latest possible onset time.
|
||||
|
||||
|
@ -30,7 +30,6 @@
|
||||
<file>icons/openloc.png</file>
|
||||
<file>icons/compare_button.png</file>
|
||||
<file>icons/pick_qualities_button.png</file>
|
||||
<file>icons/eventlist_xml_button.png</file>
|
||||
<file>icons/locate_button.png</file>
|
||||
<file>icons/Matlab_PILOT_icon.png</file>
|
||||
<file>icons/printer.png</file>
|
||||
|
Binary file not shown.
Before Width: | Height: | Size: 16 KiB |
Binary file not shown.
Before Width: | Height: | Size: 3.2 KiB |
9039
icons_rc_2.py
9039
icons_rc_2.py
File diff suppressed because it is too large
Load Diff
218911
icons_rc_3.py
218911
icons_rc_3.py
File diff suppressed because it is too large
Load Diff
@ -4,8 +4,10 @@
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
#datapath# %data path
|
||||
#eventID# %event ID for single event processing (* for all events found in datapath)
|
||||
#rootpath# %project path
|
||||
#datapath# %data path
|
||||
#database# %name of data base
|
||||
#eventID# %event ID for single event processing (* for all events found in database)
|
||||
#invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
@ -41,7 +43,6 @@ global #extent# %extent of a
|
||||
1150.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6
|
||||
P,Pdiff #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
0.05 0.5 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
0.001 0.5 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
0.05 0.5 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
|
@ -4,8 +4,10 @@
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
/DATA/Insheim/EVENT_DATA/LOCAL/2018.02_Insheim #datapath# %data path
|
||||
e0006.038.18 #eventID# %event ID for single event processing (* for all events found in datapath)
|
||||
/DATA/Insheim #rootpath# %project path
|
||||
EVENT_DATA/LOCAL #datapath# %data path
|
||||
2018.02_Insheim #database# %name of data base
|
||||
e0006.038.18 #eventID# %event ID for single event processing (* for all events found in database)
|
||||
/DATA/Insheim/STAT_INFO #invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
@ -40,8 +42,7 @@ local #extent# %extent of a
|
||||
-0.5 #sstart# %start time [s] relative to P-onset for calculating CF for S-picking
|
||||
10.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
False #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation
|
||||
P #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation
|
||||
2.0 20.0 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
2.0 30.0 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
2.0 10.0 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
|
@ -4,8 +4,10 @@
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
#datapath# %data path
|
||||
#eventID# %event ID for single event processing (* for all events found in datapath)
|
||||
#rootpath# %project path
|
||||
#datapath# %data path
|
||||
#database# %name of data base
|
||||
#eventID# %event ID for single event processing (* for all events found in database)
|
||||
#invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
@ -40,8 +42,7 @@ local #extent# %extent of a
|
||||
-1.0 #sstart# %start time [s] relative to P-onset for calculating CF for S-picking
|
||||
10.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation
|
||||
P #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation
|
||||
2.0 10.0 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
2.0 12.0 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
2.0 8.0 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
|
12
pylot.yml
12
pylot.yml
@ -1,12 +0,0 @@
|
||||
name: pylot_311
|
||||
channels:
|
||||
- conda-forge
|
||||
- defaults
|
||||
dependencies:
|
||||
- cartopy=0.23.0=py311hcf9f919_1
|
||||
- joblib=1.4.2=pyhd8ed1ab_0
|
||||
- obspy=1.4.1=py311he736701_3
|
||||
- pyaml=24.7.0=pyhd8ed1ab_0
|
||||
- pyqtgraph=0.13.7=pyhd8ed1ab_0
|
||||
- pyside2=5.15.8=py311h3d699ce_4
|
||||
- pytest=8.3.2=pyhd8ed1ab_0
|
@ -9,7 +9,7 @@ PyLoT - the Python picking and Localization Tool
|
||||
|
||||
This python library contains a graphical user interfaces for picking
|
||||
seismic phases. This software needs ObsPy (http://github.com/obspy/obspy/wiki)
|
||||
and the Qt libraries to be installed first.
|
||||
and the Qt4 libraries to be installed first.
|
||||
|
||||
PILOT has been developed in Mathworks' MatLab. In order to distribute
|
||||
PILOT without facing portability problems, it has been decided to re-
|
||||
|
@ -144,10 +144,6 @@ class Magnitude(object):
|
||||
azimuthal_gap=self.origin_id.get_referred_object().quality.azimuthal_gap)
|
||||
else:
|
||||
# no scaling necessary
|
||||
# Temporary fix needs rework
|
||||
if (len(self.magnitudes.keys()) == 0):
|
||||
print("Error in local magnitude calculation ")
|
||||
return None
|
||||
mag = ope.Magnitude(
|
||||
mag=np.median([M.mag for M in self.magnitudes.values()]),
|
||||
magnitude_type=self.type,
|
||||
@ -225,16 +221,11 @@ class LocalMagnitude(Magnitude):
|
||||
|
||||
power = [np.power(tr.data, 2) for tr in st if tr.stats.channel[-1] not
|
||||
in 'Z3']
|
||||
# checking horizontal count and calculating power_sum accordingly
|
||||
if len(power) == 1:
|
||||
print('WARNING: Only one horizontal found for station {0}.'.format(st[0].stats.station))
|
||||
power_sum = power[0]
|
||||
elif len(power) == 2:
|
||||
power_sum = power[0] + power[1]
|
||||
else:
|
||||
raise ValueError('Wood-Anderson aomplitude defintion only valid for'
|
||||
' up to two horizontals: {0} given'.format(len(power)))
|
||||
|
||||
if len(power) != 2:
|
||||
raise ValueError('Wood-Anderson amplitude defintion only valid for '
|
||||
'two horizontals: {0} given'.format(len(power)))
|
||||
power_sum = power[0] + power[1]
|
||||
#
|
||||
sqH = np.sqrt(power_sum)
|
||||
|
||||
# get time array
|
||||
@ -286,13 +277,12 @@ class LocalMagnitude(Magnitude):
|
||||
for a in self.arrivals:
|
||||
if a.phase not in 'sS':
|
||||
continue
|
||||
pick = a.pick_id.get_referred_object()
|
||||
station = pick.waveform_id.station_code
|
||||
# make sure calculating Ml only from reliable onsets
|
||||
# NLLoc: time_weight = 0 => do not use onset!
|
||||
if a.time_weight == 0:
|
||||
print("Uncertain pick at Station {}, do not use it!".format(station))
|
||||
continue
|
||||
pick = a.pick_id.get_referred_object()
|
||||
station = pick.waveform_id.station_code
|
||||
wf = select_for_phase(self.stream.select(
|
||||
station=station), a.phase)
|
||||
if not wf:
|
||||
@ -329,7 +319,7 @@ class LocalMagnitude(Magnitude):
|
||||
if self.verbose:
|
||||
print(
|
||||
"Local Magnitude for station {0}: ML = {1:3.1f}".format(
|
||||
station, magnitude.mag))
|
||||
station, magnitude.mag))
|
||||
magnitude.origin_id = self.origin_id
|
||||
magnitude.waveform_id = pick.waveform_id
|
||||
magnitude.amplitude_id = amplitude.resource_id
|
||||
@ -404,32 +394,24 @@ class MomentMagnitude(Magnitude):
|
||||
print("WARNING: No instrument corrected data available,"
|
||||
" no magnitude calculation possible! Go on.")
|
||||
continue
|
||||
wf = self.stream.select(station=station)
|
||||
scopy = self.stream.copy()
|
||||
wf = scopy.select(station=station)
|
||||
if not wf:
|
||||
continue
|
||||
try:
|
||||
scopy = wf.copy()
|
||||
except AssertionError:
|
||||
print("WARNING: Something's wrong with the data,"
|
||||
"station {},"
|
||||
"no calculation of moment magnitude possible! Go on.".format(station))
|
||||
continue
|
||||
onset = pick.time
|
||||
distance = degrees2kilometers(a.distance)
|
||||
azimuth = a.azimuth
|
||||
incidence = a.takeoff_angle
|
||||
if not 0. <= incidence <= 360.:
|
||||
if self.verbose:
|
||||
print(f'WARNING: Incidence angle outside bounds - {incidence}')
|
||||
return
|
||||
w0, fc = calcsourcespec(scopy, onset, self.p_velocity, distance,
|
||||
w0, fc = calcsourcespec(wf, onset, self.p_velocity, distance,
|
||||
azimuth, incidence, self.p_attenuation,
|
||||
self.plot_flag, self.verbose)
|
||||
if w0 is None or fc is None:
|
||||
if self.verbose:
|
||||
print("WARNING: insufficient frequency information")
|
||||
continue
|
||||
WF = select_for_phase(scopy, "P")
|
||||
WF = select_for_phase(self.stream.select(
|
||||
station=station), a.phase)
|
||||
WF = select_for_phase(WF, "P")
|
||||
m0, mw = calcMoMw(WF, w0, self.rock_density, self.p_velocity,
|
||||
distance, self.verbose)
|
||||
self.moment_props = (station, dict(w0=w0, fc=fc, Mo=m0))
|
||||
@ -440,40 +422,6 @@ class MomentMagnitude(Magnitude):
|
||||
self.event.station_magnitudes.append(magnitude)
|
||||
self.magnitudes = (station, magnitude)
|
||||
|
||||
# WIP JG
|
||||
def getSourceSpec(self):
|
||||
for a in self.arrivals:
|
||||
if a.phase not in 'pP':
|
||||
continue
|
||||
# make sure calculating Mo only from reliable onsets
|
||||
# NLLoc: time_weight = 0 => do not use onset!
|
||||
if a.time_weight == 0:
|
||||
continue
|
||||
pick = a.pick_id.get_referred_object()
|
||||
station = pick.waveform_id.station_code
|
||||
if len(self.stream) <= 2:
|
||||
print("Station:" '{0}'.format(station))
|
||||
print("WARNING: No instrument corrected data available,"
|
||||
" no magnitude calculation possible! Go on.")
|
||||
continue
|
||||
wf = self.stream.select(station=station)
|
||||
if not wf:
|
||||
continue
|
||||
try:
|
||||
scopy = wf.copy()
|
||||
except AssertionError:
|
||||
print("WARNING: Something's wrong with the data,"
|
||||
"station {},"
|
||||
"no calculation of moment magnitude possible! Go on.".format(station))
|
||||
continue
|
||||
onset = pick.time
|
||||
distance = degrees2kilometers(a.distance)
|
||||
azimuth = a.azimuth
|
||||
incidence = a.takeoff_angle
|
||||
w0, fc, plt = calcsourcespec(scopy, onset, self.p_velocity, distance,
|
||||
azimuth, incidence, self.p_attenuation,
|
||||
3, self.verbose)
|
||||
return w0, fc, plt
|
||||
|
||||
def calcMoMw(wfstream, w0, rho, vp, delta, verbosity=False):
|
||||
'''
|
||||
@ -506,7 +454,7 @@ def calcMoMw(wfstream, w0, rho, vp, delta, verbosity=False):
|
||||
|
||||
# additional common parameters for calculating Mo
|
||||
# average radiation pattern of P waves (Aki & Richards, 1980)
|
||||
rP = 2 / np.sqrt(15)
|
||||
rP = 2 / np.sqrt(15)
|
||||
freesurf = 2.0 # free surface correction, assuming vertical incidence
|
||||
|
||||
Mo = w0 * 4 * np.pi * rho * np.power(vp, 3) * delta / (rP * freesurf)
|
||||
@ -566,7 +514,7 @@ def calcsourcespec(wfstream, onset, vp, delta, azimuth, incidence,
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
|
||||
|
||||
# get Q value
|
||||
Q, A = qp
|
||||
|
||||
@ -636,15 +584,15 @@ def calcsourcespec(wfstream, onset, vp, delta, azimuth, incidence,
|
||||
|
||||
# fft
|
||||
fny = freq / 2
|
||||
# l = len(xdat) / freq
|
||||
#l = len(xdat) / freq
|
||||
# number of fft bins after Bath
|
||||
# n = freq * l
|
||||
#n = freq * l
|
||||
# find next power of 2 of data length
|
||||
m = pow(2, np.ceil(np.log(len(xdat)) / np.log(2)))
|
||||
N = min(int(np.power(m, 2)), 16384)
|
||||
# N = int(np.power(m, 2))
|
||||
#N = int(np.power(m, 2))
|
||||
y = dt * np.fft.fft(xdat, N)
|
||||
Y = abs(y[: int(N / 2)])
|
||||
Y = abs(y[: N / 2])
|
||||
L = (N - 1) / freq
|
||||
f = np.arange(0, fny, 1 / L)
|
||||
|
||||
@ -685,8 +633,8 @@ def calcsourcespec(wfstream, onset, vp, delta, azimuth, incidence,
|
||||
w0 = np.median([w01, w02])
|
||||
Fc = np.median([fc1, fc2])
|
||||
if verbosity:
|
||||
print("calcsourcespec: Using w0-value = %e m/Hz and fc = %f Hz" % (
|
||||
w0, Fc))
|
||||
print("calcsourcespec: Using w0-value = %e m/Hz and fc = %f Hz" % (
|
||||
w0, Fc))
|
||||
if iplot >= 1:
|
||||
f1 = plt.figure()
|
||||
tLdat = np.arange(0, len(Ldat) * dt, dt)
|
||||
@ -721,8 +669,6 @@ def calcsourcespec(wfstream, onset, vp, delta, azimuth, incidence,
|
||||
plt.xlabel('Frequency [Hz]')
|
||||
plt.ylabel('Amplitude [m/Hz]')
|
||||
plt.grid()
|
||||
if iplot == 3:
|
||||
return w0, Fc, plt
|
||||
plt.show()
|
||||
try:
|
||||
input()
|
||||
@ -777,7 +723,7 @@ def fitSourceModel(f, S, fc0, iplot, verbosity=False):
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
|
||||
|
||||
w0 = []
|
||||
stdw0 = []
|
||||
fc = []
|
||||
|
@ -2,26 +2,29 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import copy
|
||||
import logging
|
||||
import os
|
||||
|
||||
from PySide2.QtWidgets import QMessageBox
|
||||
from obspy import read_events
|
||||
from obspy.core import read, Stream, UTCDateTime
|
||||
from obspy.core.event import Event as ObsPyEvent
|
||||
from obspy.io.sac import SacIOError
|
||||
|
||||
from PySide2.QtWidgets import QMessageBox
|
||||
|
||||
import pylot.core.loc.velest as velest
|
||||
import pylot.core.loc.focmec as focmec
|
||||
import pylot.core.loc.hypodd as hypodd
|
||||
import pylot.core.loc.velest as velest
|
||||
from pylot.core.io.phases import readPILOTEvent, picks_from_picksdict, \
|
||||
picksdict_from_pilot, merge_picks, PylotParameter
|
||||
from pylot.core.util.errors import FormatError, OverwriteError
|
||||
from pylot.core.util.event import Event
|
||||
from pylot.core.util.obspyDMT_interface import qml_from_obspyDMT
|
||||
from pylot.core.util.utils import fnConstructor, full_range, check4rotated, \
|
||||
check_for_gaps_and_merge, trim_station_components, check_for_nan
|
||||
check4gapsAndMerge, trim_station_components
|
||||
|
||||
try:
|
||||
str_TypeLst = [str, unicode] # if python 2.X
|
||||
except NameError:
|
||||
str_TypeLst = [str] # if python 3.*
|
||||
|
||||
class Data(object):
|
||||
"""
|
||||
@ -36,17 +39,8 @@ class Data(object):
|
||||
loaded event. Container object holding, e.g. phase arrivals, etc.
|
||||
"""
|
||||
|
||||
def __init__(self, parent=None, evtdata=None, picking_parameter=None):
|
||||
def __init__(self, parent=None, evtdata=None):
|
||||
self._parent = parent
|
||||
|
||||
if not picking_parameter:
|
||||
if hasattr(parent, '_inputs'):
|
||||
picking_parameter = parent._inputs
|
||||
else:
|
||||
logging.warning('No picking parameters found! Using default input parameters!!!')
|
||||
picking_parameter = PylotParameter()
|
||||
self.picking_parameter = picking_parameter
|
||||
|
||||
if self.getParent():
|
||||
self.comp = parent.getComponent()
|
||||
else:
|
||||
@ -58,10 +52,10 @@ class Data(object):
|
||||
elif isinstance(evtdata, dict):
|
||||
evt = readPILOTEvent(**evtdata)
|
||||
evtdata = evt
|
||||
elif isinstance(evtdata, str):
|
||||
elif type(evtdata) in str_TypeLst:
|
||||
try:
|
||||
cat = read_events(evtdata)
|
||||
if len(cat) != 1:
|
||||
if len(cat) is not 1:
|
||||
raise ValueError('ambiguous event information for file: '
|
||||
'{file}'.format(file=evtdata))
|
||||
evtdata = cat[0]
|
||||
@ -74,7 +68,7 @@ class Data(object):
|
||||
elif 'LOC' in evtdata:
|
||||
raise NotImplementedError('PILOT location information '
|
||||
'read support not yet '
|
||||
'implemented.')
|
||||
'implemeted.')
|
||||
elif 'event.pkl' in evtdata:
|
||||
evtdata = qml_from_obspyDMT(evtdata)
|
||||
else:
|
||||
@ -110,7 +104,7 @@ class Data(object):
|
||||
old_pick.phase_hint == new_pick.phase_hint,
|
||||
old_pick.method_id == new_pick.method_id]
|
||||
if all(comparison):
|
||||
del (old_pick)
|
||||
del(old_pick)
|
||||
old_picks.append(new_pick)
|
||||
elif not other.isNew() and self.isNew():
|
||||
new = other + self
|
||||
@ -122,7 +116,7 @@ class Data(object):
|
||||
return self + other
|
||||
else:
|
||||
raise ValueError("both Data objects have differing "
|
||||
"unique Event identifiers")
|
||||
"unique Event identifiers")
|
||||
return self
|
||||
|
||||
def getPicksStr(self):
|
||||
@ -173,8 +167,9 @@ class Data(object):
|
||||
|
||||
def checkEvent(self, event, fcheck, forceOverwrite=False):
|
||||
"""
|
||||
Check information in supplied event and own event and replace with own
|
||||
information if no other information are given or forced by forceOverwrite
|
||||
Check information in supplied event and own event and replace own
|
||||
information with supplied information if own information not exiisting
|
||||
or forced by forceOverwrite
|
||||
:param event: Event that supplies information for comparison
|
||||
:type event: pylot.core.util.event.Event
|
||||
:param fcheck: check and delete existing information
|
||||
@ -230,7 +225,7 @@ class Data(object):
|
||||
|
||||
def replacePicks(self, event, picktype):
|
||||
"""
|
||||
Replace picks in event with own picks
|
||||
Replace own picks with the one in event
|
||||
:param event: Event that supplies information for comparison
|
||||
:type event: pylot.core.util.event.Event
|
||||
:param picktype: 'auto' or 'manual' picks
|
||||
@ -238,29 +233,26 @@ class Data(object):
|
||||
:return:
|
||||
:rtype: None
|
||||
"""
|
||||
checkflag = 1
|
||||
checkflag = 0
|
||||
picks = event.picks
|
||||
# remove existing picks
|
||||
for j, pick in reversed(list(enumerate(picks))):
|
||||
try:
|
||||
if picktype in str(pick.method_id.id):
|
||||
picks.pop(j)
|
||||
checkflag = 2
|
||||
checkflag = 1
|
||||
except AttributeError as e:
|
||||
msg = '{}'.format(e)
|
||||
print(e)
|
||||
checkflag = 0
|
||||
if checkflag > 0:
|
||||
if checkflag == 1:
|
||||
print("Write new %s picks to catalog." % picktype)
|
||||
if checkflag == 2:
|
||||
print("Found %s pick(s), remove them and append new picks to catalog." % picktype)
|
||||
if checkflag:
|
||||
print("Found %s pick(s), remove them and append new picks to catalog." % picktype)
|
||||
|
||||
# append new picks
|
||||
for pick in self.get_evt_data().picks:
|
||||
if picktype in str(pick.method_id.id):
|
||||
picks.append(pick)
|
||||
|
||||
|
||||
def exportEvent(self, fnout, fnext='.xml', fcheck='auto', upperErrors=None):
|
||||
"""
|
||||
Export event to file
|
||||
@ -270,6 +262,7 @@ class Data(object):
|
||||
can be a str or a list of strings of ['manual', 'auto', 'origin', 'magnitude']
|
||||
"""
|
||||
from pylot.core.util.defaults import OUTPUTFORMATS
|
||||
|
||||
if not type(fcheck) == list:
|
||||
fcheck = [fcheck]
|
||||
|
||||
@ -281,11 +274,8 @@ class Data(object):
|
||||
raise FormatError(errmsg)
|
||||
|
||||
if hasattr(self.get_evt_data(), 'notes'):
|
||||
try:
|
||||
with open(os.path.join(os.path.dirname(fnout), 'notes.txt'), 'w') as notes_file:
|
||||
notes_file.write(self.get_evt_data().notes)
|
||||
except Exception as e:
|
||||
print('Warning: Could not save notes.txt: ', str(e))
|
||||
with open(os.path.join(os.path.dirname(fnout), 'notes.txt'), 'w') as notes_file:
|
||||
notes_file.write(self.get_evt_data().notes)
|
||||
|
||||
# check for already existing xml-file
|
||||
if fnext == '.xml':
|
||||
@ -302,9 +292,7 @@ class Data(object):
|
||||
return
|
||||
self.checkEvent(event, fcheck)
|
||||
self.setEvtData(event)
|
||||
|
||||
self.get_evt_data().write(fnout + fnext, format=evtformat)
|
||||
|
||||
# try exporting event
|
||||
else:
|
||||
evtdata_org = self.get_evt_data()
|
||||
@ -327,64 +315,38 @@ class Data(object):
|
||||
del picks_copy[k]
|
||||
break
|
||||
lendiff = len(picks) - len(picks_copy)
|
||||
if lendiff != 0:
|
||||
if lendiff is not 0:
|
||||
print("Manual as well as automatic picks available. Prefered the {} manual ones!".format(lendiff))
|
||||
|
||||
|
||||
no_uncertainties_p = []
|
||||
no_uncertainties_s = []
|
||||
if upperErrors:
|
||||
# check for pick uncertainties exceeding adjusted upper errors
|
||||
# Picks with larger uncertainties will not be saved in output file!
|
||||
for j in range(len(picks)):
|
||||
for i in range(len(picks_copy)):
|
||||
if picks_copy[i].phase_hint[0] == 'P':
|
||||
# Skipping pick if no upper_uncertainty is found and warning user
|
||||
if picks_copy[i].time_errors['upper_uncertainty'] is None:
|
||||
#print("{1} P-Pick of station {0} does not have upper_uncertainty and cant be checked".format(
|
||||
# picks_copy[i].waveform_id.station_code,
|
||||
# picks_copy[i].method_id))
|
||||
if not picks_copy[i].waveform_id.station_code in no_uncertainties_p:
|
||||
no_uncertainties_p.append(picks_copy[i].waveform_id.station_code)
|
||||
continue
|
||||
|
||||
#print ("checking for upper_uncertainty")
|
||||
if (picks_copy[i].time_errors['uncertainty'] is None) or \
|
||||
(picks_copy[i].time_errors['upper_uncertainty'] >= upperErrors[0]):
|
||||
if (picks_copy[i].time_errors['upper_uncertainty'] >= upperErrors[0]) or \
|
||||
(picks_copy[i].time_errors['uncertainty'] is None):
|
||||
print("Uncertainty exceeds or equal adjusted upper time error!")
|
||||
print("Adjusted uncertainty: {}".format(upperErrors[0]))
|
||||
print("Pick uncertainty: {}".format(picks_copy[i].time_errors['uncertainty']))
|
||||
print("{1} P-Pick of station {0} will not be saved in outputfile".format(
|
||||
picks_copy[i].waveform_id.station_code,
|
||||
picks_copy[i].method_id))
|
||||
print("#")
|
||||
del picks_copy[i]
|
||||
break
|
||||
if picks_copy[i].phase_hint[0] == 'S':
|
||||
|
||||
# Skipping pick if no upper_uncertainty is found and warning user
|
||||
if picks_copy[i].time_errors['upper_uncertainty'] is None:
|
||||
#print("{1} S-Pick of station {0} does not have upper_uncertainty and cant be checked".format(
|
||||
#picks_copy[i].waveform_id.station_code,
|
||||
#picks_copy[i].method_id))
|
||||
if not picks_copy[i].waveform_id.station_code in no_uncertainties_s:
|
||||
no_uncertainties_s.append(picks_copy[i].waveform_id.station_code)
|
||||
continue
|
||||
|
||||
|
||||
if (picks_copy[i].time_errors['uncertainty'] is None) or \
|
||||
(picks_copy[i].time_errors['upper_uncertainty'] >= upperErrors[1]):
|
||||
if (picks_copy[i].time_errors['upper_uncertainty'] >= upperErrors[1]) or \
|
||||
(picks_copy[i].time_errors['uncertainty'] is None):
|
||||
print("Uncertainty exceeds or equal adjusted upper time error!")
|
||||
print("Adjusted uncertainty: {}".format(upperErrors[1]))
|
||||
print("Pick uncertainty: {}".format(picks_copy[i].time_errors['uncertainty']))
|
||||
print("{1} S-Pick of station {0} will not be saved in outputfile".format(
|
||||
picks_copy[i].waveform_id.station_code,
|
||||
picks_copy[i].method_id))
|
||||
print("#")
|
||||
del picks_copy[i]
|
||||
break
|
||||
for s in no_uncertainties_p:
|
||||
print("P-Pick of station {0} does not have upper_uncertainty and cant be checked".format(s))
|
||||
for s in no_uncertainties_s:
|
||||
print("S-Pick of station {0} does not have upper_uncertainty and cant be checked".format(s))
|
||||
|
||||
if fnext == '.obs':
|
||||
try:
|
||||
@ -394,14 +356,6 @@ class Data(object):
|
||||
header = '# EQEVENT: Label: EQ%s Loc: X 0.00 Y 0.00 Z 10.00 OT 0.00 \n' % evid
|
||||
nllocfile = open(fnout + fnext)
|
||||
l = nllocfile.readlines()
|
||||
# Adding A0/Generic Amplitude to .obs file
|
||||
# l2 = []
|
||||
# for li in l:
|
||||
# for amp in evtdata_org.amplitudes:
|
||||
# if amp.waveform_id.station_code == li[0:5].strip():
|
||||
# li = li[0:64] + '{:0.2e}'.format(amp.generic_amplitude) + li[73:-1] + '\n'
|
||||
# l2.append(li)
|
||||
# l = l2
|
||||
nllocfile.close()
|
||||
l.insert(0, header)
|
||||
nllocfile = open(fnout + fnext, 'w')
|
||||
@ -412,19 +366,25 @@ class Data(object):
|
||||
not implemented: {1}'''.format(evtformat, e))
|
||||
if fnext == '.cnv':
|
||||
try:
|
||||
velest.export(picks_copy, fnout + fnext, self.picking_parameter, eventinfo=self.get_evt_data())
|
||||
velest.export(picks_copy, fnout + fnext, eventinfo=self.get_evt_data())
|
||||
except KeyError as e:
|
||||
raise KeyError('''{0} export format
|
||||
not implemented: {1}'''.format(evtformat, e))
|
||||
if fnext == '_focmec.in':
|
||||
try:
|
||||
focmec.export(picks_copy, fnout + fnext, self.picking_parameter, eventinfo=self.get_evt_data())
|
||||
infile = os.path.join(os.path.expanduser('~'), '.pylot', 'pylot.in')
|
||||
print('Using default input file {}'.format(infile))
|
||||
parameter = PylotParameter(infile)
|
||||
focmec.export(picks_copy, fnout + fnext, parameter, eventinfo=self.get_evt_data())
|
||||
except KeyError as e:
|
||||
raise KeyError('''{0} export format
|
||||
not implemented: {1}'''.format(evtformat, e))
|
||||
if fnext == '.pha':
|
||||
try:
|
||||
hypodd.export(picks_copy, fnout + fnext, self.picking_parameter, eventinfo=self.get_evt_data())
|
||||
infile = os.path.join(os.path.expanduser('~'), '.pylot', 'pylot.in')
|
||||
print('Using default input file {}'.format(infile))
|
||||
parameter = PylotParameter(infile)
|
||||
hypodd.export(picks_copy, fnout + fnext, parameter, eventinfo=self.get_evt_data())
|
||||
except KeyError as e:
|
||||
raise KeyError('''{0} export format
|
||||
not implemented: {1}'''.format(evtformat, e))
|
||||
@ -455,30 +415,20 @@ class Data(object):
|
||||
data.filter(**kwargs)
|
||||
self.dirty = True
|
||||
|
||||
def setWFData(self, fnames, fnames_alt=None, checkRotated=False, metadata=None, tstart=0, tstop=0):
|
||||
def setWFData(self, fnames, fnames_syn=None, checkRotated=False, metadata=None, tstart=0, tstop=0):
|
||||
"""
|
||||
Clear current waveform data and set given waveform data
|
||||
:param fnames: waveform data names to append
|
||||
:param fnames_alt: alternative data to show (e.g. synthetic/processed)
|
||||
:type fnames: list
|
||||
"""
|
||||
def check_fname_exists(filenames: list) -> list:
|
||||
if filenames:
|
||||
filenames = [fn for fn in filenames if os.path.isfile(fn)]
|
||||
return filenames
|
||||
|
||||
self.wfdata = Stream()
|
||||
self.wforiginal = None
|
||||
self.wf_alt = Stream()
|
||||
self.wfsyn = Stream()
|
||||
if tstart == tstop:
|
||||
tstart = tstop = None
|
||||
self.tstart = tstart
|
||||
self.tstop = tstop
|
||||
|
||||
# remove directories
|
||||
fnames = check_fname_exists(fnames)
|
||||
fnames_alt = check_fname_exists(fnames_alt)
|
||||
|
||||
# if obspy_dmt:
|
||||
# wfdir = 'raw'
|
||||
# self.processed = False
|
||||
@ -496,8 +446,8 @@ class Data(object):
|
||||
# wffnames = fnames
|
||||
if fnames is not None:
|
||||
self.appendWFData(fnames)
|
||||
if fnames_alt is not None:
|
||||
self.appendWFData(fnames_alt, alternative=True)
|
||||
if fnames_syn is not None:
|
||||
self.appendWFData(fnames_syn, synthetic=True)
|
||||
else:
|
||||
return False
|
||||
|
||||
@ -505,9 +455,7 @@ class Data(object):
|
||||
# remove possible underscores in station names
|
||||
# self.wfdata = remove_underscores(self.wfdata)
|
||||
# check for gaps and merge
|
||||
self.wfdata, _ = check_for_gaps_and_merge(self.wfdata)
|
||||
# check for nans
|
||||
check_for_nan(self.wfdata)
|
||||
self.wfdata = check4gapsAndMerge(self.wfdata)
|
||||
# check for stations with rotated components
|
||||
if checkRotated and metadata is not None:
|
||||
self.wfdata = check4rotated(self.wfdata, metadata, verbosity=0)
|
||||
@ -519,7 +467,7 @@ class Data(object):
|
||||
self.dirty = False
|
||||
return True
|
||||
|
||||
def appendWFData(self, fnames, alternative=False):
|
||||
def appendWFData(self, fnames, synthetic=False):
|
||||
"""
|
||||
Read waveform data from fnames and append it to current wf data
|
||||
:param fnames: waveform data to append
|
||||
@ -532,20 +480,19 @@ class Data(object):
|
||||
if self.dirty:
|
||||
self.resetWFData()
|
||||
|
||||
orig_or_alternative_data = {True: self.wf_alt,
|
||||
False: self.wfdata}
|
||||
real_or_syn_data = {True: self.wfsyn,
|
||||
False: self.wfdata}
|
||||
|
||||
warnmsg = ''
|
||||
for fname in set(fnames):
|
||||
try:
|
||||
orig_or_alternative_data[alternative] += read(fname, starttime=self.tstart, endtime=self.tstop)
|
||||
real_or_syn_data[synthetic] += read(fname, starttime=self.tstart, endtime=self.tstop)
|
||||
except TypeError:
|
||||
try:
|
||||
orig_or_alternative_data[alternative] += read(fname, format='GSE2', starttime=self.tstart, endtime=self.tstop)
|
||||
real_or_syn_data[synthetic] += read(fname, format='GSE2', starttime=self.tstart, endtime=self.tstop)
|
||||
except Exception as e:
|
||||
try:
|
||||
orig_or_alternative_data[alternative] += read(fname, format='SEGY', starttime=self.tstart,
|
||||
endtime=self.tstop)
|
||||
real_or_syn_data[synthetic] += read(fname, format='SEGY', starttime=self.tstart, endtime=self.tstop)
|
||||
except Exception as e:
|
||||
warnmsg += '{0}\n{1}\n'.format(fname, e)
|
||||
except SacIOError as se:
|
||||
@ -560,8 +507,8 @@ class Data(object):
|
||||
def getOriginalWFData(self):
|
||||
return self.wforiginal
|
||||
|
||||
def getAltWFdata(self):
|
||||
return self.wf_alt
|
||||
def getSynWFData(self):
|
||||
return self.wfsyn
|
||||
|
||||
def resetWFData(self):
|
||||
"""
|
||||
@ -585,7 +532,7 @@ class Data(object):
|
||||
def setEvtData(self, event):
|
||||
self.evtdata = event
|
||||
|
||||
def applyEVTData(self, data, typ='pick'):
|
||||
def applyEVTData(self, data, typ='pick', authority_id='rub'):
|
||||
"""
|
||||
Either takes an `obspy.core.event.Event` object and applies all new
|
||||
information on the event to the actual data if typ is 'event or
|
||||
@ -836,8 +783,8 @@ class PilotDataStructure(GenericDataStructure):
|
||||
|
||||
def __init__(self, **fields):
|
||||
if not fields:
|
||||
fields = {'database': '',
|
||||
'root': ''}
|
||||
fields = {'database': '2006.01',
|
||||
'root': '/data/Egelados/EVENT_DATA/LOCAL'}
|
||||
|
||||
GenericDataStructure.__init__(self, **fields)
|
||||
|
||||
|
@ -6,14 +6,24 @@ import numpy as np
|
||||
Default parameters used for picking
|
||||
"""
|
||||
|
||||
defaults = {'datapath': {'type': str,
|
||||
'tooltip': 'path to eventfolders',
|
||||
defaults = {'rootpath': {'type': str,
|
||||
'tooltip': 'project path',
|
||||
'value': '',
|
||||
'namestring': 'Root path'},
|
||||
|
||||
'datapath': {'type': str,
|
||||
'tooltip': 'data path',
|
||||
'value': '',
|
||||
'namestring': 'Data path'},
|
||||
|
||||
'database': {'type': str,
|
||||
'tooltip': 'name of data base',
|
||||
'value': '',
|
||||
'namestring': 'Database path'},
|
||||
|
||||
'eventID': {'type': str,
|
||||
'tooltip': 'event ID for single event processing (* for all events found in datapath)',
|
||||
'value': '*',
|
||||
'tooltip': 'event ID for single event processing (* for all events found in database)',
|
||||
'value': '',
|
||||
'namestring': 'Event ID'},
|
||||
|
||||
'extent': {'type': str,
|
||||
@ -505,14 +515,16 @@ defaults = {'datapath': {'type': str,
|
||||
'namestring': 'TauPy model'},
|
||||
|
||||
'taup_phases': {'type': str,
|
||||
'tooltip': 'Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.',
|
||||
'value': 'ttall',
|
||||
'namestring': 'TauPy phases'},
|
||||
'tooltip': 'Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.',
|
||||
'value': 'ttall',
|
||||
'namestring': 'TauPy phases'},
|
||||
}
|
||||
|
||||
settings_main = {
|
||||
'dirs': [
|
||||
'rootpath',
|
||||
'datapath',
|
||||
'database',
|
||||
'eventID',
|
||||
'invdir',
|
||||
'datastructure',
|
||||
|
@ -1,84 +0,0 @@
|
||||
#!/usr/bin/python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
"""
|
||||
Script to get event parameters from PyLoT-xml file to write
|
||||
them into eventlist.
|
||||
LK, igem, 03/2021
|
||||
Edited for use in PyLoT
|
||||
JG, igem, 01/2022
|
||||
"""
|
||||
|
||||
import os
|
||||
import argparse
|
||||
import numpy as np
|
||||
import matplotlib.pyplot as plt
|
||||
import glob
|
||||
|
||||
from obspy.core.event import read_events
|
||||
from pyproj import Proj
|
||||
|
||||
"""
|
||||
Creates an eventlist file summarizing all events found in a certain folder. Only called by pressing UI Button eventlis_xml_action
|
||||
|
||||
:rtype:
|
||||
:param path: Path to root folder where single Event folder are to found
|
||||
"""
|
||||
|
||||
|
||||
def geteventlistfromxml(path, outpath):
|
||||
p = Proj(proj='utm', zone=32, ellps='WGS84')
|
||||
|
||||
# open eventlist file and write header
|
||||
evlist = outpath + '/eventlist'
|
||||
evlistobj = open(evlist, 'w')
|
||||
evlistobj.write(
|
||||
'EventID Date To Lat Lon EAST NORTH Dep Ml NoP NoS RMS errH errZ Gap \n')
|
||||
|
||||
# data path
|
||||
dp = path + "/e*/*.xml"
|
||||
# list of all available xml-files
|
||||
xmlnames = glob.glob(dp)
|
||||
|
||||
# read all onset weights
|
||||
for names in xmlnames:
|
||||
print("Getting location parameters from {}".format(names))
|
||||
cat = read_events(names)
|
||||
try:
|
||||
st = cat.events[0].origins[0].time
|
||||
Lat = cat.events[0].origins[0].latitude
|
||||
Lon = cat.events[0].origins[0].longitude
|
||||
EAST, NORTH = p(Lon, Lat)
|
||||
Dep = cat.events[0].origins[0].depth / 1000
|
||||
Ml = cat.events[0].magnitudes[1].mag
|
||||
NoP = []
|
||||
NoS = []
|
||||
except IndexError:
|
||||
print('Insufficient data found for event (not localised): ' + names.split('/')[-1].split('_')[-1][
|
||||
:-4] + ' Skipping event for eventlist.')
|
||||
continue
|
||||
|
||||
for i in range(len(cat.events[0].origins[0].arrivals)):
|
||||
if cat.events[0].origins[0].arrivals[i].phase == 'P':
|
||||
NoP.append(cat.events[0].origins[0].arrivals[i].phase)
|
||||
elif cat.events[0].origins[0].arrivals[i].phase == 'S':
|
||||
NoS.append(cat.events[0].origins[0].arrivals[i].phase)
|
||||
# NoP = cat.events[0].origins[0].quality.used_station_count
|
||||
errH = cat.events[0].origins[0].origin_uncertainty.max_horizontal_uncertainty
|
||||
errZ = cat.events[0].origins[0].depth_errors.uncertainty
|
||||
Gap = cat.events[0].origins[0].quality.azimuthal_gap
|
||||
# evID = names.split('/')[6]
|
||||
evID = names.split('/')[-1].split('_')[-1][:-4]
|
||||
Date = str(st.year) + str('%02d' % st.month) + str('%02d' % st.day)
|
||||
To = str('%02d' % st.hour) + str('%02d' % st.minute) + str('%02d' % st.second) + \
|
||||
'.' + str('%06d' % st.microsecond)
|
||||
|
||||
# write into eventlist
|
||||
evlistobj.write('%s %s %s %9.6f %9.6f %13.6f %13.6f %8.6f %3.1f %d %d NaN %d %d %d\n' % (evID, \
|
||||
Date, To, Lat, Lon,
|
||||
EAST, NORTH, Dep, Ml,
|
||||
len(NoP), len(NoS),
|
||||
errH, errZ, Gap))
|
||||
print('Adding Event ' + names.split('/')[-1].split('_')[-1][:-4] + ' to eventlist')
|
||||
print('Eventlist created and saved in: ' + outpath)
|
||||
evlistobj.close()
|
@ -1,7 +1,5 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
import logging
|
||||
import os
|
||||
|
||||
from pylot.core.io import default_parameters
|
||||
from pylot.core.util.errors import ParameterError
|
||||
@ -53,16 +51,10 @@ class PylotParameter(object):
|
||||
self.__parameter = {}
|
||||
self._verbosity = verbosity
|
||||
self._parFileCont = {}
|
||||
|
||||
# io from parsed arguments alternatively
|
||||
for key, val in kwargs.items():
|
||||
self._parFileCont[key] = val
|
||||
self.from_file()
|
||||
|
||||
# if no filename or kwargs given, use default values
|
||||
if not fnin and not kwargs:
|
||||
self.reset_defaults()
|
||||
|
||||
if fnout:
|
||||
self.export2File(fnout)
|
||||
|
||||
@ -96,10 +88,10 @@ class PylotParameter(object):
|
||||
return bool(self.__parameter)
|
||||
|
||||
def __getitem__(self, key):
|
||||
if key in self.__parameter:
|
||||
try:
|
||||
return self.__parameter[key]
|
||||
else:
|
||||
logging.warning(f'{key} not found in PylotParameter')
|
||||
except:
|
||||
return None
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
try:
|
||||
@ -426,28 +418,6 @@ class PylotParameter(object):
|
||||
line = value + name + ttip
|
||||
fid.write(line)
|
||||
|
||||
@staticmethod
|
||||
def check_deprecated_parameters(parameters):
|
||||
if parameters.hasParam('database') and parameters.hasParam('rootpath'):
|
||||
parameters['datapath'] = os.path.join(parameters['rootpath'], parameters['datapath'],
|
||||
parameters['database'])
|
||||
logging.warning(
|
||||
f'Parameters database and rootpath are deprecated. '
|
||||
f'Tried to merge them to now path: {parameters["datapath"]}.'
|
||||
)
|
||||
|
||||
remove_keys = []
|
||||
for key in parameters:
|
||||
if not key in default_parameters.defaults.keys():
|
||||
remove_keys.append(key)
|
||||
logging.warning(f'Removing deprecated parameter: {key}')
|
||||
|
||||
for key in remove_keys:
|
||||
del parameters[key]
|
||||
|
||||
parameters._settings_main = default_parameters.settings_main
|
||||
parameters._settings_special_pick = default_parameters.settings_special_pick
|
||||
|
||||
|
||||
class FilterOptions(object):
|
||||
'''
|
||||
|
@ -1,7 +1,7 @@
|
||||
from obspy import UTCDateTime
|
||||
from obspy.core import event as ope
|
||||
|
||||
from pylot.core.util.utils import get_login, get_hash
|
||||
from pylot.core.util.utils import getLogin, getHash
|
||||
|
||||
|
||||
def create_amplitude(pickID, amp, unit, category, cinfo):
|
||||
@ -61,7 +61,7 @@ def create_creation_info(agency_id=None, creation_time=None, author=None):
|
||||
:return:
|
||||
'''
|
||||
if author is None:
|
||||
author = get_login()
|
||||
author = getLogin()
|
||||
if creation_time is None:
|
||||
creation_time = UTCDateTime()
|
||||
return ope.CreationInfo(agency_id=agency_id, author=author,
|
||||
@ -210,7 +210,7 @@ def create_resourceID(timetohash, restype, authority_id=None, hrstr=None):
|
||||
'''
|
||||
assert isinstance(timetohash, UTCDateTime), "'timetohash' is not an ObsPy" \
|
||||
"UTCDateTime object"
|
||||
hid = get_hash(timetohash)
|
||||
hid = getHash(timetohash)
|
||||
if hrstr is None:
|
||||
resID = ope.ResourceIdentifier(restype + '/' + hid[0:6])
|
||||
else:
|
||||
|
@ -1,14 +1,13 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
import glob
|
||||
import logging
|
||||
import os
|
||||
import warnings
|
||||
|
||||
import glob
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
import obspy.core.event as ope
|
||||
import os
|
||||
import scipy.io as sio
|
||||
import warnings
|
||||
from obspy.core import UTCDateTime
|
||||
from obspy.core.event import read_events
|
||||
from obspy.core.util import AttribDict
|
||||
@ -17,8 +16,8 @@ from pylot.core.io.inputs import PylotParameter
|
||||
from pylot.core.io.location import create_event, \
|
||||
create_magnitude
|
||||
from pylot.core.pick.utils import select_for_phase, get_quality_class
|
||||
from pylot.core.util.utils import get_owner, full_range, four_digits, transformFilterString4Export, \
|
||||
backtransformFilterString, loopIdentifyPhase, identifyPhase
|
||||
from pylot.core.util.utils import getOwner, full_range, four_digits, transformFilterString4Export, \
|
||||
backtransformFilterString
|
||||
|
||||
|
||||
def add_amplitudes(event, amplitudes):
|
||||
@ -59,7 +58,7 @@ def readPILOTEvent(phasfn=None, locfn=None, authority_id='RUB', **kwargs):
|
||||
if phasfn is not None and os.path.isfile(phasfn):
|
||||
phases = sio.loadmat(phasfn)
|
||||
phasctime = UTCDateTime(os.path.getmtime(phasfn))
|
||||
phasauthor = get_owner(phasfn)
|
||||
phasauthor = getOwner(phasfn)
|
||||
else:
|
||||
phases = None
|
||||
phasctime = None
|
||||
@ -67,7 +66,7 @@ def readPILOTEvent(phasfn=None, locfn=None, authority_id='RUB', **kwargs):
|
||||
if locfn is not None and os.path.isfile(locfn):
|
||||
loc = sio.loadmat(locfn)
|
||||
locctime = UTCDateTime(os.path.getmtime(locfn))
|
||||
locauthor = get_owner(locfn)
|
||||
locauthor = getOwner(locfn)
|
||||
else:
|
||||
loc = None
|
||||
locctime = None
|
||||
@ -218,7 +217,7 @@ def picksdict_from_obs(fn):
|
||||
return picks
|
||||
|
||||
|
||||
def picksdict_from_picks(evt, parameter=None):
|
||||
def picksdict_from_picks(evt):
|
||||
"""
|
||||
Takes an Event object and return the pick dictionary commonly used within
|
||||
PyLoT
|
||||
@ -231,10 +230,9 @@ def picksdict_from_picks(evt, parameter=None):
|
||||
'auto': {}
|
||||
}
|
||||
for pick in evt.picks:
|
||||
errors = None
|
||||
phase = {}
|
||||
station = pick.waveform_id.station_code
|
||||
if pick.waveform_id.channel_code is None:
|
||||
if pick.waveform_id.channel_code == None:
|
||||
channel = ''
|
||||
else:
|
||||
channel = pick.waveform_id.channel_code
|
||||
@ -246,15 +244,16 @@ def picksdict_from_picks(evt, parameter=None):
|
||||
else:
|
||||
filter_id = None
|
||||
try:
|
||||
pick_method = str(pick.method_id)
|
||||
if pick_method.startswith('smi:local/'):
|
||||
pick_method = pick_method.split('smi:local/')[1]
|
||||
picker = str(pick.method_id)
|
||||
if picker.startswith('smi:local/'):
|
||||
picker = picker.split('smi:local/')[1]
|
||||
except IndexError:
|
||||
pick_method = 'manual' # MP MP TODO maybe improve statement
|
||||
if pick_method == 'None':
|
||||
pick_method = 'manual'
|
||||
picker = 'manual' # MP MP TODO maybe improve statement
|
||||
if picker == 'None':
|
||||
picker = 'manual'
|
||||
try:
|
||||
onsets = picksdict[pick_method][station]
|
||||
#onsets = picksdict[picker][station]
|
||||
onsets = picksdict[station]
|
||||
except KeyError as e:
|
||||
# print(e)
|
||||
onsets = {}
|
||||
@ -275,32 +274,36 @@ def picksdict_from_picks(evt, parameter=None):
|
||||
phase['epp'] = epp
|
||||
phase['lpp'] = lpp
|
||||
phase['spe'] = spe
|
||||
weight = phase.get('weight')
|
||||
if not weight:
|
||||
if not parameter:
|
||||
logging.warning('Using default input parameter')
|
||||
parameter = PylotParameter()
|
||||
pick.phase_hint = identifyPhase(pick.phase_hint)
|
||||
try:
|
||||
phase['weight'] = weight
|
||||
except:
|
||||
# get onset weight from uncertainty
|
||||
infile = os.path.join(os.path.expanduser('~'), '.pylot', 'pylot.in')
|
||||
print('Using default input file {}'.format(infile))
|
||||
parameter = PylotParameter(infile)
|
||||
if pick.phase_hint == 'P':
|
||||
errors = parameter['timeerrorsP']
|
||||
elif pick.phase_hint == 'S':
|
||||
errors = parameter['timeerrorsS']
|
||||
if errors:
|
||||
weight = get_quality_class(spe, errors)
|
||||
phase['weight'] = weight
|
||||
weight = get_quality_class(spe, errors)
|
||||
phase['weight'] = weight
|
||||
phase['channel'] = channel
|
||||
phase['network'] = network
|
||||
phase['picker'] = pick_method
|
||||
if pick.polarity == 'positive':
|
||||
phase['fm'] = 'U'
|
||||
elif pick.polarity == 'negative':
|
||||
phase['fm'] = 'D'
|
||||
else:
|
||||
phase['picker'] = picker
|
||||
try:
|
||||
if pick.polarity == 'positive':
|
||||
phase['fm'] = 'U'
|
||||
elif pick.polarity == 'negative':
|
||||
phase['fm'] = 'D'
|
||||
else:
|
||||
phase['fm'] = 'N'
|
||||
except:
|
||||
print("No FM info available!")
|
||||
phase['fm'] = 'N'
|
||||
phase['filter_id'] = filter_id if filter_id is not None else ''
|
||||
|
||||
onsets[pick.phase_hint] = phase.copy()
|
||||
picksdict[pick_method][station] = onsets.copy()
|
||||
picksdict[station] = onsets.copy()
|
||||
return picksdict
|
||||
|
||||
|
||||
@ -373,7 +376,8 @@ def picks_from_picksdict(picks, creation_info=None):
|
||||
|
||||
|
||||
def reassess_pilot_db(root_dir, db_dir, out_dir=None, fn_param=None, verbosity=0):
|
||||
# TODO: change root to datapath
|
||||
import glob
|
||||
|
||||
db_root = os.path.join(root_dir, db_dir)
|
||||
evt_list = glob.glob1(db_root, 'e????.???.??')
|
||||
|
||||
@ -388,7 +392,9 @@ def reassess_pilot_event(root_dir, db_dir, event_id, out_dir=None, fn_param=None
|
||||
|
||||
from pylot.core.io.inputs import PylotParameter
|
||||
from pylot.core.pick.utils import earllatepicker
|
||||
# TODO: change root to datapath
|
||||
|
||||
if fn_param is None:
|
||||
fn_param = defaults.AUTOMATIC_DEFAULTS
|
||||
|
||||
default = PylotParameter(fn_param, verbosity)
|
||||
|
||||
@ -478,6 +484,7 @@ def reassess_pilot_event(root_dir, db_dir, event_id, out_dir=None, fn_param=None
|
||||
os.makedirs(out_dir)
|
||||
fnout_prefix = os.path.join(out_dir, 'PyLoT_{0}.'.format(event_id))
|
||||
evt.write(fnout_prefix + 'xml', format='QUAKEML')
|
||||
# evt.write(fnout_prefix + 'cnv', format='VELEST')
|
||||
|
||||
|
||||
def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
@ -506,16 +513,16 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
and FOCMEC- and HASH-input files
|
||||
:type eventinfo: `obspy.core.event.Event` object
|
||||
"""
|
||||
|
||||
if fformat == 'NLLoc':
|
||||
print("Writing phases to %s for NLLoc" % filename)
|
||||
fid = open("%s" % filename, 'w')
|
||||
# write header
|
||||
fid.write('# EQEVENT: %s Label: EQ%s Loc: X 0.00 Y 0.00 Z 10.00 OT 0.00 \n' %
|
||||
(parameter.get('datapath'), parameter.get('eventID')))
|
||||
arrivals = chooseArrivals(arrivals)
|
||||
(parameter.get('database'), parameter.get('eventID')))
|
||||
for key in arrivals:
|
||||
# P onsets
|
||||
if 'P' in arrivals[key]:
|
||||
if arrivals[key].has_key('P'):
|
||||
try:
|
||||
fm = arrivals[key]['P']['fm']
|
||||
except KeyError as e:
|
||||
@ -536,7 +543,6 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
try:
|
||||
if arrivals[key]['P']['weight'] >= 4:
|
||||
pweight = 0 # do not use pick
|
||||
print("Station {}: Uncertain pick, do not use it!".format(key))
|
||||
except KeyError as e:
|
||||
print(e.message + '; no weight set during processing')
|
||||
fid.write('%s ? ? ? P %s %d%02d%02d %02d%02d %7.4f GAU 0 0 0 0 %d \n' % (key,
|
||||
@ -549,7 +555,7 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
ss_ms,
|
||||
pweight))
|
||||
# S onsets
|
||||
if 'S' in arrivals[key] and arrivals[key]['S']['mpp'] is not None:
|
||||
if arrivals[key].has_key('S') and arrivals[key]['S']['mpp'] is not None:
|
||||
fm = '?'
|
||||
onset = arrivals[key]['S']['mpp']
|
||||
year = onset.year
|
||||
@ -566,20 +572,18 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
sweight = 0 # do not use pick
|
||||
except KeyError as e:
|
||||
print(str(e) + '; no weight set during processing')
|
||||
Ao = arrivals[key]['S']['Ao'] # peak-to-peak amplitude
|
||||
if Ao == None:
|
||||
Ao = 0.0
|
||||
# fid.write('%s ? ? ? S %s %d%02d%02d %02d%02d %7.4f GAU 0 0 0 0 %d \n' % (key,
|
||||
Ao = arrivals[key]['S']['Ao'] # peak-to-peak amplitude
|
||||
#fid.write('%s ? ? ? S %s %d%02d%02d %02d%02d %7.4f GAU 0 0 0 0 %d \n' % (key,
|
||||
fid.write('%s ? ? ? S %s %d%02d%02d %02d%02d %7.4f GAU 0 %9.2f 0 0 %d \n' % (key,
|
||||
fm,
|
||||
year,
|
||||
month,
|
||||
day,
|
||||
hh,
|
||||
mm,
|
||||
ss_ms,
|
||||
Ao,
|
||||
sweight))
|
||||
fm,
|
||||
year,
|
||||
month,
|
||||
day,
|
||||
hh,
|
||||
mm,
|
||||
ss_ms,
|
||||
Ao,
|
||||
sweight))
|
||||
|
||||
fid.close()
|
||||
elif fformat == 'HYPO71':
|
||||
@ -588,7 +592,6 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
# write header
|
||||
fid.write(' %s\n' %
|
||||
parameter.get('eventID'))
|
||||
arrivals = chooseArrivals(arrivals) # MP MP what is chooseArrivals? It is not defined anywhere
|
||||
for key in arrivals:
|
||||
if arrivals[key]['P']['weight'] < 4:
|
||||
stat = key
|
||||
@ -664,11 +667,10 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
print("Writing phases to %s for HYPOSAT" % filename)
|
||||
fid = open("%s" % filename, 'w')
|
||||
# write header
|
||||
fid.write('%s, event %s \n' % (parameter.get('datapath'), parameter.get('eventID')))
|
||||
arrivals = chooseArrivals(arrivals)
|
||||
fid.write('%s, event %s \n' % (parameter.get('database'), parameter.get('eventID')))
|
||||
for key in arrivals:
|
||||
# P onsets
|
||||
if 'P' in arrivals[key] and arrivals[key]['P']['mpp'] is not None:
|
||||
if arrivals[key].has_key('P') and arrivals[key]['P']['mpp'] is not None:
|
||||
if arrivals[key]['P']['weight'] < 4:
|
||||
Ponset = arrivals[key]['P']['mpp']
|
||||
pyear = Ponset.year
|
||||
@ -697,7 +699,7 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
fid.write('%-5s P1 %4.0f %02d %02d %02d %02d %05.02f %5.3f -999. 0.00 -999. 0.00\n'
|
||||
% (key, pyear, pmonth, pday, phh, pmm, Pss, pstd))
|
||||
# S onsets
|
||||
if 'S' in arrivals[key] and arrivals[key]['S']['mpp'] is not None:
|
||||
if arrivals[key].has_key('S') and arrivals[key]['S']['mpp'] is not None:
|
||||
if arrivals[key]['S']['weight'] < 4:
|
||||
Sonset = arrivals[key]['S']['mpp']
|
||||
syear = Sonset.year
|
||||
@ -756,40 +758,37 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
cns, eventsource['longitude'], cew, eventsource['depth'], eventinfo.magnitudes[0]['mag'], ifx))
|
||||
n = 0
|
||||
# check whether arrivals are dictionaries (autoPyLoT) or pick object (PyLoT)
|
||||
if isinstance(arrivals, dict) is False:
|
||||
if isinstance(arrivals, dict) == False:
|
||||
# convert pick object (PyLoT) into dictionary
|
||||
evt = ope.Event(resource_id=eventinfo['resource_id'])
|
||||
evt.picks = arrivals
|
||||
arrivals = picksdict_from_picks(evt, parameter=parameter)
|
||||
# check for automatic and manual picks
|
||||
# prefer manual picks
|
||||
usedarrivals = chooseArrivals(arrivals)
|
||||
for key in usedarrivals:
|
||||
arrivals = picksdict_from_picks(evt)
|
||||
for key in arrivals:
|
||||
# P onsets
|
||||
if 'P' in usedarrivals[key]:
|
||||
if usedarrivals[key]['P']['weight'] < 4:
|
||||
if arrivals[key].has_key('P'):
|
||||
if arrivals[key]['P']['weight'] < 4:
|
||||
n += 1
|
||||
stat = key
|
||||
if len(stat) > 4: # VELEST handles only 4-string station IDs
|
||||
stat = stat[1:5]
|
||||
Ponset = usedarrivals[key]['P']['mpp']
|
||||
Pweight = usedarrivals[key]['P']['weight']
|
||||
Ponset = arrivals[key]['P']['mpp']
|
||||
Pweight = arrivals[key]['P']['weight']
|
||||
Prt = Ponset - stime # onset time relative to source time
|
||||
if n % 6 != 0:
|
||||
if n % 6 is not 0:
|
||||
fid.write('%-4sP%d%6.2f' % (stat, Pweight, Prt))
|
||||
else:
|
||||
fid.write('%-4sP%d%6.2f\n' % (stat, Pweight, Prt))
|
||||
# S onsets
|
||||
if 'S' in usedarrivals[key]:
|
||||
if usedarrivals[key]['S']['weight'] < 4:
|
||||
if arrivals[key].has_key('S'):
|
||||
if arrivals[key]['S']['weight'] < 4:
|
||||
n += 1
|
||||
stat = key
|
||||
if len(stat) > 4: # VELEST handles only 4-string station IDs
|
||||
stat = stat[1:5]
|
||||
Sonset = usedarrivals[key]['S']['mpp']
|
||||
Sweight = usedarrivals[key]['S']['weight']
|
||||
Sonset = arrivals[key]['S']['mpp']
|
||||
Sweight = arrivals[key]['S']['weight']
|
||||
Srt = Ponset - stime # onset time relative to source time
|
||||
if n % 6 != 0:
|
||||
if n % 6 is not 0:
|
||||
fid.write('%-4sS%d%6.2f' % (stat, Sweight, Srt))
|
||||
else:
|
||||
fid.write('%-4sS%d%6.2f\n' % (stat, Sweight, Srt))
|
||||
@ -805,13 +804,9 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
print("No source origin calculated yet, thus no hypoDD-infile creation possible!")
|
||||
return
|
||||
stime = eventsource['time']
|
||||
try:
|
||||
event = eventinfo['pylot_id']
|
||||
hddID = event.split('.')[0][1:5]
|
||||
except:
|
||||
print("Error 1111111!")
|
||||
hddID = "00000"
|
||||
# write header
|
||||
event = parameter.get('eventID')
|
||||
hddID = event.split('.')[0][1:5]
|
||||
# write header
|
||||
fid.write('# %d %d %d %d %d %5.2f %7.4f +%6.4f %7.4f %4.2f 0.1 0.5 %4.2f %s\n' % (
|
||||
stime.year, stime.month, stime.day, stime.hour, stime.minute, stime.second,
|
||||
eventsource['latitude'], eventsource['longitude'], eventsource['depth'] / 1000,
|
||||
@ -821,21 +816,18 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
# convert pick object (PyLoT) into dictionary
|
||||
evt = ope.Event(resource_id=eventinfo['resource_id'])
|
||||
evt.picks = arrivals
|
||||
arrivals = picksdict_from_picks(evt, parameter=parameter)
|
||||
# check for automatic and manual picks
|
||||
# prefer manual picks
|
||||
usedarrivals = chooseArrivals(arrivals)
|
||||
for key in usedarrivals:
|
||||
if 'P' in usedarrivals[key]:
|
||||
arrivals = picksdict_from_picks(evt)
|
||||
for key in arrivals:
|
||||
if arrivals[key].has_key('P'):
|
||||
# P onsets
|
||||
if usedarrivals[key]['P']['weight'] < 4:
|
||||
Ponset = usedarrivals[key]['P']['mpp']
|
||||
if arrivals[key]['P']['weight'] < 4:
|
||||
Ponset = arrivals[key]['P']['mpp']
|
||||
Prt = Ponset - stime # onset time relative to source time
|
||||
fid.write('%s %6.3f 1 P\n' % (key, Prt))
|
||||
if 'S' in usedarrivals[key]:
|
||||
if arrivals[key].has_key('S'):
|
||||
# S onsets
|
||||
if usedarrivals[key]['S']['weight'] < 4:
|
||||
Sonset = usedarrivals[key]['S']['mpp']
|
||||
if arrivals[key]['S']['weight'] < 4:
|
||||
Sonset = arrivals[key]['S']['mpp']
|
||||
Srt = Sonset - stime # onset time relative to source time
|
||||
fid.write('%-5s %6.3f 1 S\n' % (key, Srt))
|
||||
|
||||
@ -872,13 +864,10 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
# convert pick object (PyLoT) into dictionary
|
||||
evt = ope.Event(resource_id=eventinfo['resource_id'])
|
||||
evt.picks = arrivals
|
||||
arrivals = picksdict_from_picks(evt, parameter=parameter)
|
||||
# check for automatic and manual picks
|
||||
# prefer manual picks
|
||||
usedarrivals = chooseArrivals(arrivals)
|
||||
for key in usedarrivals:
|
||||
if 'P' in usedarrivals[key]:
|
||||
if usedarrivals[key]['P']['weight'] < 4 and usedarrivals[key]['P']['fm'] is not None:
|
||||
arrivals = picksdict_from_picks(evt)
|
||||
for key in arrivals:
|
||||
if arrivals[key].has_key('P'):
|
||||
if arrivals[key]['P']['weight'] < 4 and arrivals[key]['P']['fm'] is not None:
|
||||
stat = key
|
||||
for i in range(len(picks)):
|
||||
station = picks[i].waveform_id.station_code
|
||||
@ -897,7 +886,7 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
fid.write('%-4s %6.2f %6.2f%s \n' % (stat,
|
||||
az,
|
||||
inz,
|
||||
usedarrivals[key]['P']['fm']))
|
||||
arrivals[key]['P']['fm']))
|
||||
break
|
||||
|
||||
fid.close()
|
||||
@ -955,11 +944,10 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
eventsource['quality']['used_phase_count'],
|
||||
erh, erz, eventinfo.magnitudes[0]['mag'],
|
||||
hashID))
|
||||
# Prefer Manual Picks over automatic ones if possible
|
||||
arrivals = chooseArrivals(arrivals) # MP MP what is chooseArrivals? It is not defined anywhere
|
||||
|
||||
# write phase lines
|
||||
for key in arrivals:
|
||||
if 'P' in arrivals[key]:
|
||||
if arrivals[key].has_key('P'):
|
||||
if arrivals[key]['P']['weight'] < 4 and arrivals[key]['P']['fm'] is not None:
|
||||
stat = key
|
||||
ccode = arrivals[key]['P']['channel']
|
||||
@ -1004,25 +992,6 @@ def writephases(arrivals, fformat, filename, parameter=None, eventinfo=None):
|
||||
fid2.close()
|
||||
|
||||
|
||||
def chooseArrivals(arrivals):
|
||||
"""
|
||||
takes arrivals and returns the manual picks if manual and automatic ones are there
|
||||
returns automatic picks if only automatic picks are there
|
||||
:param arrivals: 'dictionary' with automatic and or manual arrivals
|
||||
:return: arrivals but with the manual picks prefered if possible
|
||||
"""
|
||||
# If len of arrivals is greater than 2 it comes from autopicking so only autopicks are available
|
||||
if len(arrivals) > 2:
|
||||
return arrivals
|
||||
if arrivals['auto'] and arrivals['manual']:
|
||||
usedarrivals = arrivals['manual']
|
||||
elif arrivals['auto']:
|
||||
usedarrivals = arrivals['auto']
|
||||
elif arrivals['manual']:
|
||||
usedarrivals = arrivals['manual']
|
||||
return usedarrivals
|
||||
|
||||
|
||||
def merge_picks(event, picks):
|
||||
"""
|
||||
takes an event object and a list of picks and searches for matching
|
||||
@ -1052,63 +1021,37 @@ def merge_picks(event, picks):
|
||||
return event
|
||||
|
||||
|
||||
def getQualitiesfromxml(path, errorsP, errorsS, plotflag=1, figure=None, verbosity=0):
|
||||
def getQualitiesfromxml(xmlnames, ErrorsP, ErrorsS, plotflag=1):
|
||||
"""
|
||||
Script to get onset uncertainties from Quakeml.xml files created by PyLoT.
|
||||
Uncertainties are tranformed into quality classes and visualized via histogram if desired.
|
||||
Ludger Küperkoch, BESTEC GmbH, 07/2017
|
||||
:param path: path containing xml files
|
||||
:type path: str
|
||||
:param errorsP: time errors of P waves for the four discrete quality classes
|
||||
:type errorsP:
|
||||
:param errorsS: time errors of S waves for the four discrete quality classes
|
||||
:type errorsS:
|
||||
:param xmlnames: list of xml obspy event files containing picks
|
||||
:type xmlnames: list
|
||||
:param ErrorsP: time errors of P waves for the four discrete quality classes
|
||||
:type ErrorsP:
|
||||
:param ErrorsS: time errors of S waves for the four discrete quality classes
|
||||
:type ErrorsS:
|
||||
:param plotflag:
|
||||
:type plotflag:
|
||||
:return:
|
||||
:rtype:
|
||||
"""
|
||||
|
||||
def calc_perc(uncertainties, ntotal):
|
||||
''' simple function that calculates percentage of number of uncertainties (list length)'''
|
||||
if len(uncertainties) == 0:
|
||||
return 0
|
||||
else:
|
||||
return 100. / ntotal * len(uncertainties)
|
||||
|
||||
def calc_weight_perc(psweights, weight_ids):
|
||||
''' calculate percentages of different weights (pick classes!?) of total number of uncertainties of a phase'''
|
||||
# count total number of list items for this phase
|
||||
numWeights = np.sum([len(weight) for weight in psweights.values()])
|
||||
|
||||
# iterate over all available weights to return a list with percentages for plotting
|
||||
plot_list = []
|
||||
for weight_id in weight_ids:
|
||||
plot_list.append(calc_perc(psweights[weight_id], numWeights))
|
||||
|
||||
return plot_list, numWeights
|
||||
|
||||
# get all xmlfiles in path (maybe this should be changed to one xml file for this function, selectable via GUI?)
|
||||
xmlnames = glob.glob(os.path.join(path, '*.xml'))
|
||||
if len(xmlnames) == 0:
|
||||
print(f'No files found in path {path}.')
|
||||
return False
|
||||
|
||||
# first define possible phases here
|
||||
phases = ['P', 'S']
|
||||
|
||||
# define possible weights (0-4)
|
||||
weight_ids = list(range(5))
|
||||
|
||||
# put both error lists in a dictionary with P/S key so that amount of code can be halfed by simply using P/S as key
|
||||
errors = dict(P=errorsP, S=errorsS)
|
||||
|
||||
# create dictionaries for each phase (P/S) with a dictionary of empty list for each weight defined in weights
|
||||
# tuple above
|
||||
weights = {}
|
||||
for phase in phases:
|
||||
weights[phase] = {weight_id: [] for weight_id in weight_ids}
|
||||
from pylot.core.pick.utils import get_quality_class
|
||||
from pylot.core.util.utils import loopIdentifyPhase, identifyPhase
|
||||
|
||||
# read all onset weights
|
||||
Pw0 = []
|
||||
Pw1 = []
|
||||
Pw2 = []
|
||||
Pw3 = []
|
||||
Pw4 = []
|
||||
Sw0 = []
|
||||
Sw1 = []
|
||||
Sw2 = []
|
||||
Sw3 = []
|
||||
Sw4 = []
|
||||
for names in xmlnames:
|
||||
print("Getting onset weights from {}".format(names))
|
||||
cat = read_events(names)
|
||||
@ -1116,60 +1059,117 @@ def getQualitiesfromxml(path, errorsP, errorsS, plotflag=1, figure=None, verbosi
|
||||
arrivals = cat.events[0].picks
|
||||
arrivals_copy = cat_copy.events[0].picks
|
||||
# Prefere manual picks if qualities are sufficient!
|
||||
for pick in arrivals:
|
||||
if pick.method_id.id.split('/')[1] == 'manual':
|
||||
mstation = pick.waveform_id.station_code
|
||||
for Pick in arrivals:
|
||||
if Pick.method_id.id.split('/')[1] == 'manual':
|
||||
mstation = Pick.waveform_id.station_code
|
||||
mstation_ext = mstation + '_'
|
||||
for mpick in arrivals_copy:
|
||||
phase = identifyPhase(loopIdentifyPhase(pick.phase_hint)) # MP MP catch if this fails?
|
||||
if ((mpick.waveform_id.station_code == mstation) or
|
||||
(mpick.waveform_id.station_code == mstation_ext)) and \
|
||||
(mpick.method_id.id.split('/')[1] == 'auto') and \
|
||||
(mpick.time_errors['uncertainty'] <= errors[phase][3]):
|
||||
del mpick
|
||||
break
|
||||
phase = identifyPhase(loopIdentifyPhase(Pick.phase_hint))
|
||||
if phase == 'P':
|
||||
if ((mpick.waveform_id.station_code == mstation) or
|
||||
(mpick.waveform_id.station_code == mstation_ext)) and \
|
||||
(mpick.method_id.split('/')[1] == 'auto') and \
|
||||
(mpick.time_errors['uncertainty'] <= ErrorsP[3]):
|
||||
del mpick
|
||||
break
|
||||
elif phase == 'S':
|
||||
if ((mpick.waveform_id.station_code == mstation) or
|
||||
(mpick.waveform_id.station_code == mstation_ext)) and \
|
||||
(mpick.method_id.split('/')[1] == 'auto') and \
|
||||
(mpick.time_errors['uncertainty'] <= ErrorsS[3]):
|
||||
del mpick
|
||||
break
|
||||
lendiff = len(arrivals) - len(arrivals_copy)
|
||||
if lendiff != 0:
|
||||
if lendiff is not 0:
|
||||
print("Found manual as well as automatic picks, prefered the {} manual ones!".format(lendiff))
|
||||
|
||||
for pick in arrivals_copy:
|
||||
phase = identifyPhase(loopIdentifyPhase(pick.phase_hint))
|
||||
uncertainty = pick.time_errors.uncertainty
|
||||
if not uncertainty:
|
||||
if verbosity > 0:
|
||||
print('No uncertainty, pick {} invalid!'.format(pick.method_id.id))
|
||||
continue
|
||||
# check P/S phase
|
||||
if phase not in phases:
|
||||
for Pick in arrivals_copy:
|
||||
phase = identifyPhase(loopIdentifyPhase(Pick.phase_hint))
|
||||
if phase == 'P':
|
||||
Pqual = get_quality_class(Pick.time_errors.uncertainty, ErrorsP)
|
||||
if Pqual == 0:
|
||||
Pw0.append(Pick.time_errors.uncertainty)
|
||||
elif Pqual == 1:
|
||||
Pw1.append(Pick.time_errors.uncertainty)
|
||||
elif Pqual == 2:
|
||||
Pw2.append(Pick.time_errors.uncertainty)
|
||||
elif Pqual == 3:
|
||||
Pw3.append(Pick.time_errors.uncertainty)
|
||||
elif Pqual == 4:
|
||||
Pw4.append(Pick.time_errors.uncertainty)
|
||||
elif phase == 'S':
|
||||
Squal = get_quality_class(Pick.time_errors.uncertainty, ErrorsS)
|
||||
if Squal == 0:
|
||||
Sw0.append(Pick.time_errors.uncertainty)
|
||||
elif Squal == 1:
|
||||
Sw1.append(Pick.time_errors.uncertainty)
|
||||
elif Squal == 2:
|
||||
Sw2.append(Pick.time_errors.uncertainty)
|
||||
elif Squal == 3:
|
||||
Sw3.append(Pick.time_errors.uncertainty)
|
||||
elif Squal == 4:
|
||||
Sw4.append(Pick.time_errors.uncertainty)
|
||||
else:
|
||||
print("Phase hint not defined for picking!")
|
||||
continue
|
||||
|
||||
qual = get_quality_class(uncertainty, errors[phase])
|
||||
weights[phase][qual].append(uncertainty)
|
||||
pass
|
||||
|
||||
if plotflag == 0:
|
||||
p_unc = [weights['P'][weight_id] for weight_id in weight_ids]
|
||||
s_unc = [weights['S'][weight_id] for weight_id in weight_ids]
|
||||
return p_unc, s_unc
|
||||
Punc = [Pw0, Pw1, Pw2, Pw3, Pw4]
|
||||
Sunc = [Sw0, Sw1, Sw2, Sw3, Sw4]
|
||||
return Punc, Sunc
|
||||
else:
|
||||
if not figure:
|
||||
fig = plt.figure()
|
||||
ax = fig.add_subplot(111)
|
||||
# get percentage of weights
|
||||
listP, numPweights = calc_weight_perc(weights['P'], weight_ids)
|
||||
listS, numSweights = calc_weight_perc(weights['S'], weight_ids)
|
||||
numPweights = np.sum([len(Pw0), len(Pw1), len(Pw2), len(Pw3), len(Pw4)])
|
||||
numSweights = np.sum([len(Sw0), len(Sw1), len(Sw2), len(Sw3), len(Sw4)])
|
||||
if len(Pw0) > 0:
|
||||
P0perc = 100 / numPweights * len(Pw0)
|
||||
else:
|
||||
P0perc = 0
|
||||
if len(Pw1) > 0:
|
||||
P1perc = 100 / numPweights * len(Pw1)
|
||||
else:
|
||||
P1perc = 0
|
||||
if len(Pw2) > 0:
|
||||
P2perc = 100 / numPweights * len(Pw2)
|
||||
else:
|
||||
P2perc = 0
|
||||
if len(Pw3) > 0:
|
||||
P3perc = 100 / numPweights * len(Pw3)
|
||||
else:
|
||||
P3perc = 0
|
||||
if len(Pw4) > 0:
|
||||
P4perc = 100 / numPweights * len(Pw4)
|
||||
else:
|
||||
P4perc = 0
|
||||
if len(Sw0) > 0:
|
||||
S0perc = 100 / numSweights * len(Sw0)
|
||||
else:
|
||||
S0perc = 0
|
||||
if len(Sw1) > 0:
|
||||
S1perc = 100 / numSweights * len(Sw1)
|
||||
else:
|
||||
S1perc = 0
|
||||
if len(Sw2) > 0:
|
||||
S2perc = 100 / numSweights * len(Sw2)
|
||||
else:
|
||||
S2perc = 0
|
||||
if len(Sw3) > 0:
|
||||
S3perc = 100 / numSweights * len(Sw3)
|
||||
else:
|
||||
S3perc = 0
|
||||
if len(Sw4) > 0:
|
||||
S4perc = 100 / numSweights * len(Sw4)
|
||||
else:
|
||||
S4perc = 0
|
||||
|
||||
y_pos = np.arange(len(weight_ids))
|
||||
weights = ('0', '1', '2', '3', '4')
|
||||
y_pos = np.arange(len(weights))
|
||||
width = 0.34
|
||||
ax.bar(y_pos - width, listP, width, color='black')
|
||||
ax.bar(y_pos, listS, width, color='red')
|
||||
ax.set_ylabel('%')
|
||||
ax.set_xticks(y_pos, weight_ids)
|
||||
ax.set_xlim([-0.5, 4.5])
|
||||
ax.set_xlabel('Qualities')
|
||||
ax.set_title('{0} P-Qualities, {1} S-Qualities'.format(numPweights, numSweights))
|
||||
|
||||
if not figure:
|
||||
fig.show()
|
||||
|
||||
return listP, listS
|
||||
plt.bar(y_pos - width, [P0perc, P1perc, P2perc, P3perc, P4perc], width, color='black')
|
||||
plt.bar(y_pos, [S0perc, S1perc, S2perc, S3perc, S4perc], width, color='red')
|
||||
plt.ylabel('%')
|
||||
plt.xticks(y_pos, weights)
|
||||
plt.xlim([-0.5, 4.5])
|
||||
plt.xlabel('Qualities')
|
||||
plt.title('{0} P-Qualities, {1} S-Qualities'.format(numPweights, numSweights))
|
||||
plt.show()
|
||||
|
@ -4,12 +4,11 @@
|
||||
import glob
|
||||
import os
|
||||
import subprocess
|
||||
|
||||
from obspy import read_events
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.util.gui import which
|
||||
from pylot.core.util.utils import getPatternLine, runProgram
|
||||
from pylot.core.util.gui import which
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
@ -82,8 +81,9 @@ def locate(fnin, parameter=None):
|
||||
:param fnin: external program name
|
||||
:return: None
|
||||
"""
|
||||
exe_path = os.path.join(parameter['nllocbin'], 'NLLoc')
|
||||
if not os.path.isfile(exe_path):
|
||||
|
||||
exe_path = which('NLLoc', parameter)
|
||||
if exe_path is None:
|
||||
raise NLLocError('NonLinLoc executable not found; check your '
|
||||
'environment variables')
|
||||
|
||||
|
@ -9,21 +9,21 @@ function conglomerate utils.
|
||||
:author: MAGS2 EP3 working group / Ludger Kueperkoch
|
||||
"""
|
||||
import copy
|
||||
import traceback
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
from obspy import Trace
|
||||
import traceback
|
||||
from obspy.taup import TauPyModel
|
||||
|
||||
from pylot.core.pick.charfuns import CharacteristicFunction
|
||||
from pylot.core.pick.charfuns import HOScf, AICcf, ARZcf, ARHcf, AR3Ccf
|
||||
from pylot.core.pick.picker import AICPicker, PragPicker
|
||||
from pylot.core.pick.utils import checksignallength, checkZ4S, earllatepicker, \
|
||||
getSNR, fmpicker, checkPonsets, wadaticheck, get_quality_class, PickingFailedException, MissingTraceException
|
||||
from pylot.core.util.utils import getPatternLine, gen_Pool, \
|
||||
get_bool, identifyPhaseID, get_none, correct_iplot
|
||||
getSNR, fmpicker, checkPonsets, wadaticheck, get_pickparams, get_quality_class
|
||||
from pylot.core.util.utils import getPatternLine, gen_Pool,\
|
||||
get_Bool, identifyPhaseID, get_None, correct_iplot
|
||||
|
||||
from obspy.taup import TauPyModel
|
||||
from obspy import Trace
|
||||
|
||||
def autopickevent(data, param, iplot=0, fig_dict=None, fig_dict_wadatijack=None, ncores=0, metadata=None, origin=None):
|
||||
"""
|
||||
@ -182,25 +182,25 @@ class PickingResults(dict):
|
||||
# TODO What are those?
|
||||
self.w0 = None
|
||||
self.fc = None
|
||||
self.Ao = None # Wood-Anderson peak-to-peak amplitude
|
||||
self.Ao = None # Wood-Anderson peak-to-peak amplitude
|
||||
|
||||
# Station information
|
||||
self.network = None
|
||||
self.channel = None
|
||||
|
||||
# pick information
|
||||
self.picker = 'auto' # type of pick
|
||||
self.picker = 'auto' # type of pick
|
||||
self.marked = []
|
||||
|
||||
# pick results
|
||||
self.epp = None # earliest possible pick
|
||||
self.mpp = None # most likely onset
|
||||
self.lpp = None # latest possible pick
|
||||
self.fm = 'N' # first motion polarity, can be set to 'U' (Up) or 'D' (Down)
|
||||
self.snr = None # signal-to-noise ratio of onset
|
||||
self.snrdb = None # signal-to-noise ratio of onset [dB]
|
||||
self.spe = None # symmetrized picking error
|
||||
self.weight = 4 # weight of onset
|
||||
self.epp = None # earliest possible pick
|
||||
self.mpp = None # most likely onset
|
||||
self.lpp = None # latest possible pick
|
||||
self.fm = 'N' # first motion polarity, can be set to 'U' (Up) or 'D' (Down)
|
||||
self.snr = None # signal-to-noise ratio of onset
|
||||
self.snrdb = None # signal-to-noise ratio of onset [dB]
|
||||
self.spe = None # symmetrized picking error
|
||||
self.weight = 4 # weight of onset
|
||||
|
||||
# to correctly provide dot access to dictionary attributes, all attribute access of the class is forwarded to the
|
||||
# dictionary
|
||||
@ -232,6 +232,20 @@ class PickingContainer:
|
||||
self.Sflag = 0
|
||||
|
||||
|
||||
class MissingTraceException(ValueError):
|
||||
"""
|
||||
Used to indicate missing traces in a obspy.core.stream.Stream object
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
class PickingFailedException(Exception):
|
||||
"""
|
||||
Raised when picking fails due to missing values etc.
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
class AutopickStation(object):
|
||||
|
||||
def __init__(self, wfstream, pickparam, verbose, iplot=0, fig_dict=None, metadata=None, origin=None):
|
||||
@ -258,14 +272,10 @@ class AutopickStation(object):
|
||||
self.pickparams = copy.deepcopy(pickparam)
|
||||
self.verbose = verbose
|
||||
self.iplot = correct_iplot(iplot)
|
||||
self.fig_dict = get_none(fig_dict)
|
||||
self.fig_dict = get_None(fig_dict)
|
||||
self.metadata = metadata
|
||||
self.origin = origin
|
||||
|
||||
# initialize TauPy pick estimates
|
||||
self.estFirstP = None
|
||||
self.estFirstS = None
|
||||
|
||||
# initialize picking results
|
||||
self.p_results = PickingResults()
|
||||
self.s_results = PickingResults()
|
||||
@ -325,10 +335,9 @@ class AutopickStation(object):
|
||||
"""
|
||||
waveform_data = {}
|
||||
for key in self.channelorder:
|
||||
waveform_data[key] = self.wfstream.select(component=key) # try ZNE first
|
||||
waveform_data[key] = self.wfstream.select(component=key) # try ZNE first
|
||||
if len(waveform_data[key]) == 0:
|
||||
waveform_data[key] = self.wfstream.select(
|
||||
component=str(self.channelorder[key])) # use 123 as second option
|
||||
waveform_data[key] = self.wfstream.select(component=str(self.channelorder[key])) # use 123 as second option
|
||||
return waveform_data['Z'], waveform_data['N'], waveform_data['E']
|
||||
|
||||
def get_traces_from_streams(self):
|
||||
@ -419,7 +428,7 @@ class AutopickStation(object):
|
||||
if station_coords is None:
|
||||
exit_taupy()
|
||||
raise AttributeError('Warning: Could not find station in metadata')
|
||||
# TODO: raise when metadata.get_coordinates returns None
|
||||
# TODO raise when metadata.get_coordinates returns None
|
||||
source_origin = origin[0]
|
||||
model = TauPyModel(taup_model)
|
||||
taup_phases = self.pickparams['taup_phases']
|
||||
@ -447,29 +456,31 @@ class AutopickStation(object):
|
||||
for arr in arrivals:
|
||||
phases[identifyPhaseID(arr.phase.name)].append(arr)
|
||||
# get first P and S onsets from arrivals list
|
||||
arrival_time_p = 0
|
||||
arrival_time_s = 0
|
||||
estFirstP = 0
|
||||
estFirstS = 0
|
||||
if len(phases['P']) > 0:
|
||||
arrP, arrival_time_p = min([(arr, arr.time) for arr in phases['P']], key=lambda t: t[1])
|
||||
arrP, estFirstP = min([(arr, arr.time) for arr in phases['P']], key=lambda t: t[1])
|
||||
if len(phases['S']) > 0:
|
||||
arrS, arrival_time_s = min([(arr, arr.time) for arr in phases['S']], key=lambda t: t[1])
|
||||
arrS, estFirstS = min([(arr, arr.time) for arr in phases['S']], key=lambda t: t[1])
|
||||
print('autopick: estimated first arrivals for P: {} s, S:{} s after event'
|
||||
' origin time using TauPy'.format(arrival_time_p, arrival_time_s))
|
||||
return arrival_time_p, arrival_time_s
|
||||
' origin time using TauPy'.format(estFirstP, estFirstS))
|
||||
return estFirstP, estFirstS
|
||||
|
||||
def exit_taupy():
|
||||
"""If taupy failed to calculate theoretical starttimes, picking continues.
|
||||
For this a clean exit is required, since the P starttime is no longer relative to the theoretic onset but
|
||||
to the vertical trace starttime, eg. it can't be < 0."""
|
||||
# TODO here the pickparams is modified, instead of a copy
|
||||
if self.pickparams["pstart"] < 0:
|
||||
# TODO here the pickparams is modified, instead of a copy
|
||||
self.pickparams["pstart"] = 0
|
||||
if self.pickparams["sstart"] < 0:
|
||||
self.pickparams["sstart"] = 0
|
||||
|
||||
if get_bool(self.pickparams["use_taup"]) is False:
|
||||
if self.pickparams["use_taup"] is False or not self.origin or not self.metadata:
|
||||
# correct user mistake where a relative cuttime is selected (pstart < 0) but use of taupy is disabled/ has
|
||||
# not the required parameters
|
||||
if not self.origin:
|
||||
print('Requested use_taup but no origin given. Exit taupy.')
|
||||
if not self.metadata:
|
||||
print('Requested use_taup but no metadata given. Exit taupy.')
|
||||
exit_taupy()
|
||||
return
|
||||
|
||||
@ -481,34 +492,16 @@ class AutopickStation(object):
|
||||
raise AttributeError('No source origins given!')
|
||||
|
||||
arrivals = create_arrivals(self.metadata, self.origin, self.pickparams["taup_model"])
|
||||
arrival_P, arrival_S = first_PS_onsets(arrivals)
|
||||
|
||||
self.estFirstP = (self.origin[0].time + arrival_P) - self.ztrace.stats.starttime
|
||||
|
||||
estFirstP, estFirstS = first_PS_onsets(arrivals)
|
||||
# modifiy pstart and pstop relative to estimated first P arrival (relative to station time axis)
|
||||
self.pickparams["pstart"] += self.estFirstP
|
||||
self.pickparams["pstop"] += self.estFirstP
|
||||
self.pickparams["pstart"] += (self.origin[0].time + estFirstP) - self.ztrace.stats.starttime
|
||||
self.pickparams["pstop"] += (self.origin[0].time + estFirstP) - self.ztrace.stats.starttime
|
||||
print('autopick: CF calculation times respectively:'
|
||||
' pstart: {} s, pstop: {} s'.format(self.pickparams["pstart"], self.pickparams["pstop"]))
|
||||
# make sure pstart and pstop are inside the starttime/endtime of vertical trace
|
||||
self.pickparams["pstart"] = max(self.pickparams["pstart"], 0)
|
||||
self.pickparams["pstop"] = min(self.pickparams["pstop"], len(self.ztrace) * self.ztrace.stats.delta)
|
||||
|
||||
if self.horizontal_traces_exist():
|
||||
# for the two horizontal components take earliest and latest time to make sure that the s onset is not clipped
|
||||
# if start and endtime of horizontal traces differ, the s windowsize will automatically increase
|
||||
trace_s_start = min([self.etrace.stats.starttime, self.ntrace.stats.starttime])
|
||||
self.estFirstS = (self.origin[0].time + arrival_S) - trace_s_start
|
||||
# modifiy sstart and sstop relative to estimated first S arrival (relative to station time axis)
|
||||
self.pickparams["sstart"] += self.estFirstS
|
||||
self.pickparams["sstop"] += self.estFirstS
|
||||
print('autopick: CF calculation times respectively:'
|
||||
' sstart: {} s, sstop: {} s'.format(self.pickparams["sstart"], self.pickparams["sstop"]))
|
||||
# make sure pstart and pstop are inside the starttime/endtime of horizontal traces
|
||||
self.pickparams["sstart"] = max(self.pickparams["sstart"], 0)
|
||||
self.pickparams["sstop"] = min(self.pickparams["sstop"], len(self.ntrace) * self.ntrace.stats.delta,
|
||||
len(self.etrace) * self.etrace.stats.delta)
|
||||
|
||||
def autopickstation(self):
|
||||
"""
|
||||
Main function of autopickstation, which calculates P and S picks and returns them in a dictionary.
|
||||
@ -518,17 +511,6 @@ class AutopickStation(object):
|
||||
station's value is the station name on which the picks were calculated.
|
||||
:rtype: dict
|
||||
"""
|
||||
|
||||
if get_bool(self.pickparams['use_taup']) is True and self.origin is not None:
|
||||
try:
|
||||
# modify pstart, pstop, sstart, sstop to be around theoretical onset if taupy should be used,
|
||||
# else do nothing
|
||||
self.modify_starttimes_taupy()
|
||||
except AttributeError as ae:
|
||||
print(ae)
|
||||
except MissingTraceException as mte:
|
||||
print(mte)
|
||||
|
||||
try:
|
||||
self.pick_p_phase()
|
||||
except MissingTraceException as mte:
|
||||
@ -536,19 +518,17 @@ class AutopickStation(object):
|
||||
except PickingFailedException as pfe:
|
||||
print(pfe)
|
||||
|
||||
if self.horizontal_traces_exist():
|
||||
if (self.p_results.weight is not None and self.p_results.weight < 4) or \
|
||||
get_bool(self.pickparams.get('use_taup')):
|
||||
try:
|
||||
self.pick_s_phase()
|
||||
except MissingTraceException as mte:
|
||||
print(mte)
|
||||
except PickingFailedException as pfe:
|
||||
print(pfe)
|
||||
if self.horizontal_traces_exist() and self.p_results.weight is not None and self.p_results.weight < 4:
|
||||
try:
|
||||
self.pick_s_phase()
|
||||
except MissingTraceException as mte:
|
||||
print(mte)
|
||||
except PickingFailedException as pfe:
|
||||
print(pfe)
|
||||
|
||||
self.plot_pick_results()
|
||||
self.finish_picking()
|
||||
return [{'P': self.p_results, 'S': self.s_results}, self.ztrace.stats.station]
|
||||
return [{'P': self.p_results, 'S':self.s_results}, self.ztrace.stats.station]
|
||||
|
||||
def finish_picking(self):
|
||||
|
||||
@ -597,7 +577,7 @@ class AutopickStation(object):
|
||||
|
||||
self.s_results.channel = self.etrace.stats.channel
|
||||
self.s_results.network = self.etrace.stats.network
|
||||
self.s_results.fm = None # override default value 'N'
|
||||
self.s_results.fm = None # override default value 'N'
|
||||
|
||||
def plot_pick_results(self):
|
||||
if self.iplot > 0:
|
||||
@ -612,20 +592,12 @@ class AutopickStation(object):
|
||||
plt_flag = 0
|
||||
fig._tight = True
|
||||
ax1 = fig.add_subplot(311)
|
||||
tdata = np.linspace(start=0, stop=self.ztrace.stats.endtime - self.ztrace.stats.starttime,
|
||||
num=self.ztrace.stats.npts)
|
||||
tdata = np.linspace(start=0, stop=self.ztrace.stats.endtime-self.ztrace.stats.starttime, num=self.ztrace.stats.npts)
|
||||
# plot tapered trace filtered with bpz2 filter settings
|
||||
ax1.plot(tdata, self.tr_filt_z_bpz2.data / max(self.tr_filt_z_bpz2.data), color=linecolor, linewidth=0.7,
|
||||
label='Data')
|
||||
# plot pickwindows for P
|
||||
pstart, pstop = self.pickparams['pstart'], self.pickparams['pstop']
|
||||
if pstart is not None and pstop is not None:
|
||||
ax1.axvspan(pstart, pstop, color='r', alpha=0.1, zorder=0, label='P window')
|
||||
if self.estFirstP is not None:
|
||||
ax1.axvline(self.estFirstP, ls='dashed', color='r', alpha=0.4, label='TauPy estimate')
|
||||
ax1.plot(tdata, self.tr_filt_z_bpz2.data/max(self.tr_filt_z_bpz2.data), color=linecolor, linewidth=0.7, label='Data')
|
||||
if self.p_results.weight < 4:
|
||||
# plot CF of initial onset (HOScf or ARZcf)
|
||||
ax1.plot(self.cf1.getTimeArray(), self.cf1.getCF() / max(self.cf1.getCF()), 'b', label='CF1')
|
||||
ax1.plot(self.cf1.getTimeArray(), self.cf1.getCF()/max(self.cf1.getCF()), 'b', label='CF1')
|
||||
if self.p_data.p_aic_plot_flag == 1:
|
||||
aicpick = self.p_data.aicpick
|
||||
refPpick = self.p_data.refPpick
|
||||
@ -660,44 +632,38 @@ class AutopickStation(object):
|
||||
ax1.set_ylim([-1.5, 1.5])
|
||||
ax1.set_ylabel('Normalized Counts')
|
||||
|
||||
if self.horizontal_traces_exist():# and self.s_data.Sflag == 1:
|
||||
if self.horizontal_traces_exist() and self.s_data.Sflag == 1:
|
||||
# plot E trace
|
||||
ax2 = fig.add_subplot(3, 1, 2, sharex=ax1)
|
||||
th1data = np.linspace(0, self.etrace.stats.endtime - self.etrace.stats.starttime,
|
||||
self.etrace.stats.npts)
|
||||
th1data = np.linspace(0, self.etrace.stats.endtime-self.etrace.stats.starttime, self.etrace.stats.npts)
|
||||
# plot filtered and tapered waveform
|
||||
ax2.plot(th1data, self.etrace.data / max(self.etrace.data), color=linecolor, linewidth=0.7,
|
||||
label='Data')
|
||||
ax2.plot(th1data, self.etrace.data / max(self.etrace.data), color=linecolor, linewidth=0.7, label='Data')
|
||||
if self.p_results.weight < 4:
|
||||
# plot initial CF (ARHcf or AR3Ccf)
|
||||
ax2.plot(self.arhcf1.getTimeArray(), self.arhcf1.getCF() / max(self.arhcf1.getCF()), 'b',
|
||||
label='CF1')
|
||||
ax2.plot(self.arhcf1.getTimeArray(), self.arhcf1.getCF() / max(self.arhcf1.getCF()), 'b', label='CF1')
|
||||
if self.s_data.aicSflag == 1 and self.s_results.weight <= 4:
|
||||
aicarhpick = self.aicarhpick
|
||||
refSpick = self.refSpick
|
||||
# plot second cf, used for determing precise onset (ARHcf or AR3Ccf)
|
||||
ax2.plot(self.arhcf2.getTimeArray(), self.arhcf2.getCF() / max(self.arhcf2.getCF()), 'm',
|
||||
label='CF2')
|
||||
ax2.plot(self.arhcf2.getTimeArray(), self.arhcf2.getCF() / max(self.arhcf2.getCF()), 'm', label='CF2')
|
||||
# plot preliminary onset time, calculated from CF1
|
||||
ax2.plot([aicarhpick.getpick(), aicarhpick.getpick()], [-1, 1], 'g', label='Initial S Onset')
|
||||
ax2.plot([aicarhpick.getpick() - 0.5, aicarhpick.getpick() + 0.5], [1, 1], 'g')
|
||||
ax2.plot([aicarhpick.getpick() - 0.5, aicarhpick.getpick() + 0.5], [-1, -1], 'g')
|
||||
# plot precise onset time, calculated from CF2
|
||||
ax2.plot([refSpick.getpick(), refSpick.getpick()], [-1.3, 1.3], 'g', linewidth=2,
|
||||
label='Final S Pick')
|
||||
ax2.plot([refSpick.getpick(), refSpick.getpick()], [-1.3, 1.3], 'g', linewidth=2, label='Final S Pick')
|
||||
ax2.plot([refSpick.getpick() - 0.5, refSpick.getpick() + 0.5], [1.3, 1.3], 'g', linewidth=2)
|
||||
ax2.plot([refSpick.getpick() - 0.5, refSpick.getpick() + 0.5], [-1.3, -1.3], 'g', linewidth=2)
|
||||
ax2.plot([self.s_results.lpp, self.s_results.lpp], [-1.1, 1.1], 'g--', label='lpp')
|
||||
ax2.plot([self.s_results.epp, self.s_results.epp], [-1.1, 1.1], 'g--', label='epp')
|
||||
title = '{channel}, S weight={sweight}, SNR={snr:7.2}, SNR[dB]={snrdb:7.2}'
|
||||
ax2.set_title(title.format(channel=str(self.etrace.stats.channel),
|
||||
sweight=str(self.s_results.weight),
|
||||
snr=str(self.s_results.snr),
|
||||
snrdb=str(self.s_results.snrdb)))
|
||||
ax2.set_title(title.format(channel=self.etrace.stats.channel,
|
||||
sweight=self.s_results.weight,
|
||||
snr=self.s_results.snr,
|
||||
snrdb=self.s_results.snrdb))
|
||||
else:
|
||||
title = '{channel}, S weight={sweight}, SNR=None, SNR[dB]=None'
|
||||
ax2.set_title(title.format(channel=str(self.etrace.stats.channel),
|
||||
sweight=str(self.s_results.weight)))
|
||||
ax2.set_title(title.format(channel=self.etrace.stats.channel, sweight=self.s_results.weight))
|
||||
ax2.legend(loc=1)
|
||||
ax2.set_yticks([])
|
||||
ax2.set_ylim([-1.5, 1.5])
|
||||
@ -705,19 +671,15 @@ class AutopickStation(object):
|
||||
|
||||
# plot N trace
|
||||
ax3 = fig.add_subplot(3, 1, 3, sharex=ax1)
|
||||
th2data = np.linspace(0, self.ntrace.stats.endtime - self.ntrace.stats.starttime,
|
||||
self.ntrace.stats.npts)
|
||||
th2data= np.linspace(0, self.ntrace.stats.endtime-self.ntrace.stats.starttime, self.ntrace.stats.npts)
|
||||
# plot trace
|
||||
ax3.plot(th2data, self.ntrace.data / max(self.ntrace.data), color=linecolor, linewidth=0.7,
|
||||
label='Data')
|
||||
ax3.plot(th2data, self.ntrace.data / max(self.ntrace.data), color=linecolor, linewidth=0.7, label='Data')
|
||||
if self.p_results.weight < 4:
|
||||
p22, = ax3.plot(self.arhcf1.getTimeArray(), self.arhcf1.getCF() / max(self.arhcf1.getCF()), 'b',
|
||||
label='CF1')
|
||||
p22, = ax3.plot(self.arhcf1.getTimeArray(), self.arhcf1.getCF() / max(self.arhcf1.getCF()), 'b', label='CF1')
|
||||
if self.s_data.aicSflag == 1:
|
||||
aicarhpick = self.aicarhpick
|
||||
refSpick = self.refSpick
|
||||
ax3.plot(self.arhcf2.getTimeArray(), self.arhcf2.getCF() / max(self.arhcf2.getCF()), 'm',
|
||||
label='CF2')
|
||||
ax3.plot(self.arhcf2.getTimeArray(), self.arhcf2.getCF() / max(self.arhcf2.getCF()), 'm', label='CF2')
|
||||
ax3.plot([aicarhpick.getpick(), aicarhpick.getpick()], [-1, 1], 'g', label='Initial S Onset')
|
||||
ax3.plot([aicarhpick.getpick() - 0.5, aicarhpick.getpick() + 0.5], [1, 1], 'g')
|
||||
ax3.plot([aicarhpick.getpick() - 0.5, aicarhpick.getpick() + 0.5], [-1, -1], 'g')
|
||||
@ -727,15 +689,6 @@ class AutopickStation(object):
|
||||
ax3.plot([refSpick.getpick() - 0.5, refSpick.getpick() + 0.5], [-1.3, -1.3], 'g', linewidth=2)
|
||||
ax3.plot([self.s_results.lpp, self.s_results.lpp], [-1.1, 1.1], 'g--', label='lpp')
|
||||
ax3.plot([self.s_results.epp, self.s_results.epp], [-1.1, 1.1], 'g--', label='epp')
|
||||
|
||||
# plot pickwindows for S
|
||||
sstart, sstop = self.pickparams['sstart'], self.pickparams['sstop']
|
||||
if sstart is not None and sstop is not None:
|
||||
for axis in [ax2, ax3]:
|
||||
axis.axvspan(sstart, sstop, color='b', alpha=0.1, zorder=0, label='S window')
|
||||
if self.estFirstS is not None:
|
||||
axis.axvline(self.estFirstS, ls='dashed', color='b', alpha=0.4, label='TauPy estimate')
|
||||
|
||||
ax3.legend(loc=1)
|
||||
ax3.set_yticks([])
|
||||
ax3.set_ylim([-1.5, 1.5])
|
||||
@ -767,8 +720,7 @@ class AutopickStation(object):
|
||||
if aicpick.getpick() is None:
|
||||
msg = "Bad initial (AIC) P-pick, skipping this onset!\nAIC-SNR={0}, AIC-Slope={1}counts/s\n " \
|
||||
"(min. AIC-SNR={2}, min. AIC-Slope={3}counts/s)"
|
||||
msg = msg.format(aicpick.getSNR(), aicpick.getSlope(), self.pickparams["minAICPSNR"],
|
||||
self.pickparams["minAICPslope"])
|
||||
msg = msg.format(aicpick.getSNR(), aicpick.getSlope(), self.pickparams["minAICPSNR"], self.pickparams["minAICPslope"])
|
||||
self.vprint(msg)
|
||||
return 0
|
||||
# Quality check initial pick with minimum signal length
|
||||
@ -778,16 +730,14 @@ class AutopickStation(object):
|
||||
if len(self.nstream) == 0 or len(self.estream) == 0:
|
||||
msg = 'One or more horizontal component(s) missing!\n' \
|
||||
'Signal length only checked on vertical component!\n' \
|
||||
'Decreasing minsiglengh from {0} to {1}' \
|
||||
.format(minsiglength, minsiglength / 2)
|
||||
'Decreasing minsiglengh from {0} to {1}'\
|
||||
.format(minsiglength, minsiglength / 2)
|
||||
self.vprint(msg)
|
||||
minsiglength = minsiglength / 2
|
||||
else:
|
||||
# filter, taper other traces as well since signal length is compared on all traces
|
||||
trH1_filt, _ = self.prepare_wfstream(self.estream, freqmin=self.pickparams["bph1"][0],
|
||||
freqmax=self.pickparams["bph1"][1])
|
||||
trH2_filt, _ = self.prepare_wfstream(self.nstream, freqmin=self.pickparams["bph1"][0],
|
||||
freqmax=self.pickparams["bph1"][1])
|
||||
trH1_filt, _ = self.prepare_wfstream(self.estream, freqmin=self.pickparams["bph1"][0], freqmax=self.pickparams["bph1"][1])
|
||||
trH2_filt, _ = self.prepare_wfstream(self.nstream, freqmin=self.pickparams["bph1"][0], freqmax=self.pickparams["bph1"][1])
|
||||
zne += trH1_filt
|
||||
zne += trH2_filt
|
||||
minsiglength = minsiglength
|
||||
@ -834,6 +784,15 @@ class AutopickStation(object):
|
||||
# save filtered trace in instance for later plotting
|
||||
self.tr_filt_z_bpz2 = tr_filt
|
||||
|
||||
if get_Bool(self.pickparams['use_taup']) is True and self.origin is not None:
|
||||
try:
|
||||
# modify pstart, pstop to be around theoretical onset if taupy should be used, else does nothing
|
||||
self.modify_starttimes_taupy()
|
||||
except AttributeError as ae:
|
||||
print(ae)
|
||||
except MissingTraceException as mte:
|
||||
print(mte)
|
||||
|
||||
Lc = self.pickparams['pstop'] - self.pickparams['pstart']
|
||||
|
||||
Lwf = self.ztrace.stats.endtime - self.ztrace.stats.starttime
|
||||
@ -858,34 +817,24 @@ class AutopickStation(object):
|
||||
self.cf1 = None
|
||||
assert isinstance(self.cf1, CharacteristicFunction), 'cf1 is not set correctly: maybe the algorithm name ({})' \
|
||||
' is corrupted'.format(self.pickparams["algoP"])
|
||||
# get the original waveform stream from first CF class cut to identical length as CF for plotting
|
||||
cut_ogstream = self.cf1.getDataArray(self.cf1.getCut())
|
||||
|
||||
# MP: Rename to cf_stream for further use of z_copy and to prevent chaos when z_copy suddenly becomes a cf
|
||||
# stream and later again a waveform stream
|
||||
cf_stream = z_copy.copy()
|
||||
cf_stream[0].data = self.cf1.getCF()
|
||||
|
||||
# calculate AIC cf from first cf (either HOS or ARZ)
|
||||
aiccf = AICcf(cf_stream, cuttimes)
|
||||
z_copy[0].data = self.cf1.getCF()
|
||||
aiccf = AICcf(z_copy, cuttimes)
|
||||
# get preliminary onset time from AIC-CF
|
||||
self.set_current_figure('aicFig')
|
||||
aicpick = AICPicker(aiccf, self.pickparams["tsnrz"], self.pickparams["pickwinP"], self.iplot,
|
||||
Tsmooth=self.pickparams["aictsmooth"], fig=self.current_figure,
|
||||
linecolor=self.current_linecolor, ogstream=cut_ogstream)
|
||||
Tsmooth=self.pickparams["aictsmooth"], fig=self.current_figure, linecolor=self.current_linecolor)
|
||||
# save aicpick for plotting later
|
||||
self.p_data.aicpick = aicpick
|
||||
# add pstart and pstop to aic plot
|
||||
if self.current_figure:
|
||||
# TODO remove plotting from picking, make own plot function
|
||||
for ax in self.current_figure.axes:
|
||||
ax.vlines(self.pickparams["pstart"], ax.get_ylim()[0], ax.get_ylim()[1], color='c', linestyles='dashed',
|
||||
label='P start')
|
||||
ax.vlines(self.pickparams["pstop"], ax.get_ylim()[0], ax.get_ylim()[1], color='c', linestyles='dashed',
|
||||
label='P stop')
|
||||
ax.vlines(self.pickparams["pstart"], ax.get_ylim()[0], ax.get_ylim()[1], color='c', linestyles='dashed', label='P start')
|
||||
ax.vlines(self.pickparams["pstop"], ax.get_ylim()[0], ax.get_ylim()[1], color='c', linestyles='dashed', label='P stop')
|
||||
ax.legend(loc=1)
|
||||
|
||||
Pflag = self._pick_p_quality_control(aicpick, cf_stream, tr_filt)
|
||||
Pflag = self._pick_p_quality_control(aicpick, z_copy, tr_filt)
|
||||
# go on with processing if AIC onset passes quality control
|
||||
slope = aicpick.getSlope()
|
||||
if not slope: slope = 0
|
||||
@ -896,8 +845,7 @@ class AutopickStation(object):
|
||||
error_msg = 'AIC P onset slope to small: got {}, min {}'.format(slope, self.pickparams["minAICPslope"])
|
||||
raise PickingFailedException(error_msg)
|
||||
if aicpick.getSNR() < self.pickparams["minAICPSNR"]:
|
||||
error_msg = 'AIC P onset SNR to small: got {}, min {}'.format(aicpick.getSNR(),
|
||||
self.pickparams["minAICPSNR"])
|
||||
error_msg = 'AIC P onset SNR to small: got {}, min {}'.format(aicpick.getSNR(), self.pickparams["minAICPSNR"])
|
||||
raise PickingFailedException(error_msg)
|
||||
|
||||
self.p_data.p_aic_plot_flag = 1
|
||||
@ -905,8 +853,7 @@ class AutopickStation(object):
|
||||
'autopickstation: re-filtering vertical trace...'.format(aicpick.getSlope(), aicpick.getSNR())
|
||||
self.vprint(msg)
|
||||
# refilter waveform with larger bandpass
|
||||
tr_filt, z_copy = self.prepare_wfstream(self.zstream, freqmin=self.pickparams["bpz2"][0],
|
||||
freqmax=self.pickparams["bpz2"][1])
|
||||
tr_filt, z_copy = self.prepare_wfstream(self.zstream, freqmin=self.pickparams["bpz2"][0], freqmax=self.pickparams["bpz2"][1])
|
||||
# save filtered trace in instance for later plotting
|
||||
self.tr_filt_z_bpz2 = tr_filt
|
||||
# determine new times around initial onset
|
||||
@ -918,29 +865,25 @@ class AutopickStation(object):
|
||||
else:
|
||||
self.cf2 = None
|
||||
assert isinstance(self.cf2, CharacteristicFunction), 'cf2 is not set correctly: maybe the algorithm name () is ' \
|
||||
'corrupted'.format(self.pickparams["algoP"])
|
||||
'corrupted'.format(self.pickparams["algoP"])
|
||||
self.set_current_figure('refPpick')
|
||||
# get refined onset time from CF2
|
||||
refPpick = PragPicker(self.cf2, self.pickparams["tsnrz"], self.pickparams["pickwinP"], self.iplot,
|
||||
self.pickparams["ausP"],
|
||||
self.pickparams["tsmoothP"], aicpick.getpick(), self.current_figure,
|
||||
self.current_linecolor, ogstream=cut_ogstream)
|
||||
refPpick = PragPicker(self.cf2, self.pickparams["tsnrz"], self.pickparams["pickwinP"], self.iplot, self.pickparams["ausP"],
|
||||
self.pickparams["tsmoothP"], aicpick.getpick(), self.current_figure, self.current_linecolor)
|
||||
# save PragPicker result for plotting
|
||||
self.p_data.refPpick = refPpick
|
||||
self.p_results.mpp = refPpick.getpick()
|
||||
if self.p_results.mpp is None:
|
||||
msg = 'Bad initial (AIC) P-pick, skipping this onset!\n AIC-SNR={}, AIC-Slope={}counts/s\n' \
|
||||
'(min. AIC-SNR={}, min. AIC-Slope={}counts/s)'
|
||||
msg.format(aicpick.getSNR(), aicpick.getSlope(), self.pickparams["minAICPSNR"],
|
||||
self.pickparams["minAICPslope"])
|
||||
msg.format(aicpick.getSNR(), aicpick.getSlope(), self.pickparams["minAICPSNR"], self.pickparams["minAICPslope"])
|
||||
self.vprint(msg)
|
||||
self.s_data.Sflag = 0
|
||||
raise PickingFailedException(msg)
|
||||
# quality assessment, get earliest/latest pick and symmetrized uncertainty
|
||||
# todo quality assessment in own function
|
||||
#todo quality assessment in own function
|
||||
self.set_current_figure('el_Ppick')
|
||||
elpicker_results = earllatepicker(z_copy, self.pickparams["nfacP"], self.pickparams["tsnrz"],
|
||||
self.p_results.mpp,
|
||||
elpicker_results = earllatepicker(z_copy, self.pickparams["nfacP"], self.pickparams["tsnrz"], self.p_results.mpp,
|
||||
self.iplot, fig=self.current_figure, linecolor=self.current_linecolor)
|
||||
self.p_results.epp, self.p_results.lpp, self.p_results.spe = elpicker_results
|
||||
snr_results = getSNR(z_copy, self.pickparams["tsnrz"], self.p_results.mpp)
|
||||
@ -948,11 +891,10 @@ class AutopickStation(object):
|
||||
|
||||
# weight P-onset using symmetric error
|
||||
self.p_results.weight = get_quality_class(self.p_results.spe, self.pickparams["timeerrorsP"])
|
||||
if self.p_results.weight <= self.pickparams["minfmweight"] and self.p_results.snr >= self.pickparams[
|
||||
"minFMSNR"]:
|
||||
if self.p_results.weight <= self.pickparams["minfmweight"] and self.p_results.snr >= self.pickparams["minFMSNR"]:
|
||||
# if SNR is high enough, try to determine first motion of onset
|
||||
self.set_current_figure('fm_picker')
|
||||
self.p_results.fm = fmpicker(self.zstream.copy(), z_copy, self.pickparams["fmpickwin"], self.p_results.mpp,
|
||||
self.p_results.fm = fmpicker(self.zstream, z_copy, self.pickparams["fmpickwin"], self.p_results.mpp,
|
||||
self.iplot, self.current_figure, self.current_linecolor)
|
||||
msg = "autopickstation: P-weight: {}, SNR: {}, SNR[dB]: {}, Polarity: {}"
|
||||
msg = msg.format(self.p_results.weight, self.p_results.snr, self.p_results.snrdb, self.p_results.fm)
|
||||
@ -1022,7 +964,7 @@ class AutopickStation(object):
|
||||
trH1_filt, _ = self.prepare_wfstream(self.zstream, filter_freq_min, filter_freq_max)
|
||||
trH2_filt, _ = self.prepare_wfstream(self.estream, filter_freq_min, filter_freq_max)
|
||||
trH3_filt, _ = self.prepare_wfstream(self.nstream, filter_freq_min, filter_freq_max)
|
||||
h_copy = self.hdat.copy()
|
||||
h_copy =self. hdat.copy()
|
||||
h_copy[0].data = trH1_filt.data
|
||||
h_copy[1].data = trH2_filt.data
|
||||
h_copy[2].data = trH3_filt.data
|
||||
@ -1164,11 +1106,9 @@ class AutopickStation(object):
|
||||
''.format(self.s_results.weight, self.s_results.snr, self.s_results.snrdb))
|
||||
|
||||
def pick_s_phase(self):
|
||||
if get_bool(self.pickparams.get('use_taup')) is True:
|
||||
cuttimesh = (self.pickparams.get('sstart'), self.pickparams.get('sstop'))
|
||||
else:
|
||||
# determine time window for calculating CF after P onset
|
||||
cuttimesh = self._calculate_cuttimes(type='S', iteration=1)
|
||||
|
||||
# determine time window for calculating CF after P onset
|
||||
cuttimesh = self._calculate_cuttimes(type='S', iteration=1)
|
||||
|
||||
# calculate autoregressive CF
|
||||
self.arhcf1 = self._calculate_autoregressive_cf_s_pick(cuttimesh)
|
||||
@ -1176,14 +1116,10 @@ class AutopickStation(object):
|
||||
# calculate AIC cf
|
||||
haiccf = self._calculate_aic_cf_s_pick(cuttimesh)
|
||||
|
||||
# get the original waveform stream cut to identical length as CF for plotting
|
||||
ogstream = haiccf.getDataArray(haiccf.getCut())
|
||||
|
||||
# get preliminary onset time from AIC cf
|
||||
self.set_current_figure('aicARHfig')
|
||||
aicarhpick = AICPicker(haiccf, self.pickparams["tsnrh"], self.pickparams["pickwinS"], self.iplot,
|
||||
Tsmooth=self.pickparams["aictsmoothS"], fig=self.current_figure,
|
||||
linecolor=self.current_linecolor, ogstream=ogstream)
|
||||
Tsmooth=self.pickparams["aictsmoothS"], fig=self.current_figure, linecolor=self.current_linecolor)
|
||||
# save pick for later plotting
|
||||
self.aicarhpick = aicarhpick
|
||||
|
||||
@ -1194,10 +1130,8 @@ class AutopickStation(object):
|
||||
|
||||
# get refined onset time from CF2
|
||||
self.set_current_figure('refSpick')
|
||||
refSpick = PragPicker(arhcf2, self.pickparams["tsnrh"], self.pickparams["pickwinS"], self.iplot,
|
||||
self.pickparams["ausS"],
|
||||
self.pickparams["tsmoothS"], aicarhpick.getpick(), self.current_figure,
|
||||
self.current_linecolor)
|
||||
refSpick = PragPicker(arhcf2, self.pickparams["tsnrh"], self.pickparams["pickwinS"], self.iplot, self.pickparams["ausS"],
|
||||
self.pickparams["tsmoothS"], aicarhpick.getpick(), self.current_figure, self.current_linecolor)
|
||||
# save refSpick for later plotitng
|
||||
self.refSpick = refSpick
|
||||
self.s_results.mpp = refSpick.getpick()
|
||||
@ -1221,6 +1155,7 @@ class AutopickStation(object):
|
||||
self.current_linecolor = plot_style['linecolor']['rgba_mpl']
|
||||
|
||||
|
||||
|
||||
def autopickstation(wfstream, pickparam, verbose=False, iplot=0, fig_dict=None, metadata=None, origin=None):
|
||||
"""
|
||||
Main function to calculate picks for the station.
|
||||
@ -1308,11 +1243,11 @@ def iteratepicker(wf, NLLocfile, picks, badpicks, pickparameter, fig_dict=None):
|
||||
print(
|
||||
"iteratepicker: The following picking parameters have been modified for iterative picking:")
|
||||
print(
|
||||
"pstart: %fs => %fs" % (pstart_old, pickparameter.get('pstart')))
|
||||
"pstart: %fs => %fs" % (pstart_old, pickparameter.get('pstart')))
|
||||
print(
|
||||
"pstop: %fs => %fs" % (pstop_old, pickparameter.get('pstop')))
|
||||
"pstop: %fs => %fs" % (pstop_old, pickparameter.get('pstop')))
|
||||
print(
|
||||
"sstop: %fs => %fs" % (sstop_old, pickparameter.get('sstop')))
|
||||
"sstop: %fs => %fs" % (sstop_old, pickparameter.get('sstop')))
|
||||
print("pickwinP: %fs => %fs" % (
|
||||
pickwinP_old, pickparameter.get('pickwinP')))
|
||||
print("Precalcwin: %fs => %fs" % (
|
||||
|
@ -16,16 +16,11 @@ autoregressive prediction: application ot local and regional distances, Geophys.
|
||||
|
||||
:author: MAGS2 EP3 working group
|
||||
"""
|
||||
|
||||
import numpy as np
|
||||
try:
|
||||
from scipy.signal import tukey
|
||||
except ImportError:
|
||||
from scipy.signal.windows import tukey
|
||||
|
||||
from scipy import signal
|
||||
from obspy.core import Stream
|
||||
|
||||
from pylot.core.pick.utils import PickingFailedException
|
||||
|
||||
|
||||
class CharacteristicFunction(object):
|
||||
"""
|
||||
@ -60,7 +55,7 @@ class CharacteristicFunction(object):
|
||||
self.setOrder(order)
|
||||
self.setFnoise(fnoise)
|
||||
self.setARdetStep(t2)
|
||||
self.calcCF()
|
||||
self.calcCF(self.getDataArray())
|
||||
self.arpara = np.array([])
|
||||
self.xpred = np.array([])
|
||||
|
||||
@ -155,7 +150,7 @@ class CharacteristicFunction(object):
|
||||
if self.cut[0] == 0 and self.cut[1] == 0:
|
||||
start = 0
|
||||
stop = len(self.orig_data[0])
|
||||
elif self.cut[0] == 0 and self.cut[1] != 0:
|
||||
elif self.cut[0] == 0 and self.cut[1] is not 0:
|
||||
start = 0
|
||||
stop = self.cut[1] / self.dt
|
||||
else:
|
||||
@ -164,7 +159,7 @@ class CharacteristicFunction(object):
|
||||
zz = self.orig_data.copy()
|
||||
z1 = zz[0].copy()
|
||||
zz[0].data = z1.data[int(start):int(stop)]
|
||||
if zz[0].stats.npts == 0: # cut times do not fit data length!
|
||||
if zz[0].stats.npts == 0: # cut times do not fit data length!
|
||||
zz[0].data = z1.data # take entire data
|
||||
data = zz
|
||||
return data
|
||||
@ -172,7 +167,7 @@ class CharacteristicFunction(object):
|
||||
if self.cut[0] == 0 and self.cut[1] == 0:
|
||||
start = 0
|
||||
stop = min([len(self.orig_data[0]), len(self.orig_data[1])])
|
||||
elif self.cut[0] == 0 and self.cut[1] != 0:
|
||||
elif self.cut[0] == 0 and self.cut[1] is not 0:
|
||||
start = 0
|
||||
stop = min([self.cut[1] / self.dt, len(self.orig_data[0]),
|
||||
len(self.orig_data[1])])
|
||||
@ -192,7 +187,7 @@ class CharacteristicFunction(object):
|
||||
start = 0
|
||||
stop = min([self.cut[1] / self.dt, len(self.orig_data[0]),
|
||||
len(self.orig_data[1]), len(self.orig_data[2])])
|
||||
elif self.cut[0] == 0 and self.cut[1] != 0:
|
||||
elif self.cut[0] == 0 and self.cut[1] is not 0:
|
||||
start = 0
|
||||
stop = self.cut[1] / self.dt
|
||||
else:
|
||||
@ -212,15 +207,17 @@ class CharacteristicFunction(object):
|
||||
data = self.orig_data.copy()
|
||||
return data
|
||||
|
||||
def calcCF(self):
|
||||
pass
|
||||
def calcCF(self, data=None):
|
||||
self.cf = data
|
||||
|
||||
|
||||
class AICcf(CharacteristicFunction):
|
||||
|
||||
def calcCF(self):
|
||||
def calcCF(self, data):
|
||||
"""
|
||||
Function to calculate the Akaike Information Criterion (AIC) after Maeda (1985).
|
||||
:param data: data, time series (whether seismogram or CF)
|
||||
:type data: tuple
|
||||
:return: AIC function
|
||||
:rtype:
|
||||
"""
|
||||
@ -229,7 +226,7 @@ class AICcf(CharacteristicFunction):
|
||||
ind = np.where(~np.isnan(xnp))[0]
|
||||
if ind.size:
|
||||
xnp[:ind[0]] = xnp[ind[0]]
|
||||
xnp = tukey(len(xnp), alpha=0.05) * xnp
|
||||
xnp = signal.tukey(len(xnp), alpha=0.05) * xnp
|
||||
xnp = xnp - np.mean(xnp)
|
||||
datlen = len(xnp)
|
||||
k = np.arange(1, datlen)
|
||||
@ -244,7 +241,7 @@ class AICcf(CharacteristicFunction):
|
||||
ff = np.where(inf is True)
|
||||
if len(ff) >= 1:
|
||||
cf[ff] = 0
|
||||
|
||||
|
||||
self.cf = cf - np.mean(cf)
|
||||
self.xcf = x
|
||||
|
||||
@ -258,11 +255,13 @@ class HOScf(CharacteristicFunction):
|
||||
"""
|
||||
super(HOScf, self).__init__(data, cut, pickparams["tlta"], pickparams["hosorder"])
|
||||
|
||||
def calcCF(self):
|
||||
def calcCF(self, data):
|
||||
"""
|
||||
Function to calculate skewness (statistics of order 3) or kurtosis
|
||||
(statistics of order 4), using one long moving window, as published
|
||||
in Kueperkoch et al. (2010), or order 2, i.e. STA/LTA.
|
||||
in Kueperkoch et al. (2010).
|
||||
:param data: data, time series (whether seismogram or CF)
|
||||
:type data: tuple
|
||||
:return: HOS cf
|
||||
:rtype:
|
||||
"""
|
||||
@ -306,7 +305,7 @@ class HOScf(CharacteristicFunction):
|
||||
if ind.size:
|
||||
first = ind[0]
|
||||
LTA[:first] = LTA[first]
|
||||
|
||||
|
||||
self.cf = LTA
|
||||
self.xcf = x
|
||||
|
||||
@ -314,13 +313,14 @@ class HOScf(CharacteristicFunction):
|
||||
class ARZcf(CharacteristicFunction):
|
||||
|
||||
def __init__(self, data, cut, t1, t2, pickparams):
|
||||
super(ARZcf, self).__init__(data, cut, t1=t1, t2=t2, order=pickparams["Parorder"],
|
||||
fnoise=pickparams["addnoise"])
|
||||
super(ARZcf, self).__init__(data, cut, t1=t1, t2=t2, order=pickparams["Parorder"], fnoise=pickparams["addnoise"])
|
||||
|
||||
def calcCF(self):
|
||||
def calcCF(self, data):
|
||||
"""
|
||||
function used to calculate the AR prediction error from a single vertical trace. Can be used to pick
|
||||
P onsets.
|
||||
:param data:
|
||||
:type data: ~obspy.core.stream.Stream
|
||||
:return: ARZ cf
|
||||
:rtype:
|
||||
"""
|
||||
@ -448,15 +448,16 @@ class ARZcf(CharacteristicFunction):
|
||||
class ARHcf(CharacteristicFunction):
|
||||
|
||||
def __init__(self, data, cut, t1, t2, pickparams):
|
||||
super(ARHcf, self).__init__(data, cut, t1=t1, t2=t2, order=pickparams["Sarorder"],
|
||||
fnoise=pickparams["addnoise"])
|
||||
super(ARHcf, self).__init__(data, cut, t1=t1, t2=t2, order=pickparams["Sarorder"], fnoise=pickparams["addnoise"])
|
||||
|
||||
def calcCF(self):
|
||||
def calcCF(self, data):
|
||||
"""
|
||||
Function to calculate a characteristic function using autoregressive modelling of the waveform of
|
||||
both horizontal traces.
|
||||
The waveform is predicted in a moving time window using the calculated AR parameters. The difference
|
||||
between the predicted and the actual waveform servers as a characteristic function.
|
||||
:param data: wavefor stream
|
||||
:type data: ~obspy.core.stream.Stream
|
||||
:return: ARH cf
|
||||
:rtype:
|
||||
"""
|
||||
@ -464,9 +465,6 @@ class ARHcf(CharacteristicFunction):
|
||||
print('Calculating AR-prediction error from both horizontal traces ...')
|
||||
|
||||
xnp = self.getDataArray(self.getCut())
|
||||
if len(xnp[0]) == 0:
|
||||
raise PickingFailedException('calcCF: Found empty data trace for cut times. Return')
|
||||
|
||||
n0 = np.isnan(xnp[0].data)
|
||||
if len(n0) > 1:
|
||||
xnp[0].data[n0] = 0
|
||||
@ -602,15 +600,16 @@ class ARHcf(CharacteristicFunction):
|
||||
class AR3Ccf(CharacteristicFunction):
|
||||
|
||||
def __init__(self, data, cut, t1, t2, pickparams):
|
||||
super(AR3Ccf, self).__init__(data, cut, t1=t1, t2=t2, order=pickparams["Sarorder"],
|
||||
fnoise=pickparams["addnoise"])
|
||||
super(AR3Ccf, self).__init__(data, cut, t1=t1, t2=t2, order=pickparams["Sarorder"], fnoise=pickparams["addnoise"])
|
||||
|
||||
def calcCF(self):
|
||||
def calcCF(self, data):
|
||||
"""
|
||||
Function to calculate a characteristic function using autoregressive modelling of the waveform of
|
||||
all three traces.
|
||||
The waveform is predicted in a moving time window using the calculated AR parameters. The difference
|
||||
between the predicted and the actual waveform servers as a characteristic function
|
||||
:param data: stream holding all three traces
|
||||
:type data: ~obspy.core.stream.Stream
|
||||
:return: AR3C cf
|
||||
:rtype:
|
||||
"""
|
||||
|
@ -2,11 +2,10 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import copy
|
||||
import operator
|
||||
import os
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
import operator
|
||||
import os
|
||||
from obspy.core import AttribDict
|
||||
|
||||
from pylot.core.util.pdf import ProbabilityDensityFunction
|
||||
@ -118,7 +117,7 @@ class Comparison(object):
|
||||
|
||||
pdf_a = self.get('auto').generate_pdf_data(type)
|
||||
pdf_b = self.get('manu').generate_pdf_data(type)
|
||||
|
||||
|
||||
for station, phases in pdf_a.items():
|
||||
if station in pdf_b.keys():
|
||||
compare_pdf = dict()
|
||||
@ -402,8 +401,6 @@ class PDFstatistics(object):
|
||||
Takes a path as argument.
|
||||
"""
|
||||
|
||||
# TODO: change root to datapath
|
||||
|
||||
def __init__(self, directory):
|
||||
"""Initiates some values needed when dealing with pdfs later"""
|
||||
self._rootdir = directory
|
||||
|
@ -19,10 +19,9 @@ calculated after Diehl & Kissling (2009).
|
||||
:author: MAGS2 EP3 working group / Ludger Kueperkoch
|
||||
"""
|
||||
|
||||
import warnings
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
import warnings
|
||||
from scipy.signal import argrelmax, argrelmin
|
||||
|
||||
from pylot.core.pick.charfuns import CharacteristicFunction
|
||||
@ -37,8 +36,7 @@ class AutoPicker(object):
|
||||
|
||||
warnings.simplefilter('ignore')
|
||||
|
||||
def __init__(self, cf, TSNR, PickWindow, iplot=0, aus=None, Tsmooth=None, Pick1=None,
|
||||
fig=None, linecolor='k', ogstream=None):
|
||||
def __init__(self, cf, TSNR, PickWindow, iplot=0, aus=None, Tsmooth=None, Pick1=None, fig=None, linecolor='k'):
|
||||
"""
|
||||
Create AutoPicker object
|
||||
:param cf: characteristic function, on which the picking algorithm is applied
|
||||
@ -60,15 +58,12 @@ class AutoPicker(object):
|
||||
:type fig: `~matplotlib.figure.Figure`
|
||||
:param linecolor: matplotlib line color string
|
||||
:type linecolor: str
|
||||
:param ogstream: original stream (waveform), e.g. for plotting purposes
|
||||
:type ogstream: `~obspy.core.stream.Stream`
|
||||
"""
|
||||
|
||||
assert isinstance(cf, CharacteristicFunction), "%s is not a CharacteristicFunction object" % str(cf)
|
||||
self._linecolor = linecolor
|
||||
self._pickcolor_p = 'b'
|
||||
self.cf = cf.getCF()
|
||||
self.ogstream = ogstream
|
||||
self.Tcf = cf.getTimeArray()
|
||||
self.Data = cf.getXCF()
|
||||
self.dt = cf.getIncrement()
|
||||
@ -177,14 +172,12 @@ class AICPicker(AutoPicker):
|
||||
nn = np.isnan(self.cf)
|
||||
if len(nn) > 1:
|
||||
self.cf[nn] = 0
|
||||
# taper AIC-CF to get rid of side maxima
|
||||
# taper AIC-CF to get rid off side maxima
|
||||
tap = np.hanning(len(self.cf))
|
||||
aic = tap * self.cf + max(abs(self.cf))
|
||||
# smooth AIC-CF
|
||||
ismooth = int(round(self.Tsmooth / self.dt))
|
||||
# MP MP better start with original data than zeros if array shall be smoothed, created artificial value before
|
||||
# when starting with i in range(1...) loop below and subtracting offset afterwards
|
||||
aicsmooth = np.copy(aic)
|
||||
aicsmooth = np.zeros(len(aic))
|
||||
if len(aic) < ismooth:
|
||||
print('AICPicker: Tsmooth larger than CF!')
|
||||
return
|
||||
@ -194,7 +187,7 @@ class AICPicker(AutoPicker):
|
||||
ii1 = i - ismooth
|
||||
aicsmooth[i] = aicsmooth[i - 1] + (aic[i] - aic[ii1]) / ismooth
|
||||
else:
|
||||
aicsmooth[i] = np.mean(aic[0: i]) # MP MP created np.nan for i=1
|
||||
aicsmooth[i] = np.mean(aic[1: i])
|
||||
# remove offset in AIC function
|
||||
offset = abs(min(aic) - min(aicsmooth))
|
||||
aicsmooth = aicsmooth - offset
|
||||
@ -203,7 +196,7 @@ class AICPicker(AutoPicker):
|
||||
# minimum in AIC function
|
||||
icfmax = np.argmax(cf)
|
||||
|
||||
# TODO: If this shall be kept, maybe add thresh_factor to pylot parameters
|
||||
# MP MP testing threshold
|
||||
thresh_hit = False
|
||||
thresh_factor = 0.7
|
||||
thresh = thresh_factor * cf[icfmax]
|
||||
@ -215,6 +208,7 @@ class AICPicker(AutoPicker):
|
||||
if sample <= cf[index - 1]:
|
||||
icfmax = index - 1
|
||||
break
|
||||
# MP MP ---
|
||||
|
||||
# find minimum in AIC-CF front of maximum of HOS/AR-CF
|
||||
lpickwindow = int(round(self.PickWindow / self.dt))
|
||||
@ -320,7 +314,16 @@ class AICPicker(AutoPicker):
|
||||
plt.close(fig)
|
||||
return
|
||||
iislope = islope[0][0:imax + 1]
|
||||
dataslope = self.Data[0].data[iislope]
|
||||
# MP MP change slope calculation
|
||||
# get all maxima of aicsmooth
|
||||
iaicmaxima = argrelmax(aicsmooth)[0]
|
||||
# get first index of maximum after pickindex (indices saved in iaicmaxima)
|
||||
aicmax = iaicmaxima[np.where(iaicmaxima > pickindex)[0]]
|
||||
if len(aicmax) > 0:
|
||||
iaicmax = aicmax[0]
|
||||
else:
|
||||
iaicmax = -1
|
||||
dataslope = aicsmooth[pickindex: iaicmax]
|
||||
# calculate slope as polynomal fit of order 1
|
||||
xslope = np.arange(0, len(dataslope), 1)
|
||||
try:
|
||||
@ -331,8 +334,8 @@ class AICPicker(AutoPicker):
|
||||
else:
|
||||
self.slope = 1 / (len(dataslope) * self.Data[0].stats.delta) * (datafit[-1] - datafit[0])
|
||||
# normalize slope to maximum of cf to make it unit independent
|
||||
self.slope /= self.Data[0].data[icfmax]
|
||||
except Exception as e:
|
||||
self.slope /= aicsmooth[iaicmax]
|
||||
except ValueError as e:
|
||||
print("AICPicker: Problems with data fitting! {}".format(e))
|
||||
|
||||
else:
|
||||
@ -351,12 +354,6 @@ class AICPicker(AutoPicker):
|
||||
self.Tcf = self.Tcf[0:len(self.Tcf) - 1]
|
||||
ax1.plot(self.Tcf, cf / max(cf), color=self._linecolor, linewidth=0.7, label='(HOS-/AR-) Data')
|
||||
ax1.plot(self.Tcf, aicsmooth / max(aicsmooth), 'r', label='Smoothed AIC-CF')
|
||||
# plot the original waveform also for evaluation of the CF and pick
|
||||
if self.ogstream:
|
||||
data = self.ogstream[0].data
|
||||
if len(data) == len(self.Tcf):
|
||||
ax1.plot(self.Tcf, 0.5 * data / max(data), 'k', label='Seismogram', alpha=0.3, zorder=0,
|
||||
lw=0.5)
|
||||
if self.Pick is not None:
|
||||
ax1.plot([self.Pick, self.Pick], [-0.1, 0.5], 'b', linewidth=2, label='AIC-Pick')
|
||||
ax1.set_xlabel('Time [s] since %s' % self.Data[0].stats.starttime)
|
||||
@ -377,7 +374,7 @@ class AICPicker(AutoPicker):
|
||||
label='Signal Window')
|
||||
ax2.axvspan(self.Tcf[iislope[0]], self.Tcf[iislope[-1]], color='g', alpha=0.2, lw=0,
|
||||
label='Slope Window')
|
||||
ax2.plot(self.Tcf[iislope], datafit, 'g', linewidth=2,
|
||||
ax2.plot(self.Tcf[pickindex: iaicmax], datafit, 'g', linewidth=2,
|
||||
label='Slope') # MP MP changed temporarily!
|
||||
|
||||
if self.slope is not None:
|
||||
@ -479,7 +476,7 @@ class PragPicker(AutoPicker):
|
||||
cfpick_r = 0
|
||||
cfpick_l = 0
|
||||
lpickwindow = int(round(self.PickWindow / self.dt))
|
||||
# for i in range(max(np.insert(ipick, 0, 2)), min([ipick1 + lpickwindow + 1, len(self.cf) - 1])):
|
||||
#for i in range(max(np.insert(ipick, 0, 2)), min([ipick1 + lpickwindow + 1, len(self.cf) - 1])):
|
||||
# # local minimum
|
||||
# if self.cf[i + 1] > self.cf[i] <= self.cf[i - 1]:
|
||||
# if cfsmooth[i - 1] * (1 + aus1) >= cfsmooth[i]:
|
||||
@ -511,9 +508,9 @@ class PragPicker(AutoPicker):
|
||||
if flagpick_l > 0 and flagpick_r > 0 and cfpick_l <= 3 * cfpick_r:
|
||||
self.Pick = pick_l
|
||||
pickflag = 1
|
||||
# elif flagpick_l > 0 and flagpick_r > 0 and cfpick_l >= cfpick_r:
|
||||
# self.Pick = pick_r
|
||||
# pickflag = 1
|
||||
elif flagpick_l > 0 and flagpick_r > 0 and cfpick_l >= cfpick_r:
|
||||
self.Pick = pick_r
|
||||
pickflag = 1
|
||||
elif flagpick_l == 0 and flagpick_r > 0 and cfpick_l >= cfpick_r:
|
||||
self.Pick = pick_l
|
||||
pickflag = 1
|
||||
|
@ -7,15 +7,13 @@
|
||||
|
||||
:author: Ludger Kueperkoch, BESTEC GmbH
|
||||
"""
|
||||
|
||||
import warnings
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
from obspy.core import Stream, UTCDateTime
|
||||
from scipy.signal import argrelmax
|
||||
from obspy.core import Stream, UTCDateTime
|
||||
|
||||
from pylot.core.util.utils import get_bool, get_none, SetChannelComponents, common_range
|
||||
from pylot.core.util.utils import get_Bool, get_None, SetChannelComponents
|
||||
|
||||
|
||||
def earllatepicker(X, nfac, TSNR, Pick1, iplot=0, verbosity=1, fig=None, linecolor='k'):
|
||||
@ -62,8 +60,8 @@ def earllatepicker(X, nfac, TSNR, Pick1, iplot=0, verbosity=1, fig=None, linecol
|
||||
plt_flag = 0
|
||||
try:
|
||||
iplot = int(iplot)
|
||||
except ValueError:
|
||||
if get_bool(iplot):
|
||||
except:
|
||||
if get_Bool(iplot):
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
@ -74,7 +72,7 @@ def earllatepicker(X, nfac, TSNR, Pick1, iplot=0, verbosity=1, fig=None, linecol
|
||||
|
||||
x = X[0].data
|
||||
t = np.linspace(0, X[0].stats.endtime - X[0].stats.starttime,
|
||||
X[0].stats.npts)
|
||||
X[0].stats.npts)
|
||||
inoise = getnoisewin(t, Pick1, TSNR[0], TSNR[1])
|
||||
# get signal window
|
||||
isignal = getsignalwin(t, Pick1, TSNR[2])
|
||||
@ -136,7 +134,7 @@ def earllatepicker(X, nfac, TSNR, Pick1, iplot=0, verbosity=1, fig=None, linecol
|
||||
PickError = symmetrize_error(diffti_te, diffti_tl)
|
||||
|
||||
if iplot > 1:
|
||||
if get_none(fig) is None:
|
||||
if get_None(fig) is None:
|
||||
fig = plt.figure() # iplot)
|
||||
plt_flag = 1
|
||||
fig._tight = True
|
||||
@ -219,7 +217,7 @@ def fmpicker(Xraw, Xfilt, pickwin, Pick, iplot=0, fig=None, linecolor='k'):
|
||||
xraw = Xraw[0].data
|
||||
xfilt = Xfilt[0].data
|
||||
t = np.linspace(0, Xraw[0].stats.endtime - Xraw[0].stats.starttime,
|
||||
Xraw[0].stats.npts)
|
||||
Xraw[0].stats.npts)
|
||||
# get pick window
|
||||
ipick = np.where((t <= min([Pick + pickwin, len(Xraw[0])])) & (t >= Pick))
|
||||
if len(ipick[0]) <= 1:
|
||||
@ -344,7 +342,7 @@ def fmpicker(Xraw, Xfilt, pickwin, Pick, iplot=0, fig=None, linecolor='k'):
|
||||
print("fmpicker: Found polarity %s" % FM)
|
||||
|
||||
if iplot > 1:
|
||||
if get_none(fig) is None:
|
||||
if get_None(fig) is None:
|
||||
fig = plt.figure() # iplot)
|
||||
plt_flag = 1
|
||||
fig._tight = True
|
||||
@ -537,10 +535,9 @@ def getslopewin(Tcf, Pick, tslope):
|
||||
:rtype: `numpy.ndarray`
|
||||
"""
|
||||
# TODO: fill out docstring
|
||||
slope = np.where((Tcf <= min(Pick + tslope, Tcf[-1])) & (Tcf >= Pick))
|
||||
slope = np.where( (Tcf <= min(Pick + tslope, Tcf[-1])) & (Tcf >= Pick) )
|
||||
return slope[0]
|
||||
|
||||
|
||||
def getResolutionWindow(snr, extent):
|
||||
"""
|
||||
Produce the half of the time resolution window width from given SNR value
|
||||
@ -598,8 +595,6 @@ def select_for_phase(st, phase):
|
||||
alter_comp = compclass.getCompPosition(comp)
|
||||
alter_comp = str(alter_comp[0])
|
||||
sel_st += st.select(component=comp)
|
||||
if len(sel_st) < 1:
|
||||
sel_st += st.select(component="Q")
|
||||
sel_st += st.select(component=alter_comp)
|
||||
elif phase.upper() == 'S':
|
||||
comps = 'NE'
|
||||
@ -816,7 +811,7 @@ def checksignallength(X, pick, minsiglength, pickparams, iplot=0, fig=None, line
|
||||
try:
|
||||
iplot = int(iplot)
|
||||
except:
|
||||
if get_bool(iplot):
|
||||
if get_Bool(iplot):
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
@ -828,22 +823,14 @@ def checksignallength(X, pick, minsiglength, pickparams, iplot=0, fig=None, line
|
||||
if len(X) > 1:
|
||||
# all three components available
|
||||
# make sure, all components have equal lengths
|
||||
earliest_starttime = min(tr.stats.starttime for tr in X)
|
||||
cuttimes = common_range(X)
|
||||
X = X.slice(cuttimes[0], cuttimes[1])
|
||||
x1, x2, x3 = X[:3]
|
||||
|
||||
if not (len(x1) == len(x2) == len(x3)):
|
||||
raise PickingFailedException('checksignallength: unequal lengths of components!')
|
||||
|
||||
ilen = min([len(X[0].data), len(X[1].data), len(X[2].data)])
|
||||
x1 = X[0][0:ilen]
|
||||
x2 = X[1][0:ilen]
|
||||
x3 = X[2][0:ilen]
|
||||
# get RMS trace
|
||||
rms = np.sqrt((np.power(x1, 2) + np.power(x2, 2) + np.power(x3, 2)) / 3)
|
||||
ilen = len(rms)
|
||||
dt = earliest_starttime - X[0].stats.starttime
|
||||
pick -= dt
|
||||
else:
|
||||
x1 = X[0].data
|
||||
x2 = x3 = None
|
||||
ilen = len(x1)
|
||||
rms = abs(x1)
|
||||
|
||||
@ -858,7 +845,7 @@ def checksignallength(X, pick, minsiglength, pickparams, iplot=0, fig=None, line
|
||||
print("Presumably picked noise peak, pick is rejected!")
|
||||
print("(min. signal length required: %s s)" % minsiglength)
|
||||
returnflag = 0
|
||||
else:
|
||||
else:
|
||||
# calculate minimum adjusted signal level
|
||||
minsiglevel = np.mean(rms[inoise]) * nfac
|
||||
# minimum adjusted number of samples over minimum signal level
|
||||
@ -876,16 +863,12 @@ def checksignallength(X, pick, minsiglength, pickparams, iplot=0, fig=None, line
|
||||
returnflag = 0
|
||||
|
||||
if iplot > 1:
|
||||
if get_none(fig) is None:
|
||||
if get_None(fig) is None:
|
||||
fig = plt.figure() # iplot)
|
||||
plt_flag = 1
|
||||
fig._tight = True
|
||||
ax = fig.add_subplot(111)
|
||||
ax.plot(t, rms, color=linecolor, linewidth=0.7, label='RMS Data')
|
||||
ax.plot(t, x1, 'k', alpha=0.3, lw=0.3, zorder=0)
|
||||
if x2 is not None and x3 is not None:
|
||||
ax.plot(t, x2, 'r', alpha=0.3, lw=0.3, zorder=0)
|
||||
ax.plot(t, x3, 'g', alpha=0.3, lw=0.3, zorder=0)
|
||||
ax.axvspan(t[inoise[0]], t[inoise[-1]], color='y', alpha=0.2, lw=0, label='Noise Window')
|
||||
ax.axvspan(t[isignal[0]], t[isignal[-1]], color='b', alpha=0.2, lw=0, label='Signal Window')
|
||||
ax.plot([t[isignal[0]], t[isignal[len(isignal) - 1]]],
|
||||
@ -895,7 +878,6 @@ def checksignallength(X, pick, minsiglength, pickparams, iplot=0, fig=None, line
|
||||
ax.set_xlabel('Time [s] since %s' % X[0].stats.starttime)
|
||||
ax.set_ylabel('Counts')
|
||||
ax.set_title('Check for Signal Length, Station %s' % X[0].stats.station)
|
||||
ax.set_xlim(pickparams["pstart"], pickparams["pstop"])
|
||||
ax.set_yticks([])
|
||||
if plt_flag == 1:
|
||||
fig.show()
|
||||
@ -903,8 +885,6 @@ def checksignallength(X, pick, minsiglength, pickparams, iplot=0, fig=None, line
|
||||
input()
|
||||
except SyntaxError:
|
||||
pass
|
||||
except EOFError:
|
||||
pass
|
||||
plt.close(fig)
|
||||
|
||||
return returnflag
|
||||
@ -1145,7 +1125,7 @@ def checkZ4S(X, pick, pickparams, iplot, fig=None, linecolor='k'):
|
||||
try:
|
||||
iplot = int(iplot)
|
||||
except:
|
||||
if get_bool(iplot):
|
||||
if get_Bool(iplot):
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
@ -1224,16 +1204,16 @@ def checkZ4S(X, pick, pickparams, iplot, fig=None, linecolor='k'):
|
||||
rms = rms_dict[key]
|
||||
trace = traces_dict[key]
|
||||
t = np.linspace(diff_dict[key], trace.stats.endtime - trace.stats.starttime + diff_dict[key],
|
||||
trace.stats.npts)
|
||||
trace.stats.npts)
|
||||
if i == 0:
|
||||
if get_none(fig) is None:
|
||||
if get_None(fig) is None:
|
||||
fig = plt.figure() # self.iplot) ### WHY? MP MP
|
||||
plt_flag = 1
|
||||
ax1 = fig.add_subplot(3, 1, i + 1)
|
||||
ax = ax1
|
||||
ax.set_title('CheckZ4S, Station %s' % zdat[0].stats.station)
|
||||
else:
|
||||
if get_none(fig) is None:
|
||||
if get_None(fig) is None:
|
||||
fig = plt.figure() # self.iplot) ### WHY? MP MP
|
||||
plt_flag = 1
|
||||
ax = fig.add_subplot(3, 1, i + 1, sharex=ax1)
|
||||
@ -1335,7 +1315,6 @@ def get_quality_class(uncertainty, weight_classes):
|
||||
:return: quality of pick (0-4)
|
||||
:rtype: int
|
||||
"""
|
||||
if not uncertainty: return len(weight_classes)
|
||||
try:
|
||||
# create generator expression containing all indices of values in weight classes that are >= than uncertainty.
|
||||
# call next on it once to receive first value
|
||||
@ -1346,6 +1325,20 @@ def get_quality_class(uncertainty, weight_classes):
|
||||
quality = len(weight_classes)
|
||||
return quality
|
||||
|
||||
def set_NaNs_to(data, nan_value):
|
||||
"""
|
||||
Replace all NaNs in data with nan_value
|
||||
:param data: array holding data
|
||||
:type data: `~numpy.ndarray`
|
||||
:param nan_value: value which all NaNs are set to
|
||||
:type nan_value: float, int
|
||||
:return: data array with all NaNs replaced with nan_value
|
||||
:rtype: `~numpy.ndarray`
|
||||
"""
|
||||
nn = np.isnan(data)
|
||||
if np.any(nn):
|
||||
data[nn] = nan_value
|
||||
return data
|
||||
|
||||
def taper_cf(cf):
|
||||
"""
|
||||
@ -1358,7 +1351,6 @@ def taper_cf(cf):
|
||||
tap = np.hanning(len(cf))
|
||||
return tap * cf
|
||||
|
||||
|
||||
def cf_positive(cf):
|
||||
"""
|
||||
Shifts cf so that all values are positive
|
||||
@ -1369,7 +1361,6 @@ def cf_positive(cf):
|
||||
"""
|
||||
return cf + max(abs(cf))
|
||||
|
||||
|
||||
def smooth_cf(cf, t_smooth, delta):
|
||||
"""
|
||||
Smooth cf by taking samples over t_smooth length
|
||||
@ -1398,7 +1389,6 @@ def smooth_cf(cf, t_smooth, delta):
|
||||
cf_smooth -= offset # remove offset from smoothed function
|
||||
return cf_smooth
|
||||
|
||||
|
||||
def check_counts_ms(data):
|
||||
"""
|
||||
check if data is in counts or m/s
|
||||
@ -1458,9 +1448,9 @@ def calcSlope(Data, datasmooth, Tcf, Pick, TSNR):
|
||||
if imax == 0:
|
||||
print("AICPicker: Maximum for slope determination right at the beginning of the window!")
|
||||
print("Choose longer slope determination window!")
|
||||
raise IndexError
|
||||
raise IndexError
|
||||
iislope = islope[0][0:imax + 1] # cut index so it contains only the first maximum
|
||||
dataslope = Data[0].data[iislope] # slope will only be calculated to the first maximum
|
||||
dataslope = Data[0].data[iislope] # slope will only be calculated to the first maximum
|
||||
# calculate slope as polynomal fit of order 1
|
||||
xslope = np.arange(0, len(dataslope))
|
||||
P = np.polyfit(xslope, dataslope, 1)
|
||||
@ -1481,10 +1471,8 @@ def get_pickparams(pickparam):
|
||||
:rtype: (dict, dict, dict, dict)
|
||||
"""
|
||||
# Define names of all parameters in different groups
|
||||
p_parameter_names = 'algoP pstart pstop use_taup taup_model tlta tsnrz hosorder bpz1 bpz2 pickwinP aictsmooth tsmoothP ausP nfacP tpred1z tdet1z Parorder addnoise Precalcwin minAICPslope minAICPSNR timeerrorsP checkwindowP minfactorP'.split(
|
||||
' ')
|
||||
s_parameter_names = 'algoS sstart sstop bph1 bph2 tsnrh pickwinS tpred1h tdet1h tpred2h tdet2h Sarorder aictsmoothS tsmoothS ausS minAICSslope minAICSSNR Srecalcwin nfacS timeerrorsS zfac checkwindowS minfactorS'.split(
|
||||
' ')
|
||||
p_parameter_names = 'algoP pstart pstop use_taup taup_model tlta tsnrz hosorder bpz1 bpz2 pickwinP aictsmooth tsmoothP ausP nfacP tpred1z tdet1z Parorder addnoise Precalcwin minAICPslope minAICPSNR timeerrorsP checkwindowP minfactorP'.split(' ')
|
||||
s_parameter_names = 'algoS sstart sstop bph1 bph2 tsnrh pickwinS tpred1h tdet1h tpred2h tdet2h Sarorder aictsmoothS tsmoothS ausS minAICSslope minAICSSNR Srecalcwin nfacS timeerrorsS zfac checkwindowS minfactorS'.split(' ')
|
||||
first_motion_names = 'minFMSNR fmpickwin minfmweight'.split(' ')
|
||||
signal_length_names = 'minsiglength minpercent noisefactor'.split(' ')
|
||||
# Get list of values from pickparam by name
|
||||
@ -1498,16 +1486,15 @@ def get_pickparams(pickparam):
|
||||
first_motion_params = dict(zip(first_motion_names, fm_parameter_values))
|
||||
signal_length_params = dict(zip(signal_length_names, sl_parameter_values))
|
||||
|
||||
p_params['use_taup'] = get_bool(p_params['use_taup'])
|
||||
p_params['use_taup'] = get_Bool(p_params['use_taup'])
|
||||
|
||||
return p_params, s_params, first_motion_params, signal_length_params
|
||||
|
||||
|
||||
def getQualityFromUncertainty(uncertainty, Errors):
|
||||
# set initial quality to 4 (worst) and change only if one condition is hit
|
||||
quality = 4
|
||||
|
||||
if get_none(uncertainty) is None:
|
||||
if get_None(uncertainty) is None:
|
||||
return quality
|
||||
|
||||
if uncertainty <= Errors[0]:
|
||||
@ -1526,22 +1513,7 @@ def getQualityFromUncertainty(uncertainty, Errors):
|
||||
|
||||
return quality
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
import doctest
|
||||
|
||||
doctest.testmod()
|
||||
|
||||
|
||||
class PickingFailedException(Exception):
|
||||
"""
|
||||
Raised when picking fails due to missing values etc.
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
class MissingTraceException(ValueError):
|
||||
"""
|
||||
Used to indicate missing traces in a obspy.core.stream.Stream object
|
||||
"""
|
||||
pass
|
||||
|
File diff suppressed because it is too large
Load Diff
@ -2,13 +2,12 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
try:
|
||||
# noinspection PyUnresolvedReferences
|
||||
from urllib2 import urlopen
|
||||
except:
|
||||
from urllib.request import urlopen
|
||||
|
||||
|
||||
def checkurl(url='https://git.geophysik.ruhr-uni-bochum.de/marcel/pylot/'):
|
||||
def checkurl(url='https://ariadne.geophysik.ruhr-uni-bochum.de/trac/PyLoT/'):
|
||||
"""
|
||||
check if URL is available
|
||||
:param url: url
|
||||
|
@ -2,11 +2,9 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import glob
|
||||
import logging
|
||||
import numpy as np
|
||||
import os
|
||||
import sys
|
||||
|
||||
import numpy as np
|
||||
from obspy import UTCDateTime, read_inventory, read
|
||||
from obspy.io.xseed import Parser
|
||||
|
||||
@ -27,10 +25,6 @@ class Metadata(object):
|
||||
# saves which metadata files are from obspy dmt
|
||||
self.obspy_dmt_invs = []
|
||||
if inventory:
|
||||
# make sure that no accidental backslashes mess up the path
|
||||
if isinstance(inventory, str):
|
||||
inventory = inventory.replace('\\', '/')
|
||||
inventory = os.path.abspath(inventory)
|
||||
if os.path.isdir(inventory):
|
||||
self.add_inventory(inventory)
|
||||
if os.path.isfile(inventory):
|
||||
@ -52,15 +46,13 @@ class Metadata(object):
|
||||
def __repr__(self):
|
||||
return self.__str__()
|
||||
|
||||
def add_inventory(self, path_to_inventory, obspy_dmt_inv=False):
|
||||
def add_inventory(self, path_to_inventory, obspy_dmt_inv = False):
|
||||
"""
|
||||
Add path to list of inventories.
|
||||
:param path_to_inventory: Path to a folder
|
||||
:type path_to_inventory: str
|
||||
:return: None
|
||||
"""
|
||||
path_to_inventory = path_to_inventory.replace('\\', '/')
|
||||
path_to_inventory = os.path.abspath(path_to_inventory)
|
||||
assert (os.path.isdir(path_to_inventory)), '{} is no directory'.format(path_to_inventory)
|
||||
if path_to_inventory not in self.inventories:
|
||||
self.inventories.append(path_to_inventory)
|
||||
@ -91,10 +83,10 @@ class Metadata(object):
|
||||
print('Path {} not in inventories list.'.format(path_to_inventory))
|
||||
return
|
||||
self.inventories.remove(path_to_inventory)
|
||||
for filename in list(self.inventory_files.keys()):
|
||||
for filename in self.inventory_files.keys():
|
||||
if filename.startswith(path_to_inventory):
|
||||
del (self.inventory_files[filename])
|
||||
for seed_id in list(self.seed_ids.keys()):
|
||||
for seed_id in self.seed_ids.keys():
|
||||
if self.seed_ids[seed_id].startswith(path_to_inventory):
|
||||
del (self.seed_ids[seed_id])
|
||||
# have to clean self.stations_dict as well
|
||||
@ -196,11 +188,7 @@ class Metadata(object):
|
||||
metadata = self.get_metadata(seed_id, time)
|
||||
if not metadata:
|
||||
return
|
||||
try:
|
||||
return metadata['data'].get_coordinates(seed_id, time)
|
||||
# no specific exception defined in obspy inventory
|
||||
except Exception as e:
|
||||
logging.warning(f'Could not get metadata for {seed_id}')
|
||||
return metadata['data'].get_coordinates(seed_id, time)
|
||||
|
||||
def get_all_coordinates(self):
|
||||
def stat_info_from_parser(parser):
|
||||
@ -220,10 +208,9 @@ class Metadata(object):
|
||||
network_name = network.code
|
||||
if not station_name in self.stations_dict.keys():
|
||||
st_id = '{}.{}'.format(network_name, station_name)
|
||||
self.stations_dict[st_id] = {'latitude': station.latitude,
|
||||
'longitude': station.longitude,
|
||||
'elevation': station.elevation}
|
||||
|
||||
self.stations_dict[st_id] = {'latitude': station[0].latitude,
|
||||
'longitude': station[0].longitude,
|
||||
'elevation': station[0].elevation}
|
||||
read_stat = {'xml': stat_info_from_inventory,
|
||||
'dless': stat_info_from_parser}
|
||||
|
||||
@ -268,6 +255,9 @@ class Metadata(object):
|
||||
if not fnames:
|
||||
# search for station name in filename
|
||||
fnames = glob.glob(os.path.join(path_to_inventory, '*' + station + '*'))
|
||||
if not fnames:
|
||||
# search for network name in filename
|
||||
fnames = glob.glob(os.path.join(path_to_inventory, '*' + network + '*'))
|
||||
if not fnames:
|
||||
if self.verbosity:
|
||||
print('Could not find filenames matching station name, network name or seed id')
|
||||
@ -287,7 +277,6 @@ class Metadata(object):
|
||||
self.seed_ids[station_seed_id] = fname
|
||||
return True
|
||||
except Exception as e:
|
||||
logging.warning(e)
|
||||
continue
|
||||
print('Could not find metadata for station_seed_id {} in path {}'.format(station_seed_id, path_to_inventory))
|
||||
|
||||
@ -362,25 +351,25 @@ def check_time(datetime):
|
||||
:type datetime: list
|
||||
:return: returns True if Values are in supposed range, returns False otherwise
|
||||
|
||||
>>> check_time([1999, 1, 1, 23, 59, 59, 999000])
|
||||
>>> check_time([1999, 01, 01, 23, 59, 59, 999000])
|
||||
True
|
||||
>>> check_time([1999, 1, 1, 23, 59, 60, 999000])
|
||||
>>> check_time([1999, 01, 01, 23, 59, 60, 999000])
|
||||
False
|
||||
>>> check_time([1999, 1, 1, 23, 59, 59, 1000000])
|
||||
>>> check_time([1999, 01, 01, 23, 59, 59, 1000000])
|
||||
False
|
||||
>>> check_time([1999, 1, 1, 23, 60, 59, 999000])
|
||||
>>> check_time([1999, 01, 01, 23, 60, 59, 999000])
|
||||
False
|
||||
>>> check_time([1999, 1, 1, 23, 60, 59, 999000])
|
||||
>>> check_time([1999, 01, 01, 23, 60, 59, 999000])
|
||||
False
|
||||
>>> check_time([1999, 1, 1, 24, 59, 59, 999000])
|
||||
>>> check_time([1999, 01, 01, 24, 59, 59, 999000])
|
||||
False
|
||||
>>> check_time([1999, 1, 31, 23, 59, 59, 999000])
|
||||
>>> check_time([1999, 01, 31, 23, 59, 59, 999000])
|
||||
True
|
||||
>>> check_time([1999, 2, 30, 23, 59, 59, 999000])
|
||||
>>> check_time([1999, 02, 30, 23, 59, 59, 999000])
|
||||
False
|
||||
>>> check_time([1999, 2, 29, 23, 59, 59, 999000])
|
||||
>>> check_time([1999, 02, 29, 23, 59, 59, 999000])
|
||||
False
|
||||
>>> check_time([2000, 2, 29, 23, 59, 59, 999000])
|
||||
>>> check_time([2000, 02, 29, 23, 59, 59, 999000])
|
||||
True
|
||||
>>> check_time([2000, 13, 29, 23, 59, 59, 999000])
|
||||
False
|
||||
@ -392,7 +381,6 @@ def check_time(datetime):
|
||||
return False
|
||||
|
||||
|
||||
# TODO: change root to datapath
|
||||
def get_file_list(root_dir):
|
||||
"""
|
||||
Function uses a directorie to get all the *.gse files from it.
|
||||
@ -456,7 +444,7 @@ def evt_head_check(root_dir, out_dir=None):
|
||||
"""
|
||||
if not out_dir:
|
||||
print('WARNING files are going to be overwritten!')
|
||||
inp = str(input('Continue? [y/N]'))
|
||||
inp = str(raw_input('Continue? [y/N]'))
|
||||
if not inp == 'y':
|
||||
sys.exit()
|
||||
filelist = get_file_list(root_dir)
|
||||
@ -652,8 +640,6 @@ def restitute_data(data, metadata, unit='VEL', force=False, ncores=0):
|
||||
"""
|
||||
|
||||
# data = remove_underscores(data)
|
||||
if not data:
|
||||
return
|
||||
|
||||
# loop over traces
|
||||
input_tuples = []
|
||||
@ -661,14 +647,9 @@ def restitute_data(data, metadata, unit='VEL', force=False, ncores=0):
|
||||
input_tuples.append((tr, metadata, unit, force))
|
||||
data.remove(tr)
|
||||
|
||||
if ncores == 0:
|
||||
result = []
|
||||
for input_tuple in input_tuples:
|
||||
result.append(restitute_trace(input_tuple))
|
||||
else:
|
||||
pool = gen_Pool(ncores)
|
||||
result = pool.imap_unordered(restitute_trace, input_tuples)
|
||||
pool.close()
|
||||
pool = gen_Pool(ncores)
|
||||
result = pool.imap_unordered(restitute_trace, input_tuples)
|
||||
pool.close()
|
||||
|
||||
for tr, remove_trace in result:
|
||||
if not remove_trace:
|
||||
|
@ -26,7 +26,9 @@ elif system_name == "Windows":
|
||||
# suffix for phase name if not phase identified by last letter (P, p, etc.)
|
||||
ALTSUFFIX = ['diff', 'n', 'g', '1', '2', '3']
|
||||
|
||||
FILTERDEFAULTS = readDefaultFilterInformation()
|
||||
FILTERDEFAULTS = readDefaultFilterInformation(os.path.join(os.path.expanduser('~'),
|
||||
'.pylot',
|
||||
'pylot.in'))
|
||||
|
||||
TIMEERROR_DEFAULTS = os.path.join(os.path.expanduser('~'),
|
||||
'.pylot',
|
||||
|
@ -2,7 +2,6 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import os
|
||||
|
||||
from obspy import UTCDateTime
|
||||
from obspy.core.event import Event as ObsPyEvent
|
||||
from obspy.core.event import Origin, ResourceIdentifier
|
||||
@ -26,7 +25,9 @@ class Event(ObsPyEvent):
|
||||
# initialize super class
|
||||
super(Event, self).__init__(resource_id=ResourceIdentifier('smi:local/' + self.pylot_id))
|
||||
self.path = path
|
||||
self.datapath = os.path.split(path)[0] # path.split('/')[-3]
|
||||
self.database = path.split('/')[-2]
|
||||
self.datapath = path.split('/')[-3]
|
||||
self.rootpath = '/' + os.path.join(*path.split('/')[:-3])
|
||||
self.pylot_autopicks = {}
|
||||
self.pylot_picks = {}
|
||||
self.notes = ''
|
||||
@ -72,7 +73,7 @@ class Event(ObsPyEvent):
|
||||
text = lines[0]
|
||||
self.addNotes(text)
|
||||
try:
|
||||
datetime = UTCDateTime(self.path.split('/')[-1])
|
||||
datetime = UTCDateTime(path.split('/')[-1])
|
||||
origin = Origin(resource_id=self.resource_id, time=datetime, latitude=0, longitude=0, depth=0)
|
||||
self.origins.append(origin)
|
||||
except:
|
||||
@ -295,13 +296,13 @@ class Event(ObsPyEvent):
|
||||
:rtype: None
|
||||
"""
|
||||
try:
|
||||
import pickle
|
||||
import cPickle
|
||||
except ImportError:
|
||||
import _pickle as pickle
|
||||
import _pickle as cPickle
|
||||
|
||||
try:
|
||||
outfile = open(filename, 'wb')
|
||||
pickle.dump(self, outfile, -1)
|
||||
cPickle.dump(self, outfile, -1)
|
||||
self.dirty = False
|
||||
except Exception as e:
|
||||
print('Could not pickle PyLoT event. Reason: {}'.format(e))
|
||||
@ -316,11 +317,11 @@ class Event(ObsPyEvent):
|
||||
:rtype: Event
|
||||
"""
|
||||
try:
|
||||
import pickle
|
||||
import cPickle
|
||||
except ImportError:
|
||||
import _pickle as pickle
|
||||
import _pickle as cPickle
|
||||
infile = open(filename, 'rb')
|
||||
event = pickle.load(infile)
|
||||
event = cPickle.load(infile)
|
||||
event.dirty = False
|
||||
print('Loaded %s' % filename)
|
||||
return event
|
||||
|
@ -3,37 +3,24 @@
|
||||
# small script that creates array maps for each event within a previously generated PyLoT project
|
||||
|
||||
import os
|
||||
|
||||
num_thread = "16"
|
||||
os.environ["OMP_NUM_THREADS"] = num_thread
|
||||
os.environ["OPENBLAS_NUM_THREADS"] = num_thread
|
||||
os.environ["MKL_NUM_THREADS"] = num_thread
|
||||
os.environ["VECLIB_MAXIMUM_THREADS"] = num_thread
|
||||
os.environ["NUMEXPR_NUM_THREADS"] = num_thread
|
||||
os.environ["NUMEXPR_MAX_THREADS"] = num_thread
|
||||
|
||||
import multiprocessing
|
||||
import sys
|
||||
import glob
|
||||
import matplotlib
|
||||
|
||||
matplotlib.use('Qt5Agg')
|
||||
sys.path.append(os.path.join('/'.join(sys.argv[0].split('/')[:-1]), '../../..'))
|
||||
|
||||
from PyLoT import Project
|
||||
from pylot.core.util.dataprocessing import Metadata
|
||||
from pylot.core.util.array_map import Array_map
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
import argparse
|
||||
|
||||
|
||||
def main(project_file_path, manual=False, auto=True, file_format='png', f_ext='', ncores=None):
|
||||
project = Project.load(project_file_path)
|
||||
nEvents = len(project.eventlist)
|
||||
input_list = []
|
||||
print('\n')
|
||||
|
||||
for index, event in enumerate(project.eventlist):
|
||||
# MP MP TESTING +++
|
||||
#if not eventdir.endswith('20170908_044946.a'):
|
||||
# continue
|
||||
# MP MP ----
|
||||
kwargs = dict(project=project, event=event, nEvents=nEvents, index=index, manual=manual, auto=auto,
|
||||
file_format=file_format, f_ext=f_ext)
|
||||
input_list.append(kwargs)
|
||||
@ -42,21 +29,15 @@ def main(project_file_path, manual=False, auto=True, file_format='png', f_ext=''
|
||||
for item in input_list:
|
||||
array_map_worker(item)
|
||||
else:
|
||||
pool = multiprocessing.Pool(ncores, maxtasksperchild=1000)
|
||||
pool.map(array_map_worker, input_list)
|
||||
pool = multiprocessing.Pool(ncores)
|
||||
result = pool.map(array_map_worker, input_list)
|
||||
pool.close()
|
||||
pool.join()
|
||||
|
||||
|
||||
def array_map_worker(input_dict):
|
||||
event = input_dict['event']
|
||||
eventdir = event.path
|
||||
print('Working on event: {} ({}/{})'.format(eventdir, input_dict['index'] + 1, input_dict['nEvents']))
|
||||
xml_picks = glob.glob(os.path.join(eventdir, f'*{input_dict["f_ext"]}.xml'))
|
||||
if not len(xml_picks):
|
||||
print('Event {} does not have any picks associated with event file extension {}'.format(eventdir,
|
||||
input_dict['f_ext']))
|
||||
return
|
||||
# check for picks
|
||||
manualpicks = event.getPicks()
|
||||
autopicks = event.getAutopicks()
|
||||
@ -70,12 +51,11 @@ def array_map_worker(input_dict):
|
||||
continue
|
||||
if not metadata:
|
||||
metadata = Metadata(inventory=metadata_path, verbosity=0)
|
||||
|
||||
# create figure to plot on
|
||||
fig, ax = plt.subplots(figsize=(15, 9))
|
||||
fig = plt.figure(figsize=(16,9))
|
||||
# create array map object
|
||||
map = Array_map(None, metadata, parameter=input_dict['project'].parameter, axes=ax,
|
||||
width=2.13e6, height=1.2e6, pointsize=25., linewidth=1.0)
|
||||
map = Array_map(None, metadata, parameter=input_dict['project'].parameter, figure=fig,
|
||||
width=2.13e6, height=1.2e6, pointsize=15., linewidth=1.0)
|
||||
# set combobox to auto/manual to plot correct pick type
|
||||
map.comboBox_am.setCurrentIndex(map.comboBox_am.findText(pick_type))
|
||||
# add picks to map and save file
|
||||
@ -85,13 +65,11 @@ def array_map_worker(input_dict):
|
||||
fig.savefig(fpath_out, dpi=300.)
|
||||
print('Wrote file: {}'.format(fpath_out))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
cl = argparse.ArgumentParser()
|
||||
cl.add_argument('--dataroot', help='Directory containing the PyLoT .plp file', type=str)
|
||||
cl.add_argument('--infiles', help='.plp files to use', nargs='+')
|
||||
cl.add_argument('--ncores', hepl='Specify number of parallel processes', type=int, default=1)
|
||||
args = cl.parse_args()
|
||||
dataroot = '/home/marcel'
|
||||
infiles=['alparray_all_events_0.03-0.1_mantle_correlated_v3.plp']
|
||||
|
||||
for infile in args.infiles:
|
||||
main(os.path.join(args.dataroot, infile), f_ext='_correlated_0.03-0.1', ncores=args.ncores)
|
||||
for infile in infiles:
|
||||
main(os.path.join(dataroot, infile), f_ext='_correlated_0.1Hz', ncores=10)
|
||||
#main('E:\Shared\AlpArray\\test_aa.plp', f_ext='_correlated_0.5Hz', ncores=1)
|
||||
#main('/home/marcel/alparray_m6.5-6.9_mantle_correlated_v3.plp', f_ext='_correlated_0.5Hz')
|
||||
|
@ -1,7 +1,6 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
import os
|
||||
from functools import lru_cache
|
||||
|
||||
try:
|
||||
import pyqtgraph as pg
|
||||
@ -12,6 +11,7 @@ try:
|
||||
except Exception as e:
|
||||
print('Warning: Could not import module QtCore.')
|
||||
|
||||
|
||||
from pylot.core.util.utils import pick_color
|
||||
|
||||
|
||||
@ -26,14 +26,14 @@ def pick_linestyle_pg(picktype, key):
|
||||
:return: Qt line style parameters
|
||||
:rtype:
|
||||
"""
|
||||
linestyles_manu = {'mpp': (QtCore.Qt.SolidLine, 2),
|
||||
'epp': (QtCore.Qt.DashLine, 1),
|
||||
'lpp': (QtCore.Qt.DashLine, 1),
|
||||
'spe': (QtCore.Qt.DashLine, 1)}
|
||||
linestyles_auto = {'mpp': (QtCore.Qt.DotLine, 2),
|
||||
'epp': (QtCore.Qt.DashDotLine, 1),
|
||||
'lpp': (QtCore.Qt.DashDotLine, 1),
|
||||
'spe': (QtCore.Qt.DashDotLine, 1)}
|
||||
linestyles_manu = {'mpp': (QtCore.Qt.SolidLine, 2.),
|
||||
'epp': (QtCore.Qt.DashLine, 1.),
|
||||
'lpp': (QtCore.Qt.DashLine, 1.),
|
||||
'spe': (QtCore.Qt.DashLine, 1.)}
|
||||
linestyles_auto = {'mpp': (QtCore.Qt.DotLine, 2.),
|
||||
'epp': (QtCore.Qt.DashDotLine, 1.),
|
||||
'lpp': (QtCore.Qt.DashDotLine, 1.),
|
||||
'spe': (QtCore.Qt.DashDotLine, 1.)}
|
||||
linestyles = {'manual': linestyles_manu,
|
||||
'auto': linestyles_auto}
|
||||
return linestyles[picktype][key]
|
||||
@ -49,7 +49,7 @@ def which(program, parameter):
|
||||
:rtype: str
|
||||
"""
|
||||
try:
|
||||
from PySide2.QtCore import QSettings
|
||||
from PySide.QtCore import QSettings
|
||||
settings = QSettings()
|
||||
for key in settings.allKeys():
|
||||
if 'binPath' in key:
|
||||
@ -57,7 +57,7 @@ def which(program, parameter):
|
||||
nllocpath = ":" + parameter.get('nllocbin')
|
||||
os.environ['PATH'] += nllocpath
|
||||
except Exception as e:
|
||||
print(e)
|
||||
print(e.message)
|
||||
|
||||
def is_exe(fpath):
|
||||
return os.path.exists(fpath) and os.access(fpath, os.X_OK)
|
||||
@ -81,7 +81,6 @@ def which(program, parameter):
|
||||
return None
|
||||
|
||||
|
||||
@lru_cache(maxsize=128)
|
||||
def make_pen(picktype, phase, key, quality):
|
||||
"""
|
||||
Make PyQtGraph.QPen
|
||||
@ -101,4 +100,4 @@ def make_pen(picktype, phase, key, quality):
|
||||
rgba = pick_color(picktype, phase, quality)
|
||||
linestyle, width = pick_linestyle_pg(picktype, key)
|
||||
pen = pg.mkPen(rgba, width=width, style=linestyle)
|
||||
return pen
|
||||
return pen
|
@ -2,7 +2,6 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import os
|
||||
|
||||
from obspy import UTCDateTime
|
||||
|
||||
|
||||
@ -35,25 +34,17 @@ def qml_from_obspyDMT(path):
|
||||
|
||||
if not os.path.exists(path):
|
||||
return IOError('Could not find Event at {}'.format(path))
|
||||
|
||||
with open(path, 'rb') as infile:
|
||||
event_dmt = pickle.load(infile) # , fix_imports=True)
|
||||
|
||||
infile = open(path, 'rb')
|
||||
event_dmt = pickle.load(infile)#, fix_imports=True)
|
||||
event_dmt['origin_id'].id = str(event_dmt['origin_id'].id)
|
||||
|
||||
ev = Event(resource_id=event_dmt['event_id'])
|
||||
# small bugfix "unhashable type: 'newstr' "
|
||||
#small bugfix "unhashable type: 'newstr' "
|
||||
event_dmt['origin_id'].id = str(event_dmt['origin_id'].id)
|
||||
|
||||
origin = Origin(resource_id=event_dmt['origin_id'],
|
||||
time=event_dmt['datetime'],
|
||||
longitude=event_dmt['longitude'],
|
||||
latitude=event_dmt['latitude'],
|
||||
depth=event_dmt['depth'])
|
||||
mag = Magnitude(mag=event_dmt['magnitude'],
|
||||
magnitude_type=event_dmt['magnitude_type'],
|
||||
origin = Origin(resource_id=event_dmt['origin_id'], time=event_dmt['datetime'], longitude=event_dmt['longitude'],
|
||||
latitude=event_dmt['latitude'], depth=event_dmt['depth'])
|
||||
mag = Magnitude(mag=event_dmt['magnitude'], magnitude_type=event_dmt['magnitude_type'],
|
||||
origin_id=event_dmt['origin_id'])
|
||||
|
||||
ev.magnitudes.append(mag)
|
||||
ev.origins.append(origin)
|
||||
return ev
|
||||
|
@ -1,9 +1,8 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import warnings
|
||||
|
||||
import numpy as np
|
||||
import warnings
|
||||
from obspy import UTCDateTime
|
||||
|
||||
from pylot.core.util.utils import fit_curve, clims
|
||||
|
@ -1,9 +1,6 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
import sys, os, traceback
|
||||
import multiprocessing
|
||||
import os
|
||||
import sys
|
||||
import traceback
|
||||
|
||||
from PySide2.QtCore import QThread, Signal, Qt, Slot, QRunnable, QObject
|
||||
from PySide2.QtWidgets import QDialog, QProgressBar, QLabel, QHBoxLayout, QPushButton
|
||||
|
||||
@ -22,11 +19,9 @@ class Thread(QThread):
|
||||
self.abortButton = abortButton
|
||||
self.finished.connect(self.hideProgressbar)
|
||||
self.showProgressbar()
|
||||
self.old_stdout = None
|
||||
|
||||
def run(self):
|
||||
if self.redirect_stdout:
|
||||
self.old_stdout = sys.stdout
|
||||
sys.stdout = self
|
||||
try:
|
||||
if self.arg is not None:
|
||||
@ -41,8 +36,7 @@ class Thread(QThread):
|
||||
exctype, value = sys.exc_info()[:2]
|
||||
self._executedErrorInfo = '{} {} {}'. \
|
||||
format(exctype, value, traceback.format_exc())
|
||||
if self.redirect_stdout:
|
||||
sys.stdout = self.old_stdout
|
||||
sys.stdout = sys.__stdout__
|
||||
|
||||
def showProgressbar(self):
|
||||
if self.progressText:
|
||||
@ -99,25 +93,23 @@ class Worker(QRunnable):
|
||||
self.progressText = progressText
|
||||
self.pb_widget = pb_widget
|
||||
self.redirect_stdout = redirect_stdout
|
||||
self.old_stdout = None
|
||||
|
||||
@Slot()
|
||||
def run(self):
|
||||
if self.redirect_stdout:
|
||||
self.old_stdout = sys.stdout
|
||||
sys.stdout = self
|
||||
|
||||
try:
|
||||
result = self.fun(self.args)
|
||||
except:
|
||||
exctype, value = sys.exc_info()[:2]
|
||||
exctype, value = sys.exc_info ()[:2]
|
||||
print(exctype, value, traceback.format_exc())
|
||||
self.signals.error.emit((exctype, value, traceback.format_exc()))
|
||||
self.signals.error.emit ((exctype, value, traceback.format_exc ()))
|
||||
else:
|
||||
self.signals.result.emit(result)
|
||||
finally:
|
||||
self.signals.finished.emit('Done')
|
||||
sys.stdout = self.old_stdout
|
||||
sys.stdout = sys.__stdout__
|
||||
|
||||
def write(self, text):
|
||||
self.signals.message.emit(text)
|
||||
@ -149,18 +141,16 @@ class MultiThread(QThread):
|
||||
self.progressText = progressText
|
||||
self.pb_widget = pb_widget
|
||||
self.redirect_stdout = redirect_stdout
|
||||
self.old_stdout = None
|
||||
self.finished.connect(self.hideProgressbar)
|
||||
self.showProgressbar()
|
||||
|
||||
def run(self):
|
||||
if self.redirect_stdout:
|
||||
self.old_stdout = sys.stdout
|
||||
sys.stdout = self
|
||||
try:
|
||||
if not self.ncores:
|
||||
self.ncores = multiprocessing.cpu_count()
|
||||
pool = multiprocessing.Pool(self.ncores, maxtasksperchild=1000)
|
||||
pool = multiprocessing.Pool(self.ncores)
|
||||
self.data = pool.map_async(self.func, self.args, callback=self.emitDone)
|
||||
# self.data = pool.apply_async(self.func, self.shotlist, callback=self.emitDone) #emit each time returned
|
||||
pool.close()
|
||||
@ -171,7 +161,7 @@ class MultiThread(QThread):
|
||||
exc_type, exc_obj, exc_tb = sys.exc_info()
|
||||
fname = os.path.split(exc_tb.tb_frame.f_code.co_filename)[1]
|
||||
print('Exception: {}, file: {}, line: {}'.format(exc_type, fname, exc_tb.tb_lineno))
|
||||
sys.stdout = self.old_stdout
|
||||
sys.stdout = sys.__stdout__
|
||||
|
||||
def showProgressbar(self):
|
||||
if self.progressText:
|
||||
|
@ -1,17 +1,13 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
import glob
|
||||
|
||||
import hashlib
|
||||
import logging
|
||||
import numpy as np
|
||||
import os
|
||||
import platform
|
||||
import re
|
||||
import subprocess
|
||||
import warnings
|
||||
from typing import Literal, Tuple, Type
|
||||
from functools import lru_cache
|
||||
|
||||
import numpy as np
|
||||
from obspy import UTCDateTime, read
|
||||
from obspy.core import AttribDict
|
||||
from obspy.signal.rotate import rotate2zne
|
||||
@ -21,10 +17,6 @@ from pylot.core.io.inputs import PylotParameter, FilterOptions
|
||||
from pylot.core.util.obspyDMT_interface import check_obspydmt_eventfolder
|
||||
from pylot.styles import style_settings
|
||||
|
||||
Rgba: Type[tuple] = Tuple[int, int, int, int]
|
||||
Mplrgba: Type[tuple] = Tuple[float, float, float, float]
|
||||
Mplrgbastr: Type[tuple] = Tuple[str, str, str, str]
|
||||
|
||||
|
||||
def _pickle_method(m):
|
||||
if m.im_self is None:
|
||||
@ -44,13 +36,15 @@ def getAutoFilteroptions(phase, parameter):
|
||||
return filteroptions
|
||||
|
||||
|
||||
def readDefaultFilterInformation():
|
||||
def readDefaultFilterInformation(fname):
|
||||
"""
|
||||
Read default filter information from pylot.in file
|
||||
:param fname: path to pylot.in file
|
||||
:type fname: str
|
||||
:return: dictionary containing the defailt filter information
|
||||
:rtype: dict
|
||||
"""
|
||||
pparam = PylotParameter()
|
||||
pparam = PylotParameter(fname)
|
||||
return readFilterInformation(pparam)
|
||||
|
||||
|
||||
@ -87,6 +81,25 @@ def fit_curve(x, y):
|
||||
return splev, splrep(x, y)
|
||||
|
||||
|
||||
def getindexbounds(f, eta):
|
||||
"""
|
||||
Get indices of values closest below and above maximum value in an array
|
||||
:param f: array
|
||||
:type f: `~numpy.ndarray`
|
||||
:param eta: look for value in array that is closes to max_value * eta
|
||||
:type eta: float
|
||||
:return: tuple containing index of max value, index of value closest below max value,
|
||||
index of value closest above max value
|
||||
:rtype: (int, int, int)
|
||||
"""
|
||||
mi = f.argmax() # get indices of max values
|
||||
m = max(f) # get maximum value
|
||||
b = m * eta #
|
||||
l = find_nearest(f[:mi], b) # find closest value below max value
|
||||
u = find_nearest(f[mi:], b) + mi # find closest value above max value
|
||||
return mi, l, u
|
||||
|
||||
|
||||
def gen_Pool(ncores=0):
|
||||
"""
|
||||
Generate mulitprocessing pool object utilizing ncores amount of cores
|
||||
@ -106,7 +119,7 @@ def gen_Pool(ncores=0):
|
||||
|
||||
print('gen_Pool: Generated multiprocessing Pool with {} cores\n'.format(ncores))
|
||||
|
||||
pool = multiprocessing.Pool(ncores, maxtasksperchild=100)
|
||||
pool = multiprocessing.Pool(ncores)
|
||||
return pool
|
||||
|
||||
|
||||
@ -152,11 +165,11 @@ def clims(lim1, lim2):
|
||||
"""
|
||||
takes two pairs of limits and returns one pair of common limts
|
||||
:param lim1: limit 1
|
||||
:type lim1: List[int]
|
||||
:type lim1: int
|
||||
:param lim2: limit 2
|
||||
:type lim2: List[int]
|
||||
:type lim2: int
|
||||
:return: new upper and lower limit common to both given limits
|
||||
:rtype: List[int]
|
||||
:rtype: [int, int]
|
||||
|
||||
>>> clims([0, 4], [1, 3])
|
||||
[0, 4]
|
||||
@ -214,7 +227,7 @@ def findComboBoxIndex(combo_box, val):
|
||||
:type val: basestring
|
||||
:return: index value of item with name val or 0
|
||||
"""
|
||||
return combo_box.findText(val) if combo_box.findText(val) != -1 else 0
|
||||
return combo_box.findText(val) if combo_box.findText(val) is not -1 else 0
|
||||
|
||||
|
||||
def find_in_list(list, str):
|
||||
@ -288,7 +301,7 @@ def fnConstructor(s):
|
||||
if type(s) is str:
|
||||
s = s.split(':')[-1]
|
||||
else:
|
||||
s = get_hash(UTCDateTime())
|
||||
s = getHash(UTCDateTime())
|
||||
|
||||
badchars = re.compile(r'[^A-Za-z0-9_. ]+|^\.|\.$|^ | $|^$')
|
||||
badsuffix = re.compile(r'(aux|com[1-9]|con|lpt[1-9]|prn)(\.|$)')
|
||||
@ -300,75 +313,32 @@ def fnConstructor(s):
|
||||
return fn
|
||||
|
||||
|
||||
def get_none(value):
|
||||
def get_None(value):
|
||||
"""
|
||||
Convert "None" to None
|
||||
:param value:
|
||||
:type value: str, NoneType
|
||||
:type value: str, bool
|
||||
:return:
|
||||
:rtype: type(value) or NoneType
|
||||
|
||||
>>> st = read()
|
||||
>>> print(get_none(st))
|
||||
3 Trace(s) in Stream:
|
||||
BW.RJOB..EHZ | 2009-08-24T00:20:03.000000Z - 2009-08-24T00:20:32.990000Z | 100.0 Hz, 3000 samples
|
||||
BW.RJOB..EHN | 2009-08-24T00:20:03.000000Z - 2009-08-24T00:20:32.990000Z | 100.0 Hz, 3000 samples
|
||||
BW.RJOB..EHE | 2009-08-24T00:20:03.000000Z - 2009-08-24T00:20:32.990000Z | 100.0 Hz, 3000 samples
|
||||
>>> get_none('Stream')
|
||||
'Stream'
|
||||
>>> get_none(0)
|
||||
0
|
||||
>>> get_none(0.)
|
||||
0.0
|
||||
>>> print(get_none('None'))
|
||||
None
|
||||
>>> print(get_none(None))
|
||||
None
|
||||
:rtype: bool
|
||||
"""
|
||||
if value is None or (type(value) is str and value == 'None'):
|
||||
if value == 'None':
|
||||
return None
|
||||
else:
|
||||
return value
|
||||
|
||||
|
||||
def get_bool(value):
|
||||
def get_Bool(value):
|
||||
"""
|
||||
Convert string representations of bools to their true boolean value. Return value if it cannot be identified as bool.
|
||||
Convert string representations of bools to their true boolean value
|
||||
:param value:
|
||||
:type value: str, bool, int, float
|
||||
:type value: str, bool
|
||||
:return: true boolean value
|
||||
:rtype: bool
|
||||
|
||||
>>> get_bool(True)
|
||||
True
|
||||
>>> get_bool(False)
|
||||
False
|
||||
>>> get_bool(0)
|
||||
False
|
||||
>>> get_bool(0.)
|
||||
False
|
||||
>>> get_bool(0.1)
|
||||
True
|
||||
>>> get_bool(2)
|
||||
True
|
||||
>>> get_bool(-1)
|
||||
False
|
||||
>>> get_bool(-0.3)
|
||||
False
|
||||
>>> get_bool(None)
|
||||
None
|
||||
"""
|
||||
if type(value) is bool:
|
||||
return value
|
||||
elif value in ['True', 'true']:
|
||||
if value in ['True', 'true']:
|
||||
return True
|
||||
elif value in ['False', 'false']:
|
||||
return False
|
||||
elif isinstance(value, float) or isinstance(value, int):
|
||||
if value > 0. or value > 0:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
else:
|
||||
return value
|
||||
|
||||
@ -382,8 +352,8 @@ def four_digits(year):
|
||||
:return: four digit year correspondent
|
||||
:rtype: int
|
||||
|
||||
>>> four_digits(75)
|
||||
1975
|
||||
>>> four_digits(20)
|
||||
1920
|
||||
>>> four_digits(16)
|
||||
2016
|
||||
>>> four_digits(00)
|
||||
@ -465,53 +435,36 @@ def backtransformFilterString(st):
|
||||
return st
|
||||
|
||||
|
||||
def get_hash(time):
|
||||
def getHash(time):
|
||||
"""
|
||||
takes a time object and returns the corresponding SHA1 hash of the formatted date string
|
||||
:param time: time object for which a hash should be calculated
|
||||
:type time: `~obspy.core.utcdatetime.UTCDateTime`
|
||||
:return: SHA1 hash
|
||||
:rtype: str
|
||||
|
||||
>>> time = UTCDateTime(0)
|
||||
>>> get_hash(time)
|
||||
'7627cce3b1b58dd21b005dac008b34d18317dd15'
|
||||
>>> get_hash(0)
|
||||
Traceback (most recent call last):
|
||||
...
|
||||
AssertionError: 'time' is not an ObsPy UTCDateTime object
|
||||
"""
|
||||
assert isinstance(time, UTCDateTime), '\'time\' is not an ObsPy UTCDateTime object'
|
||||
hg = hashlib.sha1()
|
||||
hg.update(time.strftime('%Y-%m-%d %H:%M:%S.%f').encode('utf-8'))
|
||||
hg.update(time.strftime('%Y-%m-%d %H:%M:%S.%f'))
|
||||
return hg.hexdigest()
|
||||
|
||||
|
||||
def get_login():
|
||||
def getLogin():
|
||||
"""
|
||||
returns the actual user's name
|
||||
:return: login name
|
||||
returns the actual user's login ID
|
||||
:return: login ID
|
||||
:rtype: str
|
||||
"""
|
||||
import getpass
|
||||
return getpass.getuser()
|
||||
|
||||
|
||||
def get_owner(fn):
|
||||
def getOwner(fn):
|
||||
"""
|
||||
takes a filename and return the login ID of the actual owner of the file
|
||||
:param fn: filename of the file tested
|
||||
:type fn: str
|
||||
:return: login ID of the file's owner
|
||||
:rtype: str
|
||||
|
||||
>>> import tempfile
|
||||
>>> with tempfile.NamedTemporaryFile() as tmpfile:
|
||||
... tmpfile.write(b'') and True
|
||||
... tmpfile.flush()
|
||||
... get_owner(tmpfile.name) == os.path.expanduser('~').split('/')[-1]
|
||||
0
|
||||
True
|
||||
"""
|
||||
system_name = platform.system()
|
||||
if system_name in ["Linux", "Darwin"]:
|
||||
@ -557,11 +510,6 @@ def is_executable(fn):
|
||||
:param fn: path to the file to be tested
|
||||
:return: True or False
|
||||
:rtype: bool
|
||||
|
||||
>>> is_executable('/bin/ls')
|
||||
True
|
||||
>>> is_executable('/var/log/system.log')
|
||||
False
|
||||
"""
|
||||
return os.path.isfile(fn) and os.access(fn, os.X_OK)
|
||||
|
||||
@ -588,36 +536,24 @@ def isSorted(iterable):
|
||||
>>> isSorted([2,3,1,4])
|
||||
False
|
||||
"""
|
||||
assert is_iterable(iterable), "object is not iterable; object: {}".format(iterable)
|
||||
assert isIterable(iterable), 'object is not iterable; object: {' \
|
||||
'}'.format(iterable)
|
||||
if type(iterable) is str:
|
||||
iterable = [s for s in iterable]
|
||||
return sorted(iterable) == iterable
|
||||
|
||||
|
||||
def is_iterable(obj):
|
||||
def isIterable(obj):
|
||||
"""
|
||||
takes a python object and returns True is the object is iterable and
|
||||
False otherwise
|
||||
:param obj: a python object
|
||||
:type obj: obj
|
||||
:type obj: object
|
||||
:return: True of False
|
||||
:rtype: bool
|
||||
|
||||
>>> is_iterable(1)
|
||||
False
|
||||
>>> is_iterable(True)
|
||||
False
|
||||
>>> is_iterable(0.)
|
||||
False
|
||||
>>> is_iterable((0,1,3,4))
|
||||
True
|
||||
>>> is_iterable([1])
|
||||
True
|
||||
>>> is_iterable('a')
|
||||
True
|
||||
"""
|
||||
try:
|
||||
iter(obj)
|
||||
iterator = iter(obj)
|
||||
except TypeError as te:
|
||||
return False
|
||||
return True
|
||||
@ -626,19 +562,13 @@ def is_iterable(obj):
|
||||
def key_for_set_value(d):
|
||||
"""
|
||||
takes a dictionary and returns the first key for which's value the
|
||||
boolean representation is True
|
||||
boolean is True
|
||||
:param d: dictionary containing values
|
||||
:type d: dict
|
||||
:return: key to the first non-False value found; None if no value's
|
||||
boolean equals True
|
||||
:rtype: bool or NoneType
|
||||
|
||||
>>> key_for_set_value({'one': 0, 'two': 1})
|
||||
'two'
|
||||
>>> print(key_for_set_value({1: 0, 2: False}))
|
||||
None
|
||||
:rtype:
|
||||
"""
|
||||
assert type(d) is dict, "Function only defined for inputs of type 'dict'."
|
||||
r = None
|
||||
for k, v in d.items():
|
||||
if v:
|
||||
@ -646,53 +576,32 @@ def key_for_set_value(d):
|
||||
return r
|
||||
|
||||
|
||||
def prep_time_axis(offset, trace, verbosity=0):
|
||||
def prepTimeAxis(stime, trace, verbosity=0):
|
||||
"""
|
||||
takes an offset and a trace object and returns a valid time axis for
|
||||
takes a starttime and a trace object and returns a valid time axis for
|
||||
plotting
|
||||
:param offset: offset of the actual seismogram on plotting axis
|
||||
:type offset: float or int
|
||||
:param stime: start time of the actual seismogram as UTCDateTime
|
||||
:type stime: `~obspy.core.utcdatetime.UTCDateTime`
|
||||
:param trace: seismic trace object
|
||||
:type trace: `~obspy.core.trace.Trace`
|
||||
:param verbosity: if != 0, debug output will be written to console
|
||||
:type verbosity: int
|
||||
:return: valid numpy array with time stamps for plotting
|
||||
:rtype: `~numpy.ndarray`
|
||||
|
||||
>>> tr = read()[0]
|
||||
>>> prep_time_axis(0., tr)
|
||||
array([0.00000000e+00, 1.00033344e-02, 2.00066689e-02, ...,
|
||||
2.99799933e+01, 2.99899967e+01, 3.00000000e+01])
|
||||
>>> prep_time_axis(22.5, tr)
|
||||
array([22.5 , 22.51000333, 22.52000667, ..., 52.47999333,
|
||||
52.48999667, 52.5 ])
|
||||
>>> prep_time_axis(tr.stats.starttime, tr)
|
||||
Traceback (most recent call last):
|
||||
...
|
||||
AssertionError: 'offset' is not of type 'float' or 'int'; type: <class 'obspy.core.utcdatetime.UTCDateTime'>
|
||||
>>> tr.stats.npts -= 1
|
||||
>>> prep_time_axis(0, tr)
|
||||
array([0.00000000e+00, 1.00033356e-02, 2.00066711e-02, ...,
|
||||
2.99699933e+01, 2.99799967e+01, 2.99900000e+01])
|
||||
>>> tr.stats.npts += 2
|
||||
>>> prep_time_axis(0, tr)
|
||||
array([0.00000000e+00, 1.00033333e-02, 2.00066667e-02, ...,
|
||||
2.99899933e+01, 2.99999967e+01, 3.00100000e+01])
|
||||
"""
|
||||
assert isinstance(offset, (float, int)), "'offset' is not of type 'float' or 'int'; type: {}".format(type(offset))
|
||||
nsamp = trace.stats.npts
|
||||
srate = trace.stats.sampling_rate
|
||||
tincr = trace.stats.delta
|
||||
etime = offset + nsamp / srate
|
||||
time_ax = np.linspace(offset, etime, nsamp)
|
||||
etime = stime + nsamp / srate
|
||||
time_ax = np.linspace(stime, etime, nsamp)
|
||||
if len(time_ax) < nsamp:
|
||||
if verbosity:
|
||||
print('elongate time axes by one datum')
|
||||
time_ax = np.arange(offset, etime + tincr, tincr)
|
||||
time_ax = np.arange(stime, etime + tincr, tincr)
|
||||
elif len(time_ax) > nsamp:
|
||||
if verbosity:
|
||||
print('shorten time axes by one datum')
|
||||
time_ax = np.arange(offset, etime - tincr, tincr)
|
||||
time_ax = np.arange(stime, etime - tincr, tincr)
|
||||
if len(time_ax) != nsamp:
|
||||
print('Station {0}, {1} samples of data \n '
|
||||
'{2} length of time vector \n'
|
||||
@ -708,13 +617,13 @@ def find_horizontals(data):
|
||||
:param data: waveform data
|
||||
:type data: `obspy.core.stream.Stream`
|
||||
:return: components list
|
||||
:rtype: List(str)
|
||||
:rtype: list
|
||||
|
||||
..example::
|
||||
|
||||
>>> st = read()
|
||||
>>> find_horizontals(st)
|
||||
['N', 'E']
|
||||
[u'N', u'E']
|
||||
"""
|
||||
rval = []
|
||||
for tr in data:
|
||||
@ -725,7 +634,7 @@ def find_horizontals(data):
|
||||
return rval
|
||||
|
||||
|
||||
def pick_color(picktype: Literal['manual', 'automatic'], phase: Literal['P', 'S'], quality: int = 0) -> Rgba:
|
||||
def pick_color(picktype, phase, quality=0):
|
||||
"""
|
||||
Create pick color by modifying the base color by the quality.
|
||||
|
||||
@ -738,7 +647,7 @@ def pick_color(picktype: Literal['manual', 'automatic'], phase: Literal['P', 'S'
|
||||
:param quality: quality of pick. Decides the new intensity of the modifier color
|
||||
:type quality: int
|
||||
:return: tuple containing modified rgba color values
|
||||
:rtype: Rgba
|
||||
:rtype: (int, int, int, int)
|
||||
"""
|
||||
min_quality = 3
|
||||
bpc = base_phase_colors(picktype, phase) # returns dict like {'modifier': 'g', 'rgba': (0, 0, 255, 255)}
|
||||
@ -794,17 +703,17 @@ def pick_linestyle_plt(picktype, key):
|
||||
return linestyles[picktype][key]
|
||||
|
||||
|
||||
def modify_rgba(rgba: Rgba, modifier: Literal['r', 'g', 'b'], intensity: float) -> Rgba:
|
||||
def modify_rgba(rgba, modifier, intensity):
|
||||
"""
|
||||
Modify rgba color by adding the given intensity to the modifier color
|
||||
:param rgba: tuple containing rgba values
|
||||
:type rgba: Rgba
|
||||
:param modifier: which color should be modified; options: 'r', 'g', 'b'
|
||||
:type modifier: Literal['r', 'g', 'b']
|
||||
:type rgba: (int, int, int, int)
|
||||
:param modifier: which color should be modified, eg. 'r', 'g', 'b'
|
||||
:type modifier: str
|
||||
:param intensity: intensity to be added to selected color
|
||||
:type intensity: float
|
||||
:return: tuple containing rgba values
|
||||
:rtype: Rgba
|
||||
:rtype: (int, int, int, int)
|
||||
"""
|
||||
rgba = list(rgba)
|
||||
index = {'r': 0,
|
||||
@ -838,20 +747,18 @@ def transform_colors_mpl_str(colors, no_alpha=False):
|
||||
Transforms rgba color values to a matplotlib string of color values with a range of [0, 1]
|
||||
:param colors: tuple of rgba color values ranging from [0, 255]
|
||||
:type colors: (float, float, float, float)
|
||||
:param no_alpha: Whether to return an alpha value in the matplotlib color string
|
||||
:param no_alpha: Wether to return a alpha value in the matplotlib color string
|
||||
:type no_alpha: bool
|
||||
:return: String containing r, g, b values and alpha value if no_alpha is False (default)
|
||||
:rtype: str
|
||||
|
||||
>>> transform_colors_mpl_str((255., 255., 255., 255.), True)
|
||||
'(1.0, 1.0, 1.0)'
|
||||
>>> transform_colors_mpl_str((255., 255., 255., 255.))
|
||||
'(1.0, 1.0, 1.0, 1.0)'
|
||||
"""
|
||||
colors = list(colors)
|
||||
colors_mpl = tuple([color / 255. for color in colors])
|
||||
if no_alpha:
|
||||
return '({}, {}, {})'.format(*transform_colors_mpl(colors))
|
||||
colors_mpl = '({}, {}, {})'.format(*colors_mpl)
|
||||
else:
|
||||
return '({}, {}, {}, {})'.format(*transform_colors_mpl(colors))
|
||||
colors_mpl = '({}, {}, {}, {})'.format(*colors_mpl)
|
||||
return colors_mpl
|
||||
|
||||
|
||||
def transform_colors_mpl(colors):
|
||||
@ -861,16 +768,27 @@ def transform_colors_mpl(colors):
|
||||
:type colors: (float, float, float, float)
|
||||
:return: tuple of rgba color values ranging from [0, 1]
|
||||
:rtype: (float, float, float, float)
|
||||
|
||||
>>> transform_colors_mpl((127.5, 0., 63.75, 255.))
|
||||
(0.5, 0.0, 0.25, 1.0)
|
||||
>>> transform_colors_mpl(())
|
||||
"""
|
||||
colors = list(colors)
|
||||
colors_mpl = tuple([color / 255. for color in colors])
|
||||
return colors_mpl
|
||||
|
||||
|
||||
def remove_underscores(data):
|
||||
"""
|
||||
takes a `obspy.core.stream.Stream` object and removes all underscores
|
||||
from station names
|
||||
:param data: stream of seismic data
|
||||
:type data: `~obspy.core.stream.Stream`
|
||||
:return: data stream
|
||||
:rtype: `~obspy.core.stream.Stream`
|
||||
"""
|
||||
# for tr in data:
|
||||
# # remove underscores
|
||||
# tr.stats.station = tr.stats.station.strip('_')
|
||||
return data
|
||||
|
||||
|
||||
def trim_station_components(data, trim_start=True, trim_end=True):
|
||||
"""
|
||||
cut a stream so only the part common to all three traces is kept to avoid dealing with offsets
|
||||
@ -899,6 +817,19 @@ def trim_station_components(data, trim_start=True, trim_end=True):
|
||||
return data
|
||||
|
||||
|
||||
def merge_stream(stream):
|
||||
gaps = stream.get_gaps()
|
||||
if gaps:
|
||||
# list of merged stations (seed_ids)
|
||||
merged = ['{}.{}.{}.{}'.format(*gap[:4]) for gap in gaps]
|
||||
stream.merge(method=1)
|
||||
print('Merged the following stations because of gaps:')
|
||||
for merged_station in merged:
|
||||
print(merged_station)
|
||||
|
||||
return stream, gaps
|
||||
|
||||
|
||||
def check4gapsAndRemove(data):
|
||||
"""
|
||||
check for gaps in Stream and remove them
|
||||
@ -919,12 +850,12 @@ def check4gapsAndRemove(data):
|
||||
return data
|
||||
|
||||
|
||||
def check_for_gaps_and_merge(data):
|
||||
def check4gapsAndMerge(data):
|
||||
"""
|
||||
check for gaps in Stream and merge if gaps are found
|
||||
:param data: stream of seismic data
|
||||
:type data: `~obspy.core.stream.Stream`
|
||||
:return: data stream, gaps returned from obspy get_gaps
|
||||
:return: data stream
|
||||
:rtype: `~obspy.core.stream.Stream`
|
||||
"""
|
||||
gaps = data.get_gaps()
|
||||
@ -935,7 +866,7 @@ def check_for_gaps_and_merge(data):
|
||||
for merged_station in merged:
|
||||
print(merged_station)
|
||||
|
||||
return data, gaps
|
||||
return data
|
||||
|
||||
|
||||
def check4doubled(data):
|
||||
@ -965,53 +896,13 @@ def check4doubled(data):
|
||||
return data
|
||||
|
||||
|
||||
def check_for_nan(data, nan_value=0.):
|
||||
"""
|
||||
Replace all NaNs in data with nan_value (in place)
|
||||
:param data: stream of seismic data
|
||||
:type data: `~obspy.core.stream.Stream`
|
||||
:param nan_value: value which all NaNs are set to
|
||||
:type nan_value: float, int
|
||||
:return: None
|
||||
"""
|
||||
if not data:
|
||||
return
|
||||
for trace in data:
|
||||
np.nan_to_num(trace.data, copy=False, nan=nan_value)
|
||||
|
||||
|
||||
def get_pylot_eventfile_with_extension(event, fext):
|
||||
if hasattr(event, 'path'):
|
||||
eventpath = event.path
|
||||
else:
|
||||
logging.warning('No attribute path found for event.')
|
||||
return
|
||||
eventname = event.pylot_id #path.split('/')[-1] # or event.pylot_id
|
||||
filename = os.path.join(eventpath, 'PyLoT_' + eventname + fext)
|
||||
if os.path.isfile(filename):
|
||||
return filename
|
||||
|
||||
|
||||
def get_possible_pylot_eventfile_extensions(event, fext):
|
||||
if hasattr(event, 'path'):
|
||||
eventpath = event.path
|
||||
else:
|
||||
logging.warning('No attribute path found for event.')
|
||||
return []
|
||||
eventname = event.pylot_id
|
||||
filename = os.path.join(eventpath, 'PyLoT_' + eventname + fext)
|
||||
filenames = glob.glob(filename)
|
||||
extensions = [os.path.split(path)[-1].split('PyLoT_' + eventname)[-1] for path in filenames]
|
||||
return extensions
|
||||
|
||||
|
||||
def get_stations(data):
|
||||
"""
|
||||
Get list of all station names in data-stream
|
||||
Get list of all station names in data stream
|
||||
:param data: stream containing seismic traces
|
||||
:type data: `~obspy.core.stream.Stream`
|
||||
:return: list of all station names in data, no duplicates
|
||||
:rtype: List(str)
|
||||
:rtype: list of str
|
||||
"""
|
||||
stations = []
|
||||
for tr in data:
|
||||
@ -1038,88 +929,63 @@ def check4rotated(data, metadata=None, verbosity=1):
|
||||
:rtype: `~obspy.core.stream.Stream`
|
||||
"""
|
||||
|
||||
def rotation_required(trace_ids):
|
||||
"""
|
||||
Derive if any rotation is required from the orientation code of the input.
|
||||
|
||||
:param trace_ids: string identifier of waveform data trace
|
||||
:type trace_ids: List(str)
|
||||
:return: boolean representing if rotation is necessary for any of the traces
|
||||
:rtype: bool
|
||||
"""
|
||||
orientations = [trace_id[-1] for trace_id in trace_ids]
|
||||
return any([orientation.isnumeric() for orientation in orientations])
|
||||
|
||||
def rotate_components(wfs_in, metadata=None):
|
||||
def rotate_components(wfstream, metadata=None):
|
||||
"""
|
||||
Rotate components if orientation code is numeric (= non traditional orientation).
|
||||
|
||||
Azimut and dip are fetched from metadata. To be rotated, traces of a station have to be cut to the same length.
|
||||
Returns unrotated traces of no metadata is provided
|
||||
:param wfs_in: stream containing seismic traces of a station
|
||||
:type wfs_in: `~obspy.core.stream.Stream`
|
||||
:param wfstream: stream containing seismic traces of a station
|
||||
:type wfstream: `~obspy.core.stream.Stream`
|
||||
:param metadata: tuple containing metadata type string and metadata parser object
|
||||
:type metadata: (str, `~obspy.io.xseed.parser.Parser`)
|
||||
:return: stream object with traditionally oriented traces (ZNE)
|
||||
:rtype: `~obspy.core.stream.Stream`
|
||||
"""
|
||||
|
||||
if len(wfs_in) < 3:
|
||||
print(f"Stream {wfs_in=}, has not enough components to rotate.")
|
||||
return wfs_in
|
||||
|
||||
# check if any traces in this station need to be rotated
|
||||
trace_ids = [trace.id for trace in wfs_in]
|
||||
if not rotation_required(trace_ids):
|
||||
logging.debug(f"Stream does not need any rotation: Traces are {trace_ids=}")
|
||||
return wfs_in
|
||||
trace_ids = [trace.id for trace in wfstream]
|
||||
orientations = [trace_id[-1] for trace_id in trace_ids]
|
||||
rotation_required = [orientation.isnumeric() for orientation in orientations]
|
||||
if any(rotation_required):
|
||||
t_start = full_range(wfstream)
|
||||
try:
|
||||
azimuts = [metadata.get_coordinates(tr_id, t_start)['azimuth'] for tr_id in trace_ids]
|
||||
dips = [metadata.get_coordinates(tr_id, t_start)['dip'] for tr_id in trace_ids]
|
||||
except (KeyError, TypeError) as e:
|
||||
print('Failed to rotate trace {}, no azimuth or dip available in metadata'.format(trace_id))
|
||||
return wfstream
|
||||
if len(wfstream) < 3:
|
||||
print('Failed to rotate Stream {}, not enough components available.'.format(wfstream))
|
||||
return wfstream
|
||||
# to rotate all traces must have same length, so trim them
|
||||
wfstream = trim_station_components(wfstream, trim_start=True, trim_end=True)
|
||||
try:
|
||||
z, n, e = rotate2zne(wfstream[0], azimuts[0], dips[0],
|
||||
wfstream[1], azimuts[1], dips[1],
|
||||
wfstream[2], azimuts[2], dips[2])
|
||||
print('check4rotated: rotated trace {} to ZNE'.format(trace_id))
|
||||
# replace old data with rotated data, change the channel code to ZNE
|
||||
z_index = dips.index(min(
|
||||
dips)) # get z-trace index, z has minimum dip of -90 (dip is measured from 0 to -90, with -90 being vertical)
|
||||
wfstream[z_index].data = z
|
||||
wfstream[z_index].stats.channel = wfstream[z_index].stats.channel[0:-1] + 'Z'
|
||||
del trace_ids[z_index]
|
||||
for trace_id in trace_ids:
|
||||
coordinates = metadata.get_coordinates(trace_id, t_start)
|
||||
dip, az = coordinates['dip'], coordinates['azimuth']
|
||||
trace = wfstream.select(id=trace_id)[0]
|
||||
if az > 315 or az <= 45 or az > 135 and az <= 225:
|
||||
trace.data = n
|
||||
trace.stats.channel = trace.stats.channel[0:-1] + 'N'
|
||||
elif az > 45 and az <= 135 or az > 225 and az <= 315:
|
||||
trace.data = e
|
||||
trace.stats.channel = trace.stats.channel[0:-1] + 'E'
|
||||
except (ValueError) as e:
|
||||
print(e)
|
||||
return wfstream
|
||||
|
||||
# check metadata quality
|
||||
t_start = full_range(wfs_in)[0]
|
||||
try:
|
||||
azimuths = []
|
||||
dips = []
|
||||
for tr_id in trace_ids:
|
||||
azimuths.append(metadata.get_coordinates(tr_id, t_start)['azimuth'])
|
||||
dips.append(metadata.get_coordinates(tr_id, t_start)['dip'])
|
||||
except (KeyError, TypeError) as err:
|
||||
logging.warning(f"Rotating not possible, not all azimuth and dip information "
|
||||
f"available in metadata. Stream remains unchanged.")
|
||||
logging.debug(f"Rotating not possible, {err=}, {type(err)=}")
|
||||
return wfs_in
|
||||
except Exception as err:
|
||||
print(f"Unexpected {err=}, {type(err)=}")
|
||||
raise
|
||||
|
||||
# to rotate all traces must have same length, so trim them
|
||||
wfs_out = trim_station_components(wfs_in, trim_start=True, trim_end=True)
|
||||
try:
|
||||
z, n, e = rotate2zne(wfs_out[0], azimuths[0], dips[0],
|
||||
wfs_out[1], azimuths[1], dips[1],
|
||||
wfs_out[2], azimuths[2], dips[2])
|
||||
print('check4rotated: rotated trace {} to ZNE'.format(trace_ids))
|
||||
# replace old data with rotated data, change the channel code to ZNE
|
||||
z_index = dips.index(min(
|
||||
dips)) # get z-trace index, z has minimum dip of -90 (dip is measured from 0 to -90, with -90
|
||||
# being vertical)
|
||||
wfs_out[z_index].data = z
|
||||
wfs_out[z_index].stats.channel = wfs_out[z_index].stats.channel[0:-1] + 'Z'
|
||||
del trace_ids[z_index]
|
||||
for trace_id in trace_ids:
|
||||
coordinates = metadata.get_coordinates(trace_id, t_start)
|
||||
dip, az = coordinates['dip'], coordinates['azimuth']
|
||||
trace = wfs_out.select(id=trace_id)[0]
|
||||
if az > 315 or az <= 45 or 135 < az <= 225:
|
||||
trace.data = n
|
||||
trace.stats.channel = trace.stats.channel[0:-1] + 'N'
|
||||
elif 45 < az <= 135 or 225 < az <= 315:
|
||||
trace.data = e
|
||||
trace.stats.channel = trace.stats.channel[0:-1] + 'E'
|
||||
except ValueError as err:
|
||||
print(f"{err=} Rotation failed. Stream remains unchanged.")
|
||||
return wfs_in
|
||||
|
||||
return wfs_out
|
||||
return wfstream
|
||||
|
||||
if metadata is None:
|
||||
if verbosity:
|
||||
@ -1133,6 +999,38 @@ def check4rotated(data, metadata=None, verbosity=1):
|
||||
return data
|
||||
|
||||
|
||||
def scaleWFData(data, factor=None, components='all'):
|
||||
"""
|
||||
produce scaled waveforms from given waveform data and a scaling factor,
|
||||
waveform may be selected by their components name
|
||||
:param data: waveform data to be scaled
|
||||
:type data: `~obspy.core.stream.Stream` object
|
||||
:param factor: scaling factor
|
||||
:type factor: float
|
||||
:param components: components labels for the traces in data to be scaled by
|
||||
the scaling factor (optional, default: 'all')
|
||||
:type components: tuple
|
||||
:return: scaled waveform data
|
||||
:rtype: `~obspy.core.stream.Stream` object
|
||||
"""
|
||||
if components is not 'all':
|
||||
for comp in components:
|
||||
if factor is None:
|
||||
max_val = np.max(np.abs(data.select(component=comp)[0].data))
|
||||
data.select(component=comp)[0].data /= 2 * max_val
|
||||
else:
|
||||
data.select(component=comp)[0].data /= 2 * factor
|
||||
else:
|
||||
for tr in data:
|
||||
if factor is None:
|
||||
max_val = float(np.max(np.abs(tr.data)))
|
||||
tr.data /= 2 * max_val
|
||||
else:
|
||||
tr.data /= 2 * factor
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def runProgram(cmd, parameter=None):
|
||||
"""
|
||||
run an external program specified by cmd with parameters input returning the
|
||||
@ -1204,7 +1102,6 @@ def identifyPhase(phase):
|
||||
return False
|
||||
|
||||
|
||||
@lru_cache
|
||||
def identifyPhaseID(phase):
|
||||
"""
|
||||
Returns phase id (capital P or S)
|
||||
@ -1268,7 +1165,7 @@ def correct_iplot(iplot):
|
||||
try:
|
||||
iplot = int(iplot)
|
||||
except ValueError:
|
||||
if get_bool(iplot):
|
||||
if get_Bool(iplot):
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
|
@ -35,9 +35,9 @@ from __future__ import print_function
|
||||
|
||||
__all__ = "get_git_version"
|
||||
|
||||
import inspect
|
||||
# NO IMPORTS FROM PYLOT IN THIS FILE! (file gets used at installation time)
|
||||
import os
|
||||
import inspect
|
||||
from subprocess import Popen, PIPE
|
||||
|
||||
# NO IMPORTS FROM PYLOT IN THIS FILE! (file gets used at installation time)
|
||||
|
File diff suppressed because it is too large
Load Diff
@ -1,2 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
@ -1,101 +0,0 @@
|
||||
############################# correlation parameters #####################################
|
||||
# min_corr_stacking: minimum correlation coefficient for building beam trace
|
||||
# min_corr_export: minimum correlation coefficient for pick export
|
||||
# min_stack: minimum number of stations for building beam trace
|
||||
# t_before: correlation window before pick
|
||||
# t_after: correlation window after pick#
|
||||
# cc_maxlag: maximum shift for initial correlation
|
||||
# cc_maxlag2: maximum shift for second (final) correlation (also for calculating pick uncertainty)
|
||||
# initial_pick_outlier_threshold: (hopefully) threshold for excluding large outliers of initial (AIC) picks
|
||||
# export_threshold: automatically exclude all onsets which deviate more than this threshold from corrected taup onsets
|
||||
# min_picks_export: minimum number of correlated picks for export
|
||||
# min_picks_autopylot: minimum number of reference auto picks to continue with event
|
||||
# check_RMS: do RMS check to search for restitution errors (very experimental)
|
||||
# use_taupy_onsets: use taupy onsets as reference picks instead of external picks
|
||||
# station_list: use the following stations as reference for stacking
|
||||
# use_stacked_trace: use existing stacked trace if found (spare re-computation)
|
||||
# data_dir: obspyDMT data subdirectory (e.g. 'raw', 'processed')
|
||||
# pickfile_extension: use quakeML files (PyLoT output) with the following extension, e.g. '_autopylot' for pickfiles
|
||||
# such as 'PyLoT_20170501_141822_autopylot.xml'
|
||||
# dt_stacking: time shift for stacking (e.g. [0, 250] for 0 and 250 seconds shift)
|
||||
# filter_options: filter for first correlation (rough)
|
||||
# filter_options_final: filter for second correlation (fine)
|
||||
# filter_type: e.g. 'bandpass'
|
||||
# sampfreq: sampling frequency of the data
|
||||
|
||||
logging: info
|
||||
pick_phases: ['P', 'S']
|
||||
|
||||
# P-phase
|
||||
P:
|
||||
min_corr_stacking: 0.8
|
||||
min_corr_export: 0.6
|
||||
min_stack: 20
|
||||
t_before: 30.
|
||||
t_after: 50.
|
||||
cc_maxlag: 50.
|
||||
cc_maxlag2: 5.
|
||||
initial_pick_outlier_threshold: 30.
|
||||
export_threshold: 2.5
|
||||
min_picks_export: 100
|
||||
min_picks_autopylot: 50
|
||||
check_RMS: True
|
||||
use_taupy_onsets: False
|
||||
station_list: ['HU.MORH', 'HU.TIH', 'OX.FUSE', 'OX.BAD']
|
||||
use_stacked_trace: False
|
||||
data_dir: 'processed'
|
||||
pickfile_extension: '_autopylot'
|
||||
dt_stacking: [250, 250]
|
||||
|
||||
# filter for first correlation (rough)
|
||||
filter_options:
|
||||
freqmax: 0.5
|
||||
freqmin: 0.03
|
||||
# filter for second correlation (fine)
|
||||
filter_options_final:
|
||||
freqmax: 0.5
|
||||
freqmin: 0.03
|
||||
|
||||
filter_type: bandpass
|
||||
sampfreq: 20.0
|
||||
|
||||
# ignore if autopylot fails to pick master-trace (not recommended if absolute onset times matter)
|
||||
ignore_autopylot_fail_on_master: True
|
||||
|
||||
# S-phase
|
||||
S:
|
||||
min_corr_stacking: 0.7
|
||||
min_corr_export: 0.6
|
||||
min_stack: 20
|
||||
t_before: 60.
|
||||
t_after: 60.
|
||||
cc_maxlag: 100.
|
||||
cc_maxlag2: 25.
|
||||
initial_pick_outlier_threshold: 30.
|
||||
export_threshold: 5.0
|
||||
min_picks_export: 200
|
||||
min_picks_autopylot: 50
|
||||
check_RMS: True
|
||||
use_taupy_onsets: False
|
||||
station_list: ['HU.MORH','HU.TIH', 'OX.FUSE', 'OX.BAD']
|
||||
use_stacked_trace: False
|
||||
data_dir: 'processed'
|
||||
pickfile_extension: '_autopylot'
|
||||
dt_stacking: [250, 250]
|
||||
|
||||
# filter for first correlation (rough)
|
||||
filter_options:
|
||||
freqmax: 0.1
|
||||
freqmin: 0.01
|
||||
|
||||
# filter for second correlation (fine)
|
||||
filter_options_final:
|
||||
freqmax: 0.2
|
||||
freqmin: 0.01
|
||||
|
||||
filter_type: bandpass
|
||||
sampfreq: 20.0
|
||||
|
||||
# ignore if autopylot fails to pick master-trace (not recommended if absolute onset times matter)
|
||||
ignore_autopylot_fail_on_master: True
|
||||
|
File diff suppressed because it is too large
Load Diff
@ -1,41 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
#ulimit -s 8192
|
||||
#ulimit -v $(ulimit -v | awk '{printf("%d",$1*0.95)}')
|
||||
#ulimit -v
|
||||
|
||||
#655360
|
||||
|
||||
source /opt/anaconda3/etc/profile.d/conda.sh
|
||||
conda activate pylot_311
|
||||
NSLOTS=20
|
||||
|
||||
#qsub -l low -cwd -l "os=*stretch" -pe smp 40 submit_pick_corr_correction.sh
|
||||
#$ -l low
|
||||
#$ -l h_vmem=6G
|
||||
#$ -cwd
|
||||
#$ -pe smp 20
|
||||
#$ -N corr_pick
|
||||
|
||||
|
||||
export PYTHONPATH="$PYTHONPATH:/home/marcel/git/pylot_tools/"
|
||||
export PYTHONPATH="$PYTHONPATH:/home/marcel/git/"
|
||||
export PYTHONPATH="$PYTHONPATH:/home/marcel/git/pylot/"
|
||||
|
||||
#export MKL_NUM_THREADS=${NSLOTS:=1}
|
||||
#export NUMEXPR_NUM_THREADS=${NSLOTS:=1}
|
||||
#export OMP_NUM_THREADS=${NSLOTS:=1}
|
||||
|
||||
#python pick_correlation_correction.py '/data/AlpArray_Data/dmt_database_mantle_M5.8-6.0' '/home/marcel/.pylot/pylot_alparray_mantle_corr_stack_0.03-0.5.in' -pd -n ${NSLOTS:=1} -istart 0 -istop 100
|
||||
#python pick_correlation_correction.py '/data/AlpArray_Data/dmt_database_mantle_M5.8-6.0' '/home/marcel/.pylot/pylot_alparray_mantle_corr_stack_0.03-0.5.in' -pd -n ${NSLOTS:=1} -istart 100 -istop 200
|
||||
#python pick_correlation_correction.py '/data/AlpArray_Data/dmt_database_mantle_M6.0-6.5' '/home/marcel/.pylot/pylot_alparray_mantle_corr_stack_0.03-0.5.in' -pd -n ${NSLOTS:=1} -istart 0 -istop 100
|
||||
#python pick_correlation_correction.py '/data/AlpArray_Data/dmt_database_mantle_M5.8-6.0' '/home/marcel/.pylot/pylot_alparray_mantle_corr_stack_0.03-0.5.in' -pd -n ${NSLOTS:=1} -istart 100 -istop 200
|
||||
#python pick_correlation_correction.py 'H:\sciebo\dmt_database' 'H:\Sciebo\dmt_database\pylot_alparray_mantle_corr_S_0.01-0.2.in' -pd -n 4 -t
|
||||
|
||||
#pylot_infile='/home/marcel/.pylot/pylot_alparray_syn_fwi_mk6_it3.in'
|
||||
pylot_infile='/home/marcel/.pylot/pylot_adriaarray_corr_P_and_S.in'
|
||||
|
||||
# THIS SCRIPT SHOLD BE CALLED BY "submit_to_grid_engine.py" using the following line:
|
||||
# use -pd for detailed plots in eventdir/correlation_XX_XX/figures
|
||||
python pick_correlation_correction.py $1 $pylot_infile -n ${NSLOTS:=1} -istart $2 --params 'parameters_adriaarray.yaml' # -pd
|
||||
#--event_blacklist eventlist.txt
|
@ -1,28 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import subprocess
|
||||
|
||||
fnames = [
|
||||
('/data/AdriaArray_Data/dmt_database_mantle_M5.0-5.4', 0),
|
||||
('/data/AdriaArray_Data/dmt_database_mantle_M5.4-5.7', 0),
|
||||
('/data/AdriaArray_Data/dmt_database_mantle_M5.7-6.0', 0),
|
||||
('/data/AdriaArray_Data/dmt_database_mantle_M6.0-6.3', 0),
|
||||
('/data/AdriaArray_Data/dmt_database_mantle_M6.3-10.0', 0),
|
||||
# ('/data/AdriaArray_Data/dmt_database_ISC_mantle_M5.0-5.4', 0),
|
||||
# ('/data/AdriaArray_Data/dmt_database_ISC_mantle_M5.4-5.7', 0),
|
||||
# ('/data/AdriaArray_Data/dmt_database_ISC_mantle_M5.7-6.0', 0),
|
||||
# ('/data/AdriaArray_Data/dmt_database_ISC_mantle_M6.0-10.0', 0),
|
||||
]
|
||||
|
||||
#fnames = [('/data/AlpArray_Data/dmt_database_mantle_0.01-0.2_SKS-phase', 0),
|
||||
# ('/data/AlpArray_Data/dmt_database_mantle_0.01-0.2_S-phase', 0),]
|
||||
|
||||
####
|
||||
script_location = '/home/marcel/VersionCtrl/git/pylot/pylot/correlation/submit_pick_corr_correction.sh'
|
||||
####
|
||||
|
||||
for fnin, istart in fnames:
|
||||
input_cmds = f'qsub -q low.q@minos15,low.q@minos14,low.q@minos13,low.q@minos12,low.q@minos11 {script_location} {fnin} {istart}'
|
||||
|
||||
print(input_cmds)
|
||||
print(subprocess.check_output(input_cmds.split()))
|
@ -1,61 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import os
|
||||
import glob
|
||||
import json
|
||||
|
||||
from obspy import read_events
|
||||
|
||||
from pylot.core.util.dataprocessing import Metadata
|
||||
from pylot.core.util.obspyDMT_interface import qml_from_obspyDMT
|
||||
|
||||
|
||||
def get_event_obspy_dmt(eventdir):
|
||||
event_pkl_file = os.path.join(eventdir, 'info', 'event.pkl')
|
||||
if not os.path.exists(event_pkl_file):
|
||||
raise IOError('Could not find event path for event: {}'.format(eventdir))
|
||||
event = qml_from_obspyDMT(event_pkl_file)
|
||||
return event
|
||||
|
||||
|
||||
def get_event_pylot(eventdir, extension=''):
|
||||
event_id = get_event_id(eventdir)
|
||||
filename = os.path.join(eventdir, 'PyLoT_{}{}.xml'.format(event_id, extension))
|
||||
if not os.path.isfile(filename):
|
||||
return
|
||||
cat = read_events(filename)
|
||||
return cat[0]
|
||||
|
||||
|
||||
def get_event_id(eventdir):
|
||||
event_id = os.path.split(eventdir)[-1]
|
||||
return event_id
|
||||
|
||||
|
||||
def get_picks(eventdir, extension=''):
|
||||
event_id = get_event_id(eventdir)
|
||||
filename = 'PyLoT_{}{}.xml'
|
||||
filename = filename.format(event_id, extension)
|
||||
fpath = os.path.join(eventdir, filename)
|
||||
fpaths = glob.glob(fpath)
|
||||
if len(fpaths) == 1:
|
||||
cat = read_events(fpaths[0])
|
||||
picks = cat[0].picks
|
||||
return picks
|
||||
elif len(fpaths) == 0:
|
||||
print('get_picks: File not found: {}'.format(fpath))
|
||||
return
|
||||
print(f'WARNING: Ambiguous pick file specification. Found the following pick files {fpaths}\nFilemask: {fpath}')
|
||||
return
|
||||
|
||||
|
||||
def write_json(object, fname):
|
||||
with open(fname, 'w') as outfile:
|
||||
json.dump(object, outfile, sort_keys=True, indent=4)
|
||||
|
||||
|
||||
def get_metadata(eventdir):
|
||||
metadata_path = os.path.join(eventdir, 'resp')
|
||||
metadata = Metadata(inventory=metadata_path, verbosity=0)
|
||||
return metadata
|
@ -1,7 +0,0 @@
|
||||
Cartopy==0.23.0
|
||||
joblib==1.4.2
|
||||
obspy==1.4.1
|
||||
pyaml==24.7.0
|
||||
pyqtgraph==0.13.7
|
||||
PySide2==5.15.8
|
||||
pytest==8.3.2
|
17
setup.py
Normal file
17
setup.py
Normal file
@ -0,0 +1,17 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
from distutils.core import setup
|
||||
|
||||
setup(
|
||||
name='PyLoT',
|
||||
version='0.2',
|
||||
packages=['pylot', 'pylot.core', 'pylot.core.loc', 'pylot.core.pick',
|
||||
'pylot.core.io', 'pylot.core.util', 'pylot.core.active',
|
||||
'pylot.core.analysis', 'pylot.testing'],
|
||||
requires=['obspy', 'PySide', 'matplotlib', 'numpy', 'scipy', 'pyqtgraph'],
|
||||
url='dummy',
|
||||
license='LGPLv3',
|
||||
author='Sebastian Wehling-Benatelli',
|
||||
author_email='sebastian.wehling@rub.de',
|
||||
description='Comprehensive Python picking and Location Toolbox for seismological data.'
|
||||
)
|
File diff suppressed because it is too large
Load Diff
@ -1,8 +1,6 @@
|
||||
import unittest
|
||||
|
||||
from pylot.core.pick.autopick import PickingResults
|
||||
|
||||
|
||||
class TestPickingResults(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
@ -72,9 +70,9 @@ class TestPickingResults(unittest.TestCase):
|
||||
curr_len = len(self.pr)
|
||||
except Exception:
|
||||
self.fail("test_dunder_attributes overwrote an instance internal dunder method")
|
||||
self.assertEqual(prev_len + 1, curr_len) # +1 for the added __len__ key/value-pair
|
||||
self.assertEqual(prev_len+1, curr_len) # +1 for the added __len__ key/value-pair
|
||||
|
||||
self.pr.__len__ = 42
|
||||
|
||||
self.assertEqual(42, self.pr['__len__'])
|
||||
self.assertEqual(prev_len + 1, curr_len, msg="__len__ was overwritten")
|
||||
self.assertEqual(prev_len+1, curr_len, msg="__len__ was overwritten")
|
||||
|
@ -1,6 +1,5 @@
|
||||
import os
|
||||
import unittest
|
||||
|
||||
from obspy import UTCDateTime
|
||||
from obspy.io.xseed import Parser
|
||||
from obspy.io.xseed.utils import SEEDParserException
|
||||
@ -28,7 +27,7 @@ class TestMetadata(unittest.TestCase):
|
||||
result = {}
|
||||
for channel in ('Z', 'N', 'E'):
|
||||
with HidePrints():
|
||||
coords = self.m.get_coordinates(self.station_id + channel, time=self.time)
|
||||
coords = self.m.get_coordinates(self.station_id+channel, time=self.time)
|
||||
result[channel] = coords
|
||||
self.assertDictEqual(result[channel], expected[channel])
|
||||
|
||||
@ -43,7 +42,7 @@ class TestMetadata(unittest.TestCase):
|
||||
result = {}
|
||||
for channel in ('Z', 'N', 'E'):
|
||||
with HidePrints():
|
||||
coords = self.m.get_coordinates(self.station_id + channel)
|
||||
coords = self.m.get_coordinates(self.station_id+channel)
|
||||
result[channel] = coords
|
||||
self.assertDictEqual(result[channel], expected[channel])
|
||||
|
||||
@ -146,7 +145,7 @@ class TestMetadata_read_single_file(unittest.TestCase):
|
||||
|
||||
def test_read_single_file(self):
|
||||
"""Test if reading a single file works"""
|
||||
fname = os.path.join(self.metadata_folders[0], 'DATALESS.' + self.station_id)
|
||||
fname = os.path.join(self.metadata_folders[0], 'DATALESS.'+self.station_id)
|
||||
with HidePrints():
|
||||
res = self.m.read_single_file(fname)
|
||||
# method should return true if file is successfully read
|
||||
@ -173,7 +172,7 @@ class TestMetadata_read_single_file(unittest.TestCase):
|
||||
|
||||
def test_read_single_file_multiple_times(self):
|
||||
"""Test if reading a file twice doesnt add it twice to the metadata object"""
|
||||
fname = os.path.join(self.metadata_folders[0], 'DATALESS.' + self.station_id)
|
||||
fname = os.path.join(self.metadata_folders[0], 'DATALESS.'+self.station_id)
|
||||
with HidePrints():
|
||||
res1 = self.m.read_single_file(fname)
|
||||
res2 = self.m.read_single_file(fname)
|
||||
@ -198,8 +197,7 @@ class TestMetadataMultipleTime(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.seed_id = 'LE.ROTT..HN'
|
||||
path = os.path.dirname(__file__) # gets path to currently running script
|
||||
metadata = os.path.join('test_data', 'dless_multiple_times',
|
||||
'MAGS2_LE_ROTT.dless') # specific subfolder of test data
|
||||
metadata = os.path.join('test_data', 'dless_multiple_times', 'MAGS2_LE_ROTT.dless') # specific subfolder of test data
|
||||
metadata_path = os.path.join(path, metadata)
|
||||
self.m = Metadata(metadata_path)
|
||||
self.p = Parser(metadata_path)
|
||||
@ -301,8 +299,7 @@ Channels:
|
||||
def setUp(self):
|
||||
self.seed_id = 'KB.TMO07.00.HHZ'
|
||||
path = os.path.dirname(__file__) # gets path to currently running script
|
||||
metadata = os.path.join('test_data', 'dless_multiple_instruments',
|
||||
'MAGS2_KB_TMO07.dless') # specific subfolder of test data
|
||||
metadata = os.path.join('test_data', 'dless_multiple_instruments', 'MAGS2_KB_TMO07.dless') # specific subfolder of test data
|
||||
metadata_path = os.path.join(path, metadata)
|
||||
self.m = Metadata(metadata_path)
|
||||
self.p = Parser(metadata_path)
|
||||
|
35
tests/test_PickingParameters.py
Normal file
35
tests/test_PickingParameters.py
Normal file
@ -0,0 +1,35 @@
|
||||
import unittest
|
||||
from pylot.core.pick.autopick import PickingParameters
|
||||
|
||||
class TestPickingParameters(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.simple_dict = {'a': 3, 'b': 14}
|
||||
self.nested_dict = {'a': self.simple_dict, 'b': self.simple_dict}
|
||||
|
||||
def assertParameterEquality(self, dic, instance):
|
||||
"""Test wether all parameters given in dic are found in instance"""
|
||||
for key, value in dic.items():
|
||||
self.assertEqual(value, getattr(instance, key))
|
||||
|
||||
def test_add_params_from_dict_simple(self):
|
||||
pickparam = PickingParameters()
|
||||
pickparam.add_params_from_dict(self.simple_dict)
|
||||
self.assertParameterEquality(self.simple_dict, pickparam)
|
||||
|
||||
def test_add_params_from_dict_nested(self):
|
||||
pickparam = PickingParameters()
|
||||
pickparam.add_params_from_dict(self.nested_dict)
|
||||
self.assertParameterEquality(self.nested_dict, pickparam)
|
||||
|
||||
def test_init(self):
|
||||
pickparam = PickingParameters(self.simple_dict)
|
||||
self.assertParameterEquality(self.simple_dict, pickparam)
|
||||
|
||||
def test_dot_access(self):
|
||||
pickparam = PickingParameters(self.simple_dict)
|
||||
self.assertEqual(pickparam.a, self.simple_dict['a'])
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -1,99 +0,0 @@
|
||||
%This is a parameter input file for PyLoT/autoPyLoT.
|
||||
%All main and special settings regarding data handling
|
||||
%and picking are to be set here!
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
dmt_database_test #datapath# %data path
|
||||
20171010_063224.a #eventID# %event ID for single event processing (* for all events found in database)
|
||||
#invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#NLLoc settings#
|
||||
None #nllocbin# %path to NLLoc executable
|
||||
None #nllocroot# %root of NLLoc-processing directory
|
||||
None #phasefile# %name of autoPyLoT-output phase file for NLLoc
|
||||
None #ctrfile# %name of autoPyLoT-output control file for NLLoc
|
||||
ttime #ttpatter# %pattern of NLLoc ttimes from grid
|
||||
AUTOLOC_nlloc #outpatter# %pattern of NLLoc-output file
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#parameters for seismic moment estimation#
|
||||
3530.0 #vp# %average P-wave velocity
|
||||
2500.0 #rho# %average rock density [kg/m^3]
|
||||
300.0 0.8 #Qp# %quality factor for P waves (Qp*f^a); list(Qp, a)
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#settings local magnitude#
|
||||
1.0 1.0 1.0 #WAscaling# %Scaling relation (log(Ao)+Alog(r)+Br+C) of Wood-Anderson amplitude Ao [nm] If zeros are set, original Richter magnitude is calculated!
|
||||
1.0 1.0 #magscaling# %Scaling relation for derived local magnitude [a*Ml+b]. If zeros are set, no scaling of network magnitude is applied!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#filter settings#
|
||||
0.03 0.03 #minfreq# %Lower filter frequency [P, S]
|
||||
0.5 0.5 #maxfreq# %Upper filter frequency [P, S]
|
||||
4 4 #filter_order# %filter order [P, S]
|
||||
bandpass bandpass #filter_type# %filter type (bandpass, bandstop, lowpass, highpass) [P, S]
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#common settings picker#
|
||||
global #extent# %extent of array ("local", "regional" or "global")
|
||||
-100.0 #pstart# %start time [s] for calculating CF for P-picking (if TauPy: seconds relative to estimated onset)
|
||||
50.0 #pstop# %end time [s] for calculating CF for P-picking (if TauPy: seconds relative to estimated onset)
|
||||
-50.0 #sstart# %start time [s] relative to P-onset for calculating CF for S-picking
|
||||
50.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
ak135 #taup_model# %Define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6
|
||||
P,Pdiff,S,SKS #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
0.03 0.5 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
0.01 0.5 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
0.03 0.5 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
0.01 0.5 #bph2# %lower/upper corner freq. of second band pass filter z-comp. [Hz]
|
||||
#special settings for calculating CF#
|
||||
%!!Edit the following only if you know what you are doing!!%
|
||||
#Z-component#
|
||||
HOS #algoP# %choose algorithm for P-onset determination (HOS, ARZ, or AR3)
|
||||
300.0 #tlta# %for HOS-/AR-AIC-picker, length of LTA window [s]
|
||||
4 #hosorder# %for HOS-picker, order of Higher Order Statistics
|
||||
2 #Parorder# %for AR-picker, order of AR process of Z-component
|
||||
16.0 #tdet1z# %for AR-picker, length of AR determination window [s] for Z-component, 1st pick
|
||||
10.0 #tpred1z# %for AR-picker, length of AR prediction window [s] for Z-component, 1st pick
|
||||
12.0 #tdet2z# %for AR-picker, length of AR determination window [s] for Z-component, 2nd pick
|
||||
6.0 #tpred2z# %for AR-picker, length of AR prediction window [s] for Z-component, 2nd pick
|
||||
0.001 #addnoise# %add noise to seismogram for stable AR prediction
|
||||
60.0 5.0 20.0 12.0 #tsnrz# %for HOS/AR, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]
|
||||
50.0 #pickwinP# %for initial AIC pick, length of P-pick window [s]
|
||||
30.0 #Precalcwin# %for HOS/AR, window length [s] for recalculation of CF (relative to 1st pick)
|
||||
2.0 #aictsmooth# %for HOS/AR, take average of samples for smoothing of AIC-function [s]
|
||||
2.0 #tsmoothP# %for HOS/AR, take average of samples in this time window for smoothing CF [s]
|
||||
0.006 #ausP# %for HOS/AR, artificial uplift of samples (aus) of CF (P)
|
||||
2.0 #nfacP# %for HOS/AR, noise factor for noise level determination (P)
|
||||
#H-components#
|
||||
ARH #algoS# %choose algorithm for S-onset determination (ARH or AR3)
|
||||
12.0 #tdet1h# %for HOS/AR, length of AR-determination window [s], H-components, 1st pick
|
||||
6.0 #tpred1h# %for HOS/AR, length of AR-prediction window [s], H-components, 1st pick
|
||||
8.0 #tdet2h# %for HOS/AR, length of AR-determinaton window [s], H-components, 2nd pick
|
||||
4.0 #tpred2h# %for HOS/AR, length of AR-prediction window [s], H-components, 2nd pick
|
||||
4 #Sarorder# %for AR-picker, order of AR process of H-components
|
||||
100.0 #Srecalcwin# %for AR-picker, window length [s] for recalculation of CF (2nd pick) (H)
|
||||
195.0 #pickwinS# %for initial AIC pick, length of S-pick window [s]
|
||||
60.0 10.0 30.0 12.0 #tsnrh# %for ARH/AR3, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]
|
||||
22.0 #aictsmoothS# %for AIC-picker, take average of samples in this time window for smoothing of AIC-function [s]
|
||||
20.0 #tsmoothS# %for AR-picker, take average of samples for smoothing CF [s] (S)
|
||||
0.001 #ausS# %for HOS/AR, artificial uplift of samples (aus) of CF (S)
|
||||
2.0 #nfacS# %for AR-picker, noise factor for noise level determination (S)
|
||||
#first-motion picker#
|
||||
1 #minfmweight# %minimum required P weight for first-motion determination
|
||||
3.0 #minFMSNR# %miniumum required SNR for first-motion determination
|
||||
10.0 #fmpickwin# %pick window [s] around P onset for calculating zero crossings
|
||||
#quality assessment#
|
||||
0.1 0.2 0.4 0.8 #timeerrorsP# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P
|
||||
4.0 8.0 16.0 32.0 #timeerrorsS# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S
|
||||
0.005 #minAICPslope# %below this slope [counts/s] the initial P pick is rejected
|
||||
1.1 #minAICPSNR# %below this SNR the initial P pick is rejected
|
||||
0.002 #minAICSslope# %below this slope [counts/s] the initial S pick is rejected
|
||||
1.3 #minAICSSNR# %below this SNR the initial S pick is rejected
|
||||
20.0 #minsiglength# %length of signal part for which amplitudes must exceed noiselevel [s]
|
||||
1.0 #noisefactor# %noiselevel*noisefactor=threshold
|
||||
10.0 #minpercent# %required percentage of amplitudes exceeding threshold
|
||||
0.1 #zfac# %P-amplitude must exceed at least zfac times RMS-S amplitude
|
||||
100.0 #mdttolerance# %maximum allowed deviation of P picks from median [s]
|
||||
50.0 #wdttolerance# %maximum allowed deviation from Wadati-diagram
|
||||
25.0 #jackfactor# %pick is removed if the variance of the subgroup with the pick removed is larger than the mean variance of all subgroups times safety factor
|
@ -1,67 +0,0 @@
|
||||
import os
|
||||
import pytest
|
||||
|
||||
from obspy import read_events
|
||||
|
||||
from autoPyLoT import autoPyLoT
|
||||
|
||||
|
||||
class TestAutopickerGlobal():
|
||||
def init(self):
|
||||
self.params_infile = 'pylot_alparray_mantle_corr_stack_0.03-0.5.in'
|
||||
self.test_event_dir = 'dmt_database_test'
|
||||
self.fname_outfile_xml = os.path.join(
|
||||
self.test_event_dir, '20171010_063224.a', 'PyLoT_20171010_063224.a_autopylot.xml'
|
||||
)
|
||||
|
||||
# check if the input files exist
|
||||
if not os.path.isfile(self.params_infile):
|
||||
print(f'Test input file {os.path.abspath(self.params_infile)} not found.')
|
||||
return False
|
||||
|
||||
if not os.path.exists(self.test_event_dir):
|
||||
print(
|
||||
f'Test event directory not found at location "{os.path.abspath(self.test_event_dir)}". '
|
||||
f'Make sure to load it from the website first.'
|
||||
)
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def test_autopicker(self):
|
||||
assert self.init(), 'Initialization failed due to missing input files.'
|
||||
# check for output file in test directory and remove it if necessary
|
||||
if os.path.isfile(self.fname_outfile_xml):
|
||||
os.remove(self.fname_outfile_xml)
|
||||
autoPyLoT(inputfile=self.params_infile, eventid='20171010_063224.a', obspyDMT_wfpath='processed')
|
||||
|
||||
# test for different known output files if they are identical or not
|
||||
compare_pickfiles(self.fname_outfile_xml, 'PyLoT_20171010_063224.a_autopylot.xml', True)
|
||||
compare_pickfiles(self.fname_outfile_xml, 'PyLoT_20171010_063224.a_saved_from_GUI.xml', True)
|
||||
compare_pickfiles(self.fname_outfile_xml, 'PyLoT_20171010_063224.a_corrected_taup_times_0.03-0.5_P.xml', False)
|
||||
|
||||
|
||||
def compare_pickfiles(pickfile1: str, pickfile2: str, samefile: bool = True) -> None:
|
||||
"""
|
||||
Compare the pick times and errors from two pick files.
|
||||
|
||||
Parameters:
|
||||
pickfile1 (str): The path to the first pick file.
|
||||
pickfile2 (str): The path to the second pick file.
|
||||
samefile (bool): A flag indicating whether the two files are expected to be the same. Defaults to True.
|
||||
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
cat1 = read_events(pickfile1)
|
||||
cat2 = read_events(pickfile2)
|
||||
picks1 = sorted(cat1[0].picks, key=lambda pick: str(pick.waveform_id))
|
||||
picks2 = sorted(cat2[0].picks, key=lambda pick: str(pick.waveform_id))
|
||||
pick_times1 = [pick.time for pick in picks1]
|
||||
pick_times2 = [pick.time for pick in picks2]
|
||||
pick_terrs1 = [pick.time_errors for pick in picks1]
|
||||
pick_terrs2 = [pick.time_errors for pick in picks2]
|
||||
|
||||
# check if times and errors are identical or not depending on the samefile flag
|
||||
assert (pick_times1 == pick_times2) is samefile, 'Pick times error'
|
||||
assert (pick_terrs1 == pick_terrs2) is samefile, 'Pick time errors errors'
|
@ -4,8 +4,10 @@
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
/home/darius/alparray/waveforms_used #datapath# %data path
|
||||
e0093.173.16 #eventID# %event ID for single event processing (* for all events found in datapath)
|
||||
/home/darius #rootpath# %project path
|
||||
alparray #datapath# %data path
|
||||
waveforms_used #database# %name of data base
|
||||
e0093.173.16 #eventID# %event ID for single event processing (* for all events found in database)
|
||||
/home/darius/alparray/metadata #invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
@ -41,7 +43,6 @@ global #extent# %extent of a
|
||||
875.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
False #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
IASP91 #taup_model# %define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6
|
||||
P,Pdiff,S,Sdiff #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
0.01 0.1 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
0.001 0.5 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
0.01 0.5 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
|
@ -4,8 +4,10 @@
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
/home/darius/alparray/waveforms_used #datapath# %data path
|
||||
e0093.173.16 #eventID# %event ID for single event processing (* for all events found in datapath)
|
||||
/home/darius #rootpath# %project path
|
||||
alparray #datapath# %data path
|
||||
waveforms_used #database# %name of data base
|
||||
e0093.173.16 #eventID# %event ID for single event processing (* for all events found in database)
|
||||
/home/darius/alparray/metadata #invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
@ -41,7 +43,6 @@ global #extent# %extent of a
|
||||
875.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
IASP91 #taup_model# %define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6
|
||||
P,Pdiff,S,Sdiff #taup_phases# %Specify possible phases for TauPy (comma separated). See Obspy TauPy documentation for possible values.
|
||||
0.01 0.1 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
0.001 0.5 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
0.01 0.5 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
|
@ -1,21 +1,21 @@
|
||||
<?xml version='1.0' encoding='utf-8'?>
|
||||
<q:quakeml xmlns:q="http://quakeml.org/xmlns/quakeml/1.2" xmlns="http://quakeml.org/xmlns/bed/1.2">
|
||||
<eventParameters publicID="smi:local/53a38563-739a-48b2-9f34-bf40ee7b656a">
|
||||
<event publicID="smi:local/e0001.024.16">
|
||||
<origin publicID="smi:local/e0001.024.16">
|
||||
<time>
|
||||
<value>2016-01-24T10:30:30.000000Z</value>
|
||||
</time>
|
||||
<latitude>
|
||||
<value>59.66</value>
|
||||
</latitude>
|
||||
<longitude>
|
||||
<value>-153.45</value>
|
||||
</longitude>
|
||||
<depth>
|
||||
<value>128.0</value>
|
||||
</depth>
|
||||
</origin>
|
||||
</event>
|
||||
</eventParameters>
|
||||
<eventParameters publicID="smi:local/53a38563-739a-48b2-9f34-bf40ee7b656a">
|
||||
<event publicID="smi:local/e0001.024.16">
|
||||
<origin publicID="smi:local/e0001.024.16">
|
||||
<time>
|
||||
<value>2016-01-24T10:30:30.000000Z</value>
|
||||
</time>
|
||||
<latitude>
|
||||
<value>59.66</value>
|
||||
</latitude>
|
||||
<longitude>
|
||||
<value>-153.45</value>
|
||||
</longitude>
|
||||
<depth>
|
||||
<value>128.0</value>
|
||||
</depth>
|
||||
</origin>
|
||||
</event>
|
||||
</eventParameters>
|
||||
</q:quakeml>
|
||||
|
@ -1,14 +1,12 @@
|
||||
import os
|
||||
import sys
|
||||
import unittest
|
||||
import pytest
|
||||
|
||||
from unittest import skip
|
||||
import obspy
|
||||
from obspy import UTCDateTime
|
||||
|
||||
from pylot.core.io.data import Data
|
||||
from pylot.core.io.inputs import PylotParameter
|
||||
import os
|
||||
import sys
|
||||
from pylot.core.pick.autopick import autopickstation
|
||||
from pylot.core.io.inputs import PylotParameter
|
||||
from pylot.core.io.data import Data
|
||||
from pylot.core.util.utils import trim_station_components
|
||||
|
||||
|
||||
@ -95,155 +93,82 @@ class TestAutopickStation(unittest.TestCase):
|
||||
self.inputfile_taupy_disabled = os.path.join(os.path.dirname(__file__), 'autoPyLoT_global_taupy_false.in')
|
||||
self.pickparam_taupy_enabled = PylotParameter(fnin=self.inputfile_taupy_enabled)
|
||||
self.pickparam_taupy_disabled = PylotParameter(fnin=self.inputfile_taupy_disabled)
|
||||
self.xml_file = os.path.join(os.path.dirname(__file__), self.event_id, 'PyLoT_' + self.event_id + '.xml')
|
||||
self.xml_file = os.path.join(os.path.dirname(__file__),self.event_id, 'PyLoT_'+self.event_id+'.xml')
|
||||
self.data = Data(evtdata=self.xml_file)
|
||||
# create origin for taupy testing
|
||||
self.origin = [obspy.core.event.origin.Origin(magnitude=7.1, latitude=59.66, longitude=-153.45, depth=128.0,
|
||||
time=UTCDateTime("2016-01-24T10:30:30.0"))]
|
||||
self.origin = [obspy.core.event.origin.Origin(magnitude=7.1, latitude=59.66, longitude=-153.45, depth=128.0, time=UTCDateTime("2016-01-24T10:30:30.0"))]
|
||||
# mocking metadata since reading it takes a long time to read from file
|
||||
self.metadata = MockMetadata()
|
||||
|
||||
# show complete diff when difference in results dictionaries are found
|
||||
self.maxDiff = None
|
||||
|
||||
#@skip("Works")
|
||||
def test_autopickstation_taupy_disabled_gra1(self):
|
||||
expected = {
|
||||
'P': {'picker': 'auto', 'snrdb': 15.405649120980094, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None,
|
||||
'fc': None, 'snr': 34.718816470730317, 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 31, 690000),
|
||||
'w0': None, 'spe': 0.93333333333333235, 'network': u'GR',
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 41, 28, 890000),
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 41, 32, 690000), 'fm': 'D', 'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': 10.669661906545489, 'network': u'GR', 'weight': 0, 'Ao': None,
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 50, 30, 690000), 'snr': 11.667187857573905,
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 50, 21, 690000),
|
||||
'mpp': UTCDateTime(2016, 1, 24, 10, 50, 29, 690000), 'fm': None, 'spe': 2.6666666666666665,
|
||||
'channel': u'LHE'}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': 15.405649120980094, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None, 'fc': None, 'snr': 34.718816470730317, 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 31, 690000), 'w0': None, 'spe': 0.93333333333333235, 'network': u'GR', 'epp': UTCDateTime(2016, 1, 24, 10, 41, 28, 890000), 'lpp': UTCDateTime(2016, 1, 24, 10, 41, 32, 690000), 'fm': 'D', 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': 10.669661906545489, 'network': u'GR', 'weight': 0, 'Ao': None, 'lpp': UTCDateTime(2016, 1, 24, 10, 50, 30, 690000), 'snr': 11.667187857573905, 'epp': UTCDateTime(2016, 1, 24, 10, 50, 21, 690000), 'mpp': UTCDateTime(2016, 1, 24, 10, 50, 29, 690000), 'fm': None, 'spe': 2.6666666666666665, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.gra1, pickparam=self.pickparam_taupy_disabled,
|
||||
metadata=(None, None))
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
result, station = autopickstation(wfstream=self.gra1, pickparam=self.pickparam_taupy_disabled, metadata=(None, None))
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('GRA1', station)
|
||||
|
||||
def test_autopickstation_taupy_enabled_gra1(self):
|
||||
expected = {
|
||||
'P': {'picker': 'auto', 'snrdb': 15.599905299126778, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None,
|
||||
'fc': None, 'snr': 36.307013769185403, 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 27, 690000),
|
||||
'w0': None, 'spe': 0.93333333333333235, 'network': u'GR',
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 41, 24, 890000),
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 41, 28, 690000), 'fm': 'U', 'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': 10.669661906545489, 'network': u'GR', 'weight': 0, 'Ao': None,
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 50, 30, 690000), 'snr': 11.667187857573905,
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 50, 21, 690000),
|
||||
'mpp': UTCDateTime(2016, 1, 24, 10, 50, 29, 690000), 'fm': None, 'spe': 2.6666666666666665,
|
||||
'channel': u'LHE'}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': 15.599905299126778, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None, 'fc': None, 'snr': 36.307013769185403, 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 27, 690000), 'w0': None, 'spe': 0.93333333333333235, 'network': u'GR', 'epp': UTCDateTime(2016, 1, 24, 10, 41, 24, 890000), 'lpp': UTCDateTime(2016, 1, 24, 10, 41, 28, 690000), 'fm': 'U', 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': 10.669661906545489, 'network': u'GR', 'weight': 0, 'Ao': None, 'lpp': UTCDateTime(2016, 1, 24, 10, 50, 30, 690000), 'snr': 11.667187857573905, 'epp': UTCDateTime(2016, 1, 24, 10, 50, 21, 690000), 'mpp': UTCDateTime(2016, 1, 24, 10, 50, 29, 690000), 'fm': None, 'spe': 2.6666666666666665, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.gra1, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
result, station = autopickstation(wfstream=self.gra1, pickparam=self.pickparam_taupy_enabled, metadata=self.metadata, origin=self.origin)
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('GRA1', station)
|
||||
|
||||
def test_autopickstation_taupy_disabled_gra2(self):
|
||||
expected = {
|
||||
'P': {'picker': 'auto', 'snrdb': None, 'weight': 9, 'Mo': None, 'marked': 'shortsignallength', 'Mw': None,
|
||||
'fc': None, 'snr': None, 'mpp': UTCDateTime(2016, 1, 24, 10, 36, 59, 150000), 'w0': None, 'spe': None,
|
||||
'network': u'GR', 'epp': UTCDateTime(2016, 1, 24, 10, 36, 43, 150000),
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 37, 15, 150000), 'fm': 'N', 'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': None, 'network': u'GR', 'weight': 4, 'Ao': None,
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 37, 15, 150000), 'snr': None,
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 36, 43, 150000),
|
||||
'mpp': UTCDateTime(2016, 1, 24, 10, 36, 59, 150000), 'fm': None, 'spe': None, 'channel': u'LHE'}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': None, 'weight': 9, 'Mo': None, 'marked': 'shortsignallength', 'Mw': None, 'fc': None, 'snr': None, 'mpp': UTCDateTime(2016, 1, 24, 10, 36, 59, 150000), 'w0': None, 'spe': None, 'network': u'GR', 'epp': UTCDateTime(2016, 1, 24, 10, 36, 43, 150000), 'lpp': UTCDateTime(2016, 1, 24, 10, 37, 15, 150000), 'fm': 'N', 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': None, 'network': u'GR', 'weight': 4, 'Ao': None, 'lpp': UTCDateTime(2016, 1, 24, 10, 37, 15, 150000), 'snr': None, 'epp': UTCDateTime(2016, 1, 24, 10, 36, 43, 150000), 'mpp': UTCDateTime(2016, 1, 24, 10, 36, 59, 150000), 'fm': None, 'spe': None, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.gra2, pickparam=self.pickparam_taupy_disabled,
|
||||
metadata=(None, None))
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
result, station = autopickstation(wfstream=self.gra2, pickparam=self.pickparam_taupy_disabled, metadata=(None, None))
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('GRA2', station)
|
||||
|
||||
def test_autopickstation_taupy_enabled_gra2(self):
|
||||
expected = {
|
||||
'P': {'picker': 'auto', 'snrdb': 13.957959025719253, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None,
|
||||
'fc': None, 'snr': 24.876879503607871, 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 29, 150000),
|
||||
'w0': None, 'spe': 1.0, 'network': u'GR', 'epp': UTCDateTime(2016, 1, 24, 10, 41, 26, 150000),
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 41, 30, 150000), 'fm': None, 'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': 10.573236990555648, 'network': u'GR', 'weight': 1, 'Ao': None,
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 50, 34, 150000), 'snr': 11.410999834108294,
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 50, 21, 150000),
|
||||
'mpp': UTCDateTime(2016, 1, 24, 10, 50, 33, 150000), 'fm': None, 'spe': 4.666666666666667,
|
||||
'channel': u'LHE'}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': 13.957959025719253, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None, 'fc': None, 'snr': 24.876879503607871, 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 29, 150000), 'w0': None, 'spe': 1.0, 'network': u'GR', 'epp': UTCDateTime(2016, 1, 24, 10, 41, 26, 150000), 'lpp': UTCDateTime(2016, 1, 24, 10, 41, 30, 150000), 'fm': None, 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': 10.573236990555648, 'network': u'GR', 'weight': 1, 'Ao': None, 'lpp': UTCDateTime(2016, 1, 24, 10, 50, 34, 150000), 'snr': 11.410999834108294, 'epp': UTCDateTime(2016, 1, 24, 10, 50, 21, 150000), 'mpp': UTCDateTime(2016, 1, 24, 10, 50, 33, 150000), 'fm': None, 'spe': 4.666666666666667, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.gra2, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
result, station = autopickstation(wfstream=self.gra2, pickparam=self.pickparam_taupy_enabled, metadata=self.metadata, origin = self.origin)
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('GRA2', station)
|
||||
|
||||
def test_autopickstation_taupy_disabled_ech(self):
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': None, 'weight': 9, 'Mo': None, 'marked': 'SinsteadP', 'Mw': None,
|
||||
'fc': None, 'snr': None, 'mpp': UTCDateTime(2016, 1, 24, 10, 26, 57), 'w0': None, 'spe': None,
|
||||
'network': u'G', 'epp': UTCDateTime(2016, 1, 24, 10, 26, 41),
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 27, 13), 'fm': 'N', 'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': None, 'network': u'G', 'weight': 4, 'Ao': None,
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 27, 13), 'snr': None,
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 26, 41), 'mpp': UTCDateTime(2016, 1, 24, 10, 26, 57),
|
||||
'fm': None, 'spe': None, 'channel': u'LHE'}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': None, 'weight': 9, 'Mo': None, 'marked': 'SinsteadP', 'Mw': None, 'fc': None, 'snr': None, 'mpp': UTCDateTime(2016, 1, 24, 10, 26, 57), 'w0': None, 'spe': None, 'network': u'G', 'epp': UTCDateTime(2016, 1, 24, 10, 26, 41), 'lpp': UTCDateTime(2016, 1, 24, 10, 27, 13), 'fm': 'N', 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': None, 'network': u'G', 'weight': 4, 'Ao': None, 'lpp': UTCDateTime(2016, 1, 24, 10, 27, 13), 'snr': None, 'epp': UTCDateTime(2016, 1, 24, 10, 26, 41), 'mpp': UTCDateTime(2016, 1, 24, 10, 26, 57), 'fm': None, 'spe': None, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.ech, pickparam=self.pickparam_taupy_disabled)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('ECH', station)
|
||||
|
||||
def test_autopickstation_taupy_enabled_ech(self):
|
||||
# this station has a long time of before the first onset, so taupy will help during picking
|
||||
expected = {
|
||||
'P': {'picker': 'auto', 'snrdb': 9.9753586609166316, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None,
|
||||
'fc': None, 'snr': 9.9434218804137107, 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 34), 'w0': None,
|
||||
'spe': 1.6666666666666667, 'network': u'G', 'epp': UTCDateTime(2016, 1, 24, 10, 41, 29),
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 41, 35), 'fm': None, 'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': 12.698999454169567, 'network': u'G', 'weight': 0, 'Ao': None,
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 50, 44), 'snr': 18.616581906366577,
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 50, 33), 'mpp': UTCDateTime(2016, 1, 24, 10, 50, 43), 'fm': None,
|
||||
'spe': 3.3333333333333335, 'channel': u'LHE'}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': 9.9753586609166316, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None, 'fc': None, 'snr': 9.9434218804137107, 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 34), 'w0': None, 'spe': 1.6666666666666667, 'network': u'G', 'epp': UTCDateTime(2016, 1, 24, 10, 41, 29), 'lpp': UTCDateTime(2016, 1, 24, 10, 41, 35), 'fm': None, 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': 12.698999454169567, 'network': u'G', 'weight': 0, 'Ao': None, 'lpp': UTCDateTime(2016, 1, 24, 10, 50, 44), 'snr': 18.616581906366577, 'epp': UTCDateTime(2016, 1, 24, 10, 50, 33), 'mpp': UTCDateTime(2016, 1, 24, 10, 50, 43), 'fm': None, 'spe': 3.3333333333333335, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.ech, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
result, station = autopickstation(wfstream=self.ech, pickparam=self.pickparam_taupy_enabled, metadata=self.metadata, origin=self.origin)
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('ECH', station)
|
||||
|
||||
def test_autopickstation_taupy_disabled_fiesa(self):
|
||||
# this station has a long time of before the first onset, so taupy will help during picking
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': None, 'weight': 9, 'Mo': None, 'marked': 'SinsteadP', 'Mw': None,
|
||||
'fc': None, 'snr': None, 'mpp': UTCDateTime(2016, 1, 24, 10, 35, 58), 'w0': None, 'spe': None,
|
||||
'network': u'CH', 'epp': UTCDateTime(2016, 1, 24, 10, 35, 42),
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 36, 14), 'fm': 'N', 'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': None, 'network': u'CH', 'weight': 4, 'Ao': None,
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 36, 14), 'snr': None,
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 35, 42), 'mpp': UTCDateTime(2016, 1, 24, 10, 35, 58),
|
||||
'fm': None, 'spe': None, 'channel': u'LHE'}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': None, 'weight': 9, 'Mo': None, 'marked': 'SinsteadP', 'Mw': None, 'fc': None, 'snr': None, 'mpp': UTCDateTime(2016, 1, 24, 10, 35, 58), 'w0': None, 'spe': None, 'network': u'CH', 'epp': UTCDateTime(2016, 1, 24, 10, 35, 42), 'lpp': UTCDateTime(2016, 1, 24, 10, 36, 14), 'fm': 'N', 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': None, 'network': u'CH', 'weight': 4, 'Ao': None, 'lpp': UTCDateTime(2016, 1, 24, 10, 36, 14), 'snr': None, 'epp': UTCDateTime(2016, 1, 24, 10, 35, 42), 'mpp': UTCDateTime(2016, 1, 24, 10, 35, 58), 'fm': None, 'spe': None, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.fiesa, pickparam=self.pickparam_taupy_disabled)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('FIESA', station)
|
||||
|
||||
def test_autopickstation_taupy_enabled_fiesa(self):
|
||||
# this station has a long time of before the first onset, so taupy will help during picking
|
||||
expected = {
|
||||
'P': {'picker': 'auto', 'snrdb': 13.921049277904373, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None,
|
||||
'fc': None, 'snr': 24.666352170589487, 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 47), 'w0': None,
|
||||
'spe': 1.2222222222222285, 'network': u'CH', 'epp': UTCDateTime(2016, 1, 24, 10, 41, 43, 333333),
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 41, 48), 'fm': None, 'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': 10.893086316477728, 'network': u'CH', 'weight': 0, 'Ao': None,
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 51, 5), 'snr': 12.283118216397849,
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 50, 59, 333333), 'mpp': UTCDateTime(2016, 1, 24, 10, 51, 2),
|
||||
'fm': None, 'spe': 2.8888888888888764, 'channel': u'LHE'}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': 13.921049277904373, 'weight': 0, 'Mo': None, 'marked': [], 'Mw': None, 'fc': None, 'snr': 24.666352170589487, 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 47), 'w0': None, 'spe': 1.2222222222222285, 'network': u'CH', 'epp': UTCDateTime(2016, 1, 24, 10, 41, 43, 333333), 'lpp': UTCDateTime(2016, 1, 24, 10, 41, 48), 'fm': None, 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': 10.893086316477728, 'network': u'CH', 'weight': 0, 'Ao': None, 'lpp': UTCDateTime(2016, 1, 24, 10, 51, 5), 'snr': 12.283118216397849, 'epp': UTCDateTime(2016, 1, 24, 10, 50, 59, 333333), 'mpp': UTCDateTime(2016, 1, 24, 10, 51, 2), 'fm': None, 'spe': 2.8888888888888764, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.fiesa, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
result, station = autopickstation(wfstream=self.fiesa, pickparam=self.pickparam_taupy_enabled, metadata=self.metadata, origin=self.origin)
|
||||
self.assertDictContainsSubset(expected=expected['P'], actual=result['P'])
|
||||
self.assertDictContainsSubset(expected=expected['S'], actual=result['S'])
|
||||
self.assertEqual('FIESA', station)
|
||||
|
||||
def test_autopickstation_gra1_z_comp_missing(self):
|
||||
@ -251,8 +176,7 @@ class TestAutopickStation(unittest.TestCase):
|
||||
wfstream = self.gra1.copy()
|
||||
wfstream = wfstream.select(channel='*E') + wfstream.select(channel='*N')
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=wfstream, pickparam=self.pickparam_taupy_disabled,
|
||||
metadata=(None, None))
|
||||
result, station = autopickstation(wfstream=wfstream, pickparam=self.pickparam_taupy_disabled, metadata=(None, None))
|
||||
self.assertIsNone(result)
|
||||
self.assertEqual('GRA1', station)
|
||||
|
||||
@ -260,91 +184,28 @@ class TestAutopickStation(unittest.TestCase):
|
||||
"""Picking on a stream without horizontal traces should still pick the P phase on the vertical component"""
|
||||
wfstream = self.gra1.copy()
|
||||
wfstream = wfstream.select(channel='*Z')
|
||||
expected = {
|
||||
'P': {'picker': 'auto', 'snrdb': 15.405649120980094, 'network': u'GR', 'weight': 0, 'Ao': None, 'Mo': None,
|
||||
'marked': [], 'lpp': UTCDateTime(2016, 1, 24, 10, 41, 32, 690000), 'Mw': None, 'fc': None,
|
||||
'snr': 34.718816470730317, 'epp': UTCDateTime(2016, 1, 24, 10, 41, 28, 890000),
|
||||
'mpp': UTCDateTime(2016, 1, 24, 10, 41, 31, 690000), 'w0': None, 'spe': 0.9333333333333323, 'fm': 'D',
|
||||
'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': None, 'network': None, 'weight': 4, 'Mo': None, 'Ao': None, 'lpp': None,
|
||||
'Mw': None, 'fc': None, 'snr': None, 'marked': [], 'mpp': None, 'w0': None, 'spe': None, 'epp': None,
|
||||
'fm': 'N', 'channel': None}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': 15.405649120980094, 'network': u'GR', 'weight': 0, 'Ao': None, 'Mo': None, 'marked': [], 'lpp': UTCDateTime(2016, 1, 24, 10, 41, 32, 690000), 'Mw': None, 'fc': None, 'snr': 34.718816470730317, 'epp': UTCDateTime(2016, 1, 24, 10, 41, 28, 890000), 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 31, 690000), 'w0': None, 'spe': 0.9333333333333323, 'fm': 'D', 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': None, 'network': None, 'weight': 4, 'Mo': None, 'Ao': None, 'lpp': None, 'Mw': None, 'fc': None, 'snr': None, 'marked': [], 'mpp': None, 'w0': None, 'spe': None, 'epp': None, 'fm': 'N', 'channel': None}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=wfstream, pickparam=self.pickparam_taupy_disabled,
|
||||
metadata=(None, None))
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
result, station = autopickstation(wfstream=wfstream, pickparam=self.pickparam_taupy_disabled, metadata=(None, None))
|
||||
self.assertEqual(expected, result)
|
||||
self.assertEqual('GRA1', station)
|
||||
|
||||
def test_autopickstation_a106_taupy_enabled(self):
|
||||
"""This station has invalid values recorded on both N and E component, but a pick can still be found on Z"""
|
||||
expected = {
|
||||
'P': {'picker': 'auto', 'snrdb': 12.862128789922826, 'network': u'Z3', 'weight': 0, 'Ao': None, 'Mo': None,
|
||||
'marked': [], 'lpp': UTCDateTime(2016, 1, 24, 10, 41, 34), 'Mw': None, 'fc': None,
|
||||
'snr': 19.329155459132608, 'epp': UTCDateTime(2016, 1, 24, 10, 41, 30),
|
||||
'mpp': UTCDateTime(2016, 1, 24, 10, 41, 33), 'w0': None, 'spe': 1.6666666666666667, 'fm': None,
|
||||
'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': None, 'network': u'Z3', 'weight': 4, 'Ao': None, 'Mo': None, 'marked': [],
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 28, 56), 'Mw': None, 'fc': None, 'snr': None,
|
||||
'epp': UTCDateTime(2016, 1, 24, 10, 28, 24), 'mpp': UTCDateTime(2016, 1, 24, 10, 28, 40), 'w0': None,
|
||||
'spe': None, 'fm': None, 'channel': u'LHE'}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': 12.862128789922826, 'network': u'Z3', 'weight': 0, 'Ao': None, 'Mo': None, 'marked': [], 'lpp': UTCDateTime(2016, 1, 24, 10, 41, 34), 'Mw': None, 'fc': None, 'snr': 19.329155459132608, 'epp': UTCDateTime(2016, 1, 24, 10, 41, 30), 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 33), 'w0': None, 'spe': 1.6666666666666667, 'fm': None, 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': None, 'network': u'Z3', 'weight': 4, 'Ao': None, 'Mo': None, 'marked': [], 'lpp': UTCDateTime(2016, 1, 24, 10, 28, 56), 'Mw': None, 'fc': None, 'snr': None, 'epp': UTCDateTime(2016, 1, 24, 10, 28, 24), 'mpp': UTCDateTime(2016, 1, 24, 10, 28, 40), 'w0': None, 'spe': None, 'fm': None, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.a106, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
|
||||
result, station = autopickstation(wfstream=self.a106, pickparam=self.pickparam_taupy_enabled, metadata=self.metadata, origin=self.origin)
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_autopickstation_station_missing_in_metadata(self):
|
||||
"""This station is not in the metadata, but Taupy is enabled. Taupy should exit cleanly and modify the starttime
|
||||
relative to the theoretical onset to one relative to the traces starttime, eg never negative.
|
||||
"""
|
||||
self.pickparam_taupy_enabled.setParamKV('pstart', -100) # modify starttime to be relative to theoretical onset
|
||||
expected = {
|
||||
'P': {'picker': 'auto', 'snrdb': 14.464757855513506, 'network': u'Z3', 'weight': 0, 'Mo': None, 'Ao': None,
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 41, 39, 605000), 'Mw': None, 'fc': None,
|
||||
'snr': 27.956048519707181, 'marked': [], 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 38, 605000),
|
||||
'w0': None, 'spe': 1.6666666666666667, 'epp': UTCDateTime(2016, 1, 24, 10, 41, 35, 605000),
|
||||
'fm': None, 'channel': u'LHZ'},
|
||||
'S': {'picker': 'auto', 'snrdb': 10.112844176301248, 'network': u'Z3', 'weight': 1, 'Mo': None, 'Ao': None,
|
||||
'lpp': UTCDateTime(2016, 1, 24, 10, 50, 51, 605000), 'Mw': None, 'fc': None,
|
||||
'snr': 10.263238413785425, 'marked': [], 'mpp': UTCDateTime(2016, 1, 24, 10, 50, 48, 605000),
|
||||
'w0': None, 'spe': 4.666666666666667, 'epp': UTCDateTime(2016, 1, 24, 10, 50, 40, 605000), 'fm': None,
|
||||
'channel': u'LHE'}}
|
||||
expected = {'P': {'picker': 'auto', 'snrdb': 14.464757855513506, 'network': u'Z3', 'weight': 0, 'Mo': None, 'Ao': None, 'lpp': UTCDateTime(2016, 1, 24, 10, 41, 39, 605000), 'Mw': None, 'fc': None, 'snr': 27.956048519707181, 'marked': [], 'mpp': UTCDateTime(2016, 1, 24, 10, 41, 38, 605000), 'w0': None, 'spe': 1.6666666666666667, 'epp': UTCDateTime(2016, 1, 24, 10, 41, 35, 605000), 'fm': None, 'channel': u'LHZ'}, 'S': {'picker': 'auto', 'snrdb': 10.112844176301248, 'network': u'Z3', 'weight': 1, 'Mo': None, 'Ao': None, 'lpp': UTCDateTime(2016, 1, 24, 10, 50, 51, 605000), 'Mw': None, 'fc': None, 'snr': 10.263238413785425, 'marked': [], 'mpp': UTCDateTime(2016, 1, 24, 10, 50, 48, 605000), 'w0': None, 'spe': 4.666666666666667, 'epp': UTCDateTime(2016, 1, 24, 10, 50, 40, 605000), 'fm': None, 'channel': u'LHE'}}
|
||||
with HidePrints():
|
||||
result, station = autopickstation(wfstream=self.a005a, pickparam=self.pickparam_taupy_enabled,
|
||||
metadata=self.metadata, origin=self.origin)
|
||||
compare_dicts(expected=expected['P'], result=result['P'], hint='P-')
|
||||
compare_dicts(expected=expected['S'], result=result['S'], hint='S-')
|
||||
|
||||
|
||||
def run_dict_comparison(result, expected):
|
||||
for key, expected_value in expected.items():
|
||||
if isinstance(expected_value, dict):
|
||||
run_dict_comparison(result[key], expected[key])
|
||||
else:
|
||||
res = result[key]
|
||||
if isinstance(res, UTCDateTime) and isinstance(expected_value, UTCDateTime):
|
||||
res = res.timestamp
|
||||
expected_value = expected_value.timestamp
|
||||
assert expected_value == pytest.approx(res), f'{key}: {expected_value} != {res}'
|
||||
|
||||
|
||||
def compare_dicts(result, expected, hint=''):
|
||||
try:
|
||||
run_dict_comparison(result, expected)
|
||||
except AssertionError:
|
||||
raise AssertionError(f'{hint}Dictionaries not equal.'
|
||||
f'\n\n<<Expected>>\n{pretty_print_dict(expected)}'
|
||||
f'\n\n<<Result>>\n{pretty_print_dict(result)}')
|
||||
|
||||
|
||||
def pretty_print_dict(dct):
|
||||
retstr = ''
|
||||
for key, value in sorted(dct.items(), key=lambda x: x[0]):
|
||||
retstr += f"{key} : {value}\n"
|
||||
|
||||
return retstr
|
||||
result, station = autopickstation(wfstream = self.a005a, pickparam=self.pickparam_taupy_enabled, metadata=self.metadata, origin=self.origin)
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
||||
|
@ -1,33 +0,0 @@
|
||||
import unittest
|
||||
|
||||
from pylot.core.io.phases import getQualitiesfromxml
|
||||
|
||||
|
||||
class TestQualityFromXML(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.path = '.'
|
||||
self.ErrorsP = [0.02, 0.04, 0.08, 0.16]
|
||||
self.ErrorsS = [0.04, 0.08, 0.16, 0.32]
|
||||
self.test0_result = [[0.0136956521739, 0.0126, 0.0101612903226, 0.00734848484849, 0.0135069444444,
|
||||
0.00649659863946, 0.0129513888889, 0.0122747747748, 0.0119252873563, 0.0103947368421,
|
||||
0.0092380952381, 0.00916666666667, 0.0104444444444, 0.0125333333333, 0.00904761904762,
|
||||
0.00885714285714, 0.00911616161616, 0.0164166666667, 0.0128787878788, 0.0122756410256,
|
||||
0.013653253667966917], [0.0239333333333, 0.0223791578953, 0.0217974304255],
|
||||
[0.0504861111111, 0.0610833333333], [], [0.171029411765]], [
|
||||
[0.0195, 0.0203623188406, 0.0212121212121, 0.0345833333333, 0.0196180555556,
|
||||
0.0202536231884, 0.0200347222222, 0.0189, 0.0210763888889, 0.018275862069,
|
||||
0.0213888888889, 0.0319791666667, 0.0205303030303, 0.0156388888889, 0.0192,
|
||||
0.0231349206349, 0.023625, 0.02875, 0.0195512820513, 0.0239393939394, 0.0234166666667,
|
||||
0.0174702380952, 0.0204151307995], [0.040314343081226646], [0.148555555556], [], []]
|
||||
self.test1_result = [77.77777777777777, 11.11111111111111, 7.407407407407407, 0, 3.7037037037037037],\
|
||||
[92.0, 4.0, 4.0, 0, 0]
|
||||
|
||||
def test_result_plotflag0(self):
|
||||
self.assertEqual(getQualitiesfromxml(self.path, self.ErrorsP, self.ErrorsS, 0), self.test0_result)
|
||||
|
||||
def test_result_plotflag1(self):
|
||||
self.assertEqual(getQualitiesfromxml(self.path, self.ErrorsP, self.ErrorsS, 1), self.test1_result)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
@ -1,5 +1,4 @@
|
||||
import unittest
|
||||
|
||||
from pylot.core.pick.utils import get_quality_class
|
||||
|
||||
|
||||
@ -53,6 +52,5 @@ class TestQualityClassFromUncertainty(unittest.TestCase):
|
||||
# Error exactly in class 3
|
||||
self.assertEqual(3, get_quality_class(5.6, self.error_classes))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
||||
|
@ -1,76 +0,0 @@
|
||||
import pytest
|
||||
from obspy import read, Trace, UTCDateTime
|
||||
|
||||
from pylot.correlation.pick_correlation_correction import XCorrPickCorrection
|
||||
|
||||
|
||||
class TestXCorrPickCorrection():
|
||||
def setup(self):
|
||||
self.make_test_traces()
|
||||
self.make_test_picks()
|
||||
self.t_before = 2.
|
||||
self.t_after = 2.
|
||||
self.cc_maxlag = 0.5
|
||||
|
||||
def make_test_traces(self):
|
||||
# take first trace of test Stream from obspy
|
||||
tr1 = read()[0]
|
||||
# filter trace
|
||||
tr1.filter('bandpass', freqmin=1, freqmax=20)
|
||||
# make a copy and shift the copy by 0.1 s
|
||||
tr2 = tr1.copy()
|
||||
tr2.stats.starttime += 0.1
|
||||
|
||||
self.trace1 = tr1
|
||||
self.trace2 = tr2
|
||||
|
||||
def make_test_picks(self):
|
||||
# create an artificial reference pick on reference trace (trace1) and another one on the 0.1 s shifted trace
|
||||
self.tpick1 = UTCDateTime('2009-08-24T00:20:07.7')
|
||||
# shift the second pick by 0.2 s, the correction should be around 0.1 s now
|
||||
self.tpick2 = self.tpick1 + 0.2
|
||||
|
||||
def test_slice_trace_okay(self):
|
||||
|
||||
self.setup()
|
||||
xcpc = XCorrPickCorrection(UTCDateTime(), Trace(), UTCDateTime(), Trace(),
|
||||
t_before=self.t_before, t_after=self.t_after, cc_maxlag=self.cc_maxlag)
|
||||
|
||||
test_trace = self.trace1
|
||||
pick_time = self.tpick2
|
||||
|
||||
sliced_trace = xcpc.slice_trace(test_trace, pick_time)
|
||||
assert ((sliced_trace.stats.starttime == pick_time - self.t_before - self.cc_maxlag / 2)
|
||||
and (sliced_trace.stats.endtime == pick_time + self.t_after + self.cc_maxlag / 2))
|
||||
|
||||
def test_slice_trace_fails(self):
|
||||
self.setup()
|
||||
|
||||
test_trace = self.trace1
|
||||
pick_time = self.tpick1
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
xcpc = XCorrPickCorrection(UTCDateTime(), Trace(), UTCDateTime(), Trace(),
|
||||
t_before=self.t_before + 20, t_after=self.t_after, cc_maxlag=self.cc_maxlag)
|
||||
xcpc.slice_trace(test_trace, pick_time)
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
xcpc = XCorrPickCorrection(UTCDateTime(), Trace(), UTCDateTime(), Trace(),
|
||||
t_before=self.t_before, t_after=self.t_after + 50, cc_maxlag=self.cc_maxlag)
|
||||
xcpc.slice_trace(test_trace, pick_time)
|
||||
|
||||
def test_cross_correlation(self):
|
||||
self.setup()
|
||||
|
||||
# create XCorrPickCorrection object
|
||||
xcpc = XCorrPickCorrection(self.tpick1, self.trace1, self.tpick2, self.trace2, t_before=self.t_before,
|
||||
t_after=self.t_after, cc_maxlag=self.cc_maxlag)
|
||||
|
||||
# execute correlation
|
||||
correction, cc_max, uncert, fwfm = xcpc.cross_correlation(False, '', '')
|
||||
|
||||
# define awaited test result
|
||||
test_result = (-0.09983091718314982, 0.9578431835689154, 0.0015285160561610929, 0.03625786256084631)
|
||||
|
||||
# check results
|
||||
assert pytest.approx(test_result, rel=1e-6) == (correction, cc_max, uncert, fwfm)
|
@ -33,7 +33,6 @@ class HidePrints:
|
||||
def silencer(*args, **kwargs):
|
||||
with HidePrints():
|
||||
func(*args, **kwargs)
|
||||
|
||||
return silencer
|
||||
|
||||
def __init__(self, hide_prints=True):
|
||||
@ -50,4 +49,4 @@ class HidePrints:
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
"""Reinstate old stdout"""
|
||||
if self.hide:
|
||||
sys.stdout = self._original_stdout
|
||||
sys.stdout = self._original_stdout
|
Loading…
x
Reference in New Issue
Block a user