Compare commits
2 Commits
38-simplif
...
0.2
| Author | SHA1 | Date | |
|---|---|---|---|
| 8aaad643ec | |||
|
|
503ea419c4 |
112
README.md
@@ -1,2 +1,110 @@
|
||||
# PyLoT
|
||||
Python picking and Location Tool
|
||||
# PyLoT
|
||||
|
||||
version: 0.2
|
||||
|
||||
The Python picking and Localisation Tool
|
||||
|
||||
This python library contains a graphical user interfaces for picking
|
||||
seismic phases. This software needs [ObsPy][ObsPy]
|
||||
and the PySide Qt4 bindings for python to be installed first.
|
||||
|
||||
PILOT has originally been developed in Mathworks' MatLab. In order to
|
||||
distribute PILOT without facing portability problems, it has been decided
|
||||
to redevelop the software package in Python. The great work of the ObsPy
|
||||
group allows easy handling of a bunch of seismic data and PyLoT will
|
||||
benefit a lot compared to the former MatLab version.
|
||||
|
||||
The development of PyLoT is part of the joint research project MAGS2 and AlpArray.
|
||||
|
||||
## Installation
|
||||
|
||||
At the moment there is no automatic installation procedure available for PyLoT.
|
||||
Best way to install is to clone the repository and add the path to your Python path.
|
||||
|
||||
#### Prerequisites:
|
||||
|
||||
In order to run PyLoT you need to install:
|
||||
|
||||
- python 2 or 3
|
||||
- scipy
|
||||
- numpy
|
||||
- matplotlib
|
||||
- obspy
|
||||
- pyside
|
||||
|
||||
#### Some handwork:
|
||||
|
||||
PyLoT needs a properties folder on your system to work. It should be situated in your home directory
|
||||
(on Windows usually C:/Users/*username*):
|
||||
|
||||
mkdir ~/.pylot
|
||||
|
||||
In the next step you have to copy some files to this directory:
|
||||
|
||||
*for local distance seismicity*
|
||||
|
||||
cp path-to-pylot/inputs/pylot_local.in ~/.pylot/pylot.in
|
||||
|
||||
*for regional distance seismicity*
|
||||
|
||||
cp path-to-pylot/inputs/pylot_regional.in ~/.pylot/pylot.in
|
||||
|
||||
*for global distance seismicity*
|
||||
|
||||
cp path-to-pylot/inputs/pylot_global.in ~/.pylot/pylot.in
|
||||
|
||||
and some extra information on error estimates (just needed for reading old PILOT data) and the Richter magnitude scaling relation
|
||||
|
||||
cp path-to-pylot/inputs/PILOT_TimeErrors.in path-to-pylot/inputs/richter_scaling.data ~/.pylot/
|
||||
|
||||
You may need to do some modifications to these files. Especially folder names should be reviewed.
|
||||
|
||||
PyLoT has been tested on Mac OSX (10.11), Debian Linux 8 and on Windows 10.
|
||||
|
||||
|
||||
## Release notes
|
||||
|
||||
#### Features:
|
||||
|
||||
- centralize all functionalities of PyLoT and control them from within the main GUI
|
||||
- handling multiple events inside GUI with project files (save and load work progress)
|
||||
- GUI based adjustments of pick parameters and I/O
|
||||
- interactive tuning of parameters from within the GUI
|
||||
- call automatic picking algorithm from within the GUI
|
||||
- comparison of automatic with manual picks for multiple events using clear differentiation of manual picks into 'tune' and 'test-set' (beta)
|
||||
- manual picking of different (user defined) phase types
|
||||
- phase onset estimation with ObsPy TauPy
|
||||
- interactive zoom/scale functionalities in all plots (mousewheel, pan, pan-zoom)
|
||||
- array map to visualize stations and control onsets (beta feature, switch to manual picks not implemented)
|
||||
|
||||
##### Platform support:
|
||||
- Python 3 support
|
||||
- Windows support
|
||||
|
||||
##### Performance:
|
||||
- multiprocessing for automatic picking and restitution of multiple stations
|
||||
- use pyqtgraph library for better performance on main waveform plot
|
||||
|
||||
##### Visualization:
|
||||
- pick uncertainty (quality classes) visualization with gradients
|
||||
- pick color unification for all plots
|
||||
- new icons and stylesheets
|
||||
|
||||
#### Known Issues:
|
||||
- some Qt related errors might occur at runtime
|
||||
- filter toggle not working in pickDlg
|
||||
- PyLoT data structure requires at least three parent directories for waveform data directory
|
||||
|
||||
## Staff
|
||||
|
||||
Original author(s): L. Kueperkoch, S. Wehling-Benatelli, M. Bischoff (PILOT)
|
||||
|
||||
Developer(s): S. Wehling-Benatelli, L. Kueperkoch, M. Paffrath, K. Olbert,
|
||||
M. Bischoff, C. Wollin, M. Rische
|
||||
|
||||
Others: A. Bruestle, T. Meier, W. Friederich
|
||||
|
||||
|
||||
[ObsPy]: http://github.com/obspy/obspy/wiki
|
||||
|
||||
September 2017
|
||||
|
||||
505
autoPyLoT.py
Executable file
@@ -0,0 +1,505 @@
|
||||
#!/usr/bin/python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from __future__ import print_function
|
||||
|
||||
import argparse
|
||||
import datetime
|
||||
import glob
|
||||
import os
|
||||
|
||||
import pylot.core.loc.focmec as focmec
|
||||
import pylot.core.loc.hash as hash
|
||||
import pylot.core.loc.hypo71 as hypo71
|
||||
import pylot.core.loc.hypodd as hypodd
|
||||
import pylot.core.loc.hyposat as hyposat
|
||||
import pylot.core.loc.nll as nll
|
||||
import pylot.core.loc.velest as velest
|
||||
from obspy import read_events
|
||||
from obspy.core.event import ResourceIdentifier
|
||||
# from PySide.QtGui import QWidget, QInputDialog
|
||||
from pylot.core.analysis.magnitude import MomentMagnitude, LocalMagnitude
|
||||
from pylot.core.io.data import Data
|
||||
from pylot.core.io.inputs import PylotParameter
|
||||
from pylot.core.pick.autopick import autopickevent, iteratepicker
|
||||
from pylot.core.util.dataprocessing import restitute_data, read_metadata
|
||||
from pylot.core.util.defaults import SEPARATOR
|
||||
from pylot.core.util.event import Event
|
||||
from pylot.core.util.structure import DATASTRUCTURE
|
||||
from pylot.core.util.utils import real_None, remove_underscores, trim_station_components, check4gaps, check4doubled, \
|
||||
check4rotated
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
|
||||
|
||||
def autoPyLoT(input_dict=None, parameter=None, inputfile=None, fnames=None, eventid=None, savepath=None,
|
||||
savexml=True, station='all', iplot=0, ncores=0):
|
||||
"""
|
||||
Determine phase onsets automatically utilizing the automatic picking
|
||||
algorithms by Kueperkoch et al. 2010/2012.
|
||||
|
||||
:param inputfile: path to the input file containing all parameter
|
||||
information for automatic picking (for formatting details, see.
|
||||
`~pylot.core.io.inputs.PylotParameter`
|
||||
:type inputfile: str
|
||||
:return:
|
||||
|
||||
.. rubric:: Example
|
||||
|
||||
"""
|
||||
|
||||
if ncores == 1:
|
||||
sp_info = 'autoPyLoT is running serial on 1 cores.'
|
||||
else:
|
||||
if ncores == 0:
|
||||
ncores_readable = 'all available'
|
||||
else:
|
||||
ncores_readable = ncores
|
||||
sp_info = 'autoPyLoT is running in parallel on {} cores.'.format(ncores_readable)
|
||||
|
||||
splash = '''************************************\n
|
||||
*********autoPyLoT starting*********\n
|
||||
The Python picking and Location Tool\n
|
||||
Version {version} 2017\n
|
||||
\n
|
||||
Authors:\n
|
||||
L. Kueperkoch (BESTEC GmbH, Landau i. d. Pfalz)\n
|
||||
M. Paffrath (Ruhr-Universitaet Bochum)\n
|
||||
S. Wehling-Benatelli (Ruhr-Universitaet Bochum)\n
|
||||
|
||||
{sp}
|
||||
***********************************'''.format(version=_getVersionString(),
|
||||
sp=sp_info)
|
||||
print(splash)
|
||||
|
||||
|
||||
parameter = real_None(parameter)
|
||||
inputfile = real_None(inputfile)
|
||||
eventid = real_None(eventid)
|
||||
|
||||
fig_dict = None
|
||||
fig_dict_wadatijack = None
|
||||
|
||||
locflag = 1
|
||||
if input_dict and isinstance(input_dict, dict):
|
||||
if 'parameter' in input_dict:
|
||||
parameter = input_dict['parameter']
|
||||
if 'fig_dict' in input_dict:
|
||||
fig_dict = input_dict['fig_dict']
|
||||
if 'fig_dict_wadatijack' in input_dict:
|
||||
fig_dict_wadatijack = input_dict['fig_dict_wadatijack']
|
||||
if 'station' in input_dict:
|
||||
station = input_dict['station']
|
||||
if 'fnames' in input_dict:
|
||||
fnames = input_dict['fnames']
|
||||
if 'eventid' in input_dict:
|
||||
eventid = input_dict['eventid']
|
||||
if 'iplot' in input_dict:
|
||||
iplot = input_dict['iplot']
|
||||
if 'locflag' in input_dict:
|
||||
locflag = input_dict['locflag']
|
||||
if 'savexml' in input_dict:
|
||||
savexml = input_dict['savexml']
|
||||
|
||||
if not parameter:
|
||||
if inputfile:
|
||||
parameter = PylotParameter(inputfile)
|
||||
#iplot = parameter['iplot']
|
||||
else:
|
||||
infile = os.path.join(os.path.expanduser('~'), '.pylot', 'pylot.in')
|
||||
print('Using default input file {}'.format(infile))
|
||||
parameter = PylotParameter(infile)
|
||||
else:
|
||||
if not type(parameter) == PylotParameter:
|
||||
print('Wrong input type for parameter: {}'.format(type(parameter)))
|
||||
return
|
||||
if inputfile:
|
||||
print('Parameters set and input file given. Choose either of both.')
|
||||
return
|
||||
|
||||
evt = None
|
||||
|
||||
# reading parameter file
|
||||
if parameter.hasParam('datastructure'):
|
||||
# getting information on data structure
|
||||
datastructure = DATASTRUCTURE[parameter.get('datastructure')]()
|
||||
dsfields = {'root': parameter.get('rootpath'),
|
||||
'dpath': parameter.get('datapath'),
|
||||
'dbase': parameter.get('database')}
|
||||
|
||||
exf = ['root', 'dpath', 'dbase']
|
||||
|
||||
if parameter['eventID'] is not '*' and fnames == 'None':
|
||||
dsfields['eventID'] = parameter['eventID']
|
||||
exf.append('eventID')
|
||||
|
||||
datastructure.modifyFields(**dsfields)
|
||||
datastructure.setExpandFields(exf)
|
||||
|
||||
# check if default location routine NLLoc is available
|
||||
if real_None(parameter['nllocbin']) and locflag:
|
||||
# get NLLoc-root path
|
||||
nllocroot = parameter.get('nllocroot')
|
||||
# get path to NLLoc executable
|
||||
nllocbin = parameter.get('nllocbin')
|
||||
nlloccall = '%s/NLLoc' % nllocbin
|
||||
# get name of phase file
|
||||
phasef = parameter.get('phasefile')
|
||||
phasefile = '%s/obs/%s' % (nllocroot, phasef)
|
||||
# get name of NLLoc-control file
|
||||
ctrf = parameter.get('ctrfile')
|
||||
ctrfile = '%s/run/%s' % (nllocroot, ctrf)
|
||||
# pattern of NLLoc ttimes from location grid
|
||||
ttpat = parameter.get('ttpatter')
|
||||
# pattern of NLLoc-output file
|
||||
nllocoutpatter = parameter.get('outpatter')
|
||||
maxnumit = 3 # maximum number of iterations for re-picking
|
||||
else:
|
||||
locflag = 0
|
||||
print(" !!! ")
|
||||
print("!!No location routine available, autoPyLoT is running in non-location mode!!")
|
||||
print("!!No source parameter estimation possible!!")
|
||||
print(" !!! ")
|
||||
|
||||
if not input_dict:
|
||||
# started in production mode
|
||||
datapath = datastructure.expandDataPath()
|
||||
if fnames == 'None' and parameter['eventID'] is '*':
|
||||
# multiple event processing
|
||||
# read each event in database
|
||||
events = [events for events in glob.glob(os.path.join(datapath, '*')) if os.path.isdir(events)]
|
||||
elif fnames == 'None' and parameter['eventID'] is not '*' and not type(parameter['eventID']) == list:
|
||||
# single event processing
|
||||
events = glob.glob(os.path.join(datapath, parameter['eventID']))
|
||||
elif fnames == 'None' and type(parameter['eventID']) == list:
|
||||
# multiple event processing
|
||||
events = []
|
||||
for eventID in parameter['eventID']:
|
||||
events.append(os.path.join(datapath, eventID))
|
||||
else:
|
||||
# autoPyLoT was initialized from GUI
|
||||
events = []
|
||||
events.append(eventid)
|
||||
evID = os.path.split(eventid)[-1]
|
||||
locflag = 2
|
||||
else:
|
||||
# started in tune or interactive mode
|
||||
datapath = os.path.join(parameter['rootpath'],
|
||||
parameter['datapath'])
|
||||
events = []
|
||||
for eventID in eventid:
|
||||
events.append(os.path.join(datapath,
|
||||
parameter['database'],
|
||||
eventID))
|
||||
|
||||
if not events:
|
||||
print('autoPyLoT: No events given. Return!')
|
||||
return
|
||||
|
||||
# transform system path separator to '/'
|
||||
for index, eventpath in enumerate(events):
|
||||
eventpath = eventpath.replace(SEPARATOR, '/')
|
||||
events[index] = eventpath
|
||||
|
||||
allpicks = {}
|
||||
glocflag = locflag
|
||||
for eventpath in events:
|
||||
evID = os.path.split(eventpath)[-1]
|
||||
fext = '.xml'
|
||||
filename = os.path.join(eventpath, 'PyLoT_' + evID + fext)
|
||||
try:
|
||||
data = Data(evtdata=filename)
|
||||
data.get_evt_data().path = eventpath
|
||||
print('Reading event data from filename {}...'.format(filename))
|
||||
except Exception as e:
|
||||
print('Could not read event from file {}: {}'.format(filename, e))
|
||||
data = Data()
|
||||
pylot_event = Event(eventpath) # event should be path to event directory
|
||||
data.setEvtData(pylot_event)
|
||||
if fnames == 'None':
|
||||
data.setWFData(glob.glob(os.path.join(datapath, eventpath, '*')))
|
||||
# the following is necessary because within
|
||||
# multiple event processing no event ID is provided
|
||||
# in autopylot.in
|
||||
try:
|
||||
parameter.get('eventID')
|
||||
except:
|
||||
now = datetime.datetime.now()
|
||||
eventID = '%d%02d%02d%02d%02d' % (now.year,
|
||||
now.month,
|
||||
now.day,
|
||||
now.hour,
|
||||
now.minute)
|
||||
parameter.setParam(eventID=eventID)
|
||||
else:
|
||||
data.setWFData(fnames)
|
||||
|
||||
eventpath = events[0]
|
||||
# now = datetime.datetime.now()
|
||||
# evID = '%d%02d%02d%02d%02d' % (now.year,
|
||||
# now.month,
|
||||
# now.day,
|
||||
# now.hour,
|
||||
# now.minute)
|
||||
parameter.setParam(eventID=eventid)
|
||||
wfdat = data.getWFData() # all available streams
|
||||
if not station == 'all':
|
||||
wfdat = wfdat.select(station=station)
|
||||
if not wfdat:
|
||||
print('Could not find station {}. STOP!'.format(station))
|
||||
return
|
||||
wfdat = remove_underscores(wfdat)
|
||||
# trim components for each station to avoid problems with different trace starttimes for one station
|
||||
wfdat = check4gaps(wfdat)
|
||||
wfdat = check4doubled(wfdat)
|
||||
wfdat = trim_station_components(wfdat, trim_start=True, trim_end=False)
|
||||
metadata = read_metadata(parameter.get('invdir'))
|
||||
# rotate stations to ZNE
|
||||
wfdat = check4rotated(wfdat, metadata)
|
||||
corr_dat = None
|
||||
if locflag:
|
||||
print("Restitute data ...")
|
||||
corr_dat = restitute_data(wfdat.copy(), *metadata, ncores=ncores)
|
||||
if not corr_dat and locflag:
|
||||
locflag = 2
|
||||
print('Working on event %s. Stations: %s' % (eventpath, station))
|
||||
print(wfdat)
|
||||
##########################################################
|
||||
# !automated picking starts here!
|
||||
fdwj = None
|
||||
if fig_dict_wadatijack:
|
||||
fdwj = fig_dict_wadatijack[evID]
|
||||
picks = autopickevent(wfdat, parameter, iplot=iplot, fig_dict=fig_dict,
|
||||
fig_dict_wadatijack=fdwj,
|
||||
ncores=ncores, metadata=metadata, origin=data.get_evt_data().origins)
|
||||
##########################################################
|
||||
# locating
|
||||
if locflag > 0:
|
||||
# write phases to NLLoc-phase file
|
||||
nll.export(picks, phasefile, parameter)
|
||||
|
||||
# For locating the event the NLLoc-control file has to be modified!
|
||||
nllocout = '%s_%s' % (evID, nllocoutpatter)
|
||||
# create comment line for NLLoc-control file
|
||||
nll.modify_inputs(ctrf, nllocroot, nllocout, phasef,
|
||||
ttpat)
|
||||
|
||||
# locate the event
|
||||
nll.locate(ctrfile, inputfile)
|
||||
|
||||
# !iterative picking if traces remained unpicked or occupied with bad picks!
|
||||
# get theoretical onset times for picks with weights >= 4
|
||||
# in order to reprocess them using smaller time windows around theoretical onset
|
||||
# get stations with bad onsets
|
||||
badpicks = []
|
||||
for key in picks:
|
||||
if picks[key]['P']['weight'] >= 4 or picks[key]['S']['weight'] >= 4:
|
||||
badpicks.append([key, picks[key]['P']['mpp']])
|
||||
|
||||
# TODO keep code DRY (Don't Repeat Yourself) the following part is written twice
|
||||
# suggestion: delete block and modify the later similar block to work properly
|
||||
|
||||
if len(badpicks) == 0:
|
||||
print("autoPyLoT: No bad onsets found, thus no iterative picking necessary!")
|
||||
# get NLLoc-location file
|
||||
locsearch = '%s/loc/%s.????????.??????.grid?.loc.hyp' % (nllocroot, nllocout)
|
||||
if len(glob.glob(locsearch)) > 0:
|
||||
# get latest NLLoc-location file if several are available
|
||||
nllocfile = max(glob.glob(locsearch), key=os.path.getctime)
|
||||
evt = read_events(nllocfile)[0]
|
||||
# calculate seismic moment Mo and moment magnitude Mw
|
||||
moment_mag = MomentMagnitude(corr_dat, evt, parameter.get('vp'),
|
||||
parameter.get('Qp'),
|
||||
parameter.get('rho'), True,
|
||||
iplot)
|
||||
# update pick with moment property values (w0, fc, Mo)
|
||||
for stats, props in moment_mag.moment_props.items():
|
||||
picks[stats]['P'].update(props)
|
||||
evt = moment_mag.updated_event()
|
||||
net_mw = moment_mag.net_magnitude()
|
||||
print("Network moment magnitude: %4.1f" % net_mw.mag)
|
||||
# calculate local (Richter) magntiude
|
||||
WAscaling = parameter.get('WAscaling')
|
||||
magscaling = parameter.get('magscaling')
|
||||
local_mag = LocalMagnitude(corr_dat, evt,
|
||||
parameter.get('sstop'),
|
||||
WAscaling, True, iplot)
|
||||
for stats, amplitude in local_mag.amplitudes.items():
|
||||
picks[stats]['S']['Ao'] = amplitude.generic_amplitude
|
||||
print("Local station magnitudes scaled with:")
|
||||
print("log(Ao) + %f * log(r) + %f * r + %f" % (WAscaling[0],
|
||||
WAscaling[1],
|
||||
WAscaling[2]))
|
||||
evt = local_mag.updated_event(magscaling)
|
||||
net_ml = local_mag.net_magnitude(magscaling)
|
||||
print("Network local magnitude: %4.1f" % net_ml.mag)
|
||||
print("Network local magnitude scaled with:")
|
||||
print("%f * Ml + %f" % (magscaling[0], magscaling[1]))
|
||||
else:
|
||||
print("autoPyLoT: No NLLoc-location file available!")
|
||||
print("No source parameter estimation possible!")
|
||||
locflag = 9
|
||||
else:
|
||||
# get theoretical P-onset times from NLLoc-location file
|
||||
locsearch = '%s/loc/%s.????????.??????.grid?.loc.hyp' % (nllocroot, nllocout)
|
||||
if len(glob.glob(locsearch)) > 0:
|
||||
# get latest file if several are available
|
||||
nllocfile = max(glob.glob(locsearch), key=os.path.getctime)
|
||||
nlloccounter = 0
|
||||
while len(badpicks) > 0 and nlloccounter <= maxnumit:
|
||||
nlloccounter += 1
|
||||
if nlloccounter > maxnumit:
|
||||
print("autoPyLoT: Number of maximum iterations reached, stop iterative picking!")
|
||||
break
|
||||
print("autoPyLoT: Starting with iteration No. %d ..." % nlloccounter)
|
||||
if input_dict:
|
||||
if 'fig_dict' in input_dict:
|
||||
fig_dict = input_dict['fig_dict']
|
||||
picks = iteratepicker(wfdat, nllocfile, picks, badpicks, parameter,
|
||||
fig_dict=fig_dict)
|
||||
else:
|
||||
picks = iteratepicker(wfdat, nllocfile, picks, badpicks, parameter)
|
||||
# write phases to NLLoc-phase file
|
||||
nll.export(picks, phasefile, parameter)
|
||||
# remove actual NLLoc-location file to keep only the last
|
||||
os.remove(nllocfile)
|
||||
# locate the event
|
||||
nll.locate(ctrfile, inputfile)
|
||||
print("autoPyLoT: Iteration No. %d finished." % nlloccounter)
|
||||
# get updated NLLoc-location file
|
||||
nllocfile = max(glob.glob(locsearch), key=os.path.getctime)
|
||||
# check for bad picks
|
||||
badpicks = []
|
||||
for key in picks:
|
||||
if picks[key]['P']['weight'] >= 4 or picks[key]['S']['weight'] >= 4:
|
||||
badpicks.append([key, picks[key]['P']['mpp']])
|
||||
print("autoPyLoT: After iteration No. %d: %d bad onsets found ..." % (nlloccounter,
|
||||
len(badpicks)))
|
||||
if len(badpicks) == 0:
|
||||
print("autoPyLoT: No more bad onsets found, stop iterative picking!")
|
||||
nlloccounter = maxnumit
|
||||
evt = read_events(nllocfile)[0]
|
||||
if locflag < 2:
|
||||
# calculate seismic moment Mo and moment magnitude Mw
|
||||
moment_mag = MomentMagnitude(corr_dat, evt, parameter.get('vp'),
|
||||
parameter.get('Qp'),
|
||||
parameter.get('rho'), True,
|
||||
iplot)
|
||||
# update pick with moment property values (w0, fc, Mo)
|
||||
for stats, props in moment_mag.moment_props.items():
|
||||
if picks.has_key(stats):
|
||||
picks[stats]['P'].update(props)
|
||||
evt = moment_mag.updated_event()
|
||||
net_mw = moment_mag.net_magnitude()
|
||||
print("Network moment magnitude: %4.1f" % net_mw.mag)
|
||||
# calculate local (Richter) magntiude
|
||||
WAscaling = parameter.get('WAscaling')
|
||||
magscaling = parameter.get('magscaling')
|
||||
local_mag = LocalMagnitude(corr_dat, evt,
|
||||
parameter.get('sstop'),
|
||||
WAscaling, True, iplot)
|
||||
for stats, amplitude in local_mag.amplitudes.items():
|
||||
if picks.has_key(stats):
|
||||
picks[stats]['S']['Ao'] = amplitude.generic_amplitude
|
||||
print("Local station magnitudes scaled with:")
|
||||
print("log(Ao) + %f * log(r) + %f * r + %f" % (WAscaling[0],
|
||||
WAscaling[1],
|
||||
WAscaling[2]))
|
||||
evt = local_mag.updated_event(magscaling)
|
||||
net_ml = local_mag.net_magnitude(magscaling)
|
||||
print("Network local magnitude: %4.1f" % net_ml.mag)
|
||||
print("Network local magnitude scaled with:")
|
||||
print("%f * Ml + %f" % (magscaling[0], magscaling[1]))
|
||||
else:
|
||||
print("autoPyLoT: No NLLoc-location file available! Stop iteration!")
|
||||
locflag = 9
|
||||
##########################################################
|
||||
# write phase files for various location
|
||||
# and fault mechanism calculation routines
|
||||
# ObsPy event object
|
||||
if evt is not None:
|
||||
event_id = eventpath.split('/')[-1]
|
||||
evt.resource_id = ResourceIdentifier('smi:local/' + event_id)
|
||||
data.applyEVTData(evt, 'event')
|
||||
data.applyEVTData(picks)
|
||||
if savexml:
|
||||
if savepath == 'None' or savepath == None:
|
||||
saveEvtPath = eventpath
|
||||
else:
|
||||
saveEvtPath = savepath
|
||||
fnqml = '%s/PyLoT_%s' % (saveEvtPath, evID)
|
||||
data.exportEvent(fnqml, fnext='.xml', fcheck=['auto', 'magnitude', 'origin'])
|
||||
if locflag == 1:
|
||||
# HYPO71
|
||||
hypo71file = '%s/PyLoT_%s_HYPO71_phases' % (eventpath, evID)
|
||||
hypo71.export(picks, hypo71file, parameter)
|
||||
# HYPOSAT
|
||||
hyposatfile = '%s/PyLoT_%s_HYPOSAT_phases' % (eventpath, evID)
|
||||
hyposat.export(picks, hyposatfile, parameter)
|
||||
# VELEST
|
||||
velestfile = '%s/PyLoT_%s_VELEST_phases.cnv' % (eventpath, evID)
|
||||
velest.export(picks, velestfile, evt, parameter)
|
||||
# hypoDD
|
||||
hypoddfile = '%s/PyLoT_%s_hypoDD_phases.pha' % (eventpath, evID)
|
||||
hypodd.export(picks, hypoddfile, parameter, evt)
|
||||
# FOCMEC
|
||||
focmecfile = '%s/PyLoT_%s_FOCMEC.in' % (eventpath, evID)
|
||||
focmec.export(picks, focmecfile, parameter, evt)
|
||||
# HASH
|
||||
hashfile = '%s/PyLoT_%s_HASH' % (eventpath, evID)
|
||||
hash.export(picks, hashfile, parameter, evt)
|
||||
|
||||
endsplash = '''------------------------------------------\n'
|
||||
-----Finished event %s!-----\n'
|
||||
------------------------------------------'''.format \
|
||||
(version=_getVersionString()) % evID
|
||||
print(endsplash)
|
||||
locflag = glocflag
|
||||
if locflag == 0:
|
||||
print("autoPyLoT was running in non-location mode!")
|
||||
|
||||
# save picks for current event ID to dictionary with ALL picks
|
||||
allpicks[evID] = picks
|
||||
|
||||
endsp = '''####################################\n
|
||||
************************************\n
|
||||
*********autoPyLoT terminates*******\n
|
||||
The Python picking and Location Tool\n
|
||||
************************************'''.format(version=_getVersionString())
|
||||
print(endsp)
|
||||
return allpicks
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# parse arguments
|
||||
parser = argparse.ArgumentParser(
|
||||
description='''autoPyLoT automatically picks phase onset times using higher order statistics,
|
||||
autoregressive prediction and AIC followed by locating the seismic events using
|
||||
NLLoc''')
|
||||
|
||||
parser.add_argument('-i', '-I', '--inputfile', type=str,
|
||||
action='store',
|
||||
help='''full path to the file containing the input
|
||||
parameters for autoPyLoT''')
|
||||
parser.add_argument('-p', '-P', '--iplot', type=int,
|
||||
action='store',
|
||||
help='''optional, logical variable for plotting: 0=none, 1=partial, 2=all''')
|
||||
parser.add_argument('-f', '-F', '--fnames', type=str,
|
||||
action='store',
|
||||
help='''optional, list of data file names''')
|
||||
parser.add_argument('-e', '--eventid', type=str,
|
||||
action='store',
|
||||
help='''optional, event path incl. event ID''')
|
||||
parser.add_argument('-s', '-S', '--spath', type=str,
|
||||
action='store',
|
||||
help='''optional, save path for autoPyLoT output''')
|
||||
parser.add_argument('-c', '-C', '--ncores', type=int,
|
||||
action='store', default=0,
|
||||
help='''optional, number of CPU cores used for parallel processing (default: all available(=0))''')
|
||||
|
||||
cla = parser.parse_args()
|
||||
|
||||
picks = autoPyLoT(inputfile=str(cla.inputfile), fnames=str(cla.fnames),
|
||||
eventid=str(cla.eventid), savepath=str(cla.spath),
|
||||
ncores=cla.ncores, iplot=int(cla.iplot))
|
||||
19
help/index.html
Normal file
@@ -0,0 +1,19 @@
|
||||
<html>
|
||||
<head><title>PyLoT - the Python picking and Localisation Tool</title></head>
|
||||
<body>
|
||||
<p><b>PyLoT</b> is a program which is capable of picking seismic phases,
|
||||
exporting these as numerous standard phase format and localize the corresponding
|
||||
seismic event with external software as, e.g.:</p>
|
||||
<ul type="circle">
|
||||
<li><a href="http://alomax.free.fr/nlloc/index.html">NonLinLoc</a></li>
|
||||
<li>HypoInvers</li>
|
||||
<li>HypoSat</li>
|
||||
<li>whatever you want ...</li>
|
||||
</ul>
|
||||
<p>Read more on the
|
||||
<a href="https://ariadne.geophysik.rub.de/trac/PyLoT/wiki/">PyLoT WikiPage</a>.</p>
|
||||
<p>Bug reports are very much appreciated and can also be delivered on our
|
||||
<a href="https://ariadne.geophysik.rub.de/trac/PyLoT">PyLoT TracPage</a> after
|
||||
successful registration.</p>
|
||||
</body>
|
||||
</html>
|
||||
57
icons.qrc
Normal file
@@ -0,0 +1,57 @@
|
||||
<RCC>
|
||||
<qresource>
|
||||
<file>icons/pylot.ico</file>
|
||||
<file>icons/pylot.png</file>
|
||||
<file>icons/back.png</file>
|
||||
<file>icons/home.png</file>
|
||||
<file>icons/newfile.png</file>
|
||||
<file>icons/open.png</file>
|
||||
<file>icons/openproject.png</file>
|
||||
<file>icons/add.png</file>
|
||||
<file>icons/save.png</file>
|
||||
<file>icons/saveas.png</file>
|
||||
<file>icons/saveproject.png</file>
|
||||
<file>icons/saveprojectas.png</file>
|
||||
<file>icons/manupicksicon_small.png</file>
|
||||
<file>icons/autopicksicon_small.png</file>
|
||||
<file>icons/tune.png</file>
|
||||
<file>icons/autopylot_button.png</file>
|
||||
<file>icons/pick.png</file>
|
||||
<file>icons/waveform.png</file>
|
||||
<file>icons/openpick.png</file>
|
||||
<file>icons/openpicks.png</file>
|
||||
<file>icons/openpick.png</file>
|
||||
<file>icons/openpicks.png</file>
|
||||
<file>icons/savepicks.png</file>
|
||||
<file>icons/preferences.png</file>
|
||||
<file>icons/parameter.png</file>
|
||||
<file>icons/inventory.png</file>
|
||||
<file>icons/map.png</file>
|
||||
<file>icons/openloc.png</file>
|
||||
<file>icons/compare_button.png</file>
|
||||
<file>icons/locate_button.png</file>
|
||||
<file>icons/Matlab_PILOT_icon.png</file>
|
||||
<file>icons/printer.png</file>
|
||||
<file>icons/delete.png</file>
|
||||
<file>icons/key_E.png</file>
|
||||
<file>icons/key_N.png</file>
|
||||
<file>icons/key_P.png</file>
|
||||
<file>icons/key_Q.png</file>
|
||||
<file>icons/key_R.png</file>
|
||||
<file>icons/key_S.png</file>
|
||||
<file>icons/key_T.png</file>
|
||||
<file>icons/key_U.png</file>
|
||||
<file>icons/key_V.png</file>
|
||||
<file>icons/key_W.png</file>
|
||||
<file>icons/key_Z.png</file>
|
||||
<file>icons/filter.png</file>
|
||||
<file>icons/sync.png</file>
|
||||
<file>icons/zoom_0.png</file>
|
||||
<file>icons/zoom_in.png</file>
|
||||
<file>icons/zoom_out.png</file>
|
||||
<file>splash/splash.png</file>
|
||||
</qresource>
|
||||
<qresource prefix="/help">
|
||||
<file>help/index.html</file>
|
||||
</qresource>
|
||||
</RCC>
|
||||
BIN
icons/Matlab_PILOT_icon.png
Normal file
|
After Width: | Height: | Size: 50 KiB |
BIN
icons/add.png
Normal file
|
After Width: | Height: | Size: 44 KiB |
BIN
icons/autopicksicon_small.png
Normal file
|
After Width: | Height: | Size: 26 KiB |
BIN
icons/autopicsicon.png
Normal file
|
After Width: | Height: | Size: 7.1 KiB |
BIN
icons/autopylot_button.png
Normal file
|
After Width: | Height: | Size: 30 KiB |
BIN
icons/back.png
Normal file
|
After Width: | Height: | Size: 16 KiB |
BIN
icons/compare_button.png
Normal file
|
After Width: | Height: | Size: 18 KiB |
BIN
icons/delete.png
Normal file
|
After Width: | Height: | Size: 5.0 KiB |
BIN
icons/filter.png
Normal file
|
After Width: | Height: | Size: 4.6 KiB |
BIN
icons/home.png
Normal file
|
After Width: | Height: | Size: 23 KiB |
BIN
icons/inventory.png
Normal file
|
After Width: | Height: | Size: 24 KiB |
BIN
icons/key_E.png
Normal file
|
After Width: | Height: | Size: 37 KiB |
BIN
icons/key_N.png
Normal file
|
After Width: | Height: | Size: 44 KiB |
BIN
icons/key_P.png
Normal file
|
After Width: | Height: | Size: 43 KiB |
BIN
icons/key_Q.png
Normal file
|
After Width: | Height: | Size: 59 KiB |
BIN
icons/key_R.png
Normal file
|
After Width: | Height: | Size: 47 KiB |
BIN
icons/key_S.png
Normal file
|
After Width: | Height: | Size: 55 KiB |
BIN
icons/key_T.png
Normal file
|
After Width: | Height: | Size: 36 KiB |
BIN
icons/key_U.png
Normal file
|
After Width: | Height: | Size: 44 KiB |
BIN
icons/key_V.png
Normal file
|
After Width: | Height: | Size: 48 KiB |
BIN
icons/key_W.png
Normal file
|
After Width: | Height: | Size: 54 KiB |
BIN
icons/key_Z.png
Normal file
|
After Width: | Height: | Size: 45 KiB |
BIN
icons/locate_button.png
Normal file
|
After Width: | Height: | Size: 14 KiB |
BIN
icons/manupicksicon_small.png
Normal file
|
After Width: | Height: | Size: 22 KiB |
BIN
icons/manupicsicon.png
Normal file
|
After Width: | Height: | Size: 7.5 KiB |
BIN
icons/map.png
Normal file
|
After Width: | Height: | Size: 56 KiB |
BIN
icons/newfile.png
Normal file
|
After Width: | Height: | Size: 19 KiB |
BIN
icons/open.png
Normal file
|
After Width: | Height: | Size: 37 KiB |
BIN
icons/openfile.png
Normal file
|
After Width: | Height: | Size: 50 KiB |
BIN
icons/openloc.png
Normal file
|
After Width: | Height: | Size: 57 KiB |
BIN
icons/openpick.png
Normal file
|
After Width: | Height: | Size: 50 KiB |
BIN
icons/openpicks.png
Normal file
|
After Width: | Height: | Size: 52 KiB |
BIN
icons/openproject.png
Normal file
|
After Width: | Height: | Size: 58 KiB |
BIN
icons/parameter.png
Normal file
|
After Width: | Height: | Size: 38 KiB |
BIN
icons/pick.png
Normal file
|
After Width: | Height: | Size: 23 KiB |
BIN
icons/preferences.png
Normal file
|
After Width: | Height: | Size: 38 KiB |
BIN
icons/printer.png
Normal file
|
After Width: | Height: | Size: 6.5 KiB |
BIN
icons/pylot.ico
Normal file
|
After Width: | Height: | Size: 2.2 KiB |
BIN
icons/pylot.png
Normal file
|
After Width: | Height: | Size: 39 KiB |
BIN
icons/save.png
Normal file
|
After Width: | Height: | Size: 19 KiB |
BIN
icons/saveas.png
Normal file
|
After Width: | Height: | Size: 25 KiB |
BIN
icons/savepicks.png
Normal file
|
After Width: | Height: | Size: 28 KiB |
BIN
icons/saveproject.png
Normal file
|
After Width: | Height: | Size: 35 KiB |
BIN
icons/saveprojectas.png
Normal file
|
After Width: | Height: | Size: 41 KiB |
BIN
icons/sync.png
Normal file
|
After Width: | Height: | Size: 4.2 KiB |
BIN
icons/tune.png
Normal file
|
After Width: | Height: | Size: 37 KiB |
BIN
icons/waveform.png
Normal file
|
After Width: | Height: | Size: 16 KiB |
BIN
icons/zoom_0.png
Normal file
|
After Width: | Height: | Size: 6.9 KiB |
BIN
icons/zoom_in.png
Normal file
|
After Width: | Height: | Size: 7.0 KiB |
BIN
icons/zoom_out.png
Normal file
|
After Width: | Height: | Size: 6.9 KiB |
104841
icons_rc_2.py
Normal file
104841
icons_rc_3.py
Normal file
3
inputs/PILOT_TimeErrors.in
Normal file
@@ -0,0 +1,3 @@
|
||||
## default time errors for old PILOT phases
|
||||
0.04 0.08 0.16 0.32 #timeerrorsP# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P
|
||||
0.04 0.08 0.16 0.32 #timeerrorsS# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S
|
||||
100
inputs/pylot_global.in
Normal file
@@ -0,0 +1,100 @@
|
||||
%This is a parameter input file for PyLoT/autoPyLoT.
|
||||
%All main and special settings regarding data handling
|
||||
%and picking are to be set here!
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
#rootpath# %project path
|
||||
#datapath# %data path
|
||||
#database# %name of data base
|
||||
#eventID# %event ID for single event processing (* for all events found in database)
|
||||
#invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#NLLoc settings#
|
||||
None #nllocbin# %path to NLLoc executable
|
||||
None #nllocroot# %root of NLLoc-processing directory
|
||||
None #phasefile# %name of autoPyLoT-output phase file for NLLoc
|
||||
None #ctrfile# %name of autoPyLoT-output control file for NLLoc
|
||||
ttime #ttpatter# %pattern of NLLoc ttimes from grid
|
||||
AUTOLOC_nlloc #outpatter# %pattern of NLLoc-output file
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#parameters for seismic moment estimation#
|
||||
3530.0 #vp# %average P-wave velocity
|
||||
2500.0 #rho# %average rock density [kg/m^3]
|
||||
300.0 0.8 #Qp# %quality factor for P waves (Qp*f^a); list(Qp, a)
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#settings local magnitude#
|
||||
1.0 1.0 1.0 #WAscaling# %Scaling relation (log(Ao)+Alog(r)+Br+C) of Wood-Anderson amplitude Ao [nm] If zeros are set, original Richter magnitude is calculated!
|
||||
1.0 1.0 #magscaling# %Scaling relation for derived local magnitude [a*Ml+b]. If zeros are set, no scaling of network magnitude is applied!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#filter settings#
|
||||
0.01 0.01 #minfreq# %Lower filter frequency [P, S]
|
||||
0.3 0.3 #maxfreq# %Upper filter frequency [P, S]
|
||||
3 3 #filter_order# %filter order [P, S]
|
||||
bandpass bandpass #filter_type# %filter type (bandpass, bandstop, lowpass, highpass) [P, S]
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#common settings picker#
|
||||
global #extent# %extent of array ("local", "regional" or "global")
|
||||
-150.0 #pstart# %start time [s] for calculating CF for P-picking (if TauPy: seconds relative to estimated onset)
|
||||
600.0 #pstop# %end time [s] for calculating CF for P-picking (if TauPy: seconds relative to estimated onset)
|
||||
200.0 #sstart# %start time [s] relative to P-onset for calculating CF for S-picking
|
||||
1150.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6
|
||||
0.05 0.5 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
0.001 0.5 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
0.05 0.5 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
0.001 0.5 #bph2# %lower/upper corner freq. of second band pass filter z-comp. [Hz]
|
||||
#special settings for calculating CF#
|
||||
%!!Edit the following only if you know what you are doing!!%
|
||||
#Z-component#
|
||||
HOS #algoP# %choose algorithm for P-onset determination (HOS, ARZ, or AR3)
|
||||
150.0 #tlta# %for HOS-/AR-AIC-picker, length of LTA window [s]
|
||||
4 #hosorder# %for HOS-picker, order of Higher Order Statistics
|
||||
2 #Parorder# %for AR-picker, order of AR process of Z-component
|
||||
16.0 #tdet1z# %for AR-picker, length of AR determination window [s] for Z-component, 1st pick
|
||||
10.0 #tpred1z# %for AR-picker, length of AR prediction window [s] for Z-component, 1st pick
|
||||
12.0 #tdet2z# %for AR-picker, length of AR determination window [s] for Z-component, 2nd pick
|
||||
6.0 #tpred2z# %for AR-picker, length of AR prediction window [s] for Z-component, 2nd pick
|
||||
0.001 #addnoise# %add noise to seismogram for stable AR prediction
|
||||
60.0 10.0 40.0 10.0 #tsnrz# %for HOS/AR, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]
|
||||
150.0 #pickwinP# %for initial AIC pick, length of P-pick window [s]
|
||||
35.0 #Precalcwin# %for HOS/AR, window length [s] for recalculation of CF (relative to 1st pick)
|
||||
6.0 #aictsmooth# %for HOS/AR, take average of samples for smoothing of AIC-function [s]
|
||||
4.0 #tsmoothP# %for HOS/AR, take average of samples for smoothing CF [s]
|
||||
0.001 #ausP# %for HOS/AR, artificial uplift of samples (aus) of CF (P)
|
||||
1.1 #nfacP# %for HOS/AR, noise factor for noise level determination (P)
|
||||
#H-components#
|
||||
ARH #algoS# %choose algorithm for S-onset determination (ARH or AR3)
|
||||
12.0 #tdet1h# %for HOS/AR, length of AR-determination window [s], H-components, 1st pick
|
||||
6.0 #tpred1h# %for HOS/AR, length of AR-prediction window [s], H-components, 1st pick
|
||||
8.0 #tdet2h# %for HOS/AR, length of AR-determinaton window [s], H-components, 2nd pick
|
||||
4.0 #tpred2h# %for HOS/AR, length of AR-prediction window [s], H-components, 2nd pick
|
||||
4 #Sarorder# %for AR-picker, order of AR process of H-components
|
||||
30.0 #Srecalcwin# %for AR-picker, window length [s] for recalculation of CF (2nd pick) (H)
|
||||
195.0 #pickwinS# %for initial AIC pick, length of S-pick window [s]
|
||||
100.0 10.0 45.0 10.0 #tsnrh# %for ARH/AR3, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]
|
||||
22.0 #aictsmoothS# %for AIC-picker, take average of samples for smoothing of AIC-function [s]
|
||||
10.0 #tsmoothS# %for AR-picker, take average of samples for smoothing CF [s] (S)
|
||||
0.001 #ausS# %for HOS/AR, artificial uplift of samples (aus) of CF (S)
|
||||
1.2 #nfacS# %for AR-picker, noise factor for noise level determination (S)
|
||||
#first-motion picker#
|
||||
1 #minfmweight# %minimum required P weight for first-motion determination
|
||||
3.0 #minFMSNR# %miniumum required SNR for first-motion determination
|
||||
10.0 #fmpickwin# %pick window around P onset for calculating zero crossings
|
||||
#quality assessment#
|
||||
1.0 2.0 4.0 8.0 #timeerrorsP# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P
|
||||
4.0 8.0 16.0 32.0 #timeerrorsS# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S
|
||||
0.5 #minAICPslope# %below this slope [counts/s] the initial P pick is rejected
|
||||
1.1 #minAICPSNR# %below this SNR the initial P pick is rejected
|
||||
1.0 #minAICSslope# %below this slope [counts/s] the initial S pick is rejected
|
||||
1.3 #minAICSSNR# %below this SNR the initial S pick is rejected
|
||||
5.0 #minsiglength# %length of signal part for which amplitudes must exceed noiselevel [s]
|
||||
1.0 #noisefactor# %noiselevel*noisefactor=threshold
|
||||
10.0 #minpercent# %required percentage of amplitudes exceeding threshold
|
||||
1.2 #zfac# %P-amplitude must exceed at least zfac times RMS-S amplitude
|
||||
25.0 #mdttolerance# %maximum allowed deviation of P picks from median [s]
|
||||
50.0 #wdttolerance# %maximum allowed deviation from Wadati-diagram
|
||||
5.0 #jackfactor# %pick is removed if the variance of the subgroup with the pick removed is larger than the mean variance of all subgroups times safety factor
|
||||
100
inputs/pylot_local.in
Normal file
@@ -0,0 +1,100 @@
|
||||
%This is a parameter input file for PyLoT/autoPyLoT.
|
||||
%All main and special settings regarding data handling
|
||||
%and picking are to be set here!
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
#rootpath# %project path
|
||||
#datapath# %data path
|
||||
#database# %name of data base
|
||||
#eventID# %event ID for single event processing (* for all events found in database)
|
||||
#invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#NLLoc settings#
|
||||
None #nllocbin# %path to NLLoc executable
|
||||
None #nllocroot# %root of NLLoc-processing directory
|
||||
None #phasefile# %name of autoPyLoT-output phase file for NLLoc
|
||||
None #ctrfile# %name of autoPyLoT-output control file for NLLoc
|
||||
ttime #ttpatter# %pattern of NLLoc ttimes from grid
|
||||
AUTOLOC_nlloc #outpatter# %pattern of NLLoc-output file
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#parameters for seismic moment estimation#
|
||||
3530.0 #vp# %average P-wave velocity
|
||||
2500.0 #rho# %average rock density [kg/m^3]
|
||||
300.0 0.8 #Qp# %quality factor for P waves (Qp*f^a); list(Qp, a)
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#settings local magnitude#
|
||||
1.11 0.0009 -2.0 #WAscaling# %Scaling relation (log(Ao)+Alog(r)+Br+C) of Wood-Anderson amplitude Ao [nm] If zeros are set, original Richter magnitude is calculated!
|
||||
1.0382 -0.447 #magscaling# %Scaling relation for derived local magnitude [a*Ml+b]. If zeros are set, no scaling of network magnitude is applied!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#filter settings#
|
||||
1.0 1.0 #minfreq# %Lower filter frequency [P, S]
|
||||
10.0 10.0 #maxfreq# %Upper filter frequency [P, S]
|
||||
2 2 #filter_order# %filter order [P, S]
|
||||
bandpass bandpass #filter_type# %filter type (bandpass, bandstop, lowpass, highpass) [P, S]
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#common settings picker#
|
||||
local #extent# %extent of array ("local", "regional" or "global")
|
||||
15.0 #pstart# %start time [s] for calculating CF for P-picking
|
||||
60.0 #pstop# %end time [s] for calculating CF for P-picking
|
||||
-1.0 #sstart# %start time [s] relative to P-onset for calculating CF for S-picking
|
||||
10.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation
|
||||
2.0 10.0 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
2.0 12.0 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
2.0 8.0 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
2.0 10.0 #bph2# %lower/upper corner freq. of second band pass filter z-comp. [Hz]
|
||||
#special settings for calculating CF#
|
||||
%!!Edit the following only if you know what you are doing!!%
|
||||
#Z-component#
|
||||
HOS #algoP# %choose algorithm for P-onset determination (HOS, ARZ, or AR3)
|
||||
7.0 #tlta# %for HOS-/AR-AIC-picker, length of LTA window [s]
|
||||
4 #hosorder# %for HOS-picker, order of Higher Order Statistics
|
||||
2 #Parorder# %for AR-picker, order of AR process of Z-component
|
||||
1.2 #tdet1z# %for AR-picker, length of AR determination window [s] for Z-component, 1st pick
|
||||
0.4 #tpred1z# %for AR-picker, length of AR prediction window [s] for Z-component, 1st pick
|
||||
0.6 #tdet2z# %for AR-picker, length of AR determination window [s] for Z-component, 2nd pick
|
||||
0.2 #tpred2z# %for AR-picker, length of AR prediction window [s] for Z-component, 2nd pick
|
||||
0.001 #addnoise# %add noise to seismogram for stable AR prediction
|
||||
3.0 0.1 0.5 1.0 #tsnrz# %for HOS/AR, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]
|
||||
3.0 #pickwinP# %for initial AIC pick, length of P-pick window [s]
|
||||
6.0 #Precalcwin# %for HOS/AR, window length [s] for recalculation of CF (relative to 1st pick)
|
||||
0.2 #aictsmooth# %for HOS/AR, take average of samples for smoothing of AIC-function [s]
|
||||
0.1 #tsmoothP# %for HOS/AR, take average of samples for smoothing CF [s]
|
||||
0.001 #ausP# %for HOS/AR, artificial uplift of samples (aus) of CF (P)
|
||||
1.3 #nfacP# %for HOS/AR, noise factor for noise level determination (P)
|
||||
#H-components#
|
||||
ARH #algoS# %choose algorithm for S-onset determination (ARH or AR3)
|
||||
0.8 #tdet1h# %for HOS/AR, length of AR-determination window [s], H-components, 1st pick
|
||||
0.4 #tpred1h# %for HOS/AR, length of AR-prediction window [s], H-components, 1st pick
|
||||
0.6 #tdet2h# %for HOS/AR, length of AR-determinaton window [s], H-components, 2nd pick
|
||||
0.3 #tpred2h# %for HOS/AR, length of AR-prediction window [s], H-components, 2nd pick
|
||||
4 #Sarorder# %for AR-picker, order of AR process of H-components
|
||||
5.0 #Srecalcwin# %for AR-picker, window length [s] for recalculation of CF (2nd pick) (H)
|
||||
4.0 #pickwinS# %for initial AIC pick, length of S-pick window [s]
|
||||
2.0 0.3 1.5 1.0 #tsnrh# %for ARH/AR3, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]
|
||||
1.0 #aictsmoothS# %for AIC-picker, take average of samples for smoothing of AIC-function [s]
|
||||
0.7 #tsmoothS# %for AR-picker, take average of samples for smoothing CF [s] (S)
|
||||
0.9 #ausS# %for HOS/AR, artificial uplift of samples (aus) of CF (S)
|
||||
1.5 #nfacS# %for AR-picker, noise factor for noise level determination (S)
|
||||
#first-motion picker#
|
||||
1 #minfmweight# %minimum required P weight for first-motion determination
|
||||
2.0 #minFMSNR# %miniumum required SNR for first-motion determination
|
||||
0.2 #fmpickwin# %pick window around P onset for calculating zero crossings
|
||||
#quality assessment#
|
||||
0.02 0.04 0.08 0.16 #timeerrorsP# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P
|
||||
0.04 0.08 0.16 0.32 #timeerrorsS# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S
|
||||
0.8 #minAICPslope# %below this slope [counts/s] the initial P pick is rejected
|
||||
1.1 #minAICPSNR# %below this SNR the initial P pick is rejected
|
||||
1.0 #minAICSslope# %below this slope [counts/s] the initial S pick is rejected
|
||||
1.5 #minAICSSNR# %below this SNR the initial S pick is rejected
|
||||
1.0 #minsiglength# %length of signal part for which amplitudes must exceed noiselevel [s]
|
||||
1.0 #noisefactor# %noiselevel*noisefactor=threshold
|
||||
10.0 #minpercent# %required percentage of amplitudes exceeding threshold
|
||||
1.5 #zfac# %P-amplitude must exceed at least zfac times RMS-S amplitude
|
||||
6.0 #mdttolerance# %maximum allowed deviation of P picks from median [s]
|
||||
1.0 #wdttolerance# %maximum allowed deviation from Wadati-diagram
|
||||
5.0 #jackfactor# %pick is removed if the variance of the subgroup with the pick removed is larger than the mean variance of all subgroups times safety factor
|
||||
100
inputs/pylot_regional.in
Normal file
@@ -0,0 +1,100 @@
|
||||
%This is a parameter input file for PyLoT/autoPyLoT.
|
||||
%All main and special settings regarding data handling
|
||||
%and picking are to be set here!
|
||||
%Parameters are optimized for %extent data sets!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#main settings#
|
||||
#rootpath# %project path
|
||||
#datapath# %data path
|
||||
#database# %name of data base
|
||||
#eventID# %event ID for single event processing (* for all events found in database)
|
||||
#invdir# %full path to inventory or dataless-seed file
|
||||
PILOT #datastructure# %choose data structure
|
||||
True #apverbose# %choose 'True' or 'False' for terminal output
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#NLLoc settings#
|
||||
None #nllocbin# %path to NLLoc executable
|
||||
None #nllocroot# %root of NLLoc-processing directory
|
||||
None #phasefile# %name of autoPyLoT-output phase file for NLLoc
|
||||
None #ctrfile# %name of autoPyLoT-output control file for NLLoc
|
||||
ttime #ttpatter# %pattern of NLLoc ttimes from grid
|
||||
AUTOLOC_nlloc #outpatter# %pattern of NLLoc-output file
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#parameters for seismic moment estimation#
|
||||
3530.0 #vp# %average P-wave velocity
|
||||
2500.0 #rho# %average rock density [kg/m^3]
|
||||
300.0 0.8 #Qp# %quality factor for P waves (Qp*f^a); list(Qp, a)
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#settings local magnitude#
|
||||
1.11 0.0009 -2.0 #WAscaling# %Scaling relation (log(Ao)+Alog(r)+Br+C) of Wood-Anderson amplitude Ao [nm] If zeros are set, original Richter magnitude is calculated!
|
||||
1.0382 -0.447 #magscaling# %Scaling relation for derived local magnitude [a*Ml+b]. If zeros are set, no scaling of network magnitude is applied!
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#filter settings#
|
||||
1.0 1.0 #minfreq# %Lower filter frequency [P, S]
|
||||
10.0 10.0 #maxfreq# %Upper filter frequency [P, S]
|
||||
2 2 #filter_order# %filter order [P, S]
|
||||
bandpass bandpass #filter_type# %filter type (bandpass, bandstop, lowpass, highpass) [P, S]
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
#common settings picker#
|
||||
local #extent# %extent of array ("local", "regional" or "global")
|
||||
15.0 #pstart# %start time [s] for calculating CF for P-picking
|
||||
60.0 #pstop# %end time [s] for calculating CF for P-picking
|
||||
-1.0 #sstart# %start time [s] relative to P-onset for calculating CF for S-picking
|
||||
10.0 #sstop# %end time [s] after P-onset for calculating CF for S-picking
|
||||
True #use_taup# %use estimated traveltimes from TauPy for calculating windows for CF
|
||||
iasp91 #taup_model# %define TauPy model for traveltime estimation
|
||||
2.0 10.0 #bpz1# %lower/upper corner freq. of first band pass filter Z-comp. [Hz]
|
||||
2.0 12.0 #bpz2# %lower/upper corner freq. of second band pass filter Z-comp. [Hz]
|
||||
2.0 8.0 #bph1# %lower/upper corner freq. of first band pass filter H-comp. [Hz]
|
||||
2.0 10.0 #bph2# %lower/upper corner freq. of second band pass filter z-comp. [Hz]
|
||||
#special settings for calculating CF#
|
||||
%!!Edit the following only if you know what you are doing!!%
|
||||
#Z-component#
|
||||
HOS #algoP# %choose algorithm for P-onset determination (HOS, ARZ, or AR3)
|
||||
7.0 #tlta# %for HOS-/AR-AIC-picker, length of LTA window [s]
|
||||
4 #hosorder# %for HOS-picker, order of Higher Order Statistics
|
||||
2 #Parorder# %for AR-picker, order of AR process of Z-component
|
||||
1.2 #tdet1z# %for AR-picker, length of AR determination window [s] for Z-component, 1st pick
|
||||
0.4 #tpred1z# %for AR-picker, length of AR prediction window [s] for Z-component, 1st pick
|
||||
0.6 #tdet2z# %for AR-picker, length of AR determination window [s] for Z-component, 2nd pick
|
||||
0.2 #tpred2z# %for AR-picker, length of AR prediction window [s] for Z-component, 2nd pick
|
||||
0.001 #addnoise# %add noise to seismogram for stable AR prediction
|
||||
3.0 0.1 0.5 1.0 #tsnrz# %for HOS/AR, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]
|
||||
3.0 #pickwinP# %for initial AIC pick, length of P-pick window [s]
|
||||
6.0 #Precalcwin# %for HOS/AR, window length [s] for recalculation of CF (relative to 1st pick)
|
||||
0.2 #aictsmooth# %for HOS/AR, take average of samples for smoothing of AIC-function [s]
|
||||
0.1 #tsmoothP# %for HOS/AR, take average of samples for smoothing CF [s]
|
||||
0.001 #ausP# %for HOS/AR, artificial uplift of samples (aus) of CF (P)
|
||||
1.3 #nfacP# %for HOS/AR, noise factor for noise level determination (P)
|
||||
#H-components#
|
||||
ARH #algoS# %choose algorithm for S-onset determination (ARH or AR3)
|
||||
0.8 #tdet1h# %for HOS/AR, length of AR-determination window [s], H-components, 1st pick
|
||||
0.4 #tpred1h# %for HOS/AR, length of AR-prediction window [s], H-components, 1st pick
|
||||
0.6 #tdet2h# %for HOS/AR, length of AR-determinaton window [s], H-components, 2nd pick
|
||||
0.3 #tpred2h# %for HOS/AR, length of AR-prediction window [s], H-components, 2nd pick
|
||||
4 #Sarorder# %for AR-picker, order of AR process of H-components
|
||||
5.0 #Srecalcwin# %for AR-picker, window length [s] for recalculation of CF (2nd pick) (H)
|
||||
4.0 #pickwinS# %for initial AIC pick, length of S-pick window [s]
|
||||
2.0 0.3 1.5 1.0 #tsnrh# %for ARH/AR3, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]
|
||||
1.0 #aictsmoothS# %for AIC-picker, take average of samples for smoothing of AIC-function [s]
|
||||
0.7 #tsmoothS# %for AR-picker, take average of samples for smoothing CF [s] (S)
|
||||
0.9 #ausS# %for HOS/AR, artificial uplift of samples (aus) of CF (S)
|
||||
1.5 #nfacS# %for AR-picker, noise factor for noise level determination (S)
|
||||
#first-motion picker#
|
||||
1 #minfmweight# %minimum required P weight for first-motion determination
|
||||
2.0 #minFMSNR# %miniumum required SNR for first-motion determination
|
||||
0.2 #fmpickwin# %pick window around P onset for calculating zero crossings
|
||||
#quality assessment#
|
||||
0.02 0.04 0.08 0.16 #timeerrorsP# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P
|
||||
0.04 0.08 0.16 0.32 #timeerrorsS# %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S
|
||||
0.8 #minAICPslope# %below this slope [counts/s] the initial P pick is rejected
|
||||
1.1 #minAICPSNR# %below this SNR the initial P pick is rejected
|
||||
1.0 #minAICSslope# %below this slope [counts/s] the initial S pick is rejected
|
||||
1.5 #minAICSSNR# %below this SNR the initial S pick is rejected
|
||||
1.0 #minsiglength# %length of signal part for which amplitudes must exceed noiselevel [s]
|
||||
1.0 #noisefactor# %noiselevel*noisefactor=threshold
|
||||
10.0 #minpercent# %required percentage of amplitudes exceeding threshold
|
||||
1.5 #zfac# %P-amplitude must exceed at least zfac times RMS-S amplitude
|
||||
6.0 #mdttolerance# %maximum allowed deviation of P picks from median [s]
|
||||
1.0 #wdttolerance# %maximum allowed deviation from Wadati-diagram
|
||||
5.0 #jackfactor# %pick is removed if the variance of the subgroup with the pick removed is larger than the mean variance of all subgroups times safety factor
|
||||
53
inputs/richter_scaling.data
Normal file
@@ -0,0 +1,53 @@
|
||||
0 1.4
|
||||
10 1.5
|
||||
20 1.7
|
||||
25 1.9
|
||||
30 2.1
|
||||
35 2.3
|
||||
40 2.4
|
||||
45 2.5
|
||||
50 2.6
|
||||
60 2.8
|
||||
70 2.8
|
||||
75 2.9
|
||||
85 2.9
|
||||
90 3.0
|
||||
100 3.0
|
||||
110 3.1
|
||||
120 3.1
|
||||
130 3.2
|
||||
140 3.2
|
||||
150 3.3
|
||||
160 3.3
|
||||
170 3.4
|
||||
180 3.4
|
||||
190 3.5
|
||||
200 3.5
|
||||
210 3.6
|
||||
230 3.7
|
||||
240 3.7
|
||||
250 3.8
|
||||
260 3.8
|
||||
270 3.9
|
||||
280 3.9
|
||||
290 4.0
|
||||
300 4.0
|
||||
310 4.1
|
||||
320 4.2
|
||||
330 4.2
|
||||
340 4.2
|
||||
350 4.3
|
||||
360 4.3
|
||||
370 4.3
|
||||
380 4.4
|
||||
390 4.4
|
||||
400 4.5
|
||||
430 4.6
|
||||
470 4.7
|
||||
510 4.8
|
||||
560 4.9
|
||||
600 5.1
|
||||
700 5.2
|
||||
800 5.4
|
||||
900 5.5
|
||||
1000 5.7
|
||||
226
makePyLoT.py
Normal file
@@ -0,0 +1,226 @@
|
||||
#!/usr/bin/env python
|
||||
# encoding: utf-8
|
||||
from __future__ import print_function
|
||||
|
||||
"""
|
||||
makePyLoT -- build and install PyLoT
|
||||
|
||||
makePyLoT is a python make file in order to establish the folder structure and
|
||||
meet requisites
|
||||
|
||||
It defines
|
||||
:class CLIError:
|
||||
:method main:
|
||||
|
||||
:author: Sebastian Wehling-Benatelli
|
||||
|
||||
:copyright: 2014 MAGS2 EP3 Working Group. All rights reserved.
|
||||
|
||||
:license: GNU Lesser General Public License, Version 3
|
||||
(http://www.gnu.org/copyleft/lesser.html)
|
||||
|
||||
:contact: sebastian.wehling@rub.de
|
||||
|
||||
updated: Updated
|
||||
"""
|
||||
|
||||
import glob
|
||||
import os
|
||||
import sys
|
||||
import shutil
|
||||
import copy
|
||||
|
||||
from argparse import ArgumentParser
|
||||
from argparse import RawDescriptionHelpFormatter
|
||||
|
||||
__all__ = []
|
||||
__version__ = 0.1
|
||||
__date__ = '2014-11-26'
|
||||
__updated__ = '2016-04-28'
|
||||
|
||||
DEBUG = 0
|
||||
TESTRUN = 0
|
||||
PROFILE = 0
|
||||
|
||||
|
||||
class CLIError(Exception):
|
||||
"""Generic exception to raise and log different fatal errors."""
|
||||
|
||||
def __init__(self, msg):
|
||||
super(CLIError).__init__(type(self))
|
||||
self.msg = "E: %s" % msg
|
||||
|
||||
def __str__(self):
|
||||
return self.msg
|
||||
|
||||
def __unicode__(self):
|
||||
return self.msg
|
||||
|
||||
|
||||
def main(argv=None): # IGNORE:C0111
|
||||
'''Command line options.'''
|
||||
|
||||
if argv is None:
|
||||
argv = sys.argv
|
||||
else:
|
||||
sys.argv.extend(argv)
|
||||
|
||||
program_name = os.path.basename(sys.argv[0])
|
||||
program_version = "v%s" % __version__
|
||||
program_build_date = str(__updated__)
|
||||
program_version_message = 'makePyLoT %s (%s)' % (
|
||||
program_version, program_build_date)
|
||||
program_shortdesc = __import__('__main__').__doc__.split("\n")[1]
|
||||
program_license = '''{0:s}
|
||||
|
||||
Created by Sebastian Wehling-Benatelli on {1:s}.
|
||||
Copyright 2014 MAGS2 EP3 Working Group. All rights reserved.
|
||||
|
||||
GNU Lesser General Public License, Version 3
|
||||
(http://www.gnu.org/copyleft/lesser.html)
|
||||
|
||||
Distributed on an "AS IS" basis without warranties
|
||||
or conditions of any kind, either express or implied.
|
||||
|
||||
USAGE
|
||||
'''.format(program_shortdesc, str(__date__))
|
||||
|
||||
try:
|
||||
# Setup argument parser
|
||||
parser = ArgumentParser(description=program_license,
|
||||
formatter_class=RawDescriptionHelpFormatter)
|
||||
parser.add_argument("-b", "--build", dest="build", action="store_true",
|
||||
help="build PyLoT")
|
||||
parser.add_argument("-v", "--verbose", dest="verbose", action="count",
|
||||
help="set verbosity level")
|
||||
parser.add_argument("-i", "--install", dest="install",
|
||||
action="store_true",
|
||||
help="install PyLoT on the system")
|
||||
parser.add_argument("-d", "--directory", dest="directory",
|
||||
help="installation directory", metavar="RE")
|
||||
parser.add_argument('-V', '--version', action='version',
|
||||
version=program_version_message)
|
||||
|
||||
# Process arguments
|
||||
args = parser.parse_args()
|
||||
|
||||
verbose = args.verbose
|
||||
build = args.build
|
||||
install = args.install
|
||||
directory = args.directory
|
||||
|
||||
if verbose > 0:
|
||||
print("Verbose mode on")
|
||||
if install and not directory:
|
||||
raise CLIError("""Trying to install without appropriate
|
||||
destination; please specify an installation
|
||||
directory!""")
|
||||
if build and install:
|
||||
print("Building and installing PyLoT ...\n")
|
||||
buildPyLoT(verbose)
|
||||
installPyLoT(verbose)
|
||||
elif build and not install:
|
||||
print("Building PyLoT without installing! Please wait ...\n")
|
||||
buildPyLoT(verbose)
|
||||
cleanUp()
|
||||
return 0
|
||||
except KeyboardInterrupt:
|
||||
cleanUp(1)
|
||||
return 0
|
||||
except Exception as e:
|
||||
if DEBUG or TESTRUN:
|
||||
raise e
|
||||
indent = len(program_name) * " "
|
||||
sys.stderr.write(program_name + ": " + repr(e) + "\n")
|
||||
sys.stderr.write(indent + " for help use --help")
|
||||
return 2
|
||||
|
||||
|
||||
def buildPyLoT(verbosity=None):
|
||||
system = sys.platform
|
||||
if verbosity > 1:
|
||||
msg = ("... on system: {0}\n"
|
||||
"\n"
|
||||
" Current working directory: {1}\n"
|
||||
).format(system, os.getcwd())
|
||||
print(msg)
|
||||
if system.startswith(('win', 'microsoft')):
|
||||
raise CLIError(
|
||||
"building on Windows system not tested yet; implementation pending")
|
||||
elif system == 'darwin':
|
||||
# create a symbolic link to the desired python interpreter in order to
|
||||
# display the right application name
|
||||
for path in os.getenv('PATH').split(':'):
|
||||
found = glob.glob(os.path.join(path, 'python'))
|
||||
if found:
|
||||
os.symlink(found, './PyLoT')
|
||||
break
|
||||
|
||||
|
||||
def installPyLoT(verbosity=None):
|
||||
files_to_copy = {'pylot_local.in': ['~', '.pylot'],
|
||||
'pylot_regional.in': ['~', '.pylot'],
|
||||
'pylot_global.in': ['~', '.pylot']}
|
||||
if verbosity > 0:
|
||||
print('starting installation of PyLoT ...')
|
||||
if verbosity > 1:
|
||||
print('copying input files into destination folder ...')
|
||||
ans = input('please specify scope of interest '
|
||||
'([0]=local, 1=regional, 2=global) :') or 0
|
||||
if not isinstance(ans, int):
|
||||
ans = int(ans)
|
||||
if ans == 0:
|
||||
ans = 'local'
|
||||
elif ans == 1:
|
||||
ans = 'regional'
|
||||
elif ans == 2:
|
||||
ans = 'global'
|
||||
link_dest = []
|
||||
for file, destination in files_to_copy.items():
|
||||
link_file = ans in file
|
||||
if link_file:
|
||||
link_dest = copy.deepcopy(destination)
|
||||
link_dest.append('pylot.in')
|
||||
link_dest = os.path.join(*link_dest)
|
||||
destination.append(file)
|
||||
destination = os.path.join(*destination)
|
||||
srcfile = os.path.join('input', file)
|
||||
assert not os.path.isabs(srcfile), 'source files seem to be ' \
|
||||
'corrupted ...'
|
||||
if verbosity > 1:
|
||||
print('copying file {file} to folder {dest}'.format(file=file, dest=destination))
|
||||
shutil.copyfile(srcfile, destination)
|
||||
if link_file:
|
||||
if verbosity:
|
||||
print('linking input file for autoPyLoT ...')
|
||||
os.symlink(destination, link_dest)
|
||||
|
||||
|
||||
def cleanUp(verbosity=None):
|
||||
if verbosity >= 1:
|
||||
print('cleaning up build files...')
|
||||
if sys.platform == 'darwin':
|
||||
os.remove('./PyLoT')
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
if DEBUG:
|
||||
sys.argv.append("-h")
|
||||
sys.argv.append("-v")
|
||||
if TESTRUN:
|
||||
import doctest
|
||||
|
||||
doctest.testmod()
|
||||
if PROFILE:
|
||||
import cProfile
|
||||
import pstats
|
||||
|
||||
profile_filename = 'makePyLoT_profile.txt'
|
||||
cProfile.run('main()', profile_filename)
|
||||
statsfile = open("profile_stats.txt", "wb")
|
||||
p = pstats.Stats(profile_filename, stream=statsfile)
|
||||
stats = p.strip_dirs().sort_stats('cumulative')
|
||||
stats.print_stats()
|
||||
statsfile.close()
|
||||
sys.exit(0)
|
||||
sys.exit(main())
|
||||
27
pylot/__init__.py
Normal file
@@ -0,0 +1,27 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# --------------------------------------------------------
|
||||
# Purpose: Convience imports for PyLoT
|
||||
#
|
||||
'''
|
||||
================================================
|
||||
PyLoT - the Python picking and Localization Tool
|
||||
================================================
|
||||
|
||||
This python library contains a graphical user interfaces for picking
|
||||
seismic phases. This software needs ObsPy (http://github.com/obspy/obspy/wiki)
|
||||
and the Qt4 libraries to be installed first.
|
||||
|
||||
PILOT has been developed in Mathworks' MatLab. In order to distribute
|
||||
PILOT without facing portability problems, it has been decided to re-
|
||||
develop the software package in Python. The great work of the ObsPy
|
||||
group allows easy handling of a bunch of seismic data and PyLoT will
|
||||
benefit a lot compared to the former MatLab version.
|
||||
|
||||
The development of PyLoT is part of the joint research project MAGS2.
|
||||
|
||||
:copyright:
|
||||
The PyLoT Development Team
|
||||
:license:
|
||||
GNU Lesser General Public License, Version 3
|
||||
(http://www.gnu.org/copyleft/lesser.html)
|
||||
'''
|
||||
1
pylot/core/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
1
pylot/core/analysis/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
761
pylot/core/analysis/magnitude.py
Normal file
@@ -0,0 +1,761 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Created autumn/winter 2015.
|
||||
Revised/extended summer 2017.
|
||||
|
||||
:author: Ludger Küperkoch / MAGS2 EP3 working group
|
||||
"""
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
import obspy.core.event as ope
|
||||
from obspy.geodetics import degrees2kilometers
|
||||
from pylot.core.pick.utils import getsignalwin, crossings_nonzero_all, \
|
||||
select_for_phase
|
||||
from pylot.core.util.utils import common_range, fit_curve
|
||||
from scipy import integrate, signal
|
||||
from scipy.optimize import curve_fit
|
||||
|
||||
|
||||
def richter_magnitude_scaling(delta):
|
||||
distance = np.array([0, 10, 20, 25, 30, 35, 40, 45, 50, 60, 70, 75, 85, 90, 100, 110,
|
||||
120, 130, 140, 150, 160, 170, 180, 190, 200, 210, 230, 240, 250,
|
||||
260, 270, 280, 290, 300, 310, 320, 330, 340, 350, 360, 370, 380,
|
||||
390, 400, 430, 470, 510, 560, 600, 700, 800, 900, 1000])
|
||||
richter_scaling = np.array([1.4, 1.5, 1.7, 1.9, 2.1, 2.3, 2.4, 2.5, 2.6, 2.8, 2.8, 2.9,
|
||||
2.9, 3.0, 3.1, 3.1, 3.2, 3.2, 3.3, 3.3, 3.4, 3.4, 3.5, 3.5,
|
||||
3.6, 3.7, 3.7, 3.8, 3.8, 3.9, 3.9, 4.0, 4.0, 4.1, 4.2, 4.2,
|
||||
4.2, 4.2, 4.3, 4.3, 4.3, 4.4, 4.4, 4.5, 4.6, 4.7, 4.8, 4.9,
|
||||
5.1, 5.2, 5.4, 5.5, 5.7])
|
||||
# prepare spline interpolation to calculate return value
|
||||
func, params = fit_curve(distance, richter_scaling)
|
||||
return func(delta, params)
|
||||
|
||||
|
||||
class Magnitude(object):
|
||||
"""
|
||||
Base class object for Magnitude calculation within PyLoT.
|
||||
"""
|
||||
|
||||
def __init__(self, stream, event, verbosity=False, iplot=0):
|
||||
self._type = "M"
|
||||
self._stream = stream
|
||||
self._plot_flag = iplot
|
||||
self._verbosity = verbosity
|
||||
self._event = event
|
||||
self._magnitudes = dict()
|
||||
|
||||
def __str__(self):
|
||||
print(
|
||||
'number of stations used: {0}\n'.format(len(self.magnitudes.values())))
|
||||
print('\tstation\tmagnitude')
|
||||
for s, m in self.magnitudes.items(): print('\t{0}\t{1}'.format(s, m))
|
||||
|
||||
def __nonzero__(self):
|
||||
return bool(self.magnitudes)
|
||||
|
||||
@property
|
||||
def type(self):
|
||||
return self._type
|
||||
|
||||
@property
|
||||
def plot_flag(self):
|
||||
return self._plot_flag
|
||||
|
||||
@plot_flag.setter
|
||||
def plot_flag(self, value):
|
||||
self._plot_flag = value
|
||||
|
||||
@property
|
||||
def verbose(self):
|
||||
return self._verbosity
|
||||
|
||||
@verbose.setter
|
||||
def verbose(self, value):
|
||||
if not isinstance(value, bool):
|
||||
print('WARNING: only boolean values accepted...\n')
|
||||
value = bool(value)
|
||||
self._verbosity = value
|
||||
|
||||
@property
|
||||
def stream(self):
|
||||
return self._stream
|
||||
|
||||
@stream.setter
|
||||
def stream(self, value):
|
||||
self._stream = value
|
||||
|
||||
@property
|
||||
def event(self):
|
||||
return self._event
|
||||
|
||||
@property
|
||||
def origin_id(self):
|
||||
return self._event.origins[0].resource_id
|
||||
|
||||
@property
|
||||
def arrivals(self):
|
||||
return self._event.origins[0].arrivals
|
||||
|
||||
@property
|
||||
def magnitudes(self):
|
||||
return self._magnitudes
|
||||
|
||||
@magnitudes.setter
|
||||
def magnitudes(self, value):
|
||||
"""
|
||||
takes a tuple and saves the key value pair to private
|
||||
attribute _magnitudes
|
||||
:param value: station, magnitude value pair
|
||||
:type value: tuple or list
|
||||
:return:
|
||||
"""
|
||||
station, magnitude = value
|
||||
self._magnitudes[station] = magnitude
|
||||
|
||||
def calc(self):
|
||||
pass
|
||||
|
||||
def updated_event(self, magscaling=None):
|
||||
self.event.magnitudes.append(self.net_magnitude(magscaling))
|
||||
return self.event
|
||||
|
||||
def net_magnitude(self, magscaling=None):
|
||||
if self:
|
||||
if magscaling is not None and str(magscaling) is not '[0.0, 0.0]':
|
||||
# scaling necessary
|
||||
print("Scaling network magnitude ...")
|
||||
mag = ope.Magnitude(
|
||||
mag=np.median([M.mag for M in self.magnitudes.values()]) * \
|
||||
magscaling[0] + magscaling[1],
|
||||
magnitude_type=self.type,
|
||||
origin_id=self.origin_id,
|
||||
station_count=len(self.magnitudes),
|
||||
azimuthal_gap=self.origin_id.get_referred_object().quality.azimuthal_gap)
|
||||
else:
|
||||
# no saling necessary
|
||||
mag = ope.Magnitude(
|
||||
mag=np.median([M.mag for M in self.magnitudes.values()]),
|
||||
magnitude_type=self.type,
|
||||
origin_id=self.origin_id,
|
||||
station_count=len(self.magnitudes),
|
||||
azimuthal_gap=self.origin_id.get_referred_object().quality.azimuthal_gap)
|
||||
return mag
|
||||
return None
|
||||
|
||||
|
||||
class LocalMagnitude(Magnitude):
|
||||
"""
|
||||
Method to derive peak-to-peak amplitude as seen on a Wood-Anderson-
|
||||
seismograph. Has to be derived from instrument corrected traces!
|
||||
"""
|
||||
|
||||
# poles, zeros and sensitivity of WA seismograph
|
||||
# (see Uhrhammer & Collins, 1990, BSSA, pp. 702-716)
|
||||
_paz = {
|
||||
'poles': [5.6089 - 5.4978j, -5.6089 - 5.4978j],
|
||||
'zeros': [0j, 0j],
|
||||
'gain': 2080,
|
||||
'sensitivity': 1
|
||||
}
|
||||
|
||||
_amplitudes = dict()
|
||||
|
||||
def __init__(self, stream, event, calc_win, wascaling, verbosity=False, iplot=0):
|
||||
super(LocalMagnitude, self).__init__(stream, event, verbosity, iplot)
|
||||
|
||||
self._calc_win = calc_win
|
||||
self._wascaling = wascaling
|
||||
self._type = 'ML'
|
||||
self.calc()
|
||||
|
||||
@property
|
||||
def calc_win(self):
|
||||
return self._calc_win
|
||||
|
||||
@calc_win.setter
|
||||
def calc_win(self, value):
|
||||
self._calc_win = value
|
||||
|
||||
@property
|
||||
def wascaling(self):
|
||||
return self._wascaling
|
||||
|
||||
@property
|
||||
def amplitudes(self):
|
||||
return self._amplitudes
|
||||
|
||||
@amplitudes.setter
|
||||
def amplitudes(self, value):
|
||||
station, a0 = value
|
||||
self._amplitudes[station] = a0
|
||||
|
||||
def peak_to_peak(self, st, t0):
|
||||
|
||||
try:
|
||||
iplot = int(self.plot_flag)
|
||||
except:
|
||||
if self.plot_flag == True or self.plot_flag == 'True':
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
|
||||
# simulate Wood-Anderson response
|
||||
st.simulate(paz_remove=None, paz_simulate=self._paz)
|
||||
|
||||
# trim waveform to common range
|
||||
stime, etime = common_range(st)
|
||||
st.trim(stime, etime)
|
||||
|
||||
# get time delta from waveform data
|
||||
dt = st[0].stats.delta
|
||||
|
||||
power = [np.power(tr.data, 2) for tr in st if tr.stats.channel[-1] not
|
||||
in 'Z3']
|
||||
if len(power) != 2:
|
||||
raise ValueError('Wood-Anderson amplitude defintion only valid for '
|
||||
'two horizontals: {0} given'.format(len(power)))
|
||||
power_sum = power[0] + power[1]
|
||||
#
|
||||
sqH = np.sqrt(power_sum)
|
||||
|
||||
# get time array
|
||||
th = np.arange(0, len(sqH) * dt, dt)
|
||||
# get maximum peak within pick window
|
||||
iwin = getsignalwin(th, t0 - stime, self.calc_win)
|
||||
ii = min([iwin[len(iwin) - 1], len(th)])
|
||||
iwin = iwin[0:ii]
|
||||
wapp = np.max(sqH[iwin])
|
||||
if self.verbose:
|
||||
print("Determined Wood-Anderson peak-to-peak amplitude for station {0}: {1} "
|
||||
"mm".format(st[0].stats.station, wapp))
|
||||
|
||||
# check for plot flag (for debugging only)
|
||||
fig = None
|
||||
if iplot > 1:
|
||||
st.plot()
|
||||
fig = plt.figure()
|
||||
ax = fig.add_subplot(111)
|
||||
ax.plot(th, sqH)
|
||||
ax.plot(th[iwin], sqH[iwin], 'g')
|
||||
ax.plot([t0, t0], [0, max(sqH)], 'r', linewidth=2)
|
||||
ax.title(
|
||||
'Station %s, RMS Horizontal Traces, WA-peak-to-peak=%4.1f mm' \
|
||||
% (st[0].stats.station, wapp))
|
||||
ax.set_xlabel('Time [s]')
|
||||
ax.set_ylabel('Displacement [mm]')
|
||||
|
||||
return wapp, fig
|
||||
|
||||
def calc(self):
|
||||
for a in self.arrivals:
|
||||
if a.phase not in 'sS':
|
||||
continue
|
||||
pick = a.pick_id.get_referred_object()
|
||||
station = pick.waveform_id.station_code
|
||||
wf = select_for_phase(self.stream.select(
|
||||
station=station), a.phase)
|
||||
if not wf:
|
||||
if self.verbose:
|
||||
print(
|
||||
'WARNING: no waveform data found for station {0}'.format(
|
||||
station))
|
||||
continue
|
||||
delta = degrees2kilometers(a.distance)
|
||||
onset = pick.time
|
||||
a0, self.p2p_fig = self.peak_to_peak(wf, onset)
|
||||
amplitude = ope.Amplitude(generic_amplitude=a0 * 1e-3)
|
||||
amplitude.unit = 'm'
|
||||
amplitude.category = 'point'
|
||||
amplitude.waveform_id = pick.waveform_id
|
||||
amplitude.magnitude_hint = self.type
|
||||
amplitude.pick_id = pick.resource_id
|
||||
amplitude.type = 'AML'
|
||||
self.event.amplitudes.append(amplitude)
|
||||
self.amplitudes = (station, amplitude)
|
||||
# using standard Gutenberg-Richter relation
|
||||
# or scale WA amplitude with given scaling relation
|
||||
if str(self.wascaling) == '[0.0, 0.0, 0.0]':
|
||||
print("Calculating original Richter magnitude ...")
|
||||
magnitude = ope.StationMagnitude(mag=np.log10(a0) \
|
||||
+ richter_magnitude_scaling(delta))
|
||||
else:
|
||||
print("Calculating scaled local magnitude ...")
|
||||
a0 = a0 * 1e03 # mm to nm (see Havskov & Ottemöller, 2010)
|
||||
magnitude = ope.StationMagnitude(mag=np.log10(a0) \
|
||||
+ self.wascaling[0] * np.log10(delta) + self.wascaling[1]
|
||||
* delta + self.wascaling[
|
||||
2])
|
||||
magnitude.origin_id = self.origin_id
|
||||
magnitude.waveform_id = pick.waveform_id
|
||||
magnitude.amplitude_id = amplitude.resource_id
|
||||
magnitude.station_magnitude_type = self.type
|
||||
self.event.station_magnitudes.append(magnitude)
|
||||
self.magnitudes = (station, magnitude)
|
||||
|
||||
|
||||
class MomentMagnitude(Magnitude):
|
||||
'''
|
||||
Method to calculate seismic moment Mo and moment magnitude Mw.
|
||||
Requires results of class calcsourcespec for calculating plateau w0
|
||||
and corner frequency fc of source spectrum, respectively. Uses
|
||||
subfunction calcMoMw.py. Returns modified dictionary of picks including
|
||||
Dc-value, corner frequency fc, seismic moment Mo and
|
||||
corresponding moment magntiude Mw.
|
||||
'''
|
||||
|
||||
_props = dict()
|
||||
|
||||
def __init__(self, stream, event, vp, Qp, density, verbosity=False,
|
||||
iplot=False):
|
||||
super(MomentMagnitude, self).__init__(stream, event, verbosity, iplot)
|
||||
|
||||
self._vp = vp
|
||||
self._Qp = Qp
|
||||
self._density = density
|
||||
self._type = 'Mw'
|
||||
self.calc()
|
||||
|
||||
@property
|
||||
def p_velocity(self):
|
||||
return self._vp
|
||||
|
||||
@property
|
||||
def p_attenuation(self):
|
||||
return self._Qp
|
||||
|
||||
@property
|
||||
def rock_density(self):
|
||||
return self._density
|
||||
|
||||
@property
|
||||
def moment_props(self):
|
||||
return self._props
|
||||
|
||||
@moment_props.setter
|
||||
def moment_props(self, value):
|
||||
station, props = value
|
||||
self._props[station] = props
|
||||
|
||||
@property
|
||||
def seismic_moment(self):
|
||||
return self._m0
|
||||
|
||||
@seismic_moment.setter
|
||||
def seismic_moment(self, value):
|
||||
self._m0 = value
|
||||
|
||||
def calc(self):
|
||||
for a in self.arrivals:
|
||||
if a.phase not in 'pP':
|
||||
continue
|
||||
pick = a.pick_id.get_referred_object()
|
||||
station = pick.waveform_id.station_code
|
||||
scopy = self.stream.copy()
|
||||
wf = scopy.select(station=station)
|
||||
if not wf:
|
||||
continue
|
||||
onset = pick.time
|
||||
distance = degrees2kilometers(a.distance)
|
||||
azimuth = a.azimuth
|
||||
incidence = a.takeoff_angle
|
||||
w0, fc = calcsourcespec(wf, onset, self.p_velocity, distance,
|
||||
azimuth, incidence, self.p_attenuation,
|
||||
self.plot_flag, self.verbose)
|
||||
if w0 is None or fc is None:
|
||||
if self.verbose:
|
||||
print("WARNING: insufficient frequency information")
|
||||
continue
|
||||
WF = select_for_phase(self.stream.select(
|
||||
station=station), a.phase)
|
||||
WF = select_for_phase(WF, "P")
|
||||
m0, mw = calcMoMw(WF, w0, self.rock_density, self.p_velocity,
|
||||
distance, self.verbose)
|
||||
self.moment_props = (station, dict(w0=w0, fc=fc, Mo=m0))
|
||||
magnitude = ope.StationMagnitude(mag=mw)
|
||||
magnitude.origin_id = self.origin_id
|
||||
magnitude.waveform_id = pick.waveform_id
|
||||
magnitude.station_magnitude_type = self.type
|
||||
self.event.station_magnitudes.append(magnitude)
|
||||
self.magnitudes = (station, magnitude)
|
||||
|
||||
|
||||
def calcMoMw(wfstream, w0, rho, vp, delta, verbosity=False):
|
||||
'''
|
||||
Subfunction of run_calcMoMw to calculate individual
|
||||
seismic moments and corresponding moment magnitudes.
|
||||
|
||||
:param: wfstream
|
||||
:type: `~obspy.core.stream.Stream`
|
||||
|
||||
:param: w0, height of plateau of source spectrum
|
||||
:type: float
|
||||
|
||||
:param: rho, rock density [kg/m³]
|
||||
:type: integer
|
||||
|
||||
:param: delta, hypocentral distance [km]
|
||||
:type: integer
|
||||
|
||||
:param: inv, name/path of inventory or dataless-SEED file
|
||||
:type: string
|
||||
'''
|
||||
|
||||
tr = wfstream[0]
|
||||
delta = delta * 1000 # hypocentral distance in [m]
|
||||
|
||||
if verbosity:
|
||||
print(
|
||||
"calcMoMw: Calculating seismic moment Mo and moment magnitude Mw for station {0} ...".format(
|
||||
tr.stats.station))
|
||||
|
||||
# additional common parameters for calculating Mo
|
||||
rP = 2 / np.sqrt(
|
||||
15) # average radiation pattern of P waves (Aki & Richards, 1980)
|
||||
freesurf = 2.0 # free surface correction, assuming vertical incidence
|
||||
|
||||
Mo = w0 * 4 * np.pi * rho * np.power(vp, 3) * delta / (rP * freesurf)
|
||||
|
||||
# Mw = np.log10(Mo * 1e07) * 2 / 3 - 10.7 # after Hanks & Kanamori (1979), defined for [dyn*cm]!
|
||||
Mw = np.log10(Mo) * 2 / 3 - 6.7 # for metric units
|
||||
|
||||
if verbosity:
|
||||
print(
|
||||
"calcMoMw: Calculated seismic moment Mo = {0} Nm => Mw = {1:3.1f} ".format(
|
||||
Mo, Mw))
|
||||
|
||||
return Mo, Mw
|
||||
|
||||
|
||||
def calcsourcespec(wfstream, onset, vp, delta, azimuth, incidence,
|
||||
qp, iplot=0, verbosity=False):
|
||||
'''
|
||||
Subfunction to calculate the source spectrum and to derive from that the plateau
|
||||
(usually called omega0) and the corner frequency assuming Aki's omega-square
|
||||
source model. Has to be derived from instrument corrected displacement traces,
|
||||
thus restitution and integration necessary! Integrated traces are rotated
|
||||
into ray-coordinate system ZNE => LQT using Obspy's rotate modul!
|
||||
|
||||
:param: wfstream (corrected for instrument)
|
||||
:type: `~obspy.core.stream.Stream`
|
||||
|
||||
:param: onset, P-phase onset time
|
||||
:type: float
|
||||
|
||||
:param: vp, Vp-wave velocity
|
||||
:type: float
|
||||
|
||||
:param: delta, hypocentral distance [km]
|
||||
:type: integer
|
||||
|
||||
:param: azimuth
|
||||
:type: integer
|
||||
|
||||
:param: incidence
|
||||
:type: integer
|
||||
|
||||
:param: Qp, quality factor for P-waves
|
||||
:type: integer
|
||||
|
||||
:param: iplot, show results (iplot>1) or not (iplot<1)
|
||||
:type: integer
|
||||
'''
|
||||
if verbosity:
|
||||
print("Calculating source spectrum for station %s ...." % wfstream[0].stats.station)
|
||||
|
||||
try:
|
||||
iplot = int(iplot)
|
||||
except:
|
||||
if iplot == True or iplot == 'True':
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
|
||||
# get Q value
|
||||
Q, A = qp
|
||||
|
||||
dist = delta * 1000 # hypocentral distance in [m]
|
||||
|
||||
fc = None
|
||||
w0 = None
|
||||
|
||||
zdat = select_for_phase(wfstream, "P")
|
||||
|
||||
dt = zdat[0].stats.delta
|
||||
|
||||
freq = zdat[0].stats.sampling_rate
|
||||
|
||||
# trim traces to common range (for rotation)
|
||||
trstart, trend = common_range(wfstream)
|
||||
wfstream.trim(trstart, trend)
|
||||
|
||||
# rotate into LQT (ray-coordindate-) system using Obspy's rotate
|
||||
# L: P-wave direction
|
||||
# Q: SV-wave direction
|
||||
# T: SH-wave direction
|
||||
LQT = wfstream.rotate('ZNE->LQT', azimuth, incidence)
|
||||
ldat = LQT.select(component="L")
|
||||
if len(ldat) == 0:
|
||||
# if horizontal channels are 1 and 2
|
||||
# no azimuth information is available and thus no
|
||||
# rotation is possible!
|
||||
if verbosity:
|
||||
print("calcsourcespec: Azimuth information is missing, "
|
||||
"no rotation of components possible!")
|
||||
# instead, use component 3
|
||||
ldat = LQT.select(component="3")
|
||||
if len(ldat) == 0:
|
||||
# maybe component z available
|
||||
ldat = LQT.select(component="Z")
|
||||
|
||||
# integrate to displacement
|
||||
# unrotated vertical component (for comparison)
|
||||
inttrz = signal.detrend(integrate.cumtrapz(zdat[0].data, None, dt))
|
||||
# rotated component Z => L
|
||||
Ldat = signal.detrend(integrate.cumtrapz(ldat[0].data, None, dt))
|
||||
|
||||
# get window after P pulse for
|
||||
# calculating source spectrum
|
||||
rel_onset = onset - trstart
|
||||
impickP = int(rel_onset * freq)
|
||||
wfzc = Ldat[impickP: len(Ldat) - 1]
|
||||
# get time array
|
||||
t = np.arange(0, len(inttrz) * dt, dt)
|
||||
# calculate spectrum using only first cycles of
|
||||
# waveform after P onset!
|
||||
zc = crossings_nonzero_all(wfzc)
|
||||
if np.size(zc) == 0 or len(zc) <= 3:
|
||||
if verbosity:
|
||||
print("calcsourcespec: Something is wrong with the waveform, "
|
||||
"no zero crossings derived!\n")
|
||||
print("No calculation of source spectrum possible!")
|
||||
plotflag = 0
|
||||
else:
|
||||
plotflag = 1
|
||||
index = min([3, len(zc) - 1])
|
||||
calcwin = (zc[index] - zc[0]) * dt
|
||||
iwin = getsignalwin(t, rel_onset, calcwin)
|
||||
xdat = Ldat[iwin]
|
||||
|
||||
# fft
|
||||
fny = freq / 2
|
||||
l = len(xdat) / freq
|
||||
# number of fft bins after Bath
|
||||
n = freq * l
|
||||
# find next power of 2 of data length
|
||||
m = pow(2, np.ceil(np.log(len(xdat)) / np.log(2)))
|
||||
N = int(np.power(m, 2))
|
||||
y = dt * np.fft.fft(xdat, N)
|
||||
Y = abs(y[: N / 2])
|
||||
L = (N - 1) / freq
|
||||
f = np.arange(0, fny, 1 / L)
|
||||
|
||||
# remove zero-frequency and frequencies above
|
||||
# corner frequency of seismometer (assumed
|
||||
# to be 100 Hz)
|
||||
fi = np.where((f >= 1) & (f < 100))
|
||||
F = f[fi]
|
||||
YY = Y[fi]
|
||||
|
||||
# correction for attenuation
|
||||
wa = 2 * np.pi * F # angular frequency
|
||||
D = np.exp((wa * dist) / (2 * vp * Q * F ** A))
|
||||
YYcor = YY.real * D
|
||||
|
||||
# get plateau (DC value) and corner frequency
|
||||
# initial guess of plateau
|
||||
w0in = np.mean(YYcor[0:100])
|
||||
# initial guess of corner frequency
|
||||
# where spectral level reached 50% of flat level
|
||||
iin = np.where(YYcor >= 0.5 * w0in)
|
||||
Fcin = F[iin[0][np.size(iin) - 1]]
|
||||
|
||||
# use of implicit scipy otimization function
|
||||
fit = synthsourcespec(F, w0in, Fcin)
|
||||
[optspecfit, _] = curve_fit(synthsourcespec, F, YYcor, [w0in, Fcin])
|
||||
w0 = optspecfit[0]
|
||||
fc = optspecfit[1]
|
||||
# w01 = optspecfit[0]
|
||||
# fc1 = optspecfit[1]
|
||||
if verbosity:
|
||||
print("calcsourcespec: Determined w0-value: %e m/Hz, \n"
|
||||
"calcsourcespec: Determined corner frequency: %f Hz" % (w0, fc))
|
||||
|
||||
# use of conventional fitting
|
||||
# [w02, fc2] = fitSourceModel(F, YYcor, Fcin, iplot, verbosity)
|
||||
|
||||
# get w0 and fc as median of both
|
||||
# source spectrum fits
|
||||
# w0 = np.median([w01, w02])
|
||||
# fc = np.median([fc1, fc2])
|
||||
# if verbosity:
|
||||
# print("calcsourcespec: Using w0-value = %e m/Hz and fc = %f Hz" % (
|
||||
# w0, fc))
|
||||
|
||||
if iplot > 1:
|
||||
f1 = plt.figure()
|
||||
tLdat = np.arange(0, len(Ldat) * dt, dt)
|
||||
plt.subplot(2, 1, 1)
|
||||
# show displacement in mm
|
||||
p1, = plt.plot(t, np.multiply(inttrz, 1000), 'k')
|
||||
p2, = plt.plot(tLdat, np.multiply(Ldat, 1000))
|
||||
plt.legend([p1, p2], ['Displacement', 'Rotated Displacement'])
|
||||
if plotflag == 1:
|
||||
plt.plot(t[iwin], np.multiply(xdat, 1000), 'g')
|
||||
plt.title('Seismogram and P Pulse, Station %s-%s' \
|
||||
% (zdat[0].stats.station, zdat[0].stats.channel))
|
||||
else:
|
||||
plt.title('Seismogram, Station %s-%s' \
|
||||
% (zdat[0].stats.station, zdat[0].stats.channel))
|
||||
plt.xlabel('Time since %s' % zdat[0].stats.starttime)
|
||||
plt.ylabel('Displacement [mm]')
|
||||
|
||||
if plotflag == 1:
|
||||
plt.subplot(2, 1, 2)
|
||||
p1, = plt.loglog(f, Y.real, 'k')
|
||||
p2, = plt.loglog(F, YY.real)
|
||||
p3, = plt.loglog(F, YYcor, 'r')
|
||||
p4, = plt.loglog(F, fit, 'g')
|
||||
plt.loglog([fc, fc], [w0 / 100, w0], 'g')
|
||||
plt.legend([p1, p2, p3, p4], ['Raw Spectrum',
|
||||
'Used Raw Spectrum',
|
||||
'Q-Corrected Spectrum',
|
||||
'Fit to Spectrum'])
|
||||
plt.title('Source Spectrum from P Pulse, w0=%e m/Hz, fc=%6.2f Hz' \
|
||||
% (w0, fc))
|
||||
plt.xlabel('Frequency [Hz]')
|
||||
plt.ylabel('Amplitude [m/Hz]')
|
||||
plt.grid()
|
||||
|
||||
return w0, fc
|
||||
|
||||
|
||||
def synthsourcespec(f, omega0, fcorner):
|
||||
'''
|
||||
Calculates synthetic source spectrum from given plateau and corner
|
||||
frequency assuming Akis omega-square model.
|
||||
|
||||
:param: f, frequencies
|
||||
:type: array
|
||||
|
||||
:param: omega0, DC-value (plateau) of source spectrum
|
||||
:type: float
|
||||
|
||||
:param: fcorner, corner frequency of source spectrum
|
||||
:type: float
|
||||
'''
|
||||
|
||||
# ssp = omega0 / (pow(2, (1 + f / fcorner)))
|
||||
ssp = omega0 / (1 + pow(2, (f / fcorner)))
|
||||
|
||||
return ssp
|
||||
|
||||
|
||||
def fitSourceModel(f, S, fc0, iplot, verbosity=False):
|
||||
'''
|
||||
Calculates synthetic source spectrum by varying corner frequency fc.
|
||||
Returns best approximated plateau omega0 and corner frequency, i.e. with least
|
||||
common standard deviations.
|
||||
|
||||
:param: f, frequencies
|
||||
:type: array
|
||||
|
||||
:param: S, observed source spectrum
|
||||
:type: array
|
||||
|
||||
:param: fc0, initial corner frequency
|
||||
:type: float
|
||||
'''
|
||||
|
||||
try:
|
||||
iplot = int(iplot)
|
||||
except:
|
||||
if iplot == True or iplot == 'True':
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
|
||||
w0 = []
|
||||
stdw0 = []
|
||||
fc = []
|
||||
stdfc = []
|
||||
STD = []
|
||||
# get window around initial corner frequency for trials
|
||||
# left side of initial corner frequency
|
||||
fcstopl = max(f[0], fc0 - max(1, fc0 / 2))
|
||||
il = np.where(f <= fcstopl)
|
||||
il = il[0][np.size(il) - 1]
|
||||
# right side of initial corner frequency
|
||||
fcstopr = min(fc0 + (fc0 / 2), f[len(f) - 1])
|
||||
ir = np.where(f >= fcstopr)
|
||||
# check, if fcstopr is available
|
||||
if np.size(ir) == 0:
|
||||
fcstopr = fc0
|
||||
ir = len(f) - 1
|
||||
else:
|
||||
ir = ir[0][0]
|
||||
|
||||
# vary corner frequency around initial point
|
||||
print("fitSourceModel: Varying corner frequency "
|
||||
"around initial corner frequency ...")
|
||||
# check difference of il and ir in order to
|
||||
# keep calculation time acceptable
|
||||
idiff = ir - il
|
||||
if idiff > 10000:
|
||||
increment = 100
|
||||
elif idiff <= 20:
|
||||
increment = 1
|
||||
else:
|
||||
increment = 10
|
||||
|
||||
for i in range(il, ir, increment):
|
||||
FC = f[i]
|
||||
indexdc = np.where((f > 0) & (f <= FC))
|
||||
dc = np.mean(S[indexdc])
|
||||
stddc = np.std(dc - S[indexdc])
|
||||
w0.append(dc)
|
||||
stdw0.append(stddc)
|
||||
fc.append(FC)
|
||||
# slope
|
||||
indexfc = np.where((f >= FC) & (f <= fcstopr))
|
||||
yi = dc / (1 + (f[indexfc] / FC) ** 2)
|
||||
stdFC = np.std(yi - S[indexfc])
|
||||
stdfc.append(stdFC)
|
||||
STD.append(stddc + stdFC)
|
||||
|
||||
# get best found w0 anf fc from minimum
|
||||
if len(STD) > 0:
|
||||
fc = fc[np.argmin(STD)]
|
||||
w0 = w0[np.argmin(STD)]
|
||||
elif len(STD) == 0:
|
||||
fc = fc0
|
||||
w0 = max(S)
|
||||
if verbosity:
|
||||
print(
|
||||
"fitSourceModel: best fc: {0} Hz, best w0: {1} m/Hz".format(fc, w0))
|
||||
|
||||
if iplot > 1:
|
||||
plt.figure() # iplot)
|
||||
plt.loglog(f, S, 'k')
|
||||
plt.loglog([f[0], fc], [w0, w0], 'g')
|
||||
plt.loglog([fc, fc], [w0 / 100, w0], 'g')
|
||||
plt.title('Calculated Source Spectrum, Omega0=%e m/Hz, fc=%6.2f Hz' \
|
||||
% (w0, fc))
|
||||
plt.xlabel('Frequency [Hz]')
|
||||
plt.ylabel('Amplitude [m/Hz]')
|
||||
plt.grid()
|
||||
plt.figure() # iplot + 1)
|
||||
plt.subplot(311)
|
||||
plt.plot(f[il:ir], STD, '*')
|
||||
plt.title('Common Standard Deviations')
|
||||
plt.xticks([])
|
||||
plt.subplot(312)
|
||||
plt.plot(f[il:ir], stdw0, '*')
|
||||
plt.title('Standard Deviations of w0-Values')
|
||||
plt.xticks([])
|
||||
plt.subplot(313)
|
||||
plt.plot(f[il:ir], stdfc, '*')
|
||||
plt.title('Standard Deviations of Corner Frequencies')
|
||||
plt.xlabel('Corner Frequencies [Hz]')
|
||||
|
||||
return w0, fc
|
||||
1
pylot/core/io/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
750
pylot/core/io/data.py
Normal file
@@ -0,0 +1,750 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import copy
|
||||
import os
|
||||
|
||||
from obspy import read_events
|
||||
from obspy.core import read, Stream, UTCDateTime
|
||||
from obspy.core.event import Event as ObsPyEvent
|
||||
from obspy.io.sac import SacIOError
|
||||
from pylot.core.io.phases import readPILOTEvent, picks_from_picksdict, \
|
||||
picksdict_from_pilot, merge_picks
|
||||
from pylot.core.util.errors import FormatError, OverwriteError
|
||||
from pylot.core.util.event import Event
|
||||
from pylot.core.util.utils import fnConstructor, full_range
|
||||
import pylot.core.loc.velest as velest
|
||||
|
||||
|
||||
class Data(object):
|
||||
"""
|
||||
Data container with attributes wfdata holding ~obspy.core.stream.
|
||||
|
||||
:type parent: PySide.QtGui.QWidget object, optional
|
||||
:param parent: A PySide.QtGui.QWidget object utilized when
|
||||
called by a GUI to display a PySide.QtGui.QMessageBox instead of printing
|
||||
to standard out.
|
||||
:type evtdata: ~obspy.core.event.Event object, optional
|
||||
:param evtdata ~obspy.core.event.Event object containing all derived or
|
||||
loaded event. Container object holding, e.g. phase arrivals, etc.
|
||||
"""
|
||||
|
||||
def __init__(self, parent=None, evtdata=None):
|
||||
self._parent = parent
|
||||
if self.getParent():
|
||||
self.comp = parent.getComponent()
|
||||
else:
|
||||
self.comp = 'Z'
|
||||
self.wfdata = Stream()
|
||||
self._new = False
|
||||
if isinstance(evtdata, ObsPyEvent) or isinstance(evtdata, Event):
|
||||
pass
|
||||
elif isinstance(evtdata, dict):
|
||||
evt = readPILOTEvent(**evtdata)
|
||||
evtdata = evt
|
||||
elif isinstance(evtdata, str):
|
||||
try:
|
||||
cat = read_events(evtdata)
|
||||
if len(cat) is not 1:
|
||||
raise ValueError('ambiguous event information for file: '
|
||||
'{file}'.format(file=evtdata))
|
||||
evtdata = cat[0]
|
||||
except TypeError as e:
|
||||
if 'Unknown format for file' in e.message:
|
||||
if 'PHASES' in evtdata:
|
||||
picks = picksdict_from_pilot(evtdata)
|
||||
evtdata = ObsPyEvent()
|
||||
evtdata.picks = picks_from_picksdict(picks)
|
||||
elif 'LOC' in evtdata:
|
||||
raise NotImplementedError('PILOT location information '
|
||||
'read support not yet '
|
||||
'implemeted.')
|
||||
else:
|
||||
raise e
|
||||
else:
|
||||
raise e
|
||||
else: # create an empty Event object
|
||||
self.setNew()
|
||||
evtdata = ObsPyEvent()
|
||||
evtdata.picks = []
|
||||
self.evtdata = evtdata
|
||||
self.wforiginal = None
|
||||
self.cuttimes = None
|
||||
self.dirty = False
|
||||
|
||||
def __str__(self):
|
||||
return str(self.wfdata)
|
||||
|
||||
def __add__(self, other):
|
||||
assert isinstance(other, Data), "operands must be of same type 'Data'"
|
||||
rs_id = self.get_evt_data().get('resource_id')
|
||||
rs_id_other = other.get_evt_data().get('resource_id')
|
||||
if other.isNew() and not self.isNew():
|
||||
picks_to_add = other.get_evt_data().picks
|
||||
old_picks = self.get_evt_data().picks
|
||||
for pick in picks_to_add:
|
||||
if pick not in old_picks:
|
||||
old_picks.append(pick)
|
||||
elif not other.isNew() and self.isNew():
|
||||
new = other + self
|
||||
self.evtdata = new.get_evt_data()
|
||||
elif self.isNew() and other.isNew():
|
||||
pass
|
||||
elif rs_id == rs_id_other:
|
||||
other.setNew()
|
||||
return self + other
|
||||
else:
|
||||
raise ValueError("both Data objects have differing "
|
||||
"unique Event identifiers")
|
||||
return self
|
||||
|
||||
def getPicksStr(self):
|
||||
picks_str = ''
|
||||
for pick in self.get_evt_data().picks:
|
||||
picks_str += str(pick) + '\n'
|
||||
return picks_str
|
||||
|
||||
def getParent(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return self._parent
|
||||
|
||||
def isNew(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return self._new
|
||||
|
||||
def setNew(self):
|
||||
self._new = True
|
||||
|
||||
def getCutTimes(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
if self.cuttimes is None:
|
||||
self.updateCutTimes()
|
||||
return self.cuttimes
|
||||
|
||||
def updateCutTimes(self):
|
||||
"""
|
||||
|
||||
|
||||
"""
|
||||
self.cuttimes = full_range(self.getWFData())
|
||||
|
||||
def getEventFileName(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
ID = self.getID()
|
||||
# handle forbidden filenames especially on windows systems
|
||||
return fnConstructor(str(ID))
|
||||
|
||||
def checkEvent(self, event, fcheck, forceOverwrite=False):
|
||||
if 'origin' in fcheck:
|
||||
self.replaceOrigin(event, forceOverwrite)
|
||||
if 'magnitude' in fcheck:
|
||||
self.replaceMagnitude(event, forceOverwrite)
|
||||
if 'auto' in fcheck:
|
||||
self.replacePicks(event, 'auto')
|
||||
if 'manual' in fcheck:
|
||||
self.replacePicks(event, 'manual')
|
||||
|
||||
def replaceOrigin(self, event, forceOverwrite=False):
|
||||
if self.get_evt_data().origins or forceOverwrite:
|
||||
if event.origins:
|
||||
print("Found origin, replace it by new origin.")
|
||||
event.origins = self.get_evt_data().origins
|
||||
|
||||
def replaceMagnitude(self, event, forceOverwrite=False):
|
||||
if self.get_evt_data().magnitudes or forceOverwrite:
|
||||
if event.magnitudes:
|
||||
print("Found magnitude, replace it by new magnitude")
|
||||
event.magnitudes = self.get_evt_data().magnitudes
|
||||
|
||||
def replacePicks(self, event, picktype):
|
||||
checkflag = 0
|
||||
picks = event.picks
|
||||
# remove existing picks
|
||||
for j, pick in reversed(list(enumerate(picks))):
|
||||
if picktype in str(pick.method_id.id):
|
||||
picks.pop(j)
|
||||
checkflag = 1
|
||||
if checkflag:
|
||||
print("Found %s pick(s), remove them and append new picks to catalog." % picktype)
|
||||
|
||||
# append new picks
|
||||
for pick in self.get_evt_data().picks:
|
||||
if picktype in str(pick.method_id.id):
|
||||
picks.append(pick)
|
||||
|
||||
def exportEvent(self, fnout, fnext='.xml', fcheck='auto', upperErrors=None):
|
||||
|
||||
"""
|
||||
:param fnout: basename of file
|
||||
:param fnext: file extension
|
||||
:param fcheck: check and delete existing information
|
||||
can be a str or a list of strings of ['manual', 'auto', 'origin', 'magnitude']
|
||||
"""
|
||||
from pylot.core.util.defaults import OUTPUTFORMATS
|
||||
|
||||
if not type(fcheck) == list:
|
||||
fcheck = [fcheck]
|
||||
|
||||
try:
|
||||
evtformat = OUTPUTFORMATS[fnext]
|
||||
except KeyError as e:
|
||||
errmsg = '{0}; selected file extension {1} not ' \
|
||||
'supported'.format(e, fnext)
|
||||
raise FormatError(errmsg)
|
||||
|
||||
# check for already existing xml-file
|
||||
if fnext == '.xml':
|
||||
if os.path.isfile(fnout + fnext):
|
||||
print("xml-file already exists! Check content ...")
|
||||
cat = read_events(fnout + fnext)
|
||||
if len(cat) > 1:
|
||||
raise IOError('Ambigious event information in file {}'.format(fnout + fnext))
|
||||
if len(cat) < 1:
|
||||
raise IOError('No event information in file {}'.format(fnout + fnext))
|
||||
event = cat[0]
|
||||
if not event.resource_id == self.get_evt_data().resource_id:
|
||||
raise IOError("Missmatching event resource id's: {} and {}".format(event.resource_id,
|
||||
self.get_evt_data().resource_id))
|
||||
self.checkEvent(event, fcheck)
|
||||
self.setEvtData(event)
|
||||
self.get_evt_data().write(fnout + fnext, format=evtformat)
|
||||
# try exporting event
|
||||
else:
|
||||
evtdata_org = self.get_evt_data()
|
||||
picks = evtdata_org.picks
|
||||
eventpath = evtdata_org.path
|
||||
picks_copy = copy.deepcopy(picks)
|
||||
evtdata_copy = Event(eventpath)
|
||||
evtdata_copy.picks = picks_copy
|
||||
|
||||
# check for stations picked automatically as well as manually
|
||||
# Prefer manual picks!
|
||||
for i in range(len(picks)):
|
||||
if picks[i].method_id == 'manual':
|
||||
mstation = picks[i].waveform_id.station_code
|
||||
mstation_ext = mstation + '_'
|
||||
for k in range(len(picks_copy)):
|
||||
if ((picks_copy[k].waveform_id.station_code == mstation) or
|
||||
(picks_copy[k].waveform_id.station_code == mstation_ext)) and \
|
||||
(picks_copy[k].method_id == 'auto'):
|
||||
del picks_copy[k]
|
||||
break
|
||||
lendiff = len(picks) - len(picks_copy)
|
||||
if lendiff is not 0:
|
||||
print("Manual as well as automatic picks available. Prefered the {} manual ones!".format(lendiff))
|
||||
|
||||
if upperErrors:
|
||||
# check for pick uncertainties exceeding adjusted upper errors
|
||||
# Picks with larger uncertainties will not be saved in output file!
|
||||
for j in range(len(picks)):
|
||||
for i in range(len(picks_copy)):
|
||||
if picks_copy[i].phase_hint[0] == 'P':
|
||||
if (picks_copy[i].time_errors['upper_uncertainty'] >= upperErrors[0]) or \
|
||||
(picks_copy[i].time_errors['uncertainty'] == None):
|
||||
print("Uncertainty exceeds or equal adjusted upper time error!")
|
||||
print("Adjusted uncertainty: {}".format(upperErrors[0]))
|
||||
print("Pick uncertainty: {}".format(picks_copy[i].time_errors['uncertainty']))
|
||||
print("{1} P-Pick of station {0} will not be saved in outputfile".format(
|
||||
picks_copy[i].waveform_id.station_code,
|
||||
picks_copy[i].method_id))
|
||||
print("#")
|
||||
del picks_copy[i]
|
||||
break
|
||||
if picks_copy[i].phase_hint[0] == 'S':
|
||||
if (picks_copy[i].time_errors['upper_uncertainty'] >= upperErrors[1]) or \
|
||||
(picks_copy[i].time_errors['uncertainty'] == None):
|
||||
print("Uncertainty exceeds or equal adjusted upper time error!")
|
||||
print("Adjusted uncertainty: {}".format(upperErrors[1]))
|
||||
print("Pick uncertainty: {}".format(picks_copy[i].time_errors['uncertainty']))
|
||||
print("{1} S-Pick of station {0} will not be saved in outputfile".format(
|
||||
picks_copy[i].waveform_id.station_code,
|
||||
picks_copy[i].method_id))
|
||||
print("#")
|
||||
del picks_copy[i]
|
||||
break
|
||||
|
||||
if fnext == '.obs':
|
||||
try:
|
||||
evtdata_copy.write(fnout + fnext, format=evtformat)
|
||||
# write header afterwards
|
||||
evid = str(evtdata_org.resource_id).split('/')[1]
|
||||
header = '# EQEVENT: Label: EQ%s Loc: X 0.00 Y 0.00 Z 10.00 OT 0.00 \n' % evid
|
||||
nllocfile = open(fnout + fnext)
|
||||
l = nllocfile.readlines()
|
||||
nllocfile.close()
|
||||
l.insert(0, header)
|
||||
nllocfile = open(fnout + fnext, 'w')
|
||||
nllocfile.write("".join(l))
|
||||
nllocfile.close()
|
||||
except KeyError as e:
|
||||
raise KeyError('''{0} export format
|
||||
not implemented: {1}'''.format(evtformat, e))
|
||||
if fnext == '.cnv':
|
||||
try:
|
||||
velest.export(picks_copy, fnout + fnext, eventinfo=self.get_evt_data())
|
||||
except KeyError as e:
|
||||
raise KeyError('''{0} export format
|
||||
not implemented: {1}'''.format(evtformat, e))
|
||||
|
||||
def getComp(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return self.comp
|
||||
|
||||
def getID(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
try:
|
||||
return self.evtdata.get('resource_id').id
|
||||
except:
|
||||
return None
|
||||
|
||||
def filterWFData(self, kwargs):
|
||||
"""
|
||||
|
||||
:param kwargs:
|
||||
"""
|
||||
self.getWFData().filter(**kwargs)
|
||||
self.dirty = True
|
||||
|
||||
def setWFData(self, fnames):
|
||||
"""
|
||||
|
||||
:param fnames:
|
||||
"""
|
||||
self.wfdata = Stream()
|
||||
self.wforiginal = None
|
||||
if fnames is not None:
|
||||
self.appendWFData(fnames)
|
||||
else:
|
||||
return False
|
||||
self.wforiginal = self.getWFData().copy()
|
||||
self.dirty = False
|
||||
return True
|
||||
|
||||
def appendWFData(self, fnames):
|
||||
"""
|
||||
|
||||
:param fnames:
|
||||
"""
|
||||
assert isinstance(fnames, list), "input parameter 'fnames' is " \
|
||||
"supposed to be of type 'list' " \
|
||||
"but is actually" \
|
||||
" {0}".format(type(fnames))
|
||||
if self.dirty:
|
||||
self.resetWFData()
|
||||
|
||||
warnmsg = ''
|
||||
for fname in fnames:
|
||||
try:
|
||||
self.wfdata += read(fname)
|
||||
except TypeError:
|
||||
try:
|
||||
self.wfdata += read(fname, format='GSE2')
|
||||
except Exception as e:
|
||||
warnmsg += '{0}\n{1}\n'.format(fname, e)
|
||||
except SacIOError as se:
|
||||
warnmsg += '{0}\n{1}\n'.format(fname, se)
|
||||
if warnmsg:
|
||||
warnmsg = 'WARNING: unable to read\n' + warnmsg
|
||||
print(warnmsg)
|
||||
|
||||
def getWFData(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return self.wfdata
|
||||
|
||||
def getOriginalWFData(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return self.wforiginal
|
||||
|
||||
def resetWFData(self):
|
||||
"""
|
||||
|
||||
|
||||
"""
|
||||
self.wfdata = self.getOriginalWFData().copy()
|
||||
self.dirty = False
|
||||
|
||||
def resetPicks(self):
|
||||
"""
|
||||
|
||||
|
||||
"""
|
||||
self.get_evt_data().picks = []
|
||||
|
||||
def get_evt_data(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return self.evtdata
|
||||
|
||||
def setEvtData(self, event):
|
||||
self.evtdata = event
|
||||
|
||||
def applyEVTData(self, data, typ='pick', authority_id='rub'):
|
||||
|
||||
"""
|
||||
|
||||
:param data:
|
||||
:param typ:
|
||||
:param authority_id:
|
||||
:raise OverwriteError:
|
||||
"""
|
||||
|
||||
def applyPicks(picks):
|
||||
"""
|
||||
Creates ObsPy pick objects and append it to the picks list from the
|
||||
PyLoT dictionary contain all picks.
|
||||
:param picks:
|
||||
:raise OverwriteError: raises an OverwriteError if the picks list is
|
||||
not empty. The GUI will then ask for a decision.
|
||||
"""
|
||||
# firstonset = find_firstonset(picks)
|
||||
# check for automatic picks
|
||||
print("Writing phases to ObsPy-quakeml file")
|
||||
for key in picks:
|
||||
if picks[key]['P']['picker'] == 'auto':
|
||||
print("Existing picks will be overwritten!")
|
||||
picks = picks_from_picksdict(picks)
|
||||
break
|
||||
else:
|
||||
if self.get_evt_data().picks:
|
||||
raise OverwriteError('Existing picks would be overwritten!')
|
||||
else:
|
||||
picks = picks_from_picksdict(picks)
|
||||
break
|
||||
self.get_evt_data().picks = picks
|
||||
# if 'smi:local' in self.getID() and firstonset:
|
||||
# fonset_str = firstonset.strftime('%Y_%m_%d_%H_%M_%S')
|
||||
# ID = ResourceIdentifier('event/' + fonset_str)
|
||||
# ID.convertIDToQuakeMLURI(authority_id=authority_id)
|
||||
# self.get_evt_data().resource_id = ID
|
||||
|
||||
def applyEvent(event):
|
||||
"""
|
||||
takes an `obspy.core.event.Event` object and applies all new
|
||||
information on the event to the actual data
|
||||
:param event:
|
||||
"""
|
||||
if self.isNew():
|
||||
self.setEvtData(event)
|
||||
else:
|
||||
# prevent overwriting original pick information
|
||||
event_old = self.get_evt_data()
|
||||
if not event_old.resource_id == event.resource_id:
|
||||
print("WARNING: Missmatch in event resource id's: {} and {}".format(
|
||||
event_old.resource_id,
|
||||
event.resource_id))
|
||||
else:
|
||||
picks = copy.deepcopy(event_old.picks)
|
||||
event = merge_picks(event, picks)
|
||||
# apply event information from location
|
||||
event_old.update(event)
|
||||
|
||||
applydata = {'pick': applyPicks,
|
||||
'event': applyEvent}
|
||||
|
||||
applydata[typ](data)
|
||||
self._new = False
|
||||
|
||||
|
||||
class GenericDataStructure(object):
|
||||
"""
|
||||
GenericDataBase type holds all information about the current data-
|
||||
base working on.
|
||||
"""
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
|
||||
self.allowedFields = []
|
||||
self.expandFields = ['root']
|
||||
self.dsFields = {}
|
||||
|
||||
self.modifyFields(**kwargs)
|
||||
|
||||
def modifyFields(self, **kwargs):
|
||||
|
||||
"""
|
||||
|
||||
:param kwargs:
|
||||
"""
|
||||
assert isinstance(kwargs, dict), 'dictionary type object expected'
|
||||
|
||||
if not self.extraAllowed():
|
||||
kwargs = self.updateNotAllowed(kwargs)
|
||||
|
||||
for key, value in kwargs.items():
|
||||
key = str(key).lower()
|
||||
if value is not None:
|
||||
if type(value) not in (str, int, float):
|
||||
for n, val in enumerate(value):
|
||||
value[n] = str(val)
|
||||
else:
|
||||
value = str(value)
|
||||
try:
|
||||
self.setFieldValue(key, value)
|
||||
except KeyError as e:
|
||||
errmsg = ''
|
||||
errmsg += 'WARNING:\n'
|
||||
errmsg += 'unable to set values for datastructure fields\n'
|
||||
errmsg += '%s; desired value was: %s\n' % (e, value)
|
||||
print(errmsg)
|
||||
|
||||
def isField(self, key):
|
||||
"""
|
||||
|
||||
:param key:
|
||||
:return:
|
||||
"""
|
||||
return key in self.getFields().keys()
|
||||
|
||||
def getFieldValue(self, key):
|
||||
"""
|
||||
|
||||
:param key:
|
||||
:return:
|
||||
"""
|
||||
if self.isField(key):
|
||||
return self.getFields()[key]
|
||||
else:
|
||||
return
|
||||
|
||||
def setFieldValue(self, key, value):
|
||||
"""
|
||||
|
||||
:param key:
|
||||
:param value:
|
||||
:raise KeyError:
|
||||
"""
|
||||
if not self.extraAllowed() and key not in self.getAllowed():
|
||||
raise KeyError
|
||||
else:
|
||||
if not self.isField(key):
|
||||
print('creating new field "%s"' % key)
|
||||
self.getFields()[key] = value
|
||||
|
||||
def getFields(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return self.dsFields
|
||||
|
||||
def getExpandFields(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return self.expandFields
|
||||
|
||||
def setExpandFields(self, keys):
|
||||
"""
|
||||
|
||||
:param keys:
|
||||
"""
|
||||
expandFields = []
|
||||
for key in keys:
|
||||
if self.isField(key):
|
||||
expandFields.append(key)
|
||||
self.expandFields = expandFields
|
||||
|
||||
def getAllowed(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return self.allowedFields
|
||||
|
||||
def extraAllowed(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return not self.allowedFields
|
||||
|
||||
def updateNotAllowed(self, kwargs):
|
||||
"""
|
||||
|
||||
:param kwargs:
|
||||
:return:
|
||||
"""
|
||||
for key in kwargs:
|
||||
if key not in self.getAllowed():
|
||||
kwargs.__delitem__(key)
|
||||
return kwargs
|
||||
|
||||
def hasSuffix(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
try:
|
||||
self.getFieldValue('suffix')
|
||||
except KeyError:
|
||||
return False
|
||||
else:
|
||||
if self.getFieldValue('suffix'):
|
||||
return True
|
||||
return False
|
||||
|
||||
def expandDataPath(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
expandList = []
|
||||
for item in self.getExpandFields():
|
||||
expandList.append(self.getFieldValue(item))
|
||||
if self.hasSuffix():
|
||||
expandList.append('*%s' % self.getFieldValue('suffix'))
|
||||
return os.path.join(*expandList)
|
||||
|
||||
def getCatalogName(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
return os.path.join(self.getFieldValue('root'), 'catalog.qml')
|
||||
|
||||
|
||||
class PilotDataStructure(GenericDataStructure):
|
||||
"""
|
||||
Object containing the data access information for the old PILOT data
|
||||
structure.
|
||||
"""
|
||||
|
||||
def __init__(self, **fields):
|
||||
if not fields:
|
||||
fields = {'database': '2006.01',
|
||||
'root': '/data/Egelados/EVENT_DATA/LOCAL'}
|
||||
|
||||
GenericDataStructure.__init__(self, **fields)
|
||||
|
||||
self.setExpandFields(['root', 'database'])
|
||||
|
||||
|
||||
class SeiscompDataStructure(GenericDataStructure):
|
||||
"""
|
||||
Dictionary containing the data access information for an SDS data archive:
|
||||
|
||||
:param str dataType: Desired data type. Default: ``'waveform'``
|
||||
:param sdate, edate: Either date string or an instance of
|
||||
:class:`obspy.core.utcdatetime.UTCDateTime. Default: ``None``
|
||||
:type sdate, edate: str or UTCDateTime or None
|
||||
"""
|
||||
|
||||
def __init__(self, rootpath='/data/SDS', dataformat='MSEED',
|
||||
filesuffix=None, **kwargs):
|
||||
super(GenericDataStructure, self).__init__()
|
||||
|
||||
edate = UTCDateTime()
|
||||
halfyear = UTCDateTime('1970-07-01')
|
||||
sdate = UTCDateTime(edate - halfyear)
|
||||
del halfyear
|
||||
|
||||
year = ''
|
||||
if not edate.year == sdate.year:
|
||||
nyears = edate.year - sdate.year
|
||||
for yr in range(nyears):
|
||||
year += '{0:04d},'.format(sdate.year + yr)
|
||||
year = '{' + year[:-1] + '}'
|
||||
else:
|
||||
year = '{0:04d}'.format(sdate.year)
|
||||
|
||||
# SDS fields' default values
|
||||
# definitions from
|
||||
# http://www.seiscomp3.org/wiki/doc/applications/slarchive/SDS
|
||||
|
||||
self.dsFields = {'root': '/data/SDS', 'YEAR': year, 'NET': '??',
|
||||
'STA': '????', 'CHAN': 'HH?', 'TYPE': 'D', 'LOC': '',
|
||||
'DAY': '{0:03d}'.format(sdate.julday)
|
||||
}
|
||||
self.modifiyFields(**kwargs)
|
||||
|
||||
def modifiyFields(self, **kwargs):
|
||||
"""
|
||||
|
||||
:param kwargs:
|
||||
"""
|
||||
if kwargs and isinstance(kwargs, dict):
|
||||
for key, value in kwargs.iteritems():
|
||||
key = str(key)
|
||||
if type(value) not in (str, int, float):
|
||||
for n, val in enumerate(value):
|
||||
value[n] = str(val)
|
||||
else:
|
||||
value = str(value)
|
||||
try:
|
||||
self.setFieldValue(key, value)
|
||||
except KeyError as e:
|
||||
errmsg = ''
|
||||
errmsg += 'WARNING:\n'
|
||||
errmsg += 'unable to set values for SDS fields\n'
|
||||
errmsg += '%s; desired value was: %s\n' % (e, value)
|
||||
print(errmsg)
|
||||
|
||||
def setFieldValue(self, key, value):
|
||||
"""
|
||||
|
||||
:param key:
|
||||
:param value:
|
||||
"""
|
||||
if self.isField(key):
|
||||
self.getFields()[key] = value
|
||||
else:
|
||||
print('Warning: trying to set value of non-existent field '
|
||||
'{field}'.format(field=key))
|
||||
|
||||
def expandDataPath(self):
|
||||
"""
|
||||
|
||||
|
||||
:return:
|
||||
"""
|
||||
fullChan = '{0}.{1}'.format(self.getFields()['CHAN'], self.getType())
|
||||
dataPath = os.path.join(self.getFields()['SDSdir'],
|
||||
self.getFields()['YEAR'],
|
||||
self.getFields()['NET'],
|
||||
self.getFields()['STA'],
|
||||
fullChan,
|
||||
'*{0}'.format(self.getFields()['DAY']))
|
||||
return dataPath
|
||||
491
pylot/core/io/default_parameters.py
Normal file
@@ -0,0 +1,491 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
defaults = {'rootpath': {'type': str,
|
||||
'tooltip': 'project path',
|
||||
'value': '',
|
||||
'namestring': 'Root path'},
|
||||
|
||||
'datapath': {'type': str,
|
||||
'tooltip': 'data path',
|
||||
'value': '',
|
||||
'namestring': 'Data path'},
|
||||
|
||||
'database': {'type': str,
|
||||
'tooltip': 'name of data base',
|
||||
'value': '',
|
||||
'namestring': 'Database path'},
|
||||
|
||||
'eventID': {'type': str,
|
||||
'tooltip': 'event ID for single event processing (* for all events found in database)',
|
||||
'value': '',
|
||||
'namestring': 'Event ID'},
|
||||
|
||||
'extent': {'type': str,
|
||||
'tooltip': 'extent of array ("local", "regional" or "global")',
|
||||
'value': 'local',
|
||||
'namestring': 'Array extent'},
|
||||
|
||||
'invdir': {'type': str,
|
||||
'tooltip': 'full path to inventory or dataless-seed file',
|
||||
'value': '',
|
||||
'namestring': 'Inversion dir'},
|
||||
|
||||
'datastructure': {'type': str,
|
||||
'tooltip': 'choose data structure',
|
||||
'value': 'PILOT',
|
||||
'namestring': 'Datastructure'},
|
||||
|
||||
'apverbose': {'type': bool,
|
||||
'tooltip': "choose 'True' or 'False' for terminal output",
|
||||
'value': True,
|
||||
'namestring': 'App. verbosity'},
|
||||
|
||||
'nllocbin': {'type': str,
|
||||
'tooltip': 'path to NLLoc executable',
|
||||
'value': '',
|
||||
'namestring': 'NLLoc bin path'},
|
||||
|
||||
'nllocroot': {'type': str,
|
||||
'tooltip': 'root of NLLoc-processing directory',
|
||||
'value': '',
|
||||
'namestring': 'NLLoc root path'},
|
||||
|
||||
'phasefile': {'type': str,
|
||||
'tooltip': 'name of autoPyLoT-output phase file for NLLoc',
|
||||
'value': 'AUTOPHASES.obs',
|
||||
'namestring': 'Phase filename'},
|
||||
|
||||
'ctrfile': {'type': str,
|
||||
'tooltip': 'name of autoPyLoT-output control file for NLLoc',
|
||||
'value': 'Insheim_min1d2015_auto.in',
|
||||
'namestring': 'Control filename'},
|
||||
|
||||
'ttpatter': {'type': str,
|
||||
'tooltip': 'pattern of NLLoc ttimes from grid',
|
||||
'value': 'ttime',
|
||||
'namestring': 'Traveltime pattern'},
|
||||
|
||||
'outpatter': {'type': str,
|
||||
'tooltip': 'pattern of NLLoc-output file',
|
||||
'value': 'AUTOLOC_nlloc',
|
||||
'namestring': 'NLLoc output pattern'},
|
||||
|
||||
'vp': {'type': float,
|
||||
'tooltip': 'average P-wave velocity',
|
||||
'value': 3530.,
|
||||
'namestring': 'P-velocity'},
|
||||
|
||||
'rho': {'type': float,
|
||||
'tooltip': 'average rock density [kg/m^3]',
|
||||
'value': 2500.,
|
||||
'namestring': 'Density'},
|
||||
|
||||
'Qp': {'type': (float, float),
|
||||
'tooltip': 'quality factor for P waves (Qp*f^a); list(Qp, a)',
|
||||
'value': (300., 0.8),
|
||||
'namestring': ('Quality factor', 'Qp1', 'Qp2')},
|
||||
|
||||
'pstart': {'type': float,
|
||||
'tooltip': 'start time [s] for calculating CF for P-picking (if TauPy:'
|
||||
' seconds relative to estimated onset)',
|
||||
'value': 15.0,
|
||||
'namestring': 'P start'},
|
||||
|
||||
'pstop': {'type': float,
|
||||
'tooltip': 'end time [s] for calculating CF for P-picking (if TauPy:'
|
||||
' seconds relative to estimated onset)',
|
||||
'value': 60.0,
|
||||
'namestring': 'P stop'},
|
||||
|
||||
'sstart': {'type': float,
|
||||
'tooltip': 'start time [s] relative to P-onset for calculating CF for S-picking',
|
||||
'value': -1.0,
|
||||
'namestring': 'S start'},
|
||||
|
||||
'sstop': {'type': float,
|
||||
'tooltip': 'end time [s] after P-onset for calculating CF for S-picking',
|
||||
'value': 10.0,
|
||||
'namestring': 'S stop'},
|
||||
|
||||
'bpz1': {'type': (float, float),
|
||||
'tooltip': 'lower/upper corner freq. of first band pass filter Z-comp. [Hz]',
|
||||
'value': (2, 20),
|
||||
'namestring': ('Z-bandpass 1', 'Lower', 'Upper')},
|
||||
|
||||
'bpz2': {'type': (float, float),
|
||||
'tooltip': 'lower/upper corner freq. of second band pass filter Z-comp. [Hz]',
|
||||
'value': (2, 30),
|
||||
'namestring': ('Z-bandpass 2', 'Lower', 'Upper')},
|
||||
|
||||
'bph1': {'type': (float, float),
|
||||
'tooltip': 'lower/upper corner freq. of first band pass filter H-comp. [Hz]',
|
||||
'value': (2, 15),
|
||||
'namestring': ('H-bandpass 1', 'Lower', 'Upper')},
|
||||
|
||||
'bph2': {'type': (float, float),
|
||||
'tooltip': 'lower/upper corner freq. of second band pass filter z-comp. [Hz]',
|
||||
'value': (2, 20),
|
||||
'namestring': ('H-bandpass 2', 'Lower', 'Upper')},
|
||||
|
||||
'algoP': {'type': str,
|
||||
'tooltip': 'choose algorithm for P-onset determination (HOS, ARZ, or AR3)',
|
||||
'value': 'HOS',
|
||||
'namestring': 'P algorithm'},
|
||||
|
||||
'tlta': {'type': float,
|
||||
'tooltip': 'for HOS-/AR-AIC-picker, length of LTA window [s]',
|
||||
'value': 7.0,
|
||||
'namestring': 'LTA window'},
|
||||
|
||||
'hosorder': {'type': int,
|
||||
'tooltip': 'for HOS-picker, order of Higher Order Statistics',
|
||||
'value': 4,
|
||||
'namestring': 'HOS order'},
|
||||
|
||||
'Parorder': {'type': int,
|
||||
'tooltip': 'for AR-picker, order of AR process of Z-component',
|
||||
'value': 2,
|
||||
'namestring': 'AR order P'},
|
||||
|
||||
'tdet1z': {'type': float,
|
||||
'tooltip': 'for AR-picker, length of AR determination window [s] for Z-component, 1st pick',
|
||||
'value': 1.2,
|
||||
'namestring': 'AR det. window Z 1'},
|
||||
|
||||
'tpred1z': {'type': float,
|
||||
'tooltip': 'for AR-picker, length of AR prediction window [s] for Z-component, 1st pick',
|
||||
'value': 0.4,
|
||||
'namestring': 'AR pred. window Z 1'},
|
||||
|
||||
'tdet2z': {'type': float,
|
||||
'tooltip': 'for AR-picker, length of AR determination window [s] for Z-component, 2nd pick',
|
||||
'value': 0.6,
|
||||
'namestring': 'AR det. window Z 2'},
|
||||
|
||||
'tpred2z': {'type': float,
|
||||
'tooltip': 'for AR-picker, length of AR prediction window [s] for Z-component, 2nd pick',
|
||||
'value': 0.2,
|
||||
'namestring': 'AR pred. window Z 2'},
|
||||
|
||||
'addnoise': {'type': float,
|
||||
'tooltip': 'add noise to seismogram for stable AR prediction',
|
||||
'value': 0.001,
|
||||
'namestring': 'Add noise'},
|
||||
|
||||
'tsnrz': {'type': (float, float, float, float),
|
||||
'tooltip': 'for HOS/AR, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]',
|
||||
'value': (3, 0.1, 0.5, 1.0),
|
||||
'namestring': ('SNR windows P', 'Noise', 'Safety', 'Signal', 'Slope')},
|
||||
|
||||
'pickwinP': {'type': float,
|
||||
'tooltip': 'for initial AIC pick, length of P-pick window [s]',
|
||||
'value': 3.0,
|
||||
'namestring': 'AIC window P'},
|
||||
|
||||
'Precalcwin': {'type': float,
|
||||
'tooltip': 'for HOS/AR, window length [s] for recalculation of CF (relative to 1st pick)',
|
||||
'value': 6.0,
|
||||
'namestring': 'Recal. window P'},
|
||||
|
||||
'aictsmooth': {'type': float,
|
||||
'tooltip': 'for HOS/AR, take average of samples for smoothing of AIC-function [s]',
|
||||
'value': 0.2,
|
||||
'namestring': 'AIC smooth P'},
|
||||
|
||||
'tsmoothP': {'type': float,
|
||||
'tooltip': 'for HOS/AR, take average of samples for smoothing CF [s]',
|
||||
'value': 0.1,
|
||||
'namestring': 'CF smooth P'},
|
||||
|
||||
'ausP': {'type': float,
|
||||
'tooltip': 'for HOS/AR, artificial uplift of samples (aus) of CF (P)',
|
||||
'value': 0.001,
|
||||
'namestring': 'Artificial uplift P'},
|
||||
|
||||
'nfacP': {'type': float,
|
||||
'tooltip': 'for HOS/AR, noise factor for noise level determination (P)',
|
||||
'value': 1.3,
|
||||
'namestring': 'Noise factor P'},
|
||||
|
||||
'algoS': {'type': str,
|
||||
'tooltip': 'choose algorithm for S-onset determination (ARH or AR3)',
|
||||
'value': 'ARH',
|
||||
'namestring': 'S algorithm'},
|
||||
|
||||
'tdet1h': {'type': float,
|
||||
'tooltip': 'for HOS/AR, length of AR-determination window [s], H-components, 1st pick',
|
||||
'value': 0.8,
|
||||
'namestring': 'AR det. window H 1'},
|
||||
|
||||
'tpred1h': {'type': float,
|
||||
'tooltip': 'for HOS/AR, length of AR-prediction window [s], H-components, 1st pick',
|
||||
'value': 0.4,
|
||||
'namestring': 'AR pred. window H 1'},
|
||||
|
||||
'tdet2h': {'type': float,
|
||||
'tooltip': 'for HOS/AR, length of AR-determinaton window [s], H-components, 2nd pick',
|
||||
'value': 0.6,
|
||||
'namestring': 'AR det. window H 2'},
|
||||
|
||||
'tpred2h': {'type': float,
|
||||
'tooltip': 'for HOS/AR, length of AR-prediction window [s], H-components, 2nd pick',
|
||||
'value': 0.3,
|
||||
'namestring': 'AR pred. window H 2'},
|
||||
|
||||
'Sarorder': {'type': int,
|
||||
'tooltip': 'for AR-picker, order of AR process of H-components',
|
||||
'value': 4,
|
||||
'namestring': 'AR order S'},
|
||||
|
||||
'Srecalcwin': {'type': float,
|
||||
'tooltip': 'for AR-picker, window length [s] for recalculation of CF (2nd pick) (H)',
|
||||
'value': 5.0,
|
||||
'namestring': 'Recal. window S'},
|
||||
|
||||
'pickwinS': {'type': float,
|
||||
'tooltip': 'for initial AIC pick, length of S-pick window [s]',
|
||||
'value': 3.0,
|
||||
'namestring': 'AIC window S'},
|
||||
|
||||
'tsnrh': {'type': (float, float, float, float),
|
||||
'tooltip': 'for ARH/AR3, window lengths for SNR-and slope estimation [tnoise, tsafetey, tsignal, tslope] [s]',
|
||||
'value': (2, 0.2, 1.5, 0.5),
|
||||
'namestring': ('SNR windows S', 'Noise', 'Safety', 'Signal', 'Slope')},
|
||||
|
||||
'aictsmoothS': {'type': float,
|
||||
'tooltip': 'for AIC-picker, take average of samples for smoothing of AIC-function [s]',
|
||||
'value': 0.5,
|
||||
'namestring': 'AIC smooth S'},
|
||||
|
||||
'tsmoothS': {'type': float,
|
||||
'tooltip': 'for AR-picker, take average of samples for smoothing CF [s] (S)',
|
||||
'value': 0.7,
|
||||
'namestring': 'CF smooth S'},
|
||||
|
||||
'ausS': {'type': float,
|
||||
'tooltip': 'for HOS/AR, artificial uplift of samples (aus) of CF (S)',
|
||||
'value': 0.9,
|
||||
'namestring': 'Artificial uplift S'},
|
||||
|
||||
'nfacS': {'type': float,
|
||||
'tooltip': 'for AR-picker, noise factor for noise level determination (S)',
|
||||
'value': 1.5,
|
||||
'namestring': 'Noise factor S'},
|
||||
|
||||
'minfmweight': {'type': int,
|
||||
'tooltip': 'minimum required P weight for first-motion determination',
|
||||
'value': 1,
|
||||
'namestring': 'Min. P weight'},
|
||||
|
||||
'minFMSNR': {'type': float,
|
||||
'tooltip': 'miniumum required SNR for first-motion determination',
|
||||
'value': 2.,
|
||||
'namestring': 'Min SNR'},
|
||||
|
||||
'fmpickwin': {'type': float,
|
||||
'tooltip': 'pick window around P onset for calculating zero crossings',
|
||||
'value': 0.2,
|
||||
'namestring': 'Zero crossings window'},
|
||||
|
||||
'timeerrorsP': {'type': (float, float, float, float),
|
||||
'tooltip': 'discrete time errors [s] corresponding to picking weights [0 1 2 3] for P',
|
||||
'value': (0.01, 0.02, 0.04, 0.08),
|
||||
'namestring': ('Time errors P', '0', '1', '2', '3')},
|
||||
|
||||
'timeerrorsS': {'type': (float, float, float, float),
|
||||
'tooltip': 'discrete time errors [s] corresponding to picking weights [0 1 2 3] for S',
|
||||
'value': (0.04, 0.08, 0.16, 0.32),
|
||||
'namestring': ('Time errors S', '0', '1', '2', '3')},
|
||||
|
||||
'minAICPslope': {'type': float,
|
||||
'tooltip': 'below this slope [counts/s] the initial P pick is rejected',
|
||||
'value': 0.8,
|
||||
'namestring': 'Min. slope P'},
|
||||
|
||||
'minAICPSNR': {'type': float,
|
||||
'tooltip': 'below this SNR the initial P pick is rejected',
|
||||
'value': 1.1,
|
||||
'namestring': 'Min. SNR P'},
|
||||
|
||||
'minAICSslope': {'type': float,
|
||||
'tooltip': 'below this slope [counts/s] the initial S pick is rejected',
|
||||
'value': 1.,
|
||||
'namestring': 'Min. slope S'},
|
||||
|
||||
'minAICSSNR': {'type': float,
|
||||
'tooltip': 'below this SNR the initial S pick is rejected',
|
||||
'value': 1.5,
|
||||
'namestring': 'Min. SNR S'},
|
||||
|
||||
'minsiglength': {'type': float,
|
||||
'tooltip': 'length of signal part for which amplitudes must exceed noiselevel [s]',
|
||||
'value': 1.,
|
||||
'namestring': 'Min. signal length'},
|
||||
|
||||
'noisefactor': {'type': float,
|
||||
'tooltip': 'noiselevel*noisefactor=threshold',
|
||||
'value': 1.0,
|
||||
'namestring': 'Noise factor'},
|
||||
|
||||
'minpercent': {'type': float,
|
||||
'tooltip': 'required percentage of amplitudes exceeding threshold',
|
||||
'value': 10.,
|
||||
'namestring': 'Min amplitude [%]'},
|
||||
|
||||
'zfac': {'type': float,
|
||||
'tooltip': 'P-amplitude must exceed at least zfac times RMS-S amplitude',
|
||||
'value': 1.5,
|
||||
'namestring': 'Z factor'},
|
||||
|
||||
'mdttolerance': {'type': float,
|
||||
'tooltip': 'maximum allowed deviation of P picks from median [s]',
|
||||
'value': 6.0,
|
||||
'namestring': 'Median tolerance'},
|
||||
|
||||
'wdttolerance': {'type': float,
|
||||
'tooltip': 'maximum allowed deviation from Wadati-diagram',
|
||||
'value': 1.0,
|
||||
'namestring': 'Wadati tolerance'},
|
||||
|
||||
'jackfactor': {'type': float,
|
||||
'tooltip': 'pick is removed if the variance of the subgroup with the pick removed is larger than the mean variance of all subgroups times safety factor',
|
||||
'value': 5.0,
|
||||
'namestring': 'Jackknife safety factor'},
|
||||
|
||||
'WAscaling': {'type': (float, float, float),
|
||||
'tooltip': 'Scaling relation (log(Ao)+Alog(r)+Br+C) of Wood-Anderson amplitude Ao [nm] \
|
||||
If zeros are set, original Richter magnitude is calculated!',
|
||||
'value': (0., 0., 0.),
|
||||
'namestring': ('Wood-Anderson scaling', '', '', '')},
|
||||
|
||||
'magscaling': {'type': (float, float),
|
||||
'tooltip': 'Scaling relation for derived local magnitude [a*Ml+b]. \
|
||||
If zeros are set, no scaling of network magnitude is applied!',
|
||||
'value': (0., 0.),
|
||||
'namestring': ('Local mag. scaling', '', '')},
|
||||
|
||||
'minfreq': {'type': (float, float),
|
||||
'tooltip': 'Lower filter frequency [P, S]',
|
||||
'value': (1.0, 1.0),
|
||||
'namestring': ('Lower freq.', 'P', 'S')},
|
||||
|
||||
'maxfreq': {'type': (float, float),
|
||||
'tooltip': 'Upper filter frequency [P, S]',
|
||||
'value': (10.0, 10.0),
|
||||
'namestring': ('Upper freq.', 'P', 'S')},
|
||||
|
||||
'filter_order': {'type': (int, int),
|
||||
'tooltip': 'filter order [P, S]',
|
||||
'value': (2, 2),
|
||||
'namestring': ('Order', 'P', 'S')},
|
||||
|
||||
'filter_type': {'type': (str, str),
|
||||
'tooltip': 'filter type (bandpass, bandstop, lowpass, highpass) [P, S]',
|
||||
'value': ('bandpass', 'bandpass'),
|
||||
'namestring': ('Type', 'P', 'S')},
|
||||
|
||||
'use_taup': {'type': bool,
|
||||
'tooltip': 'use estimated traveltimes from TauPy for calculating windows for CF',
|
||||
'value': True,
|
||||
'namestring': 'Use TauPy'},
|
||||
|
||||
'taup_model': {'type': str,
|
||||
'tooltip': 'define TauPy model for traveltime estimation. Possible values: 1066a, 1066b, ak135, ak135f, herrin, iasp91, jb, prem, pwdk, sp6',
|
||||
'value': 'iasp91',
|
||||
'namestring': 'TauPy model'}
|
||||
}
|
||||
|
||||
settings_main = {
|
||||
'dirs': [
|
||||
'rootpath',
|
||||
'datapath',
|
||||
'database',
|
||||
'eventID',
|
||||
'invdir',
|
||||
'datastructure',
|
||||
'apverbose'],
|
||||
'nlloc': [
|
||||
'nllocbin',
|
||||
'nllocroot',
|
||||
'phasefile',
|
||||
'ctrfile',
|
||||
'ttpatter',
|
||||
'outpatter'],
|
||||
'smoment': [
|
||||
'vp',
|
||||
'rho',
|
||||
'Qp'],
|
||||
'localmag': [
|
||||
'WAscaling',
|
||||
'magscaling'],
|
||||
'filter': [
|
||||
'minfreq',
|
||||
'maxfreq',
|
||||
'filter_order',
|
||||
'filter_type'],
|
||||
'pick': [
|
||||
'extent',
|
||||
'pstart',
|
||||
'pstop',
|
||||
'sstart',
|
||||
'sstop',
|
||||
'use_taup',
|
||||
'taup_model',
|
||||
'bpz1',
|
||||
'bpz2',
|
||||
'bph1',
|
||||
'bph2']
|
||||
}
|
||||
|
||||
settings_special_pick = {
|
||||
'z': [
|
||||
'algoP',
|
||||
'tlta',
|
||||
'hosorder',
|
||||
'Parorder',
|
||||
'tdet1z',
|
||||
'tpred1z',
|
||||
'tdet2z',
|
||||
'tpred2z',
|
||||
'addnoise',
|
||||
'tsnrz',
|
||||
'pickwinP',
|
||||
'Precalcwin',
|
||||
'aictsmooth',
|
||||
'tsmoothP',
|
||||
'ausP',
|
||||
'nfacP'],
|
||||
'h': [
|
||||
'algoS',
|
||||
'tdet1h',
|
||||
'tpred1h',
|
||||
'tdet2h',
|
||||
'tpred2h',
|
||||
'Sarorder',
|
||||
'Srecalcwin',
|
||||
'pickwinS',
|
||||
'tsnrh',
|
||||
'aictsmoothS',
|
||||
'tsmoothS',
|
||||
'ausS',
|
||||
'nfacS'],
|
||||
'fm': [
|
||||
'minfmweight',
|
||||
'minFMSNR',
|
||||
'fmpickwin'],
|
||||
'quality': [
|
||||
'timeerrorsP',
|
||||
'timeerrorsS',
|
||||
'minAICPslope',
|
||||
'minAICPSNR',
|
||||
'minAICSslope',
|
||||
'minAICSSNR',
|
||||
'minsiglength',
|
||||
'noisefactor',
|
||||
'minpercent',
|
||||
'zfac',
|
||||
'mdttolerance',
|
||||
'wdttolerance',
|
||||
'jackfactor'],
|
||||
}
|
||||
371
pylot/core/io/inputs.py
Normal file
@@ -0,0 +1,371 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io import default_parameters
|
||||
from pylot.core.util.errors import ParameterError
|
||||
|
||||
|
||||
class PylotParameter(object):
|
||||
'''
|
||||
PylotParameter is a parameter type object capable to read and/or write
|
||||
parameter ASCII.
|
||||
|
||||
:param fn str: Filename of the input file
|
||||
|
||||
Parameters are given for example as follows:
|
||||
========== ========== =======================================
|
||||
Name Value Comment
|
||||
========== ========== =======================================
|
||||
phl S # phaselabel
|
||||
ff1 0.1 # freqmin
|
||||
ff2 0.5 # freqmax
|
||||
tdet 6.875 # det-window_(s)_for_ar
|
||||
tpred 2.5 # pred-window_(s)_for_ar
|
||||
order 4 # order_of_ar
|
||||
fnoise 0 # noise_level_for_ar
|
||||
suppp 7 # envelopecoeff
|
||||
tolt 300 # (s)time around arrival time
|
||||
f1tpwt 4 # propfact_minfreq_secondtaper
|
||||
pickwindow 9 # length_of_pick_window
|
||||
w1 1 # length_of_smoothing_window
|
||||
w2 0.37 # cf(i-1)*(1+peps)_for_local_min
|
||||
w3 0.25 # cf(i-1)*(1+peps)_for_local_min
|
||||
tslope 0.8;2 # slope_det_window_loc_glob
|
||||
aerr 30;60 # adjusted_error_slope_fitting_loc_glob
|
||||
tsn 20;5;20;10 # length_signal_window_S/N
|
||||
proPh Sn # nextprominentphase
|
||||
========== ========== =======================================
|
||||
'''
|
||||
|
||||
def __init__(self, fnin=None, fnout=None, verbosity=0, **kwargs):
|
||||
'''
|
||||
Initialize parameter object:
|
||||
|
||||
io content of an ASCII file an form a type consistent dictionary
|
||||
contain all parameters.
|
||||
'''
|
||||
|
||||
self.__init_default_paras()
|
||||
self.__init_subsettings()
|
||||
self.__filename = fnin
|
||||
self._verbosity = verbosity
|
||||
self._parFileCont = {}
|
||||
# io from parsed arguments alternatively
|
||||
for key, val in kwargs.items():
|
||||
self._parFileCont[key] = val
|
||||
self.from_file()
|
||||
if fnout:
|
||||
self.export2File(fnout)
|
||||
|
||||
# Human-readable string representation of the object
|
||||
def __str__(self):
|
||||
string = ''
|
||||
string += 'Automated picking parameter:\n\n'
|
||||
if self.__parameter:
|
||||
for key, value in self.iteritems():
|
||||
string += '%s:\t\t%s\n' % (key, value)
|
||||
else:
|
||||
string += 'Empty parameter dictionary.'
|
||||
return string
|
||||
|
||||
# Set default values of parameter names
|
||||
def __init_default_paras(self):
|
||||
parameters = default_parameters.defaults
|
||||
self.__defaults = parameters
|
||||
|
||||
def __init_subsettings(self):
|
||||
self._settings_main = default_parameters.settings_main
|
||||
self._settings_special_pick = default_parameters.settings_special_pick
|
||||
|
||||
# String representation of the object
|
||||
def __repr__(self):
|
||||
return "PylotParameter('%s')" % self.__filename
|
||||
|
||||
# Boolean test
|
||||
def __nonzero__(self):
|
||||
return bool(self.__parameter)
|
||||
|
||||
def __getitem__(self, key):
|
||||
try:
|
||||
return self.__parameter[key]
|
||||
except:
|
||||
return None
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
self.__parameter[key] = value
|
||||
|
||||
def __delitem__(self, key):
|
||||
del self.__parameter[key]
|
||||
|
||||
def __iter__(self):
|
||||
return iter(self.__parameter)
|
||||
|
||||
def __len__(self):
|
||||
return len(self.__parameter.keys())
|
||||
|
||||
def iteritems(self):
|
||||
for key, value in self.__parameter.items():
|
||||
yield key, value
|
||||
|
||||
def hasParam(self, parameter):
|
||||
if parameter in self.__parameter.keys():
|
||||
return True
|
||||
return False
|
||||
|
||||
def get(self, *args):
|
||||
try:
|
||||
for param in args:
|
||||
try:
|
||||
return self.__getitem__(param)
|
||||
except KeyError as e:
|
||||
self._printParameterError(e)
|
||||
raise ParameterError(e)
|
||||
except TypeError:
|
||||
try:
|
||||
return self.__getitem__(args)
|
||||
except KeyError as e:
|
||||
self._printParameterError(e)
|
||||
raise ParameterError(e)
|
||||
|
||||
def get_defaults(self):
|
||||
return self.__defaults
|
||||
|
||||
def get_main_para_names(self):
|
||||
return self._settings_main
|
||||
|
||||
def get_special_para_names(self):
|
||||
return self._settings_special_pick
|
||||
|
||||
def get_all_para_names(self):
|
||||
all_names = []
|
||||
all_names += self.get_main_para_names()['dirs']
|
||||
all_names += self.get_main_para_names()['nlloc']
|
||||
all_names += self.get_main_para_names()['smoment']
|
||||
all_names += self.get_main_para_names()['localmag']
|
||||
all_names += self.get_main_para_names()['pick']
|
||||
all_names += self.get_main_para_names()['filter']
|
||||
all_names += self.get_special_para_names()['z']
|
||||
all_names += self.get_special_para_names()['h']
|
||||
all_names += self.get_special_para_names()['fm']
|
||||
all_names += self.get_special_para_names()['quality']
|
||||
return all_names
|
||||
|
||||
def checkValue(self, param, value):
|
||||
is_type = type(value)
|
||||
expect_type = self.get_defaults()[param]['type']
|
||||
if not is_type == expect_type and not is_type == tuple:
|
||||
message = 'Type check failed for param: {}, is type: {}, expected type:{}'
|
||||
message = message.format(param, is_type, expect_type)
|
||||
print(Warning(message))
|
||||
|
||||
def setParamKV(self, param, value):
|
||||
self.__setitem__(param, value)
|
||||
|
||||
def setParam(self, **kwargs):
|
||||
for key in kwargs:
|
||||
self.__setitem__(key, kwargs[key])
|
||||
|
||||
@staticmethod
|
||||
def _printParameterError(errmsg):
|
||||
print('ParameterError:\n non-existent parameter %s' % errmsg)
|
||||
|
||||
def reset_defaults(self):
|
||||
defaults = self.get_defaults()
|
||||
for param in defaults:
|
||||
self.setParamKV(param, defaults[param]['value'])
|
||||
|
||||
def from_file(self, fnin=None):
|
||||
if not fnin:
|
||||
if self.__filename is not None:
|
||||
fnin = self.__filename
|
||||
else:
|
||||
return
|
||||
|
||||
if isinstance(fnin, (list, tuple)):
|
||||
fnin = fnin[0]
|
||||
|
||||
inputFile = open(fnin, 'r')
|
||||
try:
|
||||
lines = inputFile.readlines()
|
||||
for line in lines:
|
||||
parspl = line.split('\t')[:2]
|
||||
self._parFileCont[parspl[0].strip()] = parspl[1]
|
||||
except IndexError as e:
|
||||
if self._verbosity > 0:
|
||||
self._printParameterError(e)
|
||||
inputFile.seek(0)
|
||||
lines = inputFile.readlines()
|
||||
for line in lines:
|
||||
if not line.startswith(('#', '%', '\n', ' ')):
|
||||
parspl = line.split('#')[:2]
|
||||
self._parFileCont[parspl[1].strip()] = parspl[0].strip()
|
||||
for key, value in self._parFileCont.items():
|
||||
try:
|
||||
val = int(value)
|
||||
except:
|
||||
try:
|
||||
val = float(value)
|
||||
except:
|
||||
if len(value.split(' ')) > 1:
|
||||
vallist = value.strip().split(' ')
|
||||
val = []
|
||||
for val0 in vallist:
|
||||
try:
|
||||
val0 = float(val0)
|
||||
except:
|
||||
pass
|
||||
val.append(val0)
|
||||
else:
|
||||
val = str(value.strip())
|
||||
self._parFileCont[key] = val
|
||||
self.__parameter = self._parFileCont
|
||||
|
||||
def export2File(self, fnout):
|
||||
fid_out = open(fnout, 'w')
|
||||
lines = []
|
||||
# for key, value in self.iteritems():
|
||||
# lines.append('{key}\t{value}\n'.format(key=key, value=value))
|
||||
# fid_out.writelines(lines)
|
||||
header = ('%This is a parameter input file for PyLoT/autoPyLoT.\n' +
|
||||
'%All main and special settings regarding data handling\n' +
|
||||
'%and picking are to be set here!\n' +
|
||||
'%Parameters are optimized for %{} data sets!\n'.format(self.get_main_para_names()['pick'][0]))
|
||||
separator = '%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%\n'
|
||||
|
||||
fid_out.write(header)
|
||||
self.write_section(fid_out, self.get_main_para_names()['dirs'],
|
||||
'main settings', separator)
|
||||
self.write_section(fid_out, self.get_main_para_names()['nlloc'],
|
||||
'NLLoc settings', separator)
|
||||
self.write_section(fid_out, self.get_main_para_names()['smoment'],
|
||||
'parameters for seismic moment estimation', separator)
|
||||
self.write_section(fid_out, self.get_main_para_names()['localmag'],
|
||||
'settings local magnitude', separator)
|
||||
self.write_section(fid_out, self.get_main_para_names()['filter'],
|
||||
'filter settings', separator)
|
||||
self.write_section(fid_out, self.get_main_para_names()['pick'],
|
||||
'common settings picker', separator)
|
||||
fid_out.write(('#special settings for calculating CF#\n' +
|
||||
'%!!Edit the following only if you know what you are doing!!%\n'))
|
||||
self.write_section(fid_out, self.get_special_para_names()['z'],
|
||||
'Z-component', None)
|
||||
self.write_section(fid_out, self.get_special_para_names()['h'],
|
||||
'H-components', None)
|
||||
self.write_section(fid_out, self.get_special_para_names()['fm'],
|
||||
'first-motion picker', None)
|
||||
self.write_section(fid_out, self.get_special_para_names()['quality'],
|
||||
'quality assessment', None)
|
||||
|
||||
def write_section(self, fid, names, title, separator):
|
||||
if separator:
|
||||
fid.write(separator)
|
||||
fid.write('#{}#\n'.format(title))
|
||||
l_val = 50
|
||||
l_name = 15
|
||||
l_ttip = 100
|
||||
for name in names:
|
||||
value = self[name]
|
||||
if type(value) == list or type(value) == tuple:
|
||||
value_tmp = ''
|
||||
for vl in value:
|
||||
value_tmp += '{} '.format(vl)
|
||||
value = value_tmp
|
||||
tooltip = self.get_defaults()[name]['tooltip']
|
||||
if not len(str(value)) > l_val:
|
||||
value = '{:<{}} '.format(str(value), l_val)
|
||||
else:
|
||||
value = '{} '.format(str(value))
|
||||
name += '#'
|
||||
if not len(name) > l_name:
|
||||
name = '#{:<{}} '.format(name, l_name)
|
||||
else:
|
||||
name = '#{} '.format(name)
|
||||
if not len(tooltip) > l_ttip:
|
||||
ttip = '%{:<{}}\n'.format(tooltip, l_ttip)
|
||||
else:
|
||||
ttip = '%{}\n'.format(tooltip)
|
||||
line = value + name + ttip
|
||||
fid.write(line)
|
||||
|
||||
|
||||
class FilterOptions(object):
|
||||
'''
|
||||
FilterOptions is a parameter object type providing Butterworth filter
|
||||
option parameter for PyLoT. Its easy to access properties helps to manage
|
||||
file based as well as parameter manipulation within the GUI.
|
||||
|
||||
:type filtertype: str, optional
|
||||
:param filtertype: String containing the desired filtertype For information
|
||||
about the supported filter types see _`Supported Filter` section .
|
||||
|
||||
:type freq: list, optional
|
||||
:param freq: list of float(s) describing the cutoff limits of the filter
|
||||
|
||||
:type order: int, optional
|
||||
:param order: Integer value describing the order of the desired Butterworth
|
||||
filter.
|
||||
|
||||
.. rubric:: _`Supported Filter`
|
||||
|
||||
``'bandpass'``
|
||||
Butterworth-Bandpass
|
||||
|
||||
``'bandstop'``
|
||||
Butterworth-Bandstop
|
||||
|
||||
``'lowpass'``
|
||||
Butterworth-Lowpass
|
||||
|
||||
``'highpass'``
|
||||
Butterworth-Highpass
|
||||
'''
|
||||
|
||||
def __init__(self, filtertype='bandpass', freq=[2., 5.], order=3,
|
||||
**kwargs):
|
||||
self._order = order
|
||||
self._filtertype = filtertype
|
||||
self._freq = freq
|
||||
|
||||
def __str__(self):
|
||||
hrs = '''\n\tFilter parameter:\n
|
||||
Type:\t\t{ftype}\n
|
||||
Frequencies:\t{freq}\n
|
||||
Order:\t\t{order}\n
|
||||
'''.format(ftype=self.getFilterType(),
|
||||
freq=self.getFreq(),
|
||||
order=self.getOrder())
|
||||
return hrs
|
||||
|
||||
def __nonzero__(self):
|
||||
return bool(self.getFilterType())
|
||||
|
||||
def parseFilterOptions(self):
|
||||
if self:
|
||||
robject = {'type': self.getFilterType(), 'corners': self.getOrder()}
|
||||
if not self.getFilterType() in ['highpass', 'lowpass']:
|
||||
robject['freqmin'] = self.getFreq()[0]
|
||||
robject['freqmax'] = self.getFreq()[1]
|
||||
elif self.getFilterType() == 'highpass':
|
||||
robject['freq'] = self.getFreq()[0]
|
||||
elif self.getFilterType() == 'lowpass':
|
||||
robject['freq'] = self.getFreq()[1]
|
||||
return robject
|
||||
return None
|
||||
|
||||
def getFreq(self):
|
||||
return self.__getattribute__('_freq')
|
||||
|
||||
def setFreq(self, freq):
|
||||
self.__setattr__('_freq', freq)
|
||||
|
||||
def getOrder(self):
|
||||
return self.__getattribute__('_order')
|
||||
|
||||
def setOrder(self, order):
|
||||
self.__setattr__('_order', order)
|
||||
|
||||
def getFilterType(self):
|
||||
return self.__getattribute__('_filtertype')
|
||||
|
||||
def setFilterType(self, filtertype):
|
||||
self.__setattr__('_filtertype', filtertype)
|
||||
220
pylot/core/io/location.py
Normal file
@@ -0,0 +1,220 @@
|
||||
from obspy import UTCDateTime
|
||||
from obspy.core import event as ope
|
||||
|
||||
from pylot.core.util.utils import getLogin, getHash
|
||||
|
||||
|
||||
def create_amplitude(pickID, amp, unit, category, cinfo):
|
||||
'''
|
||||
|
||||
:param pickID:
|
||||
:param amp:
|
||||
:param unit:
|
||||
:param category:
|
||||
:param cinfo:
|
||||
:return:
|
||||
'''
|
||||
amplitude = ope.Amplitude()
|
||||
amplitude.creation_info = cinfo
|
||||
amplitude.generic_amplitude = amp
|
||||
amplitude.unit = ope.AmplitudeUnit(unit)
|
||||
amplitude.type = ope.AmplitudeCategory(category)
|
||||
amplitude.pick_id = pickID
|
||||
return amplitude
|
||||
|
||||
|
||||
def create_arrival(pickresID, cinfo, phase, azimuth=None, dist=None):
|
||||
'''
|
||||
create_arrival - function to create an Obspy Arrival
|
||||
|
||||
:param pickresID: Resource identifier of the created pick
|
||||
:type pickresID: :class: `~obspy.core.event.ResourceIdentifier` object
|
||||
:param cinfo: An ObsPy :class: `~obspy.core.event.CreationInfo` object
|
||||
holding information on the creation of the returned object
|
||||
:type cinfo: :class: `~obspy.core.event.CreationInfo` object
|
||||
:param phase: name of the arrivals seismic phase
|
||||
:type phase: str
|
||||
:param azimuth: azimuth between source and receiver
|
||||
:type azimuth: float or int, optional
|
||||
:param dist: distance between source and receiver
|
||||
:type dist: float or int, optional
|
||||
:return: An ObsPy :class: `~obspy.core.event.Arrival` object
|
||||
'''
|
||||
arrival = ope.Arrival()
|
||||
arrival.creation_info = cinfo
|
||||
arrival.pick_id = pickresID
|
||||
arrival.phase = phase
|
||||
if azimuth is not None:
|
||||
arrival.azimuth = float(azimuth) if azimuth > -180 else azimuth + 360.
|
||||
else:
|
||||
arrival.azimuth = azimuth
|
||||
arrival.distance = dist
|
||||
return arrival
|
||||
|
||||
|
||||
def create_creation_info(agency_id=None, creation_time=None, author=None):
|
||||
'''
|
||||
|
||||
:param agency_id:
|
||||
:param creation_time:
|
||||
:param author:
|
||||
:return:
|
||||
'''
|
||||
if author is None:
|
||||
author = getLogin()
|
||||
if creation_time is None:
|
||||
creation_time = UTCDateTime()
|
||||
return ope.CreationInfo(agency_id=agency_id, author=author,
|
||||
creation_time=creation_time)
|
||||
|
||||
|
||||
def create_event(origintime, cinfo, originloc=None, etype='earthquake',
|
||||
resID=None, authority_id=None):
|
||||
'''
|
||||
create_event - funtion to create an ObsPy Event
|
||||
|
||||
:param origintime: the events origintime
|
||||
:type origintime: :class: `~obspy.core.utcdatetime.UTCDateTime` object
|
||||
:param cinfo: An ObsPy :class: `~obspy.core.event.CreationInfo` object
|
||||
holding information on the creation of the returned object
|
||||
:type cinfo: :class: `~obspy.core.event.CreationInfo` object
|
||||
:param originloc: tuple containing the location of the origin
|
||||
(LAT, LON, DEP) affiliated with the event which is created
|
||||
:type originloc: tuple, list
|
||||
:param etype: Event type str object. converted via ObsPy to a valid event
|
||||
type string.
|
||||
:type etype: str
|
||||
:param resID: Resource identifier of the created event
|
||||
:type resID: :class: `~obspy.core.event.ResourceIdentifier` object, str
|
||||
:param authority_id: name of the institution carrying out the processing
|
||||
:type authority_id: str
|
||||
:return: An ObsPy :class: `~obspy.core.event.Event` object
|
||||
'''
|
||||
|
||||
if originloc is not None:
|
||||
o = create_origin(origintime, cinfo,
|
||||
originloc[0], originloc[1], originloc[2])
|
||||
else:
|
||||
o = None
|
||||
if not resID:
|
||||
resID = create_resourceID(origintime, etype, authority_id)
|
||||
elif isinstance(resID, str):
|
||||
resID = create_resourceID(origintime, etype, authority_id, resID)
|
||||
elif not isinstance(resID, ope.ResourceIdentifier):
|
||||
raise TypeError("unsupported type(resID) for resource identifier "
|
||||
"generation: %s" % type(resID))
|
||||
event = ope.Event(resource_id=resID)
|
||||
event.creation_info = cinfo
|
||||
event.event_type = etype
|
||||
if o:
|
||||
event.origins = [o]
|
||||
return event
|
||||
|
||||
|
||||
def create_magnitude(originID, cinfo):
|
||||
'''
|
||||
create_magnitude - function to create an ObsPy Magnitude object
|
||||
:param originID:
|
||||
:type originID:
|
||||
:param cinfo:
|
||||
:type cinfo:
|
||||
:return:
|
||||
'''
|
||||
magnitude = ope.Magnitude()
|
||||
magnitude.creation_info = cinfo
|
||||
magnitude.origin_id = originID
|
||||
return magnitude
|
||||
|
||||
|
||||
def create_origin(origintime, cinfo, latitude, longitude, depth):
|
||||
'''
|
||||
create_origin - function to create an ObsPy Origin
|
||||
:param origintime: the origins time of occurence
|
||||
:type origintime: :class: `~obspy.core.utcdatetime.UTCDateTime` object
|
||||
:param cinfo:
|
||||
:type cinfo:
|
||||
:param latitude: latitude in decimal degree of the origins location
|
||||
:type latitude: float
|
||||
:param longitude: longitude in decimal degree of the origins location
|
||||
:type longitude: float
|
||||
:param depth: hypocentral depth of the origin
|
||||
:type depth: float
|
||||
:return: An ObsPy :class: `~obspy.core.event.Origin` object
|
||||
'''
|
||||
|
||||
assert isinstance(origintime, UTCDateTime), "origintime has to be " \
|
||||
"a UTCDateTime object, but " \
|
||||
"actually is of type " \
|
||||
"'%s'" % type(origintime)
|
||||
|
||||
origin = ope.Origin()
|
||||
origin.time = origintime
|
||||
origin.creation_info = cinfo
|
||||
origin.latitude = latitude
|
||||
origin.longitude = longitude
|
||||
origin.depth = depth
|
||||
return origin
|
||||
|
||||
|
||||
def create_pick(origintime, picknum, picktime, eventnum, cinfo, phase, station,
|
||||
wfseedstr, authority_id):
|
||||
'''
|
||||
create_pick - function to create an ObsPy Pick
|
||||
|
||||
:param origintime:
|
||||
:type origintime:
|
||||
:param picknum: number of the created pick
|
||||
:type picknum: int
|
||||
:param picktime:
|
||||
:type picktime:
|
||||
:param eventnum: human-readable event identifier
|
||||
:type eventnum: str
|
||||
:param cinfo: An ObsPy :class: `~obspy.core.event.CreationInfo` object
|
||||
holding information on the creation of the returned object
|
||||
:type cinfo: :class: `~obspy.core.event.CreationInfo` object
|
||||
:param phase: name of the arrivals seismic phase
|
||||
:type phase: str
|
||||
:param station: name of the station at which the seismic phase has been
|
||||
picked
|
||||
:type station: str
|
||||
:param wfseedstr: A SEED formatted string of the form
|
||||
network.station.location.channel in order to set a referenced waveform
|
||||
:type wfseedstr: str, SEED formatted
|
||||
:param authority_id: name of the institution carrying out the processing
|
||||
:type authority_id: str
|
||||
:return: An ObsPy :class: `~obspy.core.event.Pick` object
|
||||
'''
|
||||
pickID = eventnum + '_' + station.strip() + '/{0:03d}'.format(picknum)
|
||||
pickresID = create_resourceID(origintime, 'pick', authority_id, pickID)
|
||||
pick = ope.Pick()
|
||||
pick.resource_id = pickresID
|
||||
pick.time = picktime
|
||||
pick.creation_info = cinfo
|
||||
pick.phase_hint = phase
|
||||
pick.waveform_id = ope.ResourceIdentifier(id=wfseedstr, prefix='file:/')
|
||||
return pick
|
||||
|
||||
|
||||
def create_resourceID(timetohash, restype, authority_id=None, hrstr=None):
|
||||
'''
|
||||
|
||||
:param timetohash:
|
||||
:type timetohash
|
||||
:param restype: type of the resource, e.g. 'orig', 'earthquake' ...
|
||||
:type restype: str
|
||||
:param authority_id: name of the institution carrying out the processing
|
||||
:type authority_id: str, optional
|
||||
:param hrstr:
|
||||
:type hrstr:
|
||||
:return:
|
||||
'''
|
||||
assert isinstance(timetohash, UTCDateTime), "'timetohash' is not an ObsPy" \
|
||||
"UTCDateTime object"
|
||||
hid = getHash(timetohash)
|
||||
if hrstr is None:
|
||||
resID = ope.ResourceIdentifier(restype + '/' + hid[0:6])
|
||||
else:
|
||||
resID = ope.ResourceIdentifier(restype + '/' + hrstr)
|
||||
if authority_id is not None:
|
||||
resID.convertIDToQuakeMLURI(authority_id=authority_id)
|
||||
return resID
|
||||
1016
pylot/core/io/phases.py
Normal file
2
pylot/core/loc/__init__.py
Normal file
@@ -0,0 +1,2 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
28
pylot/core/loc/focmec.py
Normal file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
|
||||
|
||||
def export(picks, fnout, parameter, eventinfo):
|
||||
'''
|
||||
Take <picks> dictionary and exports picking data to a focmec
|
||||
<phasefile> without creating an ObsPy event object.
|
||||
|
||||
:param picks: picking data dictionary
|
||||
:type picks: dict
|
||||
|
||||
:param fnout: complete path to the exporting obs file
|
||||
:type fnout: str
|
||||
|
||||
:param: parameter, all input information
|
||||
:type: object
|
||||
|
||||
:param: eventinfo, source information needed for focmec format
|
||||
:type: list object
|
||||
'''
|
||||
# write phases to FOCMEC-phase file
|
||||
writephases(picks, 'FOCMEC', fnout, parameter, eventinfo)
|
||||
28
pylot/core/loc/hash.py
Normal file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
|
||||
|
||||
def export(picks, fnout, parameter, eventinfo):
|
||||
'''
|
||||
Take <picks> dictionary and exports picking data to a HASH
|
||||
<phasefile> without creating an ObsPy event object.
|
||||
|
||||
:param picks: picking data dictionary
|
||||
:type picks: dict
|
||||
|
||||
:param fnout: complete path to the exporting obs file
|
||||
:type fnout: str
|
||||
|
||||
:param: parameter, all input information
|
||||
:type: object
|
||||
|
||||
:param: eventinfo, source information needed for HASH format
|
||||
:type: list object
|
||||
'''
|
||||
# write phases to HASH-phase file
|
||||
writephases(picks, 'HASH', fnout, parameter, eventinfo)
|
||||
25
pylot/core/loc/hypo71.py
Normal file
@@ -0,0 +1,25 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
|
||||
|
||||
def export(picks, fnout, parameter):
|
||||
'''
|
||||
Take <picks> dictionary and exports picking data to a HYPO71
|
||||
<phasefile> without creating an ObsPy event object.
|
||||
|
||||
:param picks: picking data dictionary
|
||||
:type picks: dict
|
||||
|
||||
:param fnout: complete path to the exporting obs file
|
||||
:type fnout: str
|
||||
|
||||
:param: parameter, all input information
|
||||
:type: object
|
||||
'''
|
||||
# write phases to HYPO71-phase file
|
||||
writephases(picks, 'HYPO71', fnout, parameter)
|
||||
28
pylot/core/loc/hypodd.py
Normal file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
|
||||
|
||||
def export(picks, fnout, parameter, eventinfo):
|
||||
'''
|
||||
Take <picks> dictionary and exports picking data to a hypoDD
|
||||
<phasefile> without creating an ObsPy event object.
|
||||
|
||||
:param picks: picking data dictionary
|
||||
:type picks: dict
|
||||
|
||||
:param fnout: complete path to the exporting obs file
|
||||
:type fnout: str
|
||||
|
||||
:param: parameter, all input information
|
||||
:type: object
|
||||
|
||||
:param: eventinfo, source information needed for hypoDD format
|
||||
:type: list object
|
||||
'''
|
||||
# write phases to hypoDD-phase file
|
||||
writephases(picks, 'hypoDD', fnout, parameter, eventinfo)
|
||||
25
pylot/core/loc/hyposat.py
Normal file
@@ -0,0 +1,25 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
|
||||
|
||||
def export(picks, fnout, parameter):
|
||||
'''
|
||||
Take <picks> dictionary and exports picking data to a HYPOSAT
|
||||
<phasefile> without creating an ObsPy event object.
|
||||
|
||||
:param picks: picking data dictionary
|
||||
:type picks: dict
|
||||
|
||||
:param fnout: complete path to the exporting obs file
|
||||
:type fnout: str
|
||||
|
||||
:param: parameter, all input information
|
||||
:type: object
|
||||
'''
|
||||
# write phases to HYPOSAT-phase file
|
||||
writephases(picks, 'HYPOSAT', fnout, parameter)
|
||||
108
pylot/core/loc/nll.py
Normal file
@@ -0,0 +1,108 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import glob
|
||||
import os
|
||||
import subprocess
|
||||
|
||||
from obspy import read_events
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.util.utils import getPatternLine, runProgram, which
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
|
||||
|
||||
class NLLocError(EnvironmentError):
|
||||
pass
|
||||
|
||||
|
||||
def export(picks, fnout, parameter):
|
||||
'''
|
||||
Take <picks> dictionary and exports picking data to a NLLOC-obs
|
||||
<phasefile> without creating an ObsPy event object.
|
||||
|
||||
:param picks: picking data dictionary
|
||||
:type picks: dict
|
||||
|
||||
:param fnout: complete path to the exporting obs file
|
||||
:type fnout: str
|
||||
|
||||
:param: parameter, all input information
|
||||
:type: object
|
||||
'''
|
||||
# write phases to NLLoc-phase file
|
||||
writephases(picks, 'NLLoc', fnout, parameter)
|
||||
|
||||
|
||||
def modify_inputs(ctrfn, root, nllocoutn, phasefn, tttn):
|
||||
'''
|
||||
:param ctrfn: name of NLLoc-control file
|
||||
:type: str
|
||||
|
||||
:param root: root path to NLLoc working directory
|
||||
:type: str
|
||||
|
||||
:param nllocoutn: name of NLLoc-location output file
|
||||
:type: str
|
||||
|
||||
:param phasefn: name of NLLoc-input phase file
|
||||
:type: str
|
||||
|
||||
:param tttn: pattern of precalculated NLLoc traveltime tables
|
||||
:type: str
|
||||
'''
|
||||
# For locating the event the NLLoc-control file has to be modified!
|
||||
# create comment line for NLLoc-control file NLLoc-output file
|
||||
ctrfile = os.path.join(root, 'run', ctrfn)
|
||||
nllocout = os.path.join(root, 'loc', nllocoutn)
|
||||
phasefile = os.path.join(root, 'obs', phasefn)
|
||||
tttable = os.path.join(root, 'time', tttn)
|
||||
locfiles = 'LOCFILES %s NLLOC_OBS %s %s 0\n' % (phasefile, tttable, nllocout)
|
||||
|
||||
# modification of NLLoc-control file
|
||||
print("Modifying NLLoc-control file %s ..." % ctrfile)
|
||||
curlocfiles = getPatternLine(ctrfile, 'LOCFILES')
|
||||
nllfile = open(ctrfile, 'r')
|
||||
filedata = nllfile.read()
|
||||
if filedata.find(locfiles) < 0:
|
||||
# replace old command
|
||||
filedata = filedata.replace(curlocfiles, locfiles)
|
||||
nllfile = open(ctrfile, 'w')
|
||||
nllfile.write(filedata)
|
||||
nllfile.close()
|
||||
|
||||
|
||||
def locate(fnin, infile=None):
|
||||
"""
|
||||
takes an external program name
|
||||
:param fnin:
|
||||
:return:
|
||||
"""
|
||||
|
||||
if infile is None:
|
||||
exe_path = which('NLLoc')
|
||||
else:
|
||||
exe_path = which('NLLoc', infile)
|
||||
if exe_path is None:
|
||||
raise NLLocError('NonLinLoc executable not found; check your '
|
||||
'environment variables')
|
||||
|
||||
# locate the event utilizing external NonLinLoc installation
|
||||
try:
|
||||
runProgram(exe_path, fnin)
|
||||
except subprocess.CalledProcessError as e:
|
||||
raise RuntimeError(e.output)
|
||||
|
||||
|
||||
def read_location(fn):
|
||||
path, file = os.path.split(fn)
|
||||
file = glob.glob1(path, file + '.[0-9]*.grid0.loc.hyp')
|
||||
if len(file) > 1:
|
||||
raise IOError('ambiguous location name {0}'.format(file))
|
||||
fn = os.path.join(path, file[0])
|
||||
return read_events(fn)[0]
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
pass
|
||||
28
pylot/core/loc/velest.py
Normal file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pylot.core.io.phases import writephases
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
|
||||
|
||||
def export(picks, fnout, eventinfo, parameter=None):
|
||||
'''
|
||||
Take <picks> dictionary and exports picking data to a VELEST-cnv
|
||||
<phasefile> without creating an ObsPy event object.
|
||||
|
||||
:param picks: picking data dictionary
|
||||
:type picks: dict
|
||||
|
||||
:param fnout: complete path to the exporting obs file
|
||||
:type fnout: str
|
||||
|
||||
:param: eventinfo, source time needed for VELEST-cnv format
|
||||
:type: list object
|
||||
|
||||
:param: parameter, all input information
|
||||
:type: object
|
||||
'''
|
||||
# write phases to VELEST-phase file
|
||||
writephases(picks, 'VELEST', fnout, parameter, eventinfo)
|
||||
2
pylot/core/pick/__init__.py
Normal file
@@ -0,0 +1,2 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
1183
pylot/core/pick/autopick.py
Normal file
705
pylot/core/pick/charfuns.py
Normal file
@@ -0,0 +1,705 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Created Oct/Nov 2014
|
||||
|
||||
Implementation of the Characteristic Functions (CF) published and described in:
|
||||
|
||||
Kueperkoch, L., Meier, T., Lee, J., Friederich, W., & EGELADOS Working Group, 2010:
|
||||
Automated determination of P-phase arrival times at regional and local distances
|
||||
using higher order statistics, Geophys. J. Int., 181, 1159-1170
|
||||
|
||||
Kueperkoch, L., Meier, T., Bruestle, A., Lee, J., Friederich, W., & EGELADOS
|
||||
Working Group, 2012: Automated determination of S-phase arrival times using
|
||||
autoregressive prediction: application ot local and regional distances, Geophys. J. Int.,
|
||||
188, 687-702.
|
||||
|
||||
:author: MAGS2 EP3 working group
|
||||
"""
|
||||
|
||||
import numpy as np
|
||||
from obspy.core import Stream
|
||||
|
||||
|
||||
class CharacteristicFunction(object):
|
||||
'''
|
||||
SuperClass for different types of characteristic functions.
|
||||
'''
|
||||
|
||||
def __init__(self, data, cut, t2=None, order=None, t1=None, fnoise=None):
|
||||
'''
|
||||
Initialize data type object with information from the original
|
||||
Seismogram.
|
||||
|
||||
:param: data
|
||||
:type: `~obspy.core.stream.Stream`
|
||||
|
||||
:param: cut
|
||||
:type: tuple
|
||||
|
||||
:param: t2
|
||||
:type: float
|
||||
|
||||
:param: order
|
||||
:type: int
|
||||
|
||||
:param: t1
|
||||
:type: float (optional, only for AR)
|
||||
|
||||
:param: fnoise
|
||||
:type: float (optional, only for AR)
|
||||
'''
|
||||
|
||||
assert isinstance(data, Stream), "%s is not a stream object" % str(data)
|
||||
|
||||
self.orig_data = data
|
||||
self.dt = self.orig_data[0].stats.delta
|
||||
self.setCut(cut)
|
||||
self.setTime1(t1)
|
||||
self.setTime2(t2)
|
||||
self.setOrder(order)
|
||||
self.setFnoise(fnoise)
|
||||
self.setARdetStep(t2)
|
||||
self.calcCF(self.getDataArray())
|
||||
self.arpara = np.array([])
|
||||
self.xpred = np.array([])
|
||||
|
||||
def __str__(self):
|
||||
return '''\n\t{name} object:\n
|
||||
Cut:\t\t{cut}\n
|
||||
t1:\t{t1}\n
|
||||
t2:\t{t2}\n
|
||||
Order:\t\t{order}\n
|
||||
Fnoise:\t{fnoise}\n
|
||||
ARdetStep:\t{ardetstep}\n
|
||||
'''.format(name=type(self).__name__,
|
||||
cut=self.getCut(),
|
||||
t1=self.getTime1(),
|
||||
t2=self.getTime2(),
|
||||
order=self.getOrder(),
|
||||
fnoise=self.getFnoise(),
|
||||
ardetstep=self.getARdetStep[0]())
|
||||
|
||||
def getCut(self):
|
||||
return self.cut
|
||||
|
||||
def setCut(self, cut):
|
||||
self.cut = cut
|
||||
|
||||
def getTime1(self):
|
||||
return self.t1
|
||||
|
||||
def setTime1(self, t1):
|
||||
self.t1 = t1
|
||||
|
||||
def getTime2(self):
|
||||
return self.t2
|
||||
|
||||
def setTime2(self, t2):
|
||||
self.t2 = t2
|
||||
|
||||
def getARdetStep(self):
|
||||
return self.ARdetStep
|
||||
|
||||
def setARdetStep(self, t1):
|
||||
if t1:
|
||||
self.ARdetStep = []
|
||||
self.ARdetStep.append(t1 / 4)
|
||||
self.ARdetStep.append(int(np.ceil(self.getTime2() / self.getIncrement()) / 4))
|
||||
|
||||
def getOrder(self):
|
||||
return self.order
|
||||
|
||||
def setOrder(self, order):
|
||||
self.order = order
|
||||
|
||||
def getIncrement(self):
|
||||
"""
|
||||
:rtype : int
|
||||
"""
|
||||
return self.dt
|
||||
|
||||
def getTimeArray(self):
|
||||
incr = self.getIncrement()
|
||||
self.TimeArray = np.arange(0, len(self.getCF()) * incr, incr) + self.getCut()[0]
|
||||
return self.TimeArray
|
||||
|
||||
def getFnoise(self):
|
||||
return self.fnoise
|
||||
|
||||
def setFnoise(self, fnoise):
|
||||
self.fnoise = fnoise
|
||||
|
||||
def getCF(self):
|
||||
return self.cf
|
||||
|
||||
def getXCF(self):
|
||||
return self.xcf
|
||||
|
||||
def getDataArray(self, cut=None):
|
||||
'''
|
||||
If cut times are given, time series is cut from cut[0] (start time)
|
||||
till cut[1] (stop time) in order to calculate CF for certain part
|
||||
only where you expect the signal!
|
||||
input: cut (tuple) ()
|
||||
cutting window
|
||||
'''
|
||||
if cut is not None:
|
||||
if len(self.orig_data) == 1:
|
||||
if self.cut[0] == 0 and self.cut[1] == 0:
|
||||
start = 0
|
||||
stop = len(self.orig_data[0])
|
||||
elif self.cut[0] == 0 and self.cut[1] is not 0:
|
||||
start = 0
|
||||
stop = self.cut[1] / self.dt
|
||||
else:
|
||||
start = self.cut[0] / self.dt
|
||||
stop = self.cut[1] / self.dt
|
||||
zz = self.orig_data.copy()
|
||||
z1 = zz[0].copy()
|
||||
zz[0].data = z1.data[int(start):int(stop)]
|
||||
data = zz
|
||||
return data
|
||||
elif len(self.orig_data) == 2:
|
||||
if self.cut[0] == 0 and self.cut[1] == 0:
|
||||
start = 0
|
||||
stop = min([len(self.orig_data[0]), len(self.orig_data[1])])
|
||||
elif self.cut[0] == 0 and self.cut[1] is not 0:
|
||||
start = 0
|
||||
stop = min([self.cut[1] / self.dt, len(self.orig_data[0]),
|
||||
len(self.orig_data[1])])
|
||||
else:
|
||||
start = max([0, self.cut[0] / self.dt])
|
||||
stop = min([self.cut[1] / self.dt, len(self.orig_data[0]),
|
||||
len(self.orig_data[1])])
|
||||
hh = self.orig_data.copy()
|
||||
h1 = hh[0].copy()
|
||||
h2 = hh[1].copy()
|
||||
hh[0].data = h1.data[int(start):int(stop)]
|
||||
hh[1].data = h2.data[int(start):int(stop)]
|
||||
data = hh
|
||||
return data
|
||||
elif len(self.orig_data) == 3:
|
||||
if self.cut[0] == 0 and self.cut[1] == 0:
|
||||
start = 0
|
||||
stop = min([self.cut[1] / self.dt, len(self.orig_data[0]),
|
||||
len(self.orig_data[1]), len(self.orig_data[2])])
|
||||
elif self.cut[0] == 0 and self.cut[1] is not 0:
|
||||
start = 0
|
||||
stop = self.cut[1] / self.dt
|
||||
else:
|
||||
start = max([0, self.cut[0] / self.dt])
|
||||
stop = min([self.cut[1] / self.dt, len(self.orig_data[0]),
|
||||
len(self.orig_data[1]), len(self.orig_data[2])])
|
||||
hh = self.orig_data.copy()
|
||||
h1 = hh[0].copy()
|
||||
h2 = hh[1].copy()
|
||||
h3 = hh[2].copy()
|
||||
hh[0].data = h1.data[int(start):int(stop)]
|
||||
hh[1].data = h2.data[int(start):int(stop)]
|
||||
hh[2].data = h3.data[int(start):int(stop)]
|
||||
data = hh
|
||||
return data
|
||||
else:
|
||||
data = self.orig_data.copy()
|
||||
return data
|
||||
|
||||
def calcCF(self, data=None):
|
||||
self.cf = data
|
||||
|
||||
|
||||
class AICcf(CharacteristicFunction):
|
||||
'''
|
||||
Function to calculate the Akaike Information Criterion (AIC) after
|
||||
Maeda (1985).
|
||||
:param: data, time series (whether seismogram or CF)
|
||||
:type: tuple
|
||||
|
||||
Output: AIC function
|
||||
'''
|
||||
|
||||
def calcCF(self, data):
|
||||
|
||||
x = self.getDataArray()
|
||||
xnp = x[0].data
|
||||
ind = np.where(~np.isnan(xnp))[0]
|
||||
if ind.size:
|
||||
xnp[:ind[0]] = xnp[ind[0]]
|
||||
datlen = len(xnp)
|
||||
k = np.arange(1, datlen)
|
||||
cf = np.zeros(datlen)
|
||||
cumsumcf = np.cumsum(np.power(xnp, 2))
|
||||
i = np.where(cumsumcf == 0)
|
||||
cumsumcf[i] = np.finfo(np.float64).eps
|
||||
cf[k] = ((k - 1) * np.log(cumsumcf[k] / k) + (datlen - k + 1) *
|
||||
np.log((cumsumcf[datlen - 1] - cumsumcf[k - 1]) / (datlen - k + 1)))
|
||||
cf[0] = cf[1]
|
||||
inf = np.isinf(cf)
|
||||
ff = np.where(inf == True)
|
||||
if len(ff) >= 1:
|
||||
cf[ff] = 0
|
||||
|
||||
self.cf = cf - np.mean(cf)
|
||||
self.xcf = x
|
||||
|
||||
|
||||
class HOScf(CharacteristicFunction):
|
||||
'''
|
||||
Function to calculate skewness (statistics of order 3) or kurtosis
|
||||
(statistics of order 4), using one long moving window, as published
|
||||
in Kueperkoch et al. (2010).
|
||||
'''
|
||||
|
||||
def calcCF(self, data):
|
||||
|
||||
x = self.getDataArray(self.getCut())
|
||||
xnp = x[0].data
|
||||
nn = np.isnan(xnp)
|
||||
if len(nn) > 1:
|
||||
xnp[nn] = 0
|
||||
if self.getOrder() == 3: # this is skewness
|
||||
y = np.power(xnp, 3)
|
||||
y1 = np.power(xnp, 2)
|
||||
elif self.getOrder() == 4: # this is kurtosis
|
||||
y = np.power(xnp, 4)
|
||||
y1 = np.power(xnp, 2)
|
||||
|
||||
# Initialisation
|
||||
# t2: long term moving window
|
||||
ilta = int(round(self.getTime2() / self.getIncrement()))
|
||||
lta = y[0]
|
||||
lta1 = y1[0]
|
||||
# moving windows
|
||||
LTA = np.zeros(len(xnp))
|
||||
for j in range(0, len(xnp)):
|
||||
if j < 4:
|
||||
LTA[j] = 0
|
||||
elif j <= ilta:
|
||||
lta = (y[j] + lta * (j - 1)) / j
|
||||
lta1 = (y1[j] + lta1 * (j - 1)) / j
|
||||
else:
|
||||
lta = (y[j] - y[j - ilta]) / ilta + lta
|
||||
lta1 = (y1[j] - y1[j - ilta]) / ilta + lta1
|
||||
# define LTA
|
||||
if self.getOrder() == 3:
|
||||
LTA[j] = lta / np.power(lta1, 1.5)
|
||||
elif self.getOrder() == 4:
|
||||
LTA[j] = lta / np.power(lta1, 2)
|
||||
|
||||
# remove NaN's with first not-NaN-value,
|
||||
# so autopicker doesnt pick discontinuity at start of the trace
|
||||
ind = np.where(~np.isnan(LTA))[0]
|
||||
if ind.size:
|
||||
first = ind[0]
|
||||
LTA[:first] = LTA[first]
|
||||
self.cf = LTA
|
||||
self.xcf = x
|
||||
|
||||
|
||||
class ARZcf(CharacteristicFunction):
|
||||
def calcCF(self, data):
|
||||
|
||||
print('Calculating AR-prediction error from single trace ...')
|
||||
x = self.getDataArray(self.getCut())
|
||||
xnp = x[0].data
|
||||
nn = np.isnan(xnp)
|
||||
if len(nn) > 1:
|
||||
xnp[nn] = 0
|
||||
# some parameters needed
|
||||
# add noise to time series
|
||||
xnoise = xnp + np.random.normal(0.0, 1.0, len(xnp)) * self.getFnoise() * max(abs(xnp))
|
||||
tend = len(xnp)
|
||||
# Time1: length of AR-determination window [sec]
|
||||
# Time2: length of AR-prediction window [sec]
|
||||
ldet = int(round(self.getTime1() / self.getIncrement())) # length of AR-determination window [samples]
|
||||
lpred = int(np.ceil(self.getTime2() / self.getIncrement())) # length of AR-prediction window [samples]
|
||||
|
||||
cf = np.zeros(len(xnp))
|
||||
loopstep = self.getARdetStep()
|
||||
arcalci = ldet + self.getOrder() # AR-calculation index
|
||||
for i in range(ldet + self.getOrder(), tend - lpred - 1):
|
||||
if i == arcalci:
|
||||
# determination of AR coefficients
|
||||
# to speed up calculation, AR-coefficients are calculated only every i+loopstep[1]!
|
||||
self.arDetZ(xnoise, self.getOrder(), i - ldet, i)
|
||||
arcalci = arcalci + loopstep[1]
|
||||
# AR prediction of waveform using calculated AR coefficients
|
||||
self.arPredZ(xnp, self.arpara, i + 1, lpred)
|
||||
# prediction error = CF
|
||||
cf[i + lpred - 1] = np.sqrt(np.sum(np.power(self.xpred[i:i + lpred - 1] - xnp[i:i + lpred - 1], 2)) / lpred)
|
||||
nn = np.isnan(cf)
|
||||
if len(nn) > 1:
|
||||
cf[nn] = 0
|
||||
# remove zeros and artefacts
|
||||
tap = np.hanning(len(cf))
|
||||
cf = tap * cf
|
||||
io = np.where(cf == 0)
|
||||
ino = np.where(cf > 0)
|
||||
if np.size(ino):
|
||||
cf[io] = cf[ino[0][0]]
|
||||
|
||||
self.cf = cf
|
||||
self.xcf = x
|
||||
|
||||
def arDetZ(self, data, order, rind, ldet):
|
||||
'''
|
||||
Function to calculate AR parameters arpara after Thomas Meier (CAU), published
|
||||
in Kueperkoch et al. (2012). This function solves SLE using the Moore-
|
||||
Penrose inverse, i.e. the least-squares approach.
|
||||
:param: data, time series to calculate AR parameters from
|
||||
:type: array
|
||||
|
||||
:param: order, order of AR process
|
||||
:type: int
|
||||
|
||||
:param: rind, first running summation index
|
||||
:type: int
|
||||
|
||||
:param: ldet, length of AR-determination window (=end of summation index)
|
||||
:type: int
|
||||
|
||||
Output: AR parameters arpara
|
||||
'''
|
||||
|
||||
# recursive calculation of data vector (right part of eq. 6.5 in Kueperkoch et al. (2012)
|
||||
rhs = np.zeros(self.getOrder())
|
||||
for k in range(0, self.getOrder()):
|
||||
for i in range(rind, ldet + 1):
|
||||
ki = k + 1
|
||||
rhs[k] = rhs[k] + data[i] * data[i - ki]
|
||||
|
||||
# recursive calculation of data array (second sum at left part of eq. 6.5 in Kueperkoch et al. 2012)
|
||||
A = np.zeros((self.getOrder(), self.getOrder()))
|
||||
for k in range(1, self.getOrder() + 1):
|
||||
for j in range(1, k + 1):
|
||||
for i in range(rind, ldet + 1):
|
||||
ki = k - 1
|
||||
ji = j - 1
|
||||
A[ki, ji] = A[ki, ji] + data[i - j] * data[i - k]
|
||||
|
||||
A[ji, ki] = A[ki, ji]
|
||||
|
||||
# apply Moore-Penrose inverse for SVD yielding the AR-parameters
|
||||
self.arpara = np.dot(np.linalg.pinv(A), rhs)
|
||||
|
||||
def arPredZ(self, data, arpara, rind, lpred):
|
||||
'''
|
||||
Function to predict waveform, assuming an autoregressive process of order
|
||||
p (=size(arpara)), with AR parameters arpara calculated in arDet. After
|
||||
Thomas Meier (CAU), published in Kueperkoch et al. (2012).
|
||||
:param: data, time series to be predicted
|
||||
:type: array
|
||||
|
||||
:param: arpara, AR parameters
|
||||
:type: float
|
||||
|
||||
:param: rind, first running summation index
|
||||
:type: int
|
||||
|
||||
:param: lpred, length of prediction window (=end of summation index)
|
||||
:type: int
|
||||
|
||||
Output: predicted waveform z
|
||||
'''
|
||||
# be sure of the summation indeces
|
||||
if rind < len(arpara):
|
||||
rind = len(arpara)
|
||||
if rind > len(data) - lpred:
|
||||
rind = len(data) - lpred
|
||||
if lpred < 1:
|
||||
lpred = 1
|
||||
if lpred > len(data) - 2:
|
||||
lpred = len(data) - 2
|
||||
|
||||
z = np.append(data[0:rind], np.zeros(lpred))
|
||||
for i in range(rind, rind + lpred):
|
||||
for j in range(1, len(arpara) + 1):
|
||||
ji = j - 1
|
||||
z[i] = z[i] + arpara[ji] * z[i - j]
|
||||
|
||||
self.xpred = z
|
||||
|
||||
|
||||
class ARHcf(CharacteristicFunction):
|
||||
def calcCF(self, data):
|
||||
|
||||
print('Calculating AR-prediction error from both horizontal traces ...')
|
||||
|
||||
xnp = self.getDataArray(self.getCut())
|
||||
n0 = np.isnan(xnp[0].data)
|
||||
if len(n0) > 1:
|
||||
xnp[0].data[n0] = 0
|
||||
n1 = np.isnan(xnp[1].data)
|
||||
if len(n1) > 1:
|
||||
xnp[1].data[n1] = 0
|
||||
|
||||
# some parameters needed
|
||||
# add noise to time series
|
||||
xenoise = xnp[0].data + np.random.normal(0.0, 1.0, len(xnp[0].data)) * self.getFnoise() * max(abs(xnp[0].data))
|
||||
xnnoise = xnp[1].data + np.random.normal(0.0, 1.0, len(xnp[1].data)) * self.getFnoise() * max(abs(xnp[1].data))
|
||||
Xnoise = np.array([xenoise.tolist(), xnnoise.tolist()])
|
||||
tend = len(xnp[0].data)
|
||||
# Time1: length of AR-determination window [sec]
|
||||
# Time2: length of AR-prediction window [sec]
|
||||
ldet = int(round(self.getTime1() / self.getIncrement())) # length of AR-determination window [samples]
|
||||
lpred = int(np.ceil(self.getTime2() / self.getIncrement())) # length of AR-prediction window [samples]
|
||||
|
||||
cf = np.zeros(len(xenoise))
|
||||
loopstep = self.getARdetStep()
|
||||
arcalci = lpred + self.getOrder() - 1 # AR-calculation index
|
||||
# arcalci = ldet + self.getOrder() - 1 #AR-calculation index
|
||||
for i in range(lpred + self.getOrder() - 1, tend - 2 * lpred + 1):
|
||||
if i == arcalci:
|
||||
# determination of AR coefficients
|
||||
# to speed up calculation, AR-coefficients are calculated only every i+loopstep[1]!
|
||||
self.arDetH(Xnoise, self.getOrder(), i - ldet, i)
|
||||
arcalci = arcalci + loopstep[1]
|
||||
# AR prediction of waveform using calculated AR coefficients
|
||||
self.arPredH(xnp, self.arpara, i + 1, lpred)
|
||||
# prediction error = CF
|
||||
cf[i + lpred] = np.sqrt(np.sum(np.power(self.xpred[0][i:i + lpred] - xnp[0][i:i + lpred], 2) \
|
||||
+ np.power(self.xpred[1][i:i + lpred] - xnp[1][i:i + lpred], 2)) / (
|
||||
2 * lpred))
|
||||
nn = np.isnan(cf)
|
||||
if len(nn) > 1:
|
||||
cf[nn] = 0
|
||||
# remove zeros and artefacts
|
||||
tap = np.hanning(len(cf))
|
||||
cf = tap * cf
|
||||
io = np.where(cf == 0)
|
||||
ino = np.where(cf > 0)
|
||||
if np.size(ino):
|
||||
cf[io] = cf[ino[0][0]]
|
||||
|
||||
self.cf = cf
|
||||
self.xcf = xnp
|
||||
|
||||
def arDetH(self, data, order, rind, ldet):
|
||||
'''
|
||||
Function to calculate AR parameters arpara after Thomas Meier (CAU), published
|
||||
in Kueperkoch et al. (2012). This function solves SLE using the Moore-
|
||||
Penrose inverse, i.e. the least-squares approach. "data" is a structured array.
|
||||
AR parameters are calculated based on both horizontal components in order
|
||||
to account for polarization.
|
||||
:param: data, horizontal component seismograms to calculate AR parameters from
|
||||
:type: structured array
|
||||
|
||||
:param: order, order of AR process
|
||||
:type: int
|
||||
|
||||
:param: rind, first running summation index
|
||||
:type: int
|
||||
|
||||
:param: ldet, length of AR-determination window (=end of summation index)
|
||||
:type: int
|
||||
|
||||
Output: AR parameters arpara
|
||||
'''
|
||||
|
||||
# recursive calculation of data vector (right part of eq. 6.5 in Kueperkoch et al. (2012)
|
||||
rhs = np.zeros(self.getOrder())
|
||||
for k in range(0, self.getOrder()):
|
||||
for i in range(rind, ldet):
|
||||
rhs[k] = rhs[k] + data[0, i] * data[0, i - k] + data[1, i] * data[1, i - k]
|
||||
|
||||
# recursive calculation of data array (second sum at left part of eq. 6.5 in Kueperkoch et al. 2012)
|
||||
A = np.zeros((4, 4))
|
||||
for k in range(1, self.getOrder() + 1):
|
||||
for j in range(1, k + 1):
|
||||
for i in range(rind, ldet):
|
||||
ki = k - 1
|
||||
ji = j - 1
|
||||
A[ki, ji] = A[ki, ji] + data[0, i - ji] * data[0, i - ki] + data[1, i - ji] * data[1, i - ki]
|
||||
|
||||
A[ji, ki] = A[ki, ji]
|
||||
|
||||
# apply Moore-Penrose inverse for SVD yielding the AR-parameters
|
||||
self.arpara = np.dot(np.linalg.pinv(A), rhs)
|
||||
|
||||
def arPredH(self, data, arpara, rind, lpred):
|
||||
'''
|
||||
Function to predict waveform, assuming an autoregressive process of order
|
||||
p (=size(arpara)), with AR parameters arpara calculated in arDet. After
|
||||
Thomas Meier (CAU), published in Kueperkoch et al. (2012).
|
||||
:param: data, horizontal component seismograms to be predicted
|
||||
:type: structured array
|
||||
|
||||
:param: arpara, AR parameters
|
||||
:type: float
|
||||
|
||||
:param: rind, first running summation index
|
||||
:type: int
|
||||
|
||||
:param: lpred, length of prediction window (=end of summation index)
|
||||
:type: int
|
||||
|
||||
Output: predicted waveform z
|
||||
:type: structured array
|
||||
'''
|
||||
# be sure of the summation indeces
|
||||
if rind < len(arpara) + 1:
|
||||
rind = len(arpara) + 1
|
||||
if rind > len(data[0]) - lpred + 1:
|
||||
rind = len(data[0]) - lpred + 1
|
||||
if lpred < 1:
|
||||
lpred = 1
|
||||
if lpred > len(data[0]) - 1:
|
||||
lpred = len(data[0]) - 1
|
||||
|
||||
z1 = np.append(data[0][0:rind], np.zeros(lpred))
|
||||
z2 = np.append(data[1][0:rind], np.zeros(lpred))
|
||||
for i in range(rind, rind + lpred):
|
||||
for j in range(1, len(arpara) + 1):
|
||||
ji = j - 1
|
||||
z1[i] = z1[i] + arpara[ji] * z1[i - ji]
|
||||
z2[i] = z2[i] + arpara[ji] * z2[i - ji]
|
||||
|
||||
z = np.array([z1.tolist(), z2.tolist()])
|
||||
self.xpred = z
|
||||
|
||||
|
||||
class AR3Ccf(CharacteristicFunction):
|
||||
def calcCF(self, data):
|
||||
|
||||
print('Calculating AR-prediction error from all 3 components ...')
|
||||
|
||||
xnp = self.getDataArray(self.getCut())
|
||||
n0 = np.isnan(xnp[0].data)
|
||||
if len(n0) > 1:
|
||||
xnp[0].data[n0] = 0
|
||||
n1 = np.isnan(xnp[1].data)
|
||||
if len(n1) > 1:
|
||||
xnp[1].data[n1] = 0
|
||||
n2 = np.isnan(xnp[2].data)
|
||||
if len(n2) > 1:
|
||||
xnp[2].data[n2] = 0
|
||||
|
||||
# some parameters needed
|
||||
# add noise to time series
|
||||
xenoise = xnp[0].data + np.random.normal(0.0, 1.0, len(xnp[0].data)) * self.getFnoise() * max(abs(xnp[0].data))
|
||||
xnnoise = xnp[1].data + np.random.normal(0.0, 1.0, len(xnp[1].data)) * self.getFnoise() * max(abs(xnp[1].data))
|
||||
xznoise = xnp[2].data + np.random.normal(0.0, 1.0, len(xnp[2].data)) * self.getFnoise() * max(abs(xnp[2].data))
|
||||
Xnoise = np.array([xenoise.tolist(), xnnoise.tolist(), xznoise.tolist()])
|
||||
tend = len(xnp[0].data)
|
||||
# Time1: length of AR-determination window [sec]
|
||||
# Time2: length of AR-prediction window [sec]
|
||||
ldet = int(round(self.getTime1() / self.getIncrement())) # length of AR-determination window [samples]
|
||||
lpred = int(np.ceil(self.getTime2() / self.getIncrement())) # length of AR-prediction window [samples]
|
||||
|
||||
cf = np.zeros(len(xenoise))
|
||||
loopstep = self.getARdetStep()
|
||||
arcalci = ldet + self.getOrder() - 1 # AR-calculation index
|
||||
for i in range(ldet + self.getOrder() - 1, tend - 2 * lpred + 1):
|
||||
if i == arcalci:
|
||||
# determination of AR coefficients
|
||||
# to speed up calculation, AR-coefficients are calculated only every i+loopstep[1]!
|
||||
self.arDet3C(Xnoise, self.getOrder(), i - ldet, i)
|
||||
arcalci = arcalci + loopstep[1]
|
||||
|
||||
# AR prediction of waveform using calculated AR coefficients
|
||||
self.arPred3C(xnp, self.arpara, i + 1, lpred)
|
||||
# prediction error = CF
|
||||
cf[i + lpred] = np.sqrt(np.sum(np.power(self.xpred[0][i:i + lpred] - xnp[0][i:i + lpred], 2) \
|
||||
+ np.power(self.xpred[1][i:i + lpred] - xnp[1][i:i + lpred], 2) \
|
||||
+ np.power(self.xpred[2][i:i + lpred] - xnp[2][i:i + lpred], 2)) / (
|
||||
3 * lpred))
|
||||
nn = np.isnan(cf)
|
||||
if len(nn) > 1:
|
||||
cf[nn] = 0
|
||||
# remove zeros and artefacts
|
||||
tap = np.hanning(len(cf))
|
||||
cf = tap * cf
|
||||
io = np.where(cf == 0)
|
||||
ino = np.where(cf > 0)
|
||||
if np.size(ino):
|
||||
cf[io] = cf[ino[0][0]]
|
||||
|
||||
self.cf = cf
|
||||
self.xcf = xnp
|
||||
|
||||
def arDet3C(self, data, order, rind, ldet):
|
||||
'''
|
||||
Function to calculate AR parameters arpara after Thomas Meier (CAU), published
|
||||
in Kueperkoch et al. (2012). This function solves SLE using the Moore-
|
||||
Penrose inverse, i.e. the least-squares approach. "data" is a structured array.
|
||||
AR parameters are calculated based on both horizontal components and vertical
|
||||
componant.
|
||||
:param: data, horizontal component seismograms to calculate AR parameters from
|
||||
:type: structured array
|
||||
|
||||
:param: order, order of AR process
|
||||
:type: int
|
||||
|
||||
:param: rind, first running summation index
|
||||
:type: int
|
||||
|
||||
:param: ldet, length of AR-determination window (=end of summation index)
|
||||
:type: int
|
||||
|
||||
Output: AR parameters arpara
|
||||
'''
|
||||
|
||||
# recursive calculation of data vector (right part of eq. 6.5 in Kueperkoch et al. (2012)
|
||||
rhs = np.zeros(self.getOrder())
|
||||
for k in range(0, self.getOrder()):
|
||||
for i in range(rind, ldet):
|
||||
rhs[k] = rhs[k] + data[0, i] * data[0, i - k] + data[1, i] * data[1, i - k] \
|
||||
+ data[2, i] * data[2, i - k]
|
||||
|
||||
# recursive calculation of data array (second sum at left part of eq. 6.5 in Kueperkoch et al. 2012)
|
||||
A = np.zeros((4, 4))
|
||||
for k in range(1, self.getOrder() + 1):
|
||||
for j in range(1, k + 1):
|
||||
for i in range(rind, ldet):
|
||||
ki = k - 1
|
||||
ji = j - 1
|
||||
A[ki, ji] = A[ki, ji] + data[0, i - ji] * data[0, i - ki] + data[1, i - ji] * data[1, i - ki] \
|
||||
+ data[2, i - ji] * data[2, i - ki]
|
||||
|
||||
A[ji, ki] = A[ki, ji]
|
||||
|
||||
# apply Moore-Penrose inverse for SVD yielding the AR-parameters
|
||||
self.arpara = np.dot(np.linalg.pinv(A), rhs)
|
||||
|
||||
def arPred3C(self, data, arpara, rind, lpred):
|
||||
'''
|
||||
Function to predict waveform, assuming an autoregressive process of order
|
||||
p (=size(arpara)), with AR parameters arpara calculated in arDet3C. After
|
||||
Thomas Meier (CAU), published in Kueperkoch et al. (2012).
|
||||
:param: data, horizontal and vertical component seismograms to be predicted
|
||||
:type: structured array
|
||||
|
||||
:param: arpara, AR parameters
|
||||
:type: float
|
||||
|
||||
:param: rind, first running summation index
|
||||
:type: int
|
||||
|
||||
:param: lpred, length of prediction window (=end of summation index)
|
||||
:type: int
|
||||
|
||||
Output: predicted waveform z
|
||||
:type: structured array
|
||||
'''
|
||||
# be sure of the summation indeces
|
||||
if rind < len(arpara) + 1:
|
||||
rind = len(arpara) + 1
|
||||
if rind > len(data[0]) - lpred + 1:
|
||||
rind = len(data[0]) - lpred + 1
|
||||
if lpred < 1:
|
||||
lpred = 1
|
||||
if lpred > len(data[0]) - 1:
|
||||
lpred = len(data[0]) - 1
|
||||
|
||||
z1 = np.append(data[0][0:rind], np.zeros(lpred))
|
||||
z2 = np.append(data[1][0:rind], np.zeros(lpred))
|
||||
z3 = np.append(data[2][0:rind], np.zeros(lpred))
|
||||
for i in range(rind, rind + lpred):
|
||||
for j in range(1, len(arpara) + 1):
|
||||
ji = j - 1
|
||||
z1[i] = z1[i] + arpara[ji] * z1[i - ji]
|
||||
z2[i] = z2[i] + arpara[ji] * z2[i - ji]
|
||||
z3[i] = z3[i] + arpara[ji] * z3[i - ji]
|
||||
|
||||
z = np.array([z1.tolist(), z2.tolist(), z3.tolist()])
|
||||
self.xpred = z
|
||||
515
pylot/core/pick/compare.py
Normal file
@@ -0,0 +1,515 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import copy
|
||||
import operator
|
||||
import os
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
from obspy import read_events
|
||||
from obspy.core import AttribDict
|
||||
from pylot.core.io.phases import picksdict_from_picks
|
||||
from pylot.core.util.pdf import ProbabilityDensityFunction
|
||||
from pylot.core.util.utils import find_in_list
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
__author__ = 'sebastianw'
|
||||
|
||||
|
||||
class Comparison(object):
|
||||
"""
|
||||
A Comparison object contains information on the evaluated picks' probability
|
||||
density function and compares these in terms of building the difference of
|
||||
compared pick sets. The results can be displayed as histograms showing its
|
||||
properties.
|
||||
"""
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
self._pdfs = dict()
|
||||
names = self.iter_kwargs(kwargs)
|
||||
if len(names) > 2:
|
||||
raise ValueError('Comparison is only defined for two '
|
||||
'arguments!')
|
||||
self._names = names
|
||||
self._compare = self.compare_picksets()
|
||||
|
||||
def __nonzero__(self):
|
||||
if not len(self.names) == 2 or not self._pdfs:
|
||||
return False
|
||||
return True
|
||||
|
||||
def iter_kwargs(self, kwargs):
|
||||
names = list()
|
||||
for name, fn in kwargs.items():
|
||||
if name == 'eventlist':
|
||||
names = self.init_by_eventlist(fn)
|
||||
break
|
||||
if isinstance(fn, PDFDictionary):
|
||||
self._pdfs[name] = fn
|
||||
elif isinstance(fn, dict) or isinstance(fn, AttribDict):
|
||||
self._pdfs[name] = PDFDictionary(fn)
|
||||
else:
|
||||
self._pdfs[name] = PDFDictionary.from_quakeml(fn)
|
||||
names.append(name)
|
||||
return names
|
||||
|
||||
def init_by_eventlist(self, eventlist):
|
||||
# create one dictionary containing all picks for all events (therefore modify station key)
|
||||
global_picksdict = {}
|
||||
for event in eventlist:
|
||||
automanu = {'manu': event.pylot_picks,
|
||||
'auto': event.pylot_autopicks}
|
||||
for method, picksdict in automanu.items():
|
||||
if not method in global_picksdict.keys():
|
||||
global_picksdict[method] = {}
|
||||
for station, picks in picksdict.items():
|
||||
new_picksdict = global_picksdict[method]
|
||||
# new id combining event and station in one dictionary for all events
|
||||
id = '{}_{}'.format(event.pylot_id, station)
|
||||
new_picksdict[id] = picks
|
||||
for method, picksdict in global_picksdict.items():
|
||||
self._pdfs[method] = PDFDictionary(picksdict)
|
||||
names = list(global_picksdict.keys())
|
||||
return names
|
||||
|
||||
def get(self, name):
|
||||
return self._pdfs[name]
|
||||
|
||||
@property
|
||||
def names(self):
|
||||
return self._names
|
||||
|
||||
@names.setter
|
||||
def names(self, names):
|
||||
assert isinstance(names, list) and len(names) == 2, 'variable "names"' \
|
||||
' is either not a' \
|
||||
' list or its ' \
|
||||
'length is not 2:' \
|
||||
'names : {names}'.format(
|
||||
names=names)
|
||||
self._names = names
|
||||
|
||||
@property
|
||||
def comparison(self):
|
||||
return self._compare
|
||||
|
||||
@property
|
||||
def stations(self):
|
||||
return self.comparison.keys()
|
||||
|
||||
@property
|
||||
def nstations(self):
|
||||
return len(self.stations)
|
||||
|
||||
def compare_picksets(self, type='exp'):
|
||||
"""
|
||||
Compare two picksets A and B and return a dictionary compiling the results.
|
||||
Comparison is carried out with the help of pdf representation of the picks
|
||||
and a probabilistic approach to the time difference of two onset
|
||||
measurements.
|
||||
:param a: filename for pickset A
|
||||
:type a: str
|
||||
:param b: filename for pickset B
|
||||
:type b: str
|
||||
:return: dictionary containing the resulting comparison pdfs for all picks
|
||||
:rtype: dict
|
||||
"""
|
||||
compare_pdfs = dict()
|
||||
|
||||
pdf_a = self.get('auto').generate_pdf_data(type)
|
||||
pdf_b = self.get('manu').generate_pdf_data(type)
|
||||
|
||||
for station, phases in pdf_a.items():
|
||||
if station in pdf_b.keys():
|
||||
compare_pdf = dict()
|
||||
for phase in phases:
|
||||
if phase in pdf_b[station].keys():
|
||||
compare_pdf[phase] = phases[phase] - pdf_b[station][
|
||||
phase]
|
||||
if compare_pdf is not None:
|
||||
compare_pdfs[station] = compare_pdf
|
||||
|
||||
return compare_pdfs
|
||||
|
||||
def plot(self, stations=None):
|
||||
if stations is None:
|
||||
nstations = self.nstations
|
||||
stations = self.stations
|
||||
else:
|
||||
nstations = len(stations)
|
||||
istations = range(nstations)
|
||||
fig, axarr = plt.subplots(nstations, 2, sharex='col', sharey='row')
|
||||
|
||||
for n in istations:
|
||||
station = stations[n]
|
||||
if station not in self.comparison.keys():
|
||||
continue
|
||||
compare_pdf = self.comparison[station]
|
||||
for l, phase in enumerate(compare_pdf.keys()):
|
||||
axarr[n, l].plot(compare_pdf[phase].axis,
|
||||
compare_pdf[phase].data)
|
||||
if n is 0:
|
||||
axarr[n, l].set_title(phase)
|
||||
if l is 0:
|
||||
axann = axarr[n, l].annotate(station, xy=(.05, .5),
|
||||
xycoords='axes fraction')
|
||||
bbox_props = dict(boxstyle='round', facecolor='lightgrey',
|
||||
alpha=.7)
|
||||
axann.set_bbox(bbox_props)
|
||||
if n == int(np.median(istations)) and l is 0:
|
||||
label = 'probability density (qualitative)'
|
||||
axarr[n, l].set_ylabel(label)
|
||||
plt.setp([a.get_xticklabels() for a in axarr[0, :]], visible=False)
|
||||
plt.setp([a.get_yticklabels() for a in axarr[:, 1]], visible=False)
|
||||
plt.setp([a.get_yticklabels() for a in axarr[:, 0]], visible=False)
|
||||
|
||||
plt.show()
|
||||
|
||||
def get_all(self, phasename):
|
||||
pdf_dict = self.comparison
|
||||
rlist = list()
|
||||
for phases in pdf_dict.values():
|
||||
try:
|
||||
rlist.append(phases[phasename])
|
||||
except KeyError:
|
||||
continue
|
||||
return rlist
|
||||
|
||||
def get_array(self, phase, method_name):
|
||||
method = operator.methodcaller(method_name)
|
||||
pdf_list = self.get_all(phase)
|
||||
rarray = list(map(method, pdf_list))
|
||||
return np.array(rarray)
|
||||
|
||||
def get_expectation_array(self, phase):
|
||||
return self.get_array(phase, 'expectation')
|
||||
|
||||
def get_std_array(self, phase):
|
||||
return self.get_array(phase, 'standard_deviation')
|
||||
|
||||
def hist_expectation(self, phases='all', bins=20, normed=False):
|
||||
phases.strip()
|
||||
if phases.find('all') is 0:
|
||||
phases = 'ps'
|
||||
phases = phases.upper()
|
||||
nsp = len(phases)
|
||||
fig, axarray = plt.subplots(1, nsp, sharey=True)
|
||||
for n, phase in enumerate(phases):
|
||||
ax = axarray[n]
|
||||
data = self.get_expectation_array(phase)
|
||||
xlims = [min(data), max(data)]
|
||||
ax.hist(data, range=xlims, bins=bins, normed=normed)
|
||||
title_str = 'phase: {0}, samples: {1}'.format(phase, len(data))
|
||||
ax.set_title(title_str)
|
||||
ax.set_xlabel('expectation [s]')
|
||||
if n is 0:
|
||||
ax.set_ylabel('abundance [-]')
|
||||
plt.setp([a.get_yticklabels() for a in axarray[1:]], visible=False)
|
||||
plt.show()
|
||||
|
||||
def hist_standard_deviation(self, phases='all', bins=20, normed=False):
|
||||
phases.strip()
|
||||
if phases.find('all') == 0:
|
||||
phases = 'ps'
|
||||
phases = phases.upper()
|
||||
nsp = len(phases)
|
||||
fig, axarray = plt.subplots(1, nsp, sharey=True)
|
||||
for n, phase in enumerate(phases):
|
||||
ax = axarray[n]
|
||||
data = self.get_std_array(phase)
|
||||
xlims = [min(data), max(data)]
|
||||
ax.hist(data, range=xlims, bins=bins, normed=normed)
|
||||
title_str = 'phase: {0}, samples: {1}'.format(phase, len(data))
|
||||
ax.set_title(title_str)
|
||||
ax.set_xlabel('standard deviation [s]')
|
||||
if n is 0:
|
||||
ax.set_ylabel('abundance [-]')
|
||||
plt.setp([a.get_yticklabels() for a in axarray[1:]], visible=False)
|
||||
plt.show()
|
||||
|
||||
def hist(self, type='std'):
|
||||
pass
|
||||
|
||||
|
||||
class PDFDictionary(object):
|
||||
"""
|
||||
A PDFDictionary is a dictionary like object containing structured data on
|
||||
the probability density function of seismic phase onsets.
|
||||
"""
|
||||
|
||||
def __init__(self, data):
|
||||
self._pickdata = data
|
||||
self._pdfdata = self.generate_pdf_data()
|
||||
|
||||
def __nonzero__(self):
|
||||
if len(self.pick_data) < 1:
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
def __getitem__(self, item):
|
||||
return self.pdf_data[item]
|
||||
|
||||
@property
|
||||
def pdf_data(self):
|
||||
return self._pdfdata
|
||||
|
||||
@pdf_data.setter
|
||||
def pdf_data(self, data):
|
||||
self._pdfdata = data
|
||||
|
||||
@property
|
||||
def pick_data(self):
|
||||
return self._pickdata
|
||||
|
||||
@pick_data.setter
|
||||
def pick_data(self, data):
|
||||
self._pickdata = data
|
||||
|
||||
@property
|
||||
def stations(self):
|
||||
return self.pick_data.keys()
|
||||
|
||||
@property
|
||||
def nstations(self):
|
||||
return len(self.stations)
|
||||
|
||||
@classmethod
|
||||
def from_quakeml(self, fn):
|
||||
return PDFDictionary(fn)
|
||||
|
||||
def get_all(self, phase):
|
||||
rlist = list()
|
||||
for phases in self.pdf_data.values():
|
||||
try:
|
||||
rlist.append(phases[phase])
|
||||
except KeyError:
|
||||
continue
|
||||
return rlist
|
||||
|
||||
def generate_pdf_data(self, type='exp'):
|
||||
"""
|
||||
Returns probabiliy density function dictionary containing the
|
||||
representation of the actual pick_data.
|
||||
:param type: type of the returned
|
||||
`~pylot.core.util.pdf.ProbabilityDensityFunction` object
|
||||
:type type: str
|
||||
:return: a dictionary containing the picks represented as pdfs
|
||||
"""
|
||||
|
||||
pdf_picks = copy.deepcopy(self.pick_data)
|
||||
|
||||
for station, phases in pdf_picks.items():
|
||||
for phase, values in phases.items():
|
||||
if phase not in 'PS':
|
||||
continue
|
||||
phases[phase] = ProbabilityDensityFunction.from_pick(
|
||||
values['epp'],
|
||||
values['mpp'],
|
||||
values['lpp'],
|
||||
type=type)
|
||||
return pdf_picks
|
||||
|
||||
def plot(self, stations=None):
|
||||
'''
|
||||
plots the all probability density function for either desired STATIONS
|
||||
or all available date
|
||||
:param stations: list of stations to be plotted
|
||||
:type stations: list
|
||||
:return: matplotlib figure object containing the plot
|
||||
'''
|
||||
assert stations is not None or not isinstance(stations, list), \
|
||||
'parameter stations should be a list not {0}'.format(type(stations))
|
||||
if not stations:
|
||||
nstations = self.nstations
|
||||
stations = self.stations
|
||||
else:
|
||||
nstations = len(stations)
|
||||
|
||||
istations = range(nstations)
|
||||
fig, axarr = plt.subplots(nstations, 2, sharex='col', sharey='row')
|
||||
hide_labels = True
|
||||
|
||||
for n in istations:
|
||||
station = stations[n]
|
||||
pdfs = self.pdf_data[station]
|
||||
for l, phase in enumerate(pdfs.keys()):
|
||||
try:
|
||||
axarr[n, l].plot(pdfs[phase].axis, pdfs[phase].data())
|
||||
if n is 0:
|
||||
axarr[n, l].set_title(phase)
|
||||
if l is 0:
|
||||
axann = axarr[n, l].annotate(station, xy=(.05, .5),
|
||||
xycoords='axes fraction')
|
||||
bbox_props = dict(boxstyle='round', facecolor='lightgrey',
|
||||
alpha=.7)
|
||||
axann.set_bbox(bbox_props)
|
||||
if n == int(np.median(istations)) and l is 0:
|
||||
label = 'probability density (qualitative)'
|
||||
axarr[n, l].set_ylabel(label)
|
||||
except IndexError as e:
|
||||
print('trying aligned plotting\n{0}'.format(e))
|
||||
hide_labels = False
|
||||
axarr[l].plot(pdfs[phase].axis, pdfs[phase].data())
|
||||
axarr[l].set_title(phase)
|
||||
if l is 0:
|
||||
axann = axarr[l].annotate(station, xy=(.05, .5),
|
||||
xycoords='axes fraction')
|
||||
bbox_props = dict(boxstyle='round', facecolor='lightgrey',
|
||||
alpha=.7)
|
||||
axann.set_bbox(bbox_props)
|
||||
if hide_labels:
|
||||
plt.setp([a.get_xticklabels() for a in axarr[0, :]], visible=False)
|
||||
plt.setp([a.get_yticklabels() for a in axarr[:, 1]], visible=False)
|
||||
plt.setp([a.get_yticklabels() for a in axarr[:, 0]], visible=False)
|
||||
|
||||
return fig
|
||||
|
||||
|
||||
class PDFstatistics(object):
|
||||
"""
|
||||
This object can be used to get various statistic values from probabillity density functions.
|
||||
Takes a path as argument.
|
||||
"""
|
||||
|
||||
def __init__(self, directory):
|
||||
"""Initiates some values needed when dealing with pdfs later"""
|
||||
self._rootdir = directory
|
||||
self._evtlist = list()
|
||||
self._rphase = None
|
||||
self.make_fnlist()
|
||||
|
||||
def make_fnlist(self, fn_pattern='*.xml'):
|
||||
"""
|
||||
Takes a file pattern and searches for that recursively in the set path for the object.
|
||||
:param fn_pattern: A pattern that can identify all datafiles. Default Value = '*.xml'
|
||||
:type fn_pattern: string
|
||||
:return: creates a list of events saved in the PDFstatistics object.
|
||||
"""
|
||||
evtlist = list()
|
||||
for root, _, files in os.walk(self.root):
|
||||
for file in files:
|
||||
if file.endswith(fn_pattern[1:]):
|
||||
evtlist.append(os.path.join(root, file))
|
||||
self._evtlist = evtlist
|
||||
|
||||
def __iter__(self):
|
||||
for evt in self._evtlist:
|
||||
yield PDFDictionary.from_quakeml(evt)
|
||||
|
||||
def __getitem__(self, item):
|
||||
evt = find_in_list(self._evtlist, item)
|
||||
if evt:
|
||||
return PDFDictionary.from_quakeml(evt)
|
||||
return None
|
||||
|
||||
@property
|
||||
def root(self):
|
||||
return self._rootdir
|
||||
|
||||
@root.setter
|
||||
def root(self, value):
|
||||
if os.path.exists(value):
|
||||
self._rootdir = value
|
||||
else:
|
||||
raise ValueError("path doesn't exist: %s" % value)
|
||||
|
||||
@property
|
||||
def curphase(self):
|
||||
"""
|
||||
return the current phase type of interest
|
||||
:return: current phase
|
||||
"""
|
||||
return self._rphase
|
||||
|
||||
@curphase.setter
|
||||
def curphase(self, type):
|
||||
"""
|
||||
setter method for property curphase
|
||||
:param type: specify the phase type of interest
|
||||
:type type: string ('p' or 's')
|
||||
:return: -
|
||||
"""
|
||||
if type.upper() not in 'PS':
|
||||
raise ValueError("phase type must be either 'P' or 'S'!")
|
||||
else:
|
||||
self._rphase = type.upper()
|
||||
|
||||
def get(self, property='std', value=None):
|
||||
"""
|
||||
takes a property str and a probability value and returns all
|
||||
property's values for the current phase of interest
|
||||
:func:`self.curphase`
|
||||
|
||||
:param property: property name (default: 'std')
|
||||
:type property: str
|
||||
:param value: probability value :math:\alpha
|
||||
:type value: float
|
||||
:return: list containing all property's values
|
||||
"""
|
||||
assert isinstance(self.curphase,
|
||||
str), 'phase has to be set before being ' \
|
||||
'able to iterate over items...'
|
||||
rlist = []
|
||||
method_options = dict(STD='standard_deviation',
|
||||
Q='quantile',
|
||||
QD='quantile_distance',
|
||||
QDF='quantile_dist_frac')
|
||||
|
||||
# create method caller for easy mapping
|
||||
if property.upper() == 'STD':
|
||||
method = operator.methodcaller(method_options[property.upper()])
|
||||
elif value is not None:
|
||||
try:
|
||||
method = operator.methodcaller(method_options[property.upper()],
|
||||
value)
|
||||
except KeyError:
|
||||
raise KeyError('unknwon property: {0}'.format(property.upper()))
|
||||
else:
|
||||
raise ValueError("for call to method {0} value has to be "
|
||||
"defined but is 'None' ".format(method_options[
|
||||
property.upper()]))
|
||||
|
||||
for pdf_dict in self:
|
||||
# create worklist
|
||||
wlist = pdf_dict.get_all(self.curphase)
|
||||
# map method calls to object in worklist
|
||||
rlist += map(method, wlist)
|
||||
|
||||
return rlist
|
||||
|
||||
def writeThetaToFile(self, array, out_dir):
|
||||
"""
|
||||
Method to write array like data to file. Useful since acquiring can take
|
||||
serious amount of time when dealing with large databases.
|
||||
:param array: List of values.
|
||||
:type array: list
|
||||
:param out_dir: Path to save file to including file name.
|
||||
:type out_dir: str
|
||||
:return: Saves a file at given output directory.
|
||||
"""
|
||||
fid = open(os.path.join(out_dir), 'w')
|
||||
for val in array:
|
||||
fid.write(str(val) + '\n')
|
||||
fid.close()
|
||||
|
||||
|
||||
def main():
|
||||
root_dir = '/home/sebastianp/Codetesting/xmls/'
|
||||
Insheim = PDFstatistics(root_dir)
|
||||
Insheim.curphase = 'p'
|
||||
qdlist = Insheim.get('qdf', 0.2)
|
||||
print(qdlist)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import cProfile
|
||||
|
||||
pr = cProfile.Profile()
|
||||
pr.enable()
|
||||
main()
|
||||
pr.disable()
|
||||
# after your program ends
|
||||
pr.print_stats(sort="calls")
|
||||
502
pylot/core/pick/picker.py
Normal file
@@ -0,0 +1,502 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Created Dec 2014 to Feb 2015
|
||||
Implementation of the automated picking algorithms published and described in:
|
||||
|
||||
Kueperkoch, L., Meier, T., Lee, J., Friederich, W., & Egelados Working Group, 2010:
|
||||
Automated determination of P-phase arrival times at regional and local distances
|
||||
using higher order statistics, Geophys. J. Int., 181, 1159-1170
|
||||
|
||||
Kueperkoch, L., Meier, T., Bruestle, A., Lee, J., Friederich, W., & Egelados
|
||||
Working Group, 2012: Automated determination of S-phase arrival times using
|
||||
autoregressive prediction: application ot local and regional distances, Geophys. J. Int.,
|
||||
188, 687-702.
|
||||
|
||||
The picks with the above described algorithms are assumed to be the most likely picks.
|
||||
For each most likely pick the corresponding earliest and latest possible picks are
|
||||
calculated after Diehl & Kissling (2009).
|
||||
|
||||
:author: MAGS2 EP3 working group / Ludger Kueperkoch
|
||||
"""
|
||||
|
||||
import warnings
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
from scipy.signal import argrelmax
|
||||
from pylot.core.pick.charfuns import CharacteristicFunction
|
||||
from pylot.core.pick.utils import getnoisewin, getsignalwin
|
||||
|
||||
|
||||
class AutoPicker(object):
|
||||
'''
|
||||
Superclass of different, automated picking algorithms applied on a CF determined
|
||||
using AIC, HOS, or AR prediction.
|
||||
'''
|
||||
|
||||
warnings.simplefilter('ignore')
|
||||
|
||||
def __init__(self, cf, TSNR, PickWindow, iplot=0, aus=None, Tsmooth=None, Pick1=None, fig=None, linecolor='k'):
|
||||
'''
|
||||
:param: cf, characteristic function, on which the picking algorithm is applied
|
||||
:type: `~pylot.core.pick.CharFuns.CharacteristicFunction` object
|
||||
|
||||
:param: TSNR, length of time windows around pick used to determine SNR [s]
|
||||
:type: tuple (T_noise, T_gap, T_signal)
|
||||
|
||||
:param: PickWindow, length of pick window [s]
|
||||
:type: float
|
||||
|
||||
:param: iplot, no. of figure window for plotting interims results
|
||||
:type: integer
|
||||
|
||||
:param: aus ("artificial uplift of samples"), find local minimum at i if aic(i-1)*(1+aus) >= aic(i)
|
||||
:type: float
|
||||
|
||||
:param: Tsmooth, length of moving smoothing window to calculate smoothed CF [s]
|
||||
:type: float
|
||||
|
||||
:param: Pick1, initial (prelimenary) onset time, starting point for PragPicker and
|
||||
EarlLatePicker
|
||||
:type: float
|
||||
|
||||
'''
|
||||
|
||||
assert isinstance(cf, CharacteristicFunction), "%s is not a CharacteristicFunction object" % str(cf)
|
||||
self._linecolor = linecolor
|
||||
self._pickcolor_p = 'b'
|
||||
self.cf = cf.getCF()
|
||||
self.Tcf = cf.getTimeArray()
|
||||
self.Data = cf.getXCF()
|
||||
self.dt = cf.getIncrement()
|
||||
self.setTSNR(TSNR)
|
||||
self.setPickWindow(PickWindow)
|
||||
self.setiplot(iplot)
|
||||
self.setaus(aus)
|
||||
self.setTsmooth(Tsmooth)
|
||||
self.setpick1(Pick1)
|
||||
self.fig = fig
|
||||
self.calcPick()
|
||||
|
||||
def __str__(self):
|
||||
return '''\n\t{name} object:\n
|
||||
TSNR:\t\t\t{TSNR}\n
|
||||
PickWindow:\t{PickWindow}\n
|
||||
aus:\t{aus}\n
|
||||
Tsmooth:\t{Tsmooth}\n
|
||||
Pick1:\t{Pick1}\n
|
||||
'''.format(name=type(self).__name__,
|
||||
TSNR=self.getTSNR(),
|
||||
PickWindow=self.getPickWindow(),
|
||||
aus=self.getaus(),
|
||||
Tsmooth=self.getTsmooth(),
|
||||
Pick1=self.getpick1())
|
||||
|
||||
def getTSNR(self):
|
||||
return self.TSNR
|
||||
|
||||
def setTSNR(self, TSNR):
|
||||
self.TSNR = TSNR
|
||||
|
||||
def getPickWindow(self):
|
||||
return self.PickWindow
|
||||
|
||||
def setPickWindow(self, PickWindow):
|
||||
self.PickWindow = PickWindow
|
||||
|
||||
def getaus(self):
|
||||
return self.aus
|
||||
|
||||
def setaus(self, aus):
|
||||
self.aus = aus
|
||||
|
||||
def setTsmooth(self, Tsmooth):
|
||||
self.Tsmooth = Tsmooth
|
||||
|
||||
def getTsmooth(self):
|
||||
return self.Tsmooth
|
||||
|
||||
def getpick(self):
|
||||
return self.Pick
|
||||
|
||||
def getSNR(self):
|
||||
return self.SNR
|
||||
|
||||
def getSlope(self):
|
||||
return self.slope
|
||||
|
||||
def getiplot(self):
|
||||
return self.iplot
|
||||
|
||||
def setiplot(self, iplot):
|
||||
self.iplot = iplot
|
||||
|
||||
def getpick1(self):
|
||||
return self.Pick1
|
||||
|
||||
def setpick1(self, Pick1):
|
||||
self.Pick1 = Pick1
|
||||
|
||||
def calcPick(self):
|
||||
self.Pick = None
|
||||
|
||||
|
||||
class AICPicker(AutoPicker):
|
||||
'''
|
||||
Method to derive the onset time of an arriving phase based on CF
|
||||
derived from AIC. In order to get an impression of the quality of this inital pick,
|
||||
a quality assessment is applied based on SNR and slope determination derived from the CF,
|
||||
from which the AIC has been calculated.
|
||||
'''
|
||||
|
||||
def calcPick(self):
|
||||
|
||||
print('AICPicker: Get initial onset time (pick) from AIC-CF ...')
|
||||
|
||||
self.Pick = None
|
||||
self.slope = None
|
||||
self.SNR = None
|
||||
plt_flag = 0
|
||||
try:
|
||||
iplot = int(self.iplot)
|
||||
except:
|
||||
if self.iplot == True or self.iplot == 'True':
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
|
||||
# find NaN's
|
||||
nn = np.isnan(self.cf)
|
||||
if len(nn) > 1:
|
||||
self.cf[nn] = 0
|
||||
# taper AIC-CF to get rid off side maxima
|
||||
tap = np.hanning(len(self.cf))
|
||||
aic = tap * self.cf + max(abs(self.cf))
|
||||
# smooth AIC-CF
|
||||
ismooth = int(round(self.Tsmooth / self.dt))
|
||||
aicsmooth = np.zeros(len(aic))
|
||||
if len(aic) < ismooth:
|
||||
print('AICPicker: Tsmooth larger than CF!')
|
||||
return
|
||||
else:
|
||||
for i in range(1, len(aic)):
|
||||
if i > ismooth:
|
||||
ii1 = i - ismooth
|
||||
aicsmooth[i] = aicsmooth[i - 1] + (aic[i] - aic[ii1]) / ismooth
|
||||
else:
|
||||
aicsmooth[i] = np.mean(aic[1: i])
|
||||
# remove offset in AIC function
|
||||
offset = abs(min(aic) - min(aicsmooth))
|
||||
aicsmooth = aicsmooth - offset
|
||||
# get maximum of HOS/AR-CF as startimg point for searching
|
||||
# minimum in AIC function
|
||||
icfmax = np.argmax(self.Data[0].data)
|
||||
|
||||
# find minimum in AIC-CF front of maximum of HOS/AR-CF
|
||||
lpickwindow = int(round(self.PickWindow / self.dt))
|
||||
for i in range(icfmax - 1, max([icfmax - lpickwindow, 2]), -1):
|
||||
if aicsmooth[i - 1] >= aicsmooth[i]:
|
||||
self.Pick = self.Tcf[i]
|
||||
break
|
||||
# if no minimum could be found:
|
||||
# search in 1st derivative of AIC-CF
|
||||
if self.Pick is None:
|
||||
diffcf = np.diff(aicsmooth)
|
||||
# find NaN's
|
||||
nn = np.isnan(diffcf)
|
||||
if len(nn) > 1:
|
||||
diffcf[nn] = 0
|
||||
# taper CF to get rid off side maxima
|
||||
tap = np.hanning(len(diffcf))
|
||||
diffcf = tap * diffcf * max(abs(aicsmooth))
|
||||
for i in range(icfmax - 1, max([icfmax - lpickwindow, 2]), -1):
|
||||
if diffcf[i - 1] >= diffcf[i]:
|
||||
self.Pick = self.Tcf[i]
|
||||
break
|
||||
|
||||
# quality assessment using SNR and slope from CF
|
||||
if self.Pick is not None:
|
||||
# get noise window
|
||||
inoise = getnoisewin(self.Tcf, self.Pick, self.TSNR[0], self.TSNR[1])
|
||||
# check, if these are counts or m/s, important for slope estimation!
|
||||
# this is quick and dirty, better solution?
|
||||
if max(self.Data[0].data < 1e-3) and max(self.Data[0].data >= 1e-6):
|
||||
self.Data[0].data = self.Data[0].data * 1000000.
|
||||
elif max(self.Data[0].data < 1e-6):
|
||||
self.Data[0].data = self.Data[0].data * 1e13
|
||||
# get signal window
|
||||
isignal = getsignalwin(self.Tcf, self.Pick, self.TSNR[2])
|
||||
if len(isignal) == 0:
|
||||
return
|
||||
ii = min([isignal[len(isignal) - 1], len(self.Tcf)])
|
||||
isignal = isignal[0:ii]
|
||||
try:
|
||||
self.Data[0].data[isignal]
|
||||
except IndexError as e:
|
||||
msg = "Time series out of bounds! {}".format(e)
|
||||
print(msg)
|
||||
return
|
||||
# calculate SNR from CF
|
||||
self.SNR = max(abs(self.Data[0].data[isignal] - np.mean(self.Data[0].data[isignal]))) / \
|
||||
max(abs(self.Data[0].data[inoise] - np.mean(self.Data[0].data[inoise])))
|
||||
# calculate slope from CF after initial pick
|
||||
# get slope window
|
||||
tslope = self.TSNR[3] # slope determination window
|
||||
islope = np.where((self.Tcf <= min([self.Pick + tslope, self.Tcf[-1]])) \
|
||||
& (self.Tcf >= self.Pick)) # TODO: put this in a seperate function like getsignalwin
|
||||
# find maximum within slope determination window
|
||||
# 'cause slope should be calculated up to first local minimum only!
|
||||
try:
|
||||
dataslope = self.Data[0].data[islope[0][0:-1]]
|
||||
except IndexError:
|
||||
print("Slope Calculation: empty array islope, check signal window")
|
||||
return
|
||||
if len(dataslope) < 1:
|
||||
print('No data in slope window found!')
|
||||
return
|
||||
imaxs, = argrelmax(dataslope)
|
||||
if imaxs.size:
|
||||
imax = imaxs[0]
|
||||
else:
|
||||
imax = np.argmax(dataslope)
|
||||
iislope = islope[0][0:imax + 1]
|
||||
if len(iislope) < 2:
|
||||
# calculate slope from initial onset to maximum of AIC function
|
||||
print("AICPicker: Not enough data samples left for slope calculation!")
|
||||
print("Calculating slope from initial onset to maximum of AIC function ...")
|
||||
imax = np.argmax(aicsmooth[islope[0][0:-1]])
|
||||
if imax == 0:
|
||||
print("AICPicker: Maximum for slope determination right at the beginning of the window!")
|
||||
print("Choose longer slope determination window!")
|
||||
if self.iplot > 1:
|
||||
if self.fig == None or self.fig == 'None':
|
||||
fig = plt.figure()
|
||||
plt_flag = 1
|
||||
else:
|
||||
fig = self.fig
|
||||
ax = fig.add_subplot(111)
|
||||
x = self.Data[0].data
|
||||
ax.plot(self.Tcf, x / max(x), color=self._linecolor, linewidth=0.7, label='(HOS-/AR-) Data')
|
||||
ax.plot(self.Tcf, aicsmooth / max(aicsmooth), 'r', label='Smoothed AIC-CF')
|
||||
ax.legend(loc=1)
|
||||
ax.set_xlabel('Time [s] since %s' % self.Data[0].stats.starttime)
|
||||
ax.set_yticks([])
|
||||
ax.set_title(self.Data[0].stats.station)
|
||||
if plt_flag == 1:
|
||||
fig.show()
|
||||
try: input()
|
||||
except SyntaxError: pass
|
||||
plt.close(fig)
|
||||
return
|
||||
iislope = islope[0][0:imax+1]
|
||||
dataslope = self.Data[0].data[iislope]
|
||||
# calculate slope as polynomal fit of order 1
|
||||
xslope = np.arange(0, len(dataslope), 1)
|
||||
P = np.polyfit(xslope, dataslope, 1)
|
||||
datafit = np.polyval(P, xslope)
|
||||
if datafit[0] >= datafit[-1]:
|
||||
print('AICPicker: Negative slope, bad onset skipped!')
|
||||
return
|
||||
self.slope = 1 / (len(dataslope) * self.Data[0].stats.delta) * (datafit[-1] - datafit[0])
|
||||
|
||||
else:
|
||||
self.SNR = None
|
||||
self.slope = None
|
||||
|
||||
if iplot > 1:
|
||||
if self.fig == None or self.fig == 'None':
|
||||
fig = plt.figure() # self.iplot)
|
||||
plt_flag = 1
|
||||
else:
|
||||
fig = self.fig
|
||||
fig._tight = True
|
||||
ax1 = fig.add_subplot(211)
|
||||
x = self.Data[0].data
|
||||
if len(self.Tcf) > len(self.Data[0].data): # why? LK
|
||||
self.Tcf = self.Tcf[0:len(self.Tcf)-1]
|
||||
ax1.plot(self.Tcf, x / max(x), color=self._linecolor, linewidth=0.7, label='(HOS-/AR-) Data')
|
||||
ax1.plot(self.Tcf, aicsmooth / max(aicsmooth), 'r', label='Smoothed AIC-CF')
|
||||
if self.Pick is not None:
|
||||
ax1.plot([self.Pick, self.Pick], [-0.1, 0.5], 'b', linewidth=2, label='AIC-Pick')
|
||||
ax1.set_xlabel('Time [s] since %s' % self.Data[0].stats.starttime)
|
||||
ax1.set_yticks([])
|
||||
ax1.legend(loc=1)
|
||||
|
||||
if self.Pick is not None:
|
||||
ax2 = fig.add_subplot(2, 1, 2, sharex=ax1)
|
||||
ax2.plot(self.Tcf, x, color=self._linecolor, linewidth=0.7, label='Data')
|
||||
ax1.axvspan(self.Tcf[inoise[0]], self.Tcf[inoise[-1]], color='y', alpha=0.2, lw=0, label='Noise Window')
|
||||
ax1.axvspan(self.Tcf[isignal[0]], self.Tcf[isignal[-1]], color='b', alpha=0.2, lw=0,
|
||||
label='Signal Window')
|
||||
ax1.axvspan(self.Tcf[iislope[0]], self.Tcf[iislope[-1]], color='g', alpha=0.2, lw=0,
|
||||
label='Slope Window')
|
||||
|
||||
ax2.axvspan(self.Tcf[inoise[0]], self.Tcf[inoise[-1]], color='y', alpha=0.2, lw=0, label='Noise Window')
|
||||
ax2.axvspan(self.Tcf[isignal[0]], self.Tcf[isignal[-1]], color='b', alpha=0.2, lw=0,
|
||||
label='Signal Window')
|
||||
ax2.axvspan(self.Tcf[iislope[0]], self.Tcf[iislope[-1]], color='g', alpha=0.2, lw=0,
|
||||
label='Slope Window')
|
||||
ax2.plot(self.Tcf[iislope], datafit, 'g', linewidth=2, label='Slope')
|
||||
|
||||
ax1.set_title('Station %s, SNR=%7.2f, Slope= %12.2f counts/s' % (self.Data[0].stats.station,
|
||||
self.SNR, self.slope))
|
||||
ax2.set_xlabel('Time [s] since %s' % self.Data[0].stats.starttime)
|
||||
ax2.set_ylabel('Counts')
|
||||
ax2.set_yticks([])
|
||||
ax2.legend(loc=1)
|
||||
if plt_flag == 1:
|
||||
fig.show()
|
||||
try: input()
|
||||
except SyntaxError: pass
|
||||
plt.close(fig)
|
||||
else:
|
||||
ax1.set_title(self.Data[0].stats.station)
|
||||
if plt_flag == 1:
|
||||
fig.show()
|
||||
try: input()
|
||||
except SyntaxError: pass
|
||||
plt.close(fig)
|
||||
|
||||
if self.Pick == None:
|
||||
print('AICPicker: Could not find minimum, picking window too short?')
|
||||
|
||||
return
|
||||
|
||||
|
||||
class PragPicker(AutoPicker):
|
||||
'''
|
||||
Method of pragmatic picking exploiting information given by CF.
|
||||
'''
|
||||
|
||||
def calcPick(self):
|
||||
|
||||
try:
|
||||
iplot = int(self.getiplot())
|
||||
except:
|
||||
if self.getiplot() == True or self.getiplot() == 'True':
|
||||
iplot = 2
|
||||
else:
|
||||
iplot = 0
|
||||
|
||||
if self.getpick1() is not None:
|
||||
print('PragPicker: Get most likely pick from HOS- or AR-CF using pragmatic picking algorithm ...')
|
||||
|
||||
self.Pick = None
|
||||
self.SNR = None
|
||||
self.slope = None
|
||||
pickflag = 0
|
||||
plt_flag = 0
|
||||
# smooth CF
|
||||
ismooth = int(round(self.Tsmooth / self.dt))
|
||||
cfsmooth = np.zeros(len(self.cf))
|
||||
if len(self.cf) < ismooth:
|
||||
print('PragPicker: Tsmooth larger than CF!')
|
||||
return
|
||||
else:
|
||||
for i in range(1, len(self.cf)):
|
||||
if i > ismooth:
|
||||
ii1 = i - ismooth
|
||||
cfsmooth[i] = cfsmooth[i - 1] + (self.cf[i] - self.cf[ii1]) / ismooth
|
||||
else:
|
||||
cfsmooth[i] = np.mean(self.cf[1: i])
|
||||
|
||||
# select picking window
|
||||
# which is centered around tpick1
|
||||
ipick = np.where((self.Tcf >= self.getpick1() - self.PickWindow / 2) \
|
||||
& (self.Tcf <= self.getpick1() + self.PickWindow / 2))
|
||||
cfipick = self.cf[ipick] - np.mean(self.cf[ipick])
|
||||
Tcfpick = self.Tcf[ipick]
|
||||
cfsmoothipick = cfsmooth[ipick] - np.mean(self.cf[ipick])
|
||||
ipick1 = np.argmin(abs(self.Tcf - self.getpick1()))
|
||||
cfpick1 = 2 * self.cf[ipick1]
|
||||
|
||||
# check trend of CF, i.e. differences of CF and adjust aus ("artificial uplift
|
||||
# of picks") regarding this trend
|
||||
# prominent trend: decrease aus
|
||||
# flat: use given aus
|
||||
cfdiff = np.diff(cfipick)
|
||||
if len(cfdiff)<20:
|
||||
print('PragPicker: Very few samples for CF. Check LTA window dimensions!')
|
||||
i0diff = np.where(cfdiff > 0)
|
||||
cfdiff = cfdiff[i0diff]
|
||||
if len(cfdiff)<1:
|
||||
print('PragPicker: Negative slope for CF. Check LTA window dimensions! STOP')
|
||||
self.Pick = None
|
||||
return
|
||||
minaus = min(cfdiff * (1 + self.aus))
|
||||
aus1 = max([minaus, self.aus])
|
||||
|
||||
# at first we look to the right until the end of the pick window is reached
|
||||
flagpick_r = 0
|
||||
flagpick_l = 0
|
||||
cfpick_r = 0
|
||||
cfpick_l = 0
|
||||
lpickwindow = int(round(self.PickWindow / self.dt))
|
||||
for i in range(max(np.insert(ipick, 0, 2)), min([ipick1 + lpickwindow + 1, len(self.cf) - 1])):
|
||||
if self.cf[i + 1] > self.cf[i] and self.cf[i - 1] >= self.cf[i]:
|
||||
if cfsmooth[i - 1] * (1 + aus1) >= cfsmooth[i]:
|
||||
if cfpick1 >= self.cf[i]:
|
||||
pick_r = self.Tcf[i]
|
||||
self.Pick = pick_r
|
||||
flagpick_l = 1
|
||||
cfpick_r = self.cf[i]
|
||||
break
|
||||
|
||||
# now we look to the left
|
||||
if len(self.cf) > ipick1 +1:
|
||||
for i in range(ipick1, max([ipick1 - lpickwindow + 1, 2]), -1):
|
||||
if self.cf[i + 1] > self.cf[i] and self.cf[i - 1] >= self.cf[i]:
|
||||
if cfsmooth[i - 1] * (1 + aus1) >= cfsmooth[i]:
|
||||
if cfpick1 >= self.cf[i]:
|
||||
pick_l = self.Tcf[i]
|
||||
self.Pick = pick_l
|
||||
flagpick_r = 1
|
||||
cfpick_l = self.cf[i]
|
||||
break
|
||||
else:
|
||||
msg ='PragPicker: Initial onset too close to start of CF! \
|
||||
Stop finalizing pick to the left.'
|
||||
print(msg)
|
||||
|
||||
# now decide which pick: left or right?
|
||||
if flagpick_l > 0 and flagpick_r > 0 and cfpick_l <= 3 * cfpick_r:
|
||||
self.Pick = pick_l
|
||||
pickflag = 1
|
||||
elif flagpick_l > 0 and flagpick_r > 0 and cfpick_l >= cfpick_r:
|
||||
self.Pick = pick_r
|
||||
pickflag = 1
|
||||
elif flagpick_l == 0 and flagpick_r > 0 and cfpick_l >= cfpick_r:
|
||||
self.Pick = pick_l
|
||||
pickflag = 1
|
||||
else:
|
||||
print("PragPicker: Could not find reliable onset!")
|
||||
self.Pick = None
|
||||
pickflag = 0
|
||||
|
||||
if iplot > 1:
|
||||
if self.fig == None or self.fig == 'None':
|
||||
fig = plt.figure() # self.getiplot())
|
||||
plt_flag = 1
|
||||
else:
|
||||
fig = self.fig
|
||||
fig._tight = True
|
||||
ax = fig.add_subplot(111)
|
||||
ax.plot(Tcfpick, cfipick, color=self._linecolor, linewidth=0.7, label='CF')
|
||||
ax.plot(Tcfpick, cfsmoothipick, 'r', label='Smoothed CF')
|
||||
if pickflag > 0:
|
||||
ax.plot([self.Pick, self.Pick], [min(cfipick), max(cfipick)], self._pickcolor_p, linewidth=2, label='Pick')
|
||||
ax.set_xlabel('Time [s] since %s' % self.Data[0].stats.starttime)
|
||||
ax.set_yticks([])
|
||||
ax.set_title(self.Data[0].stats.station)
|
||||
ax.legend(loc=1)
|
||||
if plt_flag == 1:
|
||||
fig.show()
|
||||
try: input()
|
||||
except SyntaxError: pass
|
||||
plt.close(fig)
|
||||
return
|
||||
|
||||
else:
|
||||
print("PragPicker: No initial onset time given! Check input!")
|
||||
self.Pick = None
|
||||
return
|
||||
1186
pylot/core/pick/utils.py
Normal file
2
pylot/core/util/__init__.py
Normal file
@@ -0,0 +1,2 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
16
pylot/core/util/connection.py
Normal file
@@ -0,0 +1,16 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
try:
|
||||
from urllib2 import urlopen
|
||||
except:
|
||||
from urllib.request import urlopen
|
||||
|
||||
|
||||
def checkurl(url='https://ariadne.geophysik.ruhr-uni-bochum.de/trac/PyLoT/'):
|
||||
try:
|
||||
urlopen(url, timeout=1)
|
||||
return True
|
||||
except:
|
||||
pass
|
||||
return False
|
||||
353
pylot/core/util/dataprocessing.py
Normal file
@@ -0,0 +1,353 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import glob
|
||||
import os
|
||||
import sys
|
||||
|
||||
import numpy as np
|
||||
from obspy import UTCDateTime, read_inventory, read
|
||||
from obspy.io.xseed import Parser
|
||||
from pylot.core.util.utils import key_for_set_value, find_in_list, \
|
||||
remove_underscores, gen_Pool
|
||||
|
||||
|
||||
def time_from_header(header):
|
||||
"""
|
||||
Function takes in the second line from a .gse file and takes out the date and time from that line.
|
||||
:param header: second line from .gse file
|
||||
:type header: string
|
||||
:return: a list of integers of form [year, month, day, hour, minute, second, microsecond]
|
||||
"""
|
||||
timeline = header.split(' ')
|
||||
time = timeline[1].split('/') + timeline[2].split(':')
|
||||
time = time[:-1] + time[-1].split('.')
|
||||
return [int(t) for t in time]
|
||||
|
||||
|
||||
def check_time(datetime):
|
||||
"""
|
||||
Function takes in date and time as list and validates it's values by trying to make an UTCDateTime object from it
|
||||
:param datetime: list of integers [year, month, day, hour, minute, second, microsecond]
|
||||
:type datetime: list
|
||||
:return: returns True if Values are in supposed range, returns False otherwise
|
||||
|
||||
>>> check_time([1999, 01, 01, 23, 59, 59, 999000])
|
||||
True
|
||||
>>> check_time([1999, 01, 01, 23, 59, 60, 999000])
|
||||
False
|
||||
>>> check_time([1999, 01, 01, 23, 59, 59, 1000000])
|
||||
False
|
||||
>>> check_time([1999, 01, 01, 23, 60, 59, 999000])
|
||||
False
|
||||
>>> check_time([1999, 01, 01, 23, 60, 59, 999000])
|
||||
False
|
||||
>>> check_time([1999, 01, 01, 24, 59, 59, 999000])
|
||||
False
|
||||
>>> check_time([1999, 01, 31, 23, 59, 59, 999000])
|
||||
True
|
||||
>>> check_time([1999, 02, 30, 23, 59, 59, 999000])
|
||||
False
|
||||
>>> check_time([1999, 02, 29, 23, 59, 59, 999000])
|
||||
False
|
||||
>>> check_time([2000, 02, 29, 23, 59, 59, 999000])
|
||||
True
|
||||
>>> check_time([2000, 13, 29, 23, 59, 59, 999000])
|
||||
False
|
||||
"""
|
||||
try:
|
||||
UTCDateTime(*datetime)
|
||||
return True
|
||||
except ValueError:
|
||||
return False
|
||||
|
||||
|
||||
def get_file_list(root_dir):
|
||||
"""
|
||||
Function uses a directorie to get all the *.gse files from it.
|
||||
:param root_dir: a directorie leading to the .gse files
|
||||
:type root_dir: string
|
||||
:return: returns a list of filenames (without path to them)
|
||||
"""
|
||||
file_list = glob.glob1(root_dir, '*.gse')
|
||||
return file_list
|
||||
|
||||
|
||||
def checks_station_second(datetime, file):
|
||||
"""
|
||||
Function uses the given list to check if the parameter 'second' is set to 60 by mistake
|
||||
and sets the time correctly if so. Can only correct time if no date change would be necessary.
|
||||
:param datetime: [year, month, day, hour, minute, second, microsecond]
|
||||
:return: returns the input with the correct value for second
|
||||
"""
|
||||
if datetime[5] == 60:
|
||||
if datetime[4] == 59:
|
||||
if datetime[3] == 23:
|
||||
err_msg = 'Date should be next day. ' \
|
||||
'File not changed: {0}'.format(file)
|
||||
raise ValueError(err_msg)
|
||||
else:
|
||||
datetime[3] += 1
|
||||
datetime[4] = 0
|
||||
datetime[5] = 0
|
||||
else:
|
||||
datetime[4] += 1
|
||||
datetime[5] = 0
|
||||
return datetime
|
||||
|
||||
|
||||
def make_time_line(line, datetime):
|
||||
"""
|
||||
Function takes in the original line from a .gse file and a list of date and
|
||||
time values to make a new line with corrected date and time.
|
||||
:param line: second line from .gse file.
|
||||
:type line: string
|
||||
:param datetime: list of integers [year, month, day, hour, minute, second, microsecond]
|
||||
:type datetime: list
|
||||
:return: returns a string to write it into a file.
|
||||
"""
|
||||
ins_form = '{0:02d}:{1:02d}:{2:02d}.{3:03d}'
|
||||
insertion = ins_form.format(int(datetime[3]),
|
||||
int(datetime[4]),
|
||||
int(datetime[5]),
|
||||
int(datetime[6] * 1e-3))
|
||||
newline = line[:16] + insertion + line[28:]
|
||||
return newline
|
||||
|
||||
|
||||
def evt_head_check(root_dir, out_dir=None):
|
||||
"""
|
||||
A function to make sure that an arbitrary number of .gse files have correct values in their header.
|
||||
:param root_dir: a directory leading to the .gse files.
|
||||
:type root_dir: string
|
||||
:param out_dir: a directory to store the new files somwhere els.
|
||||
:return: returns nothing
|
||||
"""
|
||||
if not out_dir:
|
||||
print('WARNING files are going to be overwritten!')
|
||||
inp = str(raw_input('Continue? [y/N]'))
|
||||
if not inp == 'y':
|
||||
sys.exit()
|
||||
filelist = get_file_list(root_dir)
|
||||
nfiles = 0
|
||||
for file in filelist:
|
||||
infile = open(os.path.join(root_dir, file), 'r')
|
||||
lines = infile.readlines()
|
||||
infile.close()
|
||||
datetime = time_from_header(lines[1])
|
||||
if check_time(datetime):
|
||||
continue
|
||||
else:
|
||||
nfiles += 1
|
||||
datetime = checks_station_second(datetime, file)
|
||||
print('writing ' + file)
|
||||
# write File
|
||||
lines[1] = make_time_line(lines[1], datetime)
|
||||
if not out_dir:
|
||||
out = open(os.path.join(root_dir, file), 'w')
|
||||
out.writelines(lines)
|
||||
out.close()
|
||||
else:
|
||||
out = open(os.path.join(out_dir, file), 'w')
|
||||
out.writelines(lines)
|
||||
out.close()
|
||||
print(nfiles)
|
||||
|
||||
|
||||
def read_metadata(path_to_inventory):
|
||||
"""
|
||||
take path_to_inventory and return either the corresponding list of files
|
||||
found or the Parser object for a network dataless seed volume to prevent
|
||||
read overhead for large dataless seed volumes
|
||||
:param path_to_inventory:
|
||||
:return: tuple containing a either list of files or `obspy.io.xseed.Parser`
|
||||
object and the inventory type found
|
||||
:rtype: tuple
|
||||
"""
|
||||
dlfile = list()
|
||||
invfile = list()
|
||||
respfile = list()
|
||||
# possible file extensions specified here:
|
||||
inv = dict(dless=dlfile, xml=invfile, resp=respfile, dseed=dlfile[:])
|
||||
if os.path.isfile(path_to_inventory):
|
||||
ext = os.path.splitext(path_to_inventory)[1].split('.')[1]
|
||||
inv[ext] += [path_to_inventory]
|
||||
else:
|
||||
for ext in inv.keys():
|
||||
inv[ext] += glob.glob1(path_to_inventory, '*.{0}'.format(ext))
|
||||
|
||||
invtype = key_for_set_value(inv)
|
||||
|
||||
if invtype is None:
|
||||
print("Neither dataless-SEED file, inventory-xml file nor "
|
||||
"RESP-file found!")
|
||||
print("!!WRONG CALCULATION OF SOURCE PARAMETERS!!")
|
||||
robj = None,
|
||||
elif invtype == 'dless': # prevent multiple read of large dlsv
|
||||
print("Reading metadata information from dataless-SEED file ...")
|
||||
if len(inv[invtype]) == 1:
|
||||
fullpath_inv = os.path.join(path_to_inventory, inv[invtype][0])
|
||||
robj = Parser(fullpath_inv)
|
||||
else:
|
||||
robj = inv[invtype]
|
||||
else:
|
||||
print("Reading metadata information from inventory-xml file ...")
|
||||
robj = inv[invtype]
|
||||
return invtype, robj
|
||||
|
||||
|
||||
def restitute_trace(input_tuple):
|
||||
tr, invtype, inobj, unit, force = input_tuple
|
||||
|
||||
remove_trace = False
|
||||
|
||||
seed_id = tr.get_id()
|
||||
# check, whether this trace has already been corrected
|
||||
if 'processing' in tr.stats.keys() \
|
||||
and np.any(['remove' in p for p in tr.stats.processing]) \
|
||||
and not force:
|
||||
print("Trace {0} has already been corrected!".format(seed_id))
|
||||
return tr, False
|
||||
stime = tr.stats.starttime
|
||||
prefilt = get_prefilt(tr)
|
||||
if invtype == 'resp':
|
||||
fresp = find_in_list(inobj, seed_id)
|
||||
if not fresp:
|
||||
raise IOError('no response file found '
|
||||
'for trace {0}'.format(seed_id))
|
||||
fname = fresp
|
||||
seedresp = dict(filename=fname,
|
||||
date=stime,
|
||||
units=unit)
|
||||
kwargs = dict(paz_remove=None, pre_filt=prefilt, seedresp=seedresp)
|
||||
elif invtype == 'dless':
|
||||
if type(inobj) is list:
|
||||
fname = Parser(find_in_list(inobj, seed_id))
|
||||
else:
|
||||
fname = inobj
|
||||
seedresp = dict(filename=fname,
|
||||
date=stime,
|
||||
units=unit)
|
||||
kwargs = dict(pre_filt=prefilt, seedresp=seedresp)
|
||||
elif invtype == 'xml':
|
||||
invlist = inobj
|
||||
if len(invlist) > 1:
|
||||
finv = find_in_list(invlist, seed_id)
|
||||
else:
|
||||
finv = invlist[0]
|
||||
inventory = read_inventory(finv, format='STATIONXML')
|
||||
elif invtype == None:
|
||||
print("No restitution possible, as there are no station-meta data available!")
|
||||
return tr, True
|
||||
else:
|
||||
remove_trace = True
|
||||
# apply restitution to data
|
||||
print("Correcting instrument at station %s, channel %s" \
|
||||
% (tr.stats.station, tr.stats.channel))
|
||||
try:
|
||||
if invtype in ['resp', 'dless']:
|
||||
try:
|
||||
tr.simulate(**kwargs)
|
||||
except ValueError as e:
|
||||
vmsg = '{0}'.format(e)
|
||||
print(vmsg)
|
||||
|
||||
else:
|
||||
tr.attach_response(inventory)
|
||||
tr.remove_response(output=unit,
|
||||
pre_filt=prefilt)
|
||||
except ValueError as e:
|
||||
msg0 = 'Response for {0} not found in Parser'.format(seed_id)
|
||||
msg1 = 'evalresp failed to calculate response'
|
||||
if msg0 not in e.message or msg1 not in e.message:
|
||||
raise
|
||||
else:
|
||||
# restitution done to copies of data thus deleting traces
|
||||
# that failed should not be a problem
|
||||
remove_trace = True
|
||||
|
||||
return tr, remove_trace
|
||||
|
||||
|
||||
def restitute_data(data, invtype, inobj, unit='VEL', force=False, ncores=0):
|
||||
"""
|
||||
takes a data stream and a path_to_inventory and returns the corrected
|
||||
waveform data stream
|
||||
:param data: seismic data stream
|
||||
:param invtype: type of found metadata
|
||||
:param inobj: either list of metadata files or `obspy.io.xseed.Parser`
|
||||
object
|
||||
:param unit: unit to correct for (default: 'VEL')
|
||||
:param force: force restitution for already corrected traces (default:
|
||||
False)
|
||||
:return: corrected data stream
|
||||
"""
|
||||
|
||||
restflag = list()
|
||||
|
||||
data = remove_underscores(data)
|
||||
|
||||
# loop over traces
|
||||
input_tuples = []
|
||||
for tr in data:
|
||||
input_tuples.append((tr, invtype, inobj, unit, force))
|
||||
data.remove(tr)
|
||||
|
||||
pool = gen_Pool(ncores)
|
||||
result = pool.map(restitute_trace, input_tuples)
|
||||
pool.close()
|
||||
|
||||
for tr, remove_trace in result:
|
||||
if not remove_trace:
|
||||
data.traces.append(tr)
|
||||
|
||||
# check if ALL traces could be restituted, take care of large datasets
|
||||
# better try restitution for smaller subsets of data (e.g. station by
|
||||
# station)
|
||||
|
||||
# if len(restflag) > 0:
|
||||
# restflag = bool(np.all(restflag))
|
||||
# else:
|
||||
# restflag = False
|
||||
return data
|
||||
|
||||
|
||||
def get_prefilt(trace, tlow=(0.5, 0.9), thi=(5., 2.), verbosity=0):
|
||||
"""
|
||||
takes a `obspy.core.stream.Trace` object, taper parameters tlow and thi and
|
||||
returns the pre-filtering corner frequencies for the cosine taper for
|
||||
further processing
|
||||
:param trace: seismic data trace
|
||||
:type trace: `obspy.core.stream.Trace`
|
||||
:param tlow: tuple or list containing the desired lower corner
|
||||
frequenices for a cosine taper
|
||||
:type tlow: tuple or list
|
||||
:param thi: tuple or list containing the percentage values of the
|
||||
Nyquist frequency for the desired upper corner frequencies of the cosine
|
||||
taper
|
||||
:type thi: tuple or list
|
||||
:param verbosity: verbosity level
|
||||
:type verbosity: int
|
||||
:return: pre-filt cosine taper corner frequencies
|
||||
:rtype: tuple
|
||||
|
||||
..example::
|
||||
|
||||
>>> st = read()
|
||||
>>> get_prefilt(st[0])
|
||||
(0.5, 0.9, 47.5, 49.0)
|
||||
"""
|
||||
if verbosity:
|
||||
print("Calculating pre-filter values for %s, %s ..." % (
|
||||
trace.stats.station, trace.stats.channel))
|
||||
# get corner frequencies for pre-filtering
|
||||
fny = trace.stats.sampling_rate / 2
|
||||
fc21 = fny - (fny * thi[0] / 100.)
|
||||
fc22 = fny - (fny * thi[1] / 100.)
|
||||
return (tlow[0], tlow[1], fc21, fc22)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import doctest
|
||||
|
||||
doctest.testmod()
|
||||
95
pylot/core/util/defaults.py
Normal file
@@ -0,0 +1,95 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Created on Wed Feb 26 12:31:25 2014
|
||||
|
||||
@author: sebastianw
|
||||
"""
|
||||
|
||||
import os
|
||||
import platform
|
||||
|
||||
from pylot.core.util.utils import readDefaultFilterInformation
|
||||
from pylot.core.loc import hypo71
|
||||
from pylot.core.loc import hypodd
|
||||
from pylot.core.loc import hyposat
|
||||
from pylot.core.loc import nll
|
||||
from pylot.core.loc import velest
|
||||
|
||||
|
||||
# determine system dependent path separator
|
||||
system_name = platform.system()
|
||||
if system_name in ["Linux", "Darwin"]:
|
||||
SEPARATOR = '/'
|
||||
elif system_name == "Windows":
|
||||
SEPARATOR = '\\'
|
||||
|
||||
# suffix for phase name if not phase identified by last letter (P, p, etc.)
|
||||
ALTSUFFIX = ['diff', 'n', 'g', '1', '2', '3']
|
||||
|
||||
FILTERDEFAULTS = readDefaultFilterInformation(os.path.join(os.path.expanduser('~'),
|
||||
'.pylot',
|
||||
'pylot.in'))
|
||||
|
||||
TIMEERROR_DEFAULTS = os.path.join(os.path.expanduser('~'),
|
||||
'.pylot',
|
||||
'PILOT_TimeErrors.in')
|
||||
|
||||
OUTPUTFORMATS = {'.xml': 'QUAKEML',
|
||||
'.cnv': 'CNV',
|
||||
'.obs': 'NLLOC_OBS'}
|
||||
|
||||
LOCTOOLS = dict(nll=nll, hyposat=hyposat, velest=velest, hypo71=hypo71, hypodd=hypodd)
|
||||
|
||||
|
||||
class SetChannelComponents(object):
|
||||
def __init__(self):
|
||||
self.setDefaultCompPosition()
|
||||
|
||||
def setDefaultCompPosition(self):
|
||||
# default component order
|
||||
self.compPosition_Map = dict(Z=2, N=1, E=0)
|
||||
self.compName_Map = {'3': 'Z',
|
||||
'1': 'N',
|
||||
'2': 'E'}
|
||||
|
||||
def _getCurrentPosition(self, component):
|
||||
for key, value in self.compName_Map.items():
|
||||
if value == component:
|
||||
return key, value
|
||||
errMsg = 'getCurrentPosition: Could not find former position of component {}.'.format(component)
|
||||
raise ValueError(errMsg)
|
||||
|
||||
def _switch(self, component, component_alter):
|
||||
# Without switching, multiple definitions of the same alter_comp are possible
|
||||
old_alter_comp, _ = self._getCurrentPosition(component)
|
||||
old_comp = self.compName_Map[component_alter]
|
||||
if not old_alter_comp == component_alter and not old_comp == component:
|
||||
self.compName_Map[old_alter_comp] = old_comp
|
||||
print('switch: Automatically switched component {} to {}'.format(old_alter_comp, old_comp))
|
||||
|
||||
def setCompPosition(self, component_alter, component, switch=True):
|
||||
component_alter = str(component_alter)
|
||||
if not component_alter in self.compName_Map.keys():
|
||||
errMsg = 'setCompPosition: Unrecognized alternative component {}. Expecting one of {}.'
|
||||
raise ValueError(errMsg.format(component_alter, self.compName_Map.keys()))
|
||||
if not component in self.compPosition_Map.keys():
|
||||
errMsg = 'setCompPosition: Unrecognized target component {}. Expecting one of {}.'
|
||||
raise ValueError(errMsg.format(component, self.compPosition_Map.keys()))
|
||||
print('setCompPosition: set component {} to {}'.format(component_alter, component))
|
||||
if switch:
|
||||
self._switch(component, component_alter)
|
||||
self.compName_Map[component_alter] = component
|
||||
|
||||
def getCompPosition(self, component):
|
||||
return self._getCurrentPosition(component)[0]
|
||||
|
||||
def getPlotPosition(self, component):
|
||||
component = str(component)
|
||||
if component in self.compPosition_Map.keys():
|
||||
return self.compPosition_Map[component]
|
||||
elif component in self.compName_Map.keys():
|
||||
return self.compPosition_Map[self.compName_Map[component]]
|
||||
else:
|
||||
errMsg = 'getCompPosition: Unrecognized component {}. Expecting one of {} or {}.'
|
||||
raise ValueError(errMsg.format(component, self.compPosition_Map.keys(), self.compName_Map.keys()))
|
||||
30
pylot/core/util/errors.py
Normal file
@@ -0,0 +1,30 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Created on Thu Mar 20 09:47:04 2014
|
||||
|
||||
@author: sebastianw
|
||||
"""
|
||||
|
||||
|
||||
class OptionsError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class FormatError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class DatastructureError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class OverwriteError(IOError):
|
||||
pass
|
||||
|
||||
|
||||
class ParameterError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class ProcessingError(RuntimeError):
|
||||
pass
|
||||
172
pylot/core/util/event.py
Normal file
@@ -0,0 +1,172 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import os
|
||||
|
||||
from obspy import UTCDateTime
|
||||
from obspy.core.event import Event as ObsPyEvent
|
||||
from obspy.core.event import Origin, ResourceIdentifier
|
||||
from pylot.core.io.phases import picks_from_picksdict
|
||||
|
||||
|
||||
class Event(ObsPyEvent):
|
||||
'''
|
||||
Pickable class derived from ~obspy.core.event.Event containing information on a single event.
|
||||
'''
|
||||
|
||||
def __init__(self, path):
|
||||
self.pylot_id = path.split('/')[-1]
|
||||
# initialize super class
|
||||
super(Event, self).__init__(resource_id=ResourceIdentifier('smi:local/' + self.pylot_id))
|
||||
self.path = path
|
||||
self.database = path.split('/')[-2]
|
||||
self.datapath = path.split('/')[-3]
|
||||
self.rootpath = '/' + os.path.join(*path.split('/')[:-3])
|
||||
self.pylot_autopicks = {}
|
||||
self.pylot_picks = {}
|
||||
self.notes = ''
|
||||
self._testEvent = False
|
||||
self._refEvent = False
|
||||
self.get_notes()
|
||||
|
||||
def get_notes_path(self):
|
||||
notesfile = os.path.join(self.path, 'notes.txt')
|
||||
return notesfile
|
||||
|
||||
def get_notes(self):
|
||||
notesfile = self.get_notes_path()
|
||||
if os.path.isfile(notesfile):
|
||||
with open(notesfile) as infile:
|
||||
path = str(infile.readlines()[0].split('\n')[0])
|
||||
text = '[eventInfo: ' + path + ']'
|
||||
self.addNotes(text)
|
||||
try:
|
||||
datetime = UTCDateTime(path.split('/')[-1])
|
||||
origin = Origin(resource_id=self.resource_id, time=datetime, latitude=0, longitude=0, depth=0)
|
||||
self.origins.append(origin)
|
||||
except:
|
||||
pass
|
||||
|
||||
def addNotes(self, notes):
|
||||
self.notes = str(notes)
|
||||
|
||||
def clearNotes(self):
|
||||
self.notes = None
|
||||
|
||||
def isRefEvent(self):
|
||||
return self._refEvent
|
||||
|
||||
def isTestEvent(self):
|
||||
return self._testEvent
|
||||
|
||||
def setRefEvent(self, bool):
|
||||
self._refEvent = bool
|
||||
if bool: self._testEvent = False
|
||||
|
||||
def setTestEvent(self, bool):
|
||||
self._testEvent = bool
|
||||
if bool: self._refEvent = False
|
||||
|
||||
def clearObsPyPicks(self, picktype):
|
||||
for index, pick in reversed(list(enumerate(self.picks))):
|
||||
if picktype in str(pick.method_id):
|
||||
self.picks.pop(index)
|
||||
|
||||
def addPicks(self, picks):
|
||||
'''
|
||||
add pylot picks and overwrite existing ones
|
||||
'''
|
||||
for station in picks:
|
||||
self.pylot_picks[station] = picks[station]
|
||||
# add ObsPy picks (clear old manual and copy all new manual from pylot)
|
||||
self.clearObsPyPicks('manual')
|
||||
self.picks += picks_from_picksdict(self.pylot_picks)
|
||||
|
||||
def addAutopicks(self, autopicks):
|
||||
for station in autopicks:
|
||||
self.pylot_autopicks[station] = autopicks[station]
|
||||
# add ObsPy picks (clear old auto and copy all new auto from pylot)
|
||||
self.clearObsPyPicks('auto')
|
||||
self.picks += picks_from_picksdict(self.pylot_autopicks)
|
||||
|
||||
def setPick(self, station, pick):
|
||||
if pick:
|
||||
self.pylot_picks[station] = pick
|
||||
else:
|
||||
try:
|
||||
self.pylot_picks.pop(station)
|
||||
except Exception as e:
|
||||
print('Could not remove pick {} from station {}: {}'.format(pick, station, e))
|
||||
self.clearObsPyPicks('manual')
|
||||
self.picks += picks_from_picksdict(self.pylot_picks)
|
||||
|
||||
def setPicks(self, picks):
|
||||
'''
|
||||
set pylot picks and delete and overwrite all existing
|
||||
'''
|
||||
self.pylot_picks = picks
|
||||
self.clearObsPyPicks('manual')
|
||||
self.picks += picks_from_picksdict(self.pylot_picks)
|
||||
|
||||
def getPick(self, station):
|
||||
if station in self.pylot_picks.keys():
|
||||
return self.pylot_picks[station]
|
||||
|
||||
def getPicks(self):
|
||||
return self.pylot_picks
|
||||
|
||||
def setAutopick(self, station, pick):
|
||||
if pick:
|
||||
self.pylot_autopicks[station] = pick
|
||||
else:
|
||||
try:
|
||||
self.pylot_autopicks.pop(station)
|
||||
except Exception as e:
|
||||
print('Could not remove pick {} from station {}: {}'.format(pick, station, e))
|
||||
self.clearObsPyPicks('auto')
|
||||
self.picks += picks_from_picksdict(self.pylot_autopicks)
|
||||
|
||||
def setAutopicks(self, picks):
|
||||
'''
|
||||
set pylot picks and delete and overwrite all existing
|
||||
'''
|
||||
self.pylot_autopicks = picks
|
||||
self.clearObsPyPicks('auto')
|
||||
self.picks += picks_from_picksdict(self.pylot_autopicks)
|
||||
|
||||
def getAutopick(self, station):
|
||||
if station in self.pylot_autopicks.keys():
|
||||
return self.pylot_autopicks[station]
|
||||
|
||||
def getAutopicks(self):
|
||||
return self.pylot_autopicks
|
||||
|
||||
def save(self, filename):
|
||||
'''
|
||||
Save PyLoT Event to a file.
|
||||
Can be loaded by using event.load(filename).
|
||||
'''
|
||||
try:
|
||||
import cPickle
|
||||
except ImportError:
|
||||
import _pickle as cPickle
|
||||
|
||||
try:
|
||||
outfile = open(filename, 'wb')
|
||||
cPickle.dump(self, outfile, -1)
|
||||
except Exception as e:
|
||||
print('Could not pickle PyLoT event. Reason: {}'.format(e))
|
||||
|
||||
@staticmethod
|
||||
def load(filename):
|
||||
'''
|
||||
Load project from filename.
|
||||
'''
|
||||
try:
|
||||
import cPickle
|
||||
except ImportError:
|
||||
import _pickle as cPickle
|
||||
infile = open(filename, 'rb')
|
||||
event = cPickle.load(infile)
|
||||
print('Loaded %s' % filename)
|
||||
return event
|
||||
376
pylot/core/util/map_projection.py
Normal file
@@ -0,0 +1,376 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
import numpy as np
|
||||
import obspy
|
||||
from PySide import QtGui
|
||||
from matplotlib.backends.backend_qt4agg import NavigationToolbar2QT as NavigationToolbar
|
||||
from mpl_toolkits.basemap import Basemap
|
||||
from pylot.core.util.widgets import PickDlg
|
||||
from scipy.interpolate import griddata
|
||||
|
||||
plt.interactive(False)
|
||||
|
||||
|
||||
class map_projection(QtGui.QWidget):
|
||||
def __init__(self, parent, figure=None):
|
||||
'''
|
||||
|
||||
:param: picked, can be False, auto, manual
|
||||
:value: str
|
||||
'''
|
||||
QtGui.QWidget.__init__(self)
|
||||
self._parent = parent
|
||||
self.metadata = parent.metadata
|
||||
self.parser = parent.metadata[1]
|
||||
self.picks = None
|
||||
self.picks_dict = None
|
||||
self.eventLoc = None
|
||||
self.figure = figure
|
||||
self.init_graphics()
|
||||
self.init_stations()
|
||||
self.init_basemap(resolution='l')
|
||||
self.init_map()
|
||||
# self.show()
|
||||
|
||||
def init_map(self):
|
||||
self.init_lat_lon_dimensions()
|
||||
self.init_lat_lon_grid()
|
||||
self.init_x_y_dimensions()
|
||||
self.connectSignals()
|
||||
self.draw_everything()
|
||||
|
||||
def onpick(self, event):
|
||||
ind = event.ind
|
||||
button = event.mouseevent.button
|
||||
if ind == [] or not button == 1:
|
||||
return
|
||||
data = self._parent.get_data().getWFData()
|
||||
for index in ind:
|
||||
station = str(self.station_names[index].split('.')[-1])
|
||||
try:
|
||||
pickDlg = PickDlg(self, parameter=self._parent._inputs,
|
||||
data=data.select(station=station),
|
||||
station=station,
|
||||
picks=self._parent.get_current_event().getPick(station),
|
||||
autopicks=self._parent.get_current_event().getAutopick(station))
|
||||
except Exception as e:
|
||||
message = 'Could not generate Plot for station {st}.\n {er}'.format(st=station, er=e)
|
||||
self._warn(message)
|
||||
print(message, e)
|
||||
return
|
||||
pyl_mw = self._parent
|
||||
try:
|
||||
if pickDlg.exec_():
|
||||
pyl_mw.setDirty(True)
|
||||
pyl_mw.update_status('picks accepted ({0})'.format(station))
|
||||
replot = pyl_mw.get_current_event().setPick(station, pickDlg.getPicks())
|
||||
self._refresh_drawings()
|
||||
if replot:
|
||||
pyl_mw.plotWaveformData()
|
||||
pyl_mw.drawPicks()
|
||||
pyl_mw.draw()
|
||||
else:
|
||||
pyl_mw.drawPicks(station)
|
||||
pyl_mw.draw()
|
||||
else:
|
||||
pyl_mw.update_status('picks discarded ({0})'.format(station))
|
||||
except Exception as e:
|
||||
message = 'Could not save picks for station {st}.\n{er}'.format(st=station, er=e)
|
||||
self._warn(message)
|
||||
print(message, e)
|
||||
|
||||
def connectSignals(self):
|
||||
self.comboBox_phase.currentIndexChanged.connect(self._refresh_drawings)
|
||||
self.zoom_id = self.basemap.ax.figure.canvas.mpl_connect('scroll_event', self.zoom)
|
||||
|
||||
def init_graphics(self):
|
||||
if not self.figure:
|
||||
if not hasattr(self._parent, 'am_figure'):
|
||||
self.figure = plt.figure()
|
||||
self.toolbar = NavigationToolbar(self.figure.canvas, self)
|
||||
else:
|
||||
self.figure = self._parent.am_figure
|
||||
self.toolbar = self._parent.am_toolbar
|
||||
|
||||
self.main_ax = self.figure.add_subplot(111)
|
||||
self.canvas = self.figure.canvas
|
||||
|
||||
self.main_box = QtGui.QVBoxLayout()
|
||||
self.setLayout(self.main_box)
|
||||
|
||||
self.top_row = QtGui.QHBoxLayout()
|
||||
self.main_box.addLayout(self.top_row)
|
||||
|
||||
self.comboBox_phase = QtGui.QComboBox()
|
||||
self.comboBox_phase.insertItem(0, 'P')
|
||||
self.comboBox_phase.insertItem(1, 'S')
|
||||
|
||||
self.comboBox_am = QtGui.QComboBox()
|
||||
self.comboBox_am.insertItem(0, 'auto')
|
||||
self.comboBox_am.insertItem(1, 'manual')
|
||||
|
||||
self.top_row.addWidget(QtGui.QLabel('Select a phase: '))
|
||||
self.top_row.addWidget(self.comboBox_phase)
|
||||
self.top_row.setStretch(1, 1) # set stretch of item 1 to 1
|
||||
|
||||
self.main_box.addWidget(self.canvas)
|
||||
self.main_box.addWidget(self.toolbar)
|
||||
|
||||
def init_stations(self):
|
||||
def get_station_names_lat_lon(parser):
|
||||
station_names = []
|
||||
lat = []
|
||||
lon = []
|
||||
for station in parser.stations:
|
||||
station_name = station[0].station_call_letters
|
||||
network = station[0].network_code
|
||||
if not station_name in station_names:
|
||||
station_names.append(network + '.' + station_name)
|
||||
lat.append(station[0].latitude)
|
||||
lon.append(station[0].longitude)
|
||||
return station_names, lat, lon
|
||||
|
||||
station_names, lat, lon = get_station_names_lat_lon(self.parser)
|
||||
self.station_names = station_names
|
||||
self.lat = lat
|
||||
self.lon = lon
|
||||
|
||||
def init_picks(self):
|
||||
phase = self.comboBox_phase.currentText()
|
||||
|
||||
def get_picks(station_names):
|
||||
picks = []
|
||||
for station in station_names:
|
||||
try:
|
||||
station = station.split('.')[-1]
|
||||
picks.append(self.picks_dict[station][phase]['mpp'])
|
||||
except:
|
||||
picks.append(np.nan)
|
||||
return picks
|
||||
|
||||
def get_picks_rel(picks):
|
||||
picks_rel = []
|
||||
picks_utc = []
|
||||
for pick in picks:
|
||||
if type(pick) is obspy.core.utcdatetime.UTCDateTime:
|
||||
picks_utc.append(pick)
|
||||
minp = min(picks_utc)
|
||||
for pick in picks:
|
||||
if type(pick) is obspy.core.utcdatetime.UTCDateTime:
|
||||
pick -= minp
|
||||
picks_rel.append(pick)
|
||||
return picks_rel
|
||||
|
||||
self.picks = get_picks(self.station_names)
|
||||
self.picks_rel = get_picks_rel(self.picks)
|
||||
|
||||
def init_picks_active(self):
|
||||
def remove_nan_picks(picks):
|
||||
picks_no_nan = []
|
||||
for pick in picks:
|
||||
if not np.isnan(pick):
|
||||
picks_no_nan.append(pick)
|
||||
return picks_no_nan
|
||||
|
||||
self.picks_no_nan = remove_nan_picks(self.picks_rel)
|
||||
|
||||
def init_stations_active(self):
|
||||
def remove_nan_lat_lon(picks, lat, lon):
|
||||
lat_no_nan = []
|
||||
lon_no_nan = []
|
||||
for index, pick in enumerate(picks):
|
||||
if not np.isnan(pick):
|
||||
lat_no_nan.append(lat[index])
|
||||
lon_no_nan.append(lon[index])
|
||||
return lat_no_nan, lon_no_nan
|
||||
|
||||
self.lat_no_nan, self.lon_no_nan = remove_nan_lat_lon(self.picks_rel, self.lat, self.lon)
|
||||
|
||||
def init_lat_lon_dimensions(self):
|
||||
def get_lon_lat_dim(lon, lat):
|
||||
londim = max(lon) - min(lon)
|
||||
latdim = max(lat) - min(lat)
|
||||
return londim, latdim
|
||||
|
||||
self.londim, self.latdim = get_lon_lat_dim(self.lon, self.lat)
|
||||
|
||||
def init_x_y_dimensions(self):
|
||||
def get_x_y_dim(x, y):
|
||||
xdim = max(x) - min(x)
|
||||
ydim = max(y) - min(y)
|
||||
return xdim, ydim
|
||||
|
||||
self.x, self.y = self.basemap(self.lon, self.lat)
|
||||
self.xdim, self.ydim = get_x_y_dim(self.x, self.y)
|
||||
|
||||
def init_basemap(self, resolution='l'):
|
||||
# basemap = Basemap(projection=projection, resolution = resolution, ax=self.main_ax)
|
||||
basemap = Basemap(projection='lcc', resolution=resolution, ax=self.main_ax,
|
||||
width=5e6, height=2e6,
|
||||
lat_0=(min(self.lat) + max(self.lat)) / 2.,
|
||||
lon_0=(min(self.lon) + max(self.lon)) / 2.)
|
||||
|
||||
# basemap.fillcontinents(color=None, lake_color='aqua',zorder=1)
|
||||
basemap.drawmapboundary(zorder=2) # fill_color='darkblue')
|
||||
basemap.shadedrelief(zorder=3)
|
||||
basemap.drawcountries(zorder=4)
|
||||
basemap.drawstates(zorder=5)
|
||||
basemap.drawcoastlines(zorder=6)
|
||||
self.basemap = basemap
|
||||
self.figure.tight_layout()
|
||||
|
||||
def init_lat_lon_grid(self):
|
||||
def get_lat_lon_axis(lat, lon):
|
||||
steplat = (max(lat) - min(lat)) / 250
|
||||
steplon = (max(lon) - min(lon)) / 250
|
||||
|
||||
lataxis = np.arange(min(lat), max(lat), steplat)
|
||||
lonaxis = np.arange(min(lon), max(lon), steplon)
|
||||
return lataxis, lonaxis
|
||||
|
||||
def get_lat_lon_grid(lataxis, lonaxis):
|
||||
longrid, latgrid = np.meshgrid(lonaxis, lataxis)
|
||||
return latgrid, longrid
|
||||
|
||||
self.lataxis, self.lonaxis = get_lat_lon_axis(self.lat, self.lon)
|
||||
self.latgrid, self.longrid = get_lat_lon_grid(self.lataxis, self.lonaxis)
|
||||
|
||||
def init_picksgrid(self):
|
||||
self.picksgrid_no_nan = griddata((self.lat_no_nan, self.lon_no_nan),
|
||||
self.picks_no_nan, (self.latgrid, self.longrid),
|
||||
method='linear') ##################
|
||||
|
||||
def draw_contour_filled(self, nlevel='50'):
|
||||
levels = np.linspace(min(self.picks_no_nan), max(self.picks_no_nan), nlevel)
|
||||
self.contourf = self.basemap.contourf(self.longrid, self.latgrid, self.picksgrid_no_nan,
|
||||
levels, latlon=True, zorder=9, alpha=0.5)
|
||||
|
||||
def scatter_all_stations(self):
|
||||
self.sc = self.basemap.scatter(self.lon, self.lat, s=50, facecolor='none', latlon=True,
|
||||
zorder=10, picker=True, edgecolor='m', label='Not Picked')
|
||||
self.cid = self.canvas.mpl_connect('pick_event', self.onpick)
|
||||
if self.eventLoc:
|
||||
lat, lon = self.eventLoc
|
||||
self.sc_event = self.basemap.scatter(lon, lat, s=100, facecolor='red',
|
||||
latlon=True, zorder=11, label='Event (might be outside map region)')
|
||||
|
||||
def scatter_picked_stations(self):
|
||||
lon = self.lon_no_nan
|
||||
lat = self.lat_no_nan
|
||||
|
||||
# workaround because of an issue with latlon transformation of arrays with len <3
|
||||
if len(lon) <= 2 and len(lat) <= 2:
|
||||
self.sc_picked = self.basemap.scatter(lon[0], lat[0], s=50, facecolor='white',
|
||||
c=self.picks_no_nan[0], latlon=True, zorder=11, label='Picked')
|
||||
if len(lon) == 2 and len(lat) == 2:
|
||||
self.sc_picked = self.basemap.scatter(lon[1], lat[1], s=50, facecolor='white',
|
||||
c=self.picks_no_nan[1], latlon=True, zorder=11)
|
||||
else:
|
||||
self.sc_picked = self.basemap.scatter(lon, lat, s=50, facecolor='white',
|
||||
c=self.picks_no_nan, latlon=True, zorder=11, label='Picked')
|
||||
|
||||
def annotate_ax(self):
|
||||
self.annotations = []
|
||||
for index, name in enumerate(self.station_names):
|
||||
self.annotations.append(self.main_ax.annotate(' %s' % name, xy=(self.x[index], self.y[index]),
|
||||
fontsize='x-small', color='white', zorder=12))
|
||||
self.legend = self.main_ax.legend(loc=1)
|
||||
|
||||
def add_cbar(self, label):
|
||||
cbar = self.main_ax.figure.colorbar(self.sc_picked, fraction=0.025)
|
||||
cbar.set_label(label)
|
||||
return cbar
|
||||
|
||||
def refresh_drawings(self, picks=None):
|
||||
self.picks_dict = picks
|
||||
self._refresh_drawings()
|
||||
|
||||
def _refresh_drawings(self):
|
||||
self.remove_drawings()
|
||||
self.draw_everything()
|
||||
|
||||
def draw_everything(self):
|
||||
if self.picks_dict:
|
||||
self.init_picks()
|
||||
self.init_picks_active()
|
||||
self.init_stations_active()
|
||||
if len(self.picks_no_nan) >= 3:
|
||||
self.init_picksgrid()
|
||||
self.draw_contour_filled()
|
||||
self.scatter_all_stations()
|
||||
if self.picks_dict:
|
||||
self.scatter_picked_stations()
|
||||
self.cbar = self.add_cbar(label='Time relative to first onset [s]')
|
||||
self.comboBox_phase.setEnabled(True)
|
||||
else:
|
||||
self.comboBox_phase.setEnabled(False)
|
||||
self.annotate_ax()
|
||||
self.canvas.draw()
|
||||
|
||||
def remove_drawings(self):
|
||||
if hasattr(self, 'sc_picked'):
|
||||
self.sc_picked.remove()
|
||||
del (self.sc_picked)
|
||||
if hasattr(self, 'sc_event'):
|
||||
self.sc_event.remove()
|
||||
del (self.sc_event)
|
||||
if hasattr(self, 'cbar'):
|
||||
self.cbar.remove()
|
||||
del (self.cbar)
|
||||
if hasattr(self, 'contourf'):
|
||||
self.remove_contourf()
|
||||
del (self.contourf)
|
||||
if hasattr(self, 'cid'):
|
||||
self.canvas.mpl_disconnect(self.cid)
|
||||
del (self.cid)
|
||||
try:
|
||||
self.sc.remove()
|
||||
except Exception as e:
|
||||
print('Warning: could not remove station scatter plot.\nReason: {}'.format(e))
|
||||
try:
|
||||
self.legend.remove()
|
||||
except Exception as e:
|
||||
print('Warning: could not remove legend. Reason: {}'.format(e))
|
||||
self.canvas.draw()
|
||||
|
||||
def remove_contourf(self):
|
||||
for item in self.contourf.collections:
|
||||
item.remove()
|
||||
|
||||
def remove_annotations(self):
|
||||
for annotation in self.annotations:
|
||||
annotation.remove()
|
||||
|
||||
def zoom(self, event):
|
||||
map = self.basemap
|
||||
xlim = map.ax.get_xlim()
|
||||
ylim = map.ax.get_ylim()
|
||||
x, y = event.xdata, event.ydata
|
||||
zoom = {'up': 1. / 2.,
|
||||
'down': 2.}
|
||||
|
||||
if not event.xdata or not event.ydata:
|
||||
return
|
||||
|
||||
if event.button in zoom:
|
||||
factor = zoom[event.button]
|
||||
xdiff = (xlim[1] - xlim[0]) * factor
|
||||
xl = x - 0.5 * xdiff
|
||||
xr = x + 0.5 * xdiff
|
||||
ydiff = (ylim[1] - ylim[0]) * factor
|
||||
yb = y - 0.5 * ydiff
|
||||
yt = y + 0.5 * ydiff
|
||||
|
||||
if xl < map.xmin or yb < map.ymin or xr > map.xmax or yt > map.ymax:
|
||||
xl, xr = map.xmin, map.xmax
|
||||
yb, yt = map.ymin, map.ymax
|
||||
map.ax.set_xlim(xl, xr)
|
||||
map.ax.set_ylim(yb, yt)
|
||||
map.ax.figure.canvas.draw()
|
||||
|
||||
def _warn(self, message):
|
||||
self.qmb = QtGui.QMessageBox(QtGui.QMessageBox.Icon.Warning,
|
||||
'Warning', message)
|
||||
self.qmb.show()
|
||||
489
pylot/core/util/pdf.py
Normal file
@@ -0,0 +1,489 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import warnings
|
||||
|
||||
import numpy as np
|
||||
from obspy import UTCDateTime
|
||||
from pylot.core.util.utils import fit_curve, clims
|
||||
from pylot.core.util.version import get_git_version as _getVersionString
|
||||
|
||||
__version__ = _getVersionString()
|
||||
__author__ = 'sebastianw'
|
||||
|
||||
|
||||
def create_axis(x0, incr, npts):
|
||||
ax = np.zeros(npts)
|
||||
for i in range(npts):
|
||||
ax[i] = x0 + incr * i
|
||||
return ax
|
||||
|
||||
|
||||
def gauss_parameter(te, tm, tl, eta):
|
||||
'''
|
||||
takes three onset times and returns the parameters sig1, sig2, a1 and a2
|
||||
to represent the pick as a probability density funtion (PDF) with two
|
||||
Gauss branches
|
||||
:param te:
|
||||
:param tm:
|
||||
:param tl:
|
||||
:param eta:
|
||||
:return:
|
||||
'''
|
||||
|
||||
sig1 = (tm - te) / np.sqrt(2 * np.log(1 / eta))
|
||||
sig2 = (tl - tm) / np.sqrt(2 * np.log(1 / eta))
|
||||
|
||||
a1 = 2 / (1 + sig2 / sig1)
|
||||
a2 = 2 / (1 + sig1 / sig2)
|
||||
|
||||
return tm, sig1, sig2, a1, a2
|
||||
|
||||
|
||||
def exp_parameter(te, tm, tl, eta):
|
||||
'''
|
||||
takes three onset times te, tm and tl and returns the parameters sig1,
|
||||
sig2 and a to represent the pick as a probability density function (PDF)
|
||||
with two exponential decay branches
|
||||
:param te:
|
||||
:param tm:
|
||||
:param tl:
|
||||
:param eta:
|
||||
:return:
|
||||
'''
|
||||
|
||||
sig1 = np.log(eta) / (te - tm)
|
||||
sig2 = np.log(eta) / (tm - tl)
|
||||
if np.isinf(sig1) == True:
|
||||
sig1 = np.log(eta) / (tm - tl)
|
||||
if np.isinf(sig2) == True:
|
||||
sig2 = np.log(eta) / (te - tm)
|
||||
a = 1 / (1 / sig1 + 1 / sig2)
|
||||
return tm, sig1, sig2, a
|
||||
|
||||
|
||||
def gauss_branches(k, param_tuple):
|
||||
'''
|
||||
function gauss_branches takes an axes x, a center value mu, two sigma
|
||||
values sig1 and sig2 and two scaling factors a1 and a2 and return a
|
||||
list containing the values of a probability density function (PDF)
|
||||
consisting of gauss branches
|
||||
:param x:
|
||||
:type x:
|
||||
:param mu:
|
||||
:type mu:
|
||||
:param sig1:
|
||||
:type sig1:
|
||||
:param sig2:
|
||||
:type sig2:
|
||||
:param a1:
|
||||
:type a1:
|
||||
:param a2:
|
||||
:returns fun_vals: list with function values along axes x
|
||||
'''
|
||||
|
||||
# python 3 workaround
|
||||
mu, sig1, sig2, a1, a2 = param_tuple
|
||||
|
||||
def _func(k, mu, sig1, sig2, a1, a2):
|
||||
if k < mu:
|
||||
rval = a1 * 1 / (np.sqrt(2 * np.pi) * sig1) * np.exp(-((k - mu) / sig1) ** 2 / 2)
|
||||
else:
|
||||
rval = a2 * 1 / (np.sqrt(2 * np.pi) * sig2) * np.exp(-((k - mu) /
|
||||
sig2) ** 2 / 2)
|
||||
return rval
|
||||
|
||||
try:
|
||||
return [_func(x, mu, sig1, sig2, a1, a2) for x in iter(k)]
|
||||
except TypeError:
|
||||
return _func(k, mu, sig1, sig2, a1, a2)
|
||||
|
||||
|
||||
def exp_branches(k, param_tuple):
|
||||
'''
|
||||
function exp_branches takes an axes x, a center value mu, two sigma
|
||||
values sig1 and sig2 and a scaling factor a and return a
|
||||
list containing the values of a probability density function (PDF)
|
||||
consisting of exponential decay branches
|
||||
:param x:
|
||||
:param mu:
|
||||
:param sig1:
|
||||
:param sig2:
|
||||
:param a:
|
||||
:returns fun_vals: list with function values along axes x:
|
||||
'''
|
||||
|
||||
# python 3 workaround
|
||||
mu, sig1, sig2, a = param_tuple
|
||||
|
||||
def _func(k, mu, sig1, sig2, a):
|
||||
mu = float(mu)
|
||||
if k < mu:
|
||||
rval = a * np.exp(sig1 * (k - mu))
|
||||
else:
|
||||
rval = a * np.exp(-sig2 * (k - mu))
|
||||
return rval
|
||||
|
||||
try:
|
||||
return [_func(x, mu, sig1, sig2, a) for x in iter(k)]
|
||||
except TypeError:
|
||||
return _func(k, mu, sig1, sig2, a)
|
||||
|
||||
|
||||
# define container dictionaries for different types of pdfs
|
||||
parameter = dict(gauss=gauss_parameter, exp=exp_parameter)
|
||||
branches = dict(gauss=gauss_branches, exp=exp_branches)
|
||||
|
||||
|
||||
class ProbabilityDensityFunction(object):
|
||||
'''
|
||||
A probability density function toolkit.
|
||||
'''
|
||||
|
||||
version = __version__
|
||||
|
||||
def __init__(self, x0, incr, npts, pdf, mu, params, eta=0.01):
|
||||
self.x0 = x0
|
||||
self.incr = incr
|
||||
self.npts = npts
|
||||
self.axis = create_axis(x0, incr, npts)
|
||||
self.mu = mu
|
||||
self.eta = eta
|
||||
self._pdf = pdf
|
||||
self.params = params
|
||||
|
||||
def __add__(self, other):
|
||||
|
||||
assert self.eta == other.eta, 'decline factors differ please use equally defined pdfs for comparison'
|
||||
|
||||
eta = self.eta
|
||||
|
||||
x0, incr, npts = self.commonparameter(other)
|
||||
|
||||
axis = create_axis(x0, incr, npts)
|
||||
pdf_self = np.array(self.data(axis))
|
||||
pdf_other = np.array(other.data(axis))
|
||||
|
||||
pdf = np.convolve(pdf_self, pdf_other, 'full') * incr
|
||||
|
||||
# shift axis values for correct plotting
|
||||
npts = pdf.size
|
||||
x0 *= 2
|
||||
axis = create_axis(x0, incr, npts)
|
||||
mu = axis[np.where(pdf == max(pdf))][0]
|
||||
|
||||
func, params = fit_curve(axis, pdf)
|
||||
|
||||
return ProbabilityDensityFunction(x0, incr, npts, func, mu,
|
||||
params, eta)
|
||||
|
||||
def __sub__(self, other):
|
||||
|
||||
assert self.eta == other.eta, 'decline factors differ please use equally defined pdfs for comparison'
|
||||
|
||||
eta = self.eta
|
||||
|
||||
x0, incr, npts = self.commonparameter(other)
|
||||
|
||||
axis = create_axis(x0, incr, npts)
|
||||
pdf_self = np.array(self.data(axis))
|
||||
pdf_other = np.array(other.data(axis))
|
||||
|
||||
pdf = np.correlate(pdf_self, pdf_other, 'full') * incr
|
||||
|
||||
# shift axis values for correct plotting
|
||||
npts = len(pdf)
|
||||
midpoint = npts / 2
|
||||
x0 = -incr * midpoint
|
||||
axis = create_axis(x0, incr, npts)
|
||||
mu = axis[np.where(pdf == max(pdf))][0]
|
||||
|
||||
func, params = fit_curve(axis, pdf)
|
||||
|
||||
return ProbabilityDensityFunction(x0, incr, npts, func, mu,
|
||||
params, eta)
|
||||
|
||||
def __nonzero__(self):
|
||||
prec = self.precision(self.incr)
|
||||
data = np.array(self.data())
|
||||
gtzero = np.all(data >= 0)
|
||||
probone = bool(np.round(self.prob_gt_val(self.axis[0]), prec) == 1.)
|
||||
return bool(gtzero and probone)
|
||||
|
||||
def __str__(self):
|
||||
return str([self.data()])
|
||||
|
||||
@staticmethod
|
||||
def precision(incr):
|
||||
prec = int(np.ceil(np.abs(np.log10(incr)))) - 2
|
||||
return prec if prec >= 0 else 0
|
||||
|
||||
def data(self, value=None):
|
||||
if value is None:
|
||||
return self._pdf(self.axis, self.params)
|
||||
return self._pdf(value, self.params)
|
||||
|
||||
@property
|
||||
def eta(self):
|
||||
return self._eta
|
||||
|
||||
@eta.setter
|
||||
def eta(self, value):
|
||||
self._eta = value
|
||||
|
||||
@property
|
||||
def mu(self):
|
||||
return self._mu
|
||||
|
||||
@mu.setter
|
||||
def mu(self, mu):
|
||||
self._mu = mu
|
||||
|
||||
@property
|
||||
def axis(self):
|
||||
return self._x
|
||||
|
||||
@axis.setter
|
||||
def axis(self, x):
|
||||
self._x = np.array(x)
|
||||
|
||||
@classmethod
|
||||
def from_pick(self, lbound, barycentre, rbound, incr=0.1, decfact=0.01,
|
||||
type='exp'):
|
||||
'''
|
||||
Initialize a new ProbabilityDensityFunction object.
|
||||
Takes incr, lbound, barycentre and rbound to derive x0 and the number
|
||||
of points npts for the axis vector.
|
||||
Maximum density
|
||||
is given at the barycentre and on the boundaries the function has
|
||||
declined to decfact times the maximum value. Integration of the
|
||||
function over a particular interval gives the probability for the
|
||||
variable value to be in that interval.
|
||||
'''
|
||||
|
||||
# derive adequate window of definition
|
||||
margin = 2. * np.max([barycentre - lbound, rbound - barycentre])
|
||||
|
||||
# find midpoint accounting also for `~obspy.UTCDateTime` object usage
|
||||
try:
|
||||
midpoint = (rbound + lbound) / 2
|
||||
except TypeError:
|
||||
try:
|
||||
midpoint = (rbound + float(lbound)) / 2
|
||||
except TypeError:
|
||||
midpoint = float(rbound + float(lbound)) / 2
|
||||
|
||||
# find x0 on a grid point and sufficient npts
|
||||
was_datetime = None
|
||||
if isinstance(barycentre, UTCDateTime):
|
||||
barycentre = float(barycentre)
|
||||
was_datetime = True
|
||||
n = int(np.ceil((barycentre - midpoint) / incr))
|
||||
m = int(np.ceil((margin / incr)))
|
||||
midpoint = barycentre - n * incr
|
||||
margin = m * incr
|
||||
x0 = midpoint - margin
|
||||
npts = 2 * m
|
||||
|
||||
if was_datetime:
|
||||
barycentre = UTCDateTime(barycentre)
|
||||
|
||||
# calculate parameter for pdf representing function
|
||||
params = parameter[type](lbound, barycentre, rbound, decfact)
|
||||
|
||||
# select pdf type
|
||||
pdf = branches[type]
|
||||
|
||||
# return the object
|
||||
return ProbabilityDensityFunction(x0, incr, npts, pdf, barycentre,
|
||||
params, decfact)
|
||||
|
||||
def broadcast(self, pdf, si, ei, data):
|
||||
try:
|
||||
pdf[si:ei] = data
|
||||
except ValueError as e:
|
||||
warnings.warn(str(e), Warning)
|
||||
return self.broadcast(pdf, si, ei, data[:-1])
|
||||
return pdf
|
||||
|
||||
def expectation(self):
|
||||
'''
|
||||
returns the expectation value of the actual pdf object
|
||||
|
||||
..formula::
|
||||
mu_{\Delta t} = \int\limits_{-\infty}^\infty x \cdot f(x)dx
|
||||
|
||||
:return float: rval
|
||||
'''
|
||||
|
||||
# rval = 0
|
||||
# for x in self.axis:
|
||||
# rval += x * self.data(x)
|
||||
rval = self.mu
|
||||
# Not sure about this! That might not be the barycentre.
|
||||
# However, for std calculation (next function)
|
||||
# self.mu is also used!! (LK, 02/2017)
|
||||
return rval
|
||||
|
||||
def standard_deviation(self):
|
||||
mu = self.mu
|
||||
rval = 0
|
||||
for x in self.axis:
|
||||
rval += (x - float(mu)) ** 2 * self.data(x)
|
||||
return rval * self.incr
|
||||
|
||||
def prob_lt_val(self, value):
|
||||
if value <= self.axis[0] or value > self.axis[-1]:
|
||||
raise ValueError('value out of bounds: {0}'.format(value))
|
||||
return self.prob_limits((self.axis[0], value))
|
||||
|
||||
def prob_gt_val(self, value):
|
||||
if value < self.axis[0] or value >= self.axis[-1]:
|
||||
raise ValueError('value out of bounds: {0}'.format(value))
|
||||
return self.prob_limits((value, self.axis[-1]))
|
||||
|
||||
def prob_limits(self, limits, oversampling=1.):
|
||||
sampling = self.incr / oversampling
|
||||
lim = np.arange(limits[0], limits[1], sampling)
|
||||
data = self.data(lim)
|
||||
min_est, max_est = 0., 0.
|
||||
for n in range(len(data) - 1):
|
||||
min_est += min(data[n], data[n + 1])
|
||||
max_est += max(data[n], data[n + 1])
|
||||
return (min_est + max_est) / 2. * sampling
|
||||
|
||||
def prob_val(self, value):
|
||||
if not (self.axis[0] <= value <= self.axis[-1]):
|
||||
Warning('{0} not on axis'.format(value))
|
||||
return None
|
||||
return self.data(value) * self.incr
|
||||
|
||||
def quantile(self, prob_value, eps=0.01):
|
||||
'''
|
||||
|
||||
:param prob_value:
|
||||
:param eps:
|
||||
:return:
|
||||
'''
|
||||
l = self.axis[0]
|
||||
r = self.axis[-1]
|
||||
m = (r + l) / 2
|
||||
diff = prob_value - self.prob_lt_val(m)
|
||||
while abs(diff) > eps and ((r - l) > self.incr):
|
||||
if diff > 0:
|
||||
l = m
|
||||
else:
|
||||
r = m
|
||||
m = (r + l) / 2
|
||||
diff = prob_value - self.prob_lt_val(m)
|
||||
return m
|
||||
|
||||
def quantile_distance(self, prob_value):
|
||||
"""
|
||||
takes a probability value and and returns the distance
|
||||
between two complementary quantiles
|
||||
|
||||
.. math::
|
||||
|
||||
QA_\alpha = Q(1 - \alpha) - Q(\alpha)
|
||||
|
||||
:param value: probability value :math:\alpha
|
||||
:type value: float
|
||||
:return: quantile distance
|
||||
"""
|
||||
if 0 >= prob_value or prob_value >= 0.5:
|
||||
raise ValueError('Value out of range.')
|
||||
ql = self.quantile(prob_value)
|
||||
qu = self.quantile(1 - prob_value)
|
||||
return qu - ql
|
||||
|
||||
def quantile_dist_frac(self, x):
|
||||
"""
|
||||
takes a probability value and returns the fraction of two
|
||||
corresponding quantile distances (
|
||||
:func:`pylot.core.util.pdf.ProbabilityDensityFunction
|
||||
#quantile_distance`)
|
||||
|
||||
.. math::
|
||||
|
||||
Q\Theta_\alpha = \frac{QA(0.5 - \alpha)}{QA(\alpha)}
|
||||
|
||||
:param value: probability value :math:\alpha
|
||||
:return: quantile distance fraction
|
||||
"""
|
||||
if x <= 0 or x >= 0.25:
|
||||
raise ValueError('Value out of range.')
|
||||
return self.quantile_distance(0.5 - x) / self.quantile_distance(x)
|
||||
|
||||
def plot(self, label=None):
|
||||
import matplotlib.pyplot as plt
|
||||
|
||||
plt.plot(self.axis, self.data())
|
||||
plt.xlabel('x')
|
||||
plt.ylabel('f(x)')
|
||||
plt.autoscale(axis='x', tight=True)
|
||||
if self:
|
||||
title_str = 'Probability density function '
|
||||
if label:
|
||||
title_str += label
|
||||
title_str.strip()
|
||||
else:
|
||||
title_str = 'Function not suitable as probability density function'
|
||||
plt.title(title_str)
|
||||
plt.show()
|
||||
|
||||
def limits(self):
|
||||
l1 = self.x0
|
||||
r1 = l1 + self.incr * self.npts
|
||||
|
||||
return l1, r1
|
||||
|
||||
def cincr(self, other):
|
||||
if not self.incr == other.incr:
|
||||
raise NotImplementedError(
|
||||
'Upsampling of the lower sampled PDF not implemented yet!')
|
||||
else:
|
||||
return self.incr
|
||||
|
||||
def commonlimits(self, incr, other, max_npts=1e5):
|
||||
'''
|
||||
Takes an increment incr and two left and two right limits and returns
|
||||
the left most limit and the minimum number of points needed to cover
|
||||
the whole given interval.
|
||||
:param incr:
|
||||
:param l1:
|
||||
:param l2:
|
||||
:param r1:
|
||||
:param r2:
|
||||
:param max_npts:
|
||||
:return:
|
||||
'''
|
||||
|
||||
x0, r = clims(self.limits(), other.limits())
|
||||
|
||||
# calculate index for rounding
|
||||
ri = self.precision(incr)
|
||||
|
||||
npts = int(round(r - x0, ri) // incr)
|
||||
|
||||
if npts > max_npts:
|
||||
raise ValueError('Maximum number of points exceeded:\n'
|
||||
'max_npts - %d\n'
|
||||
'npts - %d\n' % (max_npts, npts))
|
||||
|
||||
npts = np.max([npts, self.npts, other.npts])
|
||||
|
||||
if npts < self.npts or npts < other.npts:
|
||||
raise ValueError('new npts is to small')
|
||||
|
||||
return x0, npts
|
||||
|
||||
def commonparameter(self, other):
|
||||
assert isinstance(other, ProbabilityDensityFunction), \
|
||||
'both operands must be of type ProbabilityDensityFunction'
|
||||
|
||||
incr = self.cincr(other)
|
||||
|
||||
x0, npts = self.commonlimits(incr, other)
|
||||
|
||||
return x0, incr, npts
|
||||
58
pylot/core/util/plotting.py
Normal file
@@ -0,0 +1,58 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
|
||||
|
||||
def create_bin_list(l_boundary, u_boundary, nbins=100):
|
||||
"""
|
||||
takes two boundaries and a number of bins and creates a list of bins for
|
||||
histogram plotting
|
||||
:param l_boundary: Any number.
|
||||
:type l_boundary: float
|
||||
:param u_boundary: Any number that is greater than l_boundary.
|
||||
:type u_boundary: float
|
||||
:param nbins: Any positive integer.
|
||||
:type nbins: int
|
||||
:return: A list of equidistant bins.
|
||||
"""
|
||||
if u_boundary <= l_boundary:
|
||||
raise ValueError('Upper boundary must be greather than lower!')
|
||||
elif nbins <= 0:
|
||||
raise ValueError('Number of bins is not valid.')
|
||||
binlist = []
|
||||
for i in range(nbins):
|
||||
binlist.append(l_boundary + i * (u_boundary - l_boundary) / nbins)
|
||||
return binlist
|
||||
|
||||
|
||||
def histplot(array, binlist, xlab='Values',
|
||||
ylab='Frequency', title=None, fnout=None):
|
||||
"""
|
||||
function to quickly show some distribution of data. Takes array like data,
|
||||
and a list of bins. Editing detail and inserting a legend is not possible.
|
||||
:param array: List of values.
|
||||
:type array: Array like
|
||||
:param binlist: List of bins.
|
||||
:type binlist: list
|
||||
:param xlab: A label for the x-axes.
|
||||
:type xlab: str
|
||||
:param ylab: A label for the y-axes.
|
||||
:type ylab: str
|
||||
:param title: A title for the Plot.
|
||||
:type title: str
|
||||
:param fnout: A path to save the plot instead of showing.
|
||||
Has to contain filename and type. Like: 'path/to/file.png'
|
||||
:type fnout. str
|
||||
:return: -
|
||||
"""
|
||||
|
||||
plt.hist(array, bins=binlist)
|
||||
plt.xlabel(xlab)
|
||||
plt.ylabel(ylab)
|
||||
if title:
|
||||
plt.title(title)
|
||||
if fnout:
|
||||
plt.savefig(fnout)
|
||||
else:
|
||||
plt.show()
|
||||
12
pylot/core/util/structure.py
Normal file
@@ -0,0 +1,12 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Created on Wed Jan 26 17:47:25 2015
|
||||
|
||||
@author: sebastianw
|
||||
"""
|
||||
|
||||
from pylot.core.io.data import SeiscompDataStructure, PilotDataStructure
|
||||
|
||||
DATASTRUCTURE = {'PILOT': PilotDataStructure, 'SeisComP': SeiscompDataStructure,
|
||||
None: None}
|
||||
187
pylot/core/util/thread.py
Normal file
@@ -0,0 +1,187 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
import sys, os, traceback
|
||||
import multiprocessing
|
||||
from PySide.QtCore import QThread, Signal, Qt, Slot, QRunnable, QObject
|
||||
from PySide.QtGui import QDialog, QProgressBar, QLabel, QHBoxLayout, QPushButton
|
||||
|
||||
|
||||
class Thread(QThread):
|
||||
message = Signal(str)
|
||||
|
||||
def __init__(self, parent, func, arg=None, progressText=None,
|
||||
pb_widget=None, redirect_stdout=False, abortButton=False):
|
||||
QThread.__init__(self, parent)
|
||||
self.func = func
|
||||
self.arg = arg
|
||||
self.progressText = progressText
|
||||
self.pb_widget = pb_widget
|
||||
self.redirect_stdout = redirect_stdout
|
||||
self.abortButton = abortButton
|
||||
self.finished.connect(self.hideProgressbar)
|
||||
self.showProgressbar()
|
||||
|
||||
def run(self):
|
||||
if self.redirect_stdout:
|
||||
sys.stdout = self
|
||||
try:
|
||||
if self.arg:
|
||||
self.data = self.func(self.arg)
|
||||
else:
|
||||
self.data = self.func()
|
||||
self._executed = True
|
||||
except Exception as e:
|
||||
self._executed = False
|
||||
self._executedError = e
|
||||
traceback.print_exc()
|
||||
exctype, value = sys.exc_info ()[:2]
|
||||
self._executedErrorInfo = '{} {} {}'.\
|
||||
format(exctype, value, traceback.format_exc())
|
||||
sys.stdout = sys.__stdout__
|
||||
|
||||
def showProgressbar(self):
|
||||
if self.progressText:
|
||||
|
||||
# generate widget if not given in init
|
||||
if not self.pb_widget:
|
||||
self.pb_widget = QDialog(self.parent())
|
||||
self.pb_widget.setWindowFlags(Qt.SplashScreen)
|
||||
self.pb_widget.setModal(True)
|
||||
|
||||
# add button
|
||||
delete_button = QPushButton('X')
|
||||
delete_button.clicked.connect(self.exit)
|
||||
hl = QHBoxLayout()
|
||||
pb = QProgressBar()
|
||||
pb.setRange(0, 0)
|
||||
hl.addWidget(pb)
|
||||
hl.addWidget(QLabel(self.progressText))
|
||||
if self.abortButton:
|
||||
hl.addWidget(delete_button)
|
||||
self.pb_widget.setLayout(hl)
|
||||
self.pb_widget.show()
|
||||
|
||||
def hideProgressbar(self):
|
||||
if self.pb_widget:
|
||||
self.pb_widget.hide()
|
||||
|
||||
def write(self, text):
|
||||
self.message.emit(text)
|
||||
|
||||
def flush(self):
|
||||
pass
|
||||
|
||||
|
||||
class Worker(QRunnable):
|
||||
'''
|
||||
Worker class to be run by MultiThread(QThread).
|
||||
'''
|
||||
def __init__(self, fun, args,
|
||||
progressText=None,
|
||||
pb_widget=None,
|
||||
redirect_stdout=False):
|
||||
super(Worker, self).__init__()
|
||||
self.fun = fun
|
||||
self.args = args
|
||||
#self.kwargs = kwargs
|
||||
self.signals = WorkerSignals()
|
||||
self.progressText = progressText
|
||||
self.pb_widget = pb_widget
|
||||
self.redirect_stdout = redirect_stdout
|
||||
|
||||
@Slot()
|
||||
def run(self):
|
||||
if self.redirect_stdout:
|
||||
sys.stdout = self
|
||||
|
||||
try:
|
||||
result = self.fun(self.args)
|
||||
except:
|
||||
exctype, value = sys.exc_info ()[:2]
|
||||
print(exctype, value, traceback.format_exc())
|
||||
self.signals.error.emit ((exctype, value, traceback.format_exc ()))
|
||||
else:
|
||||
self.signals.result.emit(result)
|
||||
finally:
|
||||
self.signals.finished.emit('Done')
|
||||
sys.stdout = sys.__stdout__
|
||||
|
||||
def write(self, text):
|
||||
self.signals.message.emit(text)
|
||||
|
||||
def flush(self):
|
||||
pass
|
||||
|
||||
|
||||
class WorkerSignals(QObject):
|
||||
'''
|
||||
Class to provide signals for Worker Class
|
||||
'''
|
||||
finished = Signal(str)
|
||||
message = Signal(str)
|
||||
error = Signal(tuple)
|
||||
result = Signal(object)
|
||||
|
||||
|
||||
class MultiThread(QThread):
|
||||
finished = Signal(str)
|
||||
message = Signal(str)
|
||||
|
||||
def __init__(self, parent, func, args, ncores=1,
|
||||
progressText=None, pb_widget=None, redirect_stdout=False):
|
||||
QThread.__init__(self, parent)
|
||||
self.func = func
|
||||
self.args = args
|
||||
self.ncores = ncores
|
||||
self.progressText = progressText
|
||||
self.pb_widget = pb_widget
|
||||
self.redirect_stdout = redirect_stdout
|
||||
self.finished.connect(self.hideProgressbar)
|
||||
self.showProgressbar()
|
||||
|
||||
def run(self):
|
||||
if self.redirect_stdout:
|
||||
sys.stdout = self
|
||||
try:
|
||||
if not self.ncores:
|
||||
self.ncores = multiprocessing.cpu_count()
|
||||
pool = multiprocessing.Pool(self.ncores)
|
||||
self.data = pool.map_async(self.func, self.args, callback=self.emitDone)
|
||||
#self.data = pool.apply_async(self.func, self.shotlist, callback=self.emitDone) #emit each time returned
|
||||
pool.close()
|
||||
self._executed = True
|
||||
except Exception as e:
|
||||
self._executed = False
|
||||
self._executedError = e
|
||||
exc_type, exc_obj, exc_tb = sys.exc_info()
|
||||
fname = os.path.split(exc_tb.tb_frame.f_code.co_filename)[1]
|
||||
print('Exception: {}, file: {}, line: {}'.format(exc_type, fname, exc_tb.tb_lineno))
|
||||
sys.stdout = sys.__stdout__
|
||||
|
||||
def showProgressbar(self):
|
||||
if self.progressText:
|
||||
if not self.pb_widget:
|
||||
self.pb_widget = QDialog(self.parent())
|
||||
self.pb_widget.setWindowFlags(Qt.SplashScreen)
|
||||
self.pb_widget.setModal(True)
|
||||
hl = QHBoxLayout()
|
||||
pb = QProgressBar()
|
||||
pb.setRange(0, 0)
|
||||
hl.addWidget(pb)
|
||||
hl.addWidget(QLabel(self.progressText))
|
||||
self.pb_widget.setLayout(hl)
|
||||
self.pb_widget.show()
|
||||
|
||||
def hideProgressbar(self):
|
||||
if self.pb_widget:
|
||||
self.pb_widget.hide()
|
||||
|
||||
def write(self, text):
|
||||
self.message.emit(text)
|
||||
|
||||
def flush(self):
|
||||
pass
|
||||
|
||||
def emitDone(self, result):
|
||||
print('emitDone!')
|
||||
self.finished.emit('Done thread!')
|
||||
self.hideProgressbar()
|
||||
939
pylot/core/util/utils.py
Normal file
@@ -0,0 +1,939 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import hashlib
|
||||
import os
|
||||
import platform
|
||||
import re
|
||||
import subprocess
|
||||
|
||||
import numpy as np
|
||||
from obspy import UTCDateTime, read
|
||||
from obspy.core import AttribDict
|
||||
from obspy.signal.rotate import rotate2zne
|
||||
from obspy.io.xseed.utils import SEEDParserException
|
||||
|
||||
from pylot.core.io.inputs import PylotParameter
|
||||
from pylot.styles import style_settings
|
||||
|
||||
from scipy.interpolate import splrep, splev
|
||||
from PySide import QtCore, QtGui
|
||||
|
||||
try:
|
||||
import pyqtgraph as pg
|
||||
except Exception as e:
|
||||
print('PyLoT: Could not import pyqtgraph. {}'.format(e))
|
||||
pg = None
|
||||
|
||||
def _pickle_method(m):
|
||||
if m.im_self is None:
|
||||
return getattr, (m.im_class, m.im_func.func_name)
|
||||
else:
|
||||
return getattr, (m.im_self, m.im_func.func_name)
|
||||
|
||||
|
||||
def readDefaultFilterInformation(fname):
|
||||
pparam = PylotParameter(fname)
|
||||
return readFilterInformation(pparam)
|
||||
|
||||
|
||||
def readFilterInformation(pylot_parameter):
|
||||
p_filter = {'filtertype': pylot_parameter['filter_type'][0],
|
||||
'freq': [pylot_parameter['minfreq'][0], pylot_parameter['maxfreq'][0]],
|
||||
'order': int(pylot_parameter['filter_order'][0])}
|
||||
s_filter = {'filtertype': pylot_parameter['filter_type'][1],
|
||||
'freq': [pylot_parameter['minfreq'][1], pylot_parameter['maxfreq'][1]],
|
||||
'order': int(pylot_parameter['filter_order'][1])}
|
||||
filter_information = {'P': p_filter,
|
||||
'S': s_filter}
|
||||
return filter_information
|
||||
|
||||
|
||||
def fit_curve(x, y):
|
||||
return splev, splrep(x, y)
|
||||
|
||||
|
||||
def getindexbounds(f, eta):
|
||||
mi = f.argmax()
|
||||
m = max(f)
|
||||
b = m * eta
|
||||
l = find_nearest(f[:mi], b)
|
||||
u = find_nearest(f[mi:], b) + mi
|
||||
return mi, l, u
|
||||
|
||||
|
||||
def gen_Pool(ncores=0):
|
||||
'''
|
||||
:param ncores: number of CPU cores for multiprocessing.Pool, if ncores == 0 use all available
|
||||
:return: multiprocessing.Pool object
|
||||
'''
|
||||
import multiprocessing
|
||||
|
||||
if ncores == 0:
|
||||
ncores = multiprocessing.cpu_count()
|
||||
|
||||
print('gen_Pool: Generated multiprocessing Pool with {} cores\n'.format(ncores))
|
||||
|
||||
pool = multiprocessing.Pool(ncores)
|
||||
return pool
|
||||
|
||||
|
||||
def excludeQualityClasses(picks, qClasses, timeerrorsP, timeerrorsS):
|
||||
'''
|
||||
takes PyLoT picks dictionary and returns a new dictionary with certain classes excluded.
|
||||
:param picks: PyLoT picks dictionary
|
||||
:param qClasses: list (or int) of quality classes (0-4) to exclude
|
||||
:param timeerrorsP: time errors for classes (0-4) for P
|
||||
:param timeerrorsS: time errors for classes (0-4) for S
|
||||
:return: new picks dictionary
|
||||
'''
|
||||
from pylot.core.pick.utils import getQualityFromUncertainty
|
||||
|
||||
if type(qClasses) in [int, float]:
|
||||
qClasses = [qClasses]
|
||||
|
||||
picksdict_new = {}
|
||||
|
||||
phaseError = {'P': timeerrorsP,
|
||||
'S': timeerrorsS}
|
||||
|
||||
for station, phases in picks.items():
|
||||
for phase, pick in phases.items():
|
||||
if not type(pick) in [AttribDict, dict]:
|
||||
continue
|
||||
pickerror = phaseError[identifyPhaseID(phase)]
|
||||
quality = getQualityFromUncertainty(pick['spe'], pickerror)
|
||||
if not quality in qClasses:
|
||||
if not station in picksdict_new:
|
||||
picksdict_new[station] = {}
|
||||
picksdict_new[station][phase] = pick
|
||||
|
||||
return picksdict_new
|
||||
|
||||
|
||||
def clims(lim1, lim2):
|
||||
"""
|
||||
takes two pairs of limits and returns one pair of common limts
|
||||
:param lim1:
|
||||
:param lim2:
|
||||
:return:
|
||||
|
||||
>>> clims([0, 4], [1, 3])
|
||||
[0, 4]
|
||||
>>> clims([1, 4], [0, 3])
|
||||
[0, 4]
|
||||
>>> clims([1, 3], [0, 4])
|
||||
[0, 4]
|
||||
>>> clims([0, 3], [1, 4])
|
||||
[0, 4]
|
||||
>>> clims([0, 3], [0, 4])
|
||||
[0, 4]
|
||||
>>> clims([1, 4], [0, 4])
|
||||
[0, 4]
|
||||
>>> clims([0, 4], [0, 4])
|
||||
[0, 4]
|
||||
>>> clims([0, 4], [1, 4])
|
||||
[0, 4]
|
||||
>>> clims([0, 4], [0, 3])
|
||||
[0, 4]
|
||||
"""
|
||||
lim = [None, None]
|
||||
if lim1[0] < lim2[0]:
|
||||
lim[0] = lim1[0]
|
||||
else:
|
||||
lim[0] = lim2[0]
|
||||
if lim1[1] > lim2[1]:
|
||||
lim[1] = lim1[1]
|
||||
else:
|
||||
lim[1] = lim2[1]
|
||||
return lim
|
||||
|
||||
|
||||
def demeanTrace(trace, window):
|
||||
"""
|
||||
takes a trace object and returns the same trace object but with data
|
||||
demeaned within a certain time window
|
||||
:param trace: waveform trace object
|
||||
:type trace: `~obspy.core.stream.Trace`
|
||||
:param window:
|
||||
:type window: tuple
|
||||
:return: trace
|
||||
:rtype: `~obspy.core.stream.Trace`
|
||||
"""
|
||||
trace.data -= trace.data[window].mean()
|
||||
return trace
|
||||
|
||||
|
||||
def findComboBoxIndex(combo_box, val):
|
||||
"""
|
||||
Function findComboBoxIndex takes a QComboBox object and a string and
|
||||
returns either 0 or the index throughout all QComboBox items.
|
||||
:param combo_box: Combo box object.
|
||||
:type combo_box: `~QComboBox`
|
||||
:param val: Name of a combo box to search for.
|
||||
:type val: basestring
|
||||
:return: index value of item with name val or 0
|
||||
"""
|
||||
return combo_box.findText(val) if combo_box.findText(val) is not -1 else 0
|
||||
|
||||
|
||||
def find_in_list(list, str):
|
||||
"""
|
||||
takes a list of strings and a string and returns the first list item
|
||||
matching the string pattern
|
||||
:param list: list to search in
|
||||
:param str: pattern to search for
|
||||
:return: first list item containing pattern
|
||||
|
||||
.. example::
|
||||
|
||||
>>> l = ['/dir/e1234.123.12', '/dir/e2345.123.12', 'abc123', 'def456']
|
||||
>>> find_in_list(l, 'dir')
|
||||
'/dir/e1234.123.12'
|
||||
>>> find_in_list(l, 'e1234')
|
||||
'/dir/e1234.123.12'
|
||||
>>> find_in_list(l, 'e2')
|
||||
'/dir/e2345.123.12'
|
||||
>>> find_in_list(l, 'ABC')
|
||||
'abc123'
|
||||
>>> find_in_list(l, 'f456')
|
||||
'def456'
|
||||
>>> find_in_list(l, 'gurke')
|
||||
|
||||
"""
|
||||
rlist = [s for s in list if str.lower() in s.lower()]
|
||||
if rlist:
|
||||
return rlist[0]
|
||||
return None
|
||||
|
||||
|
||||
def find_nearest(array, value):
|
||||
'''
|
||||
function find_nearest takes an array and a value and returns the
|
||||
index of the nearest value found in the array
|
||||
:param array: array containing values
|
||||
:type array: `~numpy.ndarray`
|
||||
:param value: number searched for
|
||||
:return: index of the array item being nearest to the value
|
||||
|
||||
>>> a = np.array([ 1.80339578, -0.72546654, 0.95769195, -0.98320759, 0.85922623])
|
||||
>>> find_nearest(a, 1.3)
|
||||
2
|
||||
>>> find_nearest(a, 0)
|
||||
1
|
||||
>>> find_nearest(a, 2)
|
||||
0
|
||||
>>> find_nearest(a, -1)
|
||||
3
|
||||
>>> a = np.array([ 1.1, -0.7, 0.9, -0.9, 0.8])
|
||||
>>> find_nearest(a, 0.849)
|
||||
4
|
||||
'''
|
||||
return (np.abs(array - value)).argmin()
|
||||
|
||||
|
||||
def fnConstructor(s):
|
||||
'''
|
||||
takes a string and returns a valid filename (especially on windows machines)
|
||||
:param s: desired filename
|
||||
:type s: str
|
||||
:return: valid filename
|
||||
'''
|
||||
if type(s) is str:
|
||||
s = s.split(':')[-1]
|
||||
else:
|
||||
s = getHash(UTCDateTime())
|
||||
|
||||
badchars = re.compile(r'[^A-Za-z0-9_. ]+|^\.|\.$|^ | $|^$')
|
||||
badsuffix = re.compile(r'(aux|com[1-9]|con|lpt[1-9]|prn)(\.|$)')
|
||||
|
||||
fn = badchars.sub('_', s)
|
||||
|
||||
if badsuffix.match(fn):
|
||||
fn = '_' + fn
|
||||
return fn
|
||||
|
||||
|
||||
def real_None(value):
|
||||
if value == 'None':
|
||||
return None
|
||||
else:
|
||||
return value
|
||||
|
||||
|
||||
def real_Bool(value):
|
||||
if value == 'True':
|
||||
return True
|
||||
elif value == 'False':
|
||||
return False
|
||||
else:
|
||||
return value
|
||||
|
||||
|
||||
def four_digits(year):
|
||||
"""
|
||||
takes a two digit year integer and returns the correct four digit equivalent
|
||||
from the last 100 years
|
||||
:param year: two digit year
|
||||
:type year: int
|
||||
:return: four digit year correspondant
|
||||
|
||||
>>> four_digits(20)
|
||||
1920
|
||||
>>> four_digits(16)
|
||||
2016
|
||||
>>> four_digits(00)
|
||||
2000
|
||||
"""
|
||||
if year + 2000 <= UTCDateTime.utcnow().year:
|
||||
year += 2000
|
||||
else:
|
||||
year += 1900
|
||||
return year
|
||||
|
||||
|
||||
def common_range(stream):
|
||||
'''
|
||||
takes a stream object and returns the earliest end and the latest start
|
||||
time of all contained trace objects
|
||||
:param stream: seismological data stream
|
||||
:type stream: `~obspy.core.stream.Stream`
|
||||
:return: maximum start time and minimum end time
|
||||
'''
|
||||
max_start = None
|
||||
min_end = None
|
||||
for trace in stream:
|
||||
if max_start is None or trace.stats.starttime > max_start:
|
||||
max_start = trace.stats.starttime
|
||||
if min_end is None or trace.stats.endtime < min_end:
|
||||
min_end = trace.stats.endtime
|
||||
return max_start, min_end
|
||||
|
||||
|
||||
def full_range(stream):
|
||||
'''
|
||||
takes a stream object and returns the latest end and the earliest start
|
||||
time of all contained trace objects
|
||||
:param stream: seismological data stream
|
||||
:type stream: `~obspy.core.stream.Stream`
|
||||
:return: minimum start time and maximum end time
|
||||
'''
|
||||
min_start = UTCDateTime()
|
||||
max_end = None
|
||||
for trace in stream:
|
||||
if trace.stats.starttime < min_start:
|
||||
min_start = trace.stats.starttime
|
||||
if max_end is None or trace.stats.endtime > max_end:
|
||||
max_end = trace.stats.endtime
|
||||
return min_start, max_end
|
||||
|
||||
|
||||
def getHash(time):
|
||||
'''
|
||||
takes a time object and returns the corresponding SHA1 hash of the
|
||||
formatted date string
|
||||
:param time: time object for which a hash should be calculated
|
||||
:type time: :class: `~obspy.core.utcdatetime.UTCDateTime` object
|
||||
:return: str
|
||||
'''
|
||||
hg = hashlib.sha1()
|
||||
hg.update(time.strftime('%Y-%m-%d %H:%M:%S.%f'))
|
||||
return hg.hexdigest()
|
||||
|
||||
|
||||
def getLogin():
|
||||
'''
|
||||
returns the actual user's login ID
|
||||
:return: login ID
|
||||
'''
|
||||
return os.getlogin()
|
||||
|
||||
|
||||
def getOwner(fn):
|
||||
'''
|
||||
takes a filename and return the login ID of the actual owner of the file
|
||||
:param fn: filename of the file tested
|
||||
:type fn: str
|
||||
:return: login ID of the file's owner
|
||||
'''
|
||||
system_name = platform.system()
|
||||
if system_name in ["Linux", "Darwin"]:
|
||||
import pwd
|
||||
return pwd.getpwuid(os.stat(fn).st_uid).pw_name
|
||||
elif system_name == "Windows":
|
||||
import win32security
|
||||
f = win32security.GetFileSecurity(fn, win32security.OWNER_SECURITY_INFORMATION)
|
||||
(username, domain, sid_name_use) = win32security.LookupAccountSid(None, f.GetSecurityDescriptorOwner())
|
||||
return username
|
||||
|
||||
|
||||
def getPatternLine(fn, pattern):
|
||||
"""
|
||||
takes a file name and a pattern string to search for in the file and
|
||||
returns the first line which contains the pattern string otherwise 'None'
|
||||
:param fn: file name
|
||||
:type fn: str
|
||||
:param pattern: pattern string to search for
|
||||
:type pattern: str
|
||||
:return: the complete line containing the pattern string or None
|
||||
|
||||
>>> getPatternLine('utils.py', 'python')
|
||||
'#!/usr/bin/env python\\n'
|
||||
>>> print(getPatternLine('version.py', 'palindrome'))
|
||||
None
|
||||
"""
|
||||
fobj = open(fn, 'r')
|
||||
for line in fobj.readlines():
|
||||
if pattern in line:
|
||||
fobj.close()
|
||||
return line
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def is_executable(fn):
|
||||
"""
|
||||
takes a filename and returns True if the file is executable on the system
|
||||
and False otherwise
|
||||
:param fn: path to the file to be tested
|
||||
:return: True or False
|
||||
"""
|
||||
return os.path.isfile(fn) and os.access(fn, os.X_OK)
|
||||
|
||||
|
||||
def isSorted(iterable):
|
||||
'''
|
||||
takes an iterable and returns 'True' if the items are in order otherwise
|
||||
'False'
|
||||
:param iterable: an iterable object
|
||||
:type iterable:
|
||||
:return: Boolean
|
||||
|
||||
>>> isSorted(1)
|
||||
Traceback (most recent call last):
|
||||
...
|
||||
AssertionError: object is not iterable; object: 1
|
||||
>>> isSorted([1,2,3,4])
|
||||
True
|
||||
>>> isSorted('abcd')
|
||||
True
|
||||
>>> isSorted('bcad')
|
||||
False
|
||||
>>> isSorted([2,3,1,4])
|
||||
False
|
||||
'''
|
||||
assert isIterable(iterable), 'object is not iterable; object: {' \
|
||||
'0}'.format(iterable)
|
||||
if type(iterable) is str:
|
||||
iterable = [s for s in iterable]
|
||||
return sorted(iterable) == iterable
|
||||
|
||||
|
||||
def isIterable(obj):
|
||||
"""
|
||||
takes a python object and returns 'True' is the object is iterable and
|
||||
'False' otherwise
|
||||
:param obj: a python object
|
||||
:return: True of False
|
||||
"""
|
||||
try:
|
||||
iterator = iter(obj)
|
||||
except TypeError as te:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def key_for_set_value(d):
|
||||
"""
|
||||
takes a dictionary and returns the first key for which's value the
|
||||
boolean is True
|
||||
:param d: dictionary containing values
|
||||
:type d: dict
|
||||
:return: key to the first non-False value found; None if no value's
|
||||
boolean equals True
|
||||
"""
|
||||
r = None
|
||||
for k, v in d.items():
|
||||
if v:
|
||||
return k
|
||||
return r
|
||||
|
||||
|
||||
def prepTimeAxis(stime, trace, verbosity=0):
|
||||
'''
|
||||
takes a starttime and a trace object and returns a valid time axis for
|
||||
plotting
|
||||
:param stime: start time of the actual seismogram as UTCDateTime
|
||||
:param trace: seismic trace object
|
||||
:return: valid numpy array with time stamps for plotting
|
||||
'''
|
||||
nsamp = trace.stats.npts
|
||||
srate = trace.stats.sampling_rate
|
||||
tincr = trace.stats.delta
|
||||
etime = stime + nsamp / srate
|
||||
time_ax = np.arange(stime, etime, tincr)
|
||||
if len(time_ax) < nsamp:
|
||||
if verbosity:
|
||||
print('elongate time axes by one datum')
|
||||
time_ax = np.arange(stime, etime + tincr, tincr)
|
||||
elif len(time_ax) > nsamp:
|
||||
if verbosity:
|
||||
print('shorten time axes by one datum')
|
||||
time_ax = np.arange(stime, etime - tincr, tincr)
|
||||
if len(time_ax) != nsamp:
|
||||
print('Station {0}, {1} samples of data \n '
|
||||
'{2} length of time vector \n'
|
||||
'delta: {3}'.format(trace.stats.station,
|
||||
nsamp, len(time_ax), tincr))
|
||||
time_ax = None
|
||||
return time_ax
|
||||
|
||||
|
||||
def find_horizontals(data):
|
||||
"""
|
||||
takes `obspy.core.stream.Stream` object and returns a list containing the component labels of the horizontal components available
|
||||
:param data: waveform data
|
||||
:type data: `obspy.core.stream.Stream`
|
||||
:return: components list
|
||||
:rtype: list
|
||||
|
||||
..example::
|
||||
|
||||
>>> st = read()
|
||||
>>> find_horizontals(st)
|
||||
[u'N', u'E']
|
||||
"""
|
||||
rval = []
|
||||
for tr in data:
|
||||
if tr.stats.channel[-1].upper() in ['Z', '3']:
|
||||
continue
|
||||
else:
|
||||
rval.append(tr.stats.channel[-1].upper())
|
||||
return rval
|
||||
|
||||
|
||||
def make_pen(picktype, phase, key, quality):
|
||||
if pg:
|
||||
rgba = pick_color(picktype, phase, quality)
|
||||
linestyle, width = pick_linestyle_pg(picktype, key)
|
||||
pen = pg.mkPen(rgba, width=width, style=linestyle)
|
||||
return pen
|
||||
|
||||
|
||||
def pick_color(picktype, phase, quality=0):
|
||||
min_quality = 3
|
||||
bpc = base_phase_colors(picktype, phase)
|
||||
rgba = bpc['rgba']
|
||||
modifier = bpc['modifier']
|
||||
intensity = 255.*quality/min_quality
|
||||
rgba = modify_rgba(rgba, modifier, intensity)
|
||||
return rgba
|
||||
|
||||
|
||||
def pick_color_plt(picktype, phase, quality=0):
|
||||
rgba = list(pick_color(picktype, phase, quality))
|
||||
for index, val in enumerate(rgba):
|
||||
rgba[index] /= 255.
|
||||
return rgba
|
||||
|
||||
|
||||
def pick_linestyle_plt(picktype, key):
|
||||
linestyles_manu = {'mpp': ('solid', 2.),
|
||||
'epp': ('dashed', 1.),
|
||||
'lpp': ('dashed', 1.),
|
||||
'spe': ('dashed', 1.)}
|
||||
linestyles_auto = {'mpp': ('dotted', 2.),
|
||||
'epp': ('dashdot', 1.),
|
||||
'lpp': ('dashdot', 1.),
|
||||
'spe': ('dashdot', 1.)}
|
||||
linestyles = {'manual': linestyles_manu,
|
||||
'auto': linestyles_auto}
|
||||
return linestyles[picktype][key]
|
||||
|
||||
|
||||
def pick_linestyle_pg(picktype, key):
|
||||
linestyles_manu = {'mpp': (QtCore.Qt.SolidLine, 2.),
|
||||
'epp': (QtCore.Qt.DashLine, 1.),
|
||||
'lpp': (QtCore.Qt.DashLine, 1.),
|
||||
'spe': (QtCore.Qt.DashLine, 1.)}
|
||||
linestyles_auto = {'mpp': (QtCore.Qt.DotLine, 2.),
|
||||
'epp': (QtCore.Qt.DashDotLine, 1.),
|
||||
'lpp': (QtCore.Qt.DashDotLine, 1.),
|
||||
'spe': (QtCore.Qt.DashDotLine, 1.)}
|
||||
linestyles = {'manual': linestyles_manu,
|
||||
'auto': linestyles_auto}
|
||||
return linestyles[picktype][key]
|
||||
|
||||
|
||||
def modify_rgba(rgba, modifier, intensity):
|
||||
rgba = list(rgba)
|
||||
index = {'r': 0,
|
||||
'g': 1,
|
||||
'b': 2}
|
||||
val = rgba[index[modifier]] + intensity
|
||||
if val > 255.:
|
||||
val = 255.
|
||||
elif val < 0.:
|
||||
val = 0
|
||||
rgba[index[modifier]] = val
|
||||
return tuple(rgba)
|
||||
|
||||
|
||||
def base_phase_colors(picktype, phase):
|
||||
phasecolors = style_settings.phasecolors
|
||||
return phasecolors[picktype][phase]
|
||||
|
||||
def transform_colors_mpl_str(colors, no_alpha=False):
|
||||
colors = list(colors)
|
||||
colors_mpl = tuple([color / 255. for color in colors])
|
||||
if no_alpha:
|
||||
colors_mpl = '({}, {}, {})'.format(*colors_mpl)
|
||||
else:
|
||||
colors_mpl = '({}, {}, {}, {})'.format(*colors_mpl)
|
||||
return colors_mpl
|
||||
|
||||
def transform_colors_mpl(colors):
|
||||
colors = list(colors)
|
||||
colors_mpl = tuple([color / 255. for color in colors])
|
||||
return colors_mpl
|
||||
|
||||
def remove_underscores(data):
|
||||
"""
|
||||
takes a `obspy.core.stream.Stream` object and removes all underscores
|
||||
from stationnames
|
||||
:param data: stream of seismic data
|
||||
:type data: `obspy.core.stream.Stream`
|
||||
:return: data stream
|
||||
"""
|
||||
for tr in data:
|
||||
# remove underscores
|
||||
tr.stats.station = tr.stats.station.strip('_')
|
||||
return data
|
||||
|
||||
|
||||
def trim_station_components(data, trim_start=True, trim_end=True):
|
||||
'''
|
||||
cut a stream so only the part common to all three traces is kept to avoid dealing with offsets
|
||||
:param data: stream of seismic data
|
||||
:type data: `obspy.core.stream.Stream`
|
||||
:param trim_start: trim start of stream
|
||||
:type trim_start: bool
|
||||
:param trim_end: trim end of stream
|
||||
:type trim_end: bool
|
||||
:return: data stream
|
||||
'''
|
||||
starttime = {False: None}
|
||||
endtime = {False: None}
|
||||
|
||||
stations = get_stations(data)
|
||||
|
||||
print('trim_station_components: Will trim stream for trim_start: {} and for '
|
||||
'trim_end: {}.'.format(trim_start, trim_end))
|
||||
for station in stations:
|
||||
wf_station = data.select(station=station)
|
||||
starttime[True] = max([trace.stats.starttime for trace in wf_station])
|
||||
endtime[True] = min([trace.stats.endtime for trace in wf_station])
|
||||
wf_station.trim(starttime=starttime[trim_start], endtime=endtime[trim_end])
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def check4gaps(data):
|
||||
'''
|
||||
check for gaps in Stream and remove them
|
||||
:param data: stream of seismic data
|
||||
:return: data stream
|
||||
'''
|
||||
stations = get_stations(data)
|
||||
|
||||
for station in stations:
|
||||
wf_station = data.select(station=station)
|
||||
if wf_station.get_gaps():
|
||||
for trace in wf_station:
|
||||
data.remove(trace)
|
||||
print('check4gaps: Found gaps and removed station {} from waveform data.'.format(station))
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def check4doubled(data):
|
||||
'''
|
||||
check for doubled stations for same channel in Stream and take only the first one
|
||||
:param data: stream of seismic data
|
||||
:return: data stream
|
||||
'''
|
||||
stations = get_stations(data)
|
||||
|
||||
for station in stations:
|
||||
wf_station = data.select(station=station)
|
||||
# create list of all possible channels
|
||||
channels = []
|
||||
for trace in wf_station:
|
||||
channel = trace.stats.channel
|
||||
if not channel in channels:
|
||||
channels.append(channel)
|
||||
else:
|
||||
print('check4doubled: removed the following trace for station {}, as there is'
|
||||
' already a trace with the same channel given:\n{}'.format(
|
||||
station, trace
|
||||
))
|
||||
data.remove(trace)
|
||||
return data
|
||||
|
||||
|
||||
def get_stations(data):
|
||||
stations = []
|
||||
for tr in data:
|
||||
station = tr.stats.station
|
||||
if not station in stations:
|
||||
stations.append(station)
|
||||
|
||||
return stations
|
||||
|
||||
|
||||
def check4rotated(data, metadata=None, verbosity=1):
|
||||
|
||||
def rotate_components(wfstream, metadata=None):
|
||||
"""rotates components if orientation code is numeric.
|
||||
azimut and dip are fetched from metadata"""
|
||||
try:
|
||||
# indexing fails if metadata is None
|
||||
metadata[0]
|
||||
except:
|
||||
if verbosity:
|
||||
msg = 'Warning: could not rotate traces since no metadata was given\nset Inventory file!'
|
||||
print(msg)
|
||||
return wfstream
|
||||
if metadata[0] is None:
|
||||
# sometimes metadata is (None, (None,))
|
||||
if verbosity:
|
||||
msg = 'Warning: could not rotate traces since no metadata was given\nCheck inventory directory!'
|
||||
print(msg)
|
||||
return wfstream
|
||||
else:
|
||||
parser = metadata[1]
|
||||
|
||||
def get_dip_azimut(parser, trace_id):
|
||||
"""gets azimut and dip for a trace out of the metadata parser"""
|
||||
dip = None
|
||||
azimut = None
|
||||
try:
|
||||
blockettes = parser._select(trace_id)
|
||||
except SEEDParserException as e:
|
||||
print(e)
|
||||
raise ValueError
|
||||
for blockette_ in blockettes:
|
||||
if blockette_.id != 52:
|
||||
continue
|
||||
dip = blockette_.dip
|
||||
azimut = blockette_.azimuth
|
||||
break
|
||||
if dip is None or azimut is None:
|
||||
error_msg = 'Dip and azimuth not available for trace_id {}'.format(trace_id)
|
||||
raise ValueError(error_msg)
|
||||
return dip, azimut
|
||||
|
||||
trace_ids = [trace.id for trace in wfstream]
|
||||
for trace_id in trace_ids:
|
||||
orientation = trace_id[-1]
|
||||
if orientation.isnumeric():
|
||||
# misaligned channels have a number as orientation
|
||||
azimuts = []
|
||||
dips = []
|
||||
for trace_id in trace_ids:
|
||||
try:
|
||||
dip, azimut = get_dip_azimut(parser, trace_id)
|
||||
except ValueError as e:
|
||||
print(e)
|
||||
print('Failed to rotate station {}, no azimuth or dip available in metadata'.format(trace_id))
|
||||
return wfstream
|
||||
azimuts.append(azimut)
|
||||
dips.append(dip)
|
||||
# to rotate all traces must have same length
|
||||
wfstream = trim_station_components(wfstream, trim_start=True, trim_end=True)
|
||||
z, n, e = rotate2zne(wfstream[0], azimuts[0], dips[0],
|
||||
wfstream[1], azimuts[1], dips[1],
|
||||
wfstream[2], azimuts[2], dips[2])
|
||||
print('check4rotated: rotated station {} to ZNE'.format(trace_id))
|
||||
z_index = dips.index(min(dips)) # get z-trace index (dip is measured from 0 to -90
|
||||
wfstream[z_index].data = z
|
||||
wfstream[z_index].stats.channel = wfstream[z_index].stats.channel[0:-1] + 'Z'
|
||||
del trace_ids[z_index]
|
||||
for trace_id in trace_ids:
|
||||
dip, az = get_dip_azimut(parser, trace_id)
|
||||
trace = wfstream.select(id=trace_id)[0]
|
||||
if az > 315 and az <= 45 or az > 135 and az <= 225:
|
||||
trace.data = n
|
||||
trace.stats.channel = trace.stats.channel[0:-1] + 'N'
|
||||
elif az > 45 and az <= 135 or az > 225 and az <= 315:
|
||||
trace.data = e
|
||||
trace.stats.channel = trace.stats.channel[0:-1] + 'E'
|
||||
break
|
||||
else:
|
||||
continue
|
||||
return wfstream
|
||||
|
||||
stations = get_stations(data)
|
||||
|
||||
for station in stations:
|
||||
wf_station = data.select(station=station)
|
||||
wf_station = rotate_components(wf_station, metadata)
|
||||
return data
|
||||
|
||||
|
||||
def scaleWFData(data, factor=None, components='all'):
|
||||
"""
|
||||
produce scaled waveforms from given waveform data and a scaling factor,
|
||||
waveform may be selected by their components name
|
||||
:param data: waveform data to be scaled
|
||||
:type data: `~obspy.core.stream.Stream` object
|
||||
:param factor: scaling factor
|
||||
:type factor: float
|
||||
:param components: components labels for the traces in data to be scaled by
|
||||
the scaling factor (optional, default: 'all')
|
||||
:type components: tuple
|
||||
:return: scaled waveform data
|
||||
:rtype: `~obspy.core.stream.Stream` object
|
||||
"""
|
||||
if components is not 'all':
|
||||
for comp in components:
|
||||
if factor is None:
|
||||
max_val = np.max(np.abs(data.select(component=comp)[0].data))
|
||||
data.select(component=comp)[0].data /= 2 * max_val
|
||||
else:
|
||||
data.select(component=comp)[0].data /= 2 * factor
|
||||
else:
|
||||
for tr in data:
|
||||
if factor is None:
|
||||
max_val = float(np.max(np.abs(tr.data)))
|
||||
tr.data /= 2 * max_val
|
||||
else:
|
||||
tr.data /= 2 * factor
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def runProgram(cmd, parameter=None):
|
||||
"""
|
||||
run an external program specified by cmd with parameters input returning the
|
||||
stdout output
|
||||
:param cmd: name of the command to run
|
||||
:type cmd: str
|
||||
:param parameter: filename of parameter file or parameter string
|
||||
:type parameter: str
|
||||
:return: stdout output
|
||||
:rtype: str
|
||||
"""
|
||||
|
||||
if parameter:
|
||||
cmd.strip()
|
||||
cmd += ' %s 2>&1' % parameter
|
||||
|
||||
subprocess.check_output('{} | tee /dev/stderr'.format(cmd), shell=True)
|
||||
|
||||
|
||||
def which(program, infile=None):
|
||||
"""
|
||||
takes a program name and returns the full path to the executable or None
|
||||
modified after: http://stackoverflow.com/questions/377017/test-if-executable-exists-in-python
|
||||
:param program: name of the desired external program
|
||||
:return: full path of the executable file
|
||||
"""
|
||||
try:
|
||||
from PySide.QtCore import QSettings
|
||||
settings = QSettings()
|
||||
for key in settings.allKeys():
|
||||
if 'binPath' in key:
|
||||
os.environ['PATH'] += ':{0}'.format(settings.value(key))
|
||||
if infile is None:
|
||||
# use default parameter-file name
|
||||
bpath = os.path.join(os.path.expanduser('~'), '.pylot', 'pylot.in')
|
||||
else:
|
||||
bpath = os.path.join(os.path.expanduser('~'), '.pylot', infile)
|
||||
|
||||
if os.path.exists(bpath):
|
||||
nllocpath = ":" + PylotParameter(bpath).get('nllocbin')
|
||||
os.environ['PATH'] += nllocpath
|
||||
except ImportError as e:
|
||||
print(e.message)
|
||||
|
||||
def is_exe(fpath):
|
||||
return os.path.exists(fpath) and os.access(fpath, os.X_OK)
|
||||
|
||||
def ext_candidates(fpath):
|
||||
yield fpath
|
||||
for ext in os.environ.get("PATHEXT", "").split(os.pathsep):
|
||||
yield fpath + ext
|
||||
|
||||
fpath, fname = os.path.split(program)
|
||||
if fpath:
|
||||
if is_exe(program):
|
||||
return program
|
||||
else:
|
||||
for path in os.environ["PATH"].split(os.pathsep):
|
||||
exe_file = os.path.join(path, program)
|
||||
for candidate in ext_candidates(exe_file):
|
||||
if is_exe(candidate):
|
||||
return candidate
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def loopIdentifyPhase(phase):
|
||||
'''
|
||||
Loop through phase string and try to recognize its type (P or S wave).
|
||||
Global variable ALTSUFFIX gives alternative suffix for phases if they do not end with P, p or S, s.
|
||||
If ALTSUFFIX is not given, the function will cut the last letter of the phase string until string ends
|
||||
with P or S.
|
||||
:param phase: phase name (str)
|
||||
:return:
|
||||
'''
|
||||
from pylot.core.util.defaults import ALTSUFFIX
|
||||
|
||||
phase_copy = phase
|
||||
while not identifyPhase(phase_copy):
|
||||
identified = False
|
||||
for alt_suf in ALTSUFFIX:
|
||||
if phase_copy.endswith(alt_suf):
|
||||
phase_copy = phase_copy.split(alt_suf)[0]
|
||||
identified = True
|
||||
if not identified:
|
||||
phase_copy = phase_copy[:-1]
|
||||
if len(phase_copy) < 1:
|
||||
print('Warning: Could not identify phase {}!'.format(phase))
|
||||
return
|
||||
return phase_copy
|
||||
|
||||
|
||||
def identifyPhase(phase):
|
||||
'''
|
||||
Returns capital P or S if phase string is identified by last letter. Else returns False.
|
||||
:param phase: phase name (str)
|
||||
:return: 'P', 'S' or False
|
||||
'''
|
||||
# common phase suffix for P and S
|
||||
common_P = ['P', 'p']
|
||||
common_S = ['S', 's']
|
||||
if phase[-1] in common_P:
|
||||
return 'P'
|
||||
if phase[-1] in common_S:
|
||||
return 'S'
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
def identifyPhaseID(phase):
|
||||
return identifyPhase(loopIdentifyPhase(phase))
|
||||
|
||||
|
||||
def has_spe(pick):
|
||||
if not 'spe' in pick.keys():
|
||||
return None
|
||||
else:
|
||||
return pick['spe']
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import doctest
|
||||
|
||||
doctest.testmod()
|
||||
114
pylot/core/util/version.py
Normal file
@@ -0,0 +1,114 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Author: Douglas Creager <dcreager@dcreager.net>
|
||||
# This file is placed into the public domain.
|
||||
|
||||
# Calculates the current version number. If possible, this is the
|
||||
# output of “git describe”, modified to conform to the versioning
|
||||
# scheme that setuptools uses. If “git describe” returns an error
|
||||
# (most likely because we're in an unpacked copy of a release tarball,
|
||||
# rather than in a git working copy), then we fall back on reading the
|
||||
# contents of the RELEASE-VERSION file.
|
||||
#
|
||||
# To use this script, simply import it your setup.py file, and use the
|
||||
# results of get_git_version() as your package version:
|
||||
#
|
||||
# from version import *
|
||||
#
|
||||
# setup(
|
||||
# version=get_git_version(),
|
||||
# .
|
||||
# .
|
||||
# .
|
||||
# )
|
||||
#
|
||||
# This will automatically update the RELEASE-VERSION file, if
|
||||
# necessary. Note that the RELEASE-VERSION file should *not* be
|
||||
# checked into git; please add it to your top-level .gitignore file.
|
||||
#
|
||||
# You'll probably want to distribute the RELEASE-VERSION file in your
|
||||
# sdist tarballs; to do this, just create a MANIFEST.in file that
|
||||
# contains the following line:
|
||||
#
|
||||
# include RELEASE-VERSION
|
||||
|
||||
from __future__ import print_function
|
||||
|
||||
__all__ = "get_git_version"
|
||||
|
||||
# NO IMPORTS FROM PYLOT IN THIS FILE! (file gets used at installation time)
|
||||
import os
|
||||
import inspect
|
||||
from subprocess import Popen, PIPE
|
||||
|
||||
# NO IMPORTS FROM PYLOT IN THIS FILE! (file gets used at installation time)
|
||||
|
||||
script_dir = os.path.abspath(os.path.dirname(inspect.getfile(
|
||||
inspect.currentframe())))
|
||||
PYLOT_ROOT = os.path.abspath(os.path.join(script_dir, os.pardir,
|
||||
os.pardir, os.pardir))
|
||||
VERSION_FILE = os.path.join(PYLOT_ROOT, "pylot", "RELEASE-VERSION")
|
||||
|
||||
|
||||
def call_git_describe(abbrev=4):
|
||||
try:
|
||||
p = Popen(['git', 'rev-parse', '--show-toplevel'],
|
||||
cwd=PYLOT_ROOT, stdout=PIPE, stderr=PIPE)
|
||||
p.stderr.close()
|
||||
path = p.stdout.readlines()[0].strip()
|
||||
except:
|
||||
return None
|
||||
if os.path.normpath(path) != PYLOT_ROOT:
|
||||
return None
|
||||
try:
|
||||
p = Popen(['git', 'describe', '--dirty', '--abbrev=%d' % abbrev,
|
||||
'--always'],
|
||||
cwd=PYLOT_ROOT, stdout=PIPE, stderr=PIPE)
|
||||
p.stderr.close()
|
||||
line = p.stdout.readlines()[0]
|
||||
# (this line prevents official releases)
|
||||
if "-" not in line and "." not in line:
|
||||
line = "0.0.0-g%s" % line
|
||||
return line.strip()
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
def read_release_version():
|
||||
try:
|
||||
version = open(VERSION_FILE, "r").readlines()[0]
|
||||
return version.strip()
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
def write_release_version(version):
|
||||
open(VERSION_FILE, "w").write("%s\n" % version)
|
||||
|
||||
|
||||
def get_git_version(abbrev=4):
|
||||
# Read in the version that's currently in RELEASE-VERSION.
|
||||
release_version = read_release_version()
|
||||
|
||||
# First try to get the current version using “git describe”.
|
||||
version = call_git_describe(abbrev)
|
||||
|
||||
# If that doesn't work, fall back on the value that's in
|
||||
# RELEASE-VERSION.
|
||||
if version is None:
|
||||
version = release_version
|
||||
|
||||
# If we still don't have anything, that's an error.
|
||||
if version is None:
|
||||
return '0.0.0-tar/zipball'
|
||||
|
||||
# If the current version is different from what's in the
|
||||
# RELEASE-VERSION file, update the file to be current.
|
||||
if version != release_version:
|
||||
write_release_version(version)
|
||||
|
||||
# Finally, return the current version.
|
||||
return version
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
print(get_git_version())
|
||||