release version: 0.1a
release notes: ============== Features - consistent manual phase picking through predefined SNR dependant zoom level - uniform uncertainty estimation from waveform's properties for automatic and manual picks - pdf representation and comparison of picks taking the uncertainty intrinsically into account - Richter and moment magnitude estimation - location determination with external installation of [NonLinLoc](http://alomax.free.fr/nlloc/index.html) Known issues - Magnitude estimation from manual PyLoT takes some time (instrument correction)
							
								
								
									
										1177
									
								
								QtPyLoT.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						
							
								
								
									
										92
									
								
								README.md
									
									
									
									
									
								
							
							
						
						| @ -1,2 +1,92 @@ | ||||
| # PyLoT | ||||
| Python picking and Location Tool | ||||
| 
 | ||||
| version: 0.1a | ||||
| 
 | ||||
| The Python picking and Localisation Tool | ||||
| 
 | ||||
| This python library contains a graphical user interfaces for picking | ||||
| seismic phases. This software needs [ObsPy][ObsPy] | ||||
| and the PySide Qt4 bindings for python to be installed first. | ||||
| 
 | ||||
| PILOT has originally been developed in Mathworks' MatLab. In order to | ||||
| distribute PILOT without facing portability problems, it has been decided | ||||
| to redevelop the software package in Python. The great work of the ObsPy | ||||
| group allows easy handling of a bunch of seismic data and PyLoT will | ||||
| benefit a lot compared to the former MatLab version. | ||||
| 
 | ||||
| The development of PyLoT is part of the joint research project MAGS2. | ||||
| 
 | ||||
| ##Installation | ||||
| 
 | ||||
| At the moment there is no automatic installation procedure available for PyLoT. | ||||
| Best way to install is to clone the repository and add the path to your Python path. | ||||
| 
 | ||||
| ####prerequisites: | ||||
| 
 | ||||
| In order to run PyLoT you need to install: | ||||
| 
 | ||||
| - python | ||||
| - scipy | ||||
| - numpy | ||||
| - matplotlib | ||||
| - obspy | ||||
| - pyside | ||||
| 
 | ||||
| ####some handwork | ||||
| 
 | ||||
| PyLoT needs a properties folder on your system to work. It should be situated in your home directory: | ||||
| 
 | ||||
|     mkdir ~/.pylot | ||||
| 
 | ||||
| In the next step you have to copy some files to this directory: | ||||
| 
 | ||||
|     cp path-to-pylot/inputs/pylot.in ~/.pylot/ | ||||
| 
 | ||||
| for local distance seismicity | ||||
| 
 | ||||
|     cp path-to-pylot/inputs/autoPyLoT_local.in ~/.pylot/autoPyLoT.in | ||||
| 
 | ||||
| for regional distance seismicity | ||||
| 
 | ||||
|     cp path-to-pylot/inputs/autoPyLoT_regional.in ~/.pylot/autoPyLoT.in | ||||
| 
 | ||||
| and some extra information on filtering, error estimates (just needed for reading old PILOT data) and the Richter magnitude scaling relation | ||||
| 
 | ||||
|     cp path-to-pylot/inputs/filter.in path-to-pylot/inputs/PILOT_TimeErrors.in path-to-pylot/inputs/richter_scaling.data ~/.pylot/ | ||||
| 
 | ||||
| You may need to do some modifications to these files. Especially folder names should be reviewed. | ||||
| 
 | ||||
| PyLoT has been tested on Mac OSX (10.11) and Debian Linux 8. | ||||
| 
 | ||||
| 
 | ||||
| ##release notes: | ||||
| ============== | ||||
| 
 | ||||
| #### Features | ||||
| 
 | ||||
| - consistent manual phase picking through predefined SNR dependant zoom level | ||||
| - uniform uncertainty estimation from waveform's properties for automatic and manual picks | ||||
| - pdf representation and comparison of picks taking the uncertainty intrinsically into account  | ||||
| - Richter and moment magnitude estimation | ||||
| - location determination with external installation of [NonLinLoc](http://alomax.free.fr/nlloc/index.html) | ||||
| 
 | ||||
| #### Known issues | ||||
| 
 | ||||
| - Magnitude estimation from manual PyLoT takes some time (instrument correction) | ||||
| 
 | ||||
| We hope to solve these with the next release. | ||||
| 
 | ||||
| ####staff: | ||||
| ====== | ||||
| 
 | ||||
| original author(s): L. Kueperkoch, S. Wehling-Benatelli, M. Bischoff (PILOT) | ||||
| 
 | ||||
| developer(s): S. Wehling-Benatelli, L. Kueperkoch, K. Olbert, M. Bischoff, | ||||
|               C. Wollin, M. Rische, M. Paffrath | ||||
| 
 | ||||
| others: A. Bruestle, T. Meier, W. Friederich | ||||
| 
 | ||||
| 
 | ||||
| [ObsPy]: http://github.com/obspy/obspy/wiki | ||||
| 
 | ||||
| October 2016 | ||||
|  | ||||
							
								
								
									
										270
									
								
								autoPyLoT.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,270 @@ | ||||
| #!/usr/bin/python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| from __future__ import print_function | ||||
| 
 | ||||
| import argparse | ||||
| import glob | ||||
| import os | ||||
| 
 | ||||
| from obspy import read_events | ||||
| 
 | ||||
| import pylot.core.loc.hsat as hsat | ||||
| import pylot.core.loc.nll as nll | ||||
| from pylot.core.analysis.magnitude import MomentMagnitude, RichterMagnitude | ||||
| from pylot.core.io.data import Data | ||||
| from pylot.core.io.inputs import AutoPickParameter | ||||
| from pylot.core.pick.autopick import autopickevent, iteratepicker | ||||
| from pylot.core.util.dataprocessing import restitute_data, read_metadata, \ | ||||
|     remove_underscores | ||||
| from pylot.core.util.structure import DATASTRUCTURE | ||||
| from pylot.core.util.version import get_git_version as _getVersionString | ||||
| 
 | ||||
| __version__ = _getVersionString() | ||||
| 
 | ||||
| 
 | ||||
| def autoPyLoT(inputfile): | ||||
|     """ | ||||
|     Determine phase onsets automatically utilizing the automatic picking | ||||
|     algorithms by Kueperkoch et al. 2010/2012. | ||||
| 
 | ||||
|     :param inputfile: path to the input file containing all parameter | ||||
|     information for automatic picking (for formatting details, see. | ||||
|     `~pylot.core.io.inputs.AutoPickParameter` | ||||
|     :type inputfile: str | ||||
|     :return: | ||||
| 
 | ||||
|     .. rubric:: Example | ||||
| 
 | ||||
|     """ | ||||
|     splash = '''************************************\n | ||||
|                 *********autoPyLoT starting*********\n | ||||
|                 The Python picking and Location Tool\n | ||||
|                 Version {version} 2015\n | ||||
|                 \n | ||||
|                 Authors:\n | ||||
|                 S. Wehling-Benatelli (Ruhr-Universität Bochum)\n | ||||
|                 L. Küperkoch (BESTEC GmbH, Landau i. d. Pfalz)\n | ||||
|                 K. Olbert (Christian-Albrechts Universität zu Kiel)\n | ||||
|                 ***********************************'''.format(version=_getVersionString()) | ||||
|     print(splash) | ||||
| 
 | ||||
|     # reading parameter file | ||||
| 
 | ||||
|     parameter = AutoPickParameter(inputfile) | ||||
| 
 | ||||
|     data = Data() | ||||
| 
 | ||||
|     evt = None | ||||
|     # getting information on data structure | ||||
| 
 | ||||
|     if parameter.hasParam('datastructure'): | ||||
|         datastructure = DATASTRUCTURE[parameter.get('datastructure')]() | ||||
|         dsfields = {'root': parameter.get('rootpath'), | ||||
|                     'dpath': parameter.get('datapath'), | ||||
|                     'dbase': parameter.get('database')} | ||||
| 
 | ||||
|         exf = ['root', 'dpath', 'dbase'] | ||||
| 
 | ||||
|         if parameter.hasParam('eventID'): | ||||
|             dsfields['eventID'] = parameter.get('eventID') | ||||
|             exf.append('eventID') | ||||
| 
 | ||||
|         datastructure.modifyFields(**dsfields) | ||||
|         datastructure.setExpandFields(exf) | ||||
| 
 | ||||
|         # check if default location routine NLLoc is available | ||||
|         if parameter.hasParam('nllocbin'): | ||||
|             locflag = 1 | ||||
|             # get NLLoc-root path | ||||
|             nllocroot = parameter.get('nllocroot') | ||||
|             # get path to NLLoc executable | ||||
|             nllocbin = parameter.get('nllocbin') | ||||
|             nlloccall = '%s/NLLoc' % nllocbin | ||||
|             # get name of phase file | ||||
|             phasef = parameter.get('phasefile') | ||||
|             phasefile = '%s/obs/%s' % (nllocroot, phasef) | ||||
|             # get name of NLLoc-control file | ||||
|             ctrf = parameter.get('ctrfile') | ||||
|             ctrfile = '%s/run/%s' % (nllocroot, ctrf) | ||||
|             # pattern of NLLoc ttimes from location grid | ||||
|             ttpat = parameter.get('ttpatter') | ||||
|             # pattern of NLLoc-output file | ||||
|             nllocoutpatter = parameter.get('outpatter') | ||||
|             maxnumit = 3  # maximum number of iterations for re-picking | ||||
|         else: | ||||
|             locflag = 0 | ||||
|             print("                 !!!              ") | ||||
|             print("!!No location routine available, autoPyLoT is running in non-location mode!!") | ||||
|             print("!!No source parameter estimation possible!!") | ||||
|             print("                 !!!              ") | ||||
| 
 | ||||
|         datapath = datastructure.expandDataPath() | ||||
|         if not parameter.hasParam('eventID'): | ||||
|             # multiple event processing | ||||
|             # read each event in database | ||||
|             events = [events for events in glob.glob(os.path.join(datapath, '*')) if os.path.isdir(events)] | ||||
|         else: | ||||
|             # single event processing | ||||
|             events = glob.glob(os.path.join(datapath, parameter.get('eventID'))) | ||||
|         for event in events: | ||||
|             data.setWFData(glob.glob(os.path.join(datapath, event, '*'))) | ||||
|             evID = os.path.split(event)[-1] | ||||
|             print('Working on event %s' % event) | ||||
|             print(data) | ||||
|             wfdat = data.getWFData()  # all available streams | ||||
|             wfdat = remove_underscores(wfdat) | ||||
|             metadata =  read_metadata(parameter.get('invdir')) | ||||
|             corr_dat, rest_flag = restitute_data(wfdat.copy(), *metadata) | ||||
|             ########################################################## | ||||
|             # !automated picking starts here! | ||||
|             picks = autopickevent(wfdat, parameter) | ||||
|             ########################################################## | ||||
|             # locating | ||||
|             if locflag == 1: | ||||
|                 # write phases to NLLoc-phase file | ||||
|                 nll.export(picks, phasefile) | ||||
| 
 | ||||
|                 # For locating the event the NLLoc-control file has to be modified! | ||||
|                 nllocout = '%s_%s' % (evID, nllocoutpatter) | ||||
|                 # create comment line for NLLoc-control file | ||||
|                 nll.modify_inputs(ctrf, nllocroot, nllocout, phasef, | ||||
|                                   ttpat) | ||||
| 
 | ||||
|                 # locate the event | ||||
|                 nll.locate(ctrfile) | ||||
| 
 | ||||
|                 # !iterative picking if traces remained unpicked or occupied with bad picks! | ||||
|                 # get theoretical onset times for picks with weights >= 4 | ||||
|                 # in order to reprocess them using smaller time windows around theoretical onset | ||||
|                 # get stations with bad onsets | ||||
|                 badpicks = [] | ||||
|                 for key in picks: | ||||
|                     if picks[key]['P']['weight'] >= 4 or picks[key]['S']['weight'] >= 4: | ||||
|                         badpicks.append([key, picks[key]['P']['mpp']]) | ||||
| 
 | ||||
|                 # TODO keep code DRY (Don't Repeat Yourself) the following part is written twice | ||||
|                 # suggestion: delete block and modify the later similar block to work properly | ||||
| 
 | ||||
|                 if len(badpicks) == 0: | ||||
|                     print("autoPyLoT: No bad onsets found, thus no iterative picking necessary!") | ||||
|                     # get NLLoc-location file | ||||
|                     locsearch = '%s/loc/%s.????????.??????.grid?.loc.hyp' % (nllocroot, nllocout) | ||||
|                     if len(glob.glob(locsearch)) > 0: | ||||
|                         # get latest NLLoc-location file if several are available | ||||
|                         nllocfile = max(glob.glob(locsearch), key=os.path.getctime) | ||||
|                         evt = read_events(nllocfile)[0] | ||||
|                         # calculating seismic moment Mo and moment magnitude Mw | ||||
|                         moment_mag = MomentMagnitude(corr_dat, evt, parameter.get('vp'), | ||||
|                                                      parameter.get('Qp'), | ||||
|                                                      parameter.get('rho'), True, 0) | ||||
|                         # update pick with moment property values (w0, fc, Mo) | ||||
|                         for station, props in moment_mag.moment_props.items(): | ||||
|                             picks[station]['P'].update(props) | ||||
|                         evt = moment_mag.updated_event() | ||||
|                         local_mag = RichterMagnitude(corr_dat, evt, | ||||
|                                                      parameter.get('sstop'), True, 0) | ||||
|                         for station, amplitude in local_mag.amplitudes.items(): | ||||
|                             picks[station]['S']['Ao'] = amplitude.generic_amplitude | ||||
|                         evt = local_mag.updated_event() | ||||
|                     else: | ||||
|                         print("autoPyLoT: No NLLoc-location file available!") | ||||
|                         print("No source parameter estimation possible!") | ||||
|                 else: | ||||
|                     # get theoretical P-onset times from NLLoc-location file | ||||
|                     locsearch = '%s/loc/%s.????????.??????.grid?.loc.hyp' % (nllocroot, nllocout) | ||||
|                     if len(glob.glob(locsearch)) > 0: | ||||
|                         # get latest file if several are available | ||||
|                         nllocfile = max(glob.glob(locsearch), key=os.path.getctime) | ||||
|                         nlloccounter = 0 | ||||
|                         while len(badpicks) > 0 and nlloccounter <= maxnumit: | ||||
|                             nlloccounter += 1 | ||||
|                             if nlloccounter > maxnumit: | ||||
|                                 print("autoPyLoT: Number of maximum iterations reached, stop iterative picking!") | ||||
|                                 break | ||||
|                             print("autoPyLoT: Starting with iteration No. %d ..." % nlloccounter) | ||||
|                             picks = iteratepicker(wfdat, nllocfile, picks, badpicks, parameter) | ||||
|                             # write phases to NLLoc-phase file | ||||
|                             nll.export(picks, phasefile) | ||||
|                             # remove actual NLLoc-location file to keep only the last | ||||
|                             os.remove(nllocfile) | ||||
|                             # locate the event | ||||
|                             nll.locate(ctrfile) | ||||
|                             print("autoPyLoT: Iteration No. %d finished." % nlloccounter) | ||||
|                             # get updated NLLoc-location file | ||||
|                             nllocfile = max(glob.glob(locsearch), key=os.path.getctime) | ||||
|                             # check for bad picks | ||||
|                             badpicks = [] | ||||
|                             for key in picks: | ||||
|                                 if picks[key]['P']['weight'] >= 4 or picks[key]['S']['weight'] >= 4: | ||||
|                                     badpicks.append([key, picks[key]['P']['mpp']]) | ||||
|                             print("autoPyLoT: After iteration No. %d: %d bad onsets found ..." % (nlloccounter, \ | ||||
|                                                                                                   len(badpicks))) | ||||
|                             if len(badpicks) == 0: | ||||
|                                 print("autoPyLoT: No more bad onsets found, stop iterative picking!") | ||||
|                                 nlloccounter = maxnumit | ||||
|                         evt = read_events(nllocfile)[0] | ||||
|                         # calculating seismic moment Mo and moment magnitude Mw | ||||
|                         moment_mag = MomentMagnitude(corr_dat, evt, parameter.get('vp'), | ||||
|                                                      parameter.get('Qp'), | ||||
|                                                      parameter.get('rho'), True, 0) | ||||
|                         # update pick with moment property values (w0, fc, Mo) | ||||
|                         for station, props in moment_mag.moment_props.items(): | ||||
|                             picks[station]['P'].update(props) | ||||
|                         evt = moment_mag.updated_event() | ||||
|                         local_mag = RichterMagnitude(corr_dat, evt, | ||||
|                                                      parameter.get('sstop'), True, 0) | ||||
|                         for station, amplitude in local_mag.amplitudes.items(): | ||||
|                             picks[station]['S']['Ao'] = amplitude.generic_amplitude | ||||
|                         evt = local_mag.updated_event() | ||||
|                         net_mw = moment_mag.net_magnitude() | ||||
|                         print("Network moment magnitude: %4.1f" % net_mw.mag) | ||||
|                     else: | ||||
|                         print("autoPyLoT: No NLLoc-location file available! Stop iteration!") | ||||
|             ########################################################## | ||||
|             # write phase files for various location routines | ||||
|             # HYPO71 | ||||
|             hypo71file = '%s/autoPyLoT_HYPO71.pha' % event | ||||
|             hsat.export(picks, hypo71file) | ||||
|             data.applyEVTData(picks) | ||||
|             if evt is not None: | ||||
|                 data.applyEVTData(evt, 'event') | ||||
|             fnqml = '%s/autoPyLoT' % event | ||||
|             data.exportEvent(fnqml) | ||||
| 
 | ||||
|             endsplash = '''------------------------------------------\n' | ||||
|                            -----Finished event %s!-----\n' | ||||
|                            ------------------------------------------'''.format \ | ||||
|                             (version=_getVersionString()) % evID | ||||
|             print(endsplash) | ||||
|             if locflag == 0: | ||||
|                 print("autoPyLoT was running in non-location mode!") | ||||
| 
 | ||||
|     endsp = '''####################################\n | ||||
|                ************************************\n | ||||
|                *********autoPyLoT terminates*******\n | ||||
|                The Python picking and Location Tool\n | ||||
|                ************************************'''.format(version=_getVersionString()) | ||||
|     print(endsp) | ||||
| 
 | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     from pylot.core.util.defaults import AUTOMATIC_DEFAULTS | ||||
|     # parse arguments | ||||
|     parser = argparse.ArgumentParser( | ||||
|         description='''autoPyLoT automatically picks phase onset times using higher order statistics, | ||||
|                        autoregressive prediction and AIC''') | ||||
| 
 | ||||
|     parser.add_argument('-i', '-I', '--inputfile', type=str, | ||||
|                         action='store', | ||||
|                         help='''full path to the file containing the input | ||||
|                         parameters for autoPyLoT''', | ||||
|                         default=AUTOMATIC_DEFAULTS | ||||
|                         ) | ||||
|     parser.add_argument('-v', '-V', '--version', action='version', | ||||
|                         version='autoPyLoT ' + __version__, | ||||
|                         help='show version information and exit') | ||||
| 
 | ||||
|     cla = parser.parse_args() | ||||
| 
 | ||||
|     autoPyLoT(str(cla.inputfile)) | ||||
							
								
								
									
										17
									
								
								help/index.html
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,17 @@ | ||||
| <html><head><title>PyLoT - the Python picking and Localisation Tool</title></head> | ||||
| <body> | ||||
| <p><b>PyLoT</b> is a program which is capable of picking seismic phases, | ||||
| exporting these as numerous standard phase format and localize the corresponding | ||||
| seismic event with external software as, e.g.:</p> | ||||
| <ul type="circle"> | ||||
| <li><a href="http://alomax.free.fr/nlloc/index.html">NonLinLoc</a></li> | ||||
| <li>HypoInvers</li> | ||||
| <li>HypoSat</li> | ||||
| <li>whatever you want ...</li> | ||||
| </ul> | ||||
| <p>Read more on the | ||||
| <a href="https://ariadne.geophysik.rub.de/trac/PyLoT/wiki/">PyLoT WikiPage</a>.</p> | ||||
| <p>Bug reports are very much appreciated and can also be delivered on our | ||||
| <a href="https://ariadne.geophysik.rub.de/trac/PyLoT">PyLoT TracPage</a> after | ||||
| successful registration.</p> | ||||
| </body></html> | ||||
							
								
								
									
										30
									
								
								icons.qrc
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,30 @@ | ||||
| <RCC> | ||||
|     <qresource> | ||||
|         <file>icons/pylot.ico</file> | ||||
|         <file>icons/pylot.png</file> | ||||
|         <file>icons/locate.png</file> | ||||
|         <file>icons/printer.png</file> | ||||
|         <file>icons/delete.png</file> | ||||
|         <file>icons/compare.png</file> | ||||
|         <file>icons/key_E.png</file> | ||||
|         <file>icons/key_N.png</file> | ||||
|         <file>icons/key_P.png</file> | ||||
|         <file>icons/key_Q.png</file> | ||||
|         <file>icons/key_R.png</file> | ||||
|         <file>icons/key_S.png</file> | ||||
|         <file>icons/key_T.png</file> | ||||
|         <file>icons/key_U.png</file> | ||||
|         <file>icons/key_V.png</file> | ||||
|         <file>icons/key_W.png</file> | ||||
|         <file>icons/key_Z.png</file> | ||||
|         <file>icons/filter.png</file> | ||||
|         <file>icons/sync.png</file> | ||||
|         <file>icons/zoom_0.png</file> | ||||
|         <file>icons/zoom_in.png</file> | ||||
|         <file>icons/zoom_out.png</file> | ||||
|         <file>splash/splash.png</file> | ||||
|     </qresource> | ||||
|     <qresource prefix="/help"> | ||||
|         <file>help/index.html</file> | ||||
|     </qresource> | ||||
| </RCC> | ||||
							
								
								
									
										
											BIN
										
									
								
								icons/compare.png
									
									
									
									
									
										Normal file
									
								
							
							
						
						| After Width: | Height: | Size: 3.0 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/delete.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 5.0 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/filter.png
									
									
									
									
									
										Normal file
									
								
							
							
						
						| After Width: | Height: | Size: 4.6 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_E.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 3.9 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_N.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 4.0 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_P.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 3.9 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_Q.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 4.1 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_R.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 4.0 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_S.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 4.0 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_T.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 3.8 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_U.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 3.9 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_V.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 4.0 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_W.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 4.1 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/key_Z.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 3.9 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/locate.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 7.1 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/printer.png
									
									
									
									
									
										Normal file
									
								
							
							
						
						| After Width: | Height: | Size: 6.5 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/pylot.ico
									
									
									
									
									
										Normal file
									
								
							
							
						
						| After Width: | Height: | Size: 2.2 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/pylot.png
									
									
									
									
									
										Normal file
									
								
							
							
						
						| After Width: | Height: | Size: 22 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/sync.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 4.2 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/zoom_0.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 6.9 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/zoom_in.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 7.0 KiB | 
							
								
								
									
										
											BIN
										
									
								
								icons/zoom_out.png
									
									
									
									
									
										Executable file
									
								
							
							
						
						| After Width: | Height: | Size: 6.9 KiB | 
							
								
								
									
										21
									
								
								icons_rc.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						
							
								
								
									
										3
									
								
								inputs/PILOT_TimeErrors.in
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,3 @@ | ||||
| ## default time errors for old PILOT phases | ||||
| 0.04 0.08 0.16 0.32                 #timeerrorsP#   %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P | ||||
| 0.04 0.08 0.16 0.32                 #timeerrorsS#   %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S | ||||
							
								
								
									
										100
									
								
								inputs/autoPyLoT.in
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,100 @@ | ||||
| %This is a parameter input file for autoPyLoT. | ||||
| %All main and special settings regarding data handling | ||||
| %and picking are to be set here! | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| 
 | ||||
| #main settings# | ||||
| /DATA/Insheim                       #rootpath#     %project path | ||||
| EVENT_DATA/LOCAL		    #datapath#     %data path | ||||
| 2013.02_Insheim                     #database#     %name of data base | ||||
| e0019.048.13                        #eventID#      %certain evnt ID for processing | ||||
| True                                #apverbose# | ||||
| PILOT				    #datastructure#  %choose data structure | ||||
| 0                                   #iplot#        %flag for plotting: 0 none, 1, partly, >1 everything | ||||
| AUTOPHASES_AIC_HOS4_ARH             #phasefile#    %name of autoPILOT output phase file  | ||||
| AUTOLOC_AIC_HOS4_ARH                #locfile#      %name of autoPILOT output location file  | ||||
| AUTOFOCMEC_AIC_HOS4_ARH.in          #focmecin#     %name of focmec input file containing polarities | ||||
| HYPOSAT                             #locrt#        %location routine used ("HYPOINVERSE" or "HYPOSAT") | ||||
| 6                                   #pmin#         %minimum required P picks for location | ||||
| 4                                   #p0min#        %minimum required P picks for location if at least | ||||
|                                                    %3 excellent P picks are found | ||||
| 2                                   #smin#         %minimum required S picks for location | ||||
| /home/ludger/bin/run_HYPOSAT4autoPILOT.csh #cshellp# %path and name of c-shell script to run location routine | ||||
| 7.6 8.5                             #blon#         %longitude bounding for location map | ||||
| 49 49.4                             #blat#         %lattitude bounding for location map | ||||
| #parameters for moment magnitude estimation# | ||||
| 5000                                #vp#           %average P-wave velocity | ||||
| 2800                                #vs#           %average S-wave velocity | ||||
| 2200                                #rho#          %rock density [kg/m^3] | ||||
| 300                                 #Qp#           %quality factor for P waves | ||||
| 100                                 #Qs#           %quality factor for S waves | ||||
| #common settings picker# | ||||
| 15                                  #pstart#       %start time [s] for calculating CF for P-picking | ||||
| 40                                  #pstop#        %end time [s] for calculating CF for P-picking | ||||
| -1.0                                #sstart#       %start time [s] after or before(-) P-onset for calculating CF for S-picking | ||||
| 7                                   #sstop#        %end time [s] after P-onset for calculating CF for S-picking | ||||
| 2 20                                #bpz1#         %lower/upper corner freq. of first band pass filter Z-comp. [Hz] | ||||
| 2 30                                #bpz2#         %lower/upper corner freq. of second band pass filter Z-comp. [Hz] | ||||
| 2 15                                #bph1#         %lower/upper corner freq. of first band pass filter H-comp. [Hz] | ||||
| 2 20                                #bph2#         %lower/upper corner freq. of second band pass filter z-comp. [Hz] | ||||
| #special settings for calculating CF# | ||||
| %!!Be careful when editing the following!! | ||||
| #Z-component# | ||||
| HOS                                 #algoP#        %choose algorithm for P-onset determination (HOS, ARZ, or AR3) | ||||
| 7                                   #tlta#         %for HOS-/AR-AIC-picker, length of LTA window [s]  | ||||
| 4                                   #hosorder#     %for HOS-picker, order of Higher Order Statistics | ||||
| 2                                   #Parorder#     %for AR-picker, order of AR process of Z-component | ||||
| 1.2                                 #tdet1z#       %for AR-picker, length of AR determination window [s] for Z-component, 1st pick | ||||
| 0.4                                 #tpred1z#      %for AR-picker, length of AR prediction window [s] for Z-component, 1st pick | ||||
| 0.6                                 #tdet2z#       %for AR-picker, length of AR determination window [s] for Z-component, 2nd pick | ||||
| 0.2                                 #tpred2z#      %for AR-picker, length of AR prediction window [s] for Z-component, 2nd pick | ||||
| 0.001                               #addnoise#     %add noise to seismogram for stable AR prediction | ||||
| 3 0.1 0.5 0.1                       #tsnrz#        %for HOS/AR, window lengths for SNR-and slope estimation [tnoise,tsafetey,tsignal,tslope] [s] | ||||
| 3                                   #pickwinP#     %for initial AIC pick, length of P-pick window [s] | ||||
| 8                                   #Precalcwin#   %for HOS/AR, window length [s] for recalculation of CF (relative to 1st pick) | ||||
| 0                                   #peps4aic#     %for HOS/AR, artificial uplift of samples of AIC-function (P) | ||||
| 0.2                                 #aictsmooth#   %for HOS/AR, take average of samples for smoothing of AIC-function [s] | ||||
| 0.1                                 #tsmoothP#     %for HOS/AR, take average of samples for smoothing CF [s] | ||||
| 0.001                               #ausP#         %for HOS/AR, artificial uplift of samples (aus) of CF (P) | ||||
| 1.3                                 #nfacP#        %for HOS/AR, noise factor for noise level determination (P) | ||||
| #H-components# | ||||
| ARH                                 #algoS#        %choose algorithm for S-onset determination (ARH or AR3) | ||||
| 0.8                                 #tdet1h#       %for HOS/AR, length of AR-determination window [s], H-components, 1st pick | ||||
| 0.4                                 #tpred1h#      %for HOS/AR, length of AR-prediction window [s], H-components, 1st pick | ||||
| 0.6                                 #tdet2h#       %for HOS/AR, length of AR-determinaton window [s], H-components, 2nd pick | ||||
| 0.3                                 #tpred2h#      %for HOS/AR, length of AR-prediction window [s], H-components, 2nd pick | ||||
| 4                                   #Sarorder#     %for AR-picker, order of AR process of H-components | ||||
| 6                                   #Srecalcwin#   %for AR-picker, window length [s] for recalculation of CF (2nd pick) (H) | ||||
| 3                                   #pickwinS#     %for initial AIC pick, length of S-pick window [s] | ||||
| 2 0.2 1.5 0.5                       #tsnrh#        %for ARH/AR3, window lengths for SNR-and slope estimation [tnoise,tsafetey,tsignal,tslope] [s] | ||||
| 0.05                                #aictsmoothS#  %for AIC-picker, take average of samples for smoothing of AIC-function [s] | ||||
| 0.02                                #tsmoothS#     %for AR-picker, take average of samples for smoothing CF [s] (S) | ||||
| 0.2                                 #pepsS#        %for AR-picker, artificial uplift of samples of CF (S)  | ||||
| 0.4                                 #ausS#         %for HOS/AR, artificial uplift of samples (aus) of CF (S) | ||||
| 1.5                                 #nfacS#        %for AR-picker, noise factor for noise level determination (S) | ||||
| %first-motion picker% | ||||
| 1                                   #minfmweight#  %minimum required p weight for first-motion determination | ||||
| 2                                   #minFMSNR#     %miniumum required SNR for first-motion determination | ||||
| 0.2                                 #fmpickwin#    %pick window around P onset for calculating zero crossings | ||||
| %quality assessment% | ||||
| #inital AIC onset# | ||||
| 0.01 0.02 0.04 0.08                 #timeerrorsP#   %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P | ||||
| 0.04 0.08 0.16 0.32                 #timeerrorsS#   %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S | ||||
| 80                                  #minAICPslope#  %below this slope [counts/s] the initial P pick is rejected | ||||
| 1.2                                 #minAICPSNR#    %below this SNR the initial P pick is rejected | ||||
| 50                                  #minAICSslope#  %below this slope [counts/s] the initial S pick is rejected | ||||
| 1.5                                 #minAICSSNR#    %below this SNR the initial S pick is rejected | ||||
| #check duration of signal using envelope function# | ||||
| 1.5                                 #prepickwin#   %pre-signal window length [s] for noise level estimation  | ||||
| 0.7                                 #minsiglength# %minimum required length of signal [s] | ||||
| 0.2                                 #sgap#         %safety gap between noise and signal window [s] | ||||
| 2                                   #noisefactor#  %noiselevel*noisefactor=threshold | ||||
| 60                                  #minpercent#   %per cent of samples required higher than threshold | ||||
| #check for spuriously picked S-onsets# | ||||
| 3.0                                 #zfac#         %P-amplitude must exceed zfac times RMS-S amplitude  | ||||
| #jackknife-processing for P-picks# | ||||
| 3                                   #thresholdweight#%minimum required weight of picks | ||||
| 3                                   #dttolerance#  %maximum allowed deviation of P picks from median [s] | ||||
| 4                                   #minstats#     %minimum number of stations with reliable P picks | ||||
| 3                                   #Sdttolerance# %maximum allowed deviation from Wadati-diagram | ||||
| 
 | ||||
							
								
								
									
										99
									
								
								inputs/autoPyLoT_local.in
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,99 @@ | ||||
| %This is a parameter input file for autoPyLoT. | ||||
| %All main and special settings regarding data handling | ||||
| %and picking are to be set here! | ||||
| %Parameters are optimized for local data sets! | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #main settings# | ||||
| /DATA/Insheim                       #rootpath#     %project path | ||||
| EVENT_DATA/LOCAL		    #datapath#     %data path | ||||
| 2016.08_Insheim                     #database#     %name of data base | ||||
| e0007.224.16                        #eventID#      %event ID for single event processing | ||||
| /DATA/Insheim/STAT_INFO             #invdir#       %full path to inventory or dataless-seed file | ||||
| PILOT				    #datastructure#%choose data structure | ||||
| 0                                   #iplot#        %flag for plotting: 0 none, 1 partly, >1 everything | ||||
| True                                #apverbose#    %choose 'True' or 'False' for terminal output | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #NLLoc settings# | ||||
| /home/ludger/NLLOC                  #nllocbin#     %path to NLLoc executable | ||||
| /home/ludger/NLLOC/Insheim          #nllocroot#    %root of NLLoc-processing directory | ||||
| AUTOPHASES.obs                      #phasefile#    %name of autoPyLoT-output phase file for NLLoc | ||||
|                                                    %(in nllocroot/obs) | ||||
| Insheim_min1d032016_auto.in         #ctrfile#      %name of autoPyLoT-output control file for NLLoc | ||||
|                                                    %(in nllocroot/run) | ||||
| ttime                               #ttpatter#     %pattern of NLLoc ttimes from grid | ||||
|                                                    %(in nllocroot/times) | ||||
| AUTOLOC_nlloc                       #outpatter#    %pattern of NLLoc-output file | ||||
|                                                    %(returns 'eventID_outpatter') | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #parameters for seismic moment estimation# | ||||
| 3530                                #vp#           %average P-wave velocity | ||||
| 2500                                #rho#          %average rock density [kg/m^3] | ||||
| 300 0.8                             #Qp#           %quality factor for P waves ([Qp, ap], Qp*f^a) | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| AUTOFOCMEC_AIC_HOS4_ARH.in          #focmecin#     %name of focmec input file containing derived polarities | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #common settings picker# | ||||
| 15.0                                #pstart#       %start time [s] for calculating CF for P-picking | ||||
| 60.0                                #pstop#        %end time [s] for calculating CF for P-picking | ||||
| -1.0                                #sstart#       %start time [s] relative to P-onset for calculating CF for S-picking | ||||
| 10.0                                #sstop#        %end time [s] after P-onset for calculating CF for S-picking | ||||
| 2 20                                #bpz1#         %lower/upper corner freq. of first band pass filter Z-comp. [Hz] | ||||
| 2 30                                #bpz2#         %lower/upper corner freq. of second band pass filter Z-comp. [Hz] | ||||
| 2 15                                #bph1#         %lower/upper corner freq. of first band pass filter H-comp. [Hz] | ||||
| 2 20                                #bph2#         %lower/upper corner freq. of second band pass filter z-comp. [Hz] | ||||
| #special settings for calculating CF# | ||||
| %!!Edit the following only if you know what you are doing!!% | ||||
| #Z-component# | ||||
| HOS                                 #algoP#        %choose algorithm for P-onset determination (HOS, ARZ, or AR3) | ||||
| 7.0                                 #tlta#         %for HOS-/AR-AIC-picker, length of LTA window [s]  | ||||
| 4                                   #hosorder#     %for HOS-picker, order of Higher Order Statistics | ||||
| 2                                   #Parorder#     %for AR-picker, order of AR process of Z-component | ||||
| 1.2                                 #tdet1z#       %for AR-picker, length of AR determination window [s] for Z-component, 1st pick | ||||
| 0.4                                 #tpred1z#      %for AR-picker, length of AR prediction window [s] for Z-component, 1st pick | ||||
| 0.6                                 #tdet2z#       %for AR-picker, length of AR determination window [s] for Z-component, 2nd pick | ||||
| 0.2                                 #tpred2z#      %for AR-picker, length of AR prediction window [s] for Z-component, 2nd pick | ||||
| 0.001                               #addnoise#     %add noise to seismogram for stable AR prediction | ||||
| 3 0.1 0.5 0.5                       #tsnrz#        %for HOS/AR, window lengths for SNR-and slope estimation [tnoise,tsafetey,tsignal,tslope] [s] | ||||
| 3.0                                 #pickwinP#     %for initial AIC pick, length of P-pick window [s] | ||||
| 6.0                                 #Precalcwin#   %for HOS/AR, window length [s] for recalculation of CF (relative to 1st pick) | ||||
| 0.2                                 #aictsmooth#   %for HOS/AR, take average of samples for smoothing of AIC-function [s] | ||||
| 0.1                                 #tsmoothP#     %for HOS/AR, take average of samples for smoothing CF [s] | ||||
| 0.001                               #ausP#         %for HOS/AR, artificial uplift of samples (aus) of CF (P) | ||||
| 1.3                                 #nfacP#        %for HOS/AR, noise factor for noise level determination (P) | ||||
| #H-components# | ||||
| ARH                                 #algoS#        %choose algorithm for S-onset determination (ARH or AR3) | ||||
| 0.8                                 #tdet1h#       %for HOS/AR, length of AR-determination window [s], H-components, 1st pick | ||||
| 0.4                                 #tpred1h#      %for HOS/AR, length of AR-prediction window [s], H-components, 1st pick | ||||
| 0.6                                 #tdet2h#       %for HOS/AR, length of AR-determinaton window [s], H-components, 2nd pick | ||||
| 0.3                                 #tpred2h#      %for HOS/AR, length of AR-prediction window [s], H-components, 2nd pick | ||||
| 4                                   #Sarorder#     %for AR-picker, order of AR process of H-components | ||||
| 5.0                                 #Srecalcwin#   %for AR-picker, window length [s] for recalculation of CF (2nd pick) (H) | ||||
| 3.0                                 #pickwinS#     %for initial AIC pick, length of S-pick window [s] | ||||
| 2 0.2 1.5 0.5                       #tsnrh#        %for ARH/AR3, window lengths for SNR-and slope estimation [tnoise,tsafetey,tsignal,tslope] [s] | ||||
| 0.5                                 #aictsmoothS#  %for AIC-picker, take average of samples for smoothing of AIC-function [s] | ||||
| 0.7                                 #tsmoothS#     %for AR-picker, take average of samples for smoothing CF [s] (S) | ||||
| 0.9                                 #ausS#         %for HOS/AR, artificial uplift of samples (aus) of CF (S) | ||||
| 1.5                                 #nfacS#        %for AR-picker, noise factor for noise level determination (S) | ||||
| %first-motion picker% | ||||
| 1                                   #minfmweight#  %minimum required P weight for first-motion determination | ||||
| 2                                   #minFMSNR#     %miniumum required SNR for first-motion determination | ||||
| 0.2                                 #fmpickwin#    %pick window around P onset for calculating zero crossings | ||||
| %quality assessment% | ||||
| #inital AIC onset# | ||||
| 0.01 0.02 0.04 0.08                 #timeerrorsP#   %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P | ||||
| 0.04 0.08 0.16 0.32                 #timeerrorsS#   %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S | ||||
| 4                                   #minAICPslope#  %below this slope [counts/s] the initial P pick is rejected | ||||
| 1.2                                 #minAICPSNR#    %below this SNR the initial P pick is rejected | ||||
| 2                                   #minAICSslope#  %below this slope [counts/s] the initial S pick is rejected | ||||
| 1.5                                 #minAICSSNR#    %below this SNR the initial S pick is rejected | ||||
| #check duration of signal using envelope function# | ||||
| 3                                   #minsiglength# %minimum required length of signal [s] | ||||
| 1.0                                 #noisefactor#  %noiselevel*noisefactor=threshold | ||||
| 40                                  #minpercent#   %required percentage of samples higher than threshold | ||||
| #check for spuriously picked S-onsets# | ||||
| 2.0                                 #zfac#         %P-amplitude must exceed at least zfac times RMS-S amplitude  | ||||
| #check statistics of P onsets# | ||||
| 2.5                                 #mdttolerance#  %maximum allowed deviation of P picks from median [s] | ||||
| #wadati check# | ||||
| 1.0                                 #wdttolerance# %maximum allowed deviation from Wadati-diagram | ||||
| 
 | ||||
							
								
								
									
										100
									
								
								inputs/autoPyLoT_regional.in
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,100 @@ | ||||
| %This is a parameter input file for autoPyLoT. | ||||
| %All main and special settings regarding data handling | ||||
| %and picking are to be set here! | ||||
| %Parameters are optimized for regional data sets! | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| 
 | ||||
| #main settings# | ||||
| /DATA/Egelados                      #rootpath#     %project path | ||||
| EVENT_DATA/LOCAL		    #datapath#     %data path | ||||
| 2006.01_Nisyros                     #database#     %name of data base | ||||
| e1412.008.06                        #eventID#      %event ID for single event processing | ||||
| /DATA/Egelados/STAT_INFO            #invdir#       %full path to inventory or dataless-seed file | ||||
| PILOT				    #datastructure#  %choose data structure | ||||
| 0                                   #iplot#        %flag for plotting: 0 none, 1, partly, >1 everything | ||||
| True                                #apverbose#    %choose 'True' or 'False' for terminal output | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #NLLoc settings# | ||||
| /home/ludger/NLLOC                  #nllocbin#     %path to NLLoc executable | ||||
| /home/ludger/NLLOC/Insheim          #nllocroot#    %root of NLLoc-processing directory | ||||
| AUTOPHASES.obs                      #phasefile#    %name of autoPyLoT-output phase file for NLLoc | ||||
|                                                    %(in nllocroot/obs) | ||||
| Insheim_min1d2015_auto.in           #ctrfile#      %name of autoPyLoT-output control file for NLLoc | ||||
|                                                    %(in nllocroot/run) | ||||
| ttime                               #ttpatter#     %pattern of NLLoc ttimes from grid | ||||
|                                                    %(in nllocroot/times) | ||||
| AUTOLOC_nlloc                       #outpatter#    %pattern of NLLoc-output file | ||||
|                                                    %(returns 'eventID_outpatter') | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #parameters for seismic moment estimation# | ||||
| 3530                                #vp#           %average P-wave velocity | ||||
| 2700                                #rho#          %average rock density [kg/m^3] | ||||
| 1000f**0.8                          #Qp#           %quality factor for P waves (Qp*f^a) | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| AUTOFOCMEC_AIC_HOS4_ARH.in          #focmecin#     %name of focmec input file containing derived polarities | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #common settings picker# | ||||
| 20                                  #pstart#       %start time [s] for calculating CF for P-picking | ||||
| 100                                 #pstop#        %end time [s] for calculating CF for P-picking | ||||
| 1.0                                 #sstart#       %start time [s] after or before(-) P-onset for calculating CF for S-picking | ||||
| 100                                 #sstop#        %end time [s] after P-onset for calculating CF for S-picking | ||||
| 3 10                                #bpz1#         %lower/upper corner freq. of first band pass filter Z-comp. [Hz] | ||||
| 3 12                                #bpz2#         %lower/upper corner freq. of second band pass filter Z-comp. [Hz] | ||||
| 3 8                                 #bph1#         %lower/upper corner freq. of first band pass filter H-comp. [Hz] | ||||
| 3 6                                 #bph2#         %lower/upper corner freq. of second band pass filter H-comp. [Hz] | ||||
| #special settings for calculating CF# | ||||
| %!!Be careful when editing the following!! | ||||
| #Z-component# | ||||
| HOS                                 #algoP#        %choose algorithm for P-onset determination (HOS, ARZ, or AR3) | ||||
| 7                                   #tlta#         %for HOS-/AR-AIC-picker, length of LTA window [s]  | ||||
| 4                                   #hosorder#     %for HOS-picker, order of Higher Order Statistics | ||||
| 2                                   #Parorder#     %for AR-picker, order of AR process of Z-component | ||||
| 1.2                                 #tdet1z#       %for AR-picker, length of AR determination window [s] for Z-component, 1st pick | ||||
| 0.4                                 #tpred1z#      %for AR-picker, length of AR prediction window [s] for Z-component, 1st pick | ||||
| 0.6                                 #tdet2z#       %for AR-picker, length of AR determination window [s] for Z-component, 2nd pick | ||||
| 0.2                                 #tpred2z#      %for AR-picker, length of AR prediction window [s] for Z-component, 2nd pick | ||||
| 0.001                               #addnoise#     %add noise to seismogram for stable AR prediction | ||||
| 5 0.2 3.0 1.5                       #tsnrz#        %for HOS/AR, window lengths for SNR-and slope estimation [tnoise,tsafetey,tsignal,tslope] [s] | ||||
| 3                                   #pickwinP#     %for initial AIC and refined pick, length of P-pick window [s] | ||||
| 8                                   #Precalcwin#   %for HOS/AR, window length [s] for recalculation of CF (relative to 1st pick) | ||||
| 1.0                                 #aictsmooth#   %for HOS/AR, take average of samples for smoothing of AIC-function [s] | ||||
| 0.3                                 #tsmoothP#     %for HOS/AR, take average of samples for smoothing CF [s] | ||||
| 0.3                                 #ausP#         %for HOS/AR, artificial uplift of samples (aus) of CF (P) | ||||
| 1.3                                 #nfacP#        %for HOS/AR, noise factor for noise level determination (P) | ||||
| #H-components# | ||||
| ARH                                 #algoS#        %choose algorithm for S-onset determination (ARH or AR3) | ||||
| 0.8                                 #tdet1h#       %for HOS/AR, length of AR-determination window [s], H-components, 1st pick | ||||
| 0.4                                 #tpred1h#      %for HOS/AR, length of AR-prediction window [s], H-components, 1st pick | ||||
| 0.6                                 #tdet2h#       %for HOS/AR, length of AR-determinaton window [s], H-components, 2nd pick | ||||
| 0.3                                 #tpred2h#      %for HOS/AR, length of AR-prediction window [s], H-components, 2nd pick | ||||
| 4                                   #Sarorder#     %for AR-picker, order of AR process of H-components | ||||
| 10                                  #Srecalcwin#   %for AR-picker, window length [s] for recalculation of CF (2nd pick) (H) | ||||
| 25                                  #pickwinS#     %for initial AIC and refined pick, length of S-pick window [s] | ||||
| 5 0.2 3.0 3.0                       #tsnrh#        %for ARH/AR3, window lengths for SNR-and slope estimation [tnoise,tsafetey,tsignal,tslope] [s] | ||||
| 3.5                                 #aictsmoothS#  %for AIC-picker, take average of samples for smoothing of AIC-function [s] | ||||
| 1.0                                 #tsmoothS#     %for AR-picker, take average of samples for smoothing CF [s] (S) | ||||
| 0.2                                 #ausS#         %for HOS/AR, artificial uplift of samples (aus) of CF (S) | ||||
| 1.5                                 #nfacS#        %for AR-picker, noise factor for noise level determination (S) | ||||
| %first-motion picker% | ||||
| 1                                   #minfmweight#  %minimum required p weight for first-motion determination | ||||
| 2                                   #minFMSNR#     %miniumum required SNR for first-motion determination | ||||
| 6.0                                 #fmpickwin#    %pick window around P onset for calculating zero crossings | ||||
| %quality assessment% | ||||
| #inital AIC onset# | ||||
| 0.04 0.08 0.16 0.32                 #timeerrorsP#   %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P | ||||
| 0.04 0.08 0.16 0.32                 #timeerrorsS#   %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S | ||||
| 3                                   #minAICPslope#  %below this slope [counts/s] the initial P pick is rejected | ||||
| 1.2                                 #minAICPSNR#    %below this SNR the initial P pick is rejected | ||||
| 5                                   #minAICSslope#  %below this slope [counts/s] the initial S pick is rejected | ||||
| 2.5                                 #minAICSSNR#    %below this SNR the initial S pick is rejected | ||||
| #check duration of signal using envelope function# | ||||
| 30                                  #minsiglength# %minimum required length of signal [s] | ||||
| 2.5                                 #noisefactor#  %noiselevel*noisefactor=threshold | ||||
| 60                                  #minpercent#   %required percentage of samples higher than threshold | ||||
| #check for spuriously picked S-onsets# | ||||
| 0.5                                 #zfac#         %P-amplitude must exceed at least zfac times RMS-S amplitude  | ||||
| #check statistics of P onsets# | ||||
| 45                                  #mdttolerance#  %maximum allowed deviation of P picks from median [s] | ||||
| #wadati check# | ||||
| 3.0                                 #wdttolerance# %maximum allowed deviation from Wadati-diagram | ||||
| 
 | ||||
							
								
								
									
										2
									
								
								inputs/filter.in
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,2 @@ | ||||
| P bandpass 4 2.0 20.0 | ||||
| S bandpass 4 2.0 15.0 | ||||
							
								
								
									
										98
									
								
								inputs/pylot.in
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,98 @@ | ||||
| %This is a example parameter input file for PyLoT. | ||||
| %All main and special settings regarding data handling | ||||
| %and picking are to be set here! | ||||
| %Parameters shown here are optimized for local data sets! | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #main settings# | ||||
| /data/Geothermie/Insheim            #rootpath#      %project path | ||||
| EVENT_DATA/LOCAL                    #datapath#      %data path | ||||
| 2013.02_Insheim                     #database#      %name of data base | ||||
| e0019.048.13                        #eventID#       %event ID for single event processing | ||||
| /data/Geothermie/Insheim/STAT_INFO  #invdir#        %full path to  inventory or dataless-seed file | ||||
| PILOT                               #datastructure# %choose data structure | ||||
| 0                                   #iplot#         %flag for plotting: 0 none, 1 partly, >1 everything | ||||
| True                                #apverbose#     %choose 'True' or 'False' for terminal output | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #NLLoc settings# | ||||
| /progs/bin                          #nllocbin#      %path to NLLoc executable | ||||
| /data/Geothermie/Insheim/LOCALISATION/NLLoc          #nllocroot#     %root of NLLoc-processing directory | ||||
| AUTOPHASES.obs                      #phasefile#     %name of autoPyLoT-output phase file for NLLoc | ||||
|                                                     %(in nllocroot/obs) | ||||
| Insheim_min1d2015.in           #ctrfile#       %name of PyLoT-output control file for NLLoc | ||||
|                                                     %(in nllocroot/run) | ||||
| ttime                               #ttpatter#      %pattern of NLLoc ttimes from grid | ||||
|                                                     %(in nllocroot/times) | ||||
| AUTOLOC_nlloc                       #outpatter#     %pattern of NLLoc-output file | ||||
|                                                     %(returns 'eventID_outpatter') | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #parameters for seismic moment estimation# | ||||
| 3530                                #vp#            %average P-wave velocity | ||||
| 2500                                #rho#           %average rock density [kg/m^3] | ||||
| 300 0.8                           #Qp#            %quality factor for P waves (Qp*f^a); list(Qp, a) | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| AUTOFOCMEC_AIC_HOS4_ARH.in          #focmecin#      %name of focmec input file containing derived polarities | ||||
| %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% | ||||
| #common settings picker# | ||||
| 15.0                                #pstart#        %start time [s] for calculating CF for P-picking | ||||
| 60.0                                #pstop#         %end time [s] for calculating CF for P-picking | ||||
| -1.0                                #sstart#        %start time [s] relative to P-onset for calculating CF for S-picking | ||||
| 10.0                                #sstop#         %end time [s] after P-onset for calculating CF for S-picking | ||||
| 2 20                                #bpz1#          %lower/upper corner freq. of first band pass filter Z-comp. [Hz] | ||||
| 2 30                                #bpz2#          %lower/upper corner freq. of second band pass filter Z-comp. [Hz] | ||||
| 2 15                                #bph1#          %lower/upper corner freq. of first band pass filter H-comp. [Hz] | ||||
| 2 20                                #bph2#          %lower/upper corner freq. of second band pass filter z-comp. [Hz] | ||||
| #special settings for calculating CF# | ||||
| %!!Edit the following only if you know what you are doing!!% | ||||
| #Z-component# | ||||
| HOS                                 #algoP#         %choose algorithm for P-onset determination (HOS, ARZ, or AR3) | ||||
| 7.0                                 #tlta#          %for HOS-/AR-AIC-picker, length of LTA window [s] | ||||
| 4                                   #hosorder#      %for HOS-picker, order of Higher Order Statistics | ||||
| 2                                   #Parorder#      %for AR-picker, order of AR process of Z-component | ||||
| 1.2                                 #tdet1z#        %for AR-picker, length of AR determination window [s] for Z-component, 1st pick | ||||
| 0.4                                 #tpred1z#       %for AR-picker, length of AR prediction window [s] for Z-component, 1st pick | ||||
| 0.6                                 #tdet2z#        %for AR-picker, length of AR determination window [s] for Z-component, 2nd pick | ||||
| 0.2                                 #tpred2z#       %for AR-picker, length of AR prediction window [s] for Z-component, 2nd pick | ||||
| 0.001                               #addnoise#      %add noise to seismogram for stable AR prediction | ||||
| 3 0.1 0.5 0.5                       #tsnrz#         %for HOS/AR, window lengths for SNR-and slope estimation [tnoise,tsafetey,tsignal,tslope] [s] | ||||
| 3.0                                 #pickwinP#      %for initial AIC pick, length of P-pick window [s] | ||||
| 6.0                                 #Precalcwin#    %for HOS/AR, window length [s] for recalculation of CF (relative to 1st pick) | ||||
| 0.2                                 #aictsmooth#    %for HOS/AR, take average of samples for smoothing of AIC-function [s] | ||||
| 0.1                                 #tsmoothP#      %for HOS/AR, take average of samples for smoothing CF [s] | ||||
| 0.001                               #ausP#          %for HOS/AR, artificial uplift of samples (aus) of CF (P) | ||||
| 1.3                                 #nfacP#         %for HOS/AR, noise factor for noise level determination (P) | ||||
| #H-components# | ||||
| ARH                                 #algoS#         %choose algorithm for S-onset determination (ARH or AR3) | ||||
| 0.8                                 #tdet1h#        %for HOS/AR, length of AR-determination window [s], H-components, 1st pick | ||||
| 0.4                                 #tpred1h#       %for HOS/AR, length of AR-prediction window [s], H-components, 1st pick | ||||
| 0.6                                 #tdet2h#        %for HOS/AR, length of AR-determinaton window [s], H-components, 2nd pick | ||||
| 0.3                                 #tpred2h#       %for HOS/AR, length of AR-prediction window [s], H-components, 2nd pick | ||||
| 4                                   #Sarorder#      %for AR-picker, order of AR process of H-components | ||||
| 5.0                                 #Srecalcwin#    %for AR-picker, window length [s] for recalculation of CF (2nd pick) (H) | ||||
| 3.0                                 #pickwinS#      %for initial AIC pick, length of S-pick window [s] | ||||
| 2 0.2 1.5 0.5                       #tsnrh#         %for ARH/AR3, window lengths for SNR-and slope estimation [tnoise,tsafetey,tsignal,tslope] [s] | ||||
| 0.5                                 #aictsmoothS#   %for AIC-picker, take average of samples for smoothing of AIC-function [s] | ||||
| 0.7                                 #tsmoothS#      %for AR-picker, take average of samples for smoothing CF [s] (S) | ||||
| 0.9                                 #ausS#          %for HOS/AR, artificial uplift of samples (aus) of CF (S) | ||||
| 1.5                                 #nfacS#         %for AR-picker, noise factor for noise level determination (S) | ||||
| %first-motion picker% | ||||
| 1                                   #minfmweight#   %minimum required P weight for first-motion determination | ||||
| 2                                   #minFMSNR#      %miniumum required SNR for first-motion determination | ||||
| 0.2                                 #fmpickwin#     %pick window around P onset for calculating zero crossings | ||||
| %quality assessment% | ||||
| #inital AIC onset# | ||||
| 0.01 0.02 0.04 0.08                 #timeerrorsP#   %discrete time errors [s] corresponding to picking weights [0 1 2 3] for P | ||||
| 0.04 0.08 0.16 0.32                 #timeerrorsS#   %discrete time errors [s] corresponding to picking weights [0 1 2 3] for S | ||||
| 4                                   #minAICPslope#  %below this slope [counts/s] the initial P pick is rejected | ||||
| 1.2                                 #minAICPSNR#    %below this SNR the initial P pick is rejected | ||||
| 2                                   #minAICSslope#  %below this slope [counts/s] the initial S pick is rejected | ||||
| 1.5                                 #minAICSSNR#    %below this SNR the initial S pick is rejected | ||||
| #check duration of signal using envelope function# | ||||
| 3                                   #minsiglength#  %minimum required length of signal [s] | ||||
| 1.0                                 #noisefactor#   %noiselevel*noisefactor=threshold | ||||
| 40                                  #minpercent#    %required percentage of samples higher than threshold | ||||
| #check for spuriously picked S-onsets# | ||||
| 2.0                                 #zfac#          %P-amplitude must exceed at least zfac times RMS-S amplitude | ||||
| #check statistics of P onsets# | ||||
| 2.5                                 #mdttolerance#  %maximum allowed deviation of P picks from median [s] | ||||
| #wadati check# | ||||
| 1.0                                 #wdttolerance#  %maximum allowed deviation from Wadati-diagram | ||||
							
								
								
									
										53
									
								
								inputs/richter_scaling.data
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,53 @@ | ||||
|  0 1.4 | ||||
|  10 1.5 | ||||
|  20 1.7 | ||||
|  25 1.9 | ||||
|  30 2.1 | ||||
|  35 2.3 | ||||
|  40 2.4 | ||||
|  45 2.5 | ||||
|  50 2.6 | ||||
|  60 2.8 | ||||
|  70 2.8 | ||||
|  75 2.9 | ||||
|  85 2.9 | ||||
|  90 3.0 | ||||
| 100 3.0 | ||||
| 110 3.1 | ||||
| 120 3.1 | ||||
| 130 3.2 | ||||
| 140 3.2 | ||||
| 150 3.3 | ||||
| 160 3.3 | ||||
| 170 3.4 | ||||
| 180 3.4 | ||||
| 190 3.5 | ||||
| 200 3.5 | ||||
| 210 3.6 | ||||
| 230 3.7 | ||||
| 240 3.7 | ||||
| 250 3.8 | ||||
| 260 3.8 | ||||
| 270 3.9 | ||||
| 280 3.9 | ||||
| 290 4.0 | ||||
| 300 4.0 | ||||
| 310 4.1 | ||||
| 320 4.2 | ||||
| 330 4.2 | ||||
| 340 4.2 | ||||
| 350 4.3 | ||||
| 360 4.3 | ||||
| 370 4.3 | ||||
| 380 4.4 | ||||
| 390 4.4 | ||||
| 400 4.5 | ||||
| 430 4.6 | ||||
| 470 4.7 | ||||
| 510 4.8 | ||||
| 560 4.9 | ||||
| 600 5.1 | ||||
| 700 5.2 | ||||
| 800 5.4 | ||||
| 900 5.5 | ||||
| 1000 5.7 | ||||
							
								
								
									
										223
									
								
								makePyLoT.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,223 @@ | ||||
| #!/usr/bin/env python | ||||
| # encoding: utf-8 | ||||
| from __future__ import print_function | ||||
| 
 | ||||
| """ | ||||
| makePyLoT -- build and install PyLoT | ||||
| 
 | ||||
| makePyLoT is a python make file in order to establish the folder structure and | ||||
| meet requisites | ||||
| 
 | ||||
| It defines | ||||
| :class CLIError: | ||||
| :method main: | ||||
| 
 | ||||
| :author:     Sebastian Wehling-Benatelli | ||||
| 
 | ||||
| :copyright:  2014 MAGS2 EP3 Working Group. All rights reserved. | ||||
| 
 | ||||
| :license:    GNU Lesser General Public License, Version 3 | ||||
|     (http://www.gnu.org/copyleft/lesser.html) | ||||
| 
 | ||||
| :contact:    sebastian.wehling@rub.de | ||||
| 
 | ||||
| updated: Updated | ||||
| """ | ||||
| 
 | ||||
| import glob | ||||
| import os | ||||
| import sys | ||||
| import shutil | ||||
| import copy | ||||
| 
 | ||||
| from argparse import ArgumentParser | ||||
| from argparse import RawDescriptionHelpFormatter | ||||
| 
 | ||||
| __all__ = [] | ||||
| __version__ = 0.1 | ||||
| __date__ = '2014-11-26' | ||||
| __updated__ = '2016-04-28' | ||||
| 
 | ||||
| DEBUG = 0 | ||||
| TESTRUN = 0 | ||||
| PROFILE = 0 | ||||
| 
 | ||||
| 
 | ||||
| class CLIError(Exception): | ||||
|     """Generic exception to raise and log different fatal errors.""" | ||||
| 
 | ||||
|     def __init__(self, msg): | ||||
|         super(CLIError).__init__(type(self)) | ||||
|         self.msg = "E: %s" % msg | ||||
| 
 | ||||
|     def __str__(self): | ||||
|         return self.msg | ||||
| 
 | ||||
|     def __unicode__(self): | ||||
|         return self.msg | ||||
| 
 | ||||
| 
 | ||||
| def main(argv=None):  # IGNORE:C0111 | ||||
|     '''Command line options.''' | ||||
| 
 | ||||
|     if argv is None: | ||||
|         argv = sys.argv | ||||
|     else: | ||||
|         sys.argv.extend(argv) | ||||
| 
 | ||||
|     program_name = os.path.basename(sys.argv[0]) | ||||
|     program_version = "v%s" % __version__ | ||||
|     program_build_date = str(__updated__) | ||||
|     program_version_message = 'makePyLoT %s (%s)' % ( | ||||
|         program_version, program_build_date) | ||||
|     program_shortdesc = __import__('__main__').__doc__.split("\n")[1] | ||||
|     program_license = '''{0:s} | ||||
| 
 | ||||
|     Created by Sebastian Wehling-Benatelli on {1:s}. | ||||
|     Copyright 2014 MAGS2 EP3 Working Group. All rights reserved. | ||||
| 
 | ||||
|     GNU Lesser General Public License, Version 3 | ||||
|     (http://www.gnu.org/copyleft/lesser.html) | ||||
| 
 | ||||
|     Distributed on an "AS IS" basis without warranties | ||||
|     or conditions of any kind, either express or implied. | ||||
| 
 | ||||
|     USAGE | ||||
|     '''.format(program_shortdesc, str(__date__)) | ||||
| 
 | ||||
|     try: | ||||
|         # Setup argument parser | ||||
|         parser = ArgumentParser(description=program_license, | ||||
|                                 formatter_class=RawDescriptionHelpFormatter) | ||||
|         parser.add_argument("-b", "--build", dest="build", action="store_true", | ||||
|                             help="build PyLoT") | ||||
|         parser.add_argument("-v", "--verbose", dest="verbose", action="count", | ||||
|                             help="set verbosity level") | ||||
|         parser.add_argument("-i", "--install", dest="install", | ||||
|                             action="store_true", | ||||
|                             help="install PyLoT on the system") | ||||
|         parser.add_argument("-d", "--directory", dest="directory", | ||||
|                             help="installation directory", metavar="RE") | ||||
|         parser.add_argument('-V', '--version', action='version', | ||||
|                             version=program_version_message) | ||||
| 
 | ||||
|         # Process arguments | ||||
|         args = parser.parse_args() | ||||
| 
 | ||||
|         verbose = args.verbose | ||||
|         build = args.build | ||||
|         install = args.install | ||||
|         directory = args.directory | ||||
| 
 | ||||
|         if verbose > 0: | ||||
|             print("Verbose mode on") | ||||
|         if install and not directory: | ||||
|             raise CLIError("""Trying to install without appropriate | ||||
|                            destination; please specify an installation | ||||
|                            directory!""") | ||||
|         if build and install: | ||||
|             print("Building and installing PyLoT ...\n") | ||||
|             buildPyLoT(verbose) | ||||
|             installPyLoT(verbose) | ||||
|         elif build and not install: | ||||
|             print("Building PyLoT without installing! Please wait ...\n") | ||||
|             buildPyLoT(verbose) | ||||
|         cleanUp() | ||||
|         return 0 | ||||
|     except KeyboardInterrupt: | ||||
|         cleanUp(1) | ||||
|         return 0 | ||||
|     except Exception as e: | ||||
|         if DEBUG or TESTRUN: | ||||
|             raise e | ||||
|         indent = len(program_name) * " " | ||||
|         sys.stderr.write(program_name + ": " + repr(e) + "\n") | ||||
|         sys.stderr.write(indent + "  for help use --help") | ||||
|         return 2 | ||||
| 
 | ||||
| 
 | ||||
| def buildPyLoT(verbosity=None): | ||||
|     system = sys.platform | ||||
|     if verbosity > 1: | ||||
|         msg = ("... on system: {0}\n" | ||||
|                "\n" | ||||
|                "              Current working directory: {1}\n" | ||||
|                ).format(system, os.getcwd()) | ||||
|         print(msg) | ||||
|     if system.startswith(('win', 'microsoft')): | ||||
|         raise CLIError( | ||||
|             "building on Windows system not tested yet; implementation pending") | ||||
|     elif system == 'darwin': | ||||
|         # create a symbolic link to the desired python interpreter in order to | ||||
|         # display the right application name | ||||
|         for path in os.getenv('PATH').split(':'): | ||||
|             found = glob.glob(os.path.join(path, 'python')) | ||||
|             if found: | ||||
|                 os.symlink(found, './PyLoT') | ||||
|                 break | ||||
| 
 | ||||
| 
 | ||||
| def installPyLoT(verbosity=None): | ||||
|     files_to_copy = {'autoPyLoT_local.in':['~', '.pylot'], | ||||
|                      'autoPyLoT_regional.in':['~', '.pylot'], | ||||
|                      'filter.in':['~', '.pylot']} | ||||
|     if verbosity > 0: | ||||
|         print ('starting installation of PyLoT ...') | ||||
|     if verbosity > 1: | ||||
|         print ('copying input files into destination folder ...') | ||||
|     ans = input('please specify scope of interest ' | ||||
|                 '([0]=local, 1=regional) :') or 0 | ||||
|     if not isinstance(ans, int): | ||||
|         ans = int(ans) | ||||
|     ans = 'local' if ans is 0 else 'regional' | ||||
|     link_dest = [] | ||||
|     for file, destination in files_to_copy.items(): | ||||
|         link_file = ans in file | ||||
|         if link_file: | ||||
|             link_dest = copy.deepcopy(destination) | ||||
|             link_dest.append('autoPyLoT.in') | ||||
|             link_dest = os.path.join(*link_dest) | ||||
|         destination.append(file) | ||||
|         destination = os.path.join(*destination) | ||||
|         srcfile = os.path.join('input', file) | ||||
|         assert not os.path.isabs(srcfile), 'source files seem to be ' \ | ||||
|                                            'corrupted ...' | ||||
|         if verbosity > 1: | ||||
|             print ('copying file {file} to folder {dest}'.format(file=file, dest=destination)) | ||||
|         shutil.copyfile(srcfile, destination) | ||||
|         if link_file: | ||||
|             if verbosity: | ||||
|                 print('linking input file for autoPyLoT ...') | ||||
|             os.symlink(destination, link_dest) | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| def cleanUp(verbosity=None): | ||||
|     if verbosity >= 1: | ||||
|         print('cleaning up build files...') | ||||
|     if sys.platform == 'darwin': | ||||
|         os.remove('./PyLoT') | ||||
| 
 | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     if DEBUG: | ||||
|         sys.argv.append("-h") | ||||
|         sys.argv.append("-v") | ||||
|     if TESTRUN: | ||||
|         import doctest | ||||
| 
 | ||||
|         doctest.testmod() | ||||
|     if PROFILE: | ||||
|         import cProfile | ||||
|         import pstats | ||||
| 
 | ||||
|         profile_filename = 'makePyLoT_profile.txt' | ||||
|         cProfile.run('main()', profile_filename) | ||||
|         statsfile = open("profile_stats.txt", "wb") | ||||
|         p = pstats.Stats(profile_filename, stream=statsfile) | ||||
|         stats = p.strip_dirs().sort_stats('cumulative') | ||||
|         stats.print_stats() | ||||
|         statsfile.close() | ||||
|         sys.exit(0) | ||||
|     sys.exit(main()) | ||||
							
								
								
									
										
											BIN
										
									
								
								pylot/PyLoT.ico
									
									
									
									
									
										Normal file
									
								
							
							
						
						| After Width: | Height: | Size: 2.2 KiB | 
							
								
								
									
										1
									
								
								pylot/RELEASE-VERSION
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1 @@ | ||||
| 0.1a | ||||
							
								
								
									
										27
									
								
								pylot/__init__.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,27 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
| # -------------------------------------------------------- | ||||
| # Purpose: Convience imports for PyLoT | ||||
| # | ||||
| ''' | ||||
| ================================================ | ||||
| PyLoT - the Python picking and Localization Tool | ||||
| ================================================ | ||||
| 
 | ||||
| This python library contains a graphical user interfaces for picking | ||||
| seismic phases. This software needs ObsPy (http://github.com/obspy/obspy/wiki) | ||||
| and the Qt4 libraries to be installed first. | ||||
| 
 | ||||
| PILOT has been developed in Mathworks' MatLab. In order to distribute | ||||
| PILOT without facing portability problems, it has been decided to re- | ||||
| develop the software package in Python. The great work of the ObsPy | ||||
| group allows easy handling of a bunch of seismic data and PyLoT will | ||||
| benefit a lot compared to the former MatLab version. | ||||
| 
 | ||||
| The development of PyLoT is part of the joint research project MAGS2. | ||||
| 
 | ||||
| :copyright: | ||||
|     The PyLoT Development Team | ||||
| :license: | ||||
|     GNU Lesser General Public License, Version 3 | ||||
|     (http://www.gnu.org/copyleft/lesser.html) | ||||
| ''' | ||||
							
								
								
									
										1
									
								
								pylot/core/__init__.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
							
								
								
									
										1
									
								
								pylot/core/analysis/__init__.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
							
								
								
									
										694
									
								
								pylot/core/analysis/magnitude.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,694 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| """ | ||||
| Created autumn/winter 2015. | ||||
| 
 | ||||
| :author: Ludger Küperkoch / MAGS2 EP3 working group | ||||
| """ | ||||
| import os | ||||
| 
 | ||||
| import matplotlib.pyplot as plt | ||||
| import numpy as np | ||||
| import obspy.core.event as ope | ||||
| from obspy.geodetics import degrees2kilometers | ||||
| from scipy import integrate, signal | ||||
| from scipy.optimize import curve_fit | ||||
| 
 | ||||
| from pylot.core.pick.utils import getsignalwin, crossings_nonzero_all, \ | ||||
|     select_for_phase | ||||
| from pylot.core.util.utils import common_range, fit_curve | ||||
| 
 | ||||
| 
 | ||||
| def richter_magnitude_scaling(delta): | ||||
|     relation = np.loadtxt(os.path.join(os.path.expanduser('~'), | ||||
|                                        '.pylot', 'richter_scaling.data')) | ||||
|     # prepare spline interpolation to calculate return value | ||||
|     func, params = fit_curve(relation[:, 0], relation[:, 1]) | ||||
|     return func(delta, params) | ||||
| 
 | ||||
| 
 | ||||
| class Magnitude(object): | ||||
|     """ | ||||
|     Base class object for Magnitude calculation within PyLoT. | ||||
|     """ | ||||
| 
 | ||||
|     def __init__(self, stream, event, verbosity=False, iplot=0): | ||||
|         self._type = "M" | ||||
|         self._plot_flag = iplot | ||||
|         self._verbosity = verbosity | ||||
|         self._event = event | ||||
|         self._stream = stream | ||||
|         self._magnitudes = dict() | ||||
| 
 | ||||
|     def __str__(self): | ||||
|         print( | ||||
|         'number of stations used: {0}\n'.format(len(self.magnitudes.values()))) | ||||
|         print('\tstation\tmagnitude') | ||||
|         for s, m in self.magnitudes.items(): print('\t{0}\t{1}'.format(s, m)) | ||||
| 
 | ||||
|     def __nonzero__(self): | ||||
|         return bool(self.magnitudes) | ||||
| 
 | ||||
|     @property | ||||
|     def type(self): | ||||
|         return self._type | ||||
| 
 | ||||
|     @property | ||||
|     def plot_flag(self): | ||||
|         return self._plot_flag | ||||
| 
 | ||||
|     @plot_flag.setter | ||||
|     def plot_flag(self, value): | ||||
|         self._plot_flag = value | ||||
| 
 | ||||
|     @property | ||||
|     def verbose(self): | ||||
|         return self._verbosity | ||||
| 
 | ||||
|     @verbose.setter | ||||
|     def verbose(self, value): | ||||
|         if not isinstance(value, bool): | ||||
|             print('WARNING: only boolean values accepted...\n') | ||||
|             value = bool(value) | ||||
|         self._verbosity = value | ||||
| 
 | ||||
|     @property | ||||
|     def stream(self): | ||||
|         return self._stream | ||||
| 
 | ||||
|     @stream.setter | ||||
|     def stream(self, value): | ||||
|         self._stream = value | ||||
| 
 | ||||
|     @property | ||||
|     def event(self): | ||||
|         return self._event | ||||
| 
 | ||||
|     @property | ||||
|     def origin_id(self): | ||||
|         return self._event.origins[0].resource_id | ||||
| 
 | ||||
|     @property | ||||
|     def arrivals(self): | ||||
|         return self._event.origins[0].arrivals | ||||
| 
 | ||||
|     @property | ||||
|     def magnitudes(self): | ||||
|         return self._magnitudes | ||||
| 
 | ||||
|     @magnitudes.setter | ||||
|     def magnitudes(self, value): | ||||
|         """ | ||||
|         takes a tuple and saves the key value pair to private | ||||
|         attribute _magnitudes | ||||
|         :param value: station, magnitude value pair | ||||
|         :type value: tuple or list | ||||
|         :return: | ||||
|         """ | ||||
|         station, magnitude = value | ||||
|         self._magnitudes[station] = magnitude | ||||
| 
 | ||||
|     def calc(self): | ||||
|         pass | ||||
| 
 | ||||
|     def updated_event(self): | ||||
|         self.event.magnitudes.append(self.net_magnitude()) | ||||
|         return self.event | ||||
| 
 | ||||
|     def net_magnitude(self): | ||||
|         if self: | ||||
|             # TODO if an average Magnitude instead of the median is calculated | ||||
|             # StationMagnitudeContributions should be added to the returned | ||||
|             # Magnitude object | ||||
|             # mag_error => weights (magnitude error estimate from peak_to_peak, calcsourcespec?) | ||||
|             # weights => StationMagnitdeContribution | ||||
|             mag = ope.Magnitude( | ||||
|                 mag=np.median([M.mag for M in self.magnitudes.values()]), | ||||
|                 magnitude_type=self.type, | ||||
|                 origin_id=self.origin_id, | ||||
|                 station_count=len(self.magnitudes), | ||||
|                 azimuthal_gap=self.origin_id.get_referred_object().quality.azimuthal_gap) | ||||
|             return mag | ||||
|         return None | ||||
| 
 | ||||
| 
 | ||||
| class RichterMagnitude(Magnitude): | ||||
|     """ | ||||
|     Method to derive peak-to-peak amplitude as seen on a Wood-Anderson- | ||||
|     seismograph. Has to be derived from instrument corrected traces! | ||||
|     """ | ||||
| 
 | ||||
|     # poles, zeros and sensitivity of WA seismograph | ||||
|     # (see Uhrhammer & Collins, 1990, BSSA, pp. 702-716) | ||||
|     _paz = { | ||||
|         'poles': [5.6089 - 5.4978j, -5.6089 - 5.4978j], | ||||
|         'zeros': [0j, 0j], | ||||
|         'gain': 2080, | ||||
|         'sensitivity': 1 | ||||
|     } | ||||
| 
 | ||||
|     _amplitudes = dict() | ||||
| 
 | ||||
|     def __init__(self, stream, event, calc_win, verbosity=False, iplot=0): | ||||
|         super(RichterMagnitude, self).__init__(stream, event, verbosity, iplot) | ||||
| 
 | ||||
|         self._calc_win = calc_win | ||||
|         self._type = 'ML' | ||||
|         self.calc() | ||||
| 
 | ||||
|     @property | ||||
|     def calc_win(self): | ||||
|         return self._calc_win | ||||
| 
 | ||||
|     @calc_win.setter | ||||
|     def calc_win(self, value): | ||||
|         self._calc_win = value | ||||
| 
 | ||||
|     @property | ||||
|     def amplitudes(self): | ||||
|         return self._amplitudes | ||||
| 
 | ||||
|     @amplitudes.setter | ||||
|     def amplitudes(self, value): | ||||
|         station, a0 = value | ||||
|         self._amplitudes[station] = a0 | ||||
| 
 | ||||
|     def peak_to_peak(self, st, t0): | ||||
| 
 | ||||
|         # simulate Wood-Anderson response | ||||
|         st.simulate(paz_remove=None, paz_simulate=self._paz) | ||||
| 
 | ||||
|         # trim waveform to common range | ||||
|         stime, etime = common_range(st) | ||||
|         st.trim(stime, etime) | ||||
| 
 | ||||
|         # get time delta from waveform data | ||||
|         dt = st[0].stats.delta | ||||
| 
 | ||||
|         power = [np.power(tr.data, 2) for tr in st if tr.stats.channel[-1] not | ||||
|                  in 'Z3'] | ||||
|         if len(power) != 2: | ||||
|             raise ValueError('Wood-Anderson amplitude defintion only valid for ' | ||||
|                              'two horizontals: {0} given'.format(len(power))) | ||||
|         power_sum = power[0] + power[1] | ||||
|         # | ||||
|         sqH = np.sqrt(power_sum) | ||||
| 
 | ||||
|         # get time array | ||||
|         th = np.arange(0, len(sqH) * dt, dt) | ||||
|         # get maximum peak within pick window | ||||
|         iwin = getsignalwin(th, t0 - stime, self.calc_win) | ||||
|         wapp = np.max(sqH[iwin]) | ||||
|         if self.verbose: | ||||
|             print("Determined Wood-Anderson peak-to-peak amplitude: {0} " | ||||
|                   "mm".format(wapp)) | ||||
| 
 | ||||
|         # check for plot flag (for debugging only) | ||||
|         if self.plot_flag > 1: | ||||
|             st.plot() | ||||
|             f = plt.figure(2) | ||||
|             plt.plot(th, sqH) | ||||
|             plt.plot(th[iwin], sqH[iwin], 'g') | ||||
|             plt.plot([t0, t0], [0, max(sqH)], 'r', linewidth=2) | ||||
|             plt.title( | ||||
|                 'Station %s, RMS Horizontal Traces, WA-peak-to-peak=%4.1f mm' \ | ||||
|                 % (st[0].stats.station, wapp)) | ||||
|             plt.xlabel('Time [s]') | ||||
|             plt.ylabel('Displacement [mm]') | ||||
|             plt.show() | ||||
|             raw_input() | ||||
|             plt.close(f) | ||||
| 
 | ||||
|         return wapp | ||||
| 
 | ||||
|     def calc(self): | ||||
|         for a in self.arrivals: | ||||
|             if a.phase not in 'sS': | ||||
|                 continue | ||||
|             pick = a.pick_id.get_referred_object() | ||||
|             station = pick.waveform_id.station_code | ||||
|             wf = select_for_phase(self.stream.select( | ||||
|                 station=station), a.phase) | ||||
|             if not wf: | ||||
|                 if self.verbose: | ||||
|                     print( | ||||
|                     'WARNING: no waveform data found for station {0}'.format( | ||||
|                         station)) | ||||
|                 continue | ||||
|             delta = degrees2kilometers(a.distance) | ||||
|             onset = pick.time | ||||
|             a0 = self.peak_to_peak(wf, onset) | ||||
|             amplitude = ope.Amplitude(generic_amplitude=a0 * 1e-3) | ||||
|             amplitude.unit = 'm' | ||||
|             amplitude.category = 'point' | ||||
|             amplitude.waveform_id = pick.waveform_id | ||||
|             amplitude.magnitude_hint = self.type | ||||
|             amplitude.pick_id = pick.resource_id | ||||
|             amplitude.type = 'AML' | ||||
|             self.event.amplitudes.append(amplitude) | ||||
|             self.amplitudes = (station, amplitude) | ||||
|             # using standard Gutenberg-Richter relation | ||||
|             # TODO make the ML calculation more flexible by allowing | ||||
|             # use of custom relation functions | ||||
|             magnitude = ope.StationMagnitude( | ||||
|                 mag=np.log10(a0) + richter_magnitude_scaling(delta)) | ||||
|             magnitude.origin_id = self.origin_id | ||||
|             magnitude.waveform_id = pick.waveform_id | ||||
|             magnitude.amplitude_id = amplitude.resource_id | ||||
|             magnitude.station_magnitude_type = self.type | ||||
|             self.event.station_magnitudes.append(magnitude) | ||||
|             self.magnitudes = (station, magnitude) | ||||
| 
 | ||||
| 
 | ||||
| class MomentMagnitude(Magnitude): | ||||
|     ''' | ||||
|     Method to calculate seismic moment Mo and moment magnitude Mw. | ||||
|     Requires results of class calcsourcespec for calculating plateau w0 | ||||
|     and corner frequency fc of source spectrum, respectively. Uses | ||||
|     subfunction calcMoMw.py. Returns modified dictionary of picks including | ||||
|     Dc-value, corner frequency fc, seismic moment Mo and | ||||
|     corresponding moment magntiude Mw. | ||||
|     ''' | ||||
| 
 | ||||
|     _props = dict() | ||||
| 
 | ||||
|     def __init__(self, stream, event, vp, Qp, density, verbosity=False, | ||||
|                  iplot=False): | ||||
|         super(MomentMagnitude, self).__init__(stream, event, verbosity, iplot) | ||||
| 
 | ||||
|         self._vp = vp | ||||
|         self._Qp = Qp | ||||
|         self._density = density | ||||
|         self._type = 'Mw' | ||||
|         self.calc() | ||||
| 
 | ||||
|     @property | ||||
|     def p_velocity(self): | ||||
|         return self._vp | ||||
| 
 | ||||
|     @property | ||||
|     def p_attenuation(self): | ||||
|         return self._Qp | ||||
| 
 | ||||
|     @property | ||||
|     def rock_density(self): | ||||
|         return self._density | ||||
| 
 | ||||
|     @property | ||||
|     def moment_props(self): | ||||
|         return self._props | ||||
| 
 | ||||
|     @moment_props.setter | ||||
|     def moment_props(self, value): | ||||
|         station, props = value | ||||
|         self._props[station] = props | ||||
| 
 | ||||
|     @property | ||||
|     def seismic_moment(self): | ||||
|         return self._m0 | ||||
| 
 | ||||
|     @seismic_moment.setter | ||||
|     def seismic_moment(self, value): | ||||
|         self._m0 = value | ||||
| 
 | ||||
|     def calc(self): | ||||
|         for a in self.arrivals: | ||||
|             if a.phase not in 'pP': | ||||
|                 continue | ||||
|             pick = a.pick_id.get_referred_object() | ||||
|             station = pick.waveform_id.station_code | ||||
|             wf = select_for_phase(self.stream.select( | ||||
|                 station=station), a.phase) | ||||
|             if not wf: | ||||
|                 continue | ||||
|             onset = pick.time | ||||
|             distance = degrees2kilometers(a.distance) | ||||
|             azimuth = a.azimuth | ||||
|             incidence = a.takeoff_angle | ||||
|             w0, fc = calcsourcespec(wf, onset, self.p_velocity, distance, | ||||
|                                     azimuth, | ||||
|                                     incidence, self.p_attenuation, | ||||
|                                     self.plot_flag, self.verbose) | ||||
|             if w0 is None or fc is None: | ||||
|                 if self.verbose: | ||||
|                     print("WARNING: insufficient frequency information") | ||||
|                 continue | ||||
|             wf = select_for_phase(wf, "P") | ||||
|             m0, mw = calcMoMw(wf, w0, self.rock_density, self.p_velocity, | ||||
|                               distance, self.verbose) | ||||
|             self.moment_props = (station, dict(w0=w0, fc=fc, Mo=m0)) | ||||
|             magnitude = ope.StationMagnitude(mag=mw) | ||||
|             magnitude.origin_id = self.origin_id | ||||
|             magnitude.waveform_id = pick.waveform_id | ||||
|             magnitude.station_magnitude_type = self.type | ||||
|             self.event.station_magnitudes.append(magnitude) | ||||
|             self.magnitudes = (station, magnitude) | ||||
| 
 | ||||
| 
 | ||||
| def calcMoMw(wfstream, w0, rho, vp, delta, verbosity=False): | ||||
|     ''' | ||||
|     Subfunction of run_calcMoMw to calculate individual | ||||
|     seismic moments and corresponding moment magnitudes. | ||||
| 
 | ||||
|     :param: wfstream | ||||
|     :type:  `~obspy.core.stream.Stream` | ||||
| 
 | ||||
|     :param: w0, height of plateau of source spectrum | ||||
|     :type:  float | ||||
| 
 | ||||
|     :param: rho, rock density [kg/m³] | ||||
|     :type:  integer | ||||
| 
 | ||||
|     :param: delta, hypocentral distance [km] | ||||
|     :type:  integer | ||||
| 
 | ||||
|     :param: inv, name/path of inventory or dataless-SEED file | ||||
|     :type:  string | ||||
|     ''' | ||||
| 
 | ||||
|     tr = wfstream[0] | ||||
|     delta = delta * 1000  # hypocentral distance in [m] | ||||
| 
 | ||||
|     if verbosity: | ||||
|         print( | ||||
|         "calcMoMw: Calculating seismic moment Mo and moment magnitude Mw for station {0} ...".format( | ||||
|             tr.stats.station)) | ||||
| 
 | ||||
|     # additional common parameters for calculating Mo | ||||
|     rP = 2 / np.sqrt( | ||||
|         15)  # average radiation pattern of P waves (Aki & Richards, 1980) | ||||
|     freesurf = 2.0  # free surface correction, assuming vertical incidence | ||||
| 
 | ||||
|     Mo = w0 * 4 * np.pi * rho * np.power(vp, 3) * delta / (rP * freesurf) | ||||
| 
 | ||||
|     # Mw = np.log10(Mo * 1e07) * 2 / 3 - 10.7 # after Hanks & Kanamori (1979), defined for [dyn*cm]! | ||||
|     Mw = np.log10(Mo) * 2 / 3 - 6.7  # for metric units | ||||
| 
 | ||||
|     if verbosity: | ||||
|         print( | ||||
|         "calcMoMw: Calculated seismic moment Mo = {0} Nm => Mw = {1:3.1f} ".format( | ||||
|             Mo, Mw)) | ||||
| 
 | ||||
|     return Mo, Mw | ||||
| 
 | ||||
| 
 | ||||
| def calcsourcespec(wfstream, onset, vp, delta, azimuth, incidence, | ||||
|                    qp, iplot=0, verbosity=False): | ||||
|     ''' | ||||
|     Subfunction to calculate the source spectrum and to derive from that the plateau | ||||
|     (usually called omega0) and the corner frequency assuming Aki's omega-square | ||||
|     source model. Has to be derived from instrument corrected displacement traces, | ||||
|     thus restitution and integration necessary! Integrated traces are rotated | ||||
|     into ray-coordinate system ZNE => LQT using Obspy's rotate modul! | ||||
| 
 | ||||
|     :param: wfstream (corrected for instrument) | ||||
|     :type:  `~obspy.core.stream.Stream` | ||||
| 
 | ||||
|     :param: onset, P-phase onset time | ||||
|     :type: float | ||||
| 
 | ||||
|     :param: vp, Vp-wave velocity | ||||
|     :type:  float | ||||
| 
 | ||||
|     :param: delta, hypocentral distance [km] | ||||
|     :type:  integer | ||||
| 
 | ||||
|     :param: azimuth | ||||
|     :type:  integer | ||||
| 
 | ||||
|     :param: incidence | ||||
|     :type:  integer | ||||
| 
 | ||||
|     :param: Qp, quality factor for P-waves | ||||
|     :type:  integer | ||||
| 
 | ||||
|     :param: iplot, show results (iplot>1) or not (iplot<1) | ||||
|     :type:  integer | ||||
|     ''' | ||||
|     if verbosity: | ||||
|         print ("Calculating source spectrum ....") | ||||
| 
 | ||||
|     # get Q value | ||||
|     Q, A = qp | ||||
| 
 | ||||
|     dist = delta * 1000  # hypocentral distance in [m] | ||||
| 
 | ||||
|     fc = None | ||||
|     w0 = None | ||||
| 
 | ||||
|     zdat = select_for_phase(wfstream, "P") | ||||
| 
 | ||||
|     dt = zdat[0].stats.delta | ||||
| 
 | ||||
|     freq = zdat[0].stats.sampling_rate | ||||
| 
 | ||||
|     # trim traces to common range (for rotation) | ||||
|     trstart, trend = common_range(wfstream) | ||||
|     wfstream.trim(trstart, trend) | ||||
| 
 | ||||
|     # rotate into LQT (ray-coordindate-) system using Obspy's rotate | ||||
|     # L: P-wave direction | ||||
|     # Q: SV-wave direction | ||||
|     # T: SH-wave direction | ||||
|     LQT = wfstream.rotate('ZNE->LQT', azimuth, incidence) | ||||
|     ldat = LQT.select(component="L") | ||||
|     if len(ldat) == 0: | ||||
|         # if horizontal channels are 2 and 3 | ||||
|         # no azimuth information is available and thus no | ||||
|         # rotation is possible! | ||||
|         if verbosity: | ||||
|             print("calcsourcespec: Azimuth information is missing, " | ||||
|                   "no rotation of components possible!") | ||||
|         ldat = LQT.select(component="Z") | ||||
| 
 | ||||
|     # integrate to displacement | ||||
|     # unrotated vertical component (for comparison) | ||||
|     inttrz = signal.detrend(integrate.cumtrapz(zdat[0].data, None, dt)) | ||||
| 
 | ||||
|     # rotated component Z => L | ||||
|     Ldat = signal.detrend(integrate.cumtrapz(ldat[0].data, None, dt)) | ||||
| 
 | ||||
|     # get window after P pulse for | ||||
|     # calculating source spectrum | ||||
|     rel_onset = onset - trstart | ||||
|     impickP = int(rel_onset * freq) | ||||
|     wfzc = Ldat[impickP: len(Ldat) - 1] | ||||
|     # get time array | ||||
|     t = np.arange(0, len(inttrz) * dt, dt) | ||||
|     # calculate spectrum using only first cycles of | ||||
|     # waveform after P onset! | ||||
|     zc = crossings_nonzero_all(wfzc) | ||||
|     if np.size(zc) == 0 or len(zc) <= 3: | ||||
|         if verbosity: | ||||
|             print ("calcsourcespec: Something is wrong with the waveform, " | ||||
|                    "no zero crossings derived!\n") | ||||
|             print ("No calculation of source spectrum possible!") | ||||
|         plotflag = 0 | ||||
|     else: | ||||
|         plotflag = 1 | ||||
|         index = min([3, len(zc) - 1]) | ||||
|         calcwin = (zc[index] - zc[0]) * dt | ||||
|         iwin = getsignalwin(t, rel_onset, calcwin) | ||||
|         xdat = Ldat[iwin] | ||||
| 
 | ||||
|         # fft | ||||
|         fny = freq / 2 | ||||
|         l = len(xdat) / freq | ||||
|         # number of fft bins after Bath | ||||
|         n = freq * l | ||||
|         # find next power of 2 of data length | ||||
|         m = pow(2, np.ceil(np.log(len(xdat)) / np.log(2))) | ||||
|         N = int(np.power(m, 2)) | ||||
|         y = dt * np.fft.fft(xdat, N) | ||||
|         Y = abs(y[: N / 2]) | ||||
|         L = (N - 1) / freq | ||||
|         f = np.arange(0, fny, 1 / L) | ||||
| 
 | ||||
|         # remove zero-frequency and frequencies above | ||||
|         # corner frequency of seismometer (assumed | ||||
|         # to be 100 Hz) | ||||
|         fi = np.where((f >= 1) & (f < 100)) | ||||
|         F = f[fi] | ||||
|         YY = Y[fi] | ||||
| 
 | ||||
|         # correction for attenuation | ||||
|         wa = 2 * np.pi * F  # angular frequency | ||||
|         D = np.exp((wa * dist) / (2 * vp * Q * F ** A)) | ||||
|         YYcor = YY.real * D | ||||
| 
 | ||||
|         # get plateau (DC value) and corner frequency | ||||
|         # initial guess of plateau | ||||
|         w0in = np.mean(YYcor[0:100]) | ||||
|         # initial guess of corner frequency | ||||
|         # where spectral level reached 50% of flat level | ||||
|         iin = np.where(YYcor >= 0.5 * w0in) | ||||
|         Fcin = F[iin[0][np.size(iin) - 1]] | ||||
| 
 | ||||
|         # use of implicit scipy otimization function | ||||
|         fit = synthsourcespec(F, w0in, Fcin) | ||||
|         [optspecfit, _] = curve_fit(synthsourcespec, F, YYcor, [w0in, Fcin]) | ||||
|         w01 = optspecfit[0] | ||||
|         fc1 = optspecfit[1] | ||||
|         if verbosity: | ||||
|             print ("calcsourcespec: Determined w0-value: %e m/Hz, \n" | ||||
|                    "Determined corner frequency: %f Hz" % (w01, fc1)) | ||||
| 
 | ||||
|         # use of conventional fitting | ||||
|         [w02, fc2] = fitSourceModel(F, YYcor, Fcin, iplot, verbosity) | ||||
| 
 | ||||
|         # get w0 and fc as median of both | ||||
|         # source spectrum fits | ||||
|         w0 = np.median([w01, w02]) | ||||
|         fc = np.median([fc1, fc2]) | ||||
|         if verbosity: | ||||
|             print("calcsourcespec: Using w0-value = %e m/Hz and fc = %f Hz" % ( | ||||
|             w0, fc)) | ||||
| 
 | ||||
|     if iplot > 1: | ||||
|         f1 = plt.figure() | ||||
|         tLdat = np.arange(0, len(Ldat) * dt, dt) | ||||
|         plt.subplot(2, 1, 1) | ||||
|         # show displacement in mm | ||||
|         p1, = plt.plot(t, np.multiply(inttrz, 1000), 'k') | ||||
|         p2, = plt.plot(tLdat, np.multiply(Ldat, 1000)) | ||||
|         plt.legend([p1, p2], ['Displacement', 'Rotated Displacement']) | ||||
|         if plotflag == 1: | ||||
|             plt.plot(t[iwin], np.multiply(xdat, 1000), 'g') | ||||
|             plt.title('Seismogram and P Pulse, Station %s-%s' \ | ||||
|                       % (zdat[0].stats.station, zdat[0].stats.channel)) | ||||
|         else: | ||||
|             plt.title('Seismogram, Station %s-%s' \ | ||||
|                       % (zdat[0].stats.station, zdat[0].stats.channel)) | ||||
|         plt.xlabel('Time since %s' % zdat[0].stats.starttime) | ||||
|         plt.ylabel('Displacement [mm]') | ||||
| 
 | ||||
|         if plotflag == 1: | ||||
|             plt.subplot(2, 1, 2) | ||||
|             p1, = plt.loglog(f, Y.real, 'k') | ||||
|             p2, = plt.loglog(F, YY.real) | ||||
|             p3, = plt.loglog(F, YYcor, 'r') | ||||
|             p4, = plt.loglog(F, fit, 'g') | ||||
|             plt.loglog([fc, fc], [w0 / 100, w0], 'g') | ||||
|             plt.legend([p1, p2, p3, p4], ['Raw Spectrum', \ | ||||
|                                           'Used Raw Spectrum', \ | ||||
|                                           'Q-Corrected Spectrum', \ | ||||
|                                           'Fit to Spectrum']) | ||||
|             plt.title('Source Spectrum from P Pulse, w0=%e m/Hz, fc=%6.2f Hz' \ | ||||
|                       % (w0, fc)) | ||||
|             plt.xlabel('Frequency [Hz]') | ||||
|             plt.ylabel('Amplitude [m/Hz]') | ||||
|             plt.grid() | ||||
|         plt.show() | ||||
|         raw_input() | ||||
|         plt.close(f1) | ||||
| 
 | ||||
|     return w0, fc | ||||
| 
 | ||||
| 
 | ||||
| def synthsourcespec(f, omega0, fcorner): | ||||
|     ''' | ||||
|     Calculates synthetic source spectrum from given plateau and corner | ||||
|     frequency assuming Akis omega-square model. | ||||
| 
 | ||||
|     :param: f, frequencies | ||||
|     :type:  array | ||||
| 
 | ||||
|     :param: omega0, DC-value (plateau) of source spectrum | ||||
|     :type:  float | ||||
| 
 | ||||
|     :param: fcorner, corner frequency of source spectrum | ||||
|     :type:  float | ||||
|     ''' | ||||
| 
 | ||||
|     # ssp = omega0 / (pow(2, (1 + f / fcorner))) | ||||
|     ssp = omega0 / (1 + pow(2, (f / fcorner))) | ||||
| 
 | ||||
|     return ssp | ||||
| 
 | ||||
| 
 | ||||
| def fitSourceModel(f, S, fc0, iplot, verbosity=False): | ||||
|     ''' | ||||
|     Calculates synthetic source spectrum by varying corner frequency fc. | ||||
|     Returns best approximated plateau omega0 and corner frequency, i.e. with least | ||||
|     common standard deviations. | ||||
| 
 | ||||
|     :param:  f, frequencies | ||||
|     :type:   array | ||||
| 
 | ||||
|     :param:  S, observed source spectrum | ||||
|     :type:   array | ||||
| 
 | ||||
|     :param:  fc0, initial corner frequency | ||||
|     :type:   float | ||||
|     ''' | ||||
| 
 | ||||
|     w0 = [] | ||||
|     stdw0 = [] | ||||
|     fc = [] | ||||
|     stdfc = [] | ||||
|     STD = [] | ||||
| 
 | ||||
|     # get window around initial corner frequency for trials | ||||
|     fcstopl = fc0 - max(1, len(f) / 10) | ||||
|     il = np.argmin(abs(f - fcstopl)) | ||||
|     fcstopl = f[il] | ||||
|     fcstopr = fc0 + min(len(f), len(f) / 10) | ||||
|     ir = np.argmin(abs(f - fcstopr)) | ||||
|     fcstopr = f[ir] | ||||
|     iF = np.where((f >= fcstopl) & (f <= fcstopr)) | ||||
| 
 | ||||
|     # vary corner frequency around initial point | ||||
|     for i in range(il, ir): | ||||
|         FC = f[i] | ||||
|         indexdc = np.where((f > 0) & (f <= FC)) | ||||
|         dc = np.mean(S[indexdc]) | ||||
|         stddc = np.std(dc - S[indexdc]) | ||||
|         w0.append(dc) | ||||
|         stdw0.append(stddc) | ||||
|         fc.append(FC) | ||||
|         # slope | ||||
|         indexfc = np.where((f >= FC) & (f <= fcstopr)) | ||||
|         yi = dc / (1 + (f[indexfc] / FC) ** 2) | ||||
|         stdFC = np.std(yi - S[indexfc]) | ||||
|         stdfc.append(stdFC) | ||||
|         STD.append(stddc + stdFC) | ||||
| 
 | ||||
|     # get best found w0 anf fc from minimum | ||||
|     if len(STD) > 0: | ||||
|         fc = fc[np.argmin(STD)] | ||||
|         w0 = w0[np.argmin(STD)] | ||||
|     elif len(STD) == 0: | ||||
|         fc = fc0 | ||||
|         w0 = max(S) | ||||
|     if verbosity: | ||||
|         print( | ||||
|         "fitSourceModel: best fc: {0} Hz, best w0: {1} m/Hz".format(fc, w0)) | ||||
| 
 | ||||
|     if iplot > 1: | ||||
|         plt.figure(iplot) | ||||
|         plt.loglog(f, S, 'k') | ||||
|         plt.loglog([f[0], fc], [w0, w0], 'g') | ||||
|         plt.loglog([fc, fc], [w0 / 100, w0], 'g') | ||||
|         plt.title('Calculated Source Spectrum, Omega0=%e m/Hz, fc=%6.2f Hz' \ | ||||
|                   % (w0, fc)) | ||||
|         plt.xlabel('Frequency [Hz]') | ||||
|         plt.ylabel('Amplitude [m/Hz]') | ||||
|         plt.grid() | ||||
|         plt.figure(iplot + 1) | ||||
|         plt.subplot(311) | ||||
|         plt.plot(f[il:ir], STD, '*') | ||||
|         plt.title('Common Standard Deviations') | ||||
|         plt.xticks([]) | ||||
|         plt.subplot(312) | ||||
|         plt.plot(f[il:ir], stdw0, '*') | ||||
|         plt.title('Standard Deviations of w0-Values') | ||||
|         plt.xticks([]) | ||||
|         plt.subplot(313) | ||||
|         plt.plot(f[il:ir], stdfc, '*') | ||||
|         plt.title('Standard Deviations of Corner Frequencies') | ||||
|         plt.xlabel('Corner Frequencies [Hz]') | ||||
|         plt.show() | ||||
|         raw_input() | ||||
|         plt.close() | ||||
| 
 | ||||
|     return w0, fc | ||||
							
								
								
									
										1
									
								
								pylot/core/io/__init__.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
							
								
								
									
										601
									
								
								pylot/core/io/data.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,601 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import copy | ||||
| import os | ||||
| from obspy import read_events | ||||
| from obspy.core import read, Stream, UTCDateTime | ||||
| from obspy.core.event import Event | ||||
| 
 | ||||
| from pylot.core.io.phases import readPILOTEvent, picks_from_picksdict, \ | ||||
|     picksdict_from_pilot, merge_picks | ||||
| from pylot.core.util.errors import FormatError, OverwriteError | ||||
| from pylot.core.util.utils import fnConstructor, full_range | ||||
| 
 | ||||
| 
 | ||||
| class Data(object): | ||||
|     """ | ||||
|     Data container with attributes wfdata holding ~obspy.core.stream. | ||||
| 
 | ||||
|     :type parent: PySide.QtGui.QWidget object, optional | ||||
|     :param parent: A PySide.QtGui.QWidget object utilized when | ||||
|     called by a GUI to display a PySide.QtGui.QMessageBox instead of printing | ||||
|     to standard out. | ||||
|     :type evtdata: ~obspy.core.event.Event object, optional | ||||
|     :param evtdata ~obspy.core.event.Event object containing all derived or | ||||
|     loaded event. Container object holding, e.g. phase arrivals, etc. | ||||
|     """ | ||||
| 
 | ||||
|     def __init__(self, parent=None, evtdata=None): | ||||
|         self._parent = parent | ||||
|         if self.getParent(): | ||||
|             self.comp = parent.getComponent() | ||||
|         else: | ||||
|             self.comp = 'Z' | ||||
|             self.wfdata = Stream() | ||||
|         self._new = False | ||||
|         if isinstance(evtdata, Event): | ||||
|             pass | ||||
|         elif isinstance(evtdata, dict): | ||||
|             evt = readPILOTEvent(**evtdata) | ||||
|             evtdata = evt | ||||
|         elif isinstance(evtdata, basestring): | ||||
|             try: | ||||
|                 cat = read_events(evtdata) | ||||
|                 if len(cat) is not 1: | ||||
|                     raise ValueError('ambiguous event information for file: ' | ||||
|                                      '{file}'.format(file=evtdata)) | ||||
|                 evtdata = cat[0] | ||||
|             except TypeError as e: | ||||
|                 if 'Unknown format for file' in e.message: | ||||
|                     if 'PHASES' in evtdata: | ||||
|                         picks = picksdict_from_pilot(evtdata) | ||||
|                         evtdata = Event() | ||||
|                         evtdata.picks = picks_from_picksdict(picks) | ||||
|                     elif 'LOC' in evtdata: | ||||
|                         raise NotImplementedError('PILOT location information ' | ||||
|                                                   'read support not yet ' | ||||
|                                                   'implemeted.') | ||||
|                     else: | ||||
|                         raise e | ||||
|                 else: | ||||
|                     raise e | ||||
|         else:  # create an empty Event object | ||||
|             self.setNew() | ||||
|             evtdata = Event() | ||||
|             evtdata.picks = [] | ||||
|         self.evtdata = evtdata | ||||
|         self.wforiginal = None | ||||
|         self.cuttimes = None | ||||
|         self.dirty = False | ||||
| 
 | ||||
|     def __str__(self): | ||||
|         return str(self.wfdata) | ||||
| 
 | ||||
|     def __add__(self, other): | ||||
|         assert isinstance(other, Data), "operands must be of same type 'Data'" | ||||
|         if other.isNew() and not self.isNew(): | ||||
|             picks_to_add = other.get_evt_data().picks | ||||
|             old_picks = self.get_evt_data().picks | ||||
|             for pick in picks_to_add: | ||||
|                 if pick not in old_picks: | ||||
|                     old_picks.append(pick) | ||||
|         elif not other.isNew() and self.isNew(): | ||||
|             new = other + self | ||||
|             self.evtdata = new.get_evt_data() | ||||
|         elif self.isNew() and other.isNew(): | ||||
|             pass | ||||
|         elif self.get_evt_data().get('id') == other.get_evt_data().get('id'): | ||||
|             other.setNew() | ||||
|             return self + other | ||||
|         else: | ||||
|             raise ValueError("both Data objects have differing " | ||||
|                              "unique Event identifiers") | ||||
|         return self | ||||
| 
 | ||||
|     def getPicksStr(self): | ||||
|         picks_str = '' | ||||
|         for pick in self.get_evt_data().picks: | ||||
|             picks_str += str(pick) + '\n' | ||||
|         return picks_str | ||||
| 
 | ||||
|     def getParent(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return self._parent | ||||
| 
 | ||||
|     def isNew(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return self._new | ||||
| 
 | ||||
|     def setNew(self): | ||||
|         self._new = True | ||||
| 
 | ||||
|     def getCutTimes(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         if self.cuttimes is None: | ||||
|             self.updateCutTimes() | ||||
|         return self.cuttimes | ||||
| 
 | ||||
|     def updateCutTimes(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         """ | ||||
|         self.cuttimes = full_range(self.getWFData()) | ||||
| 
 | ||||
|     def getEventFileName(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         ID = self.getID() | ||||
|         # handle forbidden filenames especially on windows systems | ||||
|         return fnConstructor(str(ID)) | ||||
| 
 | ||||
|     def exportEvent(self, fnout, fnext='.xml'): | ||||
| 
 | ||||
|         """ | ||||
| 
 | ||||
|         :param fnout: | ||||
|         :param fnext: | ||||
|         :raise KeyError: | ||||
|         """ | ||||
|         from pylot.core.util.defaults import OUTPUTFORMATS | ||||
| 
 | ||||
|         try: | ||||
|             evtformat = OUTPUTFORMATS[fnext] | ||||
|         except KeyError as e: | ||||
|             errmsg = '{0}; selected file extension {1} not ' \ | ||||
|                      'supported'.format(e, fnext) | ||||
|             raise FormatError(errmsg) | ||||
| 
 | ||||
|         # try exporting event via ObsPy | ||||
|         try: | ||||
|             self.get_evt_data().write(fnout + fnext, format=evtformat) | ||||
|         except KeyError as e: | ||||
|             raise KeyError('''{0} export format | ||||
|                               not implemented: {1}'''.format(evtformat, e)) | ||||
| 
 | ||||
|     def getComp(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return self.comp | ||||
| 
 | ||||
|     def getID(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         try: | ||||
|             return self.evtdata.get('resource_id').id | ||||
|         except: | ||||
|             return None | ||||
| 
 | ||||
|     def filterWFData(self, kwargs): | ||||
|         """ | ||||
| 
 | ||||
|         :param kwargs: | ||||
|         """ | ||||
|         self.getWFData().filter(**kwargs) | ||||
|         self.dirty = True | ||||
| 
 | ||||
|     def setWFData(self, fnames): | ||||
|         """ | ||||
| 
 | ||||
|         :param fnames: | ||||
|         """ | ||||
|         self.wfdata = Stream() | ||||
|         self.wforiginal = None | ||||
|         if fnames is not None: | ||||
|             self.appendWFData(fnames) | ||||
|         else: | ||||
|             return False | ||||
|         self.wforiginal = self.getWFData().copy() | ||||
|         self.dirty = False | ||||
|         return True | ||||
| 
 | ||||
|     def appendWFData(self, fnames): | ||||
|         """ | ||||
| 
 | ||||
|         :param fnames: | ||||
|         """ | ||||
|         assert isinstance(fnames, list), "input parameter 'fnames' is " \ | ||||
|                                          "supposed to be of type 'list' " \ | ||||
|                                          "but is actually" \ | ||||
|                                          " {0}".format(type(fnames)) | ||||
|         if self.dirty: | ||||
|             self.resetWFData() | ||||
| 
 | ||||
|         warnmsg = '' | ||||
|         for fname in fnames: | ||||
|             try: | ||||
|                 self.wfdata += read(fname) | ||||
|             except TypeError: | ||||
|                 try: | ||||
|                     self.wfdata += read(fname, format='GSE2') | ||||
|                 except Exception as e: | ||||
|                     warnmsg += '{0}\n{1}\n'.format(fname, e) | ||||
|         if warnmsg: | ||||
|             warnmsg = 'WARNING: unable to read\n' + warnmsg | ||||
|             print(warnmsg) | ||||
| 
 | ||||
|     def getWFData(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return self.wfdata | ||||
| 
 | ||||
|     def getOriginalWFData(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return self.wforiginal | ||||
| 
 | ||||
|     def resetWFData(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         """ | ||||
|         self.wfdata = self.getOriginalWFData().copy() | ||||
|         self.dirty = False | ||||
| 
 | ||||
|     def resetPicks(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         """ | ||||
|         self.get_evt_data().picks = [] | ||||
| 
 | ||||
|     def get_evt_data(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return self.evtdata | ||||
| 
 | ||||
|     def setEvtData(self, event): | ||||
|         self.evtdata = event | ||||
| 
 | ||||
|     def applyEVTData(self, data, type='pick', authority_id='rub'): | ||||
| 
 | ||||
|         """ | ||||
| 
 | ||||
|         :param data: | ||||
|         :param type: | ||||
|         :param authority_id: | ||||
|         :raise OverwriteError: | ||||
|         """ | ||||
| 
 | ||||
|         def applyPicks(picks): | ||||
|             """ | ||||
|             Creates ObsPy pick objects and append it to the picks list from the | ||||
|             PyLoT dictionary contain all picks. | ||||
|             :param picks: | ||||
|             :raise OverwriteError: raises an OverwriteError if the picks list is | ||||
|              not empty. The GUI will then ask for a decision. | ||||
|             """ | ||||
| 
 | ||||
|             #firstonset = find_firstonset(picks) | ||||
|             if self.get_evt_data().picks: | ||||
|                 raise OverwriteError('Actual picks would be overwritten!') | ||||
|             else: | ||||
|                 picks = picks_from_picksdict(picks) | ||||
|             self.get_evt_data().picks = picks | ||||
|             # if 'smi:local' in self.getID() and firstonset: | ||||
|             #     fonset_str = firstonset.strftime('%Y_%m_%d_%H_%M_%S') | ||||
|             #     ID = ResourceIdentifier('event/' + fonset_str) | ||||
|             #     ID.convertIDToQuakeMLURI(authority_id=authority_id) | ||||
|             #     self.get_evt_data().resource_id = ID | ||||
| 
 | ||||
| 
 | ||||
|         def applyEvent(event): | ||||
|             """ | ||||
|             takes an `obspy.core.event.Event` object and applies all new | ||||
|             information on the event to the actual data | ||||
|             :param event: | ||||
|             """ | ||||
|             if not self.isNew(): | ||||
|                 self.setEvtData(event) | ||||
|             else: | ||||
|                 # prevent overwriting original pick information | ||||
|                 picks =  copy.deepcopy(self.get_evt_data().picks) | ||||
|                 event = merge_picks(event, picks) | ||||
|                 # apply event information from location | ||||
|                 self.get_evt_data().update(event) | ||||
| 
 | ||||
|         applydata = {'pick': applyPicks, | ||||
|                      'event': applyEvent} | ||||
| 
 | ||||
|         applydata[type](data) | ||||
| 
 | ||||
| 
 | ||||
| class GenericDataStructure(object): | ||||
|     """ | ||||
|     GenericDataBase type holds all information about the current data- | ||||
|     base working on. | ||||
|     """ | ||||
| 
 | ||||
|     def __init__(self, **kwargs): | ||||
| 
 | ||||
|         self.allowedFields = [] | ||||
|         self.expandFields = ['root'] | ||||
|         self.dsFields = {} | ||||
| 
 | ||||
|         self.modifyFields(**kwargs) | ||||
| 
 | ||||
|     def modifyFields(self, **kwargs): | ||||
| 
 | ||||
|         """ | ||||
| 
 | ||||
|         :param kwargs: | ||||
|         """ | ||||
|         assert isinstance(kwargs, dict), 'dictionary type object expected' | ||||
| 
 | ||||
|         if not self.extraAllowed(): | ||||
|             kwargs = self.updateNotAllowed(kwargs) | ||||
| 
 | ||||
|         for key, value in kwargs.items(): | ||||
|             key = str(key).lower() | ||||
|             if value is not None: | ||||
|                 if type(value) not in (str, int, float): | ||||
|                     for n, val in enumerate(value): | ||||
|                         value[n] = str(val) | ||||
|                 else: | ||||
|                     value = str(value) | ||||
|             try: | ||||
|                 self.setFieldValue(key, value) | ||||
|             except KeyError as e: | ||||
|                 errmsg = '' | ||||
|                 errmsg += 'WARNING:\n' | ||||
|                 errmsg += 'unable to set values for datastructure fields\n' | ||||
|                 errmsg += '%s; desired value was: %s\n' % (e, value) | ||||
|                 print(errmsg) | ||||
| 
 | ||||
|     def isField(self, key): | ||||
|         """ | ||||
| 
 | ||||
|         :param key: | ||||
|         :return: | ||||
|         """ | ||||
|         return key in self.getFields().keys() | ||||
| 
 | ||||
|     def getFieldValue(self, key): | ||||
|         """ | ||||
| 
 | ||||
|         :param key: | ||||
|         :return: | ||||
|         """ | ||||
|         if self.isField(key): | ||||
|             return self.getFields()[key] | ||||
|         else: | ||||
|             return | ||||
| 
 | ||||
|     def setFieldValue(self, key, value): | ||||
|         """ | ||||
| 
 | ||||
|         :param key: | ||||
|         :param value: | ||||
|         :raise KeyError: | ||||
|         """ | ||||
|         if not self.extraAllowed() and key not in self.getAllowed(): | ||||
|             raise KeyError | ||||
|         else: | ||||
|             if not self.isField(key): | ||||
|                 print('creating new field "%s"' % key) | ||||
|             self.getFields()[key] = value | ||||
| 
 | ||||
|     def getFields(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return self.dsFields | ||||
| 
 | ||||
|     def getExpandFields(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return self.expandFields | ||||
| 
 | ||||
|     def setExpandFields(self, keys): | ||||
|         """ | ||||
| 
 | ||||
|         :param keys: | ||||
|         """ | ||||
|         expandFields = [] | ||||
|         for key in keys: | ||||
|             if self.isField(key): | ||||
|                 expandFields.append(key) | ||||
|         self.expandFields = expandFields | ||||
| 
 | ||||
|     def getAllowed(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return self.allowedFields | ||||
| 
 | ||||
|     def extraAllowed(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return not self.allowedFields | ||||
| 
 | ||||
|     def updateNotAllowed(self, kwargs): | ||||
|         """ | ||||
| 
 | ||||
|         :param kwargs: | ||||
|         :return: | ||||
|         """ | ||||
|         for key in kwargs: | ||||
|             if key not in self.getAllowed(): | ||||
|                 kwargs.__delitem__(key) | ||||
|         return kwargs | ||||
| 
 | ||||
|     def hasSuffix(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         try: | ||||
|             self.getFieldValue('suffix') | ||||
|         except KeyError: | ||||
|             return False | ||||
|         else: | ||||
|             if self.getFieldValue('suffix'): | ||||
|                 return True | ||||
|         return False | ||||
| 
 | ||||
|     def expandDataPath(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         expandList = [] | ||||
|         for item in self.getExpandFields(): | ||||
|             expandList.append(self.getFieldValue(item)) | ||||
|         if self.hasSuffix(): | ||||
|             expandList.append('*%s' % self.getFieldValue('suffix')) | ||||
|         return os.path.join(*expandList) | ||||
| 
 | ||||
|     def getCatalogName(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         return os.path.join(self.getFieldValue('root'), 'catalog.qml') | ||||
| 
 | ||||
| 
 | ||||
| class PilotDataStructure(GenericDataStructure): | ||||
|     """ | ||||
|     Object containing the data access information for the old PILOT data | ||||
|     structure. | ||||
|     """ | ||||
| 
 | ||||
|     def __init__(self, **fields): | ||||
|         if not fields: | ||||
|             fields = {'database': '2006.01', | ||||
|                       'root': '/data/Egelados/EVENT_DATA/LOCAL'} | ||||
| 
 | ||||
|         GenericDataStructure.__init__(self, **fields) | ||||
| 
 | ||||
|         self.setExpandFields(['root', 'database']) | ||||
| 
 | ||||
| 
 | ||||
| class SeiscompDataStructure(GenericDataStructure): | ||||
|     """ | ||||
|     Dictionary containing the data access information for an SDS data archive: | ||||
| 
 | ||||
|     :param str dataType: Desired data type. Default: ``'waveform'`` | ||||
|     :param sdate, edate: Either date string or an instance of | ||||
|          :class:`obspy.core.utcdatetime.UTCDateTime. Default: ``None`` | ||||
|     :type sdate, edate: str or UTCDateTime or None | ||||
|     """ | ||||
| 
 | ||||
|     def __init__(self, rootpath='/data/SDS', dataformat='MSEED', | ||||
|                  filesuffix=None, **kwargs): | ||||
|         super(GenericDataStructure, self).__init__() | ||||
| 
 | ||||
|         edate = UTCDateTime() | ||||
|         halfyear = UTCDateTime('1970-07-01') | ||||
|         sdate = UTCDateTime(edate - halfyear) | ||||
|         del halfyear | ||||
| 
 | ||||
|         year = '' | ||||
|         if not edate.year == sdate.year: | ||||
|             nyears = edate.year - sdate.year | ||||
|             for yr in range(nyears): | ||||
|                 year += '{0:04d},'.format(sdate.year + yr) | ||||
|             year = '{' + year[:-1] + '}' | ||||
|         else: | ||||
|             year = '{0:04d}'.format(sdate.year) | ||||
| 
 | ||||
|         # SDS fields' default values | ||||
|         # definitions from | ||||
|         # http://www.seiscomp3.org/wiki/doc/applications/slarchive/SDS | ||||
| 
 | ||||
|         self.dsFields = {'root': '/data/SDS', 'YEAR': year, 'NET': '??', | ||||
|                          'STA': '????', 'CHAN': 'HH?', 'TYPE': 'D', 'LOC': '', | ||||
|                          'DAY': '{0:03d}'.format(sdate.julday) | ||||
|                          } | ||||
|         self.modifiyFields(**kwargs) | ||||
| 
 | ||||
|     def modifiyFields(self, **kwargs): | ||||
|         """ | ||||
| 
 | ||||
|         :param kwargs: | ||||
|         """ | ||||
|         if kwargs and isinstance(kwargs, dict): | ||||
|             for key, value in kwargs.iteritems(): | ||||
|                 key = str(key) | ||||
|                 if type(value) not in (str, int, float): | ||||
|                     for n, val in enumerate(value): | ||||
|                         value[n] = str(val) | ||||
|                 else: | ||||
|                     value = str(value) | ||||
|                 try: | ||||
|                     self.setFieldValue(key, value) | ||||
|                 except KeyError as e: | ||||
|                     errmsg = '' | ||||
|                     errmsg += 'WARNING:\n' | ||||
|                     errmsg += 'unable to set values for SDS fields\n' | ||||
|                     errmsg += '%s; desired value was: %s\n' % (e, value) | ||||
|                     print(errmsg) | ||||
| 
 | ||||
|     def setFieldValue(self, key, value): | ||||
|         """ | ||||
| 
 | ||||
|         :param key: | ||||
|         :param value: | ||||
|         """ | ||||
|         if self.isField(key): | ||||
|             self.getFields()[key] = value | ||||
|         else: | ||||
|             print('Warning: trying to set value of non-existent field ' | ||||
|                   '{field}'.format(field=key)) | ||||
| 
 | ||||
|     def expandDataPath(self): | ||||
|         """ | ||||
| 
 | ||||
| 
 | ||||
|         :return: | ||||
|         """ | ||||
|         fullChan = '{0}.{1}'.format(self.getFields()['CHAN'], self.getType()) | ||||
|         dataPath = os.path.join(self.getFields()['SDSdir'], | ||||
|                                 self.getFields()['YEAR'], | ||||
|                                 self.getFields()['NET'], | ||||
|                                 self.getFields()['STA'], | ||||
|                                 fullChan, | ||||
|                                 '*{0}'.format(self.getFields()['DAY'])) | ||||
|         return dataPath | ||||
							
								
								
									
										246
									
								
								pylot/core/io/inputs.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,246 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| from pylot.core.util.errors import ParameterError | ||||
| 
 | ||||
| 
 | ||||
| class AutoPickParameter(object): | ||||
|     ''' | ||||
|     AutoPickParameters is a parameter type object capable to read and/or write | ||||
|     parameter ASCII. | ||||
| 
 | ||||
|     :param fn str: Filename of the input file | ||||
| 
 | ||||
|     Parameters are given for example as follows: | ||||
|     ==========  ==========  ======================================= | ||||
|     Name        Value       Comment | ||||
|     ==========  ==========  ======================================= | ||||
|     phl         S           # phaselabel | ||||
|     ff1         0.1         # freqmin | ||||
|     ff2         0.5         # freqmax | ||||
|     tdet        6.875       # det-window_(s)_for_ar | ||||
|     tpred       2.5         # pred-window_(s)_for_ar | ||||
|     order       4           # order_of_ar | ||||
|     fnoise      0           # noise_level_for_ar | ||||
|     suppp       7           # envelopecoeff | ||||
|     tolt        300         # (s)time around arrival time | ||||
|     f1tpwt      4           # propfact_minfreq_secondtaper | ||||
|     pickwindow  9           # length_of_pick_window | ||||
|     w1          1           # length_of_smoothing_window | ||||
|     w2          0.37        # cf(i-1)*(1+peps)_for_local_min | ||||
|     w3          0.25        # cf(i-1)*(1+peps)_for_local_min | ||||
|     tslope      0.8;2       # slope_det_window_loc_glob | ||||
|     aerr        30;60       # adjusted_error_slope_fitting_loc_glob | ||||
|     tsn         20;5;20;10  # length_signal_window_S/N | ||||
|     proPh       Sn          # nextprominentphase | ||||
|     ==========  ==========  ======================================= | ||||
|     ''' | ||||
| 
 | ||||
|     def __init__(self, fnin=None, fnout=None, verbosity=0, **kwargs): | ||||
|         ''' | ||||
|         Initialize parameter object: | ||||
| 
 | ||||
|         io content of an ASCII file an form a type consistent dictionary | ||||
|         contain all parameters. | ||||
|         ''' | ||||
| 
 | ||||
|         self.__filename = fnin | ||||
|         parFileCont = {} | ||||
|         # io from parsed arguments alternatively | ||||
|         for key, val in kwargs.items(): | ||||
|             parFileCont[key] = val | ||||
| 
 | ||||
|         if self.__filename is not None: | ||||
|             inputFile = open(self.__filename, 'r') | ||||
|         else: | ||||
|             return | ||||
|         try: | ||||
|             lines = inputFile.readlines() | ||||
|             for line in lines: | ||||
|                 parspl = line.split('\t')[:2] | ||||
|                 parFileCont[parspl[0].strip()] = parspl[1] | ||||
|         except IndexError as e: | ||||
|             if verbosity > 0: | ||||
|                 self._printParameterError(e) | ||||
|             inputFile.seek(0) | ||||
|             lines = inputFile.readlines() | ||||
|             for line in lines: | ||||
|                 if not line.startswith(('#', '%', '\n', ' ')): | ||||
|                     parspl = line.split('#')[:2] | ||||
|                     parFileCont[parspl[1].strip()] = parspl[0].strip() | ||||
|         for key, value in parFileCont.items(): | ||||
|             try: | ||||
|                 val = int(value) | ||||
|             except: | ||||
|                 try: | ||||
|                     val = float(value) | ||||
|                 except: | ||||
|                     if len(value.split(' ')) > 1: | ||||
|                         vallist = value.strip().split(' ') | ||||
|                         val = [] | ||||
|                         for val0 in vallist: | ||||
|                             val0 = float(val0) | ||||
|                             val.append(val0) | ||||
|                     else: | ||||
|                         val = str(value.strip()) | ||||
|             parFileCont[key] = val | ||||
|         self.__parameter = parFileCont | ||||
| 
 | ||||
|         if fnout: | ||||
|             self.export2File(fnout) | ||||
| 
 | ||||
|     # Human-readable string representation of the object | ||||
|     def __str__(self): | ||||
|         string = '' | ||||
|         string += 'Automated picking parameter:\n\n' | ||||
|         if self.__parameter: | ||||
|             for key, value in self.iteritems(): | ||||
|                 string += '%s:\t\t%s\n' % (key, value) | ||||
|         else: | ||||
|             string += 'Empty parameter dictionary.' | ||||
|         return string | ||||
| 
 | ||||
|     # String representation of the object | ||||
|     def __repr__(self): | ||||
|         return "AutoPickParameter('%s')" % self.__filename | ||||
| 
 | ||||
|     # Boolean test | ||||
|     def __nonzero__(self): | ||||
|         return self.__parameter | ||||
| 
 | ||||
|     def __getitem__(self, key): | ||||
|         return self.__parameter[key] | ||||
| 
 | ||||
|     def __setitem__(self, key, value): | ||||
|         self.__parameter[key] = value | ||||
| 
 | ||||
|     def __delitem__(self, key): | ||||
|         del self.__parameter[key] | ||||
| 
 | ||||
|     def __iter__(self): | ||||
|         return iter(self.__parameter) | ||||
| 
 | ||||
|     def __len__(self): | ||||
|         return len(self.__parameter.keys()) | ||||
| 
 | ||||
|     def iteritems(self): | ||||
|         for key, value in self.__parameter.items(): | ||||
|             yield key, value | ||||
| 
 | ||||
|     def hasParam(self, parameter): | ||||
|         if self.__parameter.has_key(parameter): | ||||
|             return True | ||||
|         return False | ||||
| 
 | ||||
|     def get(self, *args): | ||||
|         try: | ||||
|             for param in args: | ||||
|                 try: | ||||
|                     return self.__getitem__(param) | ||||
|                 except KeyError as e: | ||||
|                     self._printParameterError(e) | ||||
|                     raise ParameterError(e) | ||||
|         except TypeError: | ||||
|             try: | ||||
|                 return self.__getitem__(args) | ||||
|             except KeyError as e: | ||||
|                 self._printParameterError(e) | ||||
|                 raise ParameterError(e) | ||||
| 
 | ||||
|     def setParam(self, **kwargs): | ||||
|         for param, value in kwargs.items(): | ||||
|             self.__setitem__(param, value) | ||||
|             # print(self) | ||||
| 
 | ||||
|     @staticmethod | ||||
|     def _printParameterError(errmsg): | ||||
|         print('ParameterError:\n non-existent parameter %s' % errmsg) | ||||
| 
 | ||||
|     def export2File(self, fnout): | ||||
|         fid_out = open(fnout, 'w') | ||||
|         lines = [] | ||||
|         for key, value in self.iteritems(): | ||||
|             lines.append('{key}\t{value}'.format(key=key, value=value)) | ||||
|         fid_out.writelines(lines) | ||||
| 
 | ||||
| 
 | ||||
| class FilterOptions(object): | ||||
|     ''' | ||||
|     FilterOptions is a parameter object type providing Butterworth filter | ||||
|     option parameter for PyLoT. Its easy to access properties helps to manage | ||||
|     file based as well as parameter manipulation within the GUI. | ||||
| 
 | ||||
|     :type filtertype: str, optional | ||||
|     :param filtertype: String containing the desired filtertype For information | ||||
|     about the supported filter types see _`Supported Filter` section . | ||||
| 
 | ||||
|     :type freq: list, optional | ||||
|     :param freq: list of float(s) describing the cutoff limits of the filter | ||||
| 
 | ||||
|     :type order: int, optional | ||||
|     :param order: Integer value describing the order of the desired Butterworth | ||||
|     filter. | ||||
| 
 | ||||
|     .. rubric:: _`Supported Filter` | ||||
| 
 | ||||
|         ``'bandpass'`` | ||||
|             Butterworth-Bandpass | ||||
| 
 | ||||
|         ``'bandstop'`` | ||||
|             Butterworth-Bandstop | ||||
| 
 | ||||
|         ``'lowpass'`` | ||||
|             Butterworth-Lowpass | ||||
| 
 | ||||
|         ``'highpass'`` | ||||
|             Butterworth-Highpass | ||||
|     ''' | ||||
| 
 | ||||
|     def __init__(self, filtertype='bandpass', freq=[2., 5.], order=3, | ||||
|                  **kwargs): | ||||
|         self._order = order | ||||
|         self._filtertype = filtertype | ||||
|         self._freq = freq | ||||
| 
 | ||||
|     def __str__(self): | ||||
|         hrs = '''\n\tFilter parameter:\n | ||||
|         Type:\t\t{ftype}\n | ||||
|         Frequencies:\t{freq}\n | ||||
|         Order:\t\t{order}\n | ||||
|         '''.format(ftype=self.getFilterType(), | ||||
|                    freq=self.getFreq(), | ||||
|                    order=self.getOrder()) | ||||
|         return hrs | ||||
| 
 | ||||
|     def __nonzero__(self): | ||||
|         return bool(self.getFilterType()) | ||||
| 
 | ||||
|     def parseFilterOptions(self): | ||||
|         if self: | ||||
|             robject = {'type': self.getFilterType(), 'corners': self.getOrder()} | ||||
|             if len(self.getFreq()) > 1: | ||||
|                 robject['freqmin'] = self.getFreq()[0] | ||||
|                 robject['freqmax'] = self.getFreq()[1] | ||||
|             else: | ||||
|                 robject['freq'] = self.getFreq() if type(self.getFreq()) is \ | ||||
|                                                     float else self.getFreq()[0] | ||||
|             return robject | ||||
|         return None | ||||
| 
 | ||||
|     def getFreq(self): | ||||
|         return self.__getattribute__('_freq') | ||||
| 
 | ||||
|     def setFreq(self, freq): | ||||
|         self.__setattr__('_freq', freq) | ||||
| 
 | ||||
|     def getOrder(self): | ||||
|         return self.__getattribute__('_order') | ||||
| 
 | ||||
|     def setOrder(self, order): | ||||
|         self.__setattr__('_order', order) | ||||
| 
 | ||||
|     def getFilterType(self): | ||||
|         return self.__getattribute__('_filtertype') | ||||
| 
 | ||||
|     def setFilterType(self, filtertype): | ||||
|         self.__setattr__('_filtertype', filtertype) | ||||
							
								
								
									
										220
									
								
								pylot/core/io/location.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,220 @@ | ||||
| from obspy import UTCDateTime | ||||
| from obspy.core import event as ope | ||||
| 
 | ||||
| from pylot.core.util.utils import getLogin, getHash | ||||
| 
 | ||||
| 
 | ||||
| def create_amplitude(pickID, amp, unit, category, cinfo): | ||||
|     ''' | ||||
| 
 | ||||
|     :param pickID: | ||||
|     :param amp: | ||||
|     :param unit: | ||||
|     :param category: | ||||
|     :param cinfo: | ||||
|     :return: | ||||
|     ''' | ||||
|     amplitude = ope.Amplitude() | ||||
|     amplitude.creation_info = cinfo | ||||
|     amplitude.generic_amplitude = amp | ||||
|     amplitude.unit = ope.AmplitudeUnit(unit) | ||||
|     amplitude.type = ope.AmplitudeCategory(category) | ||||
|     amplitude.pick_id = pickID | ||||
|     return amplitude | ||||
| 
 | ||||
| 
 | ||||
| def create_arrival(pickresID, cinfo, phase, azimuth=None, dist=None): | ||||
|     ''' | ||||
|     create_arrival - function to create an Obspy Arrival | ||||
| 
 | ||||
|     :param pickresID: Resource identifier of the created pick | ||||
|     :type pickresID: :class: `~obspy.core.event.ResourceIdentifier` object | ||||
|     :param cinfo: An ObsPy :class: `~obspy.core.event.CreationInfo` object | ||||
|     holding information on the creation of the returned object | ||||
|     :type cinfo: :class: `~obspy.core.event.CreationInfo` object | ||||
|     :param phase: name of the arrivals seismic phase | ||||
|     :type phase: str | ||||
|     :param azimuth: azimuth between source and receiver | ||||
|     :type azimuth: float or int, optional | ||||
|     :param dist: distance between source and receiver | ||||
|     :type dist: float or int, optional | ||||
|     :return: An ObsPy :class: `~obspy.core.event.Arrival` object | ||||
|     ''' | ||||
|     arrival = ope.Arrival() | ||||
|     arrival.creation_info = cinfo | ||||
|     arrival.pick_id = pickresID | ||||
|     arrival.phase = phase | ||||
|     if azimuth is not None: | ||||
|         arrival.azimuth = float(azimuth) if azimuth > -180 else azimuth + 360. | ||||
|     else: | ||||
|         arrival.azimuth = azimuth | ||||
|     arrival.distance = dist | ||||
|     return arrival | ||||
| 
 | ||||
| 
 | ||||
| def create_creation_info(agency_id=None, creation_time=None, author=None): | ||||
|     ''' | ||||
| 
 | ||||
|     :param agency_id: | ||||
|     :param creation_time: | ||||
|     :param author: | ||||
|     :return: | ||||
|     ''' | ||||
|     if author is None: | ||||
|         author = getLogin() | ||||
|     if creation_time is None: | ||||
|         creation_time = UTCDateTime() | ||||
|     return ope.CreationInfo(agency_id=agency_id, author=author, | ||||
|                             creation_time=creation_time) | ||||
| 
 | ||||
| 
 | ||||
| def create_event(origintime, cinfo, originloc=None, etype='earthquake', | ||||
|                  resID=None, authority_id=None): | ||||
|     ''' | ||||
|     create_event - funtion to create an ObsPy Event | ||||
| 
 | ||||
|     :param origintime: the events origintime | ||||
|     :type origintime: :class: `~obspy.core.utcdatetime.UTCDateTime` object | ||||
|     :param cinfo: An ObsPy :class: `~obspy.core.event.CreationInfo` object | ||||
|         holding information on the creation of the returned object | ||||
|     :type cinfo: :class: `~obspy.core.event.CreationInfo` object | ||||
|     :param originloc: tuple containing the location of the origin | ||||
|         (LAT, LON, DEP) affiliated with the event which is created | ||||
|     :type originloc: tuple, list | ||||
|     :param etype: Event type str object. converted via ObsPy to a valid event | ||||
|         type string. | ||||
|     :type etype: str | ||||
|     :param resID: Resource identifier of the created event | ||||
|     :type resID: :class: `~obspy.core.event.ResourceIdentifier` object, str | ||||
|     :param authority_id: name of the institution carrying out the processing | ||||
|     :type authority_id: str | ||||
|     :return: An ObsPy :class: `~obspy.core.event.Event` object | ||||
|     ''' | ||||
| 
 | ||||
|     if originloc is not None: | ||||
|         o = create_origin(origintime, cinfo, | ||||
|                           originloc[0], originloc[1], originloc[2]) | ||||
|     else: | ||||
|         o = None | ||||
|     if not resID: | ||||
|         resID = create_resourceID(origintime, etype, authority_id) | ||||
|     elif isinstance(resID, str): | ||||
|         resID = create_resourceID(origintime, etype, authority_id, resID) | ||||
|     elif not isinstance(resID, ope.ResourceIdentifier): | ||||
|         raise TypeError("unsupported type(resID) for resource identifier " | ||||
|                         "generation: %s" % type(resID)) | ||||
|     event = ope.Event(resource_id=resID) | ||||
|     event.creation_info = cinfo | ||||
|     event.event_type = etype | ||||
|     if o: | ||||
|         event.origins = [o] | ||||
|     return event | ||||
| 
 | ||||
| 
 | ||||
| def create_magnitude(originID, cinfo): | ||||
|     ''' | ||||
|     create_magnitude - function to create an ObsPy Magnitude object | ||||
|     :param originID: | ||||
|     :type originID: | ||||
|     :param cinfo: | ||||
|     :type cinfo: | ||||
|     :return: | ||||
|     ''' | ||||
|     magnitude = ope.Magnitude() | ||||
|     magnitude.creation_info = cinfo | ||||
|     magnitude.origin_id = originID | ||||
|     return magnitude | ||||
| 
 | ||||
| 
 | ||||
| def create_origin(origintime, cinfo, latitude, longitude, depth): | ||||
|     ''' | ||||
|     create_origin - function to create an ObsPy Origin | ||||
|     :param origintime: the origins time of occurence | ||||
|     :type origintime: :class: `~obspy.core.utcdatetime.UTCDateTime` object | ||||
|     :param cinfo: | ||||
|     :type cinfo: | ||||
|     :param latitude: latitude in decimal degree of the origins location | ||||
|     :type latitude: float | ||||
|     :param longitude: longitude in decimal degree of the origins location | ||||
|     :type longitude: float | ||||
|     :param depth: hypocentral depth of the origin | ||||
|     :type depth: float | ||||
|     :return: An ObsPy :class: `~obspy.core.event.Origin` object | ||||
|     ''' | ||||
| 
 | ||||
|     assert isinstance(origintime, UTCDateTime), "origintime has to be " \ | ||||
|                                                 "a UTCDateTime object, but " \ | ||||
|                                                 "actually is of type " \ | ||||
|                                                 "'%s'" % type(origintime) | ||||
| 
 | ||||
|     origin = ope.Origin() | ||||
|     origin.time = origintime | ||||
|     origin.creation_info = cinfo | ||||
|     origin.latitude = latitude | ||||
|     origin.longitude = longitude | ||||
|     origin.depth = depth | ||||
|     return origin | ||||
| 
 | ||||
| 
 | ||||
| def create_pick(origintime, picknum, picktime, eventnum, cinfo, phase, station, | ||||
|                 wfseedstr, authority_id): | ||||
|     ''' | ||||
|     create_pick - function to create an ObsPy Pick | ||||
| 
 | ||||
|     :param origintime: | ||||
|     :type origintime: | ||||
|     :param picknum: number of the created pick | ||||
|     :type picknum: int | ||||
|     :param picktime: | ||||
|     :type picktime: | ||||
|     :param eventnum: human-readable event identifier | ||||
|     :type eventnum: str | ||||
|     :param cinfo: An ObsPy :class: `~obspy.core.event.CreationInfo` object | ||||
|         holding information on the creation of the returned object | ||||
|     :type cinfo: :class: `~obspy.core.event.CreationInfo` object | ||||
|     :param phase: name of the arrivals seismic phase | ||||
|     :type phase: str | ||||
|     :param station: name of the station at which the seismic phase has been | ||||
|         picked | ||||
|     :type station: str | ||||
|     :param wfseedstr: A SEED formatted string of the form | ||||
|         network.station.location.channel in order to set a referenced waveform | ||||
|     :type wfseedstr: str, SEED formatted | ||||
|     :param authority_id: name of the institution carrying out the processing | ||||
|     :type authority_id: str | ||||
|     :return: An ObsPy :class: `~obspy.core.event.Pick` object | ||||
|     ''' | ||||
|     pickID = eventnum + '_' + station.strip() + '/{0:03d}'.format(picknum) | ||||
|     pickresID = create_resourceID(origintime, 'pick', authority_id, pickID) | ||||
|     pick = ope.Pick() | ||||
|     pick.resource_id = pickresID | ||||
|     pick.time = picktime | ||||
|     pick.creation_info = cinfo | ||||
|     pick.phase_hint = phase | ||||
|     pick.waveform_id = ope.ResourceIdentifier(id=wfseedstr, prefix='file:/') | ||||
|     return pick | ||||
| 
 | ||||
| 
 | ||||
| def create_resourceID(timetohash, restype, authority_id=None, hrstr=None): | ||||
|     ''' | ||||
| 
 | ||||
|     :param timetohash: | ||||
|     :type timetohash | ||||
|     :param restype: type of the resource, e.g. 'orig', 'earthquake' ... | ||||
|     :type restype: str | ||||
|     :param authority_id: name of the institution carrying out the processing | ||||
|     :type authority_id: str, optional | ||||
|     :param hrstr: | ||||
|     :type hrstr: | ||||
|     :return: | ||||
|     ''' | ||||
|     assert isinstance(timetohash, UTCDateTime), "'timetohash' is not an ObsPy" \ | ||||
|                                                 "UTCDateTime object" | ||||
|     hid = getHash(timetohash) | ||||
|     if hrstr is None: | ||||
|         resID = ope.ResourceIdentifier(restype + '/' + hid[0:6]) | ||||
|     else: | ||||
|         resID = ope.ResourceIdentifier(restype + '/' + hrstr) | ||||
|     if authority_id is not None: | ||||
|         resID.convertIDToQuakeMLURI(authority_id=authority_id) | ||||
|     return resID | ||||
							
								
								
									
										570
									
								
								pylot/core/io/phases.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,570 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import glob | ||||
| import obspy.core.event as ope | ||||
| import os | ||||
| import scipy.io as sio | ||||
| import warnings | ||||
| from obspy.core import UTCDateTime | ||||
| 
 | ||||
| from pylot.core.io.inputs import AutoPickParameter | ||||
| from pylot.core.io.location import create_arrival, create_event, \ | ||||
|     create_magnitude, create_origin, create_pick | ||||
| from pylot.core.pick.utils import select_for_phase | ||||
| from pylot.core.util.utils import getOwner, full_range, four_digits | ||||
| 
 | ||||
| 
 | ||||
| def add_amplitudes(event, amplitudes): | ||||
|     amplitude_list = [] | ||||
|     for pick in event.picks: | ||||
|         try: | ||||
|             a0 = amplitudes[pick.waveform_id.station_code] | ||||
|             amplitude = ope.Amplitude(generic_amplitude=a0 * 1e-3) | ||||
|             amplitude.unit = 'm' | ||||
|             amplitude.category = 'point' | ||||
|             amplitude.waveform_id = pick.waveform_id | ||||
|             amplitude.magnitude_hint = 'ML' | ||||
|             amplitude.pick_id = pick.resource_id | ||||
|             amplitude.type = 'AML' | ||||
|             amplitude_list.append(amplitude) | ||||
|         except KeyError: | ||||
|             continue | ||||
|     event.amplitudes = amplitude_list | ||||
|     return event | ||||
| 
 | ||||
| def readPILOTEvent(phasfn=None, locfn=None, authority_id='RUB', **kwargs): | ||||
|     """ | ||||
|     readPILOTEvent - function | ||||
| 
 | ||||
|     Reads Matlab PHASES and LOC files written by Matlab versions of PILOT and | ||||
|     converts the data into an ObsPy Event object which is returned to the | ||||
|     calling program. | ||||
| 
 | ||||
|     :rtype : ~obspy.core.event.Event | ||||
|     :param eventID: | ||||
|     :param authority: | ||||
|     :param kwargs: | ||||
|     :param phasfn: filename of the old PILOT Matlab PHASES file | ||||
|     :param locfn: filename of the old PILOT Matlab LOC file | ||||
|     :return event:  event object containing event and phase information | ||||
|     """ | ||||
|     if phasfn is not None and os.path.isfile(phasfn): | ||||
|         phases = sio.loadmat(phasfn) | ||||
|         phasctime = UTCDateTime(os.path.getmtime(phasfn)) | ||||
|         phasauthor = getOwner(phasfn) | ||||
|     else: | ||||
|         phases = None | ||||
|         phasctime = None | ||||
|         phasauthor = None | ||||
|     if locfn is not None and os.path.isfile(locfn): | ||||
|         loc = sio.loadmat(locfn) | ||||
|         locctime = UTCDateTime(os.path.getmtime(locfn)) | ||||
|         locauthor = getOwner(locfn) | ||||
|     else: | ||||
|         loc = None | ||||
|         locctime = None | ||||
|         locauthor = None | ||||
|     pickcinfo = ope.CreationInfo(agency_id=authority_id, | ||||
|                                  author=phasauthor, | ||||
|                                  creation_time=phasctime) | ||||
|     loccinfo = ope.CreationInfo(agency_id=authority_id, | ||||
|                                 author=locauthor, | ||||
|                                 creation_time=locctime) | ||||
| 
 | ||||
|     eventNum = str(loc['ID'][0]) | ||||
| 
 | ||||
|     # retrieve eventID for the actual database | ||||
|     idsplit = eventNum.split('.') | ||||
| 
 | ||||
|     # retrieve date information | ||||
|     julday = int(idsplit[1]) | ||||
|     year = int(idsplit[2]) | ||||
|     hour = int(loc['hh']) | ||||
|     minute = int(loc['mm']) | ||||
|     second = int(loc['ss']) | ||||
| 
 | ||||
|     year = four_digits(year) | ||||
| 
 | ||||
|     eventDate = UTCDateTime(year=year, julday=julday, hour=hour, | ||||
|                             minute=minute, second=second) | ||||
| 
 | ||||
|     stations = [stat for stat in phases['stat'][0:-1:3]] | ||||
| 
 | ||||
|     lat = float(loc['LAT']) | ||||
|     lon = float(loc['LON']) | ||||
|     dep = float(loc['DEP']) | ||||
| 
 | ||||
|     event = create_event(eventDate, loccinfo, originloc=(lat, lon, dep), | ||||
|                          etype='earthquake', resID=eventNum, | ||||
|                          authority_id=authority_id) | ||||
| 
 | ||||
|     picks = picksdict_from_pilot(phasfn) | ||||
| 
 | ||||
|     event.picks = picks_from_picksdict(picks, creation_info=pickcinfo) | ||||
| 
 | ||||
|     if event.origins: | ||||
|         origin = event.origins[0] | ||||
|         magnitude = create_magnitude(origin.get('id'), loccinfo) | ||||
|         magnitude.mag = float(loc['Mnet']) | ||||
|         magnitude.magnitude_type = 'Ml' | ||||
|         event.magnitudes.append(magnitude) | ||||
|     return event | ||||
| 
 | ||||
| 
 | ||||
| def picksdict_from_pilot(fn): | ||||
|     from pylot.core.util.defaults import TIMEERROR_DEFAULTS | ||||
|     picks = dict() | ||||
|     phases_pilot = sio.loadmat(fn) | ||||
|     stations = stations_from_pilot(phases_pilot['stat']) | ||||
|     params = AutoPickParameter(TIMEERROR_DEFAULTS) | ||||
|     timeerrors = dict(P=params.get('timeerrorsP'), | ||||
|                       S=params.get('timeerrorsS')) | ||||
|     for n, station in enumerate(stations): | ||||
|         phases = dict() | ||||
|         for onset_name in 'PS': | ||||
|             onset_label = '{0}time'.format(onset_name) | ||||
|             pick = phases_pilot[onset_label][n] | ||||
|             if not pick[0]: | ||||
|                 continue | ||||
|             pick = convert_pilot_times(pick) | ||||
|             uncertainty_label = '{0}weight'.format(onset_name.lower()) | ||||
|             ierror = phases_pilot[uncertainty_label][0, n] | ||||
|             try: | ||||
|                 spe = timeerrors[onset_name][ierror] | ||||
|             except IndexError as e: | ||||
|                 print(e.message + '\ntake two times the largest default error value') | ||||
|                 spe = timeerrors[onset_name][-1] * 2 | ||||
|             phases[onset_name] = dict(mpp=pick, spe=spe, weight=ierror) | ||||
|         picks[station] = phases | ||||
| 
 | ||||
|     return picks | ||||
| 
 | ||||
| 
 | ||||
| def stations_from_pilot(stat_array): | ||||
|     stations = list() | ||||
|     cur_stat = None | ||||
|     for stat in stat_array: | ||||
|         stat = stat.strip() | ||||
|         if stat == cur_stat: | ||||
|             continue | ||||
|         cur_stat = stat | ||||
|         if stat not in stations: | ||||
|             stations.append(stat) | ||||
|         else: | ||||
|             warnings.warn('station {0} listed at least twice, might corrupt ' | ||||
|                           'phase times', RuntimeWarning) | ||||
| 
 | ||||
|     return stations | ||||
| 
 | ||||
| 
 | ||||
| def convert_pilot_times(time_array): | ||||
|     times = [int(time) for time in time_array] | ||||
|     microseconds = int((time_array[-1] - times[-1]) * 1e6) | ||||
|     times.append(microseconds) | ||||
|     return UTCDateTime(*times) | ||||
| 
 | ||||
| 
 | ||||
| def picksdict_from_obs(fn): | ||||
|     picks = dict() | ||||
|     station_name = str() | ||||
|     for line in open(fn, 'r'): | ||||
|         if line.startswith('#'): | ||||
|             continue | ||||
|         else: | ||||
|             phase_line = line.split() | ||||
|             if not station_name == phase_line[0]: | ||||
|                 phase = dict() | ||||
|             station_name = phase_line[0] | ||||
|             phase_name = phase_line[4].upper() | ||||
|             pick = UTCDateTime(phase_line[6] + phase_line[7] + phase_line[8]) | ||||
|             phase[phase_name] = dict(mpp=pick, fm=phase_line[5]) | ||||
|             picks[station_name] = phase | ||||
|     return picks | ||||
| 
 | ||||
| 
 | ||||
| def picksdict_from_picks(evt): | ||||
|     """ | ||||
|     Takes an Event object and return the pick dictionary commonly used within | ||||
|     PyLoT | ||||
|     :param evt: Event object contain all available information | ||||
|     :type evt: `~obspy.core.event.Event` | ||||
|     :return: pick dictionary | ||||
|     """ | ||||
|     picks = {} | ||||
|     for pick in evt.picks: | ||||
|         phase = {} | ||||
|         station = pick.waveform_id.station_code | ||||
|         try: | ||||
|             onsets = picks[station] | ||||
|         except KeyError as e: | ||||
|             #print(e) | ||||
|             onsets = {} | ||||
|         mpp = pick.time | ||||
|         spe = pick.time_errors.uncertainty | ||||
|         try: | ||||
|             lpp = mpp + pick.time_errors.upper_uncertainty | ||||
|             epp = mpp - pick.time_errors.lower_uncertainty | ||||
|         except TypeError as e: | ||||
|             msg = e.message + ',\n falling back to symmetric uncertainties' | ||||
|             warnings.warn(msg) | ||||
|             lpp = mpp + spe | ||||
|             epp = mpp - spe | ||||
|         phase['mpp'] = mpp | ||||
|         phase['epp'] = epp | ||||
|         phase['lpp'] = lpp | ||||
|         phase['spe'] = spe | ||||
|         try: | ||||
|             picker = str(pick.method_id) | ||||
|             if picker.startswith('smi:local/'): | ||||
|                 picker = picker.split('smi:local/')[1] | ||||
|             phase['picker'] = picker | ||||
|         except IndexError: | ||||
|             pass | ||||
| 
 | ||||
|         onsets[pick.phase_hint] = phase.copy() | ||||
|         picks[station] = onsets.copy() | ||||
|     return picks | ||||
| 
 | ||||
| def picks_from_picksdict(picks, creation_info=None): | ||||
|     picks_list = list() | ||||
|     for station, onsets in picks.items(): | ||||
|         for label, phase in onsets.items(): | ||||
|             if not isinstance(phase, dict): | ||||
|                 continue | ||||
|             onset = phase['mpp'] | ||||
|             pick = ope.Pick() | ||||
|             if creation_info: | ||||
|                 pick.creation_info = creation_info | ||||
|             pick.time = onset | ||||
|             error = phase['spe'] | ||||
|             pick.time_errors.uncertainty = error | ||||
|             try: | ||||
|                 epp = phase['epp'] | ||||
|                 lpp = phase['lpp'] | ||||
|                 pick.time_errors.lower_uncertainty = onset - epp | ||||
|                 pick.time_errors.upper_uncertainty = lpp - onset | ||||
|             except KeyError as e: | ||||
|                 warnings.warn(e.message, RuntimeWarning) | ||||
|             try: | ||||
|                 picker = phase['picker'] | ||||
|             except KeyError as e: | ||||
|                 warnings.warn(e.message, RuntimeWarning) | ||||
|                 picker = 'Unknown' | ||||
|             pick.phase_hint = label | ||||
|             pick.method_id = ope.ResourceIdentifier(id=picker) | ||||
|             pick.waveform_id = ope.WaveformStreamID(station_code=station) | ||||
|             try: | ||||
|                 polarity = phase['fm'] | ||||
|                 if polarity == 'U' or '+': | ||||
|                     pick.polarity = 'positive' | ||||
|                 elif polarity == 'D' or '-': | ||||
|                     pick.polarity = 'negative' | ||||
|                 else: | ||||
|                     pick.polarity = 'undecidable' | ||||
|             except KeyError as e: | ||||
|                 if 'fm' in e.message: # no polarity information found for this phase | ||||
|                     pass | ||||
|                 else: | ||||
|                     raise e | ||||
|             picks_list.append(pick) | ||||
|     return picks_list | ||||
| 
 | ||||
| 
 | ||||
| def reassess_pilot_db(root_dir, db_dir, out_dir=None, fn_param=None, verbosity=0): | ||||
|     import glob | ||||
| 
 | ||||
|     db_root = os.path.join(root_dir, db_dir) | ||||
|     evt_list = glob.glob1(db_root,'e????.???.??') | ||||
| 
 | ||||
|     for evt in evt_list: | ||||
|         if verbosity > 0: | ||||
|             print('Reassessing event {0}'.format(evt)) | ||||
|         reassess_pilot_event(root_dir, db_dir, evt, out_dir, fn_param, verbosity) | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| def reassess_pilot_event(root_dir, db_dir, event_id, out_dir=None, fn_param=None, verbosity=0): | ||||
|     from obspy import read | ||||
| 
 | ||||
|     from pylot.core.io.inputs import AutoPickParameter | ||||
|     from pylot.core.pick.utils import earllatepicker | ||||
| 
 | ||||
|     if fn_param is None: | ||||
|         import pylot.core.util.defaults as defaults | ||||
|         fn_param = defaults.AUTOMATIC_DEFAULTS | ||||
| 
 | ||||
|     default = AutoPickParameter(fn_param, verbosity) | ||||
| 
 | ||||
|     search_base = os.path.join(root_dir, db_dir, event_id) | ||||
|     phases_file = glob.glob(os.path.join(search_base, 'PHASES.mat')) | ||||
|     if not phases_file: | ||||
|         return | ||||
|     if verbosity > 1: | ||||
|         print('Opening PILOT phases file: {fn}'.format(fn=phases_file[0])) | ||||
|     picks_dict = picksdict_from_pilot(phases_file[0]) | ||||
|     if verbosity > 0: | ||||
|         print('Dictionary read from PHASES.mat:\n{0}'.format(picks_dict)) | ||||
|     datacheck = list() | ||||
|     info = None | ||||
|     for station in picks_dict.keys(): | ||||
|         fn_pattern = os.path.join(search_base, '{0}*'.format(station)) | ||||
|         try: | ||||
|             st = read(fn_pattern) | ||||
|         except TypeError as e: | ||||
|             if 'Unknown format for file' in e.message: | ||||
|                 try: | ||||
|                     st = read(fn_pattern, format='GSE2') | ||||
|                 except ValueError as e: | ||||
|                     if e.message == 'second must be in 0..59': | ||||
|                         info = 'A known Error was raised. Please find the list of corrupted files and double-check these files.' | ||||
|                         datacheck.append(fn_pattern + ' (time info)\n') | ||||
|                         continue | ||||
|                     else: | ||||
|                         raise ValueError(e.message) | ||||
|                 except Exception as e: | ||||
|                     if 'No file matching file pattern:' in e.message: | ||||
|                         if verbosity > 0: | ||||
|                             warnings.warn('no waveform data found for station {station}'.format(station=station), RuntimeWarning) | ||||
|                         datacheck.append(fn_pattern + ' (no data)\n') | ||||
|                         continue | ||||
|                     else: | ||||
|                         raise e | ||||
|             else: | ||||
|                 raise e | ||||
|         for phase in picks_dict[station].keys(): | ||||
|             try: | ||||
|                 mpp = picks_dict[station][phase]['mpp'] | ||||
|             except KeyError as e: | ||||
|                 print(e.message, station) | ||||
|                 continue | ||||
|             sel_st = select_for_phase(st, phase) | ||||
|             if not sel_st: | ||||
|                 msg = 'no waveform data found for station {station}'.format(station=station) | ||||
|                 warnings.warn(msg, RuntimeWarning) | ||||
|                 continue | ||||
|             stime, etime = full_range(sel_st) | ||||
|             rel_pick = mpp - stime | ||||
|             epp, lpp, spe = earllatepicker(sel_st, | ||||
|                                            default.get('nfac{0}'.format(phase)), | ||||
|                                            default.get('tsnrz' if phase == 'P' else 'tsnrh'), | ||||
|                                            Pick1=rel_pick, | ||||
|                                            iplot=None, | ||||
|                                            stealth_mode=True) | ||||
|             if epp is None or lpp is None: | ||||
|                 continue | ||||
|             epp = stime + epp | ||||
|             lpp = stime + lpp | ||||
|             min_diff = 3 * st[0].stats.delta | ||||
|             if lpp - mpp < min_diff: | ||||
|                 lpp = mpp + min_diff | ||||
|             if mpp - epp < min_diff: | ||||
|                 epp = mpp - min_diff | ||||
|             picks_dict[station][phase] = dict(epp=epp, mpp=mpp, lpp=lpp, spe=spe) | ||||
|     if datacheck: | ||||
|         if info: | ||||
|             if verbosity > 0: | ||||
|                 print(info + ': {0}'.format(search_base)) | ||||
|         fncheck = open(os.path.join(search_base, 'datacheck_list'), 'w') | ||||
|         fncheck.writelines(datacheck) | ||||
|         fncheck.close() | ||||
|         del datacheck | ||||
|     # create Event object for export | ||||
|     evt = ope.Event(resource_id=event_id) | ||||
|     evt.picks = picks_from_picksdict(picks_dict) | ||||
|     # write phase information to file | ||||
|     if not out_dir: | ||||
|         fnout_prefix = os.path.join(root_dir, db_dir, event_id, '{0}.'.format(event_id)) | ||||
|     else: | ||||
|         out_dir = os.path.join(out_dir, db_dir) | ||||
|         if not os.path.isdir(out_dir): | ||||
|             os.makedirs(out_dir) | ||||
|         fnout_prefix = os.path.join(out_dir, '{0}.'.format(event_id)) | ||||
|     evt.write(fnout_prefix + 'xml', format='QUAKEML') | ||||
|     #evt.write(fnout_prefix + 'cnv', format='VELEST') | ||||
| 
 | ||||
| 
 | ||||
| def writephases(arrivals, fformat, filename): | ||||
|     """ | ||||
|     Function of methods to write phases to the following standard file | ||||
|     formats used for locating earthquakes: | ||||
| 
 | ||||
|     HYPO71, NLLoc, VELEST, HYPOSAT, and hypoDD | ||||
| 
 | ||||
|     :param: arrivals | ||||
|     :type: dictionary containing all phase information including | ||||
|            station ID, phase, first motion, weight (uncertainty), | ||||
|            .... | ||||
| 
 | ||||
|     :param: fformat | ||||
|     :type:  string, chosen file format (location routine), | ||||
|             choose between NLLoc, HYPO71, HYPOSAT, VELEST, | ||||
|             HYPOINVERSE, and hypoDD | ||||
| 
 | ||||
|     :param: filename, full path and name of phase file | ||||
|     :type: string | ||||
|     """ | ||||
| 
 | ||||
|     if fformat == 'NLLoc': | ||||
|         print ("Writing phases to %s for NLLoc" % filename) | ||||
|         fid = open("%s" % filename, 'w') | ||||
|         # write header | ||||
|         fid.write('# EQEVENT:  Label: EQ001  Loc:  X 0.00  Y 0.00  Z 10.00  OT 0.00 \n') | ||||
|         for key in arrivals: | ||||
|             # P onsets | ||||
|             if arrivals[key]['P']: | ||||
|                 try: | ||||
|                     fm = arrivals[key]['P']['fm'] | ||||
|                 except KeyError as e: | ||||
|                     print(e) | ||||
|                     fm = None | ||||
|                 if fm == None: | ||||
|                     fm = '?' | ||||
|                 onset = arrivals[key]['P']['mpp'] | ||||
|                 year = onset.year | ||||
|                 month = onset.month | ||||
|                 day = onset.day | ||||
|                 hh = onset.hour | ||||
|                 mm = onset.minute | ||||
|                 ss = onset.second | ||||
|                 ms = onset.microsecond | ||||
|                 ss_ms = ss + ms / 1000000.0 | ||||
|                 pweight = 1 # use pick | ||||
|                 try: | ||||
|                     if arrivals[key]['P']['weight'] >= 4: | ||||
|                         pweight = 0  # do not use pick | ||||
|                 except KeyError as e: | ||||
|                     print(e.message + '; no weight set during processing') | ||||
|                 fid.write('%s ? ? ? P   %s %d%02d%02d %02d%02d %7.4f GAU 0 0 0 0 %d \n' % (key, | ||||
|                                                                                            fm, | ||||
|                                                                                            year, | ||||
|                                                                                            month, | ||||
|                                                                                            day, | ||||
|                                                                                            hh, | ||||
|                                                                                            mm, | ||||
|                                                                                            ss_ms, | ||||
|                                                                                            pweight)) | ||||
|             # S onsets | ||||
|             if arrivals[key].has_key('S') and arrivals[key]['S']: | ||||
|                 fm = '?' | ||||
|                 onset = arrivals[key]['S']['mpp'] | ||||
|                 year = onset.year | ||||
|                 month = onset.month | ||||
|                 day = onset.day | ||||
|                 hh = onset.hour | ||||
|                 mm = onset.minute | ||||
|                 ss = onset.second | ||||
|                 ms = onset.microsecond | ||||
|                 ss_ms = ss + ms / 1000000.0 | ||||
|                 sweight = 1 # use pick | ||||
|                 try: | ||||
|                     if arrivals[key]['S']['weight'] >= 4: | ||||
|                         sweight = 0  # do not use pick | ||||
|                 except KeyError as e: | ||||
|                     print(str(e) + '; no weight set during processing') | ||||
|                 fid.write('%s ? ? ? S   %s %d%02d%02d %02d%02d %7.4f GAU 0 0 0 0 %d \n' % (key, | ||||
|                                                                                            fm, | ||||
|                                                                                            year, | ||||
|                                                                                            month, | ||||
|                                                                                            day, | ||||
|                                                                                            hh, | ||||
|                                                                                            mm, | ||||
|                                                                                            ss_ms, | ||||
|                                                                                            sweight)) | ||||
| 
 | ||||
|         fid.close() | ||||
|     elif fformat == 'HYPO71': | ||||
|         print ("Writing phases to %s for HYPO71" % filename) | ||||
|         fid = open("%s" % filename, 'w') | ||||
|         # write header | ||||
|         fid.write('                                                              EQ001\n') | ||||
|         for key in arrivals: | ||||
|             if arrivals[key]['P']['weight'] < 4: | ||||
|                 Ponset = arrivals[key]['P']['mpp'] | ||||
|                 Sonset = arrivals[key]['S']['mpp'] | ||||
|                 pweight = arrivals[key]['P']['weight'] | ||||
|                 sweight = arrivals[key]['S']['weight'] | ||||
|                 fm = arrivals[key]['P']['fm'] | ||||
|                 if fm is None: | ||||
|                     fm = '-' | ||||
|                 Ao = arrivals[key]['S']['Ao'] | ||||
|                 if Ao is None: | ||||
|                     Ao = '' | ||||
|                 else: | ||||
|                     Ao = str('%7.2f' % Ao) | ||||
|                 year = Ponset.year | ||||
|                 if year >= 2000: | ||||
|                     year = year - 2000 | ||||
|                 else: | ||||
|                     year = year - 1900 | ||||
|                 month = Ponset.month | ||||
|                 day = Ponset.day | ||||
|                 hh = Ponset.hour | ||||
|                 mm = Ponset.minute | ||||
|                 ss = Ponset.second | ||||
|                 ms = Ponset.microsecond | ||||
|                 ss_ms = ss + ms / 1000000.0 | ||||
|                 if pweight < 2: | ||||
|                     pstr = 'I' | ||||
|                 elif pweight >= 2: | ||||
|                     pstr = 'E' | ||||
|                 if arrivals[key]['S']['weight'] < 4: | ||||
|                     Sss = Sonset.second | ||||
|                     Sms = Sonset.microsecond | ||||
|                     Sss_ms = Sss + Sms / 1000000.0 | ||||
|                     Sss_ms = str('%5.02f' % Sss_ms) | ||||
|                     if sweight < 2: | ||||
|                         sstr = 'I' | ||||
|                     elif sweight >= 2: | ||||
|                         sstr = 'E' | ||||
|                     fid.write('%s%sP%s%d %02d%02d%02d%02d%02d%5.2f       %s%sS %d   %s\n' % (key, | ||||
|                                                                                              pstr, | ||||
|                                                                                              fm, | ||||
|                                                                                              pweight, | ||||
|                                                                                              year, | ||||
|                                                                                              month, | ||||
|                                                                                              day, | ||||
|                                                                                              hh, | ||||
|                                                                                              mm, | ||||
|                                                                                              ss_ms, | ||||
|                                                                                              Sss_ms, | ||||
|                                                                                              sstr, | ||||
|                                                                                              sweight, | ||||
|                                                                                              Ao)) | ||||
|                 else: | ||||
|                     fid.write('%s%sP%s%d %02d%02d%02d%02d%02d%5.2f                  %s\n' % (key, | ||||
|                                                                                              pstr, | ||||
|                                                                                              fm, | ||||
|                                                                                              pweight, | ||||
|                                                                                              year, | ||||
|                                                                                              month, | ||||
|                                                                                              day, | ||||
|                                                                                              hh, | ||||
|                                                                                              mm, | ||||
|                                                                                              ss_ms, | ||||
|                                                                                              Ao)) | ||||
| 
 | ||||
|         fid.close() | ||||
| 
 | ||||
| 
 | ||||
| def merge_picks(event, picks): | ||||
|     """ | ||||
|     takes an event object and a list of picks and searches for matching | ||||
|     entries by comparing station name and phase_hint and overwrites the time | ||||
|     and time_errors value of the event picks' with those from the picks | ||||
|     without changing the resource identifiers | ||||
|     :param event: `obspy.core.event.Event` object (e.g. from NLLoc output) | ||||
|     :param picks: list of `obspy.core.event.Pick` objects containing the | ||||
|     original time and time_errors values | ||||
|     :return: merged `obspy.core.event.Event` object | ||||
|     """ | ||||
|     for pick in picks: | ||||
|         time = pick.time | ||||
|         err = pick.time_errors | ||||
|         phase = pick.phase_hint | ||||
|         station = pick.waveform_id.station_code | ||||
|         for p in event.picks: | ||||
|             if p.waveform_id.station_code == station and p.phase_hint == phase: | ||||
|                 p.time, p.time_errors = time, err | ||||
|         del time, err, phase, station | ||||
|     return event | ||||
							
								
								
									
										2
									
								
								pylot/core/loc/__init__.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,2 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
							
								
								
									
										21
									
								
								pylot/core/loc/hsat.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,21 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| from pylot.core.io.phases import writephases | ||||
| from pylot.core.util.version import get_git_version as _getVersionString | ||||
| 
 | ||||
| __version__ = _getVersionString() | ||||
| 
 | ||||
| def export(picks, fnout): | ||||
|     ''' | ||||
|     Take <picks> dictionary and exports picking data to a NLLOC-obs | ||||
|     <phasefile> without creating an ObsPy event object. | ||||
| 
 | ||||
|     :param picks: picking data dictionary | ||||
|     :type picks: dict | ||||
| 
 | ||||
|     :param fnout: complete path to the exporting obs file | ||||
|     :type fnout: str | ||||
|     ''' | ||||
|     # write phases to NLLoc-phase file | ||||
|     writephases(picks, 'HYPO71', fnout) | ||||
							
								
								
									
										99
									
								
								pylot/core/loc/nll.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,99 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import subprocess | ||||
| import os | ||||
| import glob | ||||
| from obspy import read_events | ||||
| from pylot.core.io.phases import writephases | ||||
| from pylot.core.util.utils import getPatternLine, runProgram, which | ||||
| from pylot.core.util.version import get_git_version as _getVersionString | ||||
| 
 | ||||
| __version__ = _getVersionString() | ||||
| 
 | ||||
| class NLLocError(EnvironmentError): | ||||
|     pass | ||||
| 
 | ||||
| def export(picks, fnout): | ||||
|     ''' | ||||
|     Take <picks> dictionary and exports picking data to a NLLOC-obs | ||||
|     <phasefile> without creating an ObsPy event object. | ||||
| 
 | ||||
|     :param picks: picking data dictionary | ||||
|     :type picks: dict | ||||
| 
 | ||||
|     :param fnout: complete path to the exporting obs file | ||||
|     :type fnout: str | ||||
|     ''' | ||||
|     # write phases to NLLoc-phase file | ||||
|     writephases(picks, 'NLLoc', fnout) | ||||
| 
 | ||||
| 
 | ||||
| def modify_inputs(ctrfn, root, nllocoutn, phasefn, tttn): | ||||
|     ''' | ||||
|     :param ctrfn: name of NLLoc-control file | ||||
|     :type: str | ||||
| 
 | ||||
|     :param root: root path to NLLoc working directory | ||||
|     :type: str | ||||
| 
 | ||||
|     :param nllocoutn: name of NLLoc-location output file | ||||
|     :type: str | ||||
| 
 | ||||
|     :param phasefn: name of NLLoc-input phase file | ||||
|     :type: str | ||||
| 
 | ||||
|     :param tttn: pattern of precalculated NLLoc traveltime tables | ||||
|     :type: str | ||||
|     ''' | ||||
|     # For locating the event the NLLoc-control file has to be modified! | ||||
|     # create comment line for NLLoc-control file NLLoc-output file | ||||
|     ctrfile = os.path.join(root, 'run', ctrfn) | ||||
|     nllocout = os.path.join(root, 'loc', nllocoutn) | ||||
|     phasefile = os.path.join(root, 'obs', phasefn) | ||||
|     tttable = os.path.join(root, 'time', tttn) | ||||
|     locfiles = 'LOCFILES %s NLLOC_OBS %s %s 0\n' % (phasefile, tttable, nllocout) | ||||
| 
 | ||||
|     # modification of NLLoc-control file | ||||
|     print ("Modifying  NLLoc-control file %s ..." % ctrfile) | ||||
|     curlocfiles = getPatternLine(ctrfile, 'LOCFILES') | ||||
|     nllfile = open(ctrfile, 'r') | ||||
|     filedata = nllfile.read() | ||||
|     if filedata.find(locfiles) < 0: | ||||
|         # replace old command | ||||
|         filedata = filedata.replace(curlocfiles, locfiles) | ||||
|         nllfile = open(ctrfile, 'w') | ||||
|         nllfile.write(filedata) | ||||
|     nllfile.close() | ||||
| 
 | ||||
| 
 | ||||
| def locate(fnin): | ||||
|     """ | ||||
|     takes an external program name | ||||
|     :param fnin: | ||||
|     :return: | ||||
|     """ | ||||
| 
 | ||||
|     exe_path = which('NLLoc') | ||||
|     if exe_path is None: | ||||
|         raise NLLocError('NonLinLoc executable not found; check your ' | ||||
|                          'environment variables') | ||||
| 
 | ||||
|     # locate the event utilizing external NonLinLoc installation | ||||
|     try: | ||||
|         runProgram(exe_path, fnin) | ||||
|     except subprocess.CalledProcessError as e: | ||||
|         raise RuntimeError(e.output) | ||||
| 
 | ||||
| 
 | ||||
| def read_location(fn): | ||||
|     path, file = os.path.split(fn) | ||||
|     file = glob.glob1(path, file +  '.[0-9]*.grid0.loc.hyp') | ||||
|     if len(file) > 1: | ||||
|         raise IOError('ambiguous location name {0}'.format(file)) | ||||
|     fn = os.path.join(path, file[0]) | ||||
|     return read_events(fn)[0] | ||||
| 
 | ||||
| 
 | ||||
| if __name__ == '__main__': | ||||
|     pass | ||||
							
								
								
									
										2
									
								
								pylot/core/loc/velest.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,2 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
							
								
								
									
										2
									
								
								pylot/core/pick/__init__.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,2 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
| # | ||||
							
								
								
									
										882
									
								
								pylot/core/pick/autopick.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,882 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| """ | ||||
| Function to run automated picking algorithms using AIC, | ||||
| HOS and AR prediction. Uses objects CharFuns and Picker and | ||||
| function conglomerate utils. | ||||
| 
 | ||||
| :author: MAGS2 EP3 working group / Ludger Kueperkoch | ||||
| """ | ||||
| 
 | ||||
| import matplotlib.pyplot as plt | ||||
| import numpy as np | ||||
| from pylot.core.io.inputs import AutoPickParameter | ||||
| from pylot.core.pick.picker import AICPicker, PragPicker | ||||
| from pylot.core.pick.charfuns import CharacteristicFunction | ||||
| from pylot.core.pick.charfuns import HOScf, AICcf, ARZcf, ARHcf, AR3Ccf | ||||
| from pylot.core.pick.utils import checksignallength, checkZ4S, earllatepicker, \ | ||||
|     getSNR, fmpicker, checkPonsets, wadaticheck | ||||
| from pylot.core.util.utils import getPatternLine | ||||
| from pylot.core.io.data import Data | ||||
| 
 | ||||
| 
 | ||||
| def autopickevent(data, param): | ||||
|     stations = [] | ||||
|     all_onsets = {} | ||||
| 
 | ||||
|     # get some parameters for quality control from | ||||
|     # parameter input file (usually autoPyLoT.in). | ||||
|     wdttolerance = param.get('wdttolerance') | ||||
|     mdttolerance = param.get('mdttolerance') | ||||
|     iplot = param.get('iplot') | ||||
|     apverbose = param.get('apverbose') | ||||
|     for n in range(len(data)): | ||||
|         station = data[n].stats.station | ||||
|         if station not in stations: | ||||
|             stations.append(station) | ||||
|         else: | ||||
|             continue | ||||
| 
 | ||||
|     for station in stations: | ||||
|         topick = data.select(station=station) | ||||
|         all_onsets[station] = autopickstation(topick, param, verbose=apverbose) | ||||
| 
 | ||||
|     # quality control | ||||
|     # median check and jackknife on P-onset times | ||||
|     jk_checked_onsets = checkPonsets(all_onsets, mdttolerance, iplot) | ||||
|     # check S-P times (Wadati) | ||||
|     return wadaticheck(jk_checked_onsets, wdttolerance, iplot) | ||||
| 
 | ||||
| 
 | ||||
| def autopickstation(wfstream, pickparam, verbose=False): | ||||
|     """ | ||||
|     :param wfstream: `~obspy.core.stream.Stream`  containing waveform | ||||
|     :type wfstream: obspy.core.stream.Stream | ||||
| 
 | ||||
|     :param pickparam: container of picking parameters from input file, | ||||
|            usually autoPyLoT.in | ||||
|     :type pickparam: AutoPickParameter | ||||
|     :param verbose: | ||||
|     :type verbose: bool | ||||
| 
 | ||||
|     """ | ||||
| 
 | ||||
|     # declaring pickparam variables (only for convenience) | ||||
|     # read your autoPyLoT.in for details! | ||||
| 
 | ||||
|     # special parameters for P picking | ||||
|     algoP = pickparam.get('algoP') | ||||
|     iplot = pickparam.get('iplot') | ||||
|     pstart = pickparam.get('pstart') | ||||
|     pstop = pickparam.get('pstop') | ||||
|     thosmw = pickparam.get('tlta') | ||||
|     tsnrz = pickparam.get('tsnrz') | ||||
|     hosorder = pickparam.get('hosorder') | ||||
|     bpz1 = pickparam.get('bpz1') | ||||
|     bpz2 = pickparam.get('bpz2') | ||||
|     pickwinP = pickparam.get('pickwinP') | ||||
|     tsmoothP = pickparam.get('tsmoothP') | ||||
|     ausP = pickparam.get('ausP') | ||||
|     nfacP = pickparam.get('nfacP') | ||||
|     tpred1z = pickparam.get('tpred1z') | ||||
|     tdet1z = pickparam.get('tdet1z') | ||||
|     Parorder = pickparam.get('Parorder') | ||||
|     addnoise = pickparam.get('addnoise') | ||||
|     Precalcwin = pickparam.get('Precalcwin') | ||||
|     minAICPslope = pickparam.get('minAICPslope') | ||||
|     minAICPSNR = pickparam.get('minAICPSNR') | ||||
|     timeerrorsP = pickparam.get('timeerrorsP') | ||||
|     # special parameters for S picking | ||||
|     algoS = pickparam.get('algoS') | ||||
|     sstart = pickparam.get('sstart') | ||||
|     sstop = pickparam.get('sstop') | ||||
|     bph1 = pickparam.get('bph1') | ||||
|     bph2 = pickparam.get('bph2') | ||||
|     tsnrh = pickparam.get('tsnrh') | ||||
|     pickwinS = pickparam.get('pickwinS') | ||||
|     tpred1h = pickparam.get('tpred1h') | ||||
|     tdet1h = pickparam.get('tdet1h') | ||||
|     tpred2h = pickparam.get('tpred2h') | ||||
|     tdet2h = pickparam.get('tdet2h') | ||||
|     Sarorder = pickparam.get('Sarorder') | ||||
|     aictsmoothS = pickparam.get('aictsmoothS') | ||||
|     tsmoothS = pickparam.get('tsmoothS') | ||||
|     ausS = pickparam.get('ausS') | ||||
|     minAICSslope = pickparam.get('minAICSslope') | ||||
|     minAICSSNR = pickparam.get('minAICSSNR') | ||||
|     Srecalcwin = pickparam.get('Srecalcwin') | ||||
|     nfacS = pickparam.get('nfacS') | ||||
|     timeerrorsS = pickparam.get('timeerrorsS') | ||||
|     # parameters for first-motion determination | ||||
|     minFMSNR = pickparam.get('minFMSNR') | ||||
|     fmpickwin = pickparam.get('fmpickwin') | ||||
|     minfmweight = pickparam.get('minfmweight') | ||||
|     # parameters for checking signal length | ||||
|     minsiglength = pickparam.get('minsiglength') | ||||
|     minpercent = pickparam.get('minpercent') | ||||
|     nfacsl = pickparam.get('noisefactor') | ||||
|     # parameter to check for spuriously picked S onset | ||||
|     zfac = pickparam.get('zfac') | ||||
|     # path to inventory-, dataless- or resp-files | ||||
| 
 | ||||
|     # initialize output | ||||
|     Pweight = 4  # weight for P onset | ||||
|     Sweight = 4  # weight for S onset | ||||
|     FM = 'N'  # first motion (polarity) | ||||
|     SNRP = None  # signal-to-noise ratio of P onset | ||||
|     SNRPdB = None  # signal-to-noise ratio of P onset [dB] | ||||
|     SNRS = None  # signal-to-noise ratio of S onset | ||||
|     SNRSdB = None  # signal-to-noise ratio of S onset [dB] | ||||
|     mpickP = None  # most likely P onset | ||||
|     lpickP = None  # latest possible P onset | ||||
|     epickP = None  # earliest possible P onset | ||||
|     mpickS = None  # most likely S onset | ||||
|     lpickS = None  # latest possible S onset | ||||
|     epickS = None  # earliest possible S onset | ||||
|     Perror = None  # symmetrized picking error P onset | ||||
|     Serror = None  # symmetrized picking error S onset | ||||
| 
 | ||||
|     aicSflag = 0 | ||||
|     aicPflag = 0 | ||||
|     Pflag = 0 | ||||
|     Sflag = 0 | ||||
|     Pmarker = [] | ||||
|     Ao = None  # Wood-Anderson peak-to-peak amplitude | ||||
|     picker = 'autoPyLoT'  # name of the picking programm | ||||
| 
 | ||||
|     # split components | ||||
|     zdat = wfstream.select(component="Z") | ||||
|     if len(zdat) == 0:  # check for other components | ||||
|         zdat = wfstream.select(component="3") | ||||
|     edat = wfstream.select(component="E") | ||||
|     if len(edat) == 0:  # check for other components | ||||
|         edat = wfstream.select(component="2") | ||||
|     ndat = wfstream.select(component="N") | ||||
|     if len(ndat) == 0:  # check for other components | ||||
|         ndat = wfstream.select(component="1") | ||||
| 
 | ||||
|     if algoP == 'HOS' or algoP == 'ARZ' and zdat is not None: | ||||
|         msg = '##########################################\nautopickstation:' \ | ||||
|               ' Working on P onset of station {station}\nFiltering vertical ' \ | ||||
|               'trace ...\n{data}'.format(station=zdat[0].stats.station, | ||||
|                                          data=str(zdat)) | ||||
|         if verbose: print(msg) | ||||
|         z_copy = zdat.copy() | ||||
|         # filter and taper data | ||||
|         tr_filt = zdat[0].copy() | ||||
|         tr_filt.filter('bandpass', freqmin=bpz1[0], freqmax=bpz1[1], | ||||
|                        zerophase=False) | ||||
|         tr_filt.taper(max_percentage=0.05, type='hann') | ||||
|         z_copy[0].data = tr_filt.data | ||||
|         ############################################################## | ||||
|         # check length of waveform and compare with cut times | ||||
|         Lc = pstop - pstart | ||||
|         Lwf = zdat[0].stats.endtime - zdat[0].stats.starttime | ||||
|         Ldiff = Lwf - Lc | ||||
|         if Ldiff < 0: | ||||
|             msg = 'autopickstation: Cutting times are too large for actual ' \ | ||||
|                   'waveform!\nUsing entire waveform instead!' | ||||
|             if verbose: print(msg) | ||||
|             pstart = 0 | ||||
|             pstop = len(zdat[0].data) * zdat[0].stats.delta | ||||
|         cuttimes = [pstart, pstop] | ||||
|         cf1 = None | ||||
|         if algoP == 'HOS': | ||||
|             # calculate HOS-CF using subclass HOScf of class | ||||
|             # CharacteristicFunction | ||||
|             cf1 = HOScf(z_copy, cuttimes, thosmw, hosorder)  # instance of HOScf | ||||
|         elif algoP == 'ARZ': | ||||
|             # calculate ARZ-CF using subclass ARZcf of class | ||||
|             # CharcteristicFunction | ||||
|             cf1 = ARZcf(z_copy, cuttimes, tpred1z, Parorder, tdet1z, | ||||
|                         addnoise)  # instance of ARZcf | ||||
|         ############################################################## | ||||
|         # calculate AIC-HOS-CF using subclass AICcf of class | ||||
|         # CharacteristicFunction | ||||
|         # class needs stream object => build it | ||||
|         assert isinstance(cf1, CharacteristicFunction), 'cf2 is not set ' \ | ||||
|                                                         'correctly: maybe the algorithm name ({algoP}) is ' \ | ||||
|                                                         'corrupted'.format( | ||||
|             algoP=algoP) | ||||
|         tr_aic = tr_filt.copy() | ||||
|         tr_aic.data = cf1.getCF() | ||||
|         z_copy[0].data = tr_aic.data | ||||
|         aiccf = AICcf(z_copy, cuttimes)  # instance of AICcf | ||||
|         ############################################################## | ||||
|         # get prelimenary onset time from AIC-HOS-CF using subclass AICPicker | ||||
|         # of class AutoPicking | ||||
|         aicpick = AICPicker(aiccf, tsnrz, pickwinP, iplot, None, tsmoothP) | ||||
|         ############################################################## | ||||
|         if aicpick.getpick() is not None: | ||||
|             # check signal length to detect spuriously picked noise peaks | ||||
|             # use all available components to avoid skipping correct picks | ||||
|             # on vertical traces with weak P coda | ||||
|             z_copy[0].data = tr_filt.data | ||||
|             zne = z_copy | ||||
|             if len(ndat) == 0 or len(edat) == 0: | ||||
|                 msg = 'One or more horizontal component(s) missing!\nSignal ' \ | ||||
|                       'length only checked on vertical component!\n' \ | ||||
|                       'Decreasing minsiglengh from {0} to ' \ | ||||
|                       '{1}'.format(minsiglength, minsiglength / 2) | ||||
|                 if verbose: print(msg) | ||||
|                 Pflag = checksignallength(zne, aicpick.getpick(), tsnrz, | ||||
|                                           minsiglength / 2, | ||||
|                                           nfacsl, minpercent, iplot) | ||||
|             else: | ||||
|                 # filter and taper horizontal traces | ||||
|                 trH1_filt = edat.copy() | ||||
|                 trH2_filt = ndat.copy() | ||||
|                 trH1_filt.filter('bandpass', freqmin=bph1[0], | ||||
|                                  freqmax=bph1[1], | ||||
|                                  zerophase=False) | ||||
|                 trH2_filt.filter('bandpass', freqmin=bph1[0], | ||||
|                                  freqmax=bph1[1], | ||||
|                                  zerophase=False) | ||||
|                 trH1_filt.taper(max_percentage=0.05, type='hann') | ||||
|                 trH2_filt.taper(max_percentage=0.05, type='hann') | ||||
|                 zne += trH1_filt | ||||
|                 zne += trH2_filt | ||||
|                 Pflag = checksignallength(zne, aicpick.getpick(), tsnrz, | ||||
|                                           minsiglength, | ||||
|                                           nfacsl, minpercent, iplot) | ||||
| 
 | ||||
|             if Pflag == 1: | ||||
|                 # check for spuriously picked S onset | ||||
|                 # both horizontal traces needed | ||||
|                 if len(ndat) == 0 or len(edat) == 0: | ||||
|                     msg = 'One or more horizontal components missing!\n' \ | ||||
|                           'Skipping control function checkZ4S.' | ||||
|                     if verbose: print(msg) | ||||
|                 else: | ||||
|                     Pflag = checkZ4S(zne, aicpick.getpick(), zfac, | ||||
|                                      tsnrz[3], iplot) | ||||
|                     if Pflag == 0: | ||||
|                         Pmarker = 'SinsteadP' | ||||
|                         Pweight = 9 | ||||
|             else: | ||||
|                 Pmarker = 'shortsignallength' | ||||
|                 Pweight = 9 | ||||
|         ############################################################## | ||||
|         # go on with processing if AIC onset passes quality control | ||||
|         if (aicpick.getSlope() >= minAICPslope and | ||||
|                     aicpick.getSNR() >= minAICPSNR and Pflag == 1): | ||||
|             aicPflag = 1 | ||||
|             msg = 'AIC P-pick passes quality control: Slope: {0} counts/s, ' \ | ||||
|                   'SNR: {1}\nGo on with refined picking ...\n' \ | ||||
|                   'autopickstation: re-filtering vertical trace ' \ | ||||
|                   '...'.format(aicpick.getSlope(), aicpick.getSNR()) | ||||
|             if verbose: print(msg) | ||||
|             # re-filter waveform with larger bandpass | ||||
|             z_copy = zdat.copy() | ||||
|             tr_filt = zdat[0].copy() | ||||
|             tr_filt.filter('bandpass', freqmin=bpz2[0], freqmax=bpz2[1], | ||||
|                            zerophase=False) | ||||
|             tr_filt.taper(max_percentage=0.05, type='hann') | ||||
|             z_copy[0].data = tr_filt.data | ||||
|             ############################################################# | ||||
|             # re-calculate CF from re-filtered trace in vicinity of initial | ||||
|             # onset | ||||
|             cuttimes2 = [round(max([aicpick.getpick() - Precalcwin, 0])), | ||||
|                          round(min([len(zdat[0].data) * zdat[0].stats.delta, | ||||
|                                     aicpick.getpick() + Precalcwin]))] | ||||
|             cf2 = None | ||||
|             if algoP == 'HOS': | ||||
|                 # calculate HOS-CF using subclass HOScf of class | ||||
|                 # CharacteristicFunction | ||||
|                 cf2 = HOScf(z_copy, cuttimes2, thosmw, | ||||
|                             hosorder)  # instance of HOScf | ||||
|             elif algoP == 'ARZ': | ||||
|                 # calculate ARZ-CF using subclass ARZcf of class | ||||
|                 # CharcteristicFunction | ||||
|                 cf2 = ARZcf(z_copy, cuttimes2, tpred1z, Parorder, tdet1z, | ||||
|                             addnoise)  # instance of ARZcf | ||||
|             ############################################################## | ||||
|             # get refined onset time from CF2 using class Picker | ||||
|             assert isinstance(cf2, CharacteristicFunction), 'cf2 is not set ' \ | ||||
|                                                             'correctly: maybe the algorithm name ({algoP}) is ' \ | ||||
|                                                             'corrupted'.format( | ||||
|                 algoP=algoP) | ||||
|             refPpick = PragPicker(cf2, tsnrz, pickwinP, iplot, ausP, tsmoothP, | ||||
|                                   aicpick.getpick()) | ||||
|             mpickP = refPpick.getpick() | ||||
|             ############################################################# | ||||
|             if mpickP is not None: | ||||
|                 # quality assessment | ||||
|                 # get earliest/latest possible pick and symmetrized uncertainty | ||||
|                 [epickP, lpickP, Perror] = earllatepicker(z_copy, nfacP, tsnrz, | ||||
|                                                           mpickP, iplot) | ||||
| 
 | ||||
|                 # get SNR | ||||
|                 [SNRP, SNRPdB, Pnoiselevel] = getSNR(z_copy, tsnrz, mpickP) | ||||
| 
 | ||||
|                 # weight P-onset using symmetric error | ||||
|                 if Perror <= timeerrorsP[0]: | ||||
|                     Pweight = 0 | ||||
|                 elif timeerrorsP[0] < Perror <= timeerrorsP[1]: | ||||
|                     Pweight = 1 | ||||
|                 elif timeerrorsP[1] < Perror <= timeerrorsP[2]: | ||||
|                     Pweight = 2 | ||||
|                 elif timeerrorsP[2] < Perror <= timeerrorsP[3]: | ||||
|                     Pweight = 3 | ||||
|                 elif Perror > timeerrorsP[3]: | ||||
|                     Pweight = 4 | ||||
| 
 | ||||
|                 ############################################################## | ||||
|                 # get first motion of P onset | ||||
|                 # certain quality required | ||||
|                 if Pweight <= minfmweight and SNRP >= minFMSNR: | ||||
|                     FM = fmpicker(zdat, z_copy, fmpickwin, mpickP, iplot) | ||||
|                 else: | ||||
|                     FM = 'N' | ||||
| 
 | ||||
|                 msg = "autopickstation: P-weight: {0}, " \ | ||||
|                       "SNR: {1}, SNR[dB]: {2}, Polarity: {3}".format(Pweight, | ||||
|                                                                      SNRP, | ||||
|                                                                      SNRPdB, | ||||
|                                                                      FM) | ||||
|                 print(msg) | ||||
|                 Sflag = 1 | ||||
| 
 | ||||
|         else: | ||||
|             msg = 'Bad initial (AIC) P-pick, skipping this onset!\n' \ | ||||
|                   'AIC-SNR={0}, AIC-Slope={1}counts/s\n' \ | ||||
|                   '(min. AIC-SNR={2}, ' \ | ||||
|                   'min. AIC-Slope={3}counts/s)'.format(aicpick.getSNR(), | ||||
|                                                        aicpick.getSlope(), | ||||
|                                                        minAICPSNR, | ||||
|                                                        minAICPslope) | ||||
|             if verbose: print(msg) | ||||
|             Sflag = 0 | ||||
| 
 | ||||
|     else: | ||||
|         print('autopickstation: No vertical component data available!, ' | ||||
|               'Skipping station!') | ||||
| 
 | ||||
|     if edat is not None and ndat is not None and len(edat) > 0 and len( | ||||
|             ndat) > 0 and Pweight < 4: | ||||
|         msg = 'Go on picking S onset ...\n' \ | ||||
|               '##################################################\n' \ | ||||
|               'Working on S onset of station {0}\nFiltering horizontal ' \ | ||||
|               'traces ...'.format(edat[0].stats.station) | ||||
|         if verbose: print(msg) | ||||
|         # determine time window for calculating CF after P onset | ||||
|         cuttimesh = [round(max([mpickP + sstart, 0])), | ||||
|                      round(min([mpickP + sstop, Lwf]))] | ||||
| 
 | ||||
|         if algoS == 'ARH': | ||||
|             # re-create stream object including both horizontal components | ||||
|             hdat = edat.copy() | ||||
|             hdat += ndat | ||||
|             h_copy = hdat.copy() | ||||
|             # filter and taper data | ||||
|             trH1_filt = hdat[0].copy() | ||||
|             trH2_filt = hdat[1].copy() | ||||
|             trH1_filt.filter('bandpass', freqmin=bph1[0], freqmax=bph1[1], | ||||
|                              zerophase=False) | ||||
|             trH2_filt.filter('bandpass', freqmin=bph1[0], freqmax=bph1[1], | ||||
|                              zerophase=False) | ||||
|             trH1_filt.taper(max_percentage=0.05, type='hann') | ||||
|             trH2_filt.taper(max_percentage=0.05, type='hann') | ||||
|             h_copy[0].data = trH1_filt.data | ||||
|             h_copy[1].data = trH2_filt.data | ||||
|         elif algoS == 'AR3': | ||||
|             # re-create stream object including all components | ||||
|             hdat = zdat.copy() | ||||
|             hdat += edat | ||||
|             hdat += ndat | ||||
|             h_copy = hdat.copy() | ||||
|             # filter and taper data | ||||
|             trH1_filt = hdat[0].copy() | ||||
|             trH2_filt = hdat[1].copy() | ||||
|             trH3_filt = hdat[2].copy() | ||||
|             trH1_filt.filter('bandpass', freqmin=bph1[0], freqmax=bph1[1], | ||||
|                              zerophase=False) | ||||
|             trH2_filt.filter('bandpass', freqmin=bph1[0], freqmax=bph1[1], | ||||
|                              zerophase=False) | ||||
|             trH3_filt.filter('bandpass', freqmin=bph1[0], freqmax=bph1[1], | ||||
|                              zerophase=False) | ||||
|             trH1_filt.taper(max_percentage=0.05, type='hann') | ||||
|             trH2_filt.taper(max_percentage=0.05, type='hann') | ||||
|             trH3_filt.taper(max_percentage=0.05, type='hann') | ||||
|             h_copy[0].data = trH1_filt.data | ||||
|             h_copy[1].data = trH2_filt.data | ||||
|             h_copy[2].data = trH3_filt.data | ||||
|         ############################################################## | ||||
|         if algoS == 'ARH': | ||||
|             # calculate ARH-CF using subclass ARHcf of class | ||||
|             # CharcteristicFunction | ||||
|             arhcf1 = ARHcf(h_copy, cuttimesh, tpred1h, Sarorder, tdet1h, | ||||
|                            addnoise)  # instance of ARHcf | ||||
|         elif algoS == 'AR3': | ||||
|             # calculate ARH-CF using subclass AR3cf of class | ||||
|             # CharcteristicFunction | ||||
|             arhcf1 = AR3Ccf(h_copy, cuttimesh, tpred1h, Sarorder, tdet1h, | ||||
|                             addnoise)  # instance of ARHcf | ||||
|         ############################################################## | ||||
|         # calculate AIC-ARH-CF using subclass AICcf of class | ||||
|         # CharacteristicFunction | ||||
|         # class needs stream object => build it | ||||
|         tr_arhaic = trH1_filt.copy() | ||||
|         tr_arhaic.data = arhcf1.getCF() | ||||
|         h_copy[0].data = tr_arhaic.data | ||||
|         # calculate ARH-AIC-CF | ||||
|         haiccf = AICcf(h_copy, cuttimesh)  # instance of AICcf | ||||
|         ############################################################## | ||||
|         # get prelimenary onset time from AIC-HOS-CF using subclass AICPicker | ||||
|         # of class AutoPicking | ||||
|         aicarhpick = AICPicker(haiccf, tsnrh, pickwinS, iplot, None, | ||||
|                                aictsmoothS) | ||||
|         ############################################################### | ||||
|         # go on with processing if AIC onset passes quality control | ||||
|         if (aicarhpick.getSlope() >= minAICSslope and | ||||
|                     aicarhpick.getSNR() >= minAICSSNR and | ||||
|                     aicarhpick.getpick() is not None): | ||||
|             aicSflag = 1 | ||||
|             msg = 'AIC S-pick passes quality control: Slope: {0} counts/s, ' \ | ||||
|                   'SNR: {1}\nGo on with refined picking ...\n' \ | ||||
|                   'autopickstation: re-filtering horizontal traces ' \ | ||||
|                   '...'.format(aicarhpick.getSlope(), aicarhpick.getSNR()) | ||||
|             if verbose: print(msg) | ||||
|             # re-calculate CF from re-filtered trace in vicinity of initial | ||||
|             # onset | ||||
|             cuttimesh2 = [round(aicarhpick.getpick() - Srecalcwin), | ||||
|                           round(aicarhpick.getpick() + Srecalcwin)] | ||||
|             # re-filter waveform with larger bandpass | ||||
|             h_copy = hdat.copy() | ||||
|             # filter and taper data | ||||
|             if algoS == 'ARH': | ||||
|                 trH1_filt = hdat[0].copy() | ||||
|                 trH2_filt = hdat[1].copy() | ||||
|                 trH1_filt.filter('bandpass', freqmin=bph2[0], freqmax=bph2[1], | ||||
|                                  zerophase=False) | ||||
|                 trH2_filt.filter('bandpass', freqmin=bph2[0], freqmax=bph2[1], | ||||
|                                  zerophase=False) | ||||
|                 trH1_filt.taper(max_percentage=0.05, type='hann') | ||||
|                 trH2_filt.taper(max_percentage=0.05, type='hann') | ||||
|                 h_copy[0].data = trH1_filt.data | ||||
|                 h_copy[1].data = trH2_filt.data | ||||
|                 ############################################################# | ||||
|                 arhcf2 = ARHcf(h_copy, cuttimesh2, tpred2h, Sarorder, tdet2h, | ||||
|                                addnoise)  # instance of ARHcf | ||||
|             elif algoS == 'AR3': | ||||
|                 trH1_filt = hdat[0].copy() | ||||
|                 trH2_filt = hdat[1].copy() | ||||
|                 trH3_filt = hdat[2].copy() | ||||
|                 trH1_filt.filter('bandpass', freqmin=bph2[0], freqmax=bph2[1], | ||||
|                                  zerophase=False) | ||||
|                 trH2_filt.filter('bandpass', freqmin=bph2[0], freqmax=bph2[1], | ||||
|                                  zerophase=False) | ||||
|                 trH3_filt.filter('bandpass', freqmin=bph2[0], freqmax=bph2[1], | ||||
|                                  zerophase=False) | ||||
|                 trH1_filt.taper(max_percentage=0.05, type='hann') | ||||
|                 trH2_filt.taper(max_percentage=0.05, type='hann') | ||||
|                 trH3_filt.taper(max_percentage=0.05, type='hann') | ||||
|                 h_copy[0].data = trH1_filt.data | ||||
|                 h_copy[1].data = trH2_filt.data | ||||
|                 h_copy[2].data = trH3_filt.data | ||||
|                 ############################################################# | ||||
|                 arhcf2 = AR3Ccf(h_copy, cuttimesh2, tpred2h, Sarorder, tdet2h, | ||||
|                                 addnoise)  # instance of ARHcf | ||||
| 
 | ||||
|             # get refined onset time from CF2 using class Picker | ||||
|             refSpick = PragPicker(arhcf2, tsnrh, pickwinS, iplot, ausS, | ||||
|                                   tsmoothS, aicarhpick.getpick()) | ||||
|             mpickS = refSpick.getpick() | ||||
|             ############################################################# | ||||
|             if mpickS is not None: | ||||
|                 # quality assessment | ||||
|                 # get earliest/latest possible pick and symmetrized uncertainty | ||||
|                 h_copy[0].data = trH1_filt.data | ||||
|                 [epickS1, lpickS1, Serror1] = earllatepicker(h_copy, nfacS, | ||||
|                                                              tsnrh, | ||||
|                                                              mpickS, iplot) | ||||
| 
 | ||||
|                 h_copy[0].data = trH2_filt.data | ||||
|                 [epickS2, lpickS2, Serror2] = earllatepicker(h_copy, nfacS, | ||||
|                                                              tsnrh, | ||||
|                                                              mpickS, iplot) | ||||
|                 if epickS1 is not None and epickS2 is not None: | ||||
|                     if algoS == 'ARH': | ||||
|                         # get earliest pick of both earliest possible picks | ||||
|                         epick = [epickS1, epickS2] | ||||
|                         lpick = [lpickS1, lpickS2] | ||||
|                         pickerr = [Serror1, Serror2] | ||||
|                         if epickS1 is None and epickS2 is not None: | ||||
|                             ipick = 1 | ||||
|                         elif epickS1 is not None and epickS2 is None: | ||||
|                             ipick = 0 | ||||
|                         elif epickS1 is not None and epickS2 is not None: | ||||
|                             ipick = np.argmin([epickS1, epickS2]) | ||||
|                     elif algoS == 'AR3': | ||||
|                         [epickS3, lpickS3, Serror3] = earllatepicker(h_copy, | ||||
|                                                                      nfacS, | ||||
|                                                                      tsnrh, | ||||
|                                                                      mpickS, | ||||
|                                                                      iplot) | ||||
|                         # get earliest pick of all three picks | ||||
|                         epick = [epickS1, epickS2, epickS3] | ||||
|                         lpick = [lpickS1, lpickS2, lpickS3] | ||||
|                         pickerr = [Serror1, Serror2, Serror3] | ||||
|                         if epickS1 is None and epickS2 is not None \ | ||||
|                                 and epickS3 is not None: | ||||
|                             ipick = np.argmin([epickS2, epickS3]) | ||||
|                         elif epickS1 is not None and epickS2 is None \ | ||||
|                                 and epickS3 is not None: | ||||
|                             ipick = np.argmin([epickS2, epickS3]) | ||||
|                         elif epickS1 is not None and epickS2 is not None \ | ||||
|                                 and epickS3 is None: | ||||
|                             ipick = np.argmin([epickS1, epickS2]) | ||||
|                         elif epickS1 is not None and epickS2 is not None \ | ||||
|                                 and epickS3 is not None: | ||||
|                             ipick = np.argmin([epickS1, epickS2, epickS3]) | ||||
| 
 | ||||
|                     epickS = epick[ipick] | ||||
|                     lpickS = lpick[ipick] | ||||
|                     Serror = pickerr[ipick] | ||||
| 
 | ||||
|                     # get SNR | ||||
|                     [SNRS, SNRSdB, Snoiselevel] = getSNR(h_copy, tsnrh, mpickS) | ||||
| 
 | ||||
|                     # weight S-onset using symmetric error | ||||
|                     if Serror <= timeerrorsS[0]: | ||||
|                         Sweight = 0 | ||||
|                     elif timeerrorsS[0] < Serror <= timeerrorsS[1]: | ||||
|                         Sweight = 1 | ||||
|                     elif Perror > timeerrorsS[1] and Serror <= timeerrorsS[2]: | ||||
|                         Sweight = 2 | ||||
|                     elif timeerrorsS[2] < Serror <= timeerrorsS[3]: | ||||
|                         Sweight = 3 | ||||
|                     elif Serror > timeerrorsS[3]: | ||||
|                         Sweight = 4 | ||||
| 
 | ||||
|                     print('autopickstation: S-weight: {0}, SNR: {1}, ' | ||||
|                           'SNR[dB]: {2}\n' | ||||
|                           '################################################' | ||||
|                           ''.format(Sweight, SNRS, SNRSdB)) | ||||
|                 ################################################################ | ||||
|                 # get Wood-Anderson peak-to-peak amplitude | ||||
|                 # initialize Data object | ||||
|                 data = Data() | ||||
|                 # re-create stream object including both horizontal components | ||||
|                 hdat = edat.copy() | ||||
|                 hdat += ndat | ||||
|         else: | ||||
|             msg = 'Bad initial (AIC) S-pick, skipping this onset!\n' \ | ||||
|                   'AIC-SNR={0}, AIC-Slope={1}counts/s\n' \ | ||||
|                   '(min. AIC-SNR={2}, ' \ | ||||
|                   'min. AIC-Slope={3}counts/s)\n' \ | ||||
|                   '################################################' \ | ||||
|                   ''.format(aicarhpick.getSNR(), | ||||
|                             aicarhpick.getSlope(), | ||||
|                             minAICSSNR, | ||||
|                             minAICSslope) | ||||
|             if verbose: print(msg) | ||||
| 
 | ||||
|             ############################################################ | ||||
|             # get Wood-Anderson peak-to-peak amplitude | ||||
|             # initialize Data object | ||||
|             data = Data() | ||||
|             # re-create stream object including both horizontal components | ||||
|             hdat = edat.copy() | ||||
|             hdat += ndat | ||||
|     else: | ||||
|         print('autopickstation: No horizontal component data available or ' \ | ||||
|               'bad P onset, skipping S picking!') | ||||
| 
 | ||||
|     ############################################################## | ||||
|     if iplot > 0: | ||||
|         # plot vertical trace | ||||
|         plt.figure() | ||||
|         plt.subplot(3, 1, 1) | ||||
|         tdata = np.arange(0, zdat[0].stats.npts / tr_filt.stats.sampling_rate, | ||||
|                           tr_filt.stats.delta) | ||||
|         # check equal length of arrays, sometimes they are different!? | ||||
|         wfldiff = len(tr_filt.data) - len(tdata) | ||||
|         if wfldiff < 0: | ||||
|             tdata = tdata[0:len(tdata) - abs(wfldiff)] | ||||
|         p1, = plt.plot(tdata, tr_filt.data / max(tr_filt.data), 'k') | ||||
|         if Pweight < 4: | ||||
|             p2, = plt.plot(cf1.getTimeArray(), cf1.getCF() / max(cf1.getCF()), | ||||
|                            'b') | ||||
|             if aicPflag == 1: | ||||
|                 p3, = plt.plot(cf2.getTimeArray(), | ||||
|                                cf2.getCF() / max(cf2.getCF()), 'm') | ||||
|                 p4, = plt.plot([aicpick.getpick(), aicpick.getpick()], [-1, 1], | ||||
|                                'r') | ||||
|                 plt.plot([aicpick.getpick() - 0.5, aicpick.getpick() + 0.5], | ||||
|                          [1, 1], 'r') | ||||
|                 plt.plot([aicpick.getpick() - 0.5, aicpick.getpick() + 0.5], | ||||
|                          [-1, -1], 'r') | ||||
|                 p5, = plt.plot([refPpick.getpick(), refPpick.getpick()], | ||||
|                                [-1.3, 1.3], 'r', linewidth=2) | ||||
|                 plt.plot([refPpick.getpick() - 0.5, refPpick.getpick() + 0.5], | ||||
|                          [1.3, 1.3], 'r', linewidth=2) | ||||
|                 plt.plot([refPpick.getpick() - 0.5, refPpick.getpick() + 0.5], | ||||
|                          [-1.3, -1.3], 'r', linewidth=2) | ||||
|                 plt.plot([lpickP, lpickP], [-1.1, 1.1], 'r--') | ||||
|                 plt.plot([epickP, epickP], [-1.1, 1.1], 'r--') | ||||
|                 plt.legend([p1, p2, p3, p4, p5], | ||||
|                            ['Data', 'CF1', 'CF2', 'Initial P Onset', | ||||
|                             'Final P Pick']) | ||||
|                 plt.title('%s, %s, P Weight=%d, SNR=%7.2f, SNR[dB]=%7.2f ' | ||||
|                           'Polarity: %s' % (tr_filt.stats.station, | ||||
|                                             tr_filt.stats.channel, | ||||
|                                             Pweight, | ||||
|                                             SNRP, | ||||
|                                             SNRPdB, | ||||
|                                             FM)) | ||||
|             else: | ||||
|                 plt.legend([p1, p2], ['Data', 'CF1']) | ||||
|                 plt.title('%s, P Weight=%d, SNR=None, ' | ||||
|                           'SNRdB=None' % (tr_filt.stats.channel, Pweight)) | ||||
|         else: | ||||
|             plt.title('%s, %s, P Weight=%d' % (tr_filt.stats.station, | ||||
|                                                tr_filt.stats.channel, | ||||
|                                                Pweight)) | ||||
| 
 | ||||
|         plt.yticks([]) | ||||
|         plt.ylim([-1.5, 1.5]) | ||||
|         plt.ylabel('Normalized Counts') | ||||
|         plt.suptitle(tr_filt.stats.starttime) | ||||
| 
 | ||||
|         if len(edat[0]) > 1 and len(ndat[0]) > 1 and Sflag == 1: | ||||
|             # plot horizontal traces | ||||
|             plt.subplot(3, 1, 2) | ||||
|             th1data = np.arange(0, | ||||
|                                 trH1_filt.stats.npts / | ||||
|                                 trH1_filt.stats.sampling_rate, | ||||
|                                 trH1_filt.stats.delta) | ||||
|             # check equal length of arrays, sometimes they are different!? | ||||
|             wfldiff = len(trH1_filt.data) - len(th1data) | ||||
|             if wfldiff < 0: | ||||
|                 th1data = th1data[0:len(th1data) - abs(wfldiff)] | ||||
|             p21, = plt.plot(th1data, trH1_filt.data / max(trH1_filt.data), 'k') | ||||
|             if Pweight < 4: | ||||
|                 p22, = plt.plot(arhcf1.getTimeArray(), | ||||
|                                 arhcf1.getCF() / max(arhcf1.getCF()), 'b') | ||||
|                 if aicSflag == 1: | ||||
|                     p23, = plt.plot(arhcf2.getTimeArray(), | ||||
|                                     arhcf2.getCF() / max(arhcf2.getCF()), 'm') | ||||
|                     p24, = plt.plot( | ||||
|                         [aicarhpick.getpick(), aicarhpick.getpick()], | ||||
|                         [-1, 1], 'g') | ||||
|                     plt.plot( | ||||
|                         [aicarhpick.getpick() - 0.5, | ||||
|                          aicarhpick.getpick() + 0.5], | ||||
|                         [1, 1], 'g') | ||||
|                     plt.plot( | ||||
|                         [aicarhpick.getpick() - 0.5, | ||||
|                          aicarhpick.getpick() + 0.5], | ||||
|                         [-1, -1], 'g') | ||||
|                     p25, = plt.plot([refSpick.getpick(), refSpick.getpick()], | ||||
|                                     [-1.3, 1.3], 'g', linewidth=2) | ||||
|                     plt.plot( | ||||
|                         [refSpick.getpick() - 0.5, refSpick.getpick() + 0.5], | ||||
|                         [1.3, 1.3], 'g', linewidth=2) | ||||
|                     plt.plot( | ||||
|                         [refSpick.getpick() - 0.5, refSpick.getpick() + 0.5], | ||||
|                         [-1.3, -1.3], 'g', linewidth=2) | ||||
|                     plt.plot([lpickS, lpickS], [-1.1, 1.1], 'g--') | ||||
|                     plt.plot([epickS, epickS], [-1.1, 1.1], 'g--') | ||||
|                     plt.legend([p21, p22, p23, p24, p25], | ||||
|                                ['Data', 'CF1', 'CF2', 'Initial S Onset', | ||||
|                                 'Final S Pick']) | ||||
|                     plt.title('%s, S Weight=%d, SNR=%7.2f, SNR[dB]=%7.2f' % ( | ||||
|                         trH1_filt.stats.channel, | ||||
|                         Sweight, SNRS, SNRSdB)) | ||||
|                 else: | ||||
|                     plt.legend([p21, p22], ['Data', 'CF1']) | ||||
|                     plt.title('%s, S Weight=%d, SNR=None, SNRdB=None' % ( | ||||
|                         trH1_filt.stats.channel, Sweight)) | ||||
|             plt.yticks([]) | ||||
|             plt.ylim([-1.5, 1.5]) | ||||
|             plt.ylabel('Normalized Counts') | ||||
|             plt.suptitle(trH1_filt.stats.starttime) | ||||
| 
 | ||||
|             plt.subplot(3, 1, 3) | ||||
|             th2data = np.arange(0, | ||||
|                                 trH2_filt.stats.npts / | ||||
|                                 trH2_filt.stats.sampling_rate, | ||||
|                                 trH2_filt.stats.delta) | ||||
|             # check equal length of arrays, sometimes they are different!? | ||||
|             wfldiff = len(trH2_filt.data) - len(th2data) | ||||
|             if wfldiff < 0: | ||||
|                 th2data = th2data[0:len(th2data) - abs(wfldiff)] | ||||
|             plt.plot(th2data, trH2_filt.data / max(trH2_filt.data), 'k') | ||||
|             if Pweight < 4: | ||||
|                 p22, = plt.plot(arhcf1.getTimeArray(), | ||||
|                                 arhcf1.getCF() / max(arhcf1.getCF()), 'b') | ||||
|                 if aicSflag == 1: | ||||
|                     p23, = plt.plot(arhcf2.getTimeArray(), | ||||
|                                     arhcf2.getCF() / max(arhcf2.getCF()), 'm') | ||||
|                     p24, = plt.plot( | ||||
|                         [aicarhpick.getpick(), aicarhpick.getpick()], | ||||
|                         [-1, 1], 'g') | ||||
|                     plt.plot( | ||||
|                         [aicarhpick.getpick() - 0.5, | ||||
|                          aicarhpick.getpick() + 0.5], | ||||
|                         [1, 1], 'g') | ||||
|                     plt.plot( | ||||
|                         [aicarhpick.getpick() - 0.5, | ||||
|                          aicarhpick.getpick() + 0.5], | ||||
|                         [-1, -1], 'g') | ||||
|                     p25, = plt.plot([refSpick.getpick(), refSpick.getpick()], | ||||
|                                     [-1.3, 1.3], 'g', linewidth=2) | ||||
|                     plt.plot( | ||||
|                         [refSpick.getpick() - 0.5, refSpick.getpick() + 0.5], | ||||
|                         [1.3, 1.3], 'g', linewidth=2) | ||||
|                     plt.plot( | ||||
|                         [refSpick.getpick() - 0.5, refSpick.getpick() + 0.5], | ||||
|                         [-1.3, -1.3], 'g', linewidth=2) | ||||
|                     plt.plot([lpickS, lpickS], [-1.1, 1.1], 'g--') | ||||
|                     plt.plot([epickS, epickS], [-1.1, 1.1], 'g--') | ||||
|                     plt.legend([p21, p22, p23, p24, p25], | ||||
|                                ['Data', 'CF1', 'CF2', 'Initial S Onset', | ||||
|                                 'Final S Pick']) | ||||
|                 else: | ||||
|                     plt.legend([p21, p22], ['Data', 'CF1']) | ||||
|             plt.yticks([]) | ||||
|             plt.ylim([-1.5, 1.5]) | ||||
|             plt.xlabel('Time [s] after %s' % tr_filt.stats.starttime) | ||||
|             plt.ylabel('Normalized Counts') | ||||
|             plt.title(trH2_filt.stats.channel) | ||||
|             plt.show() | ||||
|             raw_input() | ||||
|             plt.close() | ||||
|     ########################################################################## | ||||
|     # calculate "real" onset times | ||||
|     if lpickP is not None and lpickP == mpickP: | ||||
|         lpickP += timeerrorsP[0] | ||||
|     if epickP is not None and epickP == mpickP: | ||||
|         epickP -= timeerrorsP[0] | ||||
|     if mpickP is not None and epickP is not None and lpickP is not None: | ||||
|         lpickP = zdat[0].stats.starttime + lpickP | ||||
|         epickP = zdat[0].stats.starttime + epickP | ||||
|         mpickP = zdat[0].stats.starttime + mpickP | ||||
|     else: | ||||
|         # dummy values (start of seismic trace) in order to derive | ||||
|         # theoretical onset times for iteratve picking | ||||
|         lpickP = zdat[0].stats.starttime + timeerrorsP[3] | ||||
|         epickP = zdat[0].stats.starttime - timeerrorsP[3] | ||||
|         mpickP = zdat[0].stats.starttime | ||||
| 
 | ||||
|     if lpickS is not None and lpickS == mpickS: | ||||
|         lpickS += timeerrorsS[0] | ||||
|     if epickS is not None and epickS == mpickS: | ||||
|         epickS -= timeerrorsS[0] | ||||
|     if mpickS is not None and epickS is not None and lpickS is not None: | ||||
|         lpickS = edat[0].stats.starttime + lpickS | ||||
|         epickS = edat[0].stats.starttime + epickS | ||||
|         mpickS = edat[0].stats.starttime + mpickS | ||||
|     else: | ||||
|         # dummy values (start of seismic trace) in order to derive | ||||
|         # theoretical onset times for iteratve picking | ||||
|         lpickS = edat[0].stats.starttime + timeerrorsS[3] | ||||
|         epickS = edat[0].stats.starttime - timeerrorsS[3] | ||||
|         mpickS = edat[0].stats.starttime | ||||
| 
 | ||||
|     # create dictionary | ||||
|     # for P phase | ||||
|     ppick = dict(lpp=lpickP, epp=epickP, mpp=mpickP, spe=Perror, snr=SNRP, | ||||
|                  snrdb=SNRPdB, weight=Pweight, fm=FM, w0=None, fc=None, Mo=None, | ||||
|                  Mw=None, picker=picker, marked=Pmarker) | ||||
|     # add S phase | ||||
|     spick = dict(lpp=lpickS, epp=epickS, mpp=mpickS, spe=Serror, snr=SNRS, | ||||
|                  snrdb=SNRSdB, weight=Sweight, fm=None, picker=picker, Ao=Ao) | ||||
|     # merge picks into returning dictionary | ||||
|     picks = dict(P=ppick, S=spick) | ||||
|     return picks | ||||
| 
 | ||||
| 
 | ||||
| def iteratepicker(wf, NLLocfile, picks, badpicks, pickparameter): | ||||
|     ''' | ||||
|     Repicking of bad onsets. Uses theoretical onset times from NLLoc-location file. | ||||
| 
 | ||||
|     :param wf: waveform, obspy stream object | ||||
| 
 | ||||
|     :param NLLocfile: path/name of NLLoc-location file | ||||
| 
 | ||||
|     :param picks: dictionary of available onset times | ||||
| 
 | ||||
|     :param badpicks: picks to be repicked | ||||
| 
 | ||||
|     :param pickparameter: picking parameters from autoPyLoT-input file | ||||
|     ''' | ||||
| 
 | ||||
|     msg = '#######################################################\n' \ | ||||
|           'autoPyLoT: Found {0} bad onsets at station(s) {1}, ' \ | ||||
|           'starting re-picking them ...'.format(len(badpicks), badpicks) | ||||
|     print(msg) | ||||
| 
 | ||||
|     newpicks = {} | ||||
|     for i in range(0, len(badpicks)): | ||||
|         if len(badpicks[i][0]) > 4: | ||||
|             Ppattern = '%s  ?    ?    ? P' % badpicks[i][0] | ||||
|         elif len(badpicks[i][0]) == 4: | ||||
|             Ppattern = '%s   ?    ?    ? P' % badpicks[i][0] | ||||
|         elif len(badpicks[i][0]) < 4: | ||||
|             Ppattern = '%s    ?    ?    ? P' % badpicks[i][0] | ||||
|         nllocline = getPatternLine(NLLocfile, Ppattern) | ||||
|         res = nllocline.split(None)[16] | ||||
|         # get theoretical P-onset time from residuum | ||||
|         badpicks[i][1] = picks[badpicks[i][0]]['P']['mpp'] - float(res) | ||||
| 
 | ||||
|         # get corresponding waveform stream | ||||
|         msg = '#######################################################\n' \ | ||||
|               'iteratepicker: Re-picking station {0}'.format(badpicks[i][0]) | ||||
|         print(msg) | ||||
|         wf2pick = wf.select(station=badpicks[i][0]) | ||||
| 
 | ||||
|         # modify some picking parameters | ||||
|         pstart_old = pickparameter.get('pstart') | ||||
|         pstop_old = pickparameter.get('pstop') | ||||
|         sstop_old = pickparameter.get('sstop') | ||||
|         pickwinP_old = pickparameter.get('pickwinP') | ||||
|         Precalcwin_old = pickparameter.get('Precalcwin') | ||||
|         noisefactor_old = pickparameter.get('noisefactor') | ||||
|         zfac_old = pickparameter.get('zfac') | ||||
|         pickparameter.setParam( | ||||
|             pstart=max([0, badpicks[i][1] - wf2pick[0].stats.starttime \ | ||||
|                         - pickparameter.get('tlta')])) | ||||
|         pickparameter.setParam(pstop=pickparameter.get('pstart') + \ | ||||
|                                      (3 * pickparameter.get('tlta'))) | ||||
|         pickparameter.setParam(sstop=pickparameter.get('sstop') / 2) | ||||
|         pickparameter.setParam(pickwinP=pickparameter.get('pickwinP') / 2) | ||||
|         pickparameter.setParam( | ||||
|             Precalcwin=pickparameter.get('Precalcwin') / 2) | ||||
|         pickparameter.setParam(noisefactor=1.0) | ||||
|         pickparameter.setParam(zfac=1.0) | ||||
|         print( | ||||
|             "iteratepicker: The following picking parameters have been modified for iterative picking:") | ||||
|         print( | ||||
|             "pstart: %fs => %fs" % (pstart_old, pickparameter.get('pstart'))) | ||||
|         print( | ||||
|             "pstop: %fs => %fs" % (pstop_old, pickparameter.get('pstop'))) | ||||
|         print( | ||||
|             "sstop: %fs => %fs" % (sstop_old, pickparameter.get('sstop'))) | ||||
|         print("pickwinP: %fs => %fs" % ( | ||||
|             pickwinP_old, pickparameter.get('pickwinP'))) | ||||
|         print("Precalcwin: %fs => %fs" % ( | ||||
|             Precalcwin_old, pickparameter.get('Precalcwin'))) | ||||
|         print("noisefactor: %f => %f" % ( | ||||
|             noisefactor_old, pickparameter.get('noisefactor'))) | ||||
|         print("zfac: %f => %f" % (zfac_old, pickparameter.get('zfac'))) | ||||
| 
 | ||||
|         # repick station | ||||
|         newpicks = autopickstation(wf2pick, pickparameter) | ||||
| 
 | ||||
|         # replace old dictionary with new one | ||||
|         picks[badpicks[i][0]] = newpicks | ||||
| 
 | ||||
|         # reset temporary change of picking parameters | ||||
|         print("iteratepicker: Resetting picking parameters ...") | ||||
|         pickparameter.setParam(pstart=pstart_old) | ||||
|         pickparameter.setParam(pstop=pstop_old) | ||||
|         pickparameter.setParam(sstop=sstop_old) | ||||
|         pickparameter.setParam(pickwinP=pickwinP_old) | ||||
|         pickparameter.setParam(Precalcwin=Precalcwin_old) | ||||
|         pickparameter.setParam(noisefactor=noisefactor_old) | ||||
|         pickparameter.setParam(zfac=zfac_old) | ||||
| 
 | ||||
|     return picks | ||||
							
								
								
									
										710
									
								
								pylot/core/pick/charfuns.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,710 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| """ | ||||
| Created Oct/Nov 2014 | ||||
| 
 | ||||
| Implementation of the Characteristic Functions (CF) published and described in: | ||||
| 
 | ||||
| Kueperkoch, L., Meier, T., Lee, J., Friederich, W., & EGELADOS Working Group, 2010: | ||||
| Automated determination of P-phase arrival times at regional and local distances | ||||
| using higher order statistics, Geophys. J. Int., 181, 1159-1170 | ||||
| 
 | ||||
| Kueperkoch, L., Meier, T., Bruestle, A., Lee, J., Friederich, W., & EGELADOS | ||||
| Working Group, 2012: Automated determination of S-phase arrival times using | ||||
| autoregressive prediction: application ot local and regional distances, Geophys. J. Int., | ||||
| 188, 687-702. | ||||
| 
 | ||||
| :author: MAGS2 EP3 working group | ||||
| """ | ||||
| 
 | ||||
| import matplotlib.pyplot as plt | ||||
| import numpy as np | ||||
| from obspy.core import Stream | ||||
| 
 | ||||
| 
 | ||||
| class CharacteristicFunction(object): | ||||
|     ''' | ||||
|     SuperClass for different types of characteristic functions. | ||||
|     ''' | ||||
| 
 | ||||
|     def __init__(self, data, cut, t2=None, order=None, t1=None, fnoise=None, stealthMode=False): | ||||
|         ''' | ||||
|         Initialize data type object with information from the original | ||||
|         Seismogram. | ||||
| 
 | ||||
|         :param: data | ||||
|         :type: `~obspy.core.stream.Stream` | ||||
| 
 | ||||
|         :param: cut | ||||
|         :type: tuple | ||||
| 
 | ||||
|         :param: t2 | ||||
|         :type: float | ||||
| 
 | ||||
|         :param: order | ||||
|         :type: int | ||||
| 
 | ||||
|         :param: t1 | ||||
|         :type: float (optional, only for AR) | ||||
| 
 | ||||
|         :param: fnoise | ||||
|         :type: float (optional, only for AR) | ||||
|         ''' | ||||
| 
 | ||||
|         assert isinstance(data, Stream), "%s is not a stream object" % str(data) | ||||
| 
 | ||||
|         self.orig_data = data | ||||
|         self.dt = self.orig_data[0].stats.delta | ||||
|         self.setCut(cut) | ||||
|         self.setTime1(t1) | ||||
|         self.setTime2(t2) | ||||
|         self.setOrder(order) | ||||
|         self.setFnoise(fnoise) | ||||
|         self.setARdetStep(t2) | ||||
|         self.calcCF(self.getDataArray()) | ||||
|         self.arpara = np.array([]) | ||||
|         self.xpred = np.array([]) | ||||
|         self._stealthMode = stealthMode | ||||
| 
 | ||||
|     def __str__(self): | ||||
|         return '''\n\t{name} object:\n | ||||
|         Cut:\t\t{cut}\n | ||||
|         t1:\t{t1}\n | ||||
|         t2:\t{t2}\n | ||||
|         Order:\t\t{order}\n | ||||
|         Fnoise:\t{fnoise}\n | ||||
|         ARdetStep:\t{ardetstep}\n | ||||
|         '''.format(name=type(self).__name__, | ||||
|                    cut=self.getCut(), | ||||
|                    t1=self.getTime1(), | ||||
|                    t2=self.getTime2(), | ||||
|                    order=self.getOrder(), | ||||
|                    fnoise=self.getFnoise(), | ||||
|                    ardetstep=self.getARdetStep[0]()) | ||||
| 
 | ||||
|     def getCut(self): | ||||
|         return self.cut | ||||
| 
 | ||||
|     def setCut(self, cut): | ||||
|         self.cut = cut | ||||
| 
 | ||||
|     def getTime1(self): | ||||
|         return self.t1 | ||||
| 
 | ||||
|     def setTime1(self, t1): | ||||
|         self.t1 = t1 | ||||
| 
 | ||||
|     def getTime2(self): | ||||
|         return self.t2 | ||||
| 
 | ||||
|     def setTime2(self, t2): | ||||
|         self.t2 = t2 | ||||
| 
 | ||||
|     def getARdetStep(self): | ||||
|         return self.ARdetStep | ||||
| 
 | ||||
|     def setARdetStep(self, t1): | ||||
|         if t1: | ||||
|             self.ARdetStep = [] | ||||
|             self.ARdetStep.append(t1 / 4) | ||||
|             self.ARdetStep.append(int(np.ceil(self.getTime2() / self.getIncrement()) / 4)) | ||||
| 
 | ||||
|     def getOrder(self): | ||||
|         return self.order | ||||
| 
 | ||||
|     def setOrder(self, order): | ||||
|         self.order = order | ||||
| 
 | ||||
|     def getIncrement(self): | ||||
|         """ | ||||
|         :rtype : int | ||||
|         """ | ||||
|         return self.dt | ||||
| 
 | ||||
|     def getTimeArray(self): | ||||
|         incr = self.getIncrement() | ||||
|         self.TimeArray = np.arange(0, len(self.getCF()) * incr, incr) + self.getCut()[0] | ||||
|         return self.TimeArray | ||||
| 
 | ||||
|     def getFnoise(self): | ||||
|         return self.fnoise | ||||
| 
 | ||||
|     def setFnoise(self, fnoise): | ||||
|         self.fnoise = fnoise | ||||
| 
 | ||||
|     def getCF(self): | ||||
|         return self.cf | ||||
| 
 | ||||
|     def getXCF(self): | ||||
|         return self.xcf | ||||
| 
 | ||||
|     def _getStealthMode(self): | ||||
|         return self._stealthMode() | ||||
| 
 | ||||
|     def getDataArray(self, cut=None): | ||||
|         ''' | ||||
|         If cut times are given, time series is cut from cut[0] (start time) | ||||
|         till cut[1] (stop time) in order to calculate CF for certain part | ||||
|         only where you expect the signal! | ||||
|         input: cut (tuple) () | ||||
|         cutting window | ||||
|         ''' | ||||
|         if cut is not None: | ||||
|             if len(self.orig_data) == 1: | ||||
|                 if self.cut[0] == 0 and self.cut[1] == 0: | ||||
|                     start = 0 | ||||
|                     stop = len(self.orig_data[0]) | ||||
|                 elif self.cut[0] == 0 and self.cut[1] is not 0: | ||||
|                     start = 0 | ||||
|                     stop = self.cut[1] / self.dt | ||||
|                 else: | ||||
|                     start = self.cut[0] / self.dt | ||||
|                     stop = self.cut[1] / self.dt | ||||
|                 zz = self.orig_data.copy() | ||||
|                 z1 = zz[0].copy() | ||||
|                 zz[0].data = z1.data[int(start):int(stop)] | ||||
|                 data = zz | ||||
|                 return data | ||||
|             elif len(self.orig_data) == 2: | ||||
|                 if self.cut[0] == 0 and self.cut[1] == 0: | ||||
|                     start = 0 | ||||
|                     stop = min([len(self.orig_data[0]), len(self.orig_data[1])]) | ||||
|                 elif self.cut[0] == 0 and self.cut[1] is not 0: | ||||
|                     start = 0 | ||||
|                     stop = min([self.cut[1] / self.dt, len(self.orig_data[0]), | ||||
|                                 len(self.orig_data[1])]) | ||||
|                 else: | ||||
|                     start = max([0, self.cut[0] / self.dt]) | ||||
|                     stop = min([self.cut[1] / self.dt, len(self.orig_data[0]), | ||||
|                                 len(self.orig_data[1])]) | ||||
|                 hh = self.orig_data.copy() | ||||
|                 h1 = hh[0].copy() | ||||
|                 h2 = hh[1].copy() | ||||
|                 hh[0].data = h1.data[int(start):int(stop)] | ||||
|                 hh[1].data = h2.data[int(start):int(stop)] | ||||
|                 data = hh | ||||
|                 return data | ||||
|             elif len(self.orig_data) == 3: | ||||
|                 if self.cut[0] == 0 and self.cut[1] == 0: | ||||
|                     start = 0 | ||||
|                     stop = min([self.cut[1] / self.dt, len(self.orig_data[0]), | ||||
|                                 len(self.orig_data[1]), len(self.orig_data[2])]) | ||||
|                 elif self.cut[0] == 0 and self.cut[1] is not 0: | ||||
|                     start = 0 | ||||
|                     stop = self.cut[1] / self.dt | ||||
|                 else: | ||||
|                     start = max([0, self.cut[0] / self.dt]) | ||||
|                     stop = min([self.cut[1] / self.dt, len(self.orig_data[0]), | ||||
|                                 len(self.orig_data[1]), len(self.orig_data[2])]) | ||||
|                 hh = self.orig_data.copy() | ||||
|                 h1 = hh[0].copy() | ||||
|                 h2 = hh[1].copy() | ||||
|                 h3 = hh[2].copy() | ||||
|                 hh[0].data = h1.data[int(start):int(stop)] | ||||
|                 hh[1].data = h2.data[int(start):int(stop)] | ||||
|                 hh[2].data = h3.data[int(start):int(stop)] | ||||
|                 data = hh | ||||
|                 return data | ||||
|         else: | ||||
|             data = self.orig_data.copy() | ||||
|             return data | ||||
| 
 | ||||
|     def calcCF(self, data=None): | ||||
|         self.cf = data | ||||
| 
 | ||||
| 
 | ||||
| class AICcf(CharacteristicFunction): | ||||
|     ''' | ||||
|     Function to calculate the Akaike Information Criterion (AIC) after | ||||
|     Maeda (1985). | ||||
|     :param: data, time series (whether seismogram or CF) | ||||
|     :type: tuple | ||||
| 
 | ||||
|     Output: AIC function | ||||
|     ''' | ||||
| 
 | ||||
|     def calcCF(self, data): | ||||
| 
 | ||||
|         # if self._getStealthMode() is False: | ||||
|         #    print 'Calculating AIC ...' | ||||
|         x = self.getDataArray() | ||||
|         xnp = x[0].data | ||||
|         nn = np.isnan(xnp) | ||||
|         if len(nn) > 1: | ||||
|             xnp[nn] = 0 | ||||
|         datlen = len(xnp) | ||||
|         k = np.arange(1, datlen) | ||||
|         cf = np.zeros(datlen) | ||||
|         cumsumcf = np.cumsum(np.power(xnp, 2)) | ||||
|         i = np.where(cumsumcf == 0) | ||||
|         cumsumcf[i] = np.finfo(np.float64).eps | ||||
|         cf[k] = ((k - 1) * np.log(cumsumcf[k] / k) + (datlen - k + 1) * | ||||
|                  np.log((cumsumcf[datlen - 1] - cumsumcf[k - 1]) / (datlen - k + 1))) | ||||
|         cf[0] = cf[1] | ||||
|         inf = np.isinf(cf) | ||||
|         ff = np.where(inf == True) | ||||
|         if len(ff) >= 1: | ||||
|             cf[ff] = 0 | ||||
| 
 | ||||
|         self.cf = cf - np.mean(cf) | ||||
|         self.xcf = x | ||||
| 
 | ||||
| 
 | ||||
| class HOScf(CharacteristicFunction): | ||||
|     ''' | ||||
|     Function to calculate skewness (statistics of order 3) or kurtosis | ||||
|     (statistics of order 4), using one long moving window, as published | ||||
|     in Kueperkoch et al. (2010). | ||||
|     ''' | ||||
| 
 | ||||
|     def calcCF(self, data): | ||||
| 
 | ||||
|         x = self.getDataArray(self.getCut()) | ||||
|         xnp = x[0].data | ||||
|         nn = np.isnan(xnp) | ||||
|         if len(nn) > 1: | ||||
|             xnp[nn] = 0 | ||||
|         if self.getOrder() == 3:  # this is skewness | ||||
|             # if self._getStealthMode() is False: | ||||
|             #    print 'Calculating skewness ...' | ||||
|             y = np.power(xnp, 3) | ||||
|             y1 = np.power(xnp, 2) | ||||
|         elif self.getOrder() == 4:  # this is kurtosis | ||||
|             # if self._getStealthMode() is False: | ||||
|             #    print 'Calculating kurtosis ...' | ||||
|             y = np.power(xnp, 4) | ||||
|             y1 = np.power(xnp, 2) | ||||
| 
 | ||||
|         # Initialisation | ||||
|         # t2: long term moving window | ||||
|         ilta = int(round(self.getTime2() / self.getIncrement())) | ||||
|         lta = y[0] | ||||
|         lta1 = y1[0] | ||||
|         # moving windows | ||||
|         LTA = np.zeros(len(xnp)) | ||||
|         for j in range(0, len(xnp)): | ||||
|             if j < 4: | ||||
|                 LTA[j] = 0 | ||||
|             elif j <= ilta: | ||||
|                 lta = (y[j] + lta * (j - 1)) / j | ||||
|                 lta1 = (y1[j] + lta1 * (j - 1)) / j | ||||
|             else: | ||||
|                 lta = (y[j] - y[j - ilta]) / ilta + lta | ||||
|                 lta1 = (y1[j] - y1[j - ilta]) / ilta + lta1 | ||||
|             # define LTA | ||||
|             if self.getOrder() == 3: | ||||
|                 LTA[j] = lta / np.power(lta1, 1.5) | ||||
|             elif self.getOrder() == 4: | ||||
|                 LTA[j] = lta / np.power(lta1, 2) | ||||
| 
 | ||||
|         nn = np.isnan(LTA) | ||||
|         if len(nn) > 1: | ||||
|             LTA[nn] = 0 | ||||
|         self.cf = LTA | ||||
|         self.xcf = x | ||||
| 
 | ||||
| 
 | ||||
| class ARZcf(CharacteristicFunction): | ||||
|     def calcCF(self, data): | ||||
| 
 | ||||
|         print 'Calculating AR-prediction error from single trace ...' | ||||
|         x = self.getDataArray(self.getCut()) | ||||
|         xnp = x[0].data | ||||
|         nn = np.isnan(xnp) | ||||
|         if len(nn) > 1: | ||||
|             xnp[nn] = 0 | ||||
|         # some parameters needed | ||||
|         # add noise to time series | ||||
|         xnoise = xnp + np.random.normal(0.0, 1.0, len(xnp)) * self.getFnoise() * max(abs(xnp)) | ||||
|         tend = len(xnp) | ||||
|         # Time1: length of AR-determination window [sec] | ||||
|         # Time2: length of AR-prediction window [sec] | ||||
|         ldet = int(round(self.getTime1() / self.getIncrement()))  # length of AR-determination window [samples] | ||||
|         lpred = int(np.ceil(self.getTime2() / self.getIncrement()))  # length of AR-prediction window [samples] | ||||
| 
 | ||||
|         cf = np.zeros(len(xnp)) | ||||
|         loopstep = self.getARdetStep() | ||||
|         arcalci = ldet + self.getOrder()  # AR-calculation index | ||||
|         for i in range(ldet + self.getOrder(), tend - lpred - 1): | ||||
|             if i == arcalci: | ||||
|                 # determination of AR coefficients | ||||
|                 # to speed up calculation, AR-coefficients are calculated only every i+loopstep[1]! | ||||
|                 self.arDetZ(xnoise, self.getOrder(), i - ldet, i) | ||||
|                 arcalci = arcalci + loopstep[1] | ||||
|             # AR prediction of waveform using calculated AR coefficients | ||||
|             self.arPredZ(xnp, self.arpara, i + 1, lpred) | ||||
|             # prediction error = CF | ||||
|             cf[i + lpred - 1] = np.sqrt(np.sum(np.power(self.xpred[i:i + lpred - 1] - xnp[i:i + lpred - 1], 2)) / lpred) | ||||
|         nn = np.isnan(cf) | ||||
|         if len(nn) > 1: | ||||
|             cf[nn] = 0 | ||||
|         # remove zeros and artefacts | ||||
|         tap = np.hanning(len(cf)) | ||||
|         cf = tap * cf | ||||
|         io = np.where(cf == 0) | ||||
|         ino = np.where(cf > 0) | ||||
|         cf[io] = cf[ino[0][0]] | ||||
| 
 | ||||
|         self.cf = cf | ||||
|         self.xcf = x | ||||
| 
 | ||||
|     def arDetZ(self, data, order, rind, ldet): | ||||
|         ''' | ||||
|         Function to calculate AR parameters arpara after Thomas Meier (CAU), published | ||||
|         in  Kueperkoch et al. (2012). This function solves SLE using the Moore- | ||||
|         Penrose inverse, i.e. the least-squares approach. | ||||
|         :param: data, time series to calculate AR parameters from | ||||
|         :type: array | ||||
| 
 | ||||
|         :param: order, order of AR process | ||||
|         :type: int | ||||
| 
 | ||||
|         :param: rind, first running summation index | ||||
|         :type: int | ||||
| 
 | ||||
|         :param: ldet, length of AR-determination window (=end of summation index) | ||||
|         :type: int | ||||
| 
 | ||||
|         Output: AR parameters arpara | ||||
|         ''' | ||||
| 
 | ||||
|         # recursive calculation of data vector (right part of eq. 6.5 in Kueperkoch et al. (2012) | ||||
|         rhs = np.zeros(self.getOrder()) | ||||
|         for k in range(0, self.getOrder()): | ||||
|             for i in range(rind, ldet + 1): | ||||
|                 ki = k + 1 | ||||
|                 rhs[k] = rhs[k] + data[i] * data[i - ki] | ||||
| 
 | ||||
|         # recursive calculation of data array (second sum at left part of eq. 6.5 in Kueperkoch et al. 2012) | ||||
|         A = np.zeros((self.getOrder(), self.getOrder())) | ||||
|         for k in range(1, self.getOrder() + 1): | ||||
|             for j in range(1, k + 1): | ||||
|                 for i in range(rind, ldet + 1): | ||||
|                     ki = k - 1 | ||||
|                     ji = j - 1 | ||||
|                     A[ki, ji] = A[ki, ji] + data[i - j] * data[i - k] | ||||
| 
 | ||||
|                 A[ji, ki] = A[ki, ji] | ||||
| 
 | ||||
|         # apply Moore-Penrose inverse for SVD yielding the AR-parameters | ||||
|         self.arpara = np.dot(np.linalg.pinv(A), rhs) | ||||
| 
 | ||||
|     def arPredZ(self, data, arpara, rind, lpred): | ||||
|         ''' | ||||
|         Function to predict waveform, assuming an autoregressive process of order | ||||
|         p (=size(arpara)), with AR parameters arpara calculated in arDet. After | ||||
|         Thomas Meier (CAU), published in Kueperkoch et al. (2012). | ||||
|         :param: data, time series to be predicted | ||||
|         :type: array | ||||
| 
 | ||||
|         :param: arpara, AR parameters | ||||
|         :type: float | ||||
| 
 | ||||
|         :param: rind, first running summation index | ||||
|         :type: int | ||||
| 
 | ||||
|         :param: lpred, length of prediction window (=end of summation index) | ||||
|         :type: int | ||||
| 
 | ||||
|         Output: predicted waveform z | ||||
|         ''' | ||||
|         # be sure of the summation indeces | ||||
|         if rind < len(arpara): | ||||
|             rind = len(arpara) | ||||
|         if rind > len(data) - lpred: | ||||
|             rind = len(data) - lpred | ||||
|         if lpred < 1: | ||||
|             lpred = 1 | ||||
|         if lpred > len(data) - 2: | ||||
|             lpred = len(data) - 2 | ||||
| 
 | ||||
|         z = np.append(data[0:rind], np.zeros(lpred)) | ||||
|         for i in range(rind, rind + lpred): | ||||
|             for j in range(1, len(arpara) + 1): | ||||
|                 ji = j - 1 | ||||
|                 z[i] = z[i] + arpara[ji] * z[i - j] | ||||
| 
 | ||||
|         self.xpred = z | ||||
| 
 | ||||
| 
 | ||||
| class ARHcf(CharacteristicFunction): | ||||
|     def calcCF(self, data): | ||||
| 
 | ||||
|         print 'Calculating AR-prediction error from both horizontal traces ...' | ||||
| 
 | ||||
|         xnp = self.getDataArray(self.getCut()) | ||||
|         n0 = np.isnan(xnp[0].data) | ||||
|         if len(n0) > 1: | ||||
|             xnp[0].data[n0] = 0 | ||||
|         n1 = np.isnan(xnp[1].data) | ||||
|         if len(n1) > 1: | ||||
|             xnp[1].data[n1] = 0 | ||||
| 
 | ||||
|         # some parameters needed | ||||
|         # add noise to time series | ||||
|         xenoise = xnp[0].data + np.random.normal(0.0, 1.0, len(xnp[0].data)) * self.getFnoise() * max(abs(xnp[0].data)) | ||||
|         xnnoise = xnp[1].data + np.random.normal(0.0, 1.0, len(xnp[1].data)) * self.getFnoise() * max(abs(xnp[1].data)) | ||||
|         Xnoise = np.array([xenoise.tolist(), xnnoise.tolist()]) | ||||
|         tend = len(xnp[0].data) | ||||
|         # Time1: length of AR-determination window [sec] | ||||
|         # Time2: length of AR-prediction window [sec] | ||||
|         ldet = int(round(self.getTime1() / self.getIncrement()))  # length of AR-determination window [samples] | ||||
|         lpred = int(np.ceil(self.getTime2() / self.getIncrement()))  # length of AR-prediction window [samples] | ||||
| 
 | ||||
|         cf = np.zeros(len(xenoise)) | ||||
|         loopstep = self.getARdetStep() | ||||
|         arcalci = lpred + self.getOrder() - 1  # AR-calculation index | ||||
|         # arcalci = ldet + self.getOrder() - 1 #AR-calculation index | ||||
|         for i in range(lpred + self.getOrder() - 1, tend - 2 * lpred + 1): | ||||
|             if i == arcalci: | ||||
|                 # determination of AR coefficients | ||||
|                 # to speed up calculation, AR-coefficients are calculated only every i+loopstep[1]! | ||||
|                 self.arDetH(Xnoise, self.getOrder(), i - ldet, i) | ||||
|                 arcalci = arcalci + loopstep[1] | ||||
|             # AR prediction of waveform using calculated AR coefficients | ||||
|             self.arPredH(xnp, self.arpara, i + 1, lpred) | ||||
|             # prediction error = CF | ||||
|             cf[i + lpred] = np.sqrt(np.sum(np.power(self.xpred[0][i:i + lpred] - xnp[0][i:i + lpred], 2) \ | ||||
|                                            + np.power(self.xpred[1][i:i + lpred] - xnp[1][i:i + lpred], 2)) / ( | ||||
|                                     2 * lpred)) | ||||
|         nn = np.isnan(cf) | ||||
|         if len(nn) > 1: | ||||
|             cf[nn] = 0 | ||||
|         # remove zeros and artefacts | ||||
|         tap = np.hanning(len(cf)) | ||||
|         cf = tap * cf | ||||
|         io = np.where(cf == 0) | ||||
|         ino = np.where(cf > 0) | ||||
|         cf[io] = cf[ino[0][0]] | ||||
| 
 | ||||
|         self.cf = cf | ||||
|         self.xcf = xnp | ||||
| 
 | ||||
|     def arDetH(self, data, order, rind, ldet): | ||||
|         ''' | ||||
|         Function to calculate AR parameters arpara after Thomas Meier (CAU), published | ||||
|         in  Kueperkoch et al. (2012). This function solves SLE using the Moore- | ||||
|         Penrose inverse, i.e. the least-squares approach. "data" is a structured array. | ||||
|         AR parameters are calculated based on both horizontal components in order | ||||
|         to account for polarization. | ||||
|         :param: data, horizontal component seismograms to calculate AR parameters from | ||||
|         :type: structured array | ||||
| 
 | ||||
|         :param: order, order of AR process | ||||
|         :type: int | ||||
| 
 | ||||
|         :param: rind, first running summation index | ||||
|         :type: int | ||||
| 
 | ||||
|         :param: ldet, length of AR-determination window (=end of summation index) | ||||
|         :type: int | ||||
| 
 | ||||
|         Output: AR parameters arpara | ||||
|         ''' | ||||
| 
 | ||||
|         # recursive calculation of data vector (right part of eq. 6.5 in Kueperkoch et al. (2012) | ||||
|         rhs = np.zeros(self.getOrder()) | ||||
|         for k in range(0, self.getOrder()): | ||||
|             for i in range(rind, ldet): | ||||
|                 rhs[k] = rhs[k] + data[0, i] * data[0, i - k] + data[1, i] * data[1, i - k] | ||||
| 
 | ||||
|         # recursive calculation of data array (second sum at left part of eq. 6.5 in Kueperkoch et al. 2012) | ||||
|         A = np.zeros((4, 4)) | ||||
|         for k in range(1, self.getOrder() + 1): | ||||
|             for j in range(1, k + 1): | ||||
|                 for i in range(rind, ldet): | ||||
|                     ki = k - 1 | ||||
|                     ji = j - 1 | ||||
|                     A[ki, ji] = A[ki, ji] + data[0, i - ji] * data[0, i - ki] + data[1, i - ji] * data[1, i - ki] | ||||
| 
 | ||||
|                 A[ji, ki] = A[ki, ji] | ||||
| 
 | ||||
|         # apply Moore-Penrose inverse for SVD yielding the AR-parameters | ||||
|         self.arpara = np.dot(np.linalg.pinv(A), rhs) | ||||
| 
 | ||||
|     def arPredH(self, data, arpara, rind, lpred): | ||||
|         ''' | ||||
|         Function to predict waveform, assuming an autoregressive process of order | ||||
|         p (=size(arpara)), with AR parameters arpara calculated in arDet. After | ||||
|         Thomas Meier (CAU), published in Kueperkoch et al. (2012). | ||||
|         :param: data, horizontal component seismograms to be predicted | ||||
|         :type: structured array | ||||
| 
 | ||||
|         :param: arpara, AR parameters | ||||
|         :type: float | ||||
| 
 | ||||
|         :param: rind, first running summation index | ||||
|         :type: int | ||||
| 
 | ||||
|         :param: lpred, length of prediction window (=end of summation index) | ||||
|         :type: int | ||||
| 
 | ||||
|         Output: predicted waveform z | ||||
|         :type: structured array | ||||
|         ''' | ||||
|         # be sure of the summation indeces | ||||
|         if rind < len(arpara) + 1: | ||||
|             rind = len(arpara) + 1 | ||||
|         if rind > len(data[0]) - lpred + 1: | ||||
|             rind = len(data[0]) - lpred + 1 | ||||
|         if lpred < 1: | ||||
|             lpred = 1 | ||||
|         if lpred > len(data[0]) - 1: | ||||
|             lpred = len(data[0]) - 1 | ||||
| 
 | ||||
|         z1 = np.append(data[0][0:rind], np.zeros(lpred)) | ||||
|         z2 = np.append(data[1][0:rind], np.zeros(lpred)) | ||||
|         for i in range(rind, rind + lpred): | ||||
|             for j in range(1, len(arpara) + 1): | ||||
|                 ji = j - 1 | ||||
|                 z1[i] = z1[i] + arpara[ji] * z1[i - ji] | ||||
|                 z2[i] = z2[i] + arpara[ji] * z2[i - ji] | ||||
| 
 | ||||
|         z = np.array([z1.tolist(), z2.tolist()]) | ||||
|         self.xpred = z | ||||
| 
 | ||||
| 
 | ||||
| class AR3Ccf(CharacteristicFunction): | ||||
|     def calcCF(self, data): | ||||
| 
 | ||||
|         print 'Calculating AR-prediction error from all 3 components ...' | ||||
| 
 | ||||
|         xnp = self.getDataArray(self.getCut()) | ||||
|         n0 = np.isnan(xnp[0].data) | ||||
|         if len(n0) > 1: | ||||
|             xnp[0].data[n0] = 0 | ||||
|         n1 = np.isnan(xnp[1].data) | ||||
|         if len(n1) > 1: | ||||
|             xnp[1].data[n1] = 0 | ||||
|         n2 = np.isnan(xnp[2].data) | ||||
|         if len(n2) > 1: | ||||
|             xnp[2].data[n2] = 0 | ||||
| 
 | ||||
|         # some parameters needed | ||||
|         # add noise to time series | ||||
|         xenoise = xnp[0].data + np.random.normal(0.0, 1.0, len(xnp[0].data)) * self.getFnoise() * max(abs(xnp[0].data)) | ||||
|         xnnoise = xnp[1].data + np.random.normal(0.0, 1.0, len(xnp[1].data)) * self.getFnoise() * max(abs(xnp[1].data)) | ||||
|         xznoise = xnp[2].data + np.random.normal(0.0, 1.0, len(xnp[2].data)) * self.getFnoise() * max(abs(xnp[2].data)) | ||||
|         Xnoise = np.array([xenoise.tolist(), xnnoise.tolist(), xznoise.tolist()]) | ||||
|         tend = len(xnp[0].data) | ||||
|         # Time1: length of AR-determination window [sec] | ||||
|         # Time2: length of AR-prediction window [sec] | ||||
|         ldet = int(round(self.getTime1() / self.getIncrement()))  # length of AR-determination window [samples] | ||||
|         lpred = int(np.ceil(self.getTime2() / self.getIncrement()))  # length of AR-prediction window [samples] | ||||
| 
 | ||||
|         cf = np.zeros(len(xenoise)) | ||||
|         loopstep = self.getARdetStep() | ||||
|         arcalci = ldet + self.getOrder() - 1  # AR-calculation index | ||||
|         for i in range(ldet + self.getOrder() - 1, tend - 2 * lpred + 1): | ||||
|             if i == arcalci: | ||||
|                 # determination of AR coefficients | ||||
|                 # to speed up calculation, AR-coefficients are calculated only every i+loopstep[1]! | ||||
|                 self.arDet3C(Xnoise, self.getOrder(), i - ldet, i) | ||||
|                 arcalci = arcalci + loopstep[1] | ||||
| 
 | ||||
|             # AR prediction of waveform using calculated AR coefficients | ||||
|             self.arPred3C(xnp, self.arpara, i + 1, lpred) | ||||
|             # prediction error = CF | ||||
|             cf[i + lpred] = np.sqrt(np.sum(np.power(self.xpred[0][i:i + lpred] - xnp[0][i:i + lpred], 2) \ | ||||
|                                            + np.power(self.xpred[1][i:i + lpred] - xnp[1][i:i + lpred], 2) \ | ||||
|                                            + np.power(self.xpred[2][i:i + lpred] - xnp[2][i:i + lpred], 2)) / ( | ||||
|                                     3 * lpred)) | ||||
|         nn = np.isnan(cf) | ||||
|         if len(nn) > 1: | ||||
|             cf[nn] = 0 | ||||
|         # remove zeros and artefacts | ||||
|         tap = np.hanning(len(cf)) | ||||
|         cf = tap * cf | ||||
|         io = np.where(cf == 0) | ||||
|         ino = np.where(cf > 0) | ||||
|         cf[io] = cf[ino[0][0]] | ||||
| 
 | ||||
|         self.cf = cf | ||||
|         self.xcf = xnp | ||||
| 
 | ||||
|     def arDet3C(self, data, order, rind, ldet): | ||||
|         ''' | ||||
|         Function to calculate AR parameters arpara after Thomas Meier (CAU), published | ||||
|         in  Kueperkoch et al. (2012). This function solves SLE using the Moore- | ||||
|         Penrose inverse, i.e. the least-squares approach. "data" is a structured array. | ||||
|         AR parameters are calculated based on both horizontal components and vertical | ||||
|         componant. | ||||
|         :param: data, horizontal component seismograms to calculate AR parameters from | ||||
|         :type: structured array | ||||
| 
 | ||||
|         :param: order, order of AR process | ||||
|         :type: int | ||||
| 
 | ||||
|         :param: rind, first running summation index | ||||
|         :type: int | ||||
| 
 | ||||
|         :param: ldet, length of AR-determination window (=end of summation index) | ||||
|         :type: int | ||||
| 
 | ||||
|         Output: AR parameters arpara | ||||
|         ''' | ||||
| 
 | ||||
|         # recursive calculation of data vector (right part of eq. 6.5 in Kueperkoch et al. (2012) | ||||
|         rhs = np.zeros(self.getOrder()) | ||||
|         for k in range(0, self.getOrder()): | ||||
|             for i in range(rind, ldet): | ||||
|                 rhs[k] = rhs[k] + data[0, i] * data[0, i - k] + data[1, i] * data[1, i - k] \ | ||||
|                          + data[2, i] * data[2, i - k] | ||||
| 
 | ||||
|         # recursive calculation of data array (second sum at left part of eq. 6.5 in Kueperkoch et al. 2012) | ||||
|         A = np.zeros((4, 4)) | ||||
|         for k in range(1, self.getOrder() + 1): | ||||
|             for j in range(1, k + 1): | ||||
|                 for i in range(rind, ldet): | ||||
|                     ki = k - 1 | ||||
|                     ji = j - 1 | ||||
|                     A[ki, ji] = A[ki, ji] + data[0, i - ji] * data[0, i - ki] + data[1, i - ji] * data[1, i - ki] \ | ||||
|                                 + data[2, i - ji] * data[2, i - ki] | ||||
| 
 | ||||
|                 A[ji, ki] = A[ki, ji] | ||||
| 
 | ||||
|         # apply Moore-Penrose inverse for SVD yielding the AR-parameters | ||||
|         self.arpara = np.dot(np.linalg.pinv(A), rhs) | ||||
| 
 | ||||
|     def arPred3C(self, data, arpara, rind, lpred): | ||||
|         ''' | ||||
|         Function to predict waveform, assuming an autoregressive process of order | ||||
|         p (=size(arpara)), with AR parameters arpara calculated in arDet3C. After | ||||
|         Thomas Meier (CAU), published in Kueperkoch et al. (2012). | ||||
|         :param: data, horizontal and vertical component seismograms to be predicted | ||||
|         :type: structured array | ||||
| 
 | ||||
|         :param: arpara, AR parameters | ||||
|         :type: float | ||||
| 
 | ||||
|         :param: rind, first running summation index | ||||
|         :type: int | ||||
| 
 | ||||
|         :param: lpred, length of prediction window (=end of summation index) | ||||
|         :type: int | ||||
| 
 | ||||
|         Output: predicted waveform z | ||||
|         :type: structured array | ||||
|         ''' | ||||
|         # be sure of the summation indeces | ||||
|         if rind < len(arpara) + 1: | ||||
|             rind = len(arpara) + 1 | ||||
|         if rind > len(data[0]) - lpred + 1: | ||||
|             rind = len(data[0]) - lpred + 1 | ||||
|         if lpred < 1: | ||||
|             lpred = 1 | ||||
|         if lpred > len(data[0]) - 1: | ||||
|             lpred = len(data[0]) - 1 | ||||
| 
 | ||||
|         z1 = np.append(data[0][0:rind], np.zeros(lpred)) | ||||
|         z2 = np.append(data[1][0:rind], np.zeros(lpred)) | ||||
|         z3 = np.append(data[2][0:rind], np.zeros(lpred)) | ||||
|         for i in range(rind, rind + lpred): | ||||
|             for j in range(1, len(arpara) + 1): | ||||
|                 ji = j - 1 | ||||
|                 z1[i] = z1[i] + arpara[ji] * z1[i - ji] | ||||
|                 z2[i] = z2[i] + arpara[ji] * z2[i - ji] | ||||
|                 z3[i] = z3[i] + arpara[ji] * z3[i - ji] | ||||
| 
 | ||||
|         z = np.array([z1.tolist(), z2.tolist(), z3.tolist()]) | ||||
|         self.xpred = z | ||||
							
								
								
									
										496
									
								
								pylot/core/pick/compare.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,496 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import copy | ||||
| import operator | ||||
| import os | ||||
| import numpy as np | ||||
| import glob | ||||
| import matplotlib.pyplot as plt | ||||
| 
 | ||||
| from obspy import read_events | ||||
| 
 | ||||
| from pylot.core.io.phases import picksdict_from_picks | ||||
| from pylot.core.util.pdf import ProbabilityDensityFunction | ||||
| from pylot.core.util.utils import find_in_list | ||||
| from pylot.core.util.version import get_git_version as _getVersionString | ||||
| 
 | ||||
| __version__ = _getVersionString() | ||||
| __author__ = 'sebastianw' | ||||
| 
 | ||||
| 
 | ||||
| class Comparison(object): | ||||
|     """ | ||||
|     A Comparison object contains information on the evaluated picks' probability | ||||
|     density function and compares these in terms of building the difference of | ||||
|     compared pick sets. The results can be displayed as histograms showing its | ||||
|     properties. | ||||
|     """ | ||||
| 
 | ||||
|     def __init__(self, **kwargs): | ||||
|         names = list() | ||||
|         self._pdfs = dict() | ||||
|         for name, fn in kwargs.items(): | ||||
|             if isinstance(fn, PDFDictionary): | ||||
|                 self._pdfs[name] = fn | ||||
|             elif isinstance(fn, dict): | ||||
|                 self._pdfs[name] = PDFDictionary(fn) | ||||
|             else: | ||||
|                 self._pdfs[name] = PDFDictionary.from_quakeml(fn) | ||||
|             names.append(name) | ||||
|         if len(names) > 2: | ||||
|             raise ValueError('Comparison is only defined for two ' | ||||
|                              'arguments!') | ||||
|         self._names = names | ||||
|         self._compare = self.compare_picksets() | ||||
| 
 | ||||
|     def __nonzero__(self): | ||||
|         if not len(self.names) == 2 or not self._pdfs: | ||||
|             return False | ||||
|         return True | ||||
| 
 | ||||
|     def get(self, name): | ||||
|         return self._pdfs[name] | ||||
| 
 | ||||
|     @property | ||||
|     def names(self): | ||||
|         return self._names | ||||
| 
 | ||||
|     @names.setter | ||||
|     def names(self, names): | ||||
|         assert isinstance(names, list) and len(names) == 2, 'variable "names"' \ | ||||
|                                                             ' is either not a' \ | ||||
|                                                             ' list or its ' \ | ||||
|                                                             'length is not 2:' \ | ||||
|                                                             'names : {names}'.format( | ||||
|             names=names) | ||||
|         self._names = names | ||||
| 
 | ||||
|     @property | ||||
|     def comparison(self): | ||||
|         return self._compare | ||||
| 
 | ||||
|     @property | ||||
|     def stations(self): | ||||
|         return self.comparison.keys() | ||||
| 
 | ||||
|     @property | ||||
|     def nstations(self): | ||||
|         return len(self.stations) | ||||
| 
 | ||||
|     def compare_picksets(self, type='exp'): | ||||
|         """ | ||||
|         Compare two picksets A and B and return a dictionary compiling the results. | ||||
|         Comparison is carried out with the help of pdf representation of the picks | ||||
|         and a probabilistic approach to the time difference of two onset | ||||
|         measurements. | ||||
|         :param a: filename for pickset A | ||||
|         :type a: str | ||||
|         :param b: filename for pickset B | ||||
|         :type b: str | ||||
|         :return: dictionary containing the resulting comparison pdfs for all picks | ||||
|         :rtype: dict | ||||
|         """ | ||||
|         compare_pdfs = dict() | ||||
| 
 | ||||
|         pdf_a = self.get(self.names[0]).generate_pdf_data(type) | ||||
|         pdf_b = self.get(self.names[1]).generate_pdf_data(type) | ||||
| 
 | ||||
|         for station, phases in pdf_a.items(): | ||||
|             if station in pdf_b.keys(): | ||||
|                 compare_pdf = dict() | ||||
|                 for phase in phases: | ||||
|                     if phase in pdf_b[station].keys(): | ||||
|                         compare_pdf[phase] = phases[phase] - pdf_b[station][ | ||||
|                             phase] | ||||
|                 if compare_pdf is not None: | ||||
|                     compare_pdfs[station] = compare_pdf | ||||
| 
 | ||||
|         return compare_pdfs | ||||
| 
 | ||||
|     def plot(self, stations=None): | ||||
|         if stations is None: | ||||
|             nstations = self.nstations | ||||
|             stations = self.stations | ||||
|         else: | ||||
|             nstations = len(stations) | ||||
|         istations = range(nstations) | ||||
|         fig, axarr = plt.subplots(nstations, 2, sharex='col', sharey='row') | ||||
| 
 | ||||
|         for n in istations: | ||||
|             station = stations[n] | ||||
|             if station not in self.comparison.keys(): | ||||
|                 continue | ||||
|             compare_pdf = self.comparison[station] | ||||
|             for l, phase in enumerate(compare_pdf.keys()): | ||||
|                 axarr[n, l].plot(compare_pdf[phase].axis, | ||||
|                                  compare_pdf[phase].data) | ||||
|                 if n is 0: | ||||
|                     axarr[n, l].set_title(phase) | ||||
|                 if l is 0: | ||||
|                     axann = axarr[n, l].annotate(station, xy=(.05, .5), | ||||
|                                                  xycoords='axes fraction') | ||||
|                     bbox_props = dict(boxstyle='round', facecolor='lightgrey', | ||||
|                                       alpha=.7) | ||||
|                     axann.set_bbox(bbox_props) | ||||
|                 if n == int(np.median(istations)) and l is 0: | ||||
|                     label = 'probability density (qualitative)' | ||||
|                     axarr[n, l].set_ylabel(label) | ||||
|         plt.setp([a.get_xticklabels() for a in axarr[0, :]], visible=False) | ||||
|         plt.setp([a.get_yticklabels() for a in axarr[:, 1]], visible=False) | ||||
|         plt.setp([a.get_yticklabels() for a in axarr[:, 0]], visible=False) | ||||
| 
 | ||||
|         plt.show() | ||||
| 
 | ||||
|     def get_all(self, phasename): | ||||
|         pdf_dict = self.comparison | ||||
|         rlist = list() | ||||
|         for phases in pdf_dict.values(): | ||||
|             try: | ||||
|                 rlist.append(phases[phasename]) | ||||
|             except KeyError: | ||||
|                 continue | ||||
|         return rlist | ||||
| 
 | ||||
|     def get_array(self, phase, method_name): | ||||
|         method = operator.methodcaller(method_name) | ||||
|         pdf_list = self.get_all(phase) | ||||
|         rarray = map(method, pdf_list) | ||||
|         return np.array(rarray) | ||||
| 
 | ||||
|     def get_expectation_array(self, phase): | ||||
|         return self.get_array(phase, 'expectation') | ||||
| 
 | ||||
|     def get_std_array(self, phase): | ||||
|         return self.get_array(phase, 'standard_deviation') | ||||
| 
 | ||||
|     def hist_expectation(self, phases='all', bins=20, normed=False): | ||||
|         phases.strip() | ||||
|         if phases.find('all') is 0: | ||||
|             phases = 'ps' | ||||
|         phases = phases.upper() | ||||
|         nsp = len(phases) | ||||
|         fig, axarray = plt.subplots(1, nsp, sharey=True) | ||||
|         for n, phase in enumerate(phases): | ||||
|             ax = axarray[n] | ||||
|             data = self.get_expectation_array(phase) | ||||
|             xlims = [min(data), max(data)] | ||||
|             ax.hist(data, range=xlims, bins=bins, normed=normed) | ||||
|             title_str = 'phase: {0}, samples: {1}'.format(phase, len(data)) | ||||
|             ax.set_title(title_str) | ||||
|             ax.set_xlabel('expectation [s]') | ||||
|             if n is 0: | ||||
|                 ax.set_ylabel('abundance [-]') | ||||
|         plt.setp([a.get_yticklabels() for a in axarray[1:]], visible=False) | ||||
|         plt.show() | ||||
| 
 | ||||
|     def hist_standard_deviation(self, phases='all', bins=20, normed=False): | ||||
|         phases.strip() | ||||
|         if phases.find('all') == 0: | ||||
|             phases = 'ps' | ||||
|         phases = phases.upper() | ||||
|         nsp = len(phases) | ||||
|         fig, axarray = plt.subplots(1, nsp, sharey=True) | ||||
|         for n, phase in enumerate(phases): | ||||
|             ax = axarray[n] | ||||
|             data = self.get_std_array(phase) | ||||
|             xlims = [min(data), max(data)] | ||||
|             ax.hist(data, range=xlims, bins=bins, normed=normed) | ||||
|             title_str = 'phase: {0}, samples: {1}'.format(phase, len(data)) | ||||
|             ax.set_title(title_str) | ||||
|             ax.set_xlabel('standard deviation [s]') | ||||
|             if n is 0: | ||||
|                 ax.set_ylabel('abundance [-]') | ||||
|         plt.setp([a.get_yticklabels() for a in axarray[1:]], visible=False) | ||||
|         plt.show() | ||||
| 
 | ||||
|     def hist(self, type='std'): | ||||
|         pass | ||||
| 
 | ||||
| 
 | ||||
| class PDFDictionary(object): | ||||
|     """ | ||||
|     A PDFDictionary is a dictionary like object containing structured data on | ||||
|     the probability density function of seismic phase onsets. | ||||
|     """ | ||||
| 
 | ||||
|     def __init__(self, data): | ||||
|         self._pickdata = data | ||||
|         self._pdfdata = self.generate_pdf_data() | ||||
| 
 | ||||
|     def __nonzero__(self): | ||||
|         if len(self.pick_data) < 1: | ||||
|             return False | ||||
|         else: | ||||
|             return True | ||||
| 
 | ||||
|     def __getitem__(self, item): | ||||
|         return self.pdf_data[item] | ||||
| 
 | ||||
|     @property | ||||
|     def pdf_data(self): | ||||
|         return self._pdfdata | ||||
| 
 | ||||
|     @pdf_data.setter | ||||
|     def pdf_data(self, data): | ||||
|         self._pdfdata = data | ||||
| 
 | ||||
|     @property | ||||
|     def pick_data(self): | ||||
|         return self._pickdata | ||||
| 
 | ||||
|     @pick_data.setter | ||||
|     def pick_data(self, data): | ||||
|         self._pickdata = data | ||||
| 
 | ||||
|     @property | ||||
|     def stations(self): | ||||
|         return self.pick_data.keys() | ||||
| 
 | ||||
|     @property | ||||
|     def nstations(self): | ||||
|         return len(self.stations) | ||||
| 
 | ||||
|     @classmethod | ||||
|     def from_quakeml(self, fn): | ||||
|         cat = read_events(fn) | ||||
|         if len(cat) > 1: | ||||
|             raise NotImplementedError('reading more than one event at the same ' | ||||
|                                       'time is not implemented yet! Sorry!') | ||||
|         return PDFDictionary(picksdict_from_picks(cat[0])) | ||||
| 
 | ||||
|     def get_all(self, phase): | ||||
|         rlist = list() | ||||
|         for phases in self.pdf_data.values(): | ||||
|             try: | ||||
|                 rlist.append(phases[phase]) | ||||
|             except KeyError: | ||||
|                 continue | ||||
|         return rlist | ||||
| 
 | ||||
|     def generate_pdf_data(self, type='exp'): | ||||
|         """ | ||||
|         Returns probabiliy density function dictionary containing the | ||||
|         representation of the actual pick_data. | ||||
|         :param type: type of the returned | ||||
|          `~pylot.core.util.pdf.ProbabilityDensityFunction` object | ||||
|         :type type: str | ||||
|         :return: a dictionary containing the picks represented as pdfs | ||||
|         """ | ||||
| 
 | ||||
|         pdf_picks = copy.deepcopy(self.pick_data) | ||||
| 
 | ||||
|         for station, phases in pdf_picks.items(): | ||||
|             for phase, values in phases.items(): | ||||
|                 if phase not in 'PS': | ||||
|                     continue | ||||
|                 phases[phase] = ProbabilityDensityFunction.from_pick( | ||||
|                     values['epp'], | ||||
|                     values['mpp'], | ||||
|                     values['lpp'], | ||||
|                     type=type) | ||||
| 
 | ||||
|         return pdf_picks | ||||
| 
 | ||||
|     def plot(self, stations=None): | ||||
|         ''' | ||||
|         plots the all probability density function for either desired STATIONS | ||||
|         or all available date | ||||
|         :param stations: list of stations to be plotted | ||||
|         :type stations: list | ||||
|         :return: matplotlib figure object containing the plot | ||||
|         ''' | ||||
|         assert stations is not None or not isinstance(stations, list), \ | ||||
|             'parameter stations should be a list not {0}'.format(type(stations)) | ||||
|         if not stations: | ||||
|             nstations = self.nstations | ||||
|             stations = self.stations | ||||
|         else: | ||||
|             nstations = len(stations) | ||||
| 
 | ||||
|         istations = range(nstations) | ||||
|         fig, axarr = plt.subplots(nstations, 2, sharex='col', sharey='row') | ||||
|         hide_labels = True | ||||
| 
 | ||||
|         for n in istations: | ||||
|             station = stations[n] | ||||
|             pdfs = self.pdf_data[station] | ||||
|             for l, phase in enumerate(pdfs.keys()): | ||||
|                 try: | ||||
|                     axarr[n, l].plot(pdfs[phase].axis, pdfs[phase].data()) | ||||
|                     if n is 0: | ||||
|                         axarr[n, l].set_title(phase) | ||||
|                     if l is 0: | ||||
|                         axann = axarr[n, l].annotate(station, xy=(.05, .5), | ||||
|                                                      xycoords='axes fraction') | ||||
|                         bbox_props = dict(boxstyle='round', facecolor='lightgrey', | ||||
|                                           alpha=.7) | ||||
|                         axann.set_bbox(bbox_props) | ||||
|                     if n == int(np.median(istations)) and l is 0: | ||||
|                         label = 'probability density (qualitative)' | ||||
|                         axarr[n, l].set_ylabel(label) | ||||
|                 except IndexError as e: | ||||
|                     print('trying aligned plotting\n{0}'.format(e)) | ||||
|                     hide_labels = False | ||||
|                     axarr[l].plot(pdfs[phase].axis, pdfs[phase].data()) | ||||
|                     axarr[l].set_title(phase) | ||||
|                     if l is 0: | ||||
|                         axann = axarr[l].annotate(station, xy=(.05, .5), | ||||
|                                                      xycoords='axes fraction') | ||||
|                         bbox_props = dict(boxstyle='round', facecolor='lightgrey', | ||||
|                                           alpha=.7) | ||||
|                         axann.set_bbox(bbox_props) | ||||
|         if hide_labels: | ||||
|             plt.setp([a.get_xticklabels() for a in axarr[0, :]], visible=False) | ||||
|             plt.setp([a.get_yticklabels() for a in axarr[:, 1]], visible=False) | ||||
|             plt.setp([a.get_yticklabels() for a in axarr[:, 0]], visible=False) | ||||
| 
 | ||||
|         return fig | ||||
| 
 | ||||
| 
 | ||||
| class PDFstatistics(object): | ||||
|     """ | ||||
|     This object can be used to get various statistic values from probabillity density functions. | ||||
|     Takes a path as argument. | ||||
|     """ | ||||
| 
 | ||||
| 
 | ||||
|     def __init__(self, directory): | ||||
|         """Initiates some values needed when dealing with pdfs later""" | ||||
|         self._rootdir = directory | ||||
|         self._evtlist = list() | ||||
|         self._rphase = None | ||||
|         self.make_fnlist() | ||||
| 
 | ||||
|     def make_fnlist(self, fn_pattern='*.xml'): | ||||
|         """ | ||||
|         Takes a file pattern and searches for that recursively in the set path for the object. | ||||
|         :param fn_pattern: A pattern that can identify all datafiles. Default Value = '*.xml' | ||||
|         :type  fn_pattern: string | ||||
|         :return: creates a list of events saved in the PDFstatistics object. | ||||
|         """ | ||||
|         evtlist = list() | ||||
|         for root, _, files in os.walk(self.root): | ||||
|             for file in files: | ||||
|                 if file.endswith(fn_pattern[1:]): | ||||
|                     evtlist.append(os.path.join(root, file)) | ||||
|         self._evtlist = evtlist | ||||
| 
 | ||||
|     def __iter__(self): | ||||
|         for evt in self._evtlist: | ||||
|             yield PDFDictionary.from_quakeml(evt) | ||||
| 
 | ||||
|     def __getitem__(self, item): | ||||
|         evt = find_in_list(self._evtlist, item) | ||||
|         if evt: | ||||
|             return PDFDictionary.from_quakeml(evt) | ||||
|         return None | ||||
| 
 | ||||
|     @property | ||||
|     def root(self): | ||||
|         return self._rootdir | ||||
| 
 | ||||
|     @root.setter | ||||
|     def root(self, value): | ||||
|         if os.path.exists(value): | ||||
|             self._rootdir = value | ||||
|         else: | ||||
|             raise ValueError("path doesn't exist: %s" % value) | ||||
| 
 | ||||
|     @property | ||||
|     def curphase(self): | ||||
|         """ | ||||
|         return the current phase type of interest | ||||
|         :return: current phase | ||||
|         """ | ||||
|         return self._rphase | ||||
| 
 | ||||
|     @curphase.setter | ||||
|     def curphase(self, type): | ||||
|         """ | ||||
|         setter method for property curphase | ||||
|         :param type: specify the phase type of interest | ||||
|         :type  type: string ('p' or 's') | ||||
|         :return: - | ||||
|         """ | ||||
|         if type.upper() not in 'PS': | ||||
|             raise ValueError("phase type must be either 'P' or 'S'!") | ||||
|         else: | ||||
|             self._rphase = type.upper() | ||||
| 
 | ||||
|     def get(self, property='std', value=None): | ||||
|         """ | ||||
|         takes a property str and a probability value and returns all | ||||
|         property's values for the current phase of interest | ||||
|         :func:`self.curphase` | ||||
| 
 | ||||
|         :param property: property name (default: 'std') | ||||
|         :type property: str | ||||
|         :param value: probability value :math:\alpha | ||||
|         :type  value: float | ||||
|         :return: list containing all property's values | ||||
|         """ | ||||
|         assert isinstance(self.curphase, | ||||
|                           str), 'phase has to be set before being ' \ | ||||
|                                 'able to iterate over items...' | ||||
|         rlist = [] | ||||
|         method_options = dict(STD='standard_deviation', | ||||
|                               Q='quantile', | ||||
|                               QD='quantile_distance', | ||||
|                               QDF='quantile_dist_frac') | ||||
| 
 | ||||
|         # create method caller for easy mapping | ||||
|         if property.upper() == 'STD': | ||||
|             method = operator.methodcaller(method_options[property.upper()]) | ||||
|         elif value is not None: | ||||
|             try: | ||||
|                 method = operator.methodcaller(method_options[property.upper()], | ||||
|                                                value) | ||||
|             except KeyError: | ||||
|                 raise KeyError('unknwon property: {0}'.format(property.upper())) | ||||
|         else: | ||||
|             raise ValueError("for call to method {0} value has to be " | ||||
|                              "defined but is 'None' ".format(method_options[ | ||||
|                                                              property.upper()])) | ||||
| 
 | ||||
|         for pdf_dict in self: | ||||
|             # create worklist | ||||
|             wlist = pdf_dict.get_all(self.curphase) | ||||
|             # map method calls to object in worklist | ||||
|             rlist += map(method, wlist) | ||||
| 
 | ||||
|         return rlist | ||||
| 
 | ||||
|     def writeThetaToFile(self,array,out_dir): | ||||
|         """ | ||||
|         Method to write array like data to file. Useful since acquiring can take | ||||
|         serious amount of time when dealing with large databases. | ||||
|         :param array: List of values. | ||||
|         :type  array: list | ||||
|         :param out_dir: Path to save file to including file name. | ||||
|         :type  out_dir: str | ||||
|         :return: Saves a file at given output directory. | ||||
|         """ | ||||
|         fid = open(os.path.join(out_dir), 'w') | ||||
|         for val in array: | ||||
|             fid.write(str(val)+'\n') | ||||
|         fid.close() | ||||
| 
 | ||||
| 
 | ||||
| def main(): | ||||
|     root_dir ='/home/sebastianp/Codetesting/xmls/' | ||||
|     Insheim = PDFstatistics(root_dir) | ||||
|     Insheim.curphase = 'p' | ||||
|     qdlist = Insheim.get('qdf', 0.2) | ||||
|     print qdlist | ||||
| 
 | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     import cProfile | ||||
| 
 | ||||
|     pr = cProfile.Profile() | ||||
|     pr.enable() | ||||
|     main() | ||||
|     pr.disable() | ||||
|     # after your program ends | ||||
|     pr.print_stats(sort="calls") | ||||
							
								
								
									
										399
									
								
								pylot/core/pick/picker.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,399 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
| """ | ||||
| Created Dec 2014 to Feb 2015 | ||||
| Implementation of the automated picking algorithms published and described in: | ||||
| 
 | ||||
| Kueperkoch, L., Meier, T., Lee, J., Friederich, W., & Egelados Working Group, 2010: | ||||
| Automated determination of P-phase arrival times at regional and local distances | ||||
| using higher order statistics, Geophys. J. Int., 181, 1159-1170 | ||||
| 
 | ||||
| Kueperkoch, L., Meier, T., Bruestle, A., Lee, J., Friederich, W., & Egelados | ||||
| Working Group, 2012: Automated determination of S-phase arrival times using | ||||
| autoregressive prediction: application ot local and regional distances, Geophys. J. Int., | ||||
| 188, 687-702. | ||||
| 
 | ||||
| The picks with the above described algorithms are assumed to be the most likely picks. | ||||
| For each most likely pick the corresponding earliest and latest possible picks are | ||||
| calculated after Diehl & Kissling (2009). | ||||
| 
 | ||||
| :author: MAGS2 EP3 working group / Ludger Kueperkoch | ||||
| """ | ||||
| 
 | ||||
| import numpy as np | ||||
| import matplotlib.pyplot as plt | ||||
| from pylot.core.pick.utils import getnoisewin, getsignalwin | ||||
| from pylot.core.pick.charfuns import CharacteristicFunction | ||||
| import warnings | ||||
| 
 | ||||
| 
 | ||||
| class AutoPicker(object): | ||||
|     ''' | ||||
|     Superclass of different, automated picking algorithms applied on a CF determined | ||||
|     using AIC, HOS, or AR prediction. | ||||
|     ''' | ||||
| 
 | ||||
|     warnings.simplefilter('ignore') | ||||
| 
 | ||||
|     def __init__(self, cf, TSNR, PickWindow, iplot=None, aus=None, Tsmooth=None, Pick1=None): | ||||
|         ''' | ||||
|         :param: cf, characteristic function, on which the picking algorithm is applied | ||||
|         :type: `~pylot.core.pick.CharFuns.CharacteristicFunction` object | ||||
| 
 | ||||
|         :param: TSNR, length of time windows around pick used to determine SNR [s] | ||||
|         :type: tuple (T_noise, T_gap, T_signal) | ||||
| 
 | ||||
|         :param: PickWindow, length of pick window [s] | ||||
|         :type: float | ||||
| 
 | ||||
|         :param: iplot, no. of figure window for plotting interims results | ||||
|         :type: integer | ||||
| 
 | ||||
|         :param: aus ("artificial uplift of samples"), find local minimum at i if aic(i-1)*(1+aus) >= aic(i) | ||||
|         :type: float | ||||
| 
 | ||||
|         :param: Tsmooth, length of moving smoothing window to calculate smoothed CF [s] | ||||
|         :type: float | ||||
| 
 | ||||
|         :param: Pick1, initial (prelimenary) onset time, starting point for PragPicker and | ||||
|          EarlLatePicker | ||||
|         :type: float | ||||
| 
 | ||||
|         ''' | ||||
| 
 | ||||
|         assert isinstance(cf, CharacteristicFunction), "%s is not a CharacteristicFunction object" % str(cf) | ||||
| 
 | ||||
|         self.cf = cf.getCF() | ||||
|         self.Tcf = cf.getTimeArray() | ||||
|         self.Data = cf.getXCF() | ||||
|         self.dt = cf.getIncrement() | ||||
|         self.setTSNR(TSNR) | ||||
|         self.setPickWindow(PickWindow) | ||||
|         self.setiplot(iplot) | ||||
|         self.setaus(aus) | ||||
|         self.setTsmooth(Tsmooth) | ||||
|         self.setpick1(Pick1) | ||||
|         self.calcPick() | ||||
| 
 | ||||
|     def __str__(self): | ||||
|         return '''\n\t{name} object:\n | ||||
|         TSNR:\t\t\t{TSNR}\n | ||||
|         PickWindow:\t{PickWindow}\n | ||||
|         aus:\t{aus}\n | ||||
|         Tsmooth:\t{Tsmooth}\n | ||||
|         Pick1:\t{Pick1}\n | ||||
|         '''.format(name=type(self).__name__, | ||||
|                    TSNR=self.getTSNR(), | ||||
|                    PickWindow=self.getPickWindow(), | ||||
|                    aus=self.getaus(), | ||||
|                    Tsmooth=self.getTsmooth(), | ||||
|                    Pick1=self.getpick1()) | ||||
| 
 | ||||
|     def getTSNR(self): | ||||
|         return self.TSNR | ||||
| 
 | ||||
|     def setTSNR(self, TSNR): | ||||
|         self.TSNR = TSNR | ||||
| 
 | ||||
|     def getPickWindow(self): | ||||
|         return self.PickWindow | ||||
| 
 | ||||
|     def setPickWindow(self, PickWindow): | ||||
|         self.PickWindow = PickWindow | ||||
| 
 | ||||
|     def getaus(self): | ||||
|         return self.aus | ||||
| 
 | ||||
|     def setaus(self, aus): | ||||
|         self.aus = aus | ||||
| 
 | ||||
|     def setTsmooth(self, Tsmooth): | ||||
|         self.Tsmooth = Tsmooth | ||||
| 
 | ||||
|     def getTsmooth(self): | ||||
|         return self.Tsmooth | ||||
| 
 | ||||
|     def getpick(self): | ||||
|         return self.Pick | ||||
| 
 | ||||
|     def getSNR(self): | ||||
|         return self.SNR | ||||
| 
 | ||||
|     def getSlope(self): | ||||
|         return self.slope | ||||
| 
 | ||||
|     def getiplot(self): | ||||
|         return self.iplot | ||||
| 
 | ||||
|     def setiplot(self, iplot): | ||||
|         self.iplot = iplot | ||||
| 
 | ||||
|     def getpick1(self): | ||||
|         return self.Pick1 | ||||
| 
 | ||||
|     def setpick1(self, Pick1): | ||||
|         self.Pick1 = Pick1 | ||||
| 
 | ||||
|     def calcPick(self): | ||||
|         self.Pick = None | ||||
| 
 | ||||
| 
 | ||||
| class AICPicker(AutoPicker): | ||||
|     ''' | ||||
|     Method to derive the onset time of an arriving phase based on CF | ||||
|     derived from AIC. In order to get an impression of the quality of this inital pick, | ||||
|     a quality assessment is applied based on SNR and slope determination derived from the CF, | ||||
|     from which the AIC has been calculated. | ||||
|     ''' | ||||
| 
 | ||||
|     def calcPick(self): | ||||
| 
 | ||||
|         print('AICPicker: Get initial onset time (pick) from AIC-CF ...') | ||||
| 
 | ||||
|         self.Pick = None | ||||
|         self.slope = None | ||||
|         self.SNR = None | ||||
|         # find NaN's | ||||
|         nn = np.isnan(self.cf) | ||||
|         if len(nn) > 1: | ||||
|             self.cf[nn] = 0 | ||||
|         # taper AIC-CF to get rid off side maxima | ||||
|         tap = np.hanning(len(self.cf)) | ||||
|         aic = tap * self.cf + max(abs(self.cf)) | ||||
|         # smooth AIC-CF | ||||
|         ismooth = int(round(self.Tsmooth / self.dt)) | ||||
|         aicsmooth = np.zeros(len(aic)) | ||||
|         if len(aic) < ismooth: | ||||
|             print('AICPicker: Tsmooth larger than CF!') | ||||
|             return | ||||
|         else: | ||||
|             for i in range(1, len(aic)): | ||||
|                 if i > ismooth: | ||||
|                     ii1 = i - ismooth | ||||
|                     aicsmooth[i] = aicsmooth[i - 1] + (aic[i] - aic[ii1]) / ismooth | ||||
|                 else: | ||||
|                     aicsmooth[i] = np.mean(aic[1: i]) | ||||
|         # remove offset | ||||
|         offset = abs(min(aic) - min(aicsmooth)) | ||||
|         aicsmooth = aicsmooth - offset | ||||
|         # get maximum of 1st derivative of AIC-CF (more stable!) as starting point | ||||
|         diffcf = np.diff(aicsmooth) | ||||
|         # find NaN's | ||||
|         nn = np.isnan(diffcf) | ||||
|         if len(nn) > 1: | ||||
|             diffcf[nn] = 0 | ||||
|         # taper CF to get rid off side maxima | ||||
|         tap = np.hanning(len(diffcf)) | ||||
|         diffcf = tap * diffcf * max(abs(aicsmooth)) | ||||
|         icfmax = np.argmax(diffcf) | ||||
| 
 | ||||
|         # find minimum in AIC-CF front of maximum | ||||
|         lpickwindow = int(round(self.PickWindow / self.dt)) | ||||
|         for i in range(icfmax - 1, max([icfmax - lpickwindow, 2]), -1): | ||||
|             if aicsmooth[i - 1] >= aicsmooth[i]: | ||||
|                 self.Pick = self.Tcf[i] | ||||
|                 break | ||||
|         # if no minimum could be found: | ||||
|         # search in 1st derivative of AIC-CF | ||||
|         if self.Pick is None: | ||||
|             for i in range(icfmax - 1, max([icfmax - lpickwindow, 2]), -1): | ||||
|                 if diffcf[i - 1] >= diffcf[i]: | ||||
|                     self.Pick = self.Tcf[i] | ||||
|                     break | ||||
| 
 | ||||
|         # quality assessment using SNR and slope from CF | ||||
|         if self.Pick is not None: | ||||
|             # get noise window | ||||
|             inoise = getnoisewin(self.Tcf, self.Pick, self.TSNR[0], self.TSNR[1]) | ||||
|             # check, if these are counts or m/s, important for slope estimation! | ||||
|             # this is quick and dirty, better solution? | ||||
|             if max(self.Data[0].data < 1e-3): | ||||
|                 self.Data[0].data = self.Data[0].data * 1000000 | ||||
|             # get signal window | ||||
|             isignal = getsignalwin(self.Tcf, self.Pick, self.TSNR[2]) | ||||
|             # calculate SNR from CF | ||||
|             self.SNR = max(abs(aic[isignal] - np.mean(aic[isignal]))) / \ | ||||
|                        max(abs(aic[inoise] - np.mean(aic[inoise]))) | ||||
|             # calculate slope from CF after initial pick | ||||
|             # get slope window | ||||
|             tslope = self.TSNR[3]  # slope determination window | ||||
|             islope = np.where((self.Tcf <= min([self.Pick + tslope, len(self.Data[0].data)])) \ | ||||
|                               & (self.Tcf >= self.Pick)) | ||||
|             # find maximum within slope determination window | ||||
|             # 'cause slope should be calculated up to first local minimum only! | ||||
|             imax = np.argmax(self.Data[0].data[islope]) | ||||
|             if imax == 0: | ||||
|                 print('AICPicker: Maximum for slope determination right at the beginning of the window!') | ||||
|                 print('Choose longer slope determination window!') | ||||
|                 if self.iplot > 1: | ||||
|                     p = plt.figure(self.iplot) | ||||
|                     x = self.Data[0].data | ||||
|                     p1, = plt.plot(self.Tcf, x / max(x), 'k') | ||||
|                     p2, = plt.plot(self.Tcf, aicsmooth / max(aicsmooth), 'r') | ||||
|                     plt.legend([p1, p2], ['(HOS-/AR-) Data', 'Smoothed AIC-CF']) | ||||
|                     plt.xlabel('Time [s] since %s' % self.Data[0].stats.starttime) | ||||
|                     plt.yticks([]) | ||||
|                     plt.title(self.Data[0].stats.station) | ||||
|                     plt.show() | ||||
|                     raw_input() | ||||
|                     plt.close(p) | ||||
|                 return | ||||
|             islope = islope[0][0:imax] | ||||
|             dataslope = self.Data[0].data[islope] | ||||
|             # calculate slope as polynomal fit of order 1 | ||||
|             xslope = np.arange(0, len(dataslope), 1) | ||||
|             P = np.polyfit(xslope, dataslope, 1) | ||||
|             datafit = np.polyval(P, xslope) | ||||
|             if datafit[0] >= datafit[len(datafit) - 1]: | ||||
|                 print('AICPicker: Negative slope, bad onset skipped!') | ||||
|                 return | ||||
|             self.slope = 1 / tslope * (datafit[len(dataslope) - 1] - datafit[0]) | ||||
| 
 | ||||
|         else: | ||||
|             self.SNR = None | ||||
|             self.slope = None | ||||
| 
 | ||||
|         if self.iplot > 1: | ||||
|             p = plt.figure(self.iplot) | ||||
|             x = self.Data[0].data | ||||
|             p1, = plt.plot(self.Tcf, x / max(x), 'k') | ||||
|             p2, = plt.plot(self.Tcf, aicsmooth / max(aicsmooth), 'r') | ||||
|             if self.Pick is not None: | ||||
|                 p3, = plt.plot([self.Pick, self.Pick], [-0.1, 0.5], 'b', linewidth=2) | ||||
|                 plt.legend([p1, p2, p3], ['(HOS-/AR-) Data', 'Smoothed AIC-CF', 'AIC-Pick']) | ||||
|             else: | ||||
|                 plt.legend([p1, p2], ['(HOS-/AR-) Data', 'Smoothed AIC-CF']) | ||||
|             plt.xlabel('Time [s] since %s' % self.Data[0].stats.starttime) | ||||
|             plt.yticks([]) | ||||
|             plt.title(self.Data[0].stats.station) | ||||
| 
 | ||||
|             if self.Pick is not None: | ||||
|                 plt.figure(self.iplot + 1) | ||||
|                 p11, = plt.plot(self.Tcf, x, 'k') | ||||
|                 p12, = plt.plot(self.Tcf[inoise], self.Data[0].data[inoise]) | ||||
|                 p13, = plt.plot(self.Tcf[isignal], self.Data[0].data[isignal], 'r') | ||||
|                 p14, = plt.plot(self.Tcf[islope], dataslope, 'g--') | ||||
|                 p15, = plt.plot(self.Tcf[islope], datafit, 'g', linewidth=2) | ||||
|                 plt.legend([p11, p12, p13, p14, p15], | ||||
|                            ['Data', 'Noise Window', 'Signal Window', 'Slope Window', 'Slope'], | ||||
|                            loc='best') | ||||
|                 plt.title('Station %s, SNR=%7.2f, Slope= %12.2f counts/s' % (self.Data[0].stats.station, | ||||
|                                                                              self.SNR, self.slope)) | ||||
|                 plt.xlabel('Time [s] since %s' % self.Data[0].stats.starttime) | ||||
|                 plt.ylabel('Counts') | ||||
|                 plt.yticks([]) | ||||
| 
 | ||||
|             plt.show() | ||||
|             raw_input() | ||||
|             plt.close(p) | ||||
| 
 | ||||
|         if self.Pick == None: | ||||
|             print('AICPicker: Could not find minimum, picking window too short?') | ||||
| 
 | ||||
| 
 | ||||
| class PragPicker(AutoPicker): | ||||
|     ''' | ||||
|     Method of pragmatic picking exploiting information given by CF. | ||||
|     ''' | ||||
| 
 | ||||
|     def calcPick(self): | ||||
| 
 | ||||
|         if self.getpick1() is not None: | ||||
|             print('PragPicker: Get most likely pick from HOS- or AR-CF using pragmatic picking algorithm ...') | ||||
| 
 | ||||
|             self.Pick = None | ||||
|             self.SNR = None | ||||
|             self.slope = None | ||||
|             pickflag = 0 | ||||
|             # smooth CF | ||||
|             ismooth = int(round(self.Tsmooth / self.dt)) | ||||
|             cfsmooth = np.zeros(len(self.cf)) | ||||
|             if len(self.cf) < ismooth: | ||||
|                 print('PragPicker: Tsmooth larger than CF!') | ||||
|                 return | ||||
|             else: | ||||
|                 for i in range(1, len(self.cf)): | ||||
|                     if i > ismooth: | ||||
|                         ii1 = i - ismooth | ||||
|                         cfsmooth[i] = cfsmooth[i - 1] + (self.cf[i] - self.cf[ii1]) / ismooth | ||||
|                     else: | ||||
|                         cfsmooth[i] = np.mean(self.cf[1: i]) | ||||
| 
 | ||||
|             # select picking window | ||||
|             # which is centered around tpick1 | ||||
|             ipick = np.where((self.Tcf >= self.getpick1() - self.PickWindow / 2) \ | ||||
|                              & (self.Tcf <= self.getpick1() + self.PickWindow / 2)) | ||||
|             cfipick = self.cf[ipick] - np.mean(self.cf[ipick]) | ||||
|             Tcfpick = self.Tcf[ipick] | ||||
|             cfsmoothipick = cfsmooth[ipick] - np.mean(self.cf[ipick]) | ||||
|             ipick1 = np.argmin(abs(self.Tcf - self.getpick1())) | ||||
|             cfpick1 = 2 * self.cf[ipick1] | ||||
| 
 | ||||
|             # check trend of CF, i.e. differences of CF and adjust aus regarding this trend | ||||
|             # prominent trend: decrease aus | ||||
|             # flat: use given aus | ||||
|             cfdiff = np.diff(cfipick) | ||||
|             i0diff = np.where(cfdiff > 0) | ||||
|             cfdiff = cfdiff[i0diff] | ||||
|             minaus = min(cfdiff * (1 + self.aus)) | ||||
|             aus1 = max([minaus, self.aus]) | ||||
| 
 | ||||
|             # at first we look to the right until the end of the pick window is reached | ||||
|             flagpick_r = 0 | ||||
|             flagpick_l = 0 | ||||
|             cfpick_r = 0 | ||||
|             cfpick_l = 0 | ||||
|             lpickwindow = int(round(self.PickWindow / self.dt)) | ||||
|             for i in range(max(np.insert(ipick, 0, 2)), min([ipick1 + lpickwindow + 1, len(self.cf) - 1])): | ||||
|                 if self.cf[i + 1] > self.cf[i] and self.cf[i - 1] >= self.cf[i]: | ||||
|                     if cfsmooth[i - 1] * (1 + aus1) >= cfsmooth[i]: | ||||
|                         if cfpick1 >= self.cf[i]: | ||||
|                             pick_r = self.Tcf[i] | ||||
|                             self.Pick = pick_r | ||||
|                             flagpick_l = 1 | ||||
|                             cfpick_r = self.cf[i] | ||||
|                             break | ||||
| 
 | ||||
|             # now we look to the left | ||||
|             for i in range(ipick1, max([ipick1 - lpickwindow + 1, 2]), -1): | ||||
|                 if self.cf[i + 1] > self.cf[i] and self.cf[i - 1] >= self.cf[i]: | ||||
|                     if cfsmooth[i - 1] * (1 + aus1) >= cfsmooth[i]: | ||||
|                         if cfpick1 >= self.cf[i]: | ||||
|                             pick_l = self.Tcf[i] | ||||
|                             self.Pick = pick_l | ||||
|                             flagpick_r = 1 | ||||
|                             cfpick_l = self.cf[i] | ||||
|                             break | ||||
| 
 | ||||
|             # now decide which pick: left or right? | ||||
|             if flagpick_l > 0 and flagpick_r > 0 and cfpick_l <= 3 * cfpick_r: | ||||
|                 self.Pick = pick_l | ||||
|                 pickflag = 1 | ||||
|             elif flagpick_l > 0 and flagpick_r > 0 and cfpick_l >= cfpick_r: | ||||
|                 self.Pick = pick_r | ||||
|                 pickflag = 1 | ||||
|             elif flagpick_l == 0 and flagpick_r > 0 and cfpick_l >= cfpick_r: | ||||
|                 self.Pick = pick_l | ||||
|                 pickflag = 1 | ||||
|             else: | ||||
|                 print('PragPicker: Could not find reliable onset!') | ||||
|                 self.Pick = None | ||||
|                 pickflag = 0 | ||||
| 
 | ||||
|             if self.getiplot() > 1: | ||||
|                 p = plt.figure(self.getiplot()) | ||||
|                 p1, = plt.plot(Tcfpick, cfipick, 'k') | ||||
|                 p2, = plt.plot(Tcfpick, cfsmoothipick, 'r') | ||||
|                 if pickflag > 0: | ||||
|                     p3, = plt.plot([self.Pick, self.Pick], [min(cfipick), max(cfipick)], 'b', linewidth=2) | ||||
|                     plt.legend([p1, p2, p3], ['CF', 'Smoothed CF', 'Pick']) | ||||
|                 plt.xlabel('Time [s] since %s' % self.Data[0].stats.starttime) | ||||
|                 plt.yticks([]) | ||||
|                 plt.title(self.Data[0].stats.station) | ||||
|                 plt.show() | ||||
|                 raw_input() | ||||
|                 plt.close(p) | ||||
| 
 | ||||
|         else: | ||||
|             print('PragPicker: No initial onset time given! Check input!') | ||||
|             self.Pick = None | ||||
|             return | ||||
							
								
								
									
										989
									
								
								pylot/core/pick/utils.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,989 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| # | ||||
| # -*- coding: utf-8 -*- | ||||
| """ | ||||
|    Created Mar/Apr 2015 | ||||
|    Collection of helpful functions for manual and automatic picking. | ||||
| 
 | ||||
|    :author: Ludger Kueperkoch / MAGS2 EP3 working group | ||||
| """ | ||||
| 
 | ||||
| import warnings | ||||
| 
 | ||||
| import matplotlib.pyplot as plt | ||||
| import numpy as np | ||||
| from obspy.core import Stream, UTCDateTime | ||||
| 
 | ||||
| 
 | ||||
| def earllatepicker(X, nfac, TSNR, Pick1, iplot=None, stealth_mode=False): | ||||
|     ''' | ||||
|     Function to derive earliest and latest possible pick after Diehl & Kissling (2009) | ||||
|     as reasonable uncertainties. Latest possible pick is based on noise level, | ||||
|     earliest possible pick is half a signal wavelength in front of most likely | ||||
|     pick given by PragPicker or manually set by analyst. Most likely pick | ||||
|     (initial pick Pick1) must be given. | ||||
| 
 | ||||
|     :param: X, time series (seismogram) | ||||
|     :type:  `~obspy.core.stream.Stream` | ||||
| 
 | ||||
|     :param: nfac (noise factor), nfac times noise level to calculate latest possible pick | ||||
|     :type: int | ||||
| 
 | ||||
|     :param: TSNR, length of time windows around pick used to determine SNR [s] | ||||
|     :type: tuple (T_noise, T_gap, T_signal) | ||||
| 
 | ||||
|     :param: Pick1, initial (most likely) onset time, starting point for earllatepicker | ||||
|     :type: float | ||||
| 
 | ||||
|     :param: iplot, if given, results are plotted in figure(iplot) | ||||
|     :type: int | ||||
|     ''' | ||||
| 
 | ||||
|     assert isinstance(X, Stream), "%s is not a stream object" % str(X) | ||||
| 
 | ||||
|     LPick = None | ||||
|     EPick = None | ||||
|     PickError = None | ||||
|     if stealth_mode is False: | ||||
|         print('earllatepicker: Get earliest and latest possible pick' | ||||
|               ' relative to most likely pick ...') | ||||
| 
 | ||||
|     x = X[0].data | ||||
|     t = np.arange(0, X[0].stats.npts / X[0].stats.sampling_rate, | ||||
|                   X[0].stats.delta) | ||||
|     inoise = getnoisewin(t, Pick1, TSNR[0], TSNR[1]) | ||||
|     # get signal window | ||||
|     isignal = getsignalwin(t, Pick1, TSNR[2]) | ||||
|     # remove mean | ||||
|     x = x - np.mean(x[inoise]) | ||||
|     # calculate noise level | ||||
|     nlevel = np.sqrt(np.mean(np.square(x[inoise]))) * nfac | ||||
|     # get time where signal exceeds nlevel | ||||
|     ilup, = np.where(x[isignal] > nlevel) | ||||
|     ildown, = np.where(x[isignal] < -nlevel) | ||||
|     if not ilup.size and not ildown.size: | ||||
|         if stealth_mode is False: | ||||
|             print ("earllatepicker: Signal lower than noise level!\n" | ||||
|                    "Skip this trace!") | ||||
|         return LPick, EPick, PickError | ||||
|     il = min(np.min(ilup) if ilup.size else float('inf'), | ||||
|              np.min(ildown) if ildown.size else float('inf')) | ||||
|     LPick = t[isignal][il] | ||||
| 
 | ||||
|     # get earliest possible pick | ||||
| 
 | ||||
|     EPick = np.nan; | ||||
|     count = 0 | ||||
|     pis = isignal | ||||
| 
 | ||||
|     # if EPick stays NaN the signal window size will be doubled | ||||
|     while np.isnan(EPick): | ||||
|         if count > 0: | ||||
|             if stealth_mode is False: | ||||
|                 print("\nearllatepicker: Doubled signal window size %s time(s) " | ||||
|                       "because of NaN for earliest pick." % count) | ||||
|             isigDoubleWinStart = pis[-1] + 1 | ||||
|             isignalDoubleWin = np.arange(isigDoubleWinStart, | ||||
|                                          isigDoubleWinStart + len(pis)) | ||||
|             if (isigDoubleWinStart + len(pis)) < X[0].data.size: | ||||
|                 pis = np.concatenate((pis, isignalDoubleWin)) | ||||
|             else: | ||||
|                 if stealth_mode is False: | ||||
|                     print("Could not double signal window. Index out of bounds.") | ||||
|                 break | ||||
|         count += 1 | ||||
|         # determine all zero crossings in signal window (demeaned) | ||||
|         zc = crossings_nonzero_all(x[pis] - x[pis].mean()) | ||||
|         # calculate mean half period T0 of signal as the average of the | ||||
|         T0 = np.mean(np.diff(zc)) * X[0].stats.delta  # this is half wave length! | ||||
|         EPick = Pick1 - T0  # half wavelength as suggested by Diehl et al. | ||||
| 
 | ||||
|     # get symmetric pick error as mean from earliest and latest possible pick | ||||
|     # by weighting latest possible pick two times earliest possible pick | ||||
|     diffti_tl = LPick - Pick1 | ||||
|     diffti_te = Pick1 - EPick | ||||
|     PickError = symmetrize_error(diffti_te, diffti_tl) | ||||
| 
 | ||||
|     if iplot > 1: | ||||
|         p = plt.figure(iplot) | ||||
|         p1, = plt.plot(t, x, 'k') | ||||
|         p2, = plt.plot(t[inoise], x[inoise]) | ||||
|         p3, = plt.plot(t[isignal], x[isignal], 'r') | ||||
|         p4, = plt.plot([t[0], t[int(len(t)) - 1]], [nlevel, nlevel], '--k') | ||||
|         p5, = plt.plot(t[isignal[zc]], np.zeros(len(zc)), '*g', | ||||
|                        markersize=14) | ||||
|         plt.legend([p1, p2, p3, p4, p5], | ||||
|                    ['Data', 'Noise Window', 'Signal Window', 'Noise Level', | ||||
|                     'Zero Crossings'], | ||||
|                    loc='best') | ||||
|         plt.plot([t[0], t[int(len(t)) - 1]], [-nlevel, -nlevel], '--k') | ||||
|         plt.plot([Pick1, Pick1], [max(x), -max(x)], 'b', linewidth=2) | ||||
|         plt.plot([LPick, LPick], [max(x) / 2, -max(x) / 2], '--k') | ||||
|         plt.plot([EPick, EPick], [max(x) / 2, -max(x) / 2], '--k') | ||||
|         plt.plot([Pick1 + PickError, Pick1 + PickError], | ||||
|                  [max(x) / 2, -max(x) / 2], 'r--') | ||||
|         plt.plot([Pick1 - PickError, Pick1 - PickError], | ||||
|                  [max(x) / 2, -max(x) / 2], 'r--') | ||||
|         plt.xlabel('Time [s] since %s' % X[0].stats.starttime) | ||||
|         plt.yticks([]) | ||||
|         plt.title( | ||||
|             'Earliest-/Latest Possible/Most Likely Pick & Symmetric Pick Error, %s' % | ||||
|             X[0].stats.station) | ||||
|         plt.show() | ||||
|         raw_input() | ||||
|         plt.close(p) | ||||
| 
 | ||||
|     return EPick, LPick, PickError | ||||
| 
 | ||||
| 
 | ||||
| def fmpicker(Xraw, Xfilt, pickwin, Pick, iplot=None): | ||||
|     ''' | ||||
|     Function to derive first motion (polarity) of given phase onset Pick. | ||||
|     Calculation is based on zero crossings determined within time window pickwin | ||||
|     after given onset time. | ||||
| 
 | ||||
|     :param: Xraw, unfiltered time series (seismogram) | ||||
|     :type:  `~obspy.core.stream.Stream` | ||||
| 
 | ||||
|     :param: Xfilt, filtered time series (seismogram) | ||||
|     :type:  `~obspy.core.stream.Stream` | ||||
| 
 | ||||
|     :param: pickwin, time window after onset Pick within zero crossings are calculated | ||||
|     :type: float | ||||
| 
 | ||||
|     :param: Pick, initial (most likely) onset time, starting point for fmpicker | ||||
|     :type: float | ||||
| 
 | ||||
|     :param: iplot, if given, results are plotted in figure(iplot) | ||||
|     :type: int | ||||
|     ''' | ||||
| 
 | ||||
|     warnings.simplefilter('ignore', np.RankWarning) | ||||
| 
 | ||||
|     assert isinstance(Xraw, Stream), "%s is not a stream object" % str(Xraw) | ||||
|     assert isinstance(Xfilt, Stream), "%s is not a stream object" % str(Xfilt) | ||||
| 
 | ||||
|     FM = None | ||||
|     if Pick is not None: | ||||
|         print ("fmpicker: Get first motion (polarity) of onset using unfiltered seismogram...") | ||||
| 
 | ||||
|         xraw = Xraw[0].data | ||||
|         xfilt = Xfilt[0].data | ||||
|         t = np.arange(0, Xraw[0].stats.npts / Xraw[0].stats.sampling_rate, | ||||
|                       Xraw[0].stats.delta) | ||||
|         # get pick window | ||||
|         ipick = np.where( | ||||
|             (t <= min([Pick + pickwin, len(Xraw[0])])) & (t >= Pick)) | ||||
|         # remove mean | ||||
|         xraw[ipick] = xraw[ipick] - np.mean(xraw[ipick]) | ||||
|         xfilt[ipick] = xfilt[ipick] - np.mean(xfilt[ipick]) | ||||
| 
 | ||||
|         # get zero crossings after most likely pick | ||||
|         # initial onset is assumed to be the first zero crossing | ||||
|         # first from unfiltered trace | ||||
|         zc1 = [] | ||||
|         zc1.append(Pick) | ||||
|         index1 = [] | ||||
|         i = 0 | ||||
|         for j in range(ipick[0][1], ipick[0][len(t[ipick]) - 1]): | ||||
|             i = i + 1 | ||||
|             if xraw[j - 1] <= 0 <= xraw[j]: | ||||
|                 zc1.append(t[ipick][i]) | ||||
|                 index1.append(i) | ||||
|             elif xraw[j - 1] > 0 >= xraw[j]: | ||||
|                 zc1.append(t[ipick][i]) | ||||
|                 index1.append(i) | ||||
|             if len(zc1) == 3: | ||||
|                 break | ||||
| 
 | ||||
|         # if time difference betweeen 1st and 2cnd zero crossing | ||||
|         # is too short, get time difference between 1st and 3rd | ||||
|         # to derive maximum | ||||
|         if zc1[1] - zc1[0] <= Xraw[0].stats.delta: | ||||
|             li1 = index1[1] | ||||
|         else: | ||||
|             li1 = index1[0] | ||||
|         if np.size(xraw[ipick[0][1]:ipick[0][li1]]) == 0: | ||||
|             print ("fmpicker: Onset on unfiltered trace too emergent for first motion determination!") | ||||
|             P1 = None | ||||
|         else: | ||||
|             imax1 = np.argmax(abs(xraw[ipick[0][1]:ipick[0][li1]])) | ||||
|             if imax1 == 0: | ||||
|                 imax1 = np.argmax(abs(xraw[ipick[0][1]:ipick[0][index1[1]]])) | ||||
|                 if imax1 == 0: | ||||
|                     print ("fmpicker: Zero crossings too close!") | ||||
|                     print ("Skip first motion determination!") | ||||
|                     return FM | ||||
| 
 | ||||
|             islope1 = np.where((t >= Pick) & (t <= Pick + t[imax1])) | ||||
|             # calculate slope as polynomal fit of order 1 | ||||
|             xslope1 = np.arange(0, len(xraw[islope1]), 1) | ||||
|             P1 = np.polyfit(xslope1, xraw[islope1], 1) | ||||
|             datafit1 = np.polyval(P1, xslope1) | ||||
| 
 | ||||
|         # now using filterd trace | ||||
|         # next zero crossings after most likely pick | ||||
|         zc2 = [] | ||||
|         zc2.append(Pick) | ||||
|         index2 = [] | ||||
|         i = 0 | ||||
|         for j in range(ipick[0][1], ipick[0][len(t[ipick]) - 1]): | ||||
|             i = i + 1 | ||||
|             if xfilt[j - 1] <= 0 <= xfilt[j]: | ||||
|                 zc2.append(t[ipick][i]) | ||||
|                 index2.append(i) | ||||
|             elif xfilt[j - 1] > 0 >= xfilt[j]: | ||||
|                 zc2.append(t[ipick][i]) | ||||
|                 index2.append(i) | ||||
|             if len(zc2) == 3: | ||||
|                 break | ||||
| 
 | ||||
|         # if time difference betweeen 1st and 2cnd zero crossing | ||||
|         # is too short, get time difference between 1st and 3rd | ||||
|         # to derive maximum | ||||
|         if zc2[1] - zc2[0] <= Xfilt[0].stats.delta: | ||||
|             li2 = index2[1] | ||||
|         else: | ||||
|             li2 = index2[0] | ||||
|         if np.size(xfilt[ipick[0][1]:ipick[0][li2]]) == 0: | ||||
|             print ("fmpicker: Onset on filtered trace too emergent for first motion determination!") | ||||
|             P2 = None | ||||
|         else: | ||||
|             imax2 = np.argmax(abs(xfilt[ipick[0][1]:ipick[0][li2]])) | ||||
|             if imax2 == 0: | ||||
|                 imax2 = np.argmax(abs(xfilt[ipick[0][1]:ipick[0][index2[1]]])) | ||||
|                 if imax2 == 0: | ||||
|                     print ("fmpicker: Zero crossings too close!") | ||||
|                     print ("Skip first motion determination!") | ||||
|                     return FM | ||||
| 
 | ||||
|             islope2 = np.where((t >= Pick) & (t <= Pick + t[imax2])) | ||||
|             # calculate slope as polynomal fit of order 1 | ||||
|             xslope2 = np.arange(0, len(xfilt[islope2]), 1) | ||||
|             P2 = np.polyfit(xslope2, xfilt[islope2], 1) | ||||
|             datafit2 = np.polyval(P2, xslope2) | ||||
| 
 | ||||
|         # compare results | ||||
|         if P1 is not None and P2 is not None: | ||||
|             if P1[0] < 0 and P2[0] < 0: | ||||
|                 FM = 'D' | ||||
|             elif P1[0] >= 0 > P2[0]: | ||||
|                 FM = '-' | ||||
|             elif P1[0] < 0 <= P2[0]: | ||||
|                 FM = '-' | ||||
|             elif P1[0] > 0 and P2[0] > 0: | ||||
|                 FM = 'U' | ||||
|             elif P1[0] <= 0 < P2[0]: | ||||
|                 FM = '+' | ||||
|             elif P1[0] > 0 >= P2[0]: | ||||
|                 FM = '+' | ||||
| 
 | ||||
|         print ("fmpicker: Found polarity %s" % FM) | ||||
| 
 | ||||
|     if iplot > 1: | ||||
|         plt.figure(iplot) | ||||
|         plt.subplot(2, 1, 1) | ||||
|         plt.plot(t, xraw, 'k') | ||||
|         p1, = plt.plot([Pick, Pick], [max(xraw), -max(xraw)], 'b', linewidth=2) | ||||
|         if P1 is not None: | ||||
|             p2, = plt.plot(t[islope1], xraw[islope1]) | ||||
|             p3, = plt.plot(zc1, np.zeros(len(zc1)), '*g', markersize=14) | ||||
|             p4, = plt.plot(t[islope1], datafit1, '--g', linewidth=2) | ||||
|             plt.legend([p1, p2, p3, p4], | ||||
|                        ['Pick', 'Slope Window', 'Zero Crossings', 'Slope'], | ||||
|                        loc='best') | ||||
|             plt.text(Pick + 0.02, max(xraw) / 2, '%s' % FM, fontsize=14) | ||||
|             ax = plt.gca() | ||||
|         plt.yticks([]) | ||||
|         plt.title('First-Motion Determination, %s, Unfiltered Data' % Xraw[ | ||||
|             0].stats.station) | ||||
| 
 | ||||
|         plt.subplot(2, 1, 2) | ||||
|         plt.title('First-Motion Determination, Filtered Data') | ||||
|         plt.plot(t, xfilt, 'k') | ||||
|         p1, = plt.plot([Pick, Pick], [max(xfilt), -max(xfilt)], 'b', | ||||
|                        linewidth=2) | ||||
|         if P2 is not None: | ||||
|             p2, = plt.plot(t[islope2], xfilt[islope2]) | ||||
|             p3, = plt.plot(zc2, np.zeros(len(zc2)), '*g', markersize=14) | ||||
|             p4, = plt.plot(t[islope2], datafit2, '--g', linewidth=2) | ||||
|             plt.text(Pick + 0.02, max(xraw) / 2, '%s' % FM, fontsize=14) | ||||
|             ax = plt.gca() | ||||
|         plt.xlabel('Time [s] since %s' % Xraw[0].stats.starttime) | ||||
|         plt.yticks([]) | ||||
|         plt.show() | ||||
|         raw_input() | ||||
|         plt.close(iplot) | ||||
| 
 | ||||
|     return FM | ||||
| 
 | ||||
| 
 | ||||
| def crossings_nonzero_all(data): | ||||
|     pos = data > 0 | ||||
|     npos = ~pos | ||||
|     return ((pos[:-1] & npos[1:]) | (npos[:-1] & pos[1:])).nonzero()[0] | ||||
| 
 | ||||
| 
 | ||||
| def symmetrize_error(dte, dtl): | ||||
|     """ | ||||
|     takes earliest and latest possible pick and returns the symmetrized pick | ||||
|     uncertainty value | ||||
|     :param dte: relative lower uncertainty | ||||
|     :param dtl: relative upper uncertainty | ||||
|     :return: symmetrized error | ||||
|     """ | ||||
|     return (dte + 2 * dtl) / 3 | ||||
| 
 | ||||
| 
 | ||||
| def getSNR(X, TSNR, t1, tracenum=0): | ||||
|     ''' | ||||
|     Function to calculate SNR of certain part of seismogram relative to | ||||
|     given time (onset) out of given noise and signal windows. A safety gap | ||||
|     between noise and signal part can be set. Returns SNR and SNR [dB] and | ||||
|     noiselevel. | ||||
| 
 | ||||
|     :param: X, time series (seismogram) | ||||
|     :type:  `~obspy.core.stream.Stream` | ||||
| 
 | ||||
|     :param: TSNR, length of time windows [s] around t1 (onset) used to determine SNR | ||||
|     :type: tuple (T_noise, T_gap, T_signal) | ||||
| 
 | ||||
|     :param: t1, initial time (onset) from which noise and signal windows are calculated | ||||
|     :type: float | ||||
|     ''' | ||||
| 
 | ||||
|     assert isinstance(X, Stream), "%s is not a stream object" % str(X) | ||||
| 
 | ||||
|     x = X[tracenum].data | ||||
|     npts = X[tracenum].stats.npts | ||||
|     sr = X[tracenum].stats.sampling_rate | ||||
|     dt = X[tracenum].stats.delta | ||||
|     t = np.arange(0, npts / sr, dt) | ||||
| 
 | ||||
|     # get noise window | ||||
|     inoise = getnoisewin(t, t1, TSNR[0], TSNR[1]) | ||||
| 
 | ||||
|     # get signal window | ||||
|     isignal = getsignalwin(t, t1, TSNR[2]) | ||||
|     if np.size(inoise) < 1: | ||||
|         print ("getSNR: Empty array inoise, check noise window!") | ||||
|         return | ||||
|     elif np.size(isignal) < 1: | ||||
|         print ("getSNR: Empty array isignal, check signal window!") | ||||
|         return | ||||
| 
 | ||||
|     # demean over entire waveform | ||||
|     x = x - np.mean(x[inoise]) | ||||
| 
 | ||||
|     # calculate ratios | ||||
|     # noiselevel = np.sqrt(np.mean(np.square(x[inoise]))) | ||||
|     # signallevel = np.sqrt(np.mean(np.square(x[isignal]))) | ||||
| 
 | ||||
|     noiselevel = np.abs(x[inoise]).max() | ||||
|     signallevel = np.abs(x[isignal]).max() | ||||
| 
 | ||||
|     SNR = signallevel / noiselevel | ||||
|     SNRdB = 10 * np.log10(SNR) | ||||
| 
 | ||||
|     return SNR, SNRdB, noiselevel | ||||
| 
 | ||||
| 
 | ||||
| def getnoisewin(t, t1, tnoise, tgap): | ||||
|     ''' | ||||
|     Function to extract indeces of data out of time series for noise calculation. | ||||
|     Returns an array of indeces. | ||||
| 
 | ||||
|     :param: t, array of time stamps | ||||
|     :type:  numpy array | ||||
| 
 | ||||
|     :param: t1, time from which relativ to it noise window is extracted | ||||
|     :type: float | ||||
| 
 | ||||
|     :param: tnoise, length of time window [s] for noise part extraction | ||||
|     :type: float | ||||
| 
 | ||||
|     :param: tgap, safety gap between t1 (onset) and noise window to | ||||
|             ensure, that noise window contains no signal | ||||
|     :type: float | ||||
|     ''' | ||||
| 
 | ||||
|     # get noise window | ||||
|     inoise, = np.where((t <= max([t1 - tgap, 0])) \ | ||||
|                        & (t >= max([t1 - tnoise - tgap, 0]))) | ||||
|     if np.size(inoise) < 1: | ||||
|         print ("getnoisewin: Empty array inoise, check noise window!") | ||||
| 
 | ||||
|     return inoise | ||||
| 
 | ||||
| 
 | ||||
| def getsignalwin(t, t1, tsignal): | ||||
|     ''' | ||||
|     Function to extract data out of time series for signal level calculation. | ||||
|     Returns an array of indeces. | ||||
| 
 | ||||
|     :param: t, array of time stamps | ||||
|     :type:  numpy array | ||||
| 
 | ||||
|     :param: t1, time from which relativ to it signal window is extracted | ||||
|     :type: float | ||||
| 
 | ||||
|     :param: tsignal, length of time window [s] for signal level calculation | ||||
|     :type: float | ||||
|     ''' | ||||
| 
 | ||||
|     # get signal window | ||||
|     isignal, = np.where((t <= min([t1 + tsignal, len(t)])) \ | ||||
|                         & (t >= t1)) | ||||
|     if np.size(isignal) < 1: | ||||
|         print ("getsignalwin: Empty array isignal, check signal window!") | ||||
| 
 | ||||
|     return isignal | ||||
| 
 | ||||
| 
 | ||||
| def getResolutionWindow(snr): | ||||
|     """ | ||||
|     Number -> Float | ||||
|     produce the half of the time resolution window width from given SNR | ||||
|     value | ||||
|           SNR >= 3    ->  2 sec    HRW | ||||
|       3 > SNR >= 2    ->  5 sec    MRW | ||||
|       2 > SNR >= 1.5  -> 10 sec    LRW | ||||
|     1.5 > SNR         -> 15 sec   VLRW | ||||
|     see also Diehl et al. 2009 | ||||
| 
 | ||||
|     >>> getResolutionWindow(0.5) | ||||
|     7.5 | ||||
|     >>> getResolutionWindow(1.8) | ||||
|     5.0 | ||||
|     >>> getResolutionWindow(2.3) | ||||
|     2.5 | ||||
|     >>> getResolutionWindow(4) | ||||
|     1.0 | ||||
|     >>> getResolutionWindow(2) | ||||
|     2.5 | ||||
|     """ | ||||
| 
 | ||||
|     res_wins = {'HRW': 2., 'MRW': 5., 'LRW': 10., 'VLRW': 15.} | ||||
| 
 | ||||
|     if snr < 1.5: | ||||
|         time_resolution = res_wins['VLRW'] | ||||
|     elif snr < 2.: | ||||
|         time_resolution = res_wins['LRW'] | ||||
|     elif snr < 3.: | ||||
|         time_resolution = res_wins['MRW'] | ||||
|     else: | ||||
|         time_resolution = res_wins['HRW'] | ||||
| 
 | ||||
|     return time_resolution / 2 | ||||
| 
 | ||||
| 
 | ||||
| def select_for_phase(st, phase): | ||||
|     ''' | ||||
|     takes a STream object and a phase name and returns that particular component | ||||
|     which presumably shows the chosen PHASE best | ||||
| 
 | ||||
|     :param st: stream object containing one or more component[s] | ||||
|     :type st: `~obspy.core.stream.Stream` | ||||
|     :param phase: label of the phase for which the stream selection is carried | ||||
|         out; 'P' or 'S' | ||||
|     :type phase: str | ||||
|     :return: | ||||
|     ''' | ||||
|     from pylot.core.util.defaults import COMPNAME_MAP | ||||
| 
 | ||||
|     sel_st = Stream() | ||||
|     if phase.upper() == 'P': | ||||
|         comp = 'Z' | ||||
|         alter_comp = COMPNAME_MAP[comp] | ||||
|         sel_st += st.select(component=comp) | ||||
|         sel_st += st.select(component=alter_comp) | ||||
|     elif phase.upper() == 'S': | ||||
|         comps = 'NE' | ||||
|         for comp in comps: | ||||
|             alter_comp = COMPNAME_MAP[comp] | ||||
|             sel_st += st.select(component=comp) | ||||
|             sel_st += st.select(component=alter_comp) | ||||
|     else: | ||||
|         raise TypeError('Unknown phase label: {0}'.format(phase)) | ||||
|     return sel_st | ||||
| 
 | ||||
| 
 | ||||
| def wadaticheck(pickdic, dttolerance, iplot): | ||||
|     ''' | ||||
|     Function to calculate Wadati-diagram from given P and S onsets in order | ||||
|     to detect S pick outliers. If a certain S-P time deviates by dttolerance | ||||
|     from regression of S-P time the S pick is marked and down graded. | ||||
| 
 | ||||
|     : param: pickdic, dictionary containing picks and quality parameters | ||||
|     : type:  dictionary | ||||
| 
 | ||||
|     : param: dttolerance, maximum adjusted deviation of S-P time from | ||||
|              S-P time regression | ||||
|     : type:  float | ||||
| 
 | ||||
|     : param: iplot, if iplot > 1, Wadati diagram is shown | ||||
|     : type:  int | ||||
|     ''' | ||||
| 
 | ||||
|     checkedonsets = pickdic | ||||
| 
 | ||||
|     # search for good quality picks and calculate S-P time | ||||
|     Ppicks = [] | ||||
|     Spicks = [] | ||||
|     SPtimes = [] | ||||
|     for key in pickdic: | ||||
|         if pickdic[key]['P']['weight'] < 4 and pickdic[key]['S']['weight'] < 4: | ||||
|             # calculate S-P time | ||||
|             spt = pickdic[key]['S']['mpp'] - pickdic[key]['P']['mpp'] | ||||
|             # add S-P time to dictionary | ||||
|             pickdic[key]['SPt'] = spt | ||||
|             # add P onsets and corresponding S-P times to list | ||||
|             UTCPpick = UTCDateTime(pickdic[key]['P']['mpp']) | ||||
|             UTCSpick = UTCDateTime(pickdic[key]['S']['mpp']) | ||||
|             Ppicks.append(UTCPpick.timestamp) | ||||
|             Spicks.append(UTCSpick.timestamp) | ||||
|             SPtimes.append(spt) | ||||
| 
 | ||||
|     if len(SPtimes) >= 3: | ||||
|         # calculate slope | ||||
|         p1 = np.polyfit(Ppicks, SPtimes, 1) | ||||
|         wdfit = np.polyval(p1, Ppicks) | ||||
|         wfitflag = 0 | ||||
| 
 | ||||
|         # calculate vp/vs ratio before check | ||||
|         vpvsr = p1[0] + 1 | ||||
|         print ("###############################################") | ||||
|         print ("wadaticheck: Average Vp/Vs ratio before check: %f" % vpvsr) | ||||
| 
 | ||||
|         checkedPpicks = [] | ||||
|         checkedSpicks = [] | ||||
|         checkedSPtimes = [] | ||||
|         # calculate deviations from Wadati regression | ||||
|         ii = 0 | ||||
|         ibad = 0 | ||||
|         for key in pickdic: | ||||
|             if pickdic[key].has_key('SPt'): | ||||
|                 wddiff = abs(pickdic[key]['SPt'] - wdfit[ii]) | ||||
|                 ii += 1 | ||||
|                 # check, if deviation is larger than adjusted | ||||
|                 if wddiff > dttolerance: | ||||
|                     # mark onset and downgrade S-weight to 9 | ||||
|                     # (not used anymore) | ||||
|                     marker = 'badWadatiCheck' | ||||
|                     pickdic[key]['S']['weight'] = 9 | ||||
|                     ibad += 1 | ||||
|                 else: | ||||
|                     marker = 'goodWadatiCheck' | ||||
|                     checkedPpick = UTCDateTime(pickdic[key]['P']['mpp']) | ||||
|                     checkedPpicks.append(checkedPpick.timestamp) | ||||
|                     checkedSpick = UTCDateTime(pickdic[key]['S']['mpp']) | ||||
|                     checkedSpicks.append(checkedSpick.timestamp) | ||||
|                     checkedSPtime = pickdic[key]['S']['mpp'] - pickdic[key]['P']['mpp'] | ||||
|                     checkedSPtimes.append(checkedSPtime) | ||||
| 
 | ||||
|                 pickdic[key]['S']['marked'] = marker | ||||
| 
 | ||||
|         if len(checkedPpicks) >= 3: | ||||
|             # calculate new slope | ||||
|             p2 = np.polyfit(checkedPpicks, checkedSPtimes, 1) | ||||
|             wdfit2 = np.polyval(p2, checkedPpicks) | ||||
| 
 | ||||
|             # calculate vp/vs ratio after check | ||||
|             cvpvsr = p2[0] + 1 | ||||
|             print ("wadaticheck: Average Vp/Vs ratio after check: %f" % cvpvsr) | ||||
|             print ("wadatacheck: Skipped %d S pick(s)" % ibad) | ||||
|         else: | ||||
|             print ("###############################################") | ||||
|             print ("wadatacheck: Not enough checked S-P times available!") | ||||
|             print ("Skip Wadati check!") | ||||
| 
 | ||||
|         checkedonsets = pickdic | ||||
| 
 | ||||
|     else: | ||||
|         print ("wadaticheck: Not enough S-P times available for reliable regression!") | ||||
|         print ("Skip wadati check!") | ||||
|         wfitflag = 1 | ||||
| 
 | ||||
|     # plot results | ||||
|     if iplot > 1: | ||||
|         plt.figure(iplot) | ||||
|         f1, = plt.plot(Ppicks, SPtimes, 'ro') | ||||
|         if wfitflag == 0: | ||||
|             f2, = plt.plot(Ppicks, wdfit, 'k') | ||||
|             f3, = plt.plot(checkedPpicks, checkedSPtimes, 'ko') | ||||
|             f4, = plt.plot(checkedPpicks, wdfit2, 'g') | ||||
|             plt.title('Wadati-Diagram, %d S-P Times, Vp/Vs(raw)=%5.2f,' \ | ||||
|                       'Vp/Vs(checked)=%5.2f' % (len(SPtimes), vpvsr, cvpvsr)) | ||||
|             plt.legend([f1, f2, f3, f4], ['Skipped S-Picks', 'Wadati 1', | ||||
|                                           'Reliable S-Picks', 'Wadati 2'], loc='best') | ||||
|         else: | ||||
|             plt.title('Wadati-Diagram, %d S-P Times' % len(SPtimes)) | ||||
| 
 | ||||
|         plt.ylabel('S-P Times [s]') | ||||
|         plt.xlabel('P Times [s]') | ||||
|         plt.show() | ||||
|         raw_input() | ||||
|         plt.close(iplot) | ||||
| 
 | ||||
|     return checkedonsets | ||||
| 
 | ||||
| 
 | ||||
| def checksignallength(X, pick, TSNR, minsiglength, nfac, minpercent, iplot): | ||||
|     ''' | ||||
|     Function to detect spuriously picked noise peaks. | ||||
|     Uses RMS trace of all 3 components (if available) to determine, | ||||
|     how many samples [per cent] after P onset are below certain | ||||
|     threshold, calculated from noise level times noise factor. | ||||
| 
 | ||||
|     : param: X, time series (seismogram) | ||||
|     : type:  `~obspy.core.stream.Stream` | ||||
| 
 | ||||
|     : param: pick, initial (AIC) P onset time | ||||
|     : type:  float | ||||
| 
 | ||||
|     : param: TSNR, length of time windows around initial pick [s] | ||||
|     : type:  tuple (T_noise, T_gap, T_signal) | ||||
| 
 | ||||
|     : param: minsiglength, minium required signal length [s] to | ||||
|              declare pick as P onset | ||||
|     : type:  float | ||||
| 
 | ||||
|     : param: nfac, noise factor (nfac * noise level = threshold) | ||||
|     : type:  float | ||||
| 
 | ||||
|     : param: minpercent, minimum required percentage of samples | ||||
|              above calculated threshold | ||||
|     : type:  float | ||||
| 
 | ||||
|     : param: iplot, if iplot > 1, results are shown in figure | ||||
|     : type:  int | ||||
|     ''' | ||||
| 
 | ||||
|     assert isinstance(X, Stream), "%s is not a stream object" % str(X) | ||||
| 
 | ||||
|     print ("Checking signal length ...") | ||||
| 
 | ||||
|     if len(X) > 1: | ||||
|         # all three components available | ||||
|         # make sure, all components have equal lengths | ||||
|         ilen = min([len(X[0].data), len(X[1].data), len(X[2].data)]) | ||||
|         x1 = X[0][0:ilen] | ||||
|         x2 = X[1][0:ilen] | ||||
|         x3 = X[2][0:ilen] | ||||
|         # get RMS trace | ||||
|         rms = np.sqrt((np.power(x1, 2) + np.power(x2, 2) + np.power(x3, 2)) / 3) | ||||
|     else: | ||||
|         x1 = X[0].data | ||||
|         rms = np.sqrt(np.power(2, x1)) | ||||
| 
 | ||||
|     t = np.arange(0, ilen / X[0].stats.sampling_rate, | ||||
|                   X[0].stats.delta) | ||||
| 
 | ||||
|     # get noise window in front of pick plus saftey gap | ||||
|     inoise = getnoisewin(t, pick - 0.5, TSNR[0], TSNR[1]) | ||||
|     # get signal window | ||||
|     isignal = getsignalwin(t, pick, minsiglength) | ||||
|     # calculate minimum adjusted signal level | ||||
|     minsiglevel = max(rms[inoise]) * nfac | ||||
|     # minimum adjusted number of samples over minimum signal level | ||||
|     minnum = len(isignal) * minpercent / 100 | ||||
|     # get number of samples above minimum adjusted signal level | ||||
|     numoverthr = len(np.where(rms[isignal] >= minsiglevel)[0]) | ||||
| 
 | ||||
|     if numoverthr >= minnum: | ||||
|         print ("checksignallength: Signal reached required length.") | ||||
|         returnflag = 1 | ||||
|     else: | ||||
|         print ("checksignallength: Signal shorter than required minimum signal length!") | ||||
|         print ("Presumably picked noise peak, pick is rejected!") | ||||
|         print ("(min. signal length required: %s s)" % minsiglength) | ||||
|         returnflag = 0 | ||||
| 
 | ||||
|     if iplot == 2: | ||||
|         plt.figure(iplot) | ||||
|         p1, = plt.plot(t, rms, 'k') | ||||
|         p2, = plt.plot(t[inoise], rms[inoise], 'c') | ||||
|         p3, = plt.plot(t[isignal], rms[isignal], 'r') | ||||
|         p4, = plt.plot([t[isignal[0]], t[isignal[len(isignal) - 1]]], | ||||
|                        [minsiglevel, minsiglevel], 'g', linewidth=2) | ||||
|         p5, = plt.plot([pick, pick], [min(rms), max(rms)], 'b', linewidth=2) | ||||
|         plt.legend([p1, p2, p3, p4, p5], ['RMS Data', 'RMS Noise Window', | ||||
|                                           'RMS Signal Window', 'Minimum Signal Level', | ||||
|                                           'Onset'], loc='best') | ||||
|         plt.xlabel('Time [s] since %s' % X[0].stats.starttime) | ||||
|         plt.ylabel('Counts') | ||||
|         plt.title('Check for Signal Length, Station %s' % X[0].stats.station) | ||||
|         plt.yticks([]) | ||||
|         plt.show() | ||||
|         raw_input() | ||||
|         plt.close(iplot) | ||||
| 
 | ||||
|     return returnflag | ||||
| 
 | ||||
| 
 | ||||
| def checkPonsets(pickdic, dttolerance, iplot): | ||||
|     ''' | ||||
|     Function to check statistics of P-onset times: Control deviation from | ||||
|     median (maximum adjusted deviation = dttolerance) and apply pseudo- | ||||
|     bootstrapping jackknife. | ||||
| 
 | ||||
|     : param: pickdic, dictionary containing picks and quality parameters | ||||
|     : type:  dictionary | ||||
| 
 | ||||
|     : param: dttolerance, maximum adjusted deviation of P-onset time from | ||||
|              median of all P onsets | ||||
|     : type:  float | ||||
| 
 | ||||
|     : param: iplot, if iplot > 1, Wadati diagram is shown | ||||
|     : type:  int | ||||
|     ''' | ||||
| 
 | ||||
|     checkedonsets = pickdic | ||||
| 
 | ||||
|     # search for good quality P picks | ||||
|     Ppicks = [] | ||||
|     stations = [] | ||||
|     for key in pickdic: | ||||
|         if pickdic[key]['P']['weight'] < 4: | ||||
|             # add P onsets to list | ||||
|             UTCPpick = UTCDateTime(pickdic[key]['P']['mpp']) | ||||
|             Ppicks.append(UTCPpick.timestamp) | ||||
|             stations.append(key) | ||||
| 
 | ||||
|     # apply jackknife bootstrapping on variance of P onsets | ||||
|     print ("###############################################") | ||||
|     print ("checkPonsets: Apply jackknife bootstrapping on P-onset times ...") | ||||
|     [xjack, PHI_pseudo, PHI_sub] = jackknife(Ppicks, 'VAR', 1) | ||||
|     # get pseudo variances smaller than average variances | ||||
|     # (times safety factor), these picks passed jackknife test | ||||
|     ij = np.where(PHI_pseudo <= 2 * xjack) | ||||
|     # these picks did not pass jackknife test | ||||
|     badjk = np.where(PHI_pseudo > 2 * xjack) | ||||
|     badjkstations = np.array(stations)[badjk] | ||||
|     print ("checkPonsets: %d pick(s) did not pass jackknife test!" % len(badjkstations)) | ||||
| 
 | ||||
|     # calculate median from these picks | ||||
|     pmedian = np.median(np.array(Ppicks)[ij]) | ||||
|     # find picks that deviate less than dttolerance from median | ||||
|     ii = np.where(abs(np.array(Ppicks)[ij] - pmedian) <= dttolerance) | ||||
|     jj = np.where(abs(np.array(Ppicks)[ij] - pmedian) > dttolerance) | ||||
|     igood = ij[0][ii] | ||||
|     ibad = ij[0][jj] | ||||
|     goodstations = np.array(stations)[igood] | ||||
|     badstations = np.array(stations)[ibad] | ||||
| 
 | ||||
|     print ("checkPonsets: %d pick(s) deviate too much from median!" % len(ibad)) | ||||
|     print ("checkPonsets: Skipped %d P pick(s) out of %d" % (len(badstations) \ | ||||
|                                                              + len(badjkstations), len(stations))) | ||||
| 
 | ||||
|     goodmarker = 'goodPonsetcheck' | ||||
|     badmarker = 'badPonsetcheck' | ||||
|     badjkmarker = 'badjkcheck' | ||||
|     for i in range(0, len(goodstations)): | ||||
|         # mark P onset as checked and keep P weight | ||||
|         pickdic[goodstations[i]]['P']['marked'] = goodmarker | ||||
|     for i in range(0, len(badstations)): | ||||
|         # mark P onset and downgrade P weight to 9 | ||||
|         # (not used anymore) | ||||
|         pickdic[badstations[i]]['P']['marked'] = badmarker | ||||
|         pickdic[badstations[i]]['P']['weight'] = 9 | ||||
|     for i in range(0, len(badjkstations)): | ||||
|         # mark P onset and downgrade P weight to 9 | ||||
|         # (not used anymore) | ||||
|         pickdic[badjkstations[i]]['P']['marked'] = badjkmarker | ||||
|         pickdic[badjkstations[i]]['P']['weight'] = 9 | ||||
| 
 | ||||
|     checkedonsets = pickdic | ||||
| 
 | ||||
|     if iplot > 1: | ||||
|         p1, = plt.plot(np.arange(0, len(Ppicks)), Ppicks, 'r+', markersize=14) | ||||
|         p2, = plt.plot(igood, np.array(Ppicks)[igood], 'g*', markersize=14) | ||||
|         p3, = plt.plot([0, len(Ppicks) - 1], [pmedian, pmedian], 'g', | ||||
|                        linewidth=2) | ||||
|         for i in range(0, len(Ppicks)): | ||||
|             plt.text(i, Ppicks[i] + 0.2, stations[i]) | ||||
| 
 | ||||
|         plt.xlabel('Number of P Picks') | ||||
|         plt.ylabel('Onset Time [s] from 1.1.1970') | ||||
|         plt.legend([p1, p2, p3], ['Skipped P Picks', 'Good P Picks', 'Median'], | ||||
|                    loc='best') | ||||
|         plt.title('Check P Onsets') | ||||
|         plt.show() | ||||
|         raw_input() | ||||
| 
 | ||||
|     return checkedonsets | ||||
| 
 | ||||
| 
 | ||||
| def jackknife(X, phi, h): | ||||
|     ''' | ||||
|     Function to calculate the Jackknife Estimator for a given quantity, | ||||
|     special type of boot strapping. Returns the jackknife estimator PHI_jack | ||||
|     the pseudo values PHI_pseudo and the subgroup parameters PHI_sub. | ||||
| 
 | ||||
|     : param: X, given quantity | ||||
|     : type:  list | ||||
| 
 | ||||
|     : param: phi, chosen estimator, choose between: | ||||
|              "MED" for median | ||||
|              "MEA" for arithmetic mean | ||||
|              "VAR" for variance | ||||
|     : type:  string | ||||
| 
 | ||||
|     : param: h, size of subgroups, optinal, default = 1 | ||||
|     : type:  integer | ||||
|     ''' | ||||
| 
 | ||||
|     PHI_jack = None | ||||
|     PHI_pseudo = None | ||||
|     PHI_sub = None | ||||
| 
 | ||||
|     # determine number of subgroups | ||||
|     g = len(X) / h | ||||
| 
 | ||||
|     if type(g) is not int: | ||||
|         print ("jackknife: Cannot divide quantity X in equal sized subgroups!") | ||||
|         print ("Choose another size for subgroups!") | ||||
|         return PHI_jack, PHI_pseudo, PHI_sub | ||||
|     else: | ||||
|         # estimator of undisturbed spot check | ||||
|         if phi == 'MEA': | ||||
|             phi_sc = np.mean(X) | ||||
|         elif phi == 'VAR': | ||||
|             phi_sc = np.var(X) | ||||
|         elif phi == 'MED': | ||||
|             phi_sc = np.median(X) | ||||
| 
 | ||||
|         # estimators of subgroups | ||||
|         PHI_pseudo = [] | ||||
|         PHI_sub = [] | ||||
|         for i in range(0, g - 1): | ||||
|             # subgroup i, remove i-th sample | ||||
|             xx = X[:] | ||||
|             del xx[i] | ||||
|             # calculate estimators of disturbed spot check | ||||
|             if phi == 'MEA': | ||||
|                 phi_sub = np.mean(xx) | ||||
|             elif phi == 'VAR': | ||||
|                 phi_sub = np.var(xx) | ||||
|             elif phi == 'MED': | ||||
|                 phi_sub = np.median(xx) | ||||
| 
 | ||||
|             PHI_sub.append(phi_sub) | ||||
|             # pseudo values | ||||
|             phi_pseudo = g * phi_sc - ((g - 1) * phi_sub) | ||||
|             PHI_pseudo.append(phi_pseudo) | ||||
|         # jackknife estimator | ||||
|         PHI_jack = np.mean(PHI_pseudo) | ||||
| 
 | ||||
|     return PHI_jack, PHI_pseudo, PHI_sub | ||||
| 
 | ||||
| 
 | ||||
| def checkZ4S(X, pick, zfac, checkwin, iplot): | ||||
|     ''' | ||||
|     Function to compare energy content of vertical trace with | ||||
|     energy content of horizontal traces to detect spuriously | ||||
|     picked S onsets instead of P onsets. Usually, P coda shows | ||||
|     larger longitudal energy on vertical trace than on horizontal | ||||
|     traces, where the transversal energy is larger within S coda. | ||||
|     Be careful: there are special circumstances, where this is not | ||||
|     the case! | ||||
| 
 | ||||
|     : param: X, fitered(!) time series, three traces | ||||
|     : type:  `~obspy.core.stream.Stream` | ||||
| 
 | ||||
|     : param: pick, initial (AIC) P onset time | ||||
|     : type:  float | ||||
| 
 | ||||
|     : param: zfac, factor for threshold determination, | ||||
|              vertical energy must exceed coda level times zfac | ||||
|              to declare a pick as P onset | ||||
|     : type:  float | ||||
| 
 | ||||
|     : param: checkwin, window length [s] for calculating P-coda | ||||
|              energy content | ||||
|     : type:  float | ||||
| 
 | ||||
|     : param: iplot, if iplot > 1, energy content and threshold | ||||
|              are shown | ||||
|     : type:  int | ||||
|     ''' | ||||
| 
 | ||||
|     assert isinstance(X, Stream), "%s is not a stream object" % str(X) | ||||
| 
 | ||||
|     print ("Check for spuriously picked S onset instead of P onset ...") | ||||
| 
 | ||||
|     returnflag = 0 | ||||
| 
 | ||||
|     # split components | ||||
|     zdat = X.select(component="Z") | ||||
|     if len(zdat) == 0:  # check for other components | ||||
|         zdat = X.select(component="3") | ||||
|     edat = X.select(component="E") | ||||
|     if len(edat) == 0:  # check for other components | ||||
|         edat = X.select(component="2") | ||||
|     ndat = X.select(component="N") | ||||
|     if len(ndat) == 0:  # check for other components | ||||
|         ndat = X.select(component="1") | ||||
| 
 | ||||
|     z = zdat[0].data | ||||
|     tz = np.arange(0, zdat[0].stats.npts / zdat[0].stats.sampling_rate, | ||||
|                    zdat[0].stats.delta) | ||||
| 
 | ||||
|     # calculate RMS trace from vertical component | ||||
|     absz = np.sqrt(np.power(z, 2)) | ||||
|     # calculate RMS trace from both horizontal traces | ||||
|     # make sure, both traces have equal lengths | ||||
|     lene = len(edat[0].data) | ||||
|     lenn = len(ndat[0].data) | ||||
|     minlen = min([lene, lenn]) | ||||
|     absen = np.sqrt(np.power(edat[0].data[0:minlen - 1], 2) \ | ||||
|                     + np.power(ndat[0].data[0:minlen - 1], 2)) | ||||
| 
 | ||||
|     # get signal window | ||||
|     isignal = getsignalwin(tz, pick, checkwin) | ||||
| 
 | ||||
|     # calculate energy levels | ||||
|     zcodalevel = max(absz[isignal]) | ||||
|     encodalevel = max(absen[isignal]) | ||||
| 
 | ||||
|     # calculate threshold | ||||
|     minsiglevel = encodalevel * zfac | ||||
| 
 | ||||
|     # vertical P-coda level must exceed horizontal P-coda level | ||||
|     # zfac times encodalevel | ||||
|     if zcodalevel < minsiglevel: | ||||
|         print ("checkZ4S: Maybe S onset? Skip this P pick!") | ||||
|     else: | ||||
|         print ("checkZ4S: P onset passes checkZ4S test!") | ||||
|         returnflag = 1 | ||||
| 
 | ||||
|     if iplot > 1: | ||||
|         te = np.arange(0, edat[0].stats.npts / edat[0].stats.sampling_rate, | ||||
|                        edat[0].stats.delta) | ||||
|         tn = np.arange(0, ndat[0].stats.npts / ndat[0].stats.sampling_rate, | ||||
|                        ndat[0].stats.delta) | ||||
|         plt.plot(tz, z / max(z), 'k') | ||||
|         plt.plot(tz[isignal], z[isignal] / max(z), 'r') | ||||
|         plt.plot(te, edat[0].data / max(edat[0].data) + 1, 'k') | ||||
|         plt.plot(te[isignal], edat[0].data[isignal] / max(edat[0].data) + 1, 'r') | ||||
|         plt.plot(tn, ndat[0].data / max(ndat[0].data) + 2, 'k') | ||||
|         plt.plot(tn[isignal], ndat[0].data[isignal] / max(ndat[0].data) + 2, 'r') | ||||
|         plt.plot([tz[isignal[0]], tz[isignal[len(isignal) - 1]]], | ||||
|                  [minsiglevel / max(z), minsiglevel / max(z)], 'g', | ||||
|                  linewidth=2) | ||||
|         plt.xlabel('Time [s] since %s' % zdat[0].stats.starttime) | ||||
|         plt.ylabel('Normalized Counts') | ||||
|         plt.yticks([0, 1, 2], [zdat[0].stats.channel, edat[0].stats.channel, | ||||
|                                ndat[0].stats.channel]) | ||||
|         plt.title('CheckZ4S, Station %s' % zdat[0].stats.station) | ||||
|         plt.show() | ||||
|         raw_input() | ||||
| 
 | ||||
|     return returnflag | ||||
| 
 | ||||
| 
 | ||||
| if __name__ == '__main__': | ||||
|     import doctest | ||||
| 
 | ||||
|     doctest.testmod() | ||||
							
								
								
									
										2
									
								
								pylot/core/util/__init__.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,2 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
| from pylot.core.util.version import get_git_version as _getVersionString | ||||
							
								
								
									
										13
									
								
								pylot/core/util/connection.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,13 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import urllib2 | ||||
| 
 | ||||
| 
 | ||||
| def checkurl(url='https://ariadne.geophysik.rub.de/trac/PyLoT'): | ||||
|     try: | ||||
|         urllib2.urlopen(url, timeout=1) | ||||
|         return True | ||||
|     except urllib2.URLError: | ||||
|         pass | ||||
|     return False | ||||
							
								
								
									
										323
									
								
								pylot/core/util/dataprocessing.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,323 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import os | ||||
| import glob | ||||
| import sys | ||||
| from obspy.io.xseed import Parser | ||||
| 
 | ||||
| import numpy as np | ||||
| 
 | ||||
| from obspy import UTCDateTime, read_inventory, read | ||||
| from obspy.io.xseed import Parser | ||||
| from pylot.core.util.utils import key_for_set_value, find_in_list, \ | ||||
|     remove_underscores | ||||
| 
 | ||||
| 
 | ||||
| def time_from_header(header): | ||||
|     """ | ||||
|     Function takes in the second line from a .gse file and takes out the date and time from that line. | ||||
|     :param header: second line from .gse file | ||||
|     :type header: string | ||||
|     :return: a list of integers of form [year, month, day, hour, minute, second, microsecond] | ||||
|     """ | ||||
|     timeline = header.split(' ') | ||||
|     time = timeline[1].split('/') + timeline[2].split(':') | ||||
|     time = time[:-1] + time[-1].split('.') | ||||
|     return [int(t) for t in time] | ||||
| 
 | ||||
| 
 | ||||
| def check_time(datetime): | ||||
|     """ | ||||
|     Function takes in date and time as list and validates it's values by trying to make an UTCDateTime object from it | ||||
|     :param datetime: list of integers [year, month, day, hour, minute, second, microsecond] | ||||
|     :type datetime: list | ||||
|     :return: returns True if Values are in supposed range, returns False otherwise | ||||
| 
 | ||||
|     >>> check_time([1999, 01, 01, 23, 59, 59, 999000]) | ||||
|     True | ||||
|     >>> check_time([1999, 01, 01, 23, 59, 60, 999000]) | ||||
|     False | ||||
|     >>> check_time([1999, 01, 01, 23, 59, 59, 1000000]) | ||||
|     False | ||||
|     >>> check_time([1999, 01, 01, 23, 60, 59, 999000]) | ||||
|     False | ||||
|     >>> check_time([1999, 01, 01, 23, 60, 59, 999000]) | ||||
|     False | ||||
|     >>> check_time([1999, 01, 01, 24, 59, 59, 999000]) | ||||
|     False | ||||
|     >>> check_time([1999, 01, 31, 23, 59, 59, 999000]) | ||||
|     True | ||||
|     >>> check_time([1999, 02, 30, 23, 59, 59, 999000]) | ||||
|     False | ||||
|     >>> check_time([1999, 02, 29, 23, 59, 59, 999000]) | ||||
|     False | ||||
|     >>> check_time([2000, 02, 29, 23, 59, 59, 999000]) | ||||
|     True | ||||
|     >>> check_time([2000, 13, 29, 23, 59, 59, 999000]) | ||||
|     False | ||||
|     """ | ||||
|     try: | ||||
|         UTCDateTime(*datetime) | ||||
|         return True | ||||
|     except ValueError: | ||||
|         return False | ||||
| 
 | ||||
| 
 | ||||
| def get_file_list(root_dir): | ||||
|     """ | ||||
|     Function uses a directorie to get all the *.gse files from it. | ||||
|     :param root_dir: a directorie leading to the .gse files | ||||
|     :type root_dir: string | ||||
|     :return: returns a list of filenames (without path to them) | ||||
|     """ | ||||
|     file_list = glob.glob1(root_dir, '*.gse') | ||||
|     return file_list | ||||
| 
 | ||||
| 
 | ||||
| def checks_station_second(datetime, file): | ||||
|     """ | ||||
|     Function uses the given list to check if the parameter 'second' is set to 60 by mistake | ||||
|      and sets the time correctly if so. Can only correct time if no date change would be necessary. | ||||
|     :param datetime: [year, month, day, hour, minute, second, microsecond] | ||||
|     :return: returns the input with the correct value for second | ||||
|     """ | ||||
|     if datetime[5] == 60: | ||||
|         if datetime[4] == 59: | ||||
|             if datetime[3] == 23: | ||||
|                 err_msg = 'Date should be next day. ' \ | ||||
|                           'File not changed: {0}'.format(file) | ||||
|                 raise ValueError(err_msg) | ||||
|             else: | ||||
|                 datetime[3] += 1 | ||||
|                 datetime[4] = 0 | ||||
|                 datetime[5] = 0 | ||||
|         else: | ||||
|             datetime[4] += 1 | ||||
|             datetime[5] = 0 | ||||
|     return datetime | ||||
| 
 | ||||
| 
 | ||||
| def make_time_line(line, datetime): | ||||
|     """ | ||||
|     Function takes in the original line from a .gse file and a list of date and | ||||
|     time values to make a new line with corrected date and time. | ||||
|     :param line: second line from .gse file. | ||||
|     :type line: string | ||||
|     :param datetime: list of integers [year, month, day, hour, minute, second, microsecond] | ||||
|     :type datetime: list | ||||
|     :return: returns a string to write it into a file. | ||||
|     """ | ||||
|     ins_form = '{0:02d}:{1:02d}:{2:02d}.{3:03d}' | ||||
|     insertion = ins_form.format(int(datetime[3]), | ||||
|                                 int(datetime[4]), | ||||
|                                 int(datetime[5]), | ||||
|                                 int(datetime[6] * 1e-3)) | ||||
|     newline = line[:16] + insertion + line[28:] | ||||
|     return newline | ||||
| 
 | ||||
| 
 | ||||
| def evt_head_check(root_dir, out_dir = None): | ||||
|     """ | ||||
|     A function to make sure that an arbitrary number of .gse files have correct values in their header. | ||||
|     :param root_dir: a directory leading to the .gse files. | ||||
|     :type root_dir: string | ||||
|     :param out_dir: a directory to store the new files somwhere els. | ||||
|     :return: returns nothing | ||||
|     """ | ||||
|     if not out_dir: | ||||
|         print('WARNING files are going to be overwritten!') | ||||
|         inp = str(raw_input('Continue? [y/N]')) | ||||
|         if not inp == 'y': | ||||
|             sys.exit() | ||||
|     filelist = get_file_list(root_dir) | ||||
|     nfiles = 0 | ||||
|     for file in filelist: | ||||
|         infile = open(os.path.join(root_dir, file), 'r') | ||||
|         lines = infile.readlines() | ||||
|         infile.close() | ||||
|         datetime = time_from_header(lines[1]) | ||||
|         if check_time(datetime): | ||||
|             continue | ||||
|         else: | ||||
|             nfiles += 1 | ||||
|             datetime = checks_station_second(datetime, file) | ||||
|             print('writing ' + file) | ||||
|             # write File | ||||
|             lines[1] = make_time_line(lines[1], datetime) | ||||
|             if not out_dir: | ||||
|                 out = open(os.path.join(root_dir, file), 'w') | ||||
|                 out.writelines(lines) | ||||
|                 out.close() | ||||
|             else: | ||||
|                 out = open(os.path.join(out_dir, file), 'w') | ||||
|                 out.writelines(lines) | ||||
|                 out.close() | ||||
|     print(nfiles) | ||||
| 
 | ||||
| 
 | ||||
| def read_metadata(path_to_inventory): | ||||
|     """ | ||||
|     take path_to_inventory and return either the corresponding list of files | ||||
|     found or the Parser object for a network dataless seed volume to prevent | ||||
|     read overhead for large dataless seed volumes | ||||
|     :param path_to_inventory: | ||||
|     :return: tuple containing a either list of files or `obspy.io.xseed.Parser` | ||||
|     object and the inventory type found | ||||
|     :rtype: tuple | ||||
|     """ | ||||
|     dlfile = list() | ||||
|     invfile = list() | ||||
|     respfile = list() | ||||
|     inv = dict(dless=dlfile, xml=invfile, resp=respfile) | ||||
|     if os.path.isfile(path_to_inventory): | ||||
|         ext = os.path.splitext(path_to_inventory)[1].split('.')[1] | ||||
|         inv[ext] += [path_to_inventory] | ||||
|     else: | ||||
|         for ext in inv.keys(): | ||||
|             inv[ext] += glob.glob1(path_to_inventory, '*.{0}'.format(ext)) | ||||
| 
 | ||||
|     invtype = key_for_set_value(inv) | ||||
| 
 | ||||
|     if invtype is None: | ||||
|         raise IOError("Neither dataless-SEED file, inventory-xml file nor " | ||||
|                       "RESP-file found!") | ||||
|     elif invtype == 'dless':  # prevent multiple read of large dlsv | ||||
|         print("Reading metadata information from dataless-SEED file ...") | ||||
|         if len(inv[invtype]) == 1: | ||||
|             fullpath_inv = os.path.join(path_to_inventory, inv[invtype][0]) | ||||
|             robj = Parser(fullpath_inv) | ||||
|         else: | ||||
|             robj = inv[invtype] | ||||
|     else: | ||||
|         print("Reading metadata information from inventory-xml file ...") | ||||
|         robj = inv[invtype] | ||||
|     return invtype, robj | ||||
| 
 | ||||
| 
 | ||||
| def restitute_data(data, invtype, inobj, unit='VEL', force=False): | ||||
|     """ | ||||
|     takes a data stream and a path_to_inventory and returns the corrected | ||||
|     waveform data stream | ||||
|     :param data: seismic data stream | ||||
|     :param invtype: type of found metadata | ||||
|     :param inobj: either list of metadata files or `obspy.io.xseed.Parser` | ||||
|     object | ||||
|     :param unit: unit to correct for (default: 'VEL') | ||||
|     :param force: force restitution for already corrected traces (default: | ||||
|     False) | ||||
|     :return: corrected data stream | ||||
|     """ | ||||
| 
 | ||||
|     restflag = list() | ||||
| 
 | ||||
|     data = remove_underscores(data) | ||||
| 
 | ||||
|     # loop over traces | ||||
|     for tr in data: | ||||
|         seed_id = tr.get_id() | ||||
|         # check, whether this trace has already been corrected | ||||
|         if 'processing' in tr.stats.keys() \ | ||||
|                 and np.any(['remove' in p for p in tr.stats.processing]) \ | ||||
|                 and not force: | ||||
|             print("Trace {0} has already been corrected!".format(seed_id)) | ||||
|             continue | ||||
|         stime = tr.stats.starttime | ||||
|         prefilt = get_prefilt(tr) | ||||
|         if invtype == 'resp': | ||||
|             fresp = find_in_list(inobj, seed_id) | ||||
|             if not fresp: | ||||
|                 raise IOError('no response file found ' | ||||
|                               'for trace {0}'.format(seed_id)) | ||||
|             fname = fresp | ||||
|             seedresp = dict(filename=fname, | ||||
|                             date=stime, | ||||
|                             units=unit) | ||||
|             kwargs = dict(paz_remove=None, pre_filt=prefilt, seedresp=seedresp) | ||||
|         elif invtype == 'dless': | ||||
|             if type(inobj) is list: | ||||
|                 fname = Parser(find_in_list(inobj, seed_id)) | ||||
|             else: | ||||
|                 fname = inobj | ||||
|             seedresp = dict(filename=fname, | ||||
|                             date=stime, | ||||
|                             units=unit) | ||||
|             kwargs = dict(pre_filt=prefilt, seedresp=seedresp) | ||||
|         elif invtype == 'xml': | ||||
|             invlist = inobj | ||||
|             if len(invlist) > 1: | ||||
|                 finv = find_in_list(invlist, seed_id) | ||||
|             else: | ||||
|                 finv = invlist[0] | ||||
|             inventory = read_inventory(finv, format='STATIONXML') | ||||
|         else: | ||||
|             data.remove(tr) | ||||
|             continue | ||||
|         # apply restitution to data | ||||
|         try: | ||||
|             if invtype in ['resp', 'dless']: | ||||
|                 tr.simulate(**kwargs) | ||||
|             else: | ||||
|                 tr.attach_response(inventory) | ||||
|                 tr.remove_response(output=unit, | ||||
|                                    pre_filt=prefilt) | ||||
|         except ValueError as e: | ||||
|             msg0 = 'Response for {0} not found in Parser'.format(seed_id) | ||||
|             msg1 = 'evalresp failed to calculate response' | ||||
|             if msg0 not in e.message or msg1 not in e.message: | ||||
|                 raise | ||||
|             else: | ||||
|                 # restitution done to copies of data thus deleting traces | ||||
|                 # that failed should not be a problem | ||||
|                 data.remove(tr) | ||||
|                 continue | ||||
|         restflag.append(True) | ||||
|     # check if ALL traces could be restituted, take care of large datasets | ||||
|     # better try restitution for smaller subsets of data (e.g. station by | ||||
|     # station) | ||||
|     if len(restflag) > 0: | ||||
|         restflag = bool(np.all(restflag)) | ||||
|     else: | ||||
|         restflag = False | ||||
|     return data, restflag | ||||
| 
 | ||||
| 
 | ||||
| def get_prefilt(trace, tlow=(0.5, 0.9), thi=(5., 2.), verbosity=0): | ||||
|     """ | ||||
|     takes a `obspy.core.stream.Trace` object, taper parameters tlow and thi and | ||||
|     returns the pre-filtering corner frequencies for the cosine taper for | ||||
|     further processing | ||||
|     :param trace: seismic data trace | ||||
|     :type trace: `obspy.core.stream.Trace` | ||||
|     :param tlow: tuple or list containing the desired lower corner | ||||
|     frequenices for a cosine taper | ||||
|     :type tlow: tuple or list | ||||
|     :param thi: tuple or list containing the percentage values of the | ||||
|     Nyquist frequency for the desired upper corner frequencies of the cosine | ||||
|     taper | ||||
|     :type thi: tuple or list | ||||
|     :param verbosity: verbosity level | ||||
|     :type verbosity: int | ||||
|     :return: pre-filt cosine taper corner frequencies | ||||
|     :rtype: tuple | ||||
| 
 | ||||
|     ..example:: | ||||
| 
 | ||||
|     >>> st = read() | ||||
|     >>> get_prefilt(st[0]) | ||||
|     (0.5, 0.9, 47.5, 49.0) | ||||
|     """ | ||||
|     if verbosity: | ||||
|         print("Calculating pre-filter values for %s, %s ..." % ( | ||||
|             trace.stats.station, trace.stats.channel)) | ||||
|     # get corner frequencies for pre-filtering | ||||
|     fny = trace.stats.sampling_rate / 2 | ||||
|     fc21 = fny - (fny * thi[0]/100.) | ||||
|     fc22 = fny - (fny * thi[1]/100.) | ||||
|     return (tlow[0], tlow[1], fc21, fc22) | ||||
| 
 | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     import doctest | ||||
| 
 | ||||
|     doctest.testmod() | ||||
							
								
								
									
										65
									
								
								pylot/core/util/defaults.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,65 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| """ | ||||
| Created on Wed Feb 26 12:31:25 2014 | ||||
| 
 | ||||
| @author: sebastianw | ||||
| """ | ||||
| 
 | ||||
| import os | ||||
| from pylot.core.loc import nll | ||||
| from pylot.core.loc import hsat | ||||
| from pylot.core.loc import velest | ||||
| 
 | ||||
| 
 | ||||
| def readFilterInformation(fname): | ||||
|     def convert2FreqRange(*args): | ||||
|         if len(args) > 1: | ||||
|             return [float(arg) for arg in args] | ||||
|         elif len(args) == 1: | ||||
|             return float(args[0]) | ||||
|         return None | ||||
| 
 | ||||
|     filter_file = open(fname, 'r') | ||||
|     filter_information = dict() | ||||
|     for filter_line in filter_file.readlines(): | ||||
|         filter_line = filter_line.split(' ') | ||||
|         for n, pos in enumerate(filter_line): | ||||
|             if pos == '\n': | ||||
|                 filter_line[n] = '' | ||||
|         filter_information[filter_line[0]] = {'filtertype': filter_line[1] | ||||
|         if filter_line[1] | ||||
|         else None, | ||||
|                                               'order': int(filter_line[2]) | ||||
|                                               if filter_line[1] | ||||
|                                               else None, | ||||
|                                               'freq': convert2FreqRange(*filter_line[3:]) | ||||
|                                               if filter_line[1] | ||||
|                                               else None} | ||||
|     return filter_information | ||||
| 
 | ||||
| 
 | ||||
| FILTERDEFAULTS = readFilterInformation(os.path.join(os.path.expanduser('~'), | ||||
|                                                     '.pylot', | ||||
|                                                     'filter.in')) | ||||
| 
 | ||||
| AUTOMATIC_DEFAULTS = os.path.join(os.path.expanduser('~'), | ||||
|                                   '.pylot', | ||||
|                                   'autoPyLoT.in') | ||||
| 
 | ||||
| TIMEERROR_DEFAULTS = os.path.join(os.path.expanduser('~'), | ||||
|                                   '.pylot', | ||||
|                                   'PILOT_TimeErrors.in') | ||||
| 
 | ||||
| OUTPUTFORMATS = {'.xml': 'QUAKEML', | ||||
|                  '.cnv': 'CNV', | ||||
|                  '.obs': 'NLLOC_OBS'} | ||||
| 
 | ||||
| LOCTOOLS = dict(nll=nll, hsat=hsat, velest=velest) | ||||
| 
 | ||||
| COMPPOSITION_MAP = dict(Z=2, N=1, E=0) | ||||
| COMPPOSITION_MAP['1'] = 1 | ||||
| COMPPOSITION_MAP['2'] = 0 | ||||
| COMPPOSITION_MAP['3'] = 2 | ||||
| 
 | ||||
| COMPNAME_MAP = dict(Z='3', N='1', E='2') | ||||
							
								
								
									
										29
									
								
								pylot/core/util/errors.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,29 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
| """ | ||||
| Created on Thu Mar 20 09:47:04 2014 | ||||
| 
 | ||||
| @author: sebastianw | ||||
| """ | ||||
| 
 | ||||
| 
 | ||||
| class OptionsError(Exception): | ||||
|     pass | ||||
| 
 | ||||
| 
 | ||||
| class FormatError(Exception): | ||||
|     pass | ||||
| 
 | ||||
| 
 | ||||
| class DatastructureError(Exception): | ||||
|     pass | ||||
| 
 | ||||
| 
 | ||||
| class OverwriteError(IOError): | ||||
|     pass | ||||
| 
 | ||||
| 
 | ||||
| class ParameterError(Exception): | ||||
|     pass | ||||
| 
 | ||||
| class ProcessingError(RuntimeError): | ||||
|     pass | ||||
							
								
								
									
										476
									
								
								pylot/core/util/pdf.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,476 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import warnings | ||||
| import numpy as np | ||||
| from obspy import UTCDateTime | ||||
| from pylot.core.util.utils import fit_curve, find_nearest, clims | ||||
| from pylot.core.util.version import get_git_version as _getVersionString | ||||
| 
 | ||||
| __version__ = _getVersionString() | ||||
| __author__ = 'sebastianw' | ||||
| 
 | ||||
| def create_axis(x0, incr, npts): | ||||
|     ax = np.zeros(npts) | ||||
|     for i in range(npts): | ||||
|         ax[i] = x0 + incr * i | ||||
|     return ax | ||||
| 
 | ||||
| def gauss_parameter(te, tm, tl, eta): | ||||
|     ''' | ||||
|     takes three onset times and returns the parameters sig1, sig2, a1 and a2 | ||||
|     to represent the pick as a probability density funtion (PDF) with two | ||||
|     Gauss branches | ||||
|     :param te: | ||||
|     :param tm: | ||||
|     :param tl: | ||||
|     :param eta: | ||||
|     :return: | ||||
|     ''' | ||||
| 
 | ||||
|     sig1 = (tm - te) / np.sqrt(2 * np.log(1 / eta)) | ||||
|     sig2 = (tl - tm) / np.sqrt(2 * np.log(1 / eta)) | ||||
| 
 | ||||
|     a1 = 2 / (1 + sig2 / sig1) | ||||
|     a2 = 2 / (1 + sig1 / sig2) | ||||
| 
 | ||||
|     return tm, sig1, sig2, a1, a2 | ||||
| 
 | ||||
| 
 | ||||
| def exp_parameter(te, tm, tl, eta): | ||||
|     ''' | ||||
|     takes three onset times te, tm and tl and returns the parameters sig1, | ||||
|     sig2 and a to represent the pick as a probability density function (PDF) | ||||
|     with two exponential decay branches | ||||
|     :param te: | ||||
|     :param tm: | ||||
|     :param tl: | ||||
|     :param eta: | ||||
|     :return: | ||||
|     ''' | ||||
| 
 | ||||
|     sig1 = np.log(eta) / (te - tm) | ||||
|     sig2 = np.log(eta) / (tm - tl) | ||||
|     a = 1 / (1 / sig1 + 1 / sig2) | ||||
| 
 | ||||
|     return tm, sig1, sig2, a | ||||
| 
 | ||||
| 
 | ||||
| def gauss_branches(k, (mu, sig1, sig2, a1, a2)): | ||||
|     ''' | ||||
|     function gauss_branches takes an axes x, a center value mu, two sigma | ||||
|     values sig1 and sig2 and two scaling factors a1 and a2 and return a | ||||
|     list containing the values of a probability density function (PDF) | ||||
|     consisting of gauss branches | ||||
|     :param x: | ||||
|     :type x: | ||||
|     :param mu: | ||||
|     :type mu: | ||||
|     :param sig1: | ||||
|     :type sig1: | ||||
|     :param sig2: | ||||
|     :type sig2: | ||||
|     :param a1: | ||||
|     :type a1: | ||||
|     :param a2: | ||||
|     :returns fun_vals: list with function values along axes x | ||||
|     ''' | ||||
| 
 | ||||
|     def _func(k, mu, sig1, sig2, a1, a2): | ||||
|         if k < mu: | ||||
|             rval = a1 * 1 / (np.sqrt(2 * np.pi) * sig1) * np.exp(-((k - mu) / sig1) ** 2 / 2) | ||||
|         else: | ||||
|             rval = a2 * 1 / (np.sqrt(2 * np.pi) * sig2) * np.exp(-((k - mu) / | ||||
|                                                                    sig2) ** 2 / 2) | ||||
|         return rval | ||||
| 
 | ||||
|     try: | ||||
|         return [_func(x, mu, sig1, sig2, a1, a2) for x in iter(k)] | ||||
|     except TypeError: | ||||
|         return _func(k, mu, sig1, sig2, a1, a2) | ||||
| 
 | ||||
| 
 | ||||
| def exp_branches(k, (mu, sig1, sig2, a)): | ||||
|     ''' | ||||
|     function exp_branches takes an axes x, a center value mu, two sigma | ||||
|     values sig1 and sig2 and a scaling factor a and return a | ||||
|     list containing the values of a probability density function (PDF) | ||||
|     consisting of exponential decay branches | ||||
|     :param x: | ||||
|     :param mu: | ||||
|     :param sig1: | ||||
|     :param sig2: | ||||
|     :param a: | ||||
|     :returns fun_vals: list with function values along axes x: | ||||
|     ''' | ||||
| 
 | ||||
|     def _func(k, mu, sig1, sig2, a): | ||||
|         mu = float(mu) | ||||
|         if k < mu: | ||||
|             rval = a * np.exp(sig1 * (k - mu)) | ||||
|         else: | ||||
|             rval = a * np.exp(-sig2 * (k - mu)) | ||||
|         return rval | ||||
| 
 | ||||
|     try: | ||||
|         return [_func(x, mu, sig1, sig2, a) for x in iter(k)] | ||||
|     except TypeError: | ||||
|         return _func(k, mu, sig1, sig2, a) | ||||
| 
 | ||||
| 
 | ||||
| # define container dictionaries for different types of pdfs | ||||
| parameter = dict(gauss=gauss_parameter, exp=exp_parameter) | ||||
| branches = dict(gauss=gauss_branches, exp=exp_branches) | ||||
| 
 | ||||
| 
 | ||||
| class ProbabilityDensityFunction(object): | ||||
|     ''' | ||||
|     A probability density function toolkit. | ||||
|     ''' | ||||
| 
 | ||||
|     version = __version__ | ||||
| 
 | ||||
|     def __init__(self, x0, incr, npts, pdf, mu, params, eta=0.01): | ||||
|         self.x0 = x0 | ||||
|         self.incr = incr | ||||
|         self.npts = npts | ||||
|         self.axis = create_axis(x0, incr, npts) | ||||
|         self.mu = mu | ||||
|         self.eta = eta | ||||
|         self._pdf = pdf | ||||
|         self.params = params | ||||
| 
 | ||||
|     def __add__(self, other): | ||||
| 
 | ||||
|         assert self.eta == other.eta, 'decline factors differ please use equally defined pdfs for comparison' | ||||
| 
 | ||||
|         eta = self.eta | ||||
| 
 | ||||
|         x0, incr, npts = self.commonparameter(other) | ||||
| 
 | ||||
|         axis = create_axis(x0, incr, npts) | ||||
|         pdf_self = np.array(self.data(axis)) | ||||
|         pdf_other = np.array(other.data(axis)) | ||||
| 
 | ||||
|         pdf = np.convolve(pdf_self, pdf_other, 'full') * incr | ||||
| 
 | ||||
|         # shift axis values for correct plotting | ||||
|         npts = pdf.size | ||||
|         x0 *= 2 | ||||
|         axis = create_axis(x0, incr, npts) | ||||
|         mu = axis[np.where(pdf == max(pdf))][0] | ||||
| 
 | ||||
|         func, params = fit_curve(axis, pdf) | ||||
| 
 | ||||
|         return ProbabilityDensityFunction(x0, incr, npts, func, mu, | ||||
|                                           params, eta) | ||||
| 
 | ||||
|     def __sub__(self, other): | ||||
| 
 | ||||
|         assert self.eta == other.eta, 'decline factors differ please use equally defined pdfs for comparison' | ||||
| 
 | ||||
|         eta = self.eta | ||||
| 
 | ||||
|         x0, incr, npts = self.commonparameter(other) | ||||
| 
 | ||||
|         axis = create_axis(x0, incr, npts) | ||||
|         pdf_self = np.array(self.data(axis)) | ||||
|         pdf_other = np.array(other.data(axis)) | ||||
| 
 | ||||
|         pdf = np.correlate(pdf_self, pdf_other, 'full') * incr | ||||
| 
 | ||||
|         # shift axis values for correct plotting | ||||
|         npts = len(pdf) | ||||
|         midpoint = npts / 2 | ||||
|         x0 = -incr * midpoint | ||||
|         axis = create_axis(x0, incr, npts) | ||||
|         mu = axis[np.where(pdf == max(pdf))][0] | ||||
| 
 | ||||
|         func, params = fit_curve(axis, pdf) | ||||
| 
 | ||||
|         return ProbabilityDensityFunction(x0, incr, npts, func, mu, | ||||
|                                           params, eta) | ||||
| 
 | ||||
|     def __nonzero__(self): | ||||
|         prec = self.precision(self.incr) | ||||
|         data = np.array(self.data()) | ||||
|         gtzero = np.all(data >= 0) | ||||
|         probone = bool(np.round(self.prob_gt_val(self.axis[0]), prec) == 1.) | ||||
|         return bool(gtzero and probone) | ||||
| 
 | ||||
|     def __str__(self): | ||||
|         return str([self.data()]) | ||||
| 
 | ||||
|     @staticmethod | ||||
|     def precision(incr): | ||||
|         prec = int(np.ceil(np.abs(np.log10(incr)))) - 2 | ||||
|         return prec if prec >= 0 else 0 | ||||
| 
 | ||||
|     def data(self, value=None): | ||||
|         if value is None: | ||||
|             return self._pdf(self.axis, self.params) | ||||
|         return self._pdf(value, self.params) | ||||
| 
 | ||||
|     @property | ||||
|     def eta(self): | ||||
|         return self._eta | ||||
| 
 | ||||
|     @eta.setter | ||||
|     def eta(self, value): | ||||
|         self._eta = value | ||||
| 
 | ||||
|     @property | ||||
|     def mu(self): | ||||
|         return self._mu | ||||
| 
 | ||||
|     @mu.setter | ||||
|     def mu(self, mu): | ||||
|         self._mu = mu | ||||
| 
 | ||||
|     @property | ||||
|     def axis(self): | ||||
|         return self._x | ||||
| 
 | ||||
|     @axis.setter | ||||
|     def axis(self, x): | ||||
|         self._x = np.array(x) | ||||
| 
 | ||||
|     @classmethod | ||||
|     def from_pick(self, lbound, barycentre, rbound, incr=0.001, decfact=0.01, | ||||
|                   type='gauss'): | ||||
|         ''' | ||||
|         Initialize a new ProbabilityDensityFunction object. | ||||
|         Takes incr, lbound, barycentre and rbound to derive x0 and the number | ||||
|         of points npts for the axis vector. | ||||
|         Maximum density | ||||
|         is given at the barycentre and on the boundaries the function has | ||||
|         declined to decfact times the maximum value. Integration of the | ||||
|         function over a particular interval gives the probability for the | ||||
|         variable value to be in that interval. | ||||
|         ''' | ||||
| 
 | ||||
|         # derive adequate window of definition | ||||
|         margin = 2. * np.max([barycentre - lbound, rbound - barycentre]) | ||||
| 
 | ||||
|         # find midpoint accounting also for `~obspy.UTCDateTime` object usage | ||||
|         try: | ||||
|             midpoint = (rbound + lbound) / 2 | ||||
|         except TypeError: | ||||
|             try: | ||||
|                 midpoint = (rbound + float(lbound)) / 2 | ||||
|             except TypeError: | ||||
|                 midpoint = float(rbound + float(lbound)) / 2 | ||||
| 
 | ||||
|         # find x0 on a grid point and sufficient npts | ||||
|         was_datetime = None | ||||
|         if isinstance(barycentre, UTCDateTime): | ||||
|             barycentre = float(barycentre) | ||||
|             was_datetime = True | ||||
|         n = int(np.ceil((barycentre - midpoint) / incr)) | ||||
|         m = int(np.ceil((margin / incr))) | ||||
|         midpoint = barycentre - n * incr | ||||
|         margin = m * incr | ||||
|         x0 = midpoint - margin | ||||
|         npts = 2 * m | ||||
| 
 | ||||
|         if was_datetime: | ||||
|             barycentre = UTCDateTime(barycentre) | ||||
| 
 | ||||
|         # calculate parameter for pdf representing function | ||||
|         params = parameter[type](lbound, barycentre, rbound, decfact) | ||||
| 
 | ||||
|         # select pdf type | ||||
|         pdf = branches[type] | ||||
| 
 | ||||
|         # return the object | ||||
|         return ProbabilityDensityFunction(x0, incr, npts, pdf, barycentre, | ||||
|                                           params, decfact) | ||||
| 
 | ||||
|     def broadcast(self, pdf, si, ei, data): | ||||
|         try: | ||||
|             pdf[si:ei] = data | ||||
|         except ValueError as e: | ||||
|             warnings.warn(str(e), Warning) | ||||
|             return self.broadcast(pdf, si, ei, data[:-1]) | ||||
|         return pdf | ||||
| 
 | ||||
|     def expectation(self): | ||||
|         ''' | ||||
|         returns the expectation value of the actual pdf object | ||||
| 
 | ||||
|         ..formula:: | ||||
|                 mu_{\Delta t} = \int\limits_{-\infty}^\infty x \cdot f(x)dx | ||||
| 
 | ||||
|         :return float: rval | ||||
|         ''' | ||||
| 
 | ||||
|         rval = 0 | ||||
|         for x in self.axis: | ||||
|             rval += x * self.data(x) | ||||
|         return rval * self.incr | ||||
| 
 | ||||
|     def standard_deviation(self): | ||||
|         mu = self.mu | ||||
|         rval = 0 | ||||
|         for x in self.axis: | ||||
|             rval += (x - float(mu)) ** 2 * self.data(x) | ||||
|         return rval * self.incr | ||||
| 
 | ||||
|     def prob_lt_val(self, value): | ||||
|         if value <= self.axis[0] or value > self.axis[-1]: | ||||
|             raise ValueError('value out of bounds: {0}'.format(value)) | ||||
|         return self.prob_limits((self.axis[0], value)) | ||||
| 
 | ||||
|     def prob_gt_val(self, value): | ||||
|         if value < self.axis[0] or value >= self.axis[-1]: | ||||
|             raise ValueError('value out of bounds: {0}'.format(value)) | ||||
|         return self.prob_limits((value, self.axis[-1])) | ||||
| 
 | ||||
|     def prob_limits(self, limits, oversampling=1.): | ||||
|         sampling = self.incr / oversampling | ||||
|         lim = np.arange(limits[0], limits[1], sampling) | ||||
|         data = self.data(lim) | ||||
|         min_est, max_est = 0., 0. | ||||
|         for n in range(len(data) - 1): | ||||
|             min_est += min(data[n], data[n + 1]) | ||||
|             max_est += max(data[n], data[n + 1]) | ||||
|         return (min_est + max_est) / 2. * sampling | ||||
| 
 | ||||
|     def prob_val(self, value): | ||||
|         if not (self.axis[0] <= value <= self.axis[-1]): | ||||
|             Warning('{0} not on axis'.format(value)) | ||||
|             return None | ||||
|         return self.data(value) * self.incr | ||||
| 
 | ||||
|     def quantile(self, prob_value, eps=0.01): | ||||
|         ''' | ||||
| 
 | ||||
|         :param prob_value: | ||||
|         :param eps: | ||||
|         :return: | ||||
|         ''' | ||||
|         l = self.axis[0] | ||||
|         r = self.axis[-1] | ||||
|         m = (r + l) / 2 | ||||
|         diff = prob_value - self.prob_lt_val(m) | ||||
|         while abs(diff) > eps and ((r - l) > self.incr): | ||||
|             if diff > 0: | ||||
|                 l = m | ||||
|             else: | ||||
|                 r = m | ||||
|             m = (r + l) / 2 | ||||
|             diff = prob_value - self.prob_lt_val(m) | ||||
|         return m | ||||
| 
 | ||||
|     def quantile_distance(self, prob_value): | ||||
|         """ | ||||
|         takes a probability value and and returns the distance | ||||
|         between two complementary quantiles | ||||
| 
 | ||||
|         .. math:: | ||||
| 
 | ||||
|             QA_\alpha = Q(1 - \alpha) - Q(\alpha) | ||||
| 
 | ||||
|         :param value: probability value :math:\alpha | ||||
|         :type  value: float | ||||
|         :return: quantile distance | ||||
|         """ | ||||
|         if 0 >= prob_value or prob_value >= 0.5: | ||||
|             raise ValueError('Value out of range.') | ||||
|         ql = self.quantile(prob_value) | ||||
|         qu = self.quantile(1 - prob_value) | ||||
|         return qu - ql | ||||
| 
 | ||||
| 
 | ||||
|     def quantile_dist_frac(self, x): | ||||
|         """ | ||||
|         takes a probability value and returns the fraction of two | ||||
|         corresponding quantile distances ( | ||||
|         :func:`pylot.core.util.pdf.ProbabilityDensityFunction | ||||
|         #quantile_distance`) | ||||
| 
 | ||||
|         .. math:: | ||||
| 
 | ||||
|             Q\Theta_\alpha = \frac{QA(0.5 - \alpha)}{QA(\alpha)} | ||||
| 
 | ||||
|         :param value: probability value :math:\alpha | ||||
|         :return: quantile distance fraction | ||||
|         """ | ||||
|         if x <= 0 or x >= 0.25: | ||||
|             raise ValueError('Value out of range.') | ||||
|         return self.quantile_distance(0.5-x)/self.quantile_distance(x) | ||||
| 
 | ||||
| 
 | ||||
|     def plot(self, label=None): | ||||
|         import matplotlib.pyplot as plt | ||||
| 
 | ||||
|         plt.plot(self.axis, self.data()) | ||||
|         plt.xlabel('x') | ||||
|         plt.ylabel('f(x)') | ||||
|         plt.autoscale(axis='x', tight=True) | ||||
|         if self: | ||||
|             title_str = 'Probability density function ' | ||||
|             if label: | ||||
|                 title_str += label | ||||
|             title_str.strip() | ||||
|         else: | ||||
|             title_str = 'Function not suitable as probability density function' | ||||
|         plt.title(title_str) | ||||
|         plt.show() | ||||
| 
 | ||||
|     def limits(self): | ||||
|         l1 = self.x0 | ||||
|         r1 = l1 + self.incr * self.npts | ||||
| 
 | ||||
|         return l1, r1 | ||||
| 
 | ||||
|     def cincr(self, other): | ||||
|         if not self.incr == other.incr: | ||||
|             raise NotImplementedError( | ||||
|                 'Upsampling of the lower sampled PDF not implemented yet!') | ||||
|         else: | ||||
|             return self.incr | ||||
| 
 | ||||
|     def commonlimits(self, incr, other, max_npts=1e5): | ||||
|         ''' | ||||
|         Takes an increment incr and two left and two right limits and returns | ||||
|         the left most limit and the minimum number of points needed to cover | ||||
|         the whole given interval. | ||||
|         :param incr: | ||||
|         :param l1: | ||||
|         :param l2: | ||||
|         :param r1: | ||||
|         :param r2: | ||||
|         :param max_npts: | ||||
|         :return: | ||||
|         ''' | ||||
| 
 | ||||
|         x0, r = clims(self.limits(), other.limits()) | ||||
| 
 | ||||
|         # calculate index for rounding | ||||
|         ri = self.precision(incr) | ||||
| 
 | ||||
|         npts = int(round(r - x0, ri) // incr) | ||||
| 
 | ||||
|         if npts > max_npts: | ||||
|             raise ValueError('Maximum number of points exceeded:\n' | ||||
|                              'max_npts - %d\n' | ||||
|                              'npts - %d\n' % (max_npts, npts)) | ||||
| 
 | ||||
|         npts = np.max([npts, self.npts, other.npts]) | ||||
| 
 | ||||
|         if npts < self.npts or npts < other.npts: | ||||
|             raise ValueError('new npts is to small') | ||||
| 
 | ||||
|         return x0, npts | ||||
| 
 | ||||
|     def commonparameter(self, other): | ||||
|         assert isinstance(other, ProbabilityDensityFunction), \ | ||||
|             'both operands must be of type ProbabilityDensityFunction' | ||||
| 
 | ||||
|         incr = self.cincr(other) | ||||
| 
 | ||||
|         x0, npts = self.commonlimits(incr, other) | ||||
| 
 | ||||
|         return x0, incr, npts | ||||
| 
 | ||||
							
								
								
									
										57
									
								
								pylot/core/util/plotting.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,57 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import matplotlib.pyplot as plt | ||||
| 
 | ||||
| def create_bin_list(l_boundary, u_boundary, nbins=100): | ||||
|     """ | ||||
|     takes two boundaries and a number of bins and creates a list of bins for | ||||
|     histogram plotting | ||||
|     :param l_boundary: Any number. | ||||
|     :type  l_boundary: float | ||||
|     :param u_boundary: Any number that is greater than l_boundary. | ||||
|     :type  u_boundary: float | ||||
|     :param nbins: Any positive integer. | ||||
|     :type  nbins: int | ||||
|     :return: A list of equidistant bins. | ||||
|     """ | ||||
|     if u_boundary <= l_boundary: | ||||
|         raise ValueError('Upper boundary must be greather than lower!') | ||||
|     elif nbins <= 0: | ||||
|         raise ValueError('Number of bins is not valid.') | ||||
|     binlist = [] | ||||
|     for i in range(nbins): | ||||
|         binlist.append(l_boundary + i * (u_boundary - l_boundary) / nbins) | ||||
|     return binlist | ||||
| 
 | ||||
| 
 | ||||
| def histplot(array, binlist, xlab='Values', | ||||
|              ylab='Frequency', title=None, fnout=None): | ||||
|     """ | ||||
|     function to quickly show some distribution of data. Takes array like data, | ||||
|     and a list of bins. Editing detail and inserting a legend is not possible. | ||||
|     :param array: List of values. | ||||
|     :type  array: Array like | ||||
|     :param binlist: List of bins. | ||||
|     :type  binlist: list | ||||
|     :param xlab: A label for the x-axes. | ||||
|     :type  xlab: str | ||||
|     :param ylab: A label for the y-axes. | ||||
|     :type  ylab: str | ||||
|     :param title: A title for the Plot. | ||||
|     :type  title: str | ||||
|     :param fnout: A path to save the plot instead of showing. | ||||
|                   Has to contain filename and type. Like: 'path/to/file.png' | ||||
|     :type  fnout. str | ||||
|     :return: - | ||||
|     """ | ||||
| 
 | ||||
|     plt.hist(array, bins=binlist) | ||||
|     plt.xlabel(xlab) | ||||
|     plt.ylabel(ylab) | ||||
|     if title: | ||||
|         plt.title(title) | ||||
|     if fnout: | ||||
|         plt.savefig(fnout) | ||||
|     else: | ||||
|         plt.show() | ||||
							
								
								
									
										12
									
								
								pylot/core/util/structure.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,12 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| """ | ||||
| Created on Wed Jan 26 17:47:25 2015 | ||||
| 
 | ||||
| @author: sebastianw | ||||
| """ | ||||
| 
 | ||||
| from pylot.core.io.data import SeiscompDataStructure, PilotDataStructure | ||||
| 
 | ||||
| DATASTRUCTURE = {'PILOT': PilotDataStructure, 'SeisComP': SeiscompDataStructure, | ||||
|                  None: None} | ||||
							
								
								
									
										33
									
								
								pylot/core/util/thread.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,33 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
| import sys | ||||
| from PySide.QtCore import QThread, Signal | ||||
| 
 | ||||
| 
 | ||||
| class AutoPickThread(QThread): | ||||
|     message = Signal(str) | ||||
|     finished = Signal() | ||||
| 
 | ||||
|     def __init__(self, parent, func, data, param): | ||||
|         super(AutoPickThread, self).__init__() | ||||
|         self.setParent(parent) | ||||
|         self.func = func | ||||
|         self.data = data | ||||
|         self.param = param | ||||
| 
 | ||||
|     def run(self): | ||||
|         sys.stdout = self | ||||
| 
 | ||||
|         picks = self.func(self.data, self.param) | ||||
| 
 | ||||
|         print("Autopicking finished!\n") | ||||
| 
 | ||||
|         try: | ||||
|             for station in picks: | ||||
|                 self.parent().addPicks(station, picks[station], type='auto') | ||||
|         except AttributeError: | ||||
|             print(picks) | ||||
|         sys.stdout = sys.__stdout__ | ||||
|         self.finished.emit() | ||||
| 
 | ||||
|     def write(self, text): | ||||
|         self.message.emit(text) | ||||
							
								
								
									
										524
									
								
								pylot/core/util/utils.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,524 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import hashlib | ||||
| import numpy as np | ||||
| from scipy.interpolate import splrep, splev | ||||
| import os | ||||
| import pwd | ||||
| import re | ||||
| import subprocess | ||||
| from obspy import UTCDateTime, read | ||||
| from pylot.core.io.inputs import AutoPickParameter | ||||
| 
 | ||||
| 
 | ||||
| def _pickle_method(m): | ||||
|     if m.im_self is None: | ||||
|         return getattr, (m.im_class, m.im_func.func_name) | ||||
|     else: | ||||
|         return getattr, (m.im_self, m.im_func.func_name) | ||||
| 
 | ||||
| def fit_curve(x, y): | ||||
|     return splev, splrep(x, y) | ||||
| 
 | ||||
| def getindexbounds(f, eta): | ||||
|     mi = f.argmax() | ||||
|     m = max(f) | ||||
|     b = m * eta | ||||
|     l = find_nearest(f[:mi], b) | ||||
|     u = find_nearest(f[mi:], b) + mi | ||||
|     return mi, l, u | ||||
| 
 | ||||
| def worker(func, input, cores='max', async=False): | ||||
|     import multiprocessing | ||||
| 
 | ||||
|     if cores == 'max': | ||||
|         cores = multiprocessing.cpu_count() | ||||
| 
 | ||||
|     pool = multiprocessing.Pool(cores) | ||||
|     if async == True: | ||||
|         result = pool.map_async(func, input) | ||||
|     else: | ||||
|         result = pool.map(func, input) | ||||
|     pool.close() | ||||
|     return result | ||||
| 
 | ||||
| def clims(lim1, lim2): | ||||
|     """ | ||||
|     takes two pairs of limits and returns one pair of common limts | ||||
|     :param lim1: | ||||
|     :param lim2: | ||||
|     :return: | ||||
| 
 | ||||
|     >>> clims([0, 4], [1, 3]) | ||||
|     [0, 4] | ||||
|     >>> clims([1, 4], [0, 3]) | ||||
|     [0, 4] | ||||
|     >>> clims([1, 3], [0, 4]) | ||||
|     [0, 4] | ||||
|     >>> clims([0, 3], [1, 4]) | ||||
|     [0, 4] | ||||
|     >>> clims([0, 3], [0, 4]) | ||||
|     [0, 4] | ||||
|     >>> clims([1, 4], [0, 4]) | ||||
|     [0, 4] | ||||
|     >>> clims([0, 4], [0, 4]) | ||||
|     [0, 4] | ||||
|     >>> clims([0, 4], [1, 4]) | ||||
|     [0, 4] | ||||
|     >>> clims([0, 4], [0, 3]) | ||||
|     [0, 4] | ||||
|     """ | ||||
|     lim = [None, None] | ||||
|     if lim1[0] < lim2[0]: | ||||
|         lim[0] = lim1[0] | ||||
|     else: | ||||
|         lim[0] = lim2[0] | ||||
|     if lim1[1] > lim2[1]: | ||||
|         lim[1] = lim1[1] | ||||
|     else: | ||||
|         lim[1] = lim2[1] | ||||
|     return lim | ||||
| 
 | ||||
| 
 | ||||
| def demeanTrace(trace, window): | ||||
|     """ | ||||
|     takes a trace object and returns the same trace object but with data | ||||
|     demeaned within a certain time window | ||||
|     :param trace: waveform trace object | ||||
|     :type trace: `~obspy.core.stream.Trace` | ||||
|     :param window: | ||||
|     :type window: tuple | ||||
|     :return: trace | ||||
|     :rtype: `~obspy.core.stream.Trace` | ||||
|     """ | ||||
|     trace.data -= trace.data[window].mean() | ||||
|     return trace | ||||
| 
 | ||||
| 
 | ||||
| def findComboBoxIndex(combo_box, val): | ||||
|     """ | ||||
|     Function findComboBoxIndex takes a QComboBox object and a string and | ||||
|     returns either 0 or the index throughout all QComboBox items. | ||||
|     :param combo_box: Combo box object. | ||||
|     :type combo_box: `~QComboBox` | ||||
|     :param val: Name of a combo box to search for. | ||||
|     :type val: basestring | ||||
|     :return: index value of item with name val or 0 | ||||
|     """ | ||||
|     return combo_box.findText(val) if combo_box.findText(val) is not -1 else 0 | ||||
| 
 | ||||
| def find_in_list(list, str): | ||||
|     """ | ||||
|     takes a list of strings and a string and returns the first list item | ||||
|     matching the string pattern | ||||
|     :param list: list to search in | ||||
|     :param str: pattern to search for | ||||
|     :return: first list item containing pattern | ||||
| 
 | ||||
|     .. example:: | ||||
| 
 | ||||
|     >>> l = ['/dir/e1234.123.12', '/dir/e2345.123.12', 'abc123', 'def456'] | ||||
|     >>> find_in_list(l, 'dir') | ||||
|     '/dir/e1234.123.12' | ||||
|     >>> find_in_list(l, 'e1234') | ||||
|     '/dir/e1234.123.12' | ||||
|     >>> find_in_list(l, 'e2') | ||||
|     '/dir/e2345.123.12' | ||||
|     >>> find_in_list(l, 'ABC') | ||||
|     'abc123' | ||||
|     >>> find_in_list(l, 'f456') | ||||
|     'def456' | ||||
|     >>> find_in_list(l, 'gurke') | ||||
| 
 | ||||
|     """ | ||||
|     rlist = [s for s in list if str.lower() in s.lower()] | ||||
|     if rlist: | ||||
|         return rlist[0] | ||||
|     return None | ||||
| 
 | ||||
| def find_nearest(array, value): | ||||
|     ''' | ||||
|     function find_nearest takes an array and a value and returns the | ||||
|     index of the nearest value found in the array | ||||
|     :param array: array containing values | ||||
|     :type array: `~numpy.ndarray` | ||||
|     :param value: number searched for | ||||
|     :return: index of the array item being nearest to the value | ||||
| 
 | ||||
|     >>> a = np.array([ 1.80339578, -0.72546654,  0.95769195, -0.98320759, 0.85922623]) | ||||
|     >>> find_nearest(a, 1.3) | ||||
|     2 | ||||
|     >>> find_nearest(a, 0) | ||||
|     1 | ||||
|     >>> find_nearest(a, 2) | ||||
|     0 | ||||
|     >>> find_nearest(a, -1) | ||||
|     3 | ||||
|     >>> a = np.array([ 1.1, -0.7,  0.9, -0.9, 0.8]) | ||||
|     >>> find_nearest(a, 0.849) | ||||
|     4 | ||||
|     ''' | ||||
|     return (np.abs(array - value)).argmin() | ||||
| 
 | ||||
| 
 | ||||
| def fnConstructor(s): | ||||
|     ''' | ||||
|     takes a string and returns a valid filename (especially on windows machines) | ||||
|     :param s: desired filename | ||||
|     :type s: str | ||||
|     :return: valid filename | ||||
|     ''' | ||||
|     if type(s) is str: | ||||
|         s = s.split(':')[-1] | ||||
|     else: | ||||
|         s = getHash(UTCDateTime()) | ||||
| 
 | ||||
|     badchars = re.compile(r'[^A-Za-z0-9_. ]+|^\.|\.$|^ | $|^$') | ||||
|     badsuffix = re.compile(r'(aux|com[1-9]|con|lpt[1-9]|prn)(\.|$)') | ||||
| 
 | ||||
|     fn = badchars.sub('_', s) | ||||
| 
 | ||||
|     if badsuffix.match(fn): | ||||
|         fn = '_' + fn | ||||
|     return fn | ||||
| 
 | ||||
| 
 | ||||
| def four_digits(year): | ||||
|     """ | ||||
|     takes a two digit year integer and returns the correct four digit equivalent | ||||
|     from the last 100 years | ||||
|     :param year: two digit year | ||||
|     :type year: int | ||||
|     :return: four digit year correspondant | ||||
| 
 | ||||
|     >>> four_digits(20) | ||||
|     1920 | ||||
|     >>> four_digits(16) | ||||
|     2016 | ||||
|     >>> four_digits(00) | ||||
|     2000 | ||||
|     """ | ||||
|     if year + 2000 <= UTCDateTime.utcnow().year: | ||||
|         year += 2000 | ||||
|     else: | ||||
|         year += 1900 | ||||
|     return year | ||||
| 
 | ||||
| 
 | ||||
| def common_range(stream): | ||||
|     ''' | ||||
|     takes a stream object and returns the earliest end and the latest start | ||||
|     time of all contained trace objects | ||||
|     :param stream: seismological data stream | ||||
|     :type stream: `~obspy.core.stream.Stream` | ||||
|     :return: maximum start time and minimum end time | ||||
|     ''' | ||||
|     max_start = None | ||||
|     min_end = None | ||||
|     for trace in stream: | ||||
|         if max_start is None or trace.stats.starttime > max_start: | ||||
|             max_start = trace.stats.starttime | ||||
|         if min_end is None or trace.stats.endtime < min_end: | ||||
|             min_end = trace.stats.endtime | ||||
|     return max_start, min_end | ||||
| 
 | ||||
| 
 | ||||
| def full_range(stream): | ||||
|     ''' | ||||
|     takes a stream object and returns the latest end and the earliest start | ||||
|     time of all contained trace objects | ||||
|     :param stream: seismological data stream | ||||
|     :type stream: `~obspy.core.stream.Stream` | ||||
|     :return: minimum start time and maximum end time | ||||
|     ''' | ||||
|     min_start = UTCDateTime() | ||||
|     max_end = None | ||||
|     for trace in stream: | ||||
|         if trace.stats.starttime < min_start: | ||||
|             min_start = trace.stats.starttime | ||||
|         if max_end is None or trace.stats.endtime > max_end: | ||||
|             max_end = trace.stats.endtime | ||||
|     return min_start, max_end | ||||
| 
 | ||||
| 
 | ||||
| def getHash(time): | ||||
|     ''' | ||||
|     takes a time object and returns the corresponding SHA1 hash of the | ||||
|     formatted date string | ||||
|     :param time: time object for which a hash should be calculated | ||||
|     :type time: :class: `~obspy.core.utcdatetime.UTCDateTime` object | ||||
|     :return: str | ||||
|     ''' | ||||
|     hg = hashlib.sha1() | ||||
|     hg.update(time.strftime('%Y-%m-%d %H:%M:%S.%f')) | ||||
|     return hg.hexdigest() | ||||
| 
 | ||||
| 
 | ||||
| def getLogin(): | ||||
|     ''' | ||||
|     returns the actual user's login ID | ||||
|     :return: login ID | ||||
|     ''' | ||||
|     return pwd.getpwuid(os.getuid())[0] | ||||
| 
 | ||||
| 
 | ||||
| def getOwner(fn): | ||||
|     ''' | ||||
|     takes a filename and return the login ID of the actual owner of the file | ||||
|     :param fn: filename of the file tested | ||||
|     :type fn: str | ||||
|     :return: login ID of the file's owner | ||||
|     ''' | ||||
|     return pwd.getpwuid(os.stat(fn).st_uid).pw_name | ||||
| 
 | ||||
| 
 | ||||
| def getPatternLine(fn, pattern): | ||||
|     """ | ||||
|     takes a file name and a pattern string to search for in the file and | ||||
|     returns the first line which contains the pattern string otherwise 'None' | ||||
|     :param fn: file name | ||||
|     :type fn: str | ||||
|     :param pattern: pattern string to search for | ||||
|     :type pattern: str | ||||
|     :return: the complete line containing the pattern string or None | ||||
| 
 | ||||
|     >>> getPatternLine('utils.py', 'python') | ||||
|     '#!/usr/bin/env python\\n' | ||||
|     >>> print(getPatternLine('version.py', 'palindrome')) | ||||
|     None | ||||
|     """ | ||||
|     fobj = open(fn, 'r') | ||||
|     for line in fobj.readlines(): | ||||
|         if pattern in line: | ||||
|             fobj.close() | ||||
|             return line | ||||
| 
 | ||||
|     return None | ||||
| 
 | ||||
| def is_executable(fn): | ||||
|     """ | ||||
|     takes a filename and returns True if the file is executable on the system | ||||
|     and False otherwise | ||||
|     :param fn: path to the file to be tested | ||||
|     :return: True or False | ||||
|     """ | ||||
|     return os.path.isfile(fn) and os.access(fn, os.X_OK) | ||||
| 
 | ||||
| 
 | ||||
| def isSorted(iterable): | ||||
|     ''' | ||||
|     takes an iterable and returns 'True' if the items are in order otherwise | ||||
|     'False' | ||||
|     :param iterable: an iterable object | ||||
|     :type iterable: | ||||
|     :return: Boolean | ||||
| 
 | ||||
|     >>> isSorted(1) | ||||
|     Traceback (most recent call last): | ||||
|     ... | ||||
|     AssertionError: object is not iterable; object: 1 | ||||
|     >>> isSorted([1,2,3,4]) | ||||
|     True | ||||
|     >>> isSorted('abcd') | ||||
|     True | ||||
|     >>> isSorted('bcad') | ||||
|     False | ||||
|     >>> isSorted([2,3,1,4]) | ||||
|     False | ||||
|     ''' | ||||
|     assert isIterable(iterable), 'object is not iterable; object: {' \ | ||||
|                                  '0}'.format(iterable) | ||||
|     if type(iterable) is str: | ||||
|         iterable = [s for s in iterable] | ||||
|     return sorted(iterable) == iterable | ||||
| 
 | ||||
| 
 | ||||
| def isIterable(obj): | ||||
|     """ | ||||
|     takes a python object and returns 'True' is the object is iterable and | ||||
|     'False' otherwise | ||||
|     :param obj: a python object | ||||
|     :return: True of False | ||||
|     """ | ||||
|     try: | ||||
|         iterator = iter(obj) | ||||
|     except TypeError as te: | ||||
|         return False | ||||
|     return True | ||||
| 
 | ||||
| 
 | ||||
| def key_for_set_value(d): | ||||
|     """ | ||||
|     takes a dictionary and returns the first key for which's value the | ||||
|     boolean is True | ||||
|     :param d: dictionary containing values | ||||
|     :type d: dict | ||||
|     :return: key to the first non-False value found; None if no value's | ||||
|     boolean equals True | ||||
|     """ | ||||
|     r = None | ||||
|     for k, v in d.items(): | ||||
|         if v: | ||||
|             return k | ||||
|     return r | ||||
| 
 | ||||
| 
 | ||||
| def prepTimeAxis(stime, trace): | ||||
|     ''' | ||||
|     takes a starttime and a trace object and returns a valid time axis for | ||||
|     plotting | ||||
|     :param stime: start time of the actual seismogram as UTCDateTime | ||||
|     :param trace: seismic trace object | ||||
|     :return: valid numpy array with time stamps for plotting | ||||
|     ''' | ||||
|     nsamp = trace.stats.npts | ||||
|     srate = trace.stats.sampling_rate | ||||
|     tincr = trace.stats.delta | ||||
|     etime = stime + nsamp / srate | ||||
|     time_ax = np.arange(stime, etime, tincr) | ||||
|     if len(time_ax) < nsamp: | ||||
|         print('elongate time axes by one datum') | ||||
|         time_ax = np.arange(stime, etime + tincr, tincr) | ||||
|     elif len(time_ax) > nsamp: | ||||
|         print('shorten time axes by one datum') | ||||
|         time_ax = np.arange(stime, etime - tincr, tincr) | ||||
|     if len(time_ax) != nsamp: | ||||
|         raise ValueError('{0} samples of data \n ' | ||||
|                          '{1} length of time vector \n' | ||||
|                          'delta: {2}'.format(nsamp, len(time_ax), tincr)) | ||||
|     return time_ax | ||||
| 
 | ||||
| 
 | ||||
| def find_horizontals(data): | ||||
|     """ | ||||
|     takes `obspy.core.stream.Stream` object and returns a list containing the component labels of the horizontal components available | ||||
|     :param data: waveform data | ||||
|     :type data: `obspy.core.stream.Stream` | ||||
|     :return: components list | ||||
|     :rtype: list | ||||
| 
 | ||||
|     ..example:: | ||||
| 
 | ||||
|     >>> st = read() | ||||
|     >>> find_horizontals(st) | ||||
|     [u'N', u'E'] | ||||
|     """ | ||||
|     rval = [] | ||||
|     for tr in data: | ||||
|         if tr.stats.channel[-1].upper() in ['Z', '3']: | ||||
|             continue | ||||
|         else: | ||||
|             rval.append(tr.stats.channel[-1].upper()) | ||||
|     return rval | ||||
| 
 | ||||
| 
 | ||||
| def remove_underscores(data): | ||||
|     """ | ||||
|     takes a `obspy.core.stream.Stream` object and removes all underscores | ||||
|     from stationnames | ||||
|     :param data: stream of seismic data | ||||
|     :type data: `obspy.core.stream.Stream` | ||||
|     :return: data stream | ||||
|     """ | ||||
|     for tr in data: | ||||
|         # remove underscores | ||||
|         tr.stats.station = tr.stats.station.strip('_') | ||||
|     return data | ||||
| 
 | ||||
| 
 | ||||
| def scaleWFData(data, factor=None, components='all'): | ||||
|     """ | ||||
|     produce scaled waveforms from given waveform data and a scaling factor, | ||||
|     waveform may be selected by their components name | ||||
|     :param data: waveform data to be scaled | ||||
|     :type data: `~obspy.core.stream.Stream` object | ||||
|     :param factor: scaling factor | ||||
|     :type factor: float | ||||
|     :param components: components labels for the traces in data to be scaled by | ||||
|      the scaling factor (optional, default: 'all') | ||||
|     :type components: tuple | ||||
|     :return:  scaled waveform data | ||||
|     :rtype: `~obspy.core.stream.Stream` object | ||||
|     """ | ||||
|     if components is not 'all': | ||||
|         for comp in components: | ||||
|             if factor is None: | ||||
|                 max_val = np.max(np.abs(data.select(component=comp)[0].data)) | ||||
|                 data.select(component=comp)[0].data /= 2 * max_val | ||||
|             else: | ||||
|                 data.select(component=comp)[0].data /= 2 * factor | ||||
|     else: | ||||
|         for tr in data: | ||||
|             if factor is None: | ||||
|                 max_val = float(np.max(np.abs(tr.data))) | ||||
|                 tr.data /= 2 * max_val | ||||
|             else: | ||||
|                 tr.data /= 2 * factor | ||||
| 
 | ||||
|     return data | ||||
| 
 | ||||
| 
 | ||||
| def runProgram(cmd, parameter=None): | ||||
|     """ | ||||
|     run an external program specified by cmd with parameters input returning the | ||||
|     stdout output | ||||
|     :param cmd: name of the command to run | ||||
|     :type cmd: str | ||||
|     :param parameter: filename of parameter file  or parameter string | ||||
|     :type parameter: str | ||||
|     :return: stdout output | ||||
|     :rtype: str | ||||
|     """ | ||||
| 
 | ||||
|     if parameter: | ||||
|         cmd.strip() | ||||
|         cmd += ' %s 2>&1' % parameter | ||||
| 
 | ||||
|     subprocess.check_output('{} | tee /dev/stderr'.format(cmd), shell=True) | ||||
| 
 | ||||
| def which(program): | ||||
|     """ | ||||
|     takes a program name and returns the full path to the executable or None | ||||
|     modified after: http://stackoverflow.com/questions/377017/test-if-executable-exists-in-python | ||||
|     :param program: name of the desired external program | ||||
|     :return: full path of the executable file | ||||
|     """ | ||||
|     try: | ||||
|         from PySide.QtCore import QSettings | ||||
|         settings = QSettings() | ||||
|         for key in settings.allKeys(): | ||||
|             if 'binPath' in key: | ||||
|                 os.environ['PATH'] += ':{0}'.format(settings.value(key)) | ||||
|         bpath = os.path.join(os.path.expanduser('~'), '.pylot', 'autoPyLoT.in') | ||||
|         if os.path.exists(bpath): | ||||
|             nllocpath = ":" + AutoPickParameter(bpath).get('nllocbin') | ||||
|             os.environ['PATH'] += nllocpath | ||||
|     except ImportError as e: | ||||
|         print(e.message) | ||||
| 
 | ||||
|     def is_exe(fpath): | ||||
|         return os.path.exists(fpath) and os.access(fpath, os.X_OK) | ||||
| 
 | ||||
|     def ext_candidates(fpath): | ||||
|         yield fpath | ||||
|         for ext in os.environ.get("PATHEXT", "").split(os.pathsep): | ||||
|             yield fpath + ext | ||||
| 
 | ||||
|     fpath, fname = os.path.split(program) | ||||
|     if fpath: | ||||
|         if is_exe(program): | ||||
|             return program | ||||
|     else: | ||||
|         for path in os.environ["PATH"].split(os.pathsep): | ||||
|             exe_file = os.path.join(path, program) | ||||
|             for candidate in ext_candidates(exe_file): | ||||
|                 if is_exe(candidate): | ||||
|                     return candidate | ||||
| 
 | ||||
|     return None | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     import doctest | ||||
| 
 | ||||
|     doctest.testmod() | ||||
							
								
								
									
										114
									
								
								pylot/core/util/version.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,114 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
| # Author: Douglas Creager <dcreager@dcreager.net> | ||||
| # This file is placed into the public domain. | ||||
| 
 | ||||
| # Calculates the current version number.  If possible, this is the | ||||
| # output of “git describe”, modified to conform to the versioning | ||||
| # scheme that setuptools uses.  If “git describe” returns an error | ||||
| # (most likely because we're in an unpacked copy of a release tarball, | ||||
| # rather than in a git working copy), then we fall back on reading the | ||||
| # contents of the RELEASE-VERSION file. | ||||
| # | ||||
| # To use this script, simply import it your setup.py file, and use the | ||||
| # results of get_git_version() as your package version: | ||||
| # | ||||
| # from version import * | ||||
| # | ||||
| # setup( | ||||
| #     version=get_git_version(), | ||||
| #     . | ||||
| #     . | ||||
| #     . | ||||
| # ) | ||||
| # | ||||
| # This will automatically update the RELEASE-VERSION file, if | ||||
| # necessary.  Note that the RELEASE-VERSION file should *not* be | ||||
| # checked into git; please add it to your top-level .gitignore file. | ||||
| # | ||||
| # You'll probably want to distribute the RELEASE-VERSION file in your | ||||
| # sdist tarballs; to do this, just create a MANIFEST.in file that | ||||
| # contains the following line: | ||||
| # | ||||
| #   include RELEASE-VERSION | ||||
| 
 | ||||
| from __future__ import print_function | ||||
| 
 | ||||
| __all__ = "get_git_version" | ||||
| 
 | ||||
| # NO IMPORTS FROM PYLOT IN THIS FILE! (file gets used at installation time) | ||||
| import os | ||||
| import inspect | ||||
| from subprocess import Popen, PIPE | ||||
| 
 | ||||
| # NO IMPORTS FROM PYLOT IN THIS FILE! (file gets used at installation time) | ||||
| 
 | ||||
| script_dir = os.path.abspath(os.path.dirname(inspect.getfile( | ||||
|     inspect.currentframe()))) | ||||
| PYLOT_ROOT = os.path.abspath(os.path.join(script_dir, os.pardir, | ||||
|                                           os.pardir, os.pardir)) | ||||
| VERSION_FILE = os.path.join(PYLOT_ROOT, "pylot", "RELEASE-VERSION") | ||||
| 
 | ||||
| 
 | ||||
| def call_git_describe(abbrev=4): | ||||
|     try: | ||||
|         p = Popen(['git', 'rev-parse', '--show-toplevel'], | ||||
|                   cwd=PYLOT_ROOT, stdout=PIPE, stderr=PIPE) | ||||
|         p.stderr.close() | ||||
|         path = p.stdout.readlines()[0].strip() | ||||
|     except: | ||||
|         return None | ||||
|     if os.path.normpath(path) != PYLOT_ROOT: | ||||
|         return None | ||||
|     try: | ||||
|         p = Popen(['git', 'describe', '--dirty', '--abbrev=%d' % abbrev, | ||||
|                    '--always'], | ||||
|                   cwd=PYLOT_ROOT, stdout=PIPE, stderr=PIPE) | ||||
|         p.stderr.close() | ||||
|         line = p.stdout.readlines()[0] | ||||
|         # (this line prevents official releases) | ||||
|         if "-" not in line and "." not in line: | ||||
|             line = "0.0.0-g%s" % line | ||||
|         return line.strip() | ||||
|     except: | ||||
|         return None | ||||
| 
 | ||||
| 
 | ||||
| def read_release_version(): | ||||
|     try: | ||||
|         version = open(VERSION_FILE, "r").readlines()[0] | ||||
|         return version.strip() | ||||
|     except: | ||||
|         return None | ||||
| 
 | ||||
| 
 | ||||
| def write_release_version(version): | ||||
|     open(VERSION_FILE, "w").write("%s\n" % version) | ||||
| 
 | ||||
| 
 | ||||
| def get_git_version(abbrev=4): | ||||
|     # Read in the version that's currently in RELEASE-VERSION. | ||||
|     release_version = read_release_version() | ||||
| 
 | ||||
|     # First try to get the current version using “git describe”. | ||||
|     version = call_git_describe(abbrev) | ||||
| 
 | ||||
|     # If that doesn't work, fall back on the value that's in | ||||
|     # RELEASE-VERSION. | ||||
|     if version is None: | ||||
|         version = release_version | ||||
| 
 | ||||
|     # If we still don't have anything, that's an error. | ||||
|     if version is None: | ||||
|         return '0.0.0-tar/zipball' | ||||
| 
 | ||||
|     # If the current version is different from what's in the | ||||
|     # RELEASE-VERSION file, update the file to be current. | ||||
|     if version != release_version: | ||||
|         write_release_version(version) | ||||
| 
 | ||||
|     # Finally, return the current version. | ||||
|     return version | ||||
| 
 | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     print(get_git_version()) | ||||
							
								
								
									
										1636
									
								
								pylot/core/util/widgets.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						
							
								
								
									
										0
									
								
								pylot/testing/__init__.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						
							
								
								
									
										12
									
								
								pylot/testing/testHelpForm.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,12 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import sys, time | ||||
| from PySide.QtGui import QApplication | ||||
| from pylot.core.util.widgets import HelpForm | ||||
| 
 | ||||
| app = QApplication(sys.argv) | ||||
| 
 | ||||
| win = HelpForm() | ||||
| win.show() | ||||
| app.exec_() | ||||
							
								
								
									
										20
									
								
								pylot/testing/testPickDlg.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,20 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import sys | ||||
| import matplotlib | ||||
| 
 | ||||
| matplotlib.use('Qt4Agg') | ||||
| matplotlib.rcParams['backend.qt4'] = 'PySide' | ||||
| 
 | ||||
| from PySide.QtGui import QApplication | ||||
| from obspy.core import read | ||||
| from pylot.core.util.widgets import PickDlg | ||||
| import icons_rc | ||||
| 
 | ||||
| app = QApplication(sys.argv) | ||||
| 
 | ||||
| data = read() | ||||
| win = PickDlg(data=data) | ||||
| win.show() | ||||
| app.exec_() | ||||
							
								
								
									
										12
									
								
								pylot/testing/testPropDlg.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,12 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import sys, time | ||||
| from PySide.QtGui import QApplication | ||||
| from pylot.core.util.widgets import PropertiesDlg | ||||
| 
 | ||||
| app = QApplication(sys.argv) | ||||
| 
 | ||||
| win = PropertiesDlg() | ||||
| win.show() | ||||
| app.exec_() | ||||
							
								
								
									
										17
									
								
								pylot/testing/testUIcomponents.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,17 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| 
 | ||||
| import sys, time | ||||
| from PySide.QtGui import QApplication | ||||
| from pylot.core.util.widgets import FilterOptionsDialog, PropertiesDlg, HelpForm | ||||
| 
 | ||||
| dialogs = [FilterOptionsDialog, PropertiesDlg, HelpForm] | ||||
| 
 | ||||
| app = QApplication(sys.argv) | ||||
| 
 | ||||
| for dlg in dialogs: | ||||
|     win = dlg() | ||||
|     win.show() | ||||
|     time.sleep(1) | ||||
|     win.destroy() | ||||
							
								
								
									
										27
									
								
								pylot/testing/testUtils.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,27 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
| ''' | ||||
| Created on 10.11.2014 | ||||
| 
 | ||||
| @author: sebastianw | ||||
| ''' | ||||
| import unittest | ||||
| 
 | ||||
| 
 | ||||
| class Test(unittest.TestCase): | ||||
| 
 | ||||
| 
 | ||||
|     def setUp(self): | ||||
|         pass | ||||
| 
 | ||||
| 
 | ||||
|     def tearDown(self): | ||||
|         pass | ||||
| 
 | ||||
| 
 | ||||
|     def testName(self): | ||||
|         pass | ||||
| 
 | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     #import sys;sys.argv = ['', 'Test.testName'] | ||||
|     unittest.main() | ||||
							
								
								
									
										19
									
								
								pylot/testing/testWidgets.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,19 @@ | ||||
| # -*- coding: utf-8 -*- | ||||
| ''' | ||||
| Created on 10.11.2014 | ||||
| 
 | ||||
| @author: sebastianw | ||||
| ''' | ||||
| import unittest | ||||
| 
 | ||||
| 
 | ||||
| class Test(unittest.TestCase): | ||||
| 
 | ||||
| 
 | ||||
|     def testName(self): | ||||
|         pass | ||||
| 
 | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     #import sys;sys.argv = ['', 'Test.testName'] | ||||
|     unittest.main() | ||||
							
								
								
									
										303
									
								
								pylot/testing/test_autopick.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,303 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| """ | ||||
|    Script to run autoPyLoT-script "makeCF.py". | ||||
|    Only for test purposes! | ||||
| """ | ||||
| 
 | ||||
| from obspy.core import read | ||||
| import matplotlib.pyplot as plt | ||||
| import numpy as np | ||||
| from pylot.core.pick.charfuns import * | ||||
| from pylot.core.pick.picker import * | ||||
| import glob | ||||
| import argparse | ||||
| 
 | ||||
| def run_makeCF(project, database, event, iplot, station=None): | ||||
|     #parameters for CF calculation | ||||
|     t2 = 7                   #length of moving window for HOS calculation [sec] | ||||
|     p = 4                    #order of HOS | ||||
|     cuttimes = [10, 50]      #start and end time for CF calculation | ||||
|     bpz = [2, 30]            #corner frequencies of bandpass filter, vertical component | ||||
|     bph = [2, 15]            #corner frequencies of bandpass filter, horizontal components | ||||
|     tdetz= 1.2               #length of AR-determination window [sec], vertical component | ||||
|     tdeth= 0.8               #length of AR-determination window [sec], horizontal components | ||||
|     tpredz = 0.4             #length of AR-prediction window [sec], vertical component | ||||
|     tpredh = 0.4             #length of AR-prediction window [sec], horizontal components | ||||
|     addnoise = 0.001         #add noise to seismogram for stable AR prediction | ||||
|     arzorder = 2             #chosen order of AR process, vertical component | ||||
|     arhorder = 4             #chosen order of AR process, horizontal components | ||||
|     TSNRhos = [5, 0.5, 1, 0.1] #window lengths [s] for calculating SNR for earliest/latest pick and quality assessment | ||||
|                              #from HOS-CF [noise window, safety gap, signal window, slope determination window] | ||||
|     TSNRarz = [5, 0.5, 1, 0.5] #window lengths [s] for calculating SNR for earliest/lates pick and quality assessment | ||||
|                              #from ARZ-CF | ||||
|     #get waveform data | ||||
|     if station: | ||||
|        dpz = '/data/%s/EVENT_DATA/LOCAL/%s/%s/%s*HZ.msd' % (project, database, event, station) | ||||
|        dpe = '/data/%s/EVENT_DATA/LOCAL/%s/%s/%s*HE.msd' % (project, database, event, station) | ||||
|        dpn = '/data/%s/EVENT_DATA/LOCAL/%s/%s/%s*HN.msd' % (project, database, event, station) | ||||
|        #dpz = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/%s*_z.gse' % (project, database, event, station) | ||||
|        #dpe = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/%s*_e.gse' % (project, database, event, station) | ||||
|        #dpn = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/%s*_n.gse' % (project, database, event, station) | ||||
|     else: | ||||
|         # dpz = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/*_z.gse' % (project, database, event) | ||||
|         # dpe = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/*_e.gse' % (project, database, event) | ||||
|         # dpn = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/*_n.gse' % (project, database, event) | ||||
|         dpz = '/data/%s/EVENT_DATA/LOCAL/%s/%s/*HZ.msd' % (project, database, event) | ||||
|         dpe = '/data/%s/EVENT_DATA/LOCAL/%s/%s/*HE.msd' % (project, database, event) | ||||
|         dpn = '/data/%s/EVENT_DATA/LOCAL/%s/%s/*HN.msd' % (project, database, event) | ||||
|     wfzfiles = glob.glob(dpz) | ||||
|     wfefiles = glob.glob(dpe) | ||||
|     wfnfiles = glob.glob(dpn) | ||||
|     if wfzfiles: | ||||
|        for i in range(len(wfzfiles)): | ||||
|           print 'Vertical component data found ...' | ||||
|           print wfzfiles[i] | ||||
|           st = read('%s' % wfzfiles[i]) | ||||
|           st_copy = st.copy() | ||||
|           #filter and taper data | ||||
|           tr_filt = st[0].copy() | ||||
|           tr_filt.filter('bandpass', freqmin=bpz[0], freqmax=bpz[1], zerophase=False) | ||||
|           tr_filt.taper(max_percentage=0.05, type='hann') | ||||
|           st_copy[0].data = tr_filt.data | ||||
|           ############################################################## | ||||
|           #calculate HOS-CF using subclass HOScf of class CharacteristicFunction | ||||
|           hoscf = HOScf(st_copy, cuttimes, t2, p) #instance of HOScf | ||||
|           ############################################################## | ||||
|           #calculate AIC-HOS-CF using subclass AICcf of class CharacteristicFunction | ||||
|           #class needs stream object => build it | ||||
|           tr_aic = tr_filt.copy() | ||||
|           tr_aic.data = hoscf.getCF() | ||||
|           st_copy[0].data = tr_aic.data | ||||
|           aiccf = AICcf(st_copy, cuttimes) #instance of AICcf | ||||
|           ############################################################## | ||||
|           #get prelimenary onset time from AIC-HOS-CF using subclass AICPicker of class AutoPicking | ||||
|           aicpick = AICPicker(aiccf, None, TSNRhos, 3, 10, None, 0.1) | ||||
|           ############################################################## | ||||
|           #get refined onset time from HOS-CF using class Picker | ||||
|           hospick = PragPicker(hoscf, None, TSNRhos, 2, 10, 0.001, 0.2, aicpick.getpick()) | ||||
|           #get earliest and latest possible picks | ||||
|           hosELpick = EarlLatePicker(hoscf, 1.5, TSNRhos, None, 10, None, None, hospick.getpick()) | ||||
|           ############################################################## | ||||
|           #calculate ARZ-CF using subclass ARZcf of class CharcteristicFunction | ||||
|           #get stream object of filtered data | ||||
|           st_copy[0].data = tr_filt.data | ||||
|           arzcf = ARZcf(st_copy, cuttimes, tpredz, arzorder, tdetz, addnoise) #instance of ARZcf | ||||
|           ############################################################## | ||||
|           #calculate AIC-ARZ-CF using subclass AICcf of class CharacteristicFunction | ||||
|           #class needs stream object => build it | ||||
|           tr_arzaic = tr_filt.copy() | ||||
|           tr_arzaic.data = arzcf.getCF() | ||||
|           st_copy[0].data = tr_arzaic.data | ||||
|           araiccf = AICcf(st_copy, cuttimes, tpredz, 0, tdetz) #instance of AICcf | ||||
|           ############################################################## | ||||
|           #get onset time from AIC-ARZ-CF using subclass AICPicker of class AutoPicking | ||||
|           aicarzpick = AICPicker(araiccf, 1.5, TSNRarz, 2, 10, None, 0.1) | ||||
|           ############################################################## | ||||
|           #get refined onset time from ARZ-CF using class Picker | ||||
|           arzpick = PragPicker(arzcf, 1.5, TSNRarz, 2.0, 10, 0.1, 0.05, aicarzpick.getpick()) | ||||
|           #get earliest and latest possible picks | ||||
|           arzELpick = EarlLatePicker(arzcf, 1.5, TSNRarz, None, 10, None, None, arzpick.getpick()) | ||||
|     elif not wfzfiles: | ||||
|        print 'No vertical component data found!' | ||||
| 
 | ||||
|     if wfefiles and wfnfiles: | ||||
|        for i in range(len(wfefiles)): | ||||
|           print 'Horizontal component data found ...' | ||||
|           print wfefiles[i] | ||||
|           print wfnfiles[i] | ||||
|           #merge streams | ||||
|           H = read('%s' % wfefiles[i]) | ||||
|           H += read('%s' % wfnfiles[i]) | ||||
|           H_copy = H.copy() | ||||
|           #filter and taper data | ||||
|           trH1_filt = H[0].copy() | ||||
|           trH2_filt = H[1].copy() | ||||
|           trH1_filt.filter('bandpass', freqmin=bph[0], freqmax=bph[1], zerophase=False) | ||||
|           trH2_filt.filter('bandpass', freqmin=bph[0], freqmax=bph[1], zerophase=False) | ||||
|           trH1_filt.taper(max_percentage=0.05, type='hann') | ||||
|           trH2_filt.taper(max_percentage=0.05, type='hann') | ||||
|           H_copy[0].data = trH1_filt.data | ||||
|           H_copy[1].data = trH2_filt.data | ||||
| 
 | ||||
|           ############################################################## | ||||
|           #calculate ARH-CF using subclass ARHcf of class CharcteristicFunction | ||||
|           arhcf = ARHcf(H_copy, cuttimes, tpredh, arhorder, tdeth, addnoise) #instance of ARHcf | ||||
|           ############################################################## | ||||
|           #calculate AIC-ARH-CF using subclass AICcf of class CharacteristicFunction | ||||
|           #class needs stream object => build it | ||||
|           tr_arhaic = trH1_filt.copy() | ||||
|           tr_arhaic.data = arhcf.getCF() | ||||
|           H_copy[0].data = tr_arhaic.data | ||||
|           #calculate ARH-AIC-CF | ||||
|           arhaiccf = AICcf(H_copy, cuttimes, tpredh, 0, tdeth) #instance of AICcf | ||||
|           ############################################################## | ||||
|           #get onset time from AIC-ARH-CF using subclass AICPicker of class AutoPicking | ||||
|           aicarhpick = AICPicker(arhaiccf, 1.5, TSNRarz, 4, 10, None, 0.1) | ||||
|           ############################################################### | ||||
|           #get refined onset time from ARH-CF using class Picker | ||||
|           arhpick = PragPicker(arhcf, 1.5, TSNRarz, 2.5, 10, 0.1, 0.05, aicarhpick.getpick()) | ||||
|           #get earliest and latest possible picks | ||||
|           arhELpick = EarlLatePicker(arhcf, 1.5, TSNRarz, None, 10, None, None, arhpick.getpick()) | ||||
| 
 | ||||
|           #create stream with 3 traces | ||||
|           #merge streams | ||||
|           AllC = read('%s' % wfefiles[i]) | ||||
|           AllC += read('%s' % wfnfiles[i]) | ||||
|           AllC += read('%s' % wfzfiles[i]) | ||||
|           #filter and taper data | ||||
|           All1_filt = AllC[0].copy() | ||||
|           All2_filt = AllC[1].copy() | ||||
|           All3_filt = AllC[2].copy() | ||||
|           All1_filt.filter('bandpass', freqmin=bph[0], freqmax=bph[1], zerophase=False) | ||||
|           All2_filt.filter('bandpass', freqmin=bph[0], freqmax=bph[1], zerophase=False) | ||||
|           All3_filt.filter('bandpass', freqmin=bpz[0], freqmax=bpz[1], zerophase=False) | ||||
|           All1_filt.taper(max_percentage=0.05, type='hann') | ||||
|           All2_filt.taper(max_percentage=0.05, type='hann') | ||||
|           All3_filt.taper(max_percentage=0.05, type='hann') | ||||
|           AllC[0].data = All1_filt.data | ||||
|           AllC[1].data = All2_filt.data | ||||
|           AllC[2].data = All3_filt.data | ||||
|           #calculate AR3C-CF using subclass AR3Ccf of class CharacteristicFunction | ||||
|           ar3ccf = AR3Ccf(AllC, cuttimes, tpredz, arhorder, tdetz, addnoise) #instance of AR3Ccf | ||||
|           #get earliest and latest possible pick from initial ARH-pick | ||||
|           ar3cELpick = EarlLatePicker(ar3ccf, 1.5, TSNRarz, None, 10, None, None, arhpick.getpick()) | ||||
|           ############################################################## | ||||
|           if iplot: | ||||
|              #plot vertical trace | ||||
|              plt.figure() | ||||
|              tr = st[0] | ||||
|              tdata = np.arange(0, tr.stats.npts / tr.stats.sampling_rate, tr.stats.delta) | ||||
|              p1, = plt.plot(tdata, tr_filt.data/max(tr_filt.data), 'k') | ||||
|              p2, = plt.plot(hoscf.getTimeArray(), hoscf.getCF() / max(hoscf.getCF()), 'r') | ||||
|              p3, = plt.plot(aiccf.getTimeArray(), aiccf.getCF()/max(aiccf.getCF()), 'b') | ||||
|              p4, = plt.plot(arzcf.getTimeArray(), arzcf.getCF()/max(arzcf.getCF()), 'g') | ||||
|              p5, = plt.plot(araiccf.getTimeArray(), araiccf.getCF()/max(araiccf.getCF()), 'y') | ||||
|              plt.plot([aicpick.getpick(), aicpick.getpick()], [-1, 1], 'b--') | ||||
|              plt.plot([aicpick.getpick()-0.5, aicpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.plot([aicpick.getpick()-0.5, aicpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([hospick.getpick(), hospick.getpick()], [-1.3, 1.3], 'r', linewidth=2) | ||||
|              plt.plot([hospick.getpick()-0.5, hospick.getpick()+0.5], [1.3, 1.3], 'r') | ||||
|              plt.plot([hospick.getpick()-0.5, hospick.getpick()+0.5], [-1.3, -1.3], 'r') | ||||
|              plt.plot([hosELpick.getLpick(), hosELpick.getLpick()], [-1.1, 1.1], 'r--') | ||||
|              plt.plot([hosELpick.getEpick(), hosELpick.getEpick()], [-1.1, 1.1], 'r--') | ||||
|              plt.plot([aicarzpick.getpick(), aicarzpick.getpick()], [-1.2, 1.2], 'y', linewidth=2) | ||||
|              plt.plot([aicarzpick.getpick()-0.5, aicarzpick.getpick()+0.5], [1.2, 1.2], 'y') | ||||
|              plt.plot([aicarzpick.getpick()-0.5, aicarzpick.getpick()+0.5], [-1.2, -1.2], 'y') | ||||
|              plt.plot([arzpick.getpick(), arzpick.getpick()], [-1.4, 1.4], 'g', linewidth=2) | ||||
|              plt.plot([arzpick.getpick()-0.5, arzpick.getpick()+0.5], [1.4, 1.4], 'g') | ||||
|              plt.plot([arzpick.getpick()-0.5, arzpick.getpick()+0.5], [-1.4, -1.4], 'g') | ||||
|              plt.plot([arzELpick.getLpick(), arzELpick.getLpick()], [-1.2, 1.2], 'g--') | ||||
|              plt.plot([arzELpick.getEpick(), arzELpick.getEpick()], [-1.2, 1.2], 'g--') | ||||
|              plt.yticks([]) | ||||
|              plt.ylim([-1.5, 1.5]) | ||||
|              plt.xlabel('Time [s]') | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              plt.title('%s, %s, CF-SNR=%7.2f, CF-Slope=%12.2f' % (tr.stats.station, \ | ||||
|                         tr.stats.channel, aicpick.getSNR(), aicpick.getSlope())) | ||||
|              plt.suptitle(tr.stats.starttime) | ||||
|              plt.legend([p1, p2, p3, p4, p5], ['Data', 'HOS-CF', 'HOSAIC-CF', 'ARZ-CF', 'ARZAIC-CF']) | ||||
|              #plot horizontal traces | ||||
|              plt.figure(2) | ||||
|              plt.subplot(2,1,1) | ||||
|              tsteph = tpredh / 4 | ||||
|              th1data = np.arange(0, trH1_filt.stats.npts / trH1_filt.stats.sampling_rate, trH1_filt.stats.delta) | ||||
|              th2data = np.arange(0, trH2_filt.stats.npts / trH2_filt.stats.sampling_rate, trH2_filt.stats.delta) | ||||
|              tarhcf = np.arange(0, len(arhcf.getCF()) * tsteph, tsteph) + cuttimes[0] + tdeth +tpredh | ||||
|              p21, = plt.plot(th1data, trH1_filt.data/max(trH1_filt.data), 'k') | ||||
|              p22, = plt.plot(arhcf.getTimeArray(), arhcf.getCF()/max(arhcf.getCF()), 'r') | ||||
|              p23, = plt.plot(arhaiccf.getTimeArray(), arhaiccf.getCF()/max(arhaiccf.getCF())) | ||||
|              plt.plot([aicarhpick.getpick(), aicarhpick.getpick()], [-1, 1], 'b') | ||||
|              plt.plot([aicarhpick.getpick()-0.5, aicarhpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.plot([aicarhpick.getpick()-0.5, aicarhpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([arhpick.getpick(), arhpick.getpick()], [-1, 1], 'r') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [1, 1], 'r') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [-1, -1], 'r') | ||||
|              plt.plot([arhELpick.getLpick(), arhELpick.getLpick()], [-0.8, 0.8], 'r--') | ||||
|              plt.plot([arhELpick.getEpick(), arhELpick.getEpick()], [-0.8, 0.8], 'r--') | ||||
|              plt.plot([arhpick.getpick() + arhELpick.getPickError(), arhpick.getpick() + arhELpick.getPickError()], \ | ||||
|                        [-0.2, 0.2], 'r--') | ||||
|              plt.plot([arhpick.getpick() - arhELpick.getPickError(), arhpick.getpick() - arhELpick.getPickError()], \ | ||||
|                        [-0.2, 0.2], 'r--') | ||||
|              plt.yticks([]) | ||||
|              plt.ylim([-1.5, 1.5]) | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              plt.title([trH1_filt.stats.station, trH1_filt.stats.channel]) | ||||
|              plt.suptitle(trH1_filt.stats.starttime) | ||||
|              plt.legend([p21, p22, p23], ['Data', 'ARH-CF', 'ARHAIC-CF']) | ||||
|              plt.subplot(2,1,2) | ||||
|              plt.plot(th2data, trH2_filt.data/max(trH2_filt.data), 'k') | ||||
|              plt.plot(arhcf.getTimeArray(), arhcf.getCF()/max(arhcf.getCF()), 'r') | ||||
|              plt.plot(arhaiccf.getTimeArray(), arhaiccf.getCF()/max(arhaiccf.getCF())) | ||||
|              plt.plot([aicarhpick.getpick(), aicarhpick.getpick()], [-1, 1], 'b') | ||||
|              plt.plot([aicarhpick.getpick()-0.5, aicarhpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.plot([aicarhpick.getpick()-0.5, aicarhpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([arhpick.getpick(), arhpick.getpick()], [-1, 1], 'r') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [1, 1], 'r') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [-1, -1], 'r') | ||||
|              plt.plot([arhELpick.getLpick(), arhELpick.getLpick()], [-0.8, 0.8], 'r--') | ||||
|              plt.plot([arhELpick.getEpick(), arhELpick.getEpick()], [-0.8, 0.8], 'r--') | ||||
|              plt.plot([arhpick.getpick() + arhELpick.getPickError(), arhpick.getpick() + arhELpick.getPickError()], \ | ||||
|                        [-0.2, 0.2], 'r--') | ||||
|              plt.plot([arhpick.getpick() - arhELpick.getPickError(), arhpick.getpick() - arhELpick.getPickError()], \ | ||||
|                        [-0.2, 0.2], 'r--') | ||||
|              plt.title([trH2_filt.stats.station, trH2_filt.stats.channel]) | ||||
|              plt.yticks([]) | ||||
|              plt.ylim([-1.5, 1.5]) | ||||
|              plt.xlabel('Time [s]') | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              #plot 3-component window | ||||
|              plt.figure(3) | ||||
|              plt.subplot(3,1,1) | ||||
|              p31, = plt.plot(tdata, tr_filt.data/max(tr_filt.data), 'k') | ||||
|              p32, = plt.plot(ar3ccf.getTimeArray(), ar3ccf.getCF()/max(ar3ccf.getCF()), 'r') | ||||
|              plt.plot([arhpick.getpick(), arhpick.getpick()], [-1, 1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.plot([ar3cELpick.getLpick(), ar3cELpick.getLpick()], [-0.8, 0.8], 'b--') | ||||
|              plt.plot([ar3cELpick.getEpick(), ar3cELpick.getEpick()], [-0.8, 0.8], 'b--') | ||||
|              plt.yticks([]) | ||||
|              plt.xticks([]) | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              plt.title([tr.stats.station, tr.stats.channel]) | ||||
|              plt.suptitle(trH1_filt.stats.starttime) | ||||
|              plt.legend([p31, p32], ['Data', 'AR3C-CF']) | ||||
|              plt.subplot(3,1,2) | ||||
|              plt.plot(th1data, trH1_filt.data/max(trH1_filt.data), 'k') | ||||
|              plt.plot(ar3ccf.getTimeArray(), ar3ccf.getCF()/max(ar3ccf.getCF()), 'r') | ||||
|              plt.plot([arhpick.getpick(), arhpick.getpick()], [-1, 1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.plot([ar3cELpick.getLpick(), ar3cELpick.getLpick()], [-0.8, 0.8], 'b--') | ||||
|              plt.plot([ar3cELpick.getEpick(), ar3cELpick.getEpick()], [-0.8, 0.8], 'b--') | ||||
|              plt.yticks([]) | ||||
|              plt.xticks([]) | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              plt.title([trH1_filt.stats.station, trH1_filt.stats.channel]) | ||||
|              plt.subplot(3,1,3) | ||||
|              plt.plot(th2data, trH2_filt.data/max(trH2_filt.data), 'k') | ||||
|              plt.plot(ar3ccf.getTimeArray(), ar3ccf.getCF()/max(ar3ccf.getCF()), 'r') | ||||
|              plt.plot([arhpick.getpick(), arhpick.getpick()], [-1, 1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.plot([ar3cELpick.getLpick(), ar3cELpick.getLpick()], [-0.8, 0.8], 'b--') | ||||
|              plt.plot([ar3cELpick.getEpick(), ar3cELpick.getEpick()], [-0.8, 0.8], 'b--') | ||||
|              plt.yticks([]) | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              plt.title([trH2_filt.stats.station, trH2_filt.stats.channel]) | ||||
|              plt.xlabel('Time [s]') | ||||
|              plt.show() | ||||
|              raw_input() | ||||
|              plt.close() | ||||
| 
 | ||||
| if __name__ == '__main__': | ||||
|     parser = argparse.ArgumentParser() | ||||
|     parser.add_argument('--project', type=str, help='project name (e.g. Insheim)') | ||||
|     parser.add_argument('--database', type=str, help='event data base (e.g. 2014.09_Insheim)') | ||||
|     parser.add_argument('--event', type=str, help='event ID (e.g. e0010.015.14)') | ||||
|     parser.add_argument('--iplot', help='anything, if set, figure occurs') | ||||
|     parser.add_argument('--station', type=str, help='Station ID (e.g. INS3) (optional)') | ||||
|     args = parser.parse_args() | ||||
| 
 | ||||
|     run_makeCF(args.project, args.database, args.event, args.iplot, args.station) | ||||
							
								
								
									
										7
									
								
								pylot/testing/test_pdf.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,7 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| from pylot.core.util.pdf import ProbabilityDensityFunction | ||||
| pdf = ProbabilityDensityFunction.from_pick(0.34, 0.5, 0.54, type='exp') | ||||
| pdf2 = ProbabilityDensityFunction.from_pick(0.34, 0.5, 0.54, type='exp') | ||||
| diff = pdf - pdf2 | ||||
							
								
								
									
										15
									
								
								scripts/pylot-noisewindow.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,15 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import argparse | ||||
| import numpy | ||||
| from pylot.core.pick.utils import getnoisewin | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     parser = argparse.ArgumentParser() | ||||
|     parser.add_argument('--t', type=numpy.array, help='numpy array of time stamps') | ||||
|     parser.add_argument('--t1', type=float, help='time from which relativ to it noise window is extracted') | ||||
|     parser.add_argument('--tnoise', type=float, help='length of time window [s] for noise part extraction') | ||||
|     parser.add_argument('--tgap', type=float, help='safety gap between signal (t1=onset) and noise') | ||||
|     args = parser.parse_args() | ||||
|     getnoisewin(args.t, args.t1, args.tnoise, args.tgap) | ||||
							
								
								
									
										28
									
								
								scripts/pylot-pick-earliest-latest.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,28 @@ | ||||
| #!/usr/bin/python | ||||
| # -*- coding: utf-8 -*- | ||||
| """ | ||||
|    Created Mar 2015 | ||||
|    Transcription of the rezipe of Diehl et al. (2009) for consistent phase | ||||
|    picking. For a given inital (the most likely) pick, the corresponding earliest | ||||
|    and latest possible pick is calculated based on noise measurements in front of | ||||
|    the most likely pick and signal wavelength derived from zero crossings. | ||||
| 
 | ||||
|    :author: Ludger Kueperkoch / MAGS2 EP3 working group | ||||
| """ | ||||
| 
 | ||||
| import argparse | ||||
| import obspy | ||||
| from pylot.core.pick.utils import earllatepicker | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     parser = argparse.ArgumentParser() | ||||
|     parser.add_argument('--X', type=~obspy.core.stream.Stream, | ||||
|                         help='time series (seismogram) read with obspy module read') | ||||
|     parser.add_argument('--nfac', type=int, | ||||
|                         help='(noise factor), nfac times noise level to calculate latest possible pick') | ||||
|     parser.add_argument('--TSNR', type=tuple, help='length of time windows around pick used to determine SNR \ | ||||
|                         [s] (Tnoise, Tgap, Tsignal)') | ||||
|     parser.add_argument('--Pick1', type=float, help='Onset time of most likely pick') | ||||
|     parser.add_argument('--iplot', type=int, help='if set, figure no. iplot occurs') | ||||
|     args = parser.parse_args() | ||||
|     earllatepicker(args.X, args.nfac, args.TSNR, args.Pick1, args.iplot) | ||||
							
								
								
									
										24
									
								
								scripts/pylot-pick-firstmotion.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,24 @@ | ||||
| #!/usr/bin/python | ||||
| # -*- coding: utf-8 -*- | ||||
| """ | ||||
|    Created Mar 2015 | ||||
|    Function to derive first motion (polarity) for given phase onset based on zero crossings. | ||||
| 
 | ||||
|    :author: MAGS2 EP3 working group / Ludger Kueperkoch | ||||
| """ | ||||
| 
 | ||||
| import argparse | ||||
| import obspy | ||||
| from pylot.core.pick.utils import fmpicker | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     parser = argparse.ArgumentParser() | ||||
|     parser.add_argument('--Xraw', type=obspy.core.stream.Stream, | ||||
|                         help='unfiltered time series (seismogram) read with obspy module read') | ||||
|     parser.add_argument('--Xfilt', type=obspy.core.stream.Stream, | ||||
|                         help='filtered time series (seismogram) read with obspy module read') | ||||
|     parser.add_argument('--pickwin', type=float, help='length of pick window [s] for first motion determination') | ||||
|     parser.add_argument('--Pick', type=float, help='Onset time of most likely pick') | ||||
|     parser.add_argument('--iplot', type=int, help='if set, figure no. iplot occurs') | ||||
|     args = parser.parse_args() | ||||
|     fmpicker(args.Xraw, args.Xfilt, args.pickwin, args.Pick, args.iplot) | ||||
							
								
								
									
										41
									
								
								scripts/pylot-reasses-pilot-db.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,41 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import argparse | ||||
| 
 | ||||
| from pylot.core.util.version import get_git_version as _getVersionString | ||||
| from pylot.core.io.phases import reassess_pilot_db | ||||
| 
 | ||||
| __version__ = _getVersionString() | ||||
| __author__ = 'S. Wehling-Benatelli' | ||||
| 
 | ||||
| if __name__ == '__main__': | ||||
|     parser = argparse.ArgumentParser( | ||||
|         description='reassess old PILOT event data base in terms of consistent ' | ||||
|                     'automatic uncertainty estimation', | ||||
|         epilog='Script written by {author} belonging to PyLoT version' | ||||
|                ' {version}\n'.format(author=__author__, | ||||
|                                      version=__version__) | ||||
|     ) | ||||
| 
 | ||||
|     parser.add_argument( | ||||
|         'root', type=str, help='specifies the root directory' | ||||
|     ) | ||||
|     parser.add_argument( | ||||
|         'db', type=str, help='specifies the database name' | ||||
|     ) | ||||
|     parser.add_argument( | ||||
|         '--output', '-o', type=str, help='path to the output directory', | ||||
|         dest='output' | ||||
|     ) | ||||
|     parser.add_argument( | ||||
|         '--parameterfile', '-p', type=str, | ||||
|         help='full path to the parameterfile', dest='parfile' | ||||
|     ) | ||||
|     parser.add_argument( | ||||
|         '--verbosity', '-v', action='count', help='increase output verbosity', | ||||
|         default=0, dest='verbosity' | ||||
|     ) | ||||
| 
 | ||||
|     args = parser.parse_args() | ||||
|     reassess_pilot_db(args.root, args.db, args.output, args.parfile, args.verbosity) | ||||
							
								
								
									
										38
									
								
								scripts/pylot-reasses-pilot-event.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,38 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import argparse | ||||
| 
 | ||||
| from pylot.core.util.version import get_git_version as _getVersionString | ||||
| from pylot.core.io.phases import reassess_pilot_event | ||||
| 
 | ||||
| __version__ = _getVersionString() | ||||
| __author__ = 'S. Wehling-Benatelli' | ||||
| 
 | ||||
| if __name__ == '__main__': | ||||
|     parser = argparse.ArgumentParser( | ||||
|         description='reassess old PILOT event data in terms of consistent ' | ||||
|                     'automatic uncertainty estimation', | ||||
|         epilog='Script written by {author} belonging to PyLoT version' | ||||
|                ' {version}\n'.format(author=__author__, | ||||
|                                      version=__version__) | ||||
|     ) | ||||
| 
 | ||||
|     parser.add_argument( | ||||
|         'root', type=str, help='specifies the root directory' | ||||
|     ) | ||||
|     parser.add_argument( | ||||
|         'db', type=str, help='specifies the database name' | ||||
|     ) | ||||
|     parser.add_argument( | ||||
|         'id', type=str, help='PILOT event identifier' | ||||
|     ) | ||||
|     parser.add_argument( | ||||
|         '--output', '-o', type=str, help='path to the output directory', dest='output' | ||||
|     ) | ||||
|     parser.add_argument( | ||||
|         '--parameterfile', '-p', type=str, help='full path to the parameterfile', dest='parfile' | ||||
|     ) | ||||
| 
 | ||||
|     args = parser.parse_args() | ||||
|     reassess_pilot_event(args.root, args.db, args.id, args.output, args.parfile) | ||||
							
								
								
									
										14
									
								
								scripts/pylot-signalwindow.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,14 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| import argparse | ||||
| import numpy | ||||
| from pylot.core.pick.utils import getsignalwin | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     parser = argparse.ArgumentParser() | ||||
|     parser.add_argument('--t', type=numpy.array, help='numpy array of time stamps') | ||||
|     parser.add_argument('--t1', type=float, help='time from which relativ to it signal window is extracted') | ||||
|     parser.add_argument('--tsignal', type=float, help='length of time window [s] for signal part extraction') | ||||
|     args = parser.parse_args() | ||||
|     getsignalwin(args.t, args.t1, args.tsignal) | ||||
							
								
								
									
										30
									
								
								scripts/pylot-snr.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,30 @@ | ||||
| #!/usr/bin/python | ||||
| # -*- coding: utf-8 -*- | ||||
| """ | ||||
|    Created Mar/Apr 2015 | ||||
|    Function to calculate SNR of certain part of seismogram relative | ||||
|    to given time. Returns SNR and SNR [dB]. | ||||
| 
 | ||||
|    :author: Ludger Kueperkoch /MAGS EP3 working group | ||||
| """ | ||||
| 
 | ||||
| import argparse | ||||
| import obspy | ||||
| from pylot.core.pick.utils import getSNR | ||||
| 
 | ||||
| if __name__ == "__main__": | ||||
|     parser = argparse.ArgumentParser() | ||||
|     parser.add_argument('--data', '-d', type=obspy.core.stream.Stream, | ||||
|                         help='time series (seismogram) read with obspy module ' | ||||
|                              'read', | ||||
|                         dest='data') | ||||
|     parser.add_argument('--tsnr', '-s', type=tuple, | ||||
|                         help='length of time windows around pick used to ' | ||||
|                              'determine SNR [s] (Tnoise, Tgap, Tsignal)', | ||||
|                         dest='tsnr') | ||||
|     parser.add_argument('--time', '-t', type=float, | ||||
|                         help='initial time from which noise and signal windows ' | ||||
|                              'are calculated', | ||||
|                         dest='time') | ||||
|     args = parser.parse_args() | ||||
|     print getSNR(args.data, args.tsnr, args.time) | ||||
							
								
								
									
										307
									
								
								scripts/run_makeCF.py
									
									
									
									
									
										Executable file
									
								
							
							
						
						| @ -0,0 +1,307 @@ | ||||
| #!/usr/bin/python | ||||
| # -*- coding: utf-8 -*- | ||||
| 
 | ||||
| """ | ||||
|    Script to run autoPyLoT-script "run_makeCF.py". | ||||
|    Only for test purposes! | ||||
| """ | ||||
| 
 | ||||
| from obspy.core import read | ||||
| import matplotlib.pyplot as plt | ||||
| import numpy as np | ||||
| from pylot.core.pick.charfuns import CharacteristicFunction | ||||
| from pylot.core.pick.picker import AutoPicker | ||||
| from pylot.core.pick.utils import * | ||||
| import glob | ||||
| import argparse | ||||
| 
 | ||||
| def run_makeCF(project, database, event, iplot, station=None): | ||||
|     #parameters for CF calculation | ||||
|     t2 = 7                   #length of moving window for HOS calculation [sec] | ||||
|     p = 4                    #order of HOS | ||||
|     cuttimes = [10, 50]      #start and end time for CF calculation | ||||
|     bpz = [2, 30]            #corner frequencies of bandpass filter, vertical component | ||||
|     bph = [2, 15]            #corner frequencies of bandpass filter, horizontal components | ||||
|     tdetz= 1.2               #length of AR-determination window [sec], vertical component | ||||
|     tdeth= 0.8               #length of AR-determination window [sec], horizontal components | ||||
|     tpredz = 0.4             #length of AR-prediction window [sec], vertical component | ||||
|     tpredh = 0.4             #length of AR-prediction window [sec], horizontal components | ||||
|     addnoise = 0.001         #add noise to seismogram for stable AR prediction | ||||
|     arzorder = 2             #chosen order of AR process, vertical component | ||||
|     arhorder = 4             #chosen order of AR process, horizontal components | ||||
|     TSNRhos = [5, 0.5, 1, .6] #window lengths [s] for calculating SNR for earliest/latest pick and quality assessment | ||||
|                              #from HOS-CF [noise window, safety gap, signal window, slope determination window] | ||||
|     TSNRarz = [5, 0.5, 1, 1.0] #window lengths [s] for calculating SNR for earliest/lates pick and quality assessment | ||||
|                              #from ARZ-CF | ||||
|     #get waveform data | ||||
|     if station: | ||||
|        dpz = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/%s*HZ.msd' % (project, database, event, station) | ||||
|        dpe = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/%s*HE.msd' % (project, database, event, station) | ||||
|        dpn = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/%s*HN.msd' % (project, database, event, station) | ||||
|        #dpz = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/%s*_z.gse' % (project, database, event, station) | ||||
|        #dpe = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/%s*_e.gse' % (project, database, event, station) | ||||
|        #dpn = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/%s*_n.gse' % (project, database, event, station) | ||||
|     else: | ||||
|        dpz = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/*HZ.msd' % (project, database, event) | ||||
|        dpe = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/*HE.msd' % (project, database, event) | ||||
|        dpn = '/DATA/%s/EVENT_DATA/LOCAL/%s/%s/*HN.msd' % (project, database, event) | ||||
|     wfzfiles = glob.glob(dpz) | ||||
|     wfefiles = glob.glob(dpe) | ||||
|     wfnfiles = glob.glob(dpn) | ||||
|     if wfzfiles: | ||||
|        for i in range(len(wfzfiles)): | ||||
|           print 'Vertical component data found ...' | ||||
|           print wfzfiles[i] | ||||
|           st = read('%s' % wfzfiles[i]) | ||||
|           st_copy = st.copy() | ||||
|           #filter and taper data | ||||
|           tr_filt = st[0].copy() | ||||
|           tr_filt.filter('bandpass', freqmin=bpz[0], freqmax=bpz[1], zerophase=False) | ||||
|           tr_filt.taper(max_percentage=0.05, type='hann') | ||||
|           st_copy[0].data = tr_filt.data | ||||
|           ############################################################## | ||||
|           #calculate HOS-CF using subclass HOScf of class CharacteristicFunction | ||||
|           hoscf = HOScf(st_copy, cuttimes, t2, p) #instance of HOScf | ||||
|           ############################################################## | ||||
|           #calculate AIC-HOS-CF using subclass AICcf of class CharacteristicFunction | ||||
|           #class needs stream object => build it | ||||
|           tr_aic = tr_filt.copy() | ||||
|           tr_aic.data = hoscf.getCF() | ||||
|           st_copy[0].data = tr_aic.data | ||||
|           aiccf = AICcf(st_copy, cuttimes) #instance of AICcf | ||||
|           ############################################################## | ||||
|           #get prelimenary onset time from AIC-HOS-CF using subclass AICPicker of class AutoPicking | ||||
|           aicpick = AICPicker(aiccf, TSNRhos, 3, 10, None, 0.1) | ||||
|           ############################################################## | ||||
|           #get refined onset time from HOS-CF using class Picker | ||||
|           hospick = PragPicker(hoscf, TSNRhos, 2, 10, 0.001, 0.2, aicpick.getpick()) | ||||
|           ############################################################# | ||||
|           #get earliest and latest possible picks | ||||
|           st_copy[0].data = tr_filt.data | ||||
|           [lpickhos, epickhos, pickerrhos] = earllatepicker(st_copy, 1.5, TSNRhos, hospick.getpick(), 10) | ||||
|           ############################################################# | ||||
|           #get SNR | ||||
|           [SNR, SNRdB] = getSNR(st_copy, TSNRhos, hospick.getpick()) | ||||
|           print 'SNR:', SNR, 'SNR[dB]:', SNRdB | ||||
|           ########################################################## | ||||
|           #get first motion of onset | ||||
|           hosfm = fmpicker(st, st_copy, 0.2, hospick.getpick(), 11) | ||||
|           ############################################################## | ||||
|           #calculate ARZ-CF using subclass ARZcf of class CharcteristicFunction | ||||
|           arzcf = ARZcf(st, cuttimes, tpredz, arzorder, tdetz, addnoise) #instance of ARZcf | ||||
|           ############################################################## | ||||
|           #calculate AIC-ARZ-CF using subclass AICcf of class CharacteristicFunction | ||||
|           #class needs stream object => build it | ||||
|           tr_arzaic = tr_filt.copy() | ||||
|           tr_arzaic.data = arzcf.getCF() | ||||
|           st_copy[0].data = tr_arzaic.data | ||||
|           araiccf = AICcf(st_copy, cuttimes, tpredz, 0, tdetz) #instance of AICcf | ||||
|           ############################################################## | ||||
|           #get onset time from AIC-ARZ-CF using subclass AICPicker of class AutoPicking | ||||
|           aicarzpick = AICPicker(araiccf, TSNRarz, 2, 10, None, 0.1) | ||||
|           ############################################################## | ||||
|           #get refined onset time from ARZ-CF using class Picker | ||||
|           arzpick = PragPicker(arzcf, TSNRarz, 2.0, 10, 0.1, 0.05, aicarzpick.getpick()) | ||||
|           #get earliest and latest possible picks | ||||
|           st_copy[0].data = tr_filt.data | ||||
|           [lpickarz, epickarz, pickerrarz] = earllatepicker(st_copy, 1.5, TSNRarz, arzpick.getpick(), 10) | ||||
|     elif not wfzfiles: | ||||
|        print 'No vertical component data found!' | ||||
| 
 | ||||
|     if wfefiles and wfnfiles: | ||||
|        for i in range(len(wfefiles)): | ||||
|           print 'Horizontal component data found ...' | ||||
|           print wfefiles[i] | ||||
|           print wfnfiles[i] | ||||
|           #merge streams | ||||
|           H = read('%s' % wfefiles[i]) | ||||
|           H += read('%s' % wfnfiles[i]) | ||||
|           H_copy = H.copy() | ||||
|           #filter and taper data | ||||
|           trH1_filt = H[0].copy() | ||||
|           trH2_filt = H[1].copy() | ||||
|           trH1_filt.filter('bandpass', freqmin=bph[0], freqmax=bph[1], zerophase=False) | ||||
|           trH2_filt.filter('bandpass', freqmin=bph[0], freqmax=bph[1], zerophase=False) | ||||
|           trH1_filt.taper(max_percentage=0.05, type='hann') | ||||
|           trH2_filt.taper(max_percentage=0.05, type='hann') | ||||
|           H_copy[0].data = trH1_filt.data | ||||
|           H_copy[1].data = trH2_filt.data | ||||
| 
 | ||||
|           ############################################################## | ||||
|           #calculate ARH-CF using subclass ARHcf of class CharcteristicFunction | ||||
|           arhcf = ARHcf(H_copy, cuttimes, tpredh, arhorder, tdeth, addnoise) #instance of ARHcf | ||||
|           ############################################################## | ||||
|           #calculate AIC-ARH-CF using subclass AICcf of class CharacteristicFunction | ||||
|           #class needs stream object => build it | ||||
|           tr_arhaic = trH1_filt.copy() | ||||
|           tr_arhaic.data = arhcf.getCF() | ||||
|           H_copy[0].data = tr_arhaic.data | ||||
|           #calculate ARH-AIC-CF | ||||
|           arhaiccf = AICcf(H_copy, cuttimes, tpredh, 0, tdeth) #instance of AICcf | ||||
|           ############################################################## | ||||
|           #get onset time from AIC-ARH-CF using subclass AICPicker of class AutoPicking | ||||
|           aicarhpick = AICPicker(arhaiccf, TSNRarz, 4, 10, None, 0.1) | ||||
|           ############################################################### | ||||
|           #get refined onset time from ARH-CF using class Picker | ||||
|           arhpick = PragPicker(arhcf, TSNRarz, 2.5, 10, 0.1, 0.05, aicarhpick.getpick()) | ||||
|           #get earliest and latest possible picks | ||||
|           H_copy[0].data = trH1_filt.data | ||||
|           [lpickarh1, epickarh1, pickerrarh1] = earllatepicker(H_copy, 1.5, TSNRarz, arhpick.getpick(), 10) | ||||
|           H_copy[0].data = trH2_filt.data | ||||
|           [lpickarh2, epickarh2, pickerrarh2] = earllatepicker(H_copy, 1.5, TSNRarz, arhpick.getpick(), 10) | ||||
|           #get earliest pick of both earliest possible picks | ||||
|           epick = [epickarh1, epickarh2] | ||||
|           lpick = [lpickarh1, lpickarh2] | ||||
|           pickerr = [pickerrarh1, pickerrarh2] | ||||
|           ipick =np.argmin([epickarh1, epickarh2]) | ||||
|           epickarh = epick[ipick] | ||||
|           lpickarh = lpick[ipick] | ||||
|           pickerrarh = pickerr[ipick] | ||||
| 
 | ||||
|           #create stream with 3 traces | ||||
|           #merge streams | ||||
|           AllC = read('%s' % wfefiles[i]) | ||||
|           AllC += read('%s' % wfnfiles[i]) | ||||
|           AllC += read('%s' % wfzfiles[i]) | ||||
|           #filter and taper data | ||||
|           All1_filt = AllC[0].copy() | ||||
|           All2_filt = AllC[1].copy() | ||||
|           All3_filt = AllC[2].copy() | ||||
|           All1_filt.filter('bandpass', freqmin=bph[0], freqmax=bph[1], zerophase=False) | ||||
|           All2_filt.filter('bandpass', freqmin=bph[0], freqmax=bph[1], zerophase=False) | ||||
|           All3_filt.filter('bandpass', freqmin=bpz[0], freqmax=bpz[1], zerophase=False) | ||||
|           All1_filt.taper(max_percentage=0.05, type='hann') | ||||
|           All2_filt.taper(max_percentage=0.05, type='hann') | ||||
|           All3_filt.taper(max_percentage=0.05, type='hann') | ||||
|           AllC[0].data = All1_filt.data | ||||
|           AllC[1].data = All2_filt.data | ||||
|           AllC[2].data = All3_filt.data | ||||
|           #calculate AR3C-CF using subclass AR3Ccf of class CharacteristicFunction | ||||
|           ar3ccf = AR3Ccf(AllC, cuttimes, tpredz, arhorder, tdetz, addnoise) #instance of AR3Ccf | ||||
|           ############################################################## | ||||
|           if iplot: | ||||
|              #plot vertical trace | ||||
|              plt.figure() | ||||
|              tr = st[0] | ||||
|              tdata = np.arange(0, tr.stats.npts / tr.stats.sampling_rate, tr.stats.delta) | ||||
|              p1, = plt.plot(tdata, tr_filt.data/max(tr_filt.data), 'k') | ||||
|              p2, = plt.plot(hoscf.getTimeArray(), hoscf.getCF() / max(hoscf.getCF()), 'r') | ||||
|              p3, = plt.plot(aiccf.getTimeArray(), aiccf.getCF()/max(aiccf.getCF()), 'b') | ||||
|              p4, = plt.plot(arzcf.getTimeArray(), arzcf.getCF()/max(arzcf.getCF()), 'g') | ||||
|              p5, = plt.plot(araiccf.getTimeArray(), araiccf.getCF()/max(araiccf.getCF()), 'y') | ||||
|              plt.plot([aicpick.getpick(), aicpick.getpick()], [-1, 1], 'b--') | ||||
|              plt.plot([aicpick.getpick()-0.5, aicpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.plot([aicpick.getpick()-0.5, aicpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([hospick.getpick(), hospick.getpick()], [-1.3, 1.3], 'r', linewidth=2) | ||||
|              plt.plot([hospick.getpick()-0.5, hospick.getpick()+0.5], [1.3, 1.3], 'r') | ||||
|              plt.plot([hospick.getpick()-0.5, hospick.getpick()+0.5], [-1.3, -1.3], 'r') | ||||
|              plt.plot([lpickhos, lpickhos], [-1.1, 1.1], 'r--') | ||||
|              plt.plot([epickhos, epickhos], [-1.1, 1.1], 'r--') | ||||
|              plt.plot([aicarzpick.getpick(), aicarzpick.getpick()], [-1.2, 1.2], 'y', linewidth=2) | ||||
|              plt.plot([aicarzpick.getpick()-0.5, aicarzpick.getpick()+0.5], [1.2, 1.2], 'y') | ||||
|              plt.plot([aicarzpick.getpick()-0.5, aicarzpick.getpick()+0.5], [-1.2, -1.2], 'y') | ||||
|              plt.plot([arzpick.getpick(), arzpick.getpick()], [-1.4, 1.4], 'g', linewidth=2) | ||||
|              plt.plot([arzpick.getpick()-0.5, arzpick.getpick()+0.5], [1.4, 1.4], 'g') | ||||
|              plt.plot([arzpick.getpick()-0.5, arzpick.getpick()+0.5], [-1.4, -1.4], 'g') | ||||
|              plt.plot([lpickarz, lpickarz], [-1.2, 1.2], 'g--') | ||||
|              plt.plot([epickarz, epickarz], [-1.2, 1.2], 'g--') | ||||
|              plt.yticks([]) | ||||
|              plt.ylim([-1.5, 1.5]) | ||||
|              plt.xlabel('Time [s]') | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              plt.title('%s, %s, CF-SNR=%7.2f, CF-Slope=%12.2f' % (tr.stats.station, | ||||
|                                                                   tr.stats.channel, aicpick.getSNR(), aicpick.getSlope())) | ||||
|              plt.suptitle(tr.stats.starttime) | ||||
|              plt.legend([p1, p2, p3, p4, p5], ['Data', 'HOS-CF', 'HOSAIC-CF', 'ARZ-CF', 'ARZAIC-CF']) | ||||
|              #plot horizontal traces | ||||
|              plt.figure(2) | ||||
|              plt.subplot(2,1,1) | ||||
|              tsteph = tpredh / 4 | ||||
|              th1data = np.arange(0, trH1_filt.stats.npts / trH1_filt.stats.sampling_rate, trH1_filt.stats.delta) | ||||
|              th2data = np.arange(0, trH2_filt.stats.npts / trH2_filt.stats.sampling_rate, trH2_filt.stats.delta) | ||||
|              tarhcf = np.arange(0, len(arhcf.getCF()) * tsteph, tsteph) + cuttimes[0] + tdeth +tpredh | ||||
|              p21, = plt.plot(th1data, trH1_filt.data/max(trH1_filt.data), 'k') | ||||
|              p22, = plt.plot(arhcf.getTimeArray(), arhcf.getCF()/max(arhcf.getCF()), 'r') | ||||
|              p23, = plt.plot(arhaiccf.getTimeArray(), arhaiccf.getCF()/max(arhaiccf.getCF())) | ||||
|              plt.plot([aicarhpick.getpick(), aicarhpick.getpick()], [-1, 1], 'b') | ||||
|              plt.plot([aicarhpick.getpick()-0.5, aicarhpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.plot([aicarhpick.getpick()-0.5, aicarhpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([arhpick.getpick(), arhpick.getpick()], [-1, 1], 'r') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [1, 1], 'r') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [-1, -1], 'r') | ||||
|              plt.plot([lpickarh, lpickarh], [-0.8, 0.8], 'r--') | ||||
|              plt.plot([epickarh, epickarh], [-0.8, 0.8], 'r--') | ||||
|              plt.plot([arhpick.getpick() + pickerrarh, arhpick.getpick() + pickerrarh], [-0.2, 0.2], 'r--') | ||||
|              plt.plot([arhpick.getpick() - pickerrarh, arhpick.getpick() - pickerrarh], [-0.2, 0.2], 'r--') | ||||
|              plt.yticks([]) | ||||
|              plt.ylim([-1.5, 1.5]) | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              plt.title([trH1_filt.stats.station, trH1_filt.stats.channel]) | ||||
|              plt.suptitle(trH1_filt.stats.starttime) | ||||
|              plt.legend([p21, p22, p23], ['Data', 'ARH-CF', 'ARHAIC-CF']) | ||||
|              plt.subplot(2,1,2) | ||||
|              plt.plot(th2data, trH2_filt.data/max(trH2_filt.data), 'k') | ||||
|              plt.plot(arhcf.getTimeArray(), arhcf.getCF()/max(arhcf.getCF()), 'r') | ||||
|              plt.plot(arhaiccf.getTimeArray(), arhaiccf.getCF()/max(arhaiccf.getCF())) | ||||
|              plt.plot([aicarhpick.getpick(), aicarhpick.getpick()], [-1, 1], 'b') | ||||
|              plt.plot([aicarhpick.getpick()-0.5, aicarhpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.plot([aicarhpick.getpick()-0.5, aicarhpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([arhpick.getpick(), arhpick.getpick()], [-1, 1], 'r') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [1, 1], 'r') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [-1, -1], 'r') | ||||
|              plt.plot([lpickarh, lpickarh], [-0.8, 0.8], 'r--') | ||||
|              plt.plot([epickarh, epickarh], [-0.8, 0.8], 'r--') | ||||
|              plt.plot([arhpick.getpick() + pickerrarh, arhpick.getpick() + pickerrarh], [-0.2, 0.2], 'r--') | ||||
|              plt.plot([arhpick.getpick() - pickerrarh, arhpick.getpick() - pickerrarh], [-0.2, 0.2], 'r--') | ||||
|              plt.title([trH2_filt.stats.station, trH2_filt.stats.channel]) | ||||
|              plt.yticks([]) | ||||
|              plt.ylim([-1.5, 1.5]) | ||||
|              plt.xlabel('Time [s]') | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              #plot 3-component window | ||||
|              plt.figure(3) | ||||
|              plt.subplot(3,1,1) | ||||
|              p31, = plt.plot(tdata, tr_filt.data/max(tr_filt.data), 'k') | ||||
|              p32, = plt.plot(ar3ccf.getTimeArray(), ar3ccf.getCF()/max(ar3ccf.getCF()), 'r') | ||||
|              plt.plot([arhpick.getpick(), arhpick.getpick()], [-1, 1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.yticks([]) | ||||
|              plt.xticks([]) | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              plt.title([tr.stats.station, tr.stats.channel]) | ||||
|              plt.suptitle(trH1_filt.stats.starttime) | ||||
|              plt.legend([p31, p32], ['Data', 'AR3C-CF']) | ||||
|              plt.subplot(3,1,2) | ||||
|              plt.plot(th1data, trH1_filt.data/max(trH1_filt.data), 'k') | ||||
|              plt.plot(ar3ccf.getTimeArray(), ar3ccf.getCF()/max(ar3ccf.getCF()), 'r') | ||||
|              plt.plot([arhpick.getpick(), arhpick.getpick()], [-1, 1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.yticks([]) | ||||
|              plt.xticks([]) | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              plt.title([trH1_filt.stats.station, trH1_filt.stats.channel]) | ||||
|              plt.subplot(3,1,3) | ||||
|              plt.plot(th2data, trH2_filt.data/max(trH2_filt.data), 'k') | ||||
|              plt.plot(ar3ccf.getTimeArray(), ar3ccf.getCF()/max(ar3ccf.getCF()), 'r') | ||||
|              plt.plot([arhpick.getpick(), arhpick.getpick()], [-1, 1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [-1, -1], 'b') | ||||
|              plt.plot([arhpick.getpick()-0.5, arhpick.getpick()+0.5], [1, 1], 'b') | ||||
|              plt.yticks([]) | ||||
|              plt.ylabel('Normalized Counts') | ||||
|              plt.title([trH2_filt.stats.station, trH2_filt.stats.channel]) | ||||
|              plt.xlabel('Time [s]') | ||||
|              plt.show() | ||||
|              raw_input() | ||||
|              plt.close() | ||||
| 
 | ||||
| parser = argparse.ArgumentParser() | ||||
| parser.add_argument('--project', type=str, help='project name (e.g. Insheim)') | ||||
| parser.add_argument('--database', type=str, help='event data base (e.g. 2014.09_Insheim)') | ||||
| parser.add_argument('--event', type=str, help='event ID (e.g. e0010.015.14)') | ||||
| parser.add_argument('--iplot', help='anything, if set, figure occurs') | ||||
| parser.add_argument('--station', type=str, help='Station ID (e.g. INS3) (optional)') | ||||
| args = parser.parse_args() | ||||
| 
 | ||||
| run_makeCF(args.project, args.database, args.event, args.iplot, args.station) | ||||
							
								
								
									
										17
									
								
								setup.py
									
									
									
									
									
										Normal file
									
								
							
							
						
						| @ -0,0 +1,17 @@ | ||||
| #!/usr/bin/env python | ||||
| # -*- coding: utf-8 -*- | ||||
| from distutils.core import setup | ||||
| 
 | ||||
| setup( | ||||
|     name='PyLoT', | ||||
|     version='0.1a1', | ||||
|     packages=['pylot', 'pylot.core', 'pylot.core.loc', 'pylot.core.pick', | ||||
|               'pylot.core.io', 'pylot.core.util', 'pylot.core.active', | ||||
|               'pylot.core.analysis', 'pylot.testing'], | ||||
|     requires=['obspy', 'PySide'], | ||||
|     url='dummy', | ||||
|     license='LGPLv3', | ||||
|     author='Sebastian Wehling-Benatelli', | ||||
|     author_email='sebastian.wehling@rub.de', | ||||
|     description='Comprehensive Python picking and Location Toolbox for seismological data.' | ||||
| ) | ||||
							
								
								
									
										
											BIN
										
									
								
								splash/splash.png
									
									
									
									
									
										Normal file
									
								
							
							
						
						| After Width: | Height: | Size: 40 KiB |