Thursday, July 23, 2015

Can you find the fault "hosting" this "earthquake"



I like the picturesque metaphor of faults "hosting" earthquakes. Faults as metaphors: I went into this in my last book, perhaps too deeply.

However, do faults really cause anything? Or are they results, not causes? I hold that the foregoing video clip, which can be profitably watched several times, opens a door to world of physics in which certain premises and statistics of the SSHAC process, set forth in naked naivete in the defining NRC SSHAC document, simply disappear. By SSHAC premises, I mean the following axiom presented in the defining NRC document:
It is universally true that earthquakes are the result of differential slip on faults. However, in many areas, such as most of the eastern U.S., the identification of the causative faults giving rise to seismicity is problematic. To accommodate this uncertainty in fault location..... SSHAC 1997 NUREG CR-6372
The SSHAC process urges, not surprisingly given its adherence to this non-mathematical doctrine, even greater and better funded attempts to seek "causative" faults in the hidden depths. All of this follows a fifty year old idea that earthquake forecasting can be carried out using a model such as:

Find faults>

characterize earthquake style and magnitude>

determine distance and geological transmission path to site>

import and adjust various seismic recordings from other sites.

The probabilistic version of this procedure,  which was originally proposed for Central Mexico by Emilio Rosenblueth and Luis Esteva, mainly with distant earthquakes and long-period ground motions in mind, was elaborated in the mid 1960s by engineer Alan Cornell. It subsequently accrued other paste-in assumptions relating to inter-earthquake temporal probability theories, site effects, and so on. Geologists loved this idea, it lent a quantitative prestige to their poetical approach to nature. Hard to imagine that we once considered all this highly sophisticated, but we can now in the post Fukushima era, be permitted to ask whether it provides valid guidance. Some geophysicists imply it may not. Consider the implications of the following:




Should we be really be surprised at doubts about using a single number founded on Professor Richter's solitary star gazing habits, signals recorded at great distance as a base,  to characterize ground vibrations in the immediate vicinity of an inferred epicenter of a M6 earthquake? Not any more than having doubts about a Boston Red Socks'  bettor's  relying on either the distant roar of the crowd, or (to approach it from the opposite direction,)  from the bleacher seats populated by home town fans.

And as for that idealized "causative fault", consider the video clip at the top of the page. Does the reader find such a distinct fault  there?






(Add to this later: the question addressed in stock descriptions of probabilistic seismic hazard analyses, as compared favorably with "deterministic" straw man methods embodying complete ignorance of how engineers actually make design decisions. Add here too a Monte Carlo analysis showing how breaking the problem down into "causative" steps increases rather than decreases the uncertainty...add further how the assumptions of characteristic earthquakes based on similar Newtonian correlations was the fatal misconception in Fukushima in 2011...the questions are piling up here like traffic on the San Diego freeway)