News Ticker

MEGA earthquakes may be more likely due to ‘stable’ zones, say researchers

Larger earthquakes may be more likely due to so-called “stable” zones, according to researchers from California Institute of Technology (Caltech.)

Previous research has suggested that “stable” zones act as barriers to fast-slipping, shake-producing earthquake ruptures, but a new study by Caltech researchers and the Japan Agency for Marine-Earth Science and Technology suggests otherwise. Typically, “stable” zones, where fault segments slip slowly, aren’t thought to be capable of hosting rapid earthquake-producing slip. A new study, however, throws this theory out of the window.

“What we have found, based on laboratory data about rock behavior, is that such supposedly stable segments can behave differently when an earthquake rupture penetrates into them. Instead of arresting the rupture as expected, they can actually join in and hence make earthquakes much larger than anticipated,” says coauthor Nadia Lapusta, professor of mechanical engineering and geophysics at Caltech and coauthor of the study.

Lapusta and her coauthor, Hiroyuki Noda, think this is what happened during the 2011 magnitude 9.0 Tohoku-Oki earthquake, which triggered a massive tsunami that killed thousands of people.

Fault slip results from the interaction between the stresses acting on the fault and friction. Stress and friction are determined by a number of factors, including the behavior of the fluids permeating the rocks in the Earth’s crust. Lapusta and her research team devised new earthquake fault models that blend laboratory-based knowledge of complex friction laws and fluid behavior, and created computation procedures that allow scientists to numerically simulate how those model faults will act under stress.

“The uniqueness of our approach is that we aim to reproduce the entire range of observed fault behaviors—earthquake nucleation, dynamic rupture, postseismic slip, interseismic deformation, patterns of large earthquakes—within the same physical model; other approaches typically focus only on some of these phenomena,” says Lapusta.

To make the new earthquake fault models as realistic as possible, researchers gave realistic fault properties to the model faults, based on previous laboratory experiments on rock materials from a real-life fault zone.

“In that experimental work, rock materials from boreholes cutting through two different parts of the fault were studied, and their properties were found to be conceptually different,” says Lapusta. “One of them had so-called velocity-weakening friction properties, characteristic of earthquake-producing fault segments, and the other one had velocity-strengthening friction, the kind that tends to produce stable creeping behavior under tectonic loading. However, these ‘stable’ samples were found to be much more susceptible to dynamic weakening during rapid earthquake-type motions, due to shear heating.”

Researchers discovered that the “stable” zones would sometimes fail to stop earthquakes. Occasionally, dynamic rupture would penetrate that area in just the right way to activate dynamic weakening, leading to massive slip. They think this is what occurred during the 1999 magnitude 7.6 Chi-Chi earthquake in Taiwan. The quake’s biggest slip took place in what was thought to be the “stable” zone.

This finding means that mega earthquakes in many areas of the world are possible, say researchers. Given this conclusion, Lapusta believes that the seismic hazard of the southern and northern parts of California’s San Andreas Fault needs to be reassessed.

The infamous San Andreas fault gained notoriety in 1906 when sudden displacement along the fault generated the great San Francisco earthquake and fire. According the U.S. Geological Survey, the San Andreas is the “master” fault of an elaborate fault network that cuts through rocks of the California coastal region.

“Lapusta and Noda’s realistic earthquake fault models are critical to our understanding of earthquakes—knowledge that is essential to reducing the potential catastrophic consequences of seismic hazards,” says Ares Rosakis, chair of Caltech’s division of engineering and applied science. “This work beautifully illustrates the way that fundamental, interdisciplinary research in the mechanics of seismology at Caltech is having a positive impact on society.”

“We find that the model qualitatively reproduces the behavior of the 2011 magnitude 9.0 Tohoku-Oki earthquake as well, with the largest slip occurring in a place that may have been creeping before the event,” says Lapusta. “All of this suggests that the underlying physical model, although based on lab measurements from a different fault, may be qualitatively valid for the area of the great Tohoku-Oki earthquake, giving us a glimpse into the mechanics and physics of that extraordinary event.”

Despite the risk of “extreme events” repeating themselves throughout the world, researchers think that new earthquake fault models may be useful for examining future earthquake scenarios in a given region.

Lapusta says that the models may also be appropriate for studying how earthquakes may be impacted by additional factors such as man-made disturbances due to geothermal energy harvesting and CO2 sequestration.

“We plan to further develop the modeling to incorporate realistic fault geometries of specific well-instrumented regions, like Southern California and Japan, to better understand their seismic hazard,” she adds.

The study’s findings were described on January 9 in the journal Nature.

Comment Here

Facebook