From hot and strong to cold and weak: How ‘Refrigeration Weakening’ facilitates strain localization during subduction initiation
- 1Utrecht University, Department of Earth Sciences, Utrecht, the Netherlands (a.j.kotowski@uu.nl)
- 2University of Southern California, Department of Earth Sciences, Los Angeles, California, USA
- 3McGill University, Department of Earth and Planetary Sciences, Montréal, Québec, Canada
Temperature predictably controls mineral stability during metamorphism, and exerts first-order influence on rock viscosity during deformation, but the co-evolution of temperature-dependent metamorphic mineralogy and rock rheology is poorly understood. Thermally-controlled metamorphic-mechanical feedbacks must lead to strain localization during plate boundary formation. However, geologic observations from horizontally-forced, newly-destructive, intra-oceanic subduction plate boundaries consistently present a rheological paradox: hot rocks are strong and cold rocks are weak.
Observations from geo-/thermochronology and metamorphic thermobarometry demonstrate that following a stage of initially slow, forced convergence along a hot plate interface when the slab tip resists downward translation, the lower plate suddenly collapses into the mantle. After this catastrophic collapse, subduction becomes self-sustaining (i.e., slab pull is established), stable, and cold. The physical mechanism triggering collapse is unknown, though geological studies show that collapse occurs contemporaneously with rapid cooling of the nascent plate contact and depression of slab-parallel isotherms, or “refrigeration”. These observations imply that hot rocks are strong (i.e., viscously sticky), and cold rocks are weak (i.e., viscously lubricating). It is puzzling why the resistance phase would be characterized by the highest temperature metamorphism, since high-temperature rocks are commonly viscously weak; furthermore, plate boundary “unzipping” and lithosphere-scale localization only occur during the rapid refrigeration phase, when rocks should, to a first order, be stiffer and less deformable. What mechanism(s) overcome stiffening, such that a temperature decrease leads to rock weakening, strain localization, and successful subduction initiation?
Here, we compare microstructures from metamorphic rocks that formed along ancient hot, warm, and cold plate interfaces that correspond to different stages in the evolution of subduction zone formation from initiation to self-sustaining. We conclude that refrigeration drives changes in metamorphic mineral stability, distributions of strain-accommodating minerals, fluid content, and deformation mechanisms, which together dramatically weaken the developing plate boundary. We use flow laws and paleopiezometry to bracket ductile rock strength and demonstrate that cooling at the time of inferred slab collapse causes a minimum 3 order of magnitude viscosity reduction from ~1021 to 1018 Pa-s—the most drastic change in viscosity over a subduction zone’s lifetime. The viscosity reduction leads to a series of dynamic feedbacks, namely: interface decoupling and accelerated plate rates; increased and sustained interface cooling; and stabilization of a wetter, weaker interface. Our ‘refrigeration weakening’ hypothesis embodies a coupled metamorphic-mechanical process that is inherent to the evolution of common oceanic crust, and successfully explains observed changes in upper plate stress state, inferred slab velocity, and timing of proto-forearc seafloor spreading. If true, this mechanism motivates several new testable hypotheses regarding the dynamics of subduction zone development as a function of temperature changes in the modern Earth and throughout Earth’s history.
How to cite: Kotowski, A., Seyler, C., and Kirkpatrick, J.: From hot and strong to cold and weak: How ‘Refrigeration Weakening’ facilitates strain localization during subduction initiation, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12666, https://doi.org/10.5194/egusphere-egu24-12666, 2024.