Exposing Bedrock: A Marker of Stability or Instability?
This piece was featured in the October 2020 edition of Reflectance. View the whole issue here.
Where Earth is not covered by water, it is usually covered by a life-sustaining mix of mineral and organic matter called soil. Despite the ubiquity of soil, some of the most inspiring landscapes are those where the ‘skin’ of the Earth has been stripped away (think bedrock gorges and glaciated peaks). The stark absence of soil in these settings is what makes them unique, alluring, and beautiful. Understanding how they formed requires thinking across at least seven orders of magnitude—a landslide can occur in a few minutes while 1-m of soil can take many millennia to form. Knowing how and where soil persists is essential to ecosystems, agriculture, and natural hazards.
The loss of soil is often attributed to disturbance, but can patchy soil cover be stable in the statistical sense? In Project Erosion, we are pondering these questions through the lens of the soils, tors (aka isolated bedrock outcrops), and rocky cliffs of the Colorado Front Range below the glacial limit. Incised canyons are dominated by steep, rocky cliffs with trees seemingly growing on rock. Elsewhere, soil cover is higher, yet soils are still very thin and patchy. Our research is showing that soil patchiness is a stable feature of the landscape driven by relatively small differences in the pace and pattern of bedrock river incision.
Can we relate fractional soil cover to process? Insights from numerical models that treat trees as agents with simple rules for sapling recruitment, growth, and death suggest we can. We ground these model results in reality using drone-based photogrammetry and multi-spectral lidar data to characterize how bedrock exposure varies as a function of landscape position and time. So the next time you drive up into the mountains, ask yourself why and where soil is absent, and whether the bedrock you see is a stochastic accident or a persistent feature of the landscape.