Jump to Main Content
Erosion response of a disturbed sagebrush steppe hillslope
- Goff, B.F., Bent, G.C., Hart, G.E.
- Journal of environmental quality 1993 v.22 no.4 pp. 698-709
- steppes, Artemisia tridentata, rill erosion, rain, losses from soil, rain intensity, Idaho
- Land management activities that disrupt surface vegetation cover pose a serious threat to the long-term stability of buried-waste sites located within the semiarid sagebrush (Artemisia tridentata Nutt.) steppe region of the northwestern USA. In this study, we evaluated the erosion response of a sagebrush hillslope subjected to three vegetation cover treatments: natural (undisturbed), bare (plant canopy and litter cover removed), and clipped (canopy removed). A rotating boom rainfall simulator was used to apply rain at 60 or 120 mm/h intensities to runoff plots (3.0 m by 10.7 m) with dry, wet, and very wet antecedent moisture conditions, and during two late and one early summer seasons. Supplemental overland flow was added at the upper end of each plot to simulate increased slope length during very wet runs. Maximum soil loss rates on the natural, clipped, and bare treatments were, respectively, 1, 5, and 216 mg/m2 per s during the 60 mm/h rainfall intensity, and 13, 79, and 1473 mg/m2 per s during the 120 mm/h rainfall intensity. Cumulative soil loss was typically 100 to 1000 times greater on the bare treatment than on the natural or clipped treatments. Increases in simulated slope length produced a near linear increase in soil loss from the bare treatment plots (about 0.02 g/m2 per s soil loss per m of slope length) until 30 m, after which the effect of slope length declined. Surface crust development and mound-intermound microtopography played important roles in governing soil detachment and transport on the hillslope. Despite high rainfall intensity and surface runoff rates, rill erosion was negligible on both the undisturbed and disturbed portions of the hillslope.