I am trying to run a model of a small (>1km^2) catchment that runs into a sediment dam. The dam starts pumping water out at 80L/s once the depth of the water reaches 2m in depth. I referred to this post which talks about how to set up the S-Q table to simulate a pump turning on at a specific water level.
The dam has a capacity of 190ML and when running the RORB model for 1%AEP we found that it only reaches about 47% capacity at peak flow. This is testing all temporal patterns and all storm durations.
However, I received an update of the dam design and such needed to update the H-S table in the catchment file. Nothing else was changed except the H-S table and the level in the S-Q table at which “pumping” starts. This new dam which has a capacity of 170ML is now giving errors in results saying that the level has exceeded the input storage curve, which suggests a spill, in many different scenarios.
How is it that which just a small change in the dam capacity, the results go from a peak of 47% capacity to spilling all the time?