When modelling Complex Catchments my PC runs out of memory

The software can run out of memory to store temporary data, or it simply takes a very long time to run, when a complex modelling scenario is applied. The imported data node can assist here.

A 'complex' or large model could be defined as follows:-

  1. A large number of source and treatment nodes - more than 100 nodes;
  2. A long time series of data - greater than 20 years; and
  3. Short time step - 6 minutes.
Memory management issues can also be limited by the hardware of the PC. Ensure that the PC that is used to run the model has sufficient processing power and memory.

Separating the model into smaller sub-catchment models and the results of each then imported back into a “full” model, which may simply be a group of imported data nodes. 

If the initial model was shown to be too complex (note images on this page are simplified for the purposes of this explanation),


Figure 1 Potential Subcatchments


With the above, 3 logical subcatchments would be appropriate. Each of these would then be set up as individual subcatchment models (as per the example in the multiple rainfall locations tip) and the data exported out into a final “full” model. Once again, using the above model as an example, this would look like the model shown below.  


Figure 2 “Full” Catchment Model

The above model is considerably simplified, from 18 nodes down to 8, and as such, each model should run considerably faster, or within the limitations of the computer being used.  As for the multiple rainfall gauges models, these sub models must use the same timestep and time period as the original model (or the “full” model). This shows that MUSIC could then be used to model very complex systems by simply breaking them down to smaller sub models and joining them up using the imported data node.

Feedback and Knowledge Base