Difference between revisions of "Modelling tool user interfaces"
m |
m |
||
Line 1: | Line 1: | ||
− | The tables below compare some main features of the user interfaces of the selected modelling tools that relate to their ease of use. These include approximate comparisons of typical model run times and the computing power needed to run them, as well as how easy it is to export and view various model outputs and test different parameter value options for sensitivity analyses and/or calibration. | + | The tables below compare some main features of the user interfaces of the selected modelling tools that relate to their ease of use. These include approximate comparisons of typical model run times and the computing power needed to run them, as well as how easy it is to export and view various model outputs and test different parameter value options for sensitivity analyses and/or calibration. (More specifics about what outputs each tool can produce is covered [[Water balance outputs across tools|here]].) |
− | + | What can be achieved in the time available for a modelling project is influenced by the combination of how long it takes to set up a model (including preparing input data in the needed format, setting up the structure, entering the parameter values), how long it takes for the model to run, how long it takes to access model outputs of interest, and how long it takes to test and refine the model. Some modelling tools may run very quickly, but take a relatively long time to set up and don't have an efficient way to change and test multiple parameter value options, which makes calibration a time consuming, laborious, manual process. Other tools may take long to run, but can be set up to do a number of parameter testing runs and even scenario runs at once, allowing the modeller to attend to other work in the meantime (however they may have to do so on another computer if the model requires a lot of computing power!). | |
− | Ease and efficiency of software use is also influenced by the documentation and support available for the tool | + | Ease and efficiency of software use is also influenced by the documentation and support available for the tool. How accessible and user-friendly these are can determine whether or not users getting stuck trouble-shooting for long periods of time or inadvertently modelling things incorrectly. User impressions of the ease of use of tool interfaces, documentation, and support are provided [[below]]. Links to online documentation and support resources for these tools are provided [[Modelling tool documentation|here]] |
Line 10: | Line 10: | ||
</br> | </br> | ||
− | == Interface comparison | + | == Interface comparison == |
{| class="wikitable" | {| class="wikitable" | ||
Line 185: | Line 185: | ||
== User impressions of 'ease-of-use' (modeller survey) == | == User impressions of 'ease-of-use' (modeller survey) == | ||
− | A brief survey for hydrological modellers was distributed via the South African Hydrology Society (SAHS) as part of the [[Model inter-comparison study|model intercomparison project (2019-2021)]]. Participants were asked to rank the ease-of-use of the software interface for any modelling tools they were familiar with on a scale of 1-5 in which: '''1=poor, 3=satisfactory, 5= excellent'''</br> | + | A brief survey for hydrological modellers was distributed via the South African Hydrology Society (SAHS) as part of the [[Model inter-comparison study|model intercomparison project (2019-2021)]]. Participants were asked to rank the ease-of-use of the software user interface, it's documentation, and available support for any modelling tools they were familiar with on a scale of 1-5 in which: '''1=poor, 3=satisfactory, 5= excellent'''</br> |
There was a very wide range of scores assigned for each tool across the respondents, showing that different people experience the tools differently! </br> | There was a very wide range of scores assigned for each tool across the respondents, showing that different people experience the tools differently! </br> | ||
− | Both the average and the range of scores assigned are presented below | + | Both the average and the range of scores assigned are presented below. |
+ | (Links to online documentation and support resources for these tools are provided [[Modelling tool documentation|here]]) | ||
{| class="wikitable" | {| class="wikitable" | ||
! scope="col" style="width:15em" | Survey data <big></big> | ! scope="col" style="width:15em" | Survey data <big></big> | ||
Line 216: | Line 217: | ||
| style="background: #FFFFF5; vertical-align: center; text-align:center;" |3 - 5 | | style="background: #FFFFF5; vertical-align: center; text-align:center;" |3 - 5 | ||
| style="background: #F5FCFF; vertical-align: center; text-align:center;" |1 - 5 | | style="background: #F5FCFF; vertical-align: center; text-align:center;" |1 - 5 | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
|} | |} |
Revision as of 09:10, 4 December 2023
The tables below compare some main features of the user interfaces of the selected modelling tools that relate to their ease of use. These include approximate comparisons of typical model run times and the computing power needed to run them, as well as how easy it is to export and view various model outputs and test different parameter value options for sensitivity analyses and/or calibration. (More specifics about what outputs each tool can produce is covered here.)
What can be achieved in the time available for a modelling project is influenced by the combination of how long it takes to set up a model (including preparing input data in the needed format, setting up the structure, entering the parameter values), how long it takes for the model to run, how long it takes to access model outputs of interest, and how long it takes to test and refine the model. Some modelling tools may run very quickly, but take a relatively long time to set up and don't have an efficient way to change and test multiple parameter value options, which makes calibration a time consuming, laborious, manual process. Other tools may take long to run, but can be set up to do a number of parameter testing runs and even scenario runs at once, allowing the modeller to attend to other work in the meantime (however they may have to do so on another computer if the model requires a lot of computing power!).
Ease and efficiency of software use is also influenced by the documentation and support available for the tool. How accessible and user-friendly these are can determine whether or not users getting stuck trouble-shooting for long periods of time or inadvertently modelling things incorrectly. User impressions of the ease of use of tool interfaces, documentation, and support are provided below. Links to online documentation and support resources for these tools are provided here
Interface comparison
Interface characteristic | WRSM-Pitman | SPATSIM-Pitman | ACRU4 | SWAT2012 | MIKE-SHE |
---|---|---|---|---|---|
Graphical user interface (vs code prompt) |
yes | yes | yes | yes | yes |
Catchment map display (visualise linkages) | no | yes | no | yes | yes |
Model run times | |||||
Estimated model run time for a 30 year run, ~300km^2 catchment (Note: will depend on model set-up complexity & computing power!) |
seconds to minutes | seconds to minutes | seconds to minutes | tens of minutes | hours |
Computing resources needed | |||||
Comparative rating of computing power needed to achieve workable run times. | light | light | light | medium | intensive (need good GPU) |
Model set-up ease & efficiency | |||||
Automated creation of model units & connections from map inputs (vs fully manual creation) |
no | no | no | yes | yes |
Input parameter values and change values for batches of models units (e.g., all HRUs of a cover type) |
(limited) | yes | no | yes | yes |
In-built database of suggested parameter values (e.g., for common vegetation types, soil types, etc.) |
no | no | yes | yes | no |
User can build own parameter databases for use across multiple models | no | (limited) | no | yes | yes |
Model set-up transparency (i.e., is it very obvious what the model is doing/assuming?) | |||||
Interface makes the user interact with every component & parameter entry option during model set-up (vs having default parameter values pre-entered & not forcing user to view them) |
yes | yes | yes | no | yes |
Tool checks connection errors | (limited) | yes | (limited) | yes | yes |
Batch runs & calibration tools | |||||
Facility for batch runs, parameter sensitivity analyses, uncertainty analyses & auto-calibration | no | yes | no | yes | yes |
Accessing model output | |||||
Output viewer tool for streamflow | yes | yes | yes | yes | yes |
Output viewer tool for water balance fluxes and stores | (limited) | yes | no | (limited) | yes |
All water balance components that are calculated by the model can be exported | no | no | yes | yes | yes |
Batch export of water balance fluxes for model's basic spatial units | no | yes | yes | yes | yes |
Automated extraction of water balance fluxes for different spatial scales (e.g., by cover class area, by subcatchment, full catchment) |
no | no | no | (limited) | yes |
Formats of input and output data
The table below gives some basic information about the file formats used for model inputs and outputs across the different modelling tools to give a general impression of what is required to work with them. This is a very rough overview and one has to work with user manuals, tutorials, and/or pre-exist demonstration models and data to understand the various formatting requirements and file types used across the inputs and outputs of a specific software tool.
For large or complex model set-ups that will have many different inputs (e.g., different input rainfall timeseries for several different points across the modelled area), it is highly recommended to use coding tools like R or Python to prepare the input files as it will be time-consuming to get many files into the same specific formatting required by the modelling software and most do not have in-built conversion tools.
Data type | WRSM-Pitman | SPATSIM-Pitman | ACRU4 | SWAT2012 | MIKE-SHE |
---|---|---|---|---|---|
Timeseries data | Specially formatted text files (special file extensions)
|
Specially formatted text files (.txt)
|
Specially formatted ASCII text files (.txt) & .DBF files
|
Specially formatted text files (.txt) & Access database files
|
Software-specific .dfs0 file format
|
Spatial data | N/A (Spatial data is not directly input into the software)
|
N/A (Spatial data is not directly input into the software)
|
N/A (Spatial data is not directly input into the software)
|
Standard GIS shapefile and grid/raster files (geotif, grid) used.
|
Software-specific .dfs2 and .dfs3 file formats used in general. A few inputs allow standard shapefiles.
|
User impressions of 'ease-of-use' (modeller survey)
A brief survey for hydrological modellers was distributed via the South African Hydrology Society (SAHS) as part of the model intercomparison project (2019-2021). Participants were asked to rank the ease-of-use of the software user interface, it's documentation, and available support for any modelling tools they were familiar with on a scale of 1-5 in which: 1=poor, 3=satisfactory, 5= excellent
There was a very wide range of scores assigned for each tool across the respondents, showing that different people experience the tools differently!
Both the average and the range of scores assigned are presented below.
(Links to online documentation and support resources for these tools are provided here)
Survey data | WRSM-Pitman | SPATSIM-Pitman | ACRU4 | SWAT2012 | MIKE-SHE |
---|---|---|---|---|---|
Number of users answering survey | 13 | 14 | 19 | 9 | 8 |
Average ease-of-use score 1-poor to 5-excellent |
3.8 | 3.4 | 3.4 | 3.9 | 3.0 |
Range of scores assigned 1-poor to 5-excellent |
2 - 5 | 2 - 5 | 1 - 5 | 3 - 5 | 1 - 5 |