This is probably a very underused method in the fields of mental heath and therapy. The principle is as simple as the name: you build a model of what you think may be happening, you absolutely accept that it will always be a profoundly simplified model of real life but you construct it in such a way as it might reveal something interesting that might be explored further, either in further simulations or real life studies of any sort.
I find it hard to envisage what a purely qualitative simulation would be, I think they are the same as the very important realm of “thought experiments” (Gedankenexperiment) that have been so vital in the physical sciences and, to my mind, bizarrely neglected in much of psychology and MH work. So for my purpose here I am assuming that simulations are always quantitative.
Details #
One very simple simulation formed a fundamental part of a paper I co-authored: Barkham, M., Connell, J., Miles, J. N. V., Evans, C., Stiles, W. B., Margison, F., & Mellor-Clark, J. (2006). Dose-effect relations and responsive regulation of treatment duration: The good enough level. Journal of Consulting and Clinical Psychology, 74(1), 160–167. https://doi.org/10.1037/0022-006x.74.1.160 (not open access, contact me if you want a copy). I for one certainly don’t think that the “good enough level” is a sufficient or complete model of how and why some clients opt out of therapies before a planned duration however simulations in our arena should never aspire to be complete models of real life. What that simulation showed was that if clients coming into therapy tended to improve at a steady (“linear”) rate per week/session but if they differed in what that rate was, with some improving fast and some more slowly, and if they opted out of therapy when they reached a certain level of improvement then even though the individual process for each individual client was a linear one, their aggregate improvement rate would be a curvilinear change, not a linear one. That probably helped researchers think a bit more about “dose response curves” for therapies.
That also illustrates a very important issue for any process, the distinction between linear and non-linear processes/functions. If a function or process is “linear” i.e. can be represented as an equation at its simplest like this:
y = a*x + c
i.e. that y, the dependent variable of interest is made up of a constant (c) and some multiple of some other variable (x), the multiplier also being a constant (a) then, even for much more complex linear processes than that, perhaps involving multiple predictor (x) variables, simulation is fairly easy and stable even when processes are “iterative”: influenced by their previous state. However, if processes are iterative and non-linear, they can be mathematically “chaotic” and very hard to simulated, or rather easy to simulate but likely to be behave in sometimes profoundly surprising ways.
The model in our paper was of a linear change for individuals: that their state was changing steadily with time. More on these issues about linearity and about chaos in other entries (to come!)
Try also #
Chaos theory
Thought experiment
Linear versus non-linear functions/processes
Opting out, “dropping out”, attrition
Chapters #
Fairly simple simulations generated the data for plots in Chapters 5, 7 and 9.
Online resources #
None envisaged!
Dates #
First created 12.viii.23