##
**A hierarchical Bayesian state trace analysis for assessing monotonicity while factoring out subject, item, and trial level dependencies.**
*(English)*
Zbl 1431.62555

Summary: State trace analyses assess the latent dimensionality of a cognitive process by asking whether the means of two dependent variables conform to a monotonic function across a set of conditions. Using an assumption of independence between the measures, recently proposed statistical tests address bivariate measurement error, allowing both frequentist and Bayesian analyses of monotonicity (e.g., C. P. Davis-Stober et al. [J. Math. Psychol. 72, 116–129 (2016; Zbl 1359.62499)]; M. L. Kalish et al. [J. Math. Psychol. 70, 1–11 (2016; Zbl 1359.62502)]). However, statistical inference can be biased by unacknowledged dependencies between measures, particularly when the data are insufficient to overwhelm an incorrect prior assumption of independence. To address this limitation, we developed a hierarchical Bayesian model that explicitly models the separate roles of subject, item, and trial-level dependencies between two measures. Assessment of monotonicity is then performed by fitting separate models that do or do not allow a non-monotonic relation between the condition effects (i.e., same versus different rank orders). The Widely Applicable Information Criterion (WAIC) and Pseudo Bayesian Model Averaging – both cross validation measures of model fit – are used for model comparison, providing an inferential conclusion regarding the dimensionality of the latent psychological space. We validated this new state trace analysis technique using model recovery simulation studies, which assumed different ground truths regarding monotonicity and the direction/magnitude of the subject- and trial-level dependence. We also provide an example application of this new technique to a visual object learning study that compared performance on a visual retrieval task (forced choice part recognition) versus a verbal retrieval task (cued recall).

### MSC:

62P15 | Applications of statistics to psychology |

62F15 | Bayesian inference |

60E15 | Inequalities; stochastic orderings |

62B10 | Statistical aspects of information-theoretic topics |

### Software:

brms### References:

[1] | Akaike, H., On the likelihood of a time series model, The Statistician, 27, 3/4, 217 (1978) |

[2] | Bamber, D., State-trace analysis: A method of testing simple theories of causation, Journal of Mathematical Psychology, 19, 2, 137-181 (1979) |

[3] | Betancourt, M. (2017). A Conceptual Introduction to Hamiltonian Monte Carlo. Retrieved from http://arxiv.org/abs/1701.02434; Betancourt, M. (2017). A Conceptual Introduction to Hamiltonian Monte Carlo. Retrieved from http://arxiv.org/abs/1701.02434 |

[4] | Betancourt, M.; Girolami, M., Hamiltonian Monte Carlo for hierarchical models, (Upadhyay, S. K.; Singh, U.; Dey, D. K.; Loganathan, A., Current trends in Bayesian methodology with applications (2015), CRC Press), Retrieved from http://arxiv.org/abs/13120906 |

[5] | Bürkner, P.-C., Brms : An R package for Bayesian multilevel models using stan, Journal of Statistical Software, 80, 1 (2017) |

[6] | Bürkner, P. C., & Charpentier, E. (preprint). Monotonic effects: A principled approach for including ordinal predictors in regression models. PsyArXiv Preprints, 1-20. http://dx.doi.org/10.31234/OSF.IO/9QKHJ; Bürkner, P. C., & Charpentier, E. (preprint). Monotonic effects: A principled approach for including ordinal predictors in regression models. PsyArXiv Preprints, 1-20. http://dx.doi.org/10.31234/OSF.IO/9QKHJ |

[7] | Davis-Stober, C. P.; Morey, R. D.; Gretton, M.; Heathcote, A., Bayes factors for state-trace analysis, Journal of Mathematical Psychology, 72, 116-129 (2016) · Zbl 1359.62499 |

[8] | Duane, S.; Kennedy, A. D.; Pendleton, B. J.; Roweth, D., Hybrid monte carlo, Physics Letters. B, 195, 2, 216-222 (1987) |

[9] | Dunn, J. C., The dimensionality of the remember-know task: A state-trace analysis, Psychological Review, 115, 2, 426-446 (2008) |

[10] | Dunn, J. C.; James, R. N., Signed difference analysis: Theory and application, Journal of Mathematical Psychology, 47, 4, 389-416 (2003) · Zbl 1077.91041 |

[11] | Dunn, J. C.; Kalish, M. L., State-trace analysis (2018), Springer |

[12] | Dunn, J. C.; Kalish, M. L.; Newell, B. R., State-trace analysis can be an appropriate tool for assessing the number of cognitive systems: A reply to Ashby (2014), Psychonomic Bulletin & Review, 21, 4, 947-954 (2014) |

[13] | Dunn, J. C.; Kirsner, K., Discovering functionally independent mental processes: The principle of reversed association, Psychological Review, 95, 1, 91-101 (1988) |

[14] | Gelman, A., How do we choose our default methods?, (Past, present, and future of statistical science (2014), Chapman and Hall/CRC), 293-301 |

[15] | Gelman, A.; Rubin, D. B., Inference from iterative simulation using multiple sequences, Statistical Science, 7, 4, 457-472 (1992) · Zbl 1386.65060 |

[16] | Greene, W. H., Econometric analysis (2017), Pearson |

[17] | Jang, Y.; Lee, H.; Huber, D. E., How many dimensions underlie judgments of learning and recall redux: Consideration of recall latency reveals a previously hidden non-monotonicity, Journal of Mathematical Psychology (2019) |

[18] | Kalish, M. L.; Dunn, J. C.; Burdakov, O. P.; Sysoev, O., A statistical test of the equality of latent orders, Journal of Mathematical Psychology, 70, 1-11 (2016) · Zbl 1359.62502 |

[19] | Kruschke, J. K. (2014). Doing Bayesian data analysis: A tutorial with R, JAGS, and Stan, second edition. Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan, Second Edition. http://dx.doi.org/10.1016/B978-0-12-405888-0.09999-2; Kruschke, J. K. (2014). Doing Bayesian data analysis: A tutorial with R, JAGS, and Stan, second edition. Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan, Second Edition. http://dx.doi.org/10.1016/B978-0-12-405888-0.09999-2 |

[20] | Lewandowski, D.; Kurowicka, D.; Joe, H., Generating random correlation matrices based on vines and extended onion method, Journal of Multivariate Analysis, 100, 9, 1989-2001 (2009) · Zbl 1170.62042 |

[21] | Loftus, G. R.; Oberg, M. A.; Dillon, A. M., Linear theory, dimensional theory, and the face-inversion effect, Psychological Review, 111, 4, 835-863 (2004) |

[22] | Macmillan, N. A.; Creelman, C. D., Detection theory: A user’s guide (2005), Lawrence Erlbaum Associates, Inc.: Lawrence Erlbaum Associates, Inc. Mahwah, New Jersey |

[23] | Morey, R. D., Confidence intervals from normalized data: A correction to Cousineau ( 2005 ), Tutorials in Quantitative Methods for Psychology, 4, 2, 61-64 (2008) |

[24] | Neal, R. M. (1996). Bayesian Learning for Neural Networks, 118. http://dx.doi.org/10.1007/978-1-4612-0745-0; Neal, R. M. (1996). Bayesian Learning for Neural Networks, 118. http://dx.doi.org/10.1007/978-1-4612-0745-0 · Zbl 0888.62021 |

[25] | Pan, K., An analytical expression for bivariate normal distribution, SSRN Electronic Journal (2017) |

[26] | Pratte, M. S.; Rouder, J. N., Assessing the dissociability of recollection and familiarity in recognition memory, Journal of Experimental Psychology. Learning, Memory, and Cognition, 38, 6, 1591-1607 (2012) |

[27] | Prince, M.; Brown, S.; Heathcote, A., The design and analysis of state-trace experiments, Psychological Methods, 17, 1, 78-99 (2012) |

[28] | Rouder, J. N.; Lu, J., An introduction to Bayesian hierarchical models with an application in the theory of signal detection, Psychonomic Bulletin & #38; Review, 12, 4, 573-604 (2005) |

[29] | Sadil, P., Potter, K., Huber, D. E., & Cowell, R. A. (submitted). Connecting the dots without top-down knowledge: Evidence for the separability of levels within the visual processing hierarchy.; Sadil, P., Potter, K., Huber, D. E., & Cowell, R. A. (submitted). Connecting the dots without top-down knowledge: Evidence for the separability of levels within the visual processing hierarchy. |

[30] | Stan Development Team. (2017a). RStan: the R interface to Stan. Retrieved from http://mc-stan.org; Stan Development Team. (2017a). RStan: the R interface to Stan. Retrieved from http://mc-stan.org |

[31] | Stan Development Team. (2017b). Stan Modeling Language Users Guide and Reference Manual. Retrieved from http://mc-stan.org; Stan Development Team. (2017b). Stan Modeling Language Users Guide and Reference Manual. Retrieved from http://mc-stan.org |

[32] | Tsuchiya, N.; Koch, C., Continuous flash suppression reduces negative afterimages, Nature Neuroscience, 8, 8, 1096-1101 (2005) |

[33] | Vehtari, A., Gabry, J., Yao, Y., & Gelman, A. (2018). loo: Efficient leave-one-out cross-validation and WAIC for Bayesian models. Retrieved from https://cran.r-project.org/package=loo; Vehtari, A., Gabry, J., Yao, Y., & Gelman, A. (2018). loo: Efficient leave-one-out cross-validation and WAIC for Bayesian models. Retrieved from https://cran.r-project.org/package=loo · Zbl 1505.62409 |

[34] | Vehtari, A.; Gelman, A.; Gabry, J., Practical Bayesian model evaluation using leave-one-out cross-validation and WAIC, Statistics and Computing, 27, 5, 1413-1432 (2017) · Zbl 1505.62408 |

[35] | Wagenmakers, E.; Farrell, S., AIC model selection using Akaike weights, Psychonomic Bulletin & Review, 11, 1, 192-196 (2004) |

[36] | Watanabe, S., Asymptotic equivalence of bayes cross validation and widely applicable information criterion in singular learning theory, Journal of Machine Learning Research (JMLR), 11, 3571-3594 (2010) · Zbl 1242.62024 |

[37] | Yao, Y.; Vehtari, A.; Simpson, D.; Gelman, A., Using stacking to average bayesian predictive distributions (with discussion), Bayesian Analysis, 13, 3, 917-1003 (2018) · Zbl 1407.62090 |

This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.