Conference Paper 2020

The Psychology of Human Entropy Intuitions

{A variety of conceptualizations of psychological uncertaintyexist. From an information-theoretic perspective, probabilisticuncertainty can be formalized as mathematical entropy. Cog-nitive emotion theories posit that uncertainty appraisals andmotivation to reduce uncertainty are modulated by emotionalstate. Yet little is known about how people evaluate proba-bilistic uncertainty, and about how emotional state modulatespeople\textquoterights evaluations of probabilistic uncertainty and behaviorto reduce probabilistic uncertainty. We tested intuitive entropyevaluations and entropy reduction strategies across four emo-tion conditions in the Entropy Mastermind game. We used theunified Sharma-Mittal space of entropy measures to quantifyparticipants\textquoteright entropy evaluations. Results suggest that manypeople use a heuristic strategy, focusing on the number of pos-sible outcomes, irrespective of the probabilities in the proba-bility distribution. This result is surprising, given that previouswork suggested that people are very sensitive to the maximumprobability when choosing queries on probabilistic classifica-tion tasks. Emotion induction generally increased participants\textquoterightheuristic assessment. The uncertainty associated with emo-tional states also affected game play: participants needed fewerqueries and spent less time on games in high-uncertainty thanin low-uncertainty emotional states. Yet entropy perceptionswere not related to subjectively reported uncertainty, numer-acy or entropy knowledge, suggesting that entropy perceptionsmay form an independent psychological construct.}

Author(s): Bertram, L and Schulz, E and Hofer, M and Nelson, JD
Book Title: 42nd Annual Meeting of the Cognitive Science Society (CogSci 2020): 5Developing a Mind: Learning in Humans, Animals, and Machines
Pages: 1457--1463
Year: 2020
Publisher: Curran
Bibtex Type: Conference Paper (inproceedings)
Address: Toronto, Canada
DOI: 10.23668/psycharchives.3016
Electronic Archiving: grant_archive

BibTex

@inproceedings{item_3238549,
  title = {{The Psychology of Human Entropy Intuitions}},
  booktitle = {{42nd Annual Meeting of the Cognitive Science Society (CogSci 2020): 5Developing a Mind: Learning in Humans, Animals, and Machines}},
  abstract = {{A variety of conceptualizations of psychological uncertaintyexist. From an information-theoretic perspective, probabilisticuncertainty can be formalized as mathematical entropy. Cog-nitive emotion theories posit that uncertainty appraisals andmotivation to reduce uncertainty are modulated by emotionalstate. Yet little is known about how people evaluate proba-bilistic uncertainty, and about how emotional state modulatespeople\textquoterights evaluations of probabilistic uncertainty and behaviorto reduce probabilistic uncertainty. We tested intuitive entropyevaluations and entropy reduction strategies across four emo-tion conditions in the Entropy Mastermind game. We used theunified Sharma-Mittal space of entropy measures to quantifyparticipants\textquoteright entropy evaluations. Results suggest that manypeople use a heuristic strategy, focusing on the number of pos-sible outcomes, irrespective of the probabilities in the proba-bility distribution. This result is surprising, given that previouswork suggested that people are very sensitive to the maximumprobability when choosing queries on probabilistic classifica-tion tasks. Emotion induction generally increased participants\textquoterightheuristic assessment. The uncertainty associated with emo-tional states also affected game play: participants needed fewerqueries and spent less time on games in high-uncertainty thanin low-uncertainty emotional states. Yet entropy perceptionswere not related to subjectively reported uncertainty, numer-acy or entropy knowledge, suggesting that entropy perceptionsmay form an independent psychological construct.}},
  pages = {1457--1463},
  publisher = {Curran},
  address = {Toronto, Canada},
  year = {2020},
  slug = {item_3238549},
  author = {Bertram, L and Schulz, E and Hofer, M and Nelson, JD}
}