Associative learning shapes visual discrimination in a web-based classical conditioning task

Sci Rep. 2021 Aug 3;11(1):15762. doi: 10.1038/s41598-021-95200-6.

Abstract

Threat detection plays a vital role in adapting behavior to changing environments. A fundamental function to improve threat detection is learning to differentiate between stimuli predicting danger and safety. Accordingly, aversive learning should lead to enhanced sensory discrimination of danger and safety cues. However, studies investigating the psychophysics of visual and auditory perception after aversive learning show divergent findings, and both enhanced and impaired discrimination after aversive learning have been reported. Therefore, the aim of this web-based study is to examine the impact of aversive learning on a continuous measure of visual discrimination. To this end, 205 participants underwent a differential fear conditioning paradigm before and after completing a visual discrimination task using differently oriented grating stimuli. Participants saw either unpleasant or neutral pictures as unconditioned stimuli (US). Results demonstrated sharpened visual discrimination for the US-associated stimulus (CS+), but not for the unpaired conditioned stimuli (CS-). Importantly, this finding was irrespective of the US's valence. These findings suggest that associative learning results in increased stimulus salience, which facilitates perceptual discrimination in order to prioritize attentional deployment.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Association Learning / physiology*
  • Brain Mapping
  • Conditioning, Classical / physiology*
  • Discrimination, Psychological / physiology*
  • Female
  • Humans
  • Internet / statistics & numerical data*
  • Male
  • Visual Perception / physiology*
  • Young Adult