<div dir="ltr"><p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><b>Call for Papers in <i>Attention, Perception, &
Psychophysics</i> Special Issue</b><b><span lang="EN-US">-</span> Neural underpinnings of
attention in the real world: Co-registration of eye movements and EEG</b></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><b> </b></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif">Visual attention is critical for many real-world cognitive
tasks that have important consequences for our daily lives (e.g., reading,
visual search, scene perception). Eye movements play a critical role in such
active tasks, but historically, cognitive neuroscience methods (e.g., EEG<span lang="EN-US">/MEG</span>) have required research
participants to refrain from moving their eyes to reduce artifacts and
confounds in neural measurements. However, restricting eye movements limits our
inferential capabilities, especially with respect to understanding the
real-world implications that motivate the research in the first place. Recent
technological innovations are helping to alleviate this problem. One major
innovation is the ability to co-register brain activity (e.g., EEG<span lang="EN-US"> or MEG</span>) to eye movement
behavior (via eye tracking) as people freely move their eyes. Such technical
innovation allows for the simultaneous study of behavioral and neural measures
of visual and cognitive processes in naturalistic free-viewing scenarios,
moving beyond the constraints of traditional laboratory paradigms. However,
there are both technical and conceptual challenges underlying the use of these
methods. Moreover, because much of the prior eye movement and
electrophysiological research has been developed in largely independent
research areas – each with their own theories, foci, and best practices – there
remains a major challenge in integrating these long-siloed domains.</p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"> </p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif">The goal of this special issue is to feature
the latest empirical, theoretical, and methodological work on eye-movement and
EEG co-registration in cognitive science. We particularly invite contributions
that take a naturalistic approach to active vision and/or aim to bridge the
“attention” gap across multiple research areas (e.g., studying commonalities
and differences in visual <span lang="EN-US">processing
and </span>attention across reading and scene perception). Contributions may
include:</p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif">A. Original empirical research using co-registration methods</p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif">B. Original empirical research using a single methodology
but that has direct implications for (or challenges to) co-registration
research</p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif">C. Opinionated review of research with implications for co-registration
research</p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif">D. Methodological or “best practices” contributions for
conducting and reporting co-registration research</p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"> </p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif">All submissions will undergo normal, full peer review,
maintaining the same high editorial standards for regular submissions to <i>Attention,
Perception, & Psychophysics</i>. The deadline for submissions
is March 15, 2023 with a target publication date of December,
2023. We invite those interested in a possible submission to contact one
of the editors: Elizabeth Schotter, Brennan Payne, or David Melcher.</p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"> </p></div>