<div dir="ltr"><p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-family:Arial,sans-serif;font-size:10pt">The ERC project “</span><b style="font-family:Arial,sans-serif;font-size:10pt">CRACK- Cracking the Neural Code of
Human Object Vision”</b><span style="font-family:Arial,sans-serif;font-size:10pt"> addresses open questions of where, when and how human
neural activity enables visual cognition.</span><br></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif"> </span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">We are offering <b>three PostDoc positions</b> in <b>Radek
Cichy’s </b></span><a href="https://www.ewi-psy.fu-berlin.de/en/einrichtungen/arbeitsbereiche/neural_dyn_of_vis_cog/index.html" style="color:rgb(5,99,193)" target="_blank"><b><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">Neural Dynamics of Visual Cognition</span></b><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">
<b>Group</b></span></a><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif"> at the Department of Education and
Psychology and Education.</span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif"> </span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">The positions are in <b>two different directions of research</b>:</span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif"> </span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><strong><i><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">1) 7T L</span></i></strong><strong><i><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif;color:black">a</span></i></strong><strong><i><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">yer-fMRI for resolving information
flow (Reference code: CICHY-ERC-7T)</span></i></strong><span style="font-family:Arial,sans-serif"></span></p>
<p style="margin:0cm 0cm 8.25pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:"Times New Roman",serif"><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">We will distinguish, map and understand the role
of feed-forward and feed-back information flow in human cortex for visual
cognition. This will be done by resolving neural activity in space at the level
of cortical layers using 7T-MRI, and in time through time-frequency
decomposition of M/EEG data (e.g. </span><span style="color:black"><a href="https://doi.org/10.1016/j.cub.2020.04.074" title="Siying's paper" style="color:rgb(5,99,193)" target="_blank"><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204)">Xie et al. 2020, </span><em><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204);text-decoration-line:none">Curr Biol</span></em></a></span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">), and their integration (</span><span style="color:black"><a href="https://doi.org/10.1038/nn.3635" title="Nat Neuro original paper" style="color:rgb(5,99,193)" target="_blank"><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204)">Cichy et
al., 2014 </span><em><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204);text-decoration-line:none">Nat Neuro</span></em></a></span><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(102,102,102)">; </span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">summarized in </span><span style="color:black"><a href="https://doi.org/10.1016/j.neuron.2020.07.001" title="Neuron review" style="color:rgb(5,99,193)" target="_blank"><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204)">Cichy
& Oliva, 2020 </span><em><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204);text-decoration-line:none">Neuron</span></em></a></span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">).</span><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(102,102,102)"></span></p>
<p style="margin:0cm 0cm 8.25pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:"Times New Roman",serif"><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">This work is in (already established)
collaboration with <a href="https://www.cbs.mpg.de/mitarbeiter/weiskopf">Nik Weiskopf </a>(MPI for Cognitive and Brain
Sciences) and </span><span style="color:black"><a href="https://www.bmmr.ovgu.de/mm/en/" title="Oliver's webpage" style="color:rgb(5,99,193)" target="_blank"><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204)">Oliver
Speck</span></a></span><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(102,102,102)"> </span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">(LIN/OVGU).</span><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(102,102,102)"></span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><strong><i><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">2) Deep learning models of visual cortex (Reference code:
CICHY-ERC-deep-learn)</span></i></strong><span style="font-family:Arial,sans-serif"></span></p>
<p style="margin:0cm 0cm 8.25pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:"Times New Roman",serif"><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">We will develop, compare and use artificial
neural networks as models to better understand the neural code for visual contents. An example of primary work is</span><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(102,102,102)"> </span><span style="color:black"><a href="https://www.nature.com/articles/srep27755" title="DNN brain comparison" style="color:rgb(5,99,193)" target="_blank"><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204)">Cichy at
al., 2016 </span><em><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204);text-decoration-line:none">Sci Rep</span></em></a></span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">, a high-level perspective on our approach is summarized in</span><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(102,102,102)"> </span><span style="color:black"><a href="https://doi.org/10.1016/j.tics.2019.01.009" title="TICS review on models" style="color:rgb(5,99,193)" target="_blank"><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204)">Cichy
& Kaiser 2020, </span><em><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204);text-decoration-line:none">Trends
Cogn Sci</span></em></a></span><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(102,102,102)">.</span></p>
<p style="margin:0cm 0cm 8.25pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:"Times New Roman",serif"><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">This work is in </span><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif;color:black">(also already established)</span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black"> collaboration with</span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black"> </span><span style="color:black"><a href="https://scholar.google.com/citations?user=FNhl50sAAAAJ&hl=en" title="Aude's GS page" style="color:rgb(5,99,193)" target="_blank"><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204)">Aude Oliva</span></a></span><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(102,102,102)"> </span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">(CSAIL/MIT) and </span><span style="color:black"><a href="http://www.cvai.cs.uni-frankfurt.de/index.html" title="Gemma's webpage" style="color:rgb(5,99,193)" target="_blank"><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204)">Gemma
Roig</span></a></span><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(102,102,102)"> </span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">(Goethe University Frankfurt). An example of
collaborative work in this spirit is the</span><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(102,102,102)"> </span><span style="color:black"><a href="http://algonauts.csail.mit.edu/" title="Algonauts Website" style="color:rgb(5,99,193)" target="_blank"><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204)">Algonauts
Project</span></a></span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">, which will be further conducted as part
of </span><span style="color:black"><a href="https://ccneuro.org/2020/" title="CCN webpage" style="color:rgb(5,99,193)" target="_blank"><span style="font-size:10pt;font-family:Arial,sans-serif;color:rgb(0,102,204)">CCN </span></a></span><span style="font-size:10pt;font-family:Arial,sans-serif;color:black">in 2021.</span></p><p style="margin:0cm 0cm 8.25pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:"Times New Roman",serif"><span style="font-size:10pt;font-family:Arial,sans-serif;color:black"><br></span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">Funding is available immediately but maximally until
30.04.2024 (end of the funding period).</span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif"> </span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">Please find <b>further details</b> about the positions,
the lab in general, and how to apply </span><a href="https://www.ewi-psy.fu-berlin.de/en/einrichtungen/arbeitsbereiche/neural_dyn_of_vis_cog/ERC-postdoc-call/index.html" style="color:rgb(5,99,193)" target="_blank"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">here</span></a><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">.</span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif"> </span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">The application deadline is September 27<sup>th</sup>.
For informal inquiries please contact </span><a href="mailto:rmcichy@zedat.fu-berlin.com" style="color:rgb(5,99,193)" target="_blank"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">me</span></a><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif"> directly. </span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif"> </span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">Best,</span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">Radek</span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif"> </span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">ERC StG / Emmy Noether Group Leader</span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="EN-US" style="font-size:10pt;font-family:Arial,sans-serif">Department of Education and Psychology</span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="DE" style="font-size:10pt;font-family:Arial,sans-serif">Freie Universität Berlin</span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="DE" style="font-size:10pt;font-family:Arial,sans-serif">Germany</span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="DE" style="font-size:10pt;font-family:Arial,sans-serif"><a href="mailto:rmcichy@zedat.fu-berlin.de" style="color:rgb(5,99,193)" target="_blank">rmcichy@zedat.fu-berlin.de</a></span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="DE" style="font-size:10pt;font-family:Arial,sans-serif"> </span></p>
<p class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:Calibri,sans-serif"><span lang="DE" style="font-size:10pt;font-family:Arial,sans-serif"> </span></p></div>