<div dir="ltr"><div class="gmail_default" style="font-family:trebuchet ms,sans-serif">Dear All,</div><div class="gmail_default" style="font-family:trebuchet ms,sans-serif"><br></div><div class="gmail_default" style="font-family:trebuchet ms,sans-serif">Our lab is hiring a full-time RA position in an NIH-funded project on computational modeling of human material perception. </div><div class="gmail_default" style="font-family:trebuchet ms,sans-serif"><br></div><div class="gmail_default" style="font-family:trebuchet ms,sans-serif">Please see the full ad below and I am happy to answer any questions.</div><div class="gmail_default" style="font-family:trebuchet ms,sans-serif"><br>Best,</div><div class="gmail_default" style="font-family:trebuchet ms,sans-serif"><br>Bei </div><div class="gmail_default" style="font-family:trebuchet ms,sans-serif"><br></div><div class="gmail_default" style="font-family:trebuchet ms,sans-serif"><span id="gmail-docs-internal-guid-f0a4341d-7fff-00f9-aac0-b38ed2d63bec"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Garamond,serif;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> </span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Position Overview</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> </span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">The </span><a href="https://sites.google.com/site/beixiao/" style="text-decoration-line:none"><span style="font-size:12pt;font-family:Arial;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline">Xiao Computational Perception Lab</span></a><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> in the </span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Department of Computer Science</span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> at American University is seeking a full-time Research Assistant/ Lab Technician for an NIH-funded project on the computational modeling of human intuitive physics, material  perception, and immersive material perception in VR/AR.</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> </span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Job Description  </span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">The RA is to pursue research projects of his/her own as well as provide support for research carried out in the Xiao lab. </span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Possible duties include:</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">  </span></p><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Python coding for experiment interface and behavioral data analysis</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Collecting data for psychophysical experiments</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Building VR/AR experimental interfaces with Unity3D</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Training machine learning models</span></p></li></ul><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">This is an ideal position for someone interested in gaining research experience in perception science and computational modeling before applying to graduate school or an industrial research position. </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">The position comes with a salary and full benefits. Full-time staff at AU can take computer science courses for tuition remission.  This position is initially for a </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">one-year contract and can be extended</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">. Starting date is</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline"> September 1st, 2023, or soon after</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">. </span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Position Requirements:</span></p><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">The ideal candidate should have a Bachelor's degree in neuroscience, psychology, computer science, engineering, or a related field. </span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">The candidate should have strong programming skills in Python and is familiar with Numpy, Pandas, and other numerical libraries.  Having experience deep learning with PyTorch is a plus.  </span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Experience with statistical methods (linear models, multivariate analysis, etc.).</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Experience with psychophysics research is not required but would be useful. </span></p></li></ul><br><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">The Lab and Facility</span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Xiao Lab studies both human and computer vision with an emphasis on material perception and recognition. The lab currently has a few ongoing research projects:  </span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><br></p><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;color:rgb(34,34,34);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Learning latent representation of human perception of material properties </span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;color:rgb(34,34,34);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Material and object perception in infants and children with behavioral and EEG methods (collaborating with Dr.</span><a href="https://www.american.edu/profiles/faculty/bayet.cfm" style="text-decoration-line:none"><span style="font-size:12pt;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline">Laurie Bayet</span></a><span style="font-size:12pt;color:rgb(34,34,34);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">). </span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Volumetric Capture Studio</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Uncertainty estimation in few-shot learning in text classification</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Prediction of clinical trial outcomes with human experts and machine learning models.</span></p></li></ul><p dir="ltr" style="line-height:1.38;margin-left:72pt;margin-top:0pt;margin-bottom:0pt"><br></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">The Xiao Lab is located in a state-of-the-art technology building, which is home to computer science, physics, applied math, and a design and build lab. The lab has high-performing GPU workstations, haptic phantom devices, VR headsets, and 3D printers. </span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><br></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:"Helvetica Neue",sans-serif;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Washington, DC, is the US capital and has a vibrant scene of computational cognition and computer vision research (e.g., NIH, NIST, Johns Hopkins University, George Washington University, and the University of Maryland). </span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><br></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">How to apply</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Please submit your application, including a CV,  and a cover letter describing your background, experience, and motivation - preferably in PDF format, and the names of two references that have agreed to be contacted. Please apply </span><span style="font-size:12pt;font-family:Arial;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">to </span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Prof. Bei Xiao at </span><a href="mailto:bxiao@american.edu" style="text-decoration-line:none"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline">bxiao@american.edu</span></a><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">.</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Representative Recent Publications:  </span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><br></p><p dir="ltr" style="line-height:1.38;margin-left:36pt;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">1.</span><span style="font-size:7pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> </span><span style="font-size:7pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"><span class="gmail-Apple-tab-span" style="text-wrap: nowrap;">        </span></span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Liao, C, Sawayama, M, </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Xiao, B.</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">  (2023) </span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Unsupervised learning reveals interpretable latent representations for translucency perception</span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">. </span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">PLOS Computational Biology</span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">. Feb 8, 2023. </span><a href="https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1010878" style="text-decoration-line:none"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline">PDF.</span></a><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> </span></p><p dir="ltr" style="line-height:1.38;margin-left:36pt;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">2.       Zhang, X, Lei, S, Alhamadni, A, Chen, F, </span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Xiao, B</span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">, and Lu, CT. (2023) CLUR: Uncertainty Estimation for Few-Shot Text Classification with Contrastive Learning.  </span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">ACM SIGKDD</span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> 2023.  PDF upon request. </span></p><p dir="ltr" style="line-height:1.38;margin-left:36pt;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">2.</span><span style="font-size:7pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> </span><span style="font-size:7pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"><span class="gmail-Apple-tab-span" style="text-wrap: nowrap;">  </span></span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Liao, C, Sawayama, M, </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Xiao, B. </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> (2022) Crystal or Jelly? Effect of Color on the Perception of Translucent Materials with Photographs of Real-world Objects</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">.</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Journal of Vision</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">. </span><a href="https://jov.arvojournals.org/Article.aspx?articleid=2778489" style="text-decoration-line:none"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline">PDF</span></a><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">.</span></p><p dir="ltr" style="line-height:1.38;margin-left:36pt;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">3.</span><span style="font-size:7pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"> </span><span style="font-size:7pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline"><span class="gmail-Apple-tab-span" style="text-wrap: nowrap;"> </span></span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">He, J. Zhang, X., Shuo L. Wang, S, Huang, Q., Lu, C-T, </span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Xiao, B</span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">. (2022) Semantic Editing On Segmentation Map Via Multi-Expansion Loss. </span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline">Neurocomputing. 501,306-317. </span><a href="https://arxiv.org/abs/2010.08128" style="text-decoration-line:none"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline">PDF.</span></a></p></span><br class="gmail-Apple-interchange-newline"></div><div><br></div><span class="gmail_signature_prefix">-- </span><br><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div><font face="garamond, serif" size="2">Bei Xiao, PhD<br>Associate Professor </font><div><font face="garamond, serif" size="2">Computer Science & Center for Behavioral Neuroscience</font></div><div><font face="garamond, serif" size="2">American University, Washington DC</font></div><div><font face="garamond, serif" size="2"><br></font></div><div><font><font face="garamond, serif" size="2">Homepage: <a href="https://sites.google.com/site/beixiao/" style="color:rgb(17,85,204)" target="_blank">https://sites.google.com/site/beixiao/</a></font><br></font></div></div><div><br></div><div><br></div><div><br></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div>