[visionlist] Software for automatic detection of face touching

John Neuhoff JNEUHOFF at wooster.edu
Sat Mar 21 17:58:08 -04 2020


There's an app for that (and a hardware wristband as well). It's the "Immunotouch wristband"

https://techcrunch.com/2020/03/09/dont-immutouch/
[https://techcrunch.com/wp-content/uploads/2020/03/Immutouch-Coronavirus-Wristband.png?w=740]<https://techcrunch.com/2020/03/09/dont-immutouch/>
Immutouch wristband buzzes to stop you touching your face – TechCrunch<https://techcrunch.com/2020/03/09/dont-immutouch/>
In the age of coronavirus, we all have to resist the urge to touch our faces. It’s how the virus can travel from doorknobs or other objects to your mucus membranes and get you sick. Luckily, a ...
techcrunch.com


___________________________
John G. Neuhoff
Professor, Department of Psychology
The College of Wooster
Chair, Auditory Perception & Cognition Society
http://jneuhoff.com

________________________________
From: visionlist <visionlist-bounces at visionscience.com> on behalf of visionlist-request at visionscience.com <visionlist-request at visionscience.com>
Sent: Saturday, March 21, 2020 2:15 PM
To: visionlist at visionscience.com <visionlist at visionscience.com>
Subject: visionlist Digest, Vol 39, Issue 18

Send visionlist mailing list submissions to
        visionlist at visionscience.com

To subscribe or unsubscribe via the World Wide Web, visit
        http://visionscience.com/mailman/listinfo/visionlist_visionscience.com

or, via email, send a message with subject or body 'help' to
        visionlist-request at visionscience.com

You can reach the person managing the list at
        visionlist-owner at visionscience.com

When replying, please edit your Subject line so it is more specific
than "Re: Contents of visionlist digest..."


Today's Topics:

   1. COVID-19 and software for automatic detection of face
      touching (James Pomerantz)
   2. Trying to demonstrate stereoscopic vision remotely
      (Lester Loschky)
   3. Re: [cvnet] Trying to demonstrate stereoscopic vision
      remotely (Maarten Wijntjes - IO)
   4. Re: [cvnet] Trying to demonstrate stereoscopic vision
      remotely (Lester Loschky)
   5. IMRF 2020 postponed to 16.-19. Oct. 2020 (Marc Ernst)
   6. Re: [cvnet] Trying to demonstrate stereoscopic vision
      remotely (Ben Backus)


----------------------------------------------------------------------

Message: 1
Date: Fri, 20 Mar 2020 17:20:10 -0500
From: James Pomerantz <pomeran at rice.edu>
To: visionlist at visionscience.com
Subject: [visionlist] COVID-19 and software for automatic detection of
        face touching
Message-ID: <5372163b-fd5e-62eb-5aae-2887212a49a6 at rice.edu>
Content-Type: text/plain; charset="utf-8"; Format="flowed"

Dear Colleagues,

Can someone put me in contact with any researchers who might know about
smartphone software capable of detecting hands touching the face? That
is certainly a primary pathway for the spread of COVID-19. I?m working
with behavioral scientists who hope to extinguish face touching through
established techniques such as operant conditioning. Software that
automated detection could speed up their work considerably. Please
forward contact information for anyone you think might help. This is
obviously a matter of urgency if we are to slow the pandemic
transmission rate and flatten the curve.

Thanks, Jim Pomerantz, pomeran at rice.edu.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200320/632bfc19/attachment-0001.html>

------------------------------

Message: 2
Date: Fri, 20 Mar 2020 21:45:14 -0500
From: Lester Loschky <loschky at ksu.edu>
To: visionlist at visionscience.com, cvnet <cvnet at mail.ewind.com>
Subject: [visionlist] Trying to demonstrate stereoscopic vision
        remotely
Message-ID:
        <CAF+FnwvKpQz2GvcLSLS1GqMK5TxutG1-bYcDrrF4rLA88dQBfA at mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

Hi Everybody,

If you teach Sensation and Perception, and are currently preparing to teach
it remotely, you may have the same question I have: how can you demonstrate
stereovision remotely?

As preface, the following are methods I have used in in-person classes to
demonstrate stereo vision:

   1. an actual stereoscope and example stereoimages to share with students
   (including the classic Julesz square tile random-dot-stereogram image)
   2. example stereoscopic lenticular lens images to share with students
   3. red/green anaglyph images with sets of cardboard & plastic red/green
   anaglyph glasses
   4. Google Cardboard plus cell phone to share with students
   5. random dot autosterographic images
   6. touching two pen tips together using two eyes versus one eye
   7. learning about crossed vs. uncrossed disparity using two fingers at
   different distances

Unfortunately, my students don't uniformly have access to the apparatuses
required for 1-4 above.

Re. # 3 (red/green anaglyph images), I've thought of having students order
a single pair of red/green anaglyph glasses online.  However, it appears
that the cardboard and plastic ones can only be purchased in bulk. (I guess
they're too cheap to sell individually.)  They also might not arrive in
time, but students could still enjoy them once they get them.

Re. #4 (Google Cardboard), I recall getting a free Google Cardboard from
the NYTimes several years ago.  However, they are now no cheaper than $5
(Irisu, of India), and likely wouldn't arrive in time.

Regarding option #5 (random-dot autostereograms), I have found that since
seeing random dot autostereographic images in depth requires perceptual
learning, a large proportion of students don't manage to learn (within the
short time period given in a single class period).  (Of course, many
students may have a lot of time on their hands now, so they might keep at
it long enough to learn to perceive them.  But there will definitely be a
good proportion of students who don't try long enough to learn, and so
don't get it.)

#6 (touching two pen tips together) is definitely something that can be
done remotely.  However, it doesn't have the "Wow!" factor of other
demonstrations.  It is more of an "oh, really..." experience to realize how
much worse you are with one eye than two.

#7 (using two fingers at different distances to teach crossed vs. uncrossed
disparity) can definitely be done remotely.  It is very educational, but
again does not have the "Wow" factor.

There is also the finger "hot dog" illusion, which can be done remotely.
It is interesting, but quite different from all of the others in that
stereoscopic depth perception is not involved.

For the related phenomenon of motion parallax, "wiggle vision" is a very
nice demonstration:
http://www.well.com/user/jimg/stereo/stereo_gate.html
https://www.3dwiggle.com/2016/06/28/5-wigglegrams-you-need-to-see-before-you-die/

Of course, depth perception from motion parallax is importantly
theoretically related to stereoscopic vision (both involve two different
images from two different views, one seen over time (and only needing one
eye)--motion parallax--and the other seen simultaneously (and requiring two
eyes)--stereovision).  But it is not the same as stereoscopic vision, so is
a separate but related issue.

For the related phenomenon of binocular disparity, there is the famous
"hole in your hand" illusion using a cardboard paper towel roll.  If
students have a spare cardboard paper towel roll, they can do this
remotely.  But, again, it is a theoretically related but separate issue.

Any other suggestions would be appreciated.

Best wishes,

Les
--
Lester Loschky
Professor
Associate Director, Cognitive and Neurobiological Approaches to Plasticity
Center
Department of Psychological Sciences
471 Bluemont Hall
1114 Mid-Campus Dr North
Kansas State University
Manhattan, KS  66506-5302
email: loschky at ksu.edu
research page: https://www.k-state.edu/psych/research/loschkylester.html
lab page: http://www.k-state.edu/psych/vcl/index.html
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200320/0d350711/attachment-0001.html>

------------------------------

Message: 3
Date: Sat, 21 Mar 2020 08:21:10 +0000
From: Maarten Wijntjes - IO <M.W.A.Wijntjes at tudelft.nl>
To: Christopher Tyler <cwtyler2020 at gmail.com>
Cc: Lester Loschky <loschky at ksu.edu>, "visionlist at visionscience.com"
        <visionlist at visionscience.com>, cvnet <cvnet at mail.ewind.com>
Subject: Re: [visionlist] [cvnet] Trying to demonstrate stereoscopic
        vision remotely
Message-ID: <D068C7CD-6A99-4E2D-A2C4-C33968287944 at tudelft.nl>
Content-Type: text/plain; charset="utf-8"

In addition it could be the right moment to discuss monocular stereopsis. Check Ames? publication about this, he offers various solutions, not all easily achievable at home but looking through an aperture should work. Off course a synopter (not mentioned by Ames) would have been great, so after the pandemic you should order some with me ;). What is cool and doable is looking through a make-up mirror to a picture on your smart phone (or a postcard, if available). It gives the effect of a snapscope and is related to the zograscope and graphoscope (which you can also simulate by looking through a big loupe).

Off course the Pulfrich effect is easily demonstrated if  students have sunglasses ate hand.

About the 3D Wiggle: I remember I once saw a museum website where visitors could help converting the 3D photo collection into GIFs that show the wiggle effect. It is quite essential to get the alignment right otherwise you get headache GIFs. But I could not find the website? would be a nice pastime now.

Good luck!
Maarten

References
Ames, A. (1925). The illusion of depth from single pictures. JOSA, 10(2), 137-148.

Vishwanath, D., & Hibbard, P. B. (2013). Seeing in 3-D with just one eye: Stereopsis without binocular vision. Psychological science, 24(9), 1673-1685.

Koenderink, J.J., Wijntjes, M.W.A., & van Doorn, A.J. (2013). Zograscopic viewing. i-Perception, 4(3), 192-206.


On 21 Mar 2020, at 05:50, Christopher Tyler via cvnet <cvnet at lawton.ewind.com<mailto:cvnet at lawton.ewind.com>> wrote:

You could try autostereograms, such as the examples on my Scholarpedia page<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.scholarpedia.org_article_Autostereogram&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=vhRKNtTW7PCWLmi9NbPxhzHa8gHdOhrtuzHzu1q1ulo&e=>.  Not everyone can get them, but they're pretty effective for those who can.

All the best,
Christopher

On Fri, Mar 20, 2020 at 9:35 PM Lester Loschky <loschky at ksu.edu<mailto:loschky at ksu.edu>> wrote:
Hi Everybody,

If you teach Sensation and Perception, and are currently preparing to teach it remotely, you may have the same question I have: how can you demonstrate stereovision remotely?

As preface, the following are methods I have used in in-person classes to demonstrate stereo vision:

  1.  an actual stereoscope and example stereoimages to share with students (including the classic Julesz square tile random-dot-stereogram image)
  2.  example stereoscopic lenticular lens images to share with students
  3.  red/green anaglyph images with sets of cardboard & plastic red/green anaglyph glasses
  4.  Google Cardboard plus cell phone to share with students
  5.  random dot autosterographic images
  6.  touching two pen tips together using two eyes versus one eye
  7.  learning about crossed vs. uncrossed disparity using two fingers at different distances

Unfortunately, my students don't uniformly have access to the apparatuses required for 1-4 above.

Re. # 3 (red/green anaglyph images), I've thought of having students order a single pair of red/green anaglyph glasses online.  However, it appears that the cardboard and plastic ones can only be purchased in bulk. (I guess they're too cheap to sell individually.)  They also might not arrive in time, but students could still enjoy them once they get them.

Re. #4 (Google Cardboard), I recall getting a free Google Cardboard from the NYTimes several years ago.  However, they are now no cheaper than $5 (Irisu, of India), and likely wouldn't arrive in time.

Regarding option #5 (random-dot autostereograms), I have found that since seeing random dot autostereographic images in depth requires perceptual learning, a large proportion of students don't manage to learn (within the short time period given in a single class period).  (Of course, many students may have a lot of time on their hands now, so they might keep at it long enough to learn to perceive them.  But there will definitely be a good proportion of students who don't try long enough to learn, and so don't get it.)

#6 (touching two pen tips together) is definitely something that can be done remotely.  However, it doesn't have the "Wow!" factor of other demonstrations.  It is more of an "oh, really..." experience to realize how much worse you are with one eye than two.

#7 (using two fingers at different distances to teach crossed vs. uncrossed disparity) can definitely be done remotely.  It is very educational, but again does not have the "Wow" factor.

There is also the finger "hot dog" illusion, which can be done remotely.  It is interesting, but quite different from all of the others in that stereoscopic depth perception is not involved.

For the related phenomenon of motion parallax, "wiggle vision" is a very nice demonstration:
http://www.well.com/user/jimg/stereo/stereo_gate.html<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.well.com_user_jimg_stereo_stereo-5Fgate.html&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=HPwhVQhqtG4Yp0wJsSQQ9jptwIKGmGCMdFA2BuzlVjc&e=>
https://www.3dwiggle.com/2016/06/28/5-wigglegrams-you-need-to-see-before-you-die/<https://urldefense.proofpoint.com/v2/url?u=https-3A__www.3dwiggle.com_2016_06_28_5-2Dwigglegrams-2Dyou-2Dneed-2Dto-2Dsee-2Dbefore-2Dyou-2Ddie_&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=R5_kk7lER-Zb8bPIW9Hc-1e1ZY99D-l6y-2A80jkSiQ&e=>

Of course, depth perception from motion parallax is importantly theoretically related to stereoscopic vision (both involve two different images from two different views, one seen over time (and only needing one eye)--motion parallax--and the other seen simultaneously (and requiring two eyes)--stereovision).  But it is not the same as stereoscopic vision, so is a separate but related issue.

For the related phenomenon of binocular disparity, there is the famous "hole in your hand" illusion using a cardboard paper towel roll.  If students have a spare cardboard paper towel roll, they can do this remotely.  But, again, it is a theoretically related but separate issue.

Any other suggestions would be appreciated.

Best wishes,

Les
--
Lester Loschky
Professor
Associate Director, Cognitive and Neurobiological Approaches to Plasticity Center
Department of Psychological Sciences
471 Bluemont Hall
1114 Mid-Campus Dr North
Kansas State University
Manhattan, KS  66506-5302
email: loschky at ksu.edu<mailto:loschky at ksu.edu>
research page: https://www.k-state.edu/psych/research/loschkylester.html<https://urldefense.proofpoint.com/v2/url?u=https-3A__www.k-2Dstate.edu_psych_research_loschkylester.html&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=JGHw_GKKB1Ph3fNYWUtFSsHL1uXmuntPqdkzyeJj_mE&e=>
lab page: http://www.k-state.edu/psych/vcl/index.html<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.k-2Dstate.edu_psych_vcl_index.html&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=OkMzh7Z8mw-EU_qOkon63NKVE2SZbyeu5l0wAwoMSGs&e=>

_______________________________________________
cvnet mailing list
cvnet at lawton.ewind.com<mailto:cvnet at lawton.ewind.com>
https://lawton.ewind.com/mailman/listinfo/cvnet<https://urldefense.proofpoint.com/v2/url?u=https-3A__lawton.ewind.com_mailman_listinfo_cvnet&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=j85CcFthbuaLJRJiJm4mmQDbyRBsl7X1MqUY06iIg9w&e=>

_______________________________________________
cvnet mailing list
cvnet at lawton.ewind.com<mailto:cvnet at lawton.ewind.com>
https://urldefense.proofpoint.com/v2/url?u=https-3A__lawton.ewind.com_mailman_listinfo_cvnet&d=DwICAg&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=j85CcFthbuaLJRJiJm4mmQDbyRBsl7X1MqUY06iIg9w&e=

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200321/1c611e08/attachment-0001.html>

------------------------------

Message: 4
Date: Sat, 21 Mar 2020 01:06:15 -0500
From: Lester Loschky <loschky at ksu.edu>
To: Christopher Tyler <cwtyler2020 at gmail.com>
Cc: visionlist at visionscience.com, cvnet <cvnet at mail.ewind.com>
Subject: Re: [visionlist] [cvnet] Trying to demonstrate stereoscopic
        vision remotely
Message-ID:
        <CAF+FnwuUYGDF9dUDhsTYaEBqCwodeAscFxqCaeEb-MkioZWoZA at mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

Thanks, Christopher!  I had listed random dot autosterograms as one option,
and noted, as you did, that many people unfortunately have great difficulty
learning to under-converge or over-converge.  However, your Scholarpedia
page is extremely informative.  I learned a lot from it.  I
(embarrassingly) hadn't realize the seminal role you played in the
development of the algorithms for them.  I'll probably give some
autostereograms to the students.  The students who manage to learn to
perceive them will enjoy them.

BTW, from an artistic point of view, some of my favorites are by Shiro
Nakayama, from the Super Stereogram books you mentioned.

Best wishes,

Les

On Fri, Mar 20, 2020 at 11:51 PM Christopher Tyler <cwtyler2020 at gmail.com>
wrote:

> You could try autostereograms, such as the examples on my Scholarpedia
> page <http://www.scholarpedia.org/article/Autostereogram>.  Not everyone
> can get them, but they're pretty effective for those who can.
>
> All the best,
> Christopher
>
> On Fri, Mar 20, 2020 at 9:35 PM Lester Loschky <loschky at ksu.edu> wrote:
>
>> Hi Everybody,
>>
>> If you teach Sensation and Perception, and are currently preparing to
>> teach it remotely, you may have the same question I have: how can you
>> demonstrate stereovision remotely?
>>
>> As preface, the following are methods I have used in in-person classes to
>> demonstrate stereo vision:
>>
>>    1. an actual stereoscope and example stereoimages to share with
>>    students (including the classic Julesz square tile random-dot-stereogram
>>    image)
>>    2. example stereoscopic lenticular lens images to share with students
>>    3. red/green anaglyph images with sets of cardboard & plastic
>>    red/green anaglyph glasses
>>    4. Google Cardboard plus cell phone to share with students
>>    5. random dot autosterographic images
>>    6. touching two pen tips together using two eyes versus one eye
>>    7. learning about crossed vs. uncrossed disparity using two fingers
>>    at different distances
>>
>> Unfortunately, my students don't uniformly have access to the apparatuses
>> required for 1-4 above.
>>
>> Re. # 3 (red/green anaglyph images), I've thought of having students
>> order a single pair of red/green anaglyph glasses online.  However, it
>> appears that the cardboard and plastic ones can only be purchased in bulk.
>> (I guess they're too cheap to sell individually.)  They also might not
>> arrive in time, but students could still enjoy them once they get them.
>>
>> Re. #4 (Google Cardboard), I recall getting a free Google Cardboard from
>> the NYTimes several years ago.  However, they are now no cheaper than $5
>> (Irisu, of India), and likely wouldn't arrive in time.
>>
>> Regarding option #5 (random-dot autostereograms), I have found that since
>> seeing random dot autostereographic images in depth requires perceptual
>> learning, a large proportion of students don't manage to learn (within the
>> short time period given in a single class period).  (Of course, many
>> students may have a lot of time on their hands now, so they might keep at
>> it long enough to learn to perceive them.  But there will definitely be a
>> good proportion of students who don't try long enough to learn, and so
>> don't get it.)
>>
>> #6 (touching two pen tips together) is definitely something that can be
>> done remotely.  However, it doesn't have the "Wow!" factor of other
>> demonstrations.  It is more of an "oh, really..." experience to realize how
>> much worse you are with one eye than two.
>>
>> #7 (using two fingers at different distances to teach crossed vs.
>> uncrossed disparity) can definitely be done remotely.  It is very
>> educational, but again does not have the "Wow" factor.
>>
>> There is also the finger "hot dog" illusion, which can be done remotely.
>> It is interesting, but quite different from all of the others in that
>> stereoscopic depth perception is not involved.
>>
>> For the related phenomenon of motion parallax, "wiggle vision" is a very
>> nice demonstration:
>> http://www.well.com/user/jimg/stereo/stereo_gate.html
>>
>> https://www.3dwiggle.com/2016/06/28/5-wigglegrams-you-need-to-see-before-you-die/
>>
>> Of course, depth perception from motion parallax is importantly
>> theoretically related to stereoscopic vision (both involve two different
>> images from two different views, one seen over time (and only needing one
>> eye)--motion parallax--and the other seen simultaneously (and requiring two
>> eyes)--stereovision).  But it is not the same as stereoscopic vision, so is
>> a separate but related issue.
>>
>> For the related phenomenon of binocular disparity, there is the famous
>> "hole in your hand" illusion using a cardboard paper towel roll.  If
>> students have a spare cardboard paper towel roll, they can do this
>> remotely.  But, again, it is a theoretically related but separate issue.
>>
>> Any other suggestions would be appreciated.
>>
>> Best wishes,
>>
>> Les
>> --
>> Lester Loschky
>> Professor
>> Associate Director, Cognitive and Neurobiological Approaches to
>> Plasticity Center
>> Department of Psychological Sciences
>> 471 Bluemont Hall
>> 1114 Mid-Campus Dr North
>> Kansas State University
>> Manhattan, KS  66506-5302
>> email: loschky at ksu.edu
>> research page: https://www.k-state.edu/psych/research/loschkylester.html
>> lab page: http://www.k-state.edu/psych/vcl/index.html
>>
>> _______________________________________________
>> cvnet mailing list
>> cvnet at lawton.ewind.com
>> https://lawton.ewind.com/mailman/listinfo/cvnet
>>
>

--
Lester Loschky
Professor
Associate Director, Cognitive and Neurobiological Approaches to Plasticity
Center
Department of Psychological Sciences
471 Bluemont Hall
1114 Mid-Campus Dr North
Kansas State University
Manhattan, KS  66506-5302
email: loschky at ksu.edu
research page: https://www.k-state.edu/psych/research/loschkylester.html
lab page: http://www.k-state.edu/psych/vcl/index.html
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200321/8d01d409/attachment-0001.html>

------------------------------

Message: 5
Date: Sat, 21 Mar 2020 17:21:45 +0100
From: Marc Ernst <marc.ernst at uni-ulm.de>
To: visionlist at visionscience.com
Subject: [visionlist] IMRF 2020 postponed to 16.-19. Oct. 2020
Message-ID: <AC44E377-B1B4-49F6-8026-4A24D86F8C9E at uni-ulm.de>
Content-Type: text/plain; charset="us-ascii"

******   IMRF 2020 postponed to 16.-19. Oct. 2020   *******


Unfortunately the COVID-19 situation is getting worse quickly these days and it is unlikely that the situation will have fully recovered by June so that it is safe for our colleagues to travel around the world. We therefore decided to postpone the International Multisensory Research Forum - IMRF 2020 - from June to October 16-19, 2020 and we truly hope things will be back to normal by then.

We are of course aware that this can potentially pose problems for some who may already have different plans for October. Also your research may develop in the coming months despite the presently tense situation. We therefore open the abstract submission portal again, so you can adapt your abstracts or you may also submit new abstracts. If you cannot make it for the October date, you can of course also withdraw your abstract or make one of your coauthors the presenter of the research.

The new abstract submission deadline is August 14, 2020.
You can submit or alter your abstracts at any moment until this date. Reviewing of all abstracts, including the ones already submitted, will follow this date.

The same date applies for adapting the symposium proposals.

Registration will stay open and the dates for early registration will be moved to September 10, 2020.

Payed registrations will remain valid. Of course we will reimburse you in case you payed already but you cannot make it for the October dates. Please let us know by April 20 if this is the case. If you have not payed your registration yet, you can do so at any moment from now on: Early registration until registration Sept. 10, late registration until Oct. 10 or on-site registration.

I hope to see you in Ulm in October. Be safe and stay healthy!!!!


Best,
Marc Ernst on behalf of the IMRF organizing team.


---------------------------------------------------------
Prof. Dr. Marc Ernst

Appl. Cognitive Psychology
Faculty for Computer Science, Engineering, and Psychology
Ulm University
Albert-Einstein-Allee 43
89081 Ulm

room:    43.3.103
phone:           +49-731-50 320 50
sekr:    +49-731-50 320 51
fax:             +49-731-50 320 59
mobile:  +49-152-22 543 156
marc.ernst at uni-ulm.de <mailto:marc.ernst at uni-ulm.de>





-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200321/22ab525d/attachment-0001.html>

------------------------------

Message: 6
Date: Sat, 21 Mar 2020 13:49:13 -0700
From: Ben Backus <ben.backus at gmail.com>
To: Lester Loschky <loschky at ksu.edu>
Cc: Christopher Tyler <cwtyler2020 at gmail.com>, cvnet
        <cvnet at mail.ewind.com>,  visionlist at visionscience.com
Subject: Re: [visionlist] [cvnet] Trying to demonstrate stereoscopic
        vision remotely
Message-ID:
        <CAGnM2QTyb0V1q12ONqaMaovYP5YOoUOh69mz-JhVG-N8v8GkHg at mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

Les et al,
How about something like this?
Please use the attached powerpoint file any way you like.


[image: image.png]


On Sat, Mar 21, 2020 at 9:21 AM Lester Loschky <loschky at ksu.edu> wrote:

> Thanks, Christopher!  I had listed random dot autosterograms as one
> option, and noted, as you did, that many people unfortunately have great
> difficulty learning to under-converge or over-converge.  However, your
>
 ...
On Fri, Mar 20, 2020 at 11:51 PM Christopher Tyler <cwtyler2020 at gmail.com>
wrote:

> You could try autostereograms, such as the examples on my Scholarpedia
>> page <http://www.scholarpedia.org/article/Autostereogram>.  Not everyone
>> can get them, but they're pretty effective for those who can.
>>
>> On Fri, Mar 20, 2020 at 9:35 PM Lester Loschky <loschky at ksu.edu> wrote:
>>
>>> Hi Everybody,
>>>
>>> If you teach Sensation and Perception, and are currently preparing to
>>> teach it remotely, you may have the same question I have: how can you
>>> demonstrate stereovision remotely?
>>>
>> ...

--
Ben Backus
CSO, Vivid Vision Inc (ben at seevividly.com)
Assoc. Prof. Emeritus, SUNY College of Optometry (bbackus at sunyopt.edu)
<http://poseidon.sunyopt.edu/Backuslab/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200321/072133e8/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image.png
Type: image/png
Size: 22528 bytes
Desc: not available
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200321/072133e8/attachment.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Stereo Demo Backus Vivid Vision.pptx
Type: application/octet-stream
Size: 40330 bytes
Desc: not available
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200321/072133e8/attachment.obj>

------------------------------

Subject: Digest Footer

_______________________________________________
visionlist mailing list
visionlist at visionscience.com
http://visionscience.com/mailman/listinfo/visionlist_visionscience.com


------------------------------

End of visionlist Digest, Vol 39, Issue 18
******************************************
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200321/19fec165/attachment.html>


More information about the visionlist mailing list