[visionlist] [cvnet] Trying to demonstrate stereoscopic vision remotely
Lester Loschky
loschky at ksu.edu
Sat Mar 21 23:43:28 -04 2020
Fascinating, Martin! I never knew of monocular stereopsis. Thanks for
sharing!
Best wishes,
Les
On Sat, Mar 21, 2020 at 3:21 AM Maarten Wijntjes - IO <
M.W.A.Wijntjes at tudelft.nl> wrote:
> In addition it could be the right moment to discuss *monocular stereopsis*.
> Check Ames’ publication about this, he offers various solutions, not all
> easily achievable at home but looking through an *aperture* should work.
> Off course a *synopter* (not mentioned by Ames) would have been great, so
> after the pandemic you should order some with me ;). What is cool and
> doable is looking through a make-up mirror to a picture on your smart phone
> (or a postcard, if available). It gives the effect of a snapscope and is
> related to the zograscope and graphoscope (which you can also simulate by
> looking through a big loupe).
>
> Off course the *Pulfrich effect* is easily demonstrated if students have
> sunglasses ate hand.
>
> About the *3D Wiggle*: I remember I once saw a museum website where
> visitors could help converting the 3D photo collection into GIFs that show
> the wiggle effect. It is quite essential to get the alignment right
> otherwise you get headache GIFs. But I could not find the website… would be
> a nice pastime now.
>
> Good luck!
> Maarten
>
> References
> Ames, A. (1925). The illusion of depth from single pictures. JOSA, 10(2),
> 137-148.
>
> Vishwanath, D., & Hibbard, P. B. (2013). Seeing in 3-D with just one eye:
> Stereopsis without binocular vision. *Psychological science*, *24*(9),
> 1673-1685.
>
> Koenderink, J.J., Wijntjes, M.W.A., & van Doorn, A.J. (2013). Zograscopic
> viewing. *i-Perception*, *4*(3), 192-206.
>
>
> On 21 Mar 2020, at 05:50, Christopher Tyler via cvnet <
> cvnet at lawton.ewind.com> wrote:
>
> You could try autostereograms, such as the examples on my Scholarpedia
> page
> <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.scholarpedia.org_article_Autostereogram&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=vhRKNtTW7PCWLmi9NbPxhzHa8gHdOhrtuzHzu1q1ulo&e=>.
> Not everyone can get them, but they're pretty effective for those who can.
>
> All the best,
> Christopher
>
> On Fri, Mar 20, 2020 at 9:35 PM Lester Loschky <loschky at ksu.edu> wrote:
>
>> Hi Everybody,
>>
>> If you teach Sensation and Perception, and are currently preparing to
>> teach it remotely, you may have the same question I have: how can you
>> demonstrate stereovision remotely?
>>
>> As preface, the following are methods I have used in in-person classes to
>> demonstrate stereo vision:
>>
>> 1. an actual stereoscope and example stereoimages to share with
>> students (including the classic Julesz square tile random-dot-stereogram
>> image)
>> 2. example stereoscopic lenticular lens images to share with students
>> 3. red/green anaglyph images with sets of cardboard & plastic
>> red/green anaglyph glasses
>> 4. Google Cardboard plus cell phone to share with students
>> 5. random dot autosterographic images
>> 6. touching two pen tips together using two eyes versus one eye
>> 7. learning about crossed vs. uncrossed disparity using two fingers
>> at different distances
>>
>> Unfortunately, my students don't uniformly have access to the apparatuses
>> required for 1-4 above.
>>
>> Re. # 3 (red/green anaglyph images), I've thought of having students
>> order a single pair of red/green anaglyph glasses online. However, it
>> appears that the cardboard and plastic ones can only be purchased in bulk.
>> (I guess they're too cheap to sell individually.) They also might not
>> arrive in time, but students could still enjoy them once they get them.
>>
>> Re. #4 (Google Cardboard), I recall getting a free Google Cardboard from
>> the NYTimes several years ago. However, they are now no cheaper than $5
>> (Irisu, of India), and likely wouldn't arrive in time.
>>
>> Regarding option #5 (random-dot autostereograms), I have found that since
>> seeing random dot autostereographic images in depth requires perceptual
>> learning, a large proportion of students don't manage to learn (within the
>> short time period given in a single class period). (Of course, many
>> students may have a lot of time on their hands now, so they might keep at
>> it long enough to learn to perceive them. But there will definitely be a
>> good proportion of students who don't try long enough to learn, and so
>> don't get it.)
>>
>> #6 (touching two pen tips together) is definitely something that can be
>> done remotely. However, it doesn't have the "Wow!" factor of other
>> demonstrations. It is more of an "oh, really..." experience to realize how
>> much worse you are with one eye than two.
>>
>> #7 (using two fingers at different distances to teach crossed vs.
>> uncrossed disparity) can definitely be done remotely. It is very
>> educational, but again does not have the "Wow" factor.
>>
>> There is also the finger "hot dog" illusion, which can be done remotely.
>> It is interesting, but quite different from all of the others in that
>> stereoscopic depth perception is not involved.
>>
>> For the related phenomenon of motion parallax, "wiggle vision" is a very
>> nice demonstration:
>> http://www.well.com/user/jimg/stereo/stereo_gate.html
>> <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.well.com_user_jimg_stereo_stereo-5Fgate.html&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=HPwhVQhqtG4Yp0wJsSQQ9jptwIKGmGCMdFA2BuzlVjc&e=>
>>
>> https://www.3dwiggle.com/2016/06/28/5-wigglegrams-you-need-to-see-before-you-die/
>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.3dwiggle.com_2016_06_28_5-2Dwigglegrams-2Dyou-2Dneed-2Dto-2Dsee-2Dbefore-2Dyou-2Ddie_&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=R5_kk7lER-Zb8bPIW9Hc-1e1ZY99D-l6y-2A80jkSiQ&e=>
>>
>> Of course, depth perception from motion parallax is importantly
>> theoretically related to stereoscopic vision (both involve two different
>> images from two different views, one seen over time (and only needing one
>> eye)--motion parallax--and the other seen simultaneously (and requiring two
>> eyes)--stereovision). But it is not the same as stereoscopic vision, so is
>> a separate but related issue.
>>
>> For the related phenomenon of binocular disparity, there is the famous
>> "hole in your hand" illusion using a cardboard paper towel roll. If
>> students have a spare cardboard paper towel roll, they can do this
>> remotely. But, again, it is a theoretically related but separate issue.
>>
>> Any other suggestions would be appreciated.
>>
>> Best wishes,
>>
>> Les
>> --
>> Lester Loschky
>> Professor
>> Associate Director, Cognitive and Neurobiological Approaches to
>> Plasticity Center
>> Department of Psychological Sciences
>> 471 Bluemont Hall
>> 1114 Mid-Campus Dr North
>> Kansas State University
>> Manhattan, KS 66506-5302
>> email: loschky at ksu.edu
>> research page: https://www.k-state.edu/psych/research/loschkylester.html
>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.k-2Dstate.edu_psych_research_loschkylester.html&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=JGHw_GKKB1Ph3fNYWUtFSsHL1uXmuntPqdkzyeJj_mE&e=>
>> lab page: http://www.k-state.edu/psych/vcl/index.html
>> <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.k-2Dstate.edu_psych_vcl_index.html&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=OkMzh7Z8mw-EU_qOkon63NKVE2SZbyeu5l0wAwoMSGs&e=>
>>
>> _______________________________________________
>> cvnet mailing list
>> cvnet at lawton.ewind.com
>> https://lawton.ewind.com/mailman/listinfo/cvnet
>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__lawton.ewind.com_mailman_listinfo_cvnet&d=DwMFaQ&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=j85CcFthbuaLJRJiJm4mmQDbyRBsl7X1MqUY06iIg9w&e=>
>>
>
> _______________________________________________
> cvnet mailing list
> cvnet at lawton.ewind.com
>
> https://urldefense.proofpoint.com/v2/url?u=https-3A__lawton.ewind.com_mailman_listinfo_cvnet&d=DwICAg&c=XYzUhXBD2cD-CornpT4QE19xOJBbRy-TBPLK0X9U2o8&r=fQ0XK5AoJ5yykeTTnr8PjdayWWrISW6iZef_0J7eSMc&m=A7nkuIaBYQrSpjCwOyL_vQcGXVWxKn9KtWM-SXsJ8I8&s=j85CcFthbuaLJRJiJm4mmQDbyRBsl7X1MqUY06iIg9w&e=
>
>
>
--
Lester Loschky
Professor
Associate Director, Cognitive and Neurobiological Approaches to Plasticity
Center
Department of Psychological Sciences
471 Bluemont Hall
1114 Mid-Campus Dr North
Kansas State University
Manhattan, KS 66506-5302
email: loschky at ksu.edu
research page: https://www.k-state.edu/psych/research/loschkylester.html
lab page: http://www.k-state.edu/psych/vcl/index.html
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200321/e393e8df/attachment-0001.html>
More information about the visionlist
mailing list