[visionlist] Matlab --> online Experiments
jon.peirce at gmail.com
Sat May 2 13:11:21 -04 2020
Sorry, I'm a bit late to this discussion, but just to expand on the
discussion and give some more info about running studies online.
We (the PsychoPy team) put in a lot of work over the last 2-3 years,
intially funded by Wellcome Trust and now the Chan Zuckerberg
Builder interface to generate scripts in either syntax. At this point I
don't think I'm being biased when I say that PsychoPy is your best bet
for creating a single study that can run both in the lab (maximal
precision) and online (maximal portability) with minimal translation
PsychoPy has excellent timing in both environments
In the lab is achieves sub-ms keyboard and audio timing (it actually
uses the same keyboard and audio C code as PTB since Mario ported that
to Python). It also has best-in-class timing online - it's the only
package currently able to produce RT timing precision under 3.5ms on all
browsers (in conjunction with a LabHacker Millikey button box).
For those that are interested I've spelled out below some of the issues
that needed solving for all this to be possible, and a few that still
aren't yet solved.
Hope the info is helpful.
best wishes all,
*Porting the /lib /to JS*
As Dee points out, is that the functions themselves need to have been
ported to JS as well as just converting the syntax of your script.
Translating your PTB script to JS syntax is no use because the Screen
functions haven't been ported to JS. PTB's key selling point, that a lot
of the critical functions run as compiled C code, is probably also the
thing that will make it hard to port to JS.
A substantial portion of the PsychoPy lib has now been ported to JS. Not
all stimuli are there but many are, and the calls to use them are
generally the same as in the Python lib.
https://www.psychopy.org/online/status.html In particular, you can time
your stimulus presentation by number of frames, and you can update
stimulus parameters on every frame! Timing can't currently be *as good*
as lab-based libraries (and don't listen to anybody telling you
otherwise) but it is much better than most people imagine.
PsychoJS (the JS port of the lib) uses WebGL where available, which
allows a large range of hardware-accelerated graphics, including GL
shaders, to be used so we should ultimately be able to create all your
favourite stimuli online. :-)
*Generating the JS experiment /script/*
PsychoPy Builder can also output your experiment in either format - the
Python script or the JS files. Python Code Components within the Builder
experiment can even be auto-translated to JS format! Again, you can't
use functions that don't exist, but the syntax converts remarkably well,
so you can augment your study with conditionals, loops and even custom
functions and these will all be translated.
You can't take a hand-written script and auto-convert that. The
structural changes required for a full script mean that isn't possible
(and probably never will be).
That said, creating a study in Builder is really fast, so starting from
scratch isn't such a big deal, and Builder is the approach we recommend
even for skilled programmers anyway. It typically means fewer mistakes,
better timing and more future-proof experiments ("future you" can still
undertand the study when it's graphical rather than 1000 lines of code).
*What isn't done
The downside for vision scientists is that we expected this community to
be last to want to go online so the psychophysics features have been
last to be implemented. In particular, gamma correction hasn't been done
(although it could be using WebGL Shaders) and we haven't ported the
Gratings and similar stimuli yet (mostly because these aren't the useful
if gamma isn't linearised). We also haven't yet ported over the
All these things can be done and we will get to them in due course but,
also, if you have JS skills you could jump in and help us fill the gaps!
> On Fri, Apr 24, 2020 at 8:23 AM Sauter Marian <marian.sauter at unibw.de
> <mailto:marian.sauter at unibw.de>> wrote:
> Dear all,
> for those of you still seeking recommendations on how to get your
> experiments online, our article just got pusblished open access:
> Sauter, M.; Draschkow, D.; Mack, W. *Building, Hosting and
> Recruiting: A Brief Introduction to Running Behavioral Experiments
> Online*. /Brain Sci./ *2020*, /10/, 251.
> Best wishes,
> *Dr. Marian Sauter*
> /Wissenschaftlicher Mitarbeiter/
> Universität der Bundeswehr München
> Department für Psychologie - Professur für Allgemeine Psychologie
> Gb. 161 / Raum 1016
> Werner-Heisenberg-Weg 39
> 85577 Neubiberg
> marian.sauter at unibw.de <mailto:marian.sauter at unibw.de>
> *Von:* visionlist <visionlist-bounces at visionscience.com
> <mailto:visionlist-bounces at visionscience.com>> im Auftrag von
> Caspar Goeke <caspar.goeke at gmail.com <mailto:caspar.goeke at gmail.com>>
> *Gesendet:* Dienstag, 21. April 2020 08:01
> *An:* visionlist at visionscience.com
> <mailto:visionlist at visionscience.com>
> *Betreff:* Re: [visionlist] Matlab --> online Experiments
> Hello everyone,
> After finishing my PhD in Cognitive Science I created a platform
> for creating and running online psychology experiments
> (https://www.labvanced.com <https://www.labvanced.com/>), it’s
> easy to start and has a very powerful UI. You will find our source
> code is open and available on https://github.com/Labvanced/. Also
> we have an open access experiment library, in which there are
> hundreds of templates for all kinds of experiments, freely to
> import and share for everyone.
> (https://www.labvanced.com/expLibrary.html). I hope with that we
> can be of value for some of you in this crisis.
> Best, Caspar
>> Dear Maarten + All:
>> Yes, a few of my phenomena pages are based on it
>> In my experience its biggest plus is that it is very easy, very approachable, thus very rapidly you can get astounding results.
>> What I found lacking is a good GUI (but then I’m possibly spoiled by<https://michaelbach.de/ot/-misc/cappuccino/> <https://michaelbach.de/ot/-misc/cappuccino/>), a little in timing, some deficiencies in sound*, and a possibly slow 3D implementation (but then, again, this<https://michaelbach.de/ot/sze-silhouette/> <https://michaelbach.de/ot/sze-silhouette/> does put heavy demand on on-line 3D processing).
>>> … Anyhow, here (https://arxiv.org/abs/2004.08198) is a preprint, …
>> Impressive, thank you.
>> Clearly p5.js has lots going for it.
>> Best, Michael
> visionlist mailing list
> visionlist at visionscience.com <mailto:visionlist at visionscience.com>
> visionlist mailing list
> visionlist at visionscience.com
University of Nottingham
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the visionlist