Over the past decade or so I have developed a series of 80+ demos for programming behavioural experiments in Matlab (or Octave) and Psychtoolbox. These start from the complete basics and walk through more complex topics such as immersive virtual reality and programming full experiments. The demos are highly popular and are used worldwide for both research and teaching.
The demos can be found on a seperate page here. I am in the process of developing some YouTube content for teaching Matlab and Psychtoolbox, and also plan to make a comparable resource for Python and PsychoPy. A (much requested) tutorial paper will be submitted later in the year to provide a means by which to cite the demos.
If you have any suggestions for extensions to the demos or spot any errors, please do contact me. Simiarly, I am also happy to hear from people who have found this resource useful.
Solid Sight Database and Code
Together with Dr Paul Hibbard, Dr Rebecca Hornsey and Dr David Hunter, we have created a database of high-resolution 3D scans of everyday objects e.g., fruit and vegetables. An example can be seen in the movie below, which is rendered in Matlab and Psychtoolbox using functions from libigl and the gptoolbox.
The aim of the database and associated code is to give researchers the ability to easily design experiments using highly realistic 3D scans of objects. Typically, most research in the area perceptual psychophysics (and elsewhere), is conducted with highly simplified stimuli which bare little resemblance to anything we would encounter in everyday life. The questions our ability to generalise our findings to real-world scenarios.
The database and code allow one to parametrically manipulate and vary highly realistic stimuli, thus maintaining high ecological validity but also high levels of experimental control. We are in the process of writing up a paper which will introduce the database and code (at the same time as making it freely available to all).
Nectarine rendered from the Solid SIght Database and code