2020-03-24 10:14:40

Preliminaries

Check-in

  • How are things going?

Announcements

Today’s topics

  • Tools for reproducible data-gathering
  • Deeper dive into PsychoPy
  • In-class jsPsych quests

Tools for reproducible data-gathering

Tools

E-Prime

Psychophysics Toolbox

  • Matlab-based, but also runs Octave (open source port of Matlab)
  • Runs on Linux, Windows, Mac OS X
    • Linux (Ubuntu) now recommended

PsychoPy

jsPsych

MTurk

Other web-based platforms (survey research)

Shiny

Challenges to reproducibility

  • Commercial vs. non-commercial tools
  • Computers vary, OS’s vary, versions change
  • Hard to ensure scripts run on different hardware, software
  • Desktop-based (better timing) vs. web-based (less hardware/software dependent)
  • Programming practices vary

Challenges to reproducibility

Partial solutions

  • Use Open Science Framework (OSF) or Databrary as a project “hub”
  • Future: “containerize” experiment apps

Self-assessment

  • What tool(s) are you using now?
  • Do you use shared code? What was the source?
  • Do you alter shared display code? How do you document the changes?
  • Do you share display code? How?
  • Barriers to greater sharing?

Deeper dive into PsychoPy

Installation

Demos/basic/hello_world.py

"""
Demo: show a very basic program: hello world
"""

from __future__ import absolute_import, division, print_function

# Import key parts of the PsychoPy library:
from psychopy import visual, core

# Create a visual window:
win = visual.Window()

# Create (but not yet display) some text:
msg1 = visual.TextStim(win, text=u"Hello world!")  # default position = centered
msg2 = visual.TextStim(win, text=u"\u00A1Hola mundo!", pos=(0, -0.3))

# Draw the text to the hidden visual buffer:
msg1.draw()
msg2.draw()

# Show the hidden buffer--everything that has been drawn since the last win.flip():
win.flip()

# Wait 3 seconds so people can see the message, then exit gracefully:
core.wait(3)

win.close()
core.quit()

# The contents of this file are in the public domain.

Demos/basic/face_jpg.py

#!/usr/bin/env python
# -*- coding: utf-8 -*-

"""
This demo shows you different image presentation using visual.ImageStim and
visual.GratinGstim. It introduces some of the many attributes of these stimulus
types.
"""

from __future__ import division

# Import the modules that we need in this script
from psychopy import core, visual, event

# Create a window to draw in
win = visual.Window(size=(600, 600), color='black')

# An image using ImageStim.
image = visual.ImageStim(win, image='face.jpg')

# We can also use the image as a mask (mask="face.jpg") for other stimuli!
grating = visual.GratingStim(win,
    pos=(-0.5, 0),
    tex='sin',
    mask='face.jpg',
    color='green')
grating.size = (0.5, 0.5)  # attributes can be changed after initialization
grating.sf = 1.0

# Initiate clock to keep track of time
clock = core.Clock()
while clock.getTime() < 12 and not event.getKeys():
    # Set dynamic attributes. There's a lot of different possibilities.
    # so look at the documentation and try playing around here.
    grating.phase += 0.01  # Advance phase by 1/100th of a cycle
    grating.pos += (0.001, 0)  # Advance on x but not y
    image.ori *=  1.01  # Accelerating orientation (1% on every frame)
    image.size -= 0.001  # Decrease size uniformly on x and y
    if image.opacity >=  0:  # attributes can be referenced
        image.opacity -= 0.001  # Decrease opacity

    # Show the result of all the above
    image.draw()
    grating.draw()
    win.flip()

win.close()
core.quit()

# The contents of this file are in the public domain.

Rick’s PsychoPy advice

  • Study the demos; tweak them.
  • Choose top down (builder) or bottom-up (script-first) approach at the outset.
  • Make experimental parameters explicit.

  • Choose GUI or script-based approach
  • Design your data outputs for reproducible post-processing.
    • Think: ‘What data frame am I going to want to have?’
  • Write and test import & cleaning code as you finalize your study.

Your turn

Goals

Preliminaries

A basic web page

<!doctype html>
<html>
    <head>
        <title>My page</title>
    </head>
    <body>
    <h1>This is a top-level header.</h1>
    <h2>This is a second level header.</h2>
    <p>This is a paragraph.</p>
    </body>
</html>

  • Nested tags: <html></html>, <head></head>, <body></body>
  • Resources: links (<a></a>), imgs (<img></img>), video (<video></video>), etc.
  • Tag + src + parameter syntax: <img src="https://upload.wikimedia.org/wikipedia/commons/thumb/6/61/HTML5_logo_and_wordmark.svg/200px-HTML5_logo_and_wordmark.svg.png" width = 200px></img>

More on web page anatomy and physiology

More on web page anatomy and physiology

  • JavaScript
    • Programming language for web pages
    • Frameworks/libraries are collections of useful commands
  • Web application framework
    • Integration with other resources and services (e.g., databases)

Hello, World!

<!DOCTYPE html>
<html>
    <head>
        <title>My experiment</title>
        <script src="jspsych-6.1.0/jspsych.js"></script>
        <script src="jspsych-6.1.0/plugins/jspsych-html-keyboard-response.js"></script>
        <link href="jspsych-6.1.0/css/jspsych.css" rel="stylesheet" type="text/css"></link>
    </head>
    <body></body>
    
    <script>
    var hello_trial = {
        type: 'html-keyboard-response',
        stimulus: 'Hello world!'
    }

    jsPsych.init({
        timeline: [hello_trial]
    })
    </script>
</html>

Loading JavaScript libraries

<!--- From local directories --->
<script src="jspsych-6.1.0/jspsych.js"></script>

Loading jsPsych CSS

<link href="jspsych-6.1.0/css/jspsych.css" rel="stylesheet" type="text/css"></link>

jsPsych code to show message

<script>
  var hello_trial = {
    type: 'html-keyboard-response',
    stimulus: 'Hello world!'
  }

  jsPsych.init({
    timeline: [hello_trial]
  })
</script>

jsPsych produces data files in JSON format

Here’s what the data look like in JavaScript Object Notation (JSON)

 {
  "rt": 1219,
  "stimulus": "img/orange.png",
  "key_press": 70,
  "response": "no-go",
  "trial_type": "single-stim",
  "trial_index": 2,
  "time_elapsed": 13924,
  "internal_node_id": "0.0-2.0-0.0",
  "correct": false
 },
 {
  "rt": -1,
  "stimulus": "img/orange.png",
  "key_press": -1,
  "response": "no-go",
  "trial_type": "single-stim",
  "trial_index": 3,
  "time_elapsed": 16305,
  "internal_node_id": "0.0-2.0-1.0",
  "correct": true
 },

Downside of jsPsych

  • Copy data file manually (if running locally)
  • Save to web server (if running on server)

Closing the loop

Questions?

In-class quests

  • Work through jsPsych Hello, World! demo.
  • Work through jsPsych simple RT demo.

Next time…

  • Gathering info from afar: Using APIs

Resources

Software

This talk was produced on 2020-03-24 in RStudio using R Markdown. The code and materials used to generate the slides may be found at https://github.com/psu-psychology/psy-525-reproducible-research-2020. Information about the R Session that produced the code is as follows:

## R version 3.6.2 (2019-12-12)
## Platform: x86_64-apple-darwin15.6.0 (64-bit)
## Running under: macOS Mojave 10.14.6
## 
## Matrix products: default
## BLAS:   /Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRblas.0.dylib
## LAPACK: /Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRlapack.dylib
## 
## locale:
## [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
## 
## attached base packages:
## [1] stats     graphics  grDevices utils     datasets  methods   base     
## 
## loaded via a namespace (and not attached):
##  [1] compiler_3.6.2  magrittr_1.5    tools_3.6.2     htmltools_0.4.0
##  [5] yaml_2.2.1      Rcpp_1.0.3      stringi_1.4.6   rmarkdown_2.1  
##  [9] knitr_1.28      stringr_1.4.0   xfun_0.12       digest_0.6.25  
## [13] rlang_0.4.5     evaluate_0.14

References

Bridges, D., Pitiot, A., MacAskill, M. R., & Peirce, J. W. (2020, January). The timing mega-study: Comparing a range of experiment generators, both lab-based and online. PsyArXiv. https://doi.org/10.31234/osf.io/d6nu5

Gilmore, R. O., & Adolph, K. E. (2017). Video can make science more open, transparent, robust, and reproducible. Retrieved from http://osf.io/3kvp7

Kominsky, J. F. (2019). PyHab: Open-source real time infant gaze coding and stimulus presentation software. Infant Behavior & Development, 54, 114–119. https://doi.org/10.1016/j.infbeh.2018.11.006

MacWhinney, B., St James, J., Schunn, C., Li, P., & Schneider, W. (2001). STEP–a system for teaching experimental psychology using E-Prime. Behavior Research Methods, Instruments, & Computers: A Journal of the Psychonomic Society, Inc, 33(2), 287–296. https://doi.org/10.3758/bf03195379

Tran, M., Cabral, L., Patel, R., & Cusack, R. (2017). Online recruitment and testing of infants with mechanical turk. Journal of Experimental Child Psychology, 156, 168–178. https://doi.org/10.1016/j.jecp.2016.12.003

Vasilevsky, N. A., Brush, M. H., Paddock, H., Ponting, L., Tripathy, S. J., Larocca, G. M., & Haendel, M. A. (2013). On the reproducibility of science: Unique identification of research resources in the biomedical literature. PeerJ, 1, e148. https://doi.org/10.7717/peerj.148