-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathcolorPerceptualReproduction.Rmd
More file actions
160 lines (127 loc) · 5.94 KB
/
colorPerceptualReproduction.Rmd
File metadata and controls
160 lines (127 loc) · 5.94 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
---
title: "Ngiam - Color Perceptual Reproduction"
author: "William XQ Ngiam"
date: "`r Sys.Date()`"
output: html_document
---
## Overview
This file is associated with the preprint titled ["Similarity judgments and visual working memory do not share the same cognitive representation"](https://osf.io/preprints/psyarxiv/fm9vz_v1) by William XQ Ngiam (Adelaide University) and Michael D Lee (University of California Irvine. This research project models the underlying psychological space in perceptual tasks, working memory tasks, and similarity judgment tasks.
The following model recovers the underlying psychological space for the color **perceptual reproduction** task in [Tomic and Bays (2024)](https://psycnet.apa.org/doi/10.1037/xlm0001172). In this task, a color hue in CIELab circular space (a = 20, b = 20, R = 50) is shown to the subject and remains on the screen. The subject reproduces the orientation as closely as possible.
The following code builds a Bayesian Thurstonian model using JAGS. The underlying Thurstonian model is an attempt to recover the representation (much like multidimensional scaling) of the colors. Then, this model could be used to predict performance on other tasks; such as a working memory task with these colors.
### Requirements
The following files should be saved in your working directory:
* The data file ("colorPercData_clean.csv") - [downloadable here](https://github.com/WilliamNgiam/colorModeling/blob/main/data/oriPercData_clean.csv).
* The model text file ("colorPerceptualReproduction_jags.txt") - [downloadable here](https://github.com/WilliamNgiam/colorModeling/blob/main/perceptualReproduction/colorPerceptualReproduction_jags.txt).
* The color values file ("colorRGBvalues.csv") - [downloadable here](https://github.com/WilliamNgiam/colorModeling/blob/main/colorRGBvalues.csv)
You will also need to have installed JAGS on your device. The installer can be accessed here: [https://sourceforge.net/projects/mcmc-jags/files/]. The following code was built using JAGS 4.3.2.
### Setup
```{r setup}
# Load required packages
library(R2jags) # v0.8-9
library(tidyverse) # v2.0.0
library(coda) # v0.19-4.1
library(ggmcmc) # v1.5.1.2
# Provide R session output
sessionInfo()
# Set RNG seed for reproducibility
set.seed(123)
```
### Load data
```{r read data, warning = FALSE, message = FALSE}
# Load in Tomic and Bays (2024) dataset
percData <- read_csv(paste0(getwd(),"/data/colorPercData_clean.csv"))
# Create intervals to match the Tomic and Bays (2024) perceptual task.
binIntervals = seq(0,2*pi,length.out = 73)
# Assign indices to the closest interval for the target (stimulus) values and response values
percData <- percData %>%
mutate(response = round(response,digits = 5),
target = round(target,digits = 5)) %>%
# Make all values positive
mutate(response = case_when(response < 0 ~ response + 2*pi,
.default = response),
target = case_when(target < 0 ~ target + 2*pi,
.default = target)) %>%
# Adjust response values to account for circular angle
mutate(response = case_when(abs(response-target) > abs(response-target-2*pi) ~ response-2*pi,
.default = response)) %>%
mutate(response = case_when(abs(target-response) > abs(target-response-2*pi) ~ response+2*pi,
.default = response)) %>%
rowwise() %>%
# Match target values to the intervals on the semi-circle
mutate(targetIndex = which.min(abs(target-binIntervals))) %>%
# The target values closest to pi are re-indexed to the first bin (0 degrees)
mutate(targetIndex = case_when(targetIndex == 73 ~ 1,
.default = targetIndex))
# Set up the true values to be recovered
muTruth = binIntervals[-73] # Remove the last index because that is the point at which the circle wraps around
sigmaTruth = 0.01
# Assign data to vectors
s <- as.integer(percData$targetIndex)
y <- as.numeric(percData$response)
nTrials <- as.integer(length(percData$response))
nStimuli <- as.integer(length(muTruth))
```
### Read in data for JAGS
```{r read in data}
data.jags <- list("s","y","nTrials","nStimuli")
bayes.mod.params <- c("mu","sigma")
bayes.mod.inits <- function() {
list("sigma" = runif(1)*2*pi)
}
```
### Run model fit
```{r run model fit}
bayes.mod.fit <- jags(data = data.jags,
inits = bayes.mod.inits,
parameters.to.save = bayes.mod.params,
n.chains = 8,
n.iter = 8000,
n.burnin = 2000,
model.file = paste0(getwd(),"/models/colorPerceptualReproduction_jags.txt"))
```
### Evaluate model
```{r evaluate model}
print(bayes.mod.fit)
```
```{r evaluate model as mcmc}
bayes.mod.fit.mcmc <- as.mcmc(bayes.mod.fit)
summary(bayes.mod.fit.mcmc)
```
### Plots
```{r plot model}
# Convert into ggmcmc object
bayes.mod.fit.gg <- ggs(bayes.mod.fit.mcmc)
catplot <- ggs_caterpillar(bayes.mod.fit.gg,
thick_ci = c(0.45,0.55),
family = "mu",
sort = FALSE)
# Show original caterpillar plot
catplot
# Update points to reflect stimulus colors
colors = read_csv("colorRGBvalues.csv",
col_names = FALSE) %>%
slice(-1)
colnames(colors) <- c("R","G","B")
colors <- colors %>%
mutate(hex = rgb(R,G,B))
catdata <- catplot@data %>%
mutate(color = colors$hex)
ggplot(data = catdata) +
geom_segment(mapping = aes(x = low,
y = Parameter,
xend = high,
yend = Parameter),
colour = catdata$color) +
geom_point(mapping = aes(x = median,
y = Parameter),
colour = catdata$color) +
geom_abline(intercept = 1,
slope = 36/pi,
linewidth = 1,
alpha = 0.3) +
theme(aspect_ratio = 1) +
theme_minimal() +
scale_x_continuous(limits = c(0,2*pi))
ggs_density(bayes.mod.fit.gg,
family = "sigma")
```