|
1 | | ---- |
2 | | -hide: |
3 | | - - navigation |
4 | | ---- |
5 | | - |
6 | | -# :edia-edia_rounded: EDIA |
7 | | - |
8 | | -## A Unity XR Toolbox for research |
9 | | - |
10 | | -!!! note "EDIA provides you with a set of modules (Unity packages) to facilitate the design and conduction of experimental research in XR." |
11 | | - |
12 | | - |
13 | | - |
14 | | - |
15 | | -## Key features |
16 | | - |
17 | | -<div class="grid cards" markdown> |
18 | | - |
19 | | -- :simple-instructure:{ .lg .middle } __Structure your experiment__ |
20 | | - |
21 | | - --- |
22 | | - |
23 | | - Define the information for your experiment at the granularity level which you need: from sessions to blocks to trials to single steps within trials. EDIA builds upon and extends the functionalities of the [UXF — Unity Experiment Framework](https://github.com/immersivecognition/unity-experiment-framework/){:target="_blank"} (Brookes et al., [2020](https://github.com/immersivecognition/unity-experiment-framework/){:target="_blank"}) package. |
24 | | - |
25 | | - [:octicons-arrow-right-24: Getting started](gettingstarted.md) |
26 | | - |
27 | | -- :material-folder-multiple-plus-outline:{ .lg .middle } __Manage it with config files__ |
28 | | - |
29 | | - --- |
30 | | - |
31 | | - Use human readable config files to provide and change central information for your experiment without needing to recompile or touch your Unity code. |
32 | | - |
33 | | - [:octicons-arrow-right-24: Overview](https://mind-body-emotion.notion.site/Config-Files-1cb03dd4773f8121b74ccd4b6a95ab7c){target="_blank"} |
34 | | - |
35 | | -- :material-eye:{ .lg .middle } __Unified eye tracking integration__ |
36 | | - |
37 | | - --- |
38 | | - |
39 | | - EDIA provides an integration of eye tracking for multiple different headsets (HTC Vive, Varjo, Meta Quest Pro, Pico), managing the parsing of the eye tracker output for you in the original sampling rate, and providing it via a standardized interface. |
40 | | - |
41 | | - [:octicons-arrow-right-24: :edia-Edia_Eye_Orange: `EDIA Eye`](https://mind-body-emotion.notion.site/EDIA-Eye-1e703dd4773f80ea8cfcd75bd87c004b){target="_blank"} |
42 | | - |
43 | | -- :material-remote:{ .lg .middle } __Remotely control mobile XR experiments__ |
44 | | - |
45 | | - --- |
46 | | - |
47 | | - Control experiments which run on a mobile XR headset (e.g., Meta Quest) externally and stream what the participant is seeing. |
48 | | - |
49 | | - |
50 | | - [:octicons-arrow-right-24: :edia-Edia_Rcas_Cyan: `EDIA RCAS`](https://mind-body-emotion.notion.site/EDIA-Remote-9dfde97f593e4221bee4630ab3284d4e){target="_blank"} |
51 | | - |
52 | | -- :fontawesome-regular-pen-to-square:{ .lg .middle } __Automatically log relevant data__ |
53 | | - |
54 | | - --- |
55 | | - |
56 | | - Use pre-configured tools to log different kinds of data: behavior and movements of the participant, experimental variables, eye tracking data, ... |
57 | | - |
58 | | - |
59 | | - [:octicons-arrow-right-24: Getting started](https://mind-body-emotion.notion.site/Logging-the-results-1cb03dd4773f81b196b0f164eb1a67be){target="_blank"} |
60 | | - |
61 | | -- :material-remote:{ .lg .middle } __Synchronize with external data__ |
62 | | - |
63 | | - --- |
64 | | - |
65 | | - Use the [LabStreamingLayer protocol](https://labstreaminglayer.org/){:target="_blank"} to synchronize your experiment with other data streams (e.g., EEG, fNIRS, ...). |
66 | | - |
67 | | - |
68 | | - [:octicons-arrow-right-24: :edia-Edia_Lsl_Yellow: `EDIA LSL`](https://mind-body-emotion.notion.site/EDIA-LSL-d0a26b0a043d408dbe353221242296b6){target="_blank"} |
69 | | - |
70 | | -</div> |
71 | | - |
72 | | - |
73 | | - |
74 | | -## Getting started |
75 | | -Read our [getting started guide](gettingstarted.md). |
76 | | - |
77 | | -<div id="edia-modules"></div> |
78 | | -## Modules |
79 | | - |
80 | | -=== ":edia-edia_rounded: EDIA Core" |
81 | | - |
82 | | - [{ width=100 align=left }](https://github.com/edia-toolbox/edia_core){:target="_blank"} |
83 | | - |
84 | | - * The 🖤 heart of the EDIA toolbox. |
85 | | - * Structure your experiment: `Sessions` <> `Blocks` <> `Trials` <> `Trial Steps` |
86 | | - * Use logically nested Config files (JSON) to manage the compiled experiment. |
87 | | - * Send messages to the XR user. |
88 | | - * Experimenter interface. |
89 | | - * Download from :fontawesome-brands-github: → [:edia-edia_rounded: `EDIA Core`](https://github.com/edia-toolbox/edia_core){:target="_blank"} |
90 | | - |
91 | | - |
92 | | -=== ":edia-Edia_Lsl_Yellow: EDIA LSL" |
93 | | - |
94 | | - [{ width=100 align=left }](https://github.com/edia-toolbox/edia_lsl){:target="_blank"} |
95 | | - |
96 | | - 1. Use the [LabStreamingLayer](https://labstreaminglayer.org/){:target="_blank"} protocol to synchronize your data. |
97 | | - 2. The :edia-Edia_Lsl_Yellow: `EDIA LSL` module is a convenience wrapper and extension of the [LSL4Unity](https://github.com/labstreaminglayer/LSL4Unity){:target="_blank} package. |
98 | | - 3. It provides prefabs and scripts which allow you to |
99 | | - 2. Send precisely timed triggers with a single command. |
100 | | - 3. Easily stream data (eye tracking data, movement data) in world or local coordinates. |
101 | | - 4. Download from :fontawesome-brands-github: → [:edia-Edia_Lsl_Yellow: `EDIA LSL`](https://github.com/edia-toolbox/edia_lsl){:target="_blank"} |
102 | | - |
103 | | -=== ":edia-Edia_Rcas_Cyan: EDIA RCAS" |
104 | | - [{ width=100 align=left }](https://github.com/edia-toolbox/edia_rcas){:target="_blank"} |
105 | | - |
106 | | - 1. <span style="color: cyan;">**R**</span>emote <span style="color: cyan;">**C**</span>ontrol <span style="color: cyan;">**A**</span>nd <span style="color: cyan;">**S**</span>treaming. |
107 | | - 2. Interface with your experiment which runs on a mobile headset. |
108 | | - 3. Load Config files from another device. |
109 | | - 4. Send commands to proceed in the experiment. |
110 | | - 5. Stream the headset view to the experimenter's device (*limited). |
111 | | - 6. Download from :fontawesome-brands-github: → [:edia-Edia_Rcas_Cyan: `EDIA RCAS`](https://github.com/edia-toolbox/edia_rcas){:target="_blank"} |
112 | | - |
113 | | -=== ":edia-Edia_Eye_Orange: EDIA Eye" |
114 | | - [{ width=100 align=left }](https://github.com/edia-toolbox/edia_eye){:target="_blank"} |
115 | | - |
116 | | - * Central :eye: eye tracking package of the `EDIA toolbox`. |
117 | | - * Provides a unified interface for accessing eye tracking data. |
118 | | - * Example visualizations of eye tracking data: gaze, eye openness, ... |
119 | | - * Interface for logging eye tracking data via [:edia-Edia_Lsl_Yellow:`EDIA LSL`](https://github.com/edia-toolbox/edia_lsl) or writing it to disc. |
120 | | - * Download from :fontawesome-brands-github: → [:edia-Edia_Eye_Orange: `EDIA Eye`](https://github.com/edia-toolbox/edia_eye){:target="_blank"} |
121 | | - |
122 | | - |
123 | | -=== ":edia-Edia_Eye_Orange: EDIA Eye Submodules" |
124 | | - { width=100 align=left } |
125 | | - |
126 | | - Separate submodules (packages) allow to parse the eye tracking data from the respective device into the unified EDIA eye tracking structure. |
127 | | - **Supported headsets:** |
128 | | - |
129 | | - <div class="grid" markdown> |
130 | | - |
131 | | - [:fontawesome-brands-meta: Meta Quest Pro](https://github.com/edia-toolbox/edia_eye_quest){:target="_blank"} |
132 | | - { .card } |
133 | | - |
134 | | - [:brands-vive: HTC Vive Pro Eye](https://github.com/edia-toolbox/edia_eye_vive){:target="_blank"} |
135 | | - { .card } |
136 | | - |
137 | | - [:brands-varjo_bg: Varjo Aero](https://github.com/edia-toolbox/edia_eye_varjo){:target="_blank"} |
138 | | - { .card } |
139 | | - |
140 | | - [:brands-pico_bg: PICO 4 Enterprise](https://github.com/edia-toolbox/edia_eye_pico){:target="_blank"} |
141 | | - { .card } |
142 | | - |
143 | | - </div> |
144 | | - |
145 | | - |
146 | | - |
147 | | - |
148 | | -## Get involved |
149 | | -If you want to use EDIA or start contributing, please |
150 | | -[reach out to us](contact.md). |
151 | | - |
152 | | - |
| 1 | +--- |
| 2 | +hide: |
| 3 | + - navigation |
| 4 | +--- |
| 5 | + |
| 6 | +# :edia-edia_rounded: EDIA |
| 7 | + |
| 8 | +## A Unity XR Toolbox for research |
| 9 | + |
| 10 | +!!! note "EDIA provides you with a set of modules (Unity packages) to facilitate the design and conduction of experimental research in XR." |
| 11 | + |
| 12 | + |
| 13 | + |
| 14 | + |
| 15 | +## Key features |
| 16 | + |
| 17 | +<div class="grid cards" markdown> |
| 18 | + |
| 19 | +- :simple-instructure:{ .lg .middle } __Structure your experiment__ |
| 20 | + |
| 21 | + --- |
| 22 | + |
| 23 | + Define the information for your experiment at the granularity level which you need: from sessions to blocks to trials to single steps within trials. EDIA builds upon and extends the functionalities of the [UXF — Unity Experiment Framework](https://github.com/immersivecognition/unity-experiment-framework/){:target="_blank"} (Brookes et al., [2020](https://github.com/immersivecognition/unity-experiment-framework/){:target="_blank"}) package. |
| 24 | + |
| 25 | + [:octicons-arrow-right-24: Getting started](gettingstarted.md) |
| 26 | + |
| 27 | +- :material-folder-multiple-plus-outline:{ .lg .middle } __Manage it with config files__ |
| 28 | + |
| 29 | + --- |
| 30 | + |
| 31 | + Use human readable config files to provide and change central information for your experiment without needing to recompile or touch your Unity code. |
| 32 | + |
| 33 | + [:octicons-arrow-right-24: Overview](https://mind-body-emotion.notion.site/Config-Files-1cb03dd4773f8121b74ccd4b6a95ab7c){target="_blank"} |
| 34 | + |
| 35 | +- :material-eye:{ .lg .middle } __Unified eye tracking integration__ |
| 36 | + |
| 37 | + --- |
| 38 | + |
| 39 | + EDIA provides an integration of eye tracking for multiple different headsets (HTC Vive, Varjo, Meta Quest Pro, Pico), managing the parsing of the eye tracker output for you in the original sampling rate, and providing it via a standardized interface. |
| 40 | + |
| 41 | + [:octicons-arrow-right-24: :edia-Edia_Eye_Orange: `EDIA Eye`](https://mind-body-emotion.notion.site/EDIA-Eye-1e703dd4773f80ea8cfcd75bd87c004b){target="_blank"} |
| 42 | + |
| 43 | +- :material-remote:{ .lg .middle } __Remotely control mobile XR experiments__ |
| 44 | + |
| 45 | + --- |
| 46 | + |
| 47 | + Control experiments which run on a mobile XR headset (e.g., Meta Quest) externally and stream what the participant is seeing. |
| 48 | + |
| 49 | + |
| 50 | + [:octicons-arrow-right-24: :edia-Edia_Rcas_Cyan: `EDIA RCAS`](https://mind-body-emotion.notion.site/EDIA-Remote-9dfde97f593e4221bee4630ab3284d4e){target="_blank"} |
| 51 | + |
| 52 | +- :fontawesome-regular-pen-to-square:{ .lg .middle } __Automatically log relevant data__ |
| 53 | + |
| 54 | + --- |
| 55 | + |
| 56 | + Use pre-configured tools to log different kinds of data: behavior and movements of the participant, experimental variables, eye tracking data, ... |
| 57 | + |
| 58 | + |
| 59 | + [:octicons-arrow-right-24: Getting started](https://mind-body-emotion.notion.site/Logging-the-results-1cb03dd4773f81b196b0f164eb1a67be){target="_blank"} |
| 60 | + |
| 61 | +- :material-remote:{ .lg .middle } __Synchronize with external data__ |
| 62 | + |
| 63 | + --- |
| 64 | + |
| 65 | + Use the [LabStreamingLayer protocol](https://labstreaminglayer.org/){:target="_blank"} to synchronize your experiment with other data streams (e.g., EEG, fNIRS, ...). |
| 66 | + |
| 67 | + |
| 68 | + [:octicons-arrow-right-24: :edia-Edia_Lsl_Yellow: `EDIA LSL`](https://mind-body-emotion.notion.site/EDIA-LSL-d0a26b0a043d408dbe353221242296b6){target="_blank"} |
| 69 | + |
| 70 | +</div> |
| 71 | + |
| 72 | + |
| 73 | + |
| 74 | +## Getting started |
| 75 | +Read our [getting started guide](gettingstarted.md). |
| 76 | +To install the EDIA modules into your Unity project, we recommend using our `EDIA Installer` ([:fontawesome-solid-download: download](https://github.com/edia-toolbox/edia_installer/releases/latest/download/EdiaInstaller.unitypackage), [:fontawesome-solid-book: README](https://github.com/edia-toolbox/edia_installer){target="_blank"}, |
| 77 | +[:fontawesome-brands-github: source](https://github.com/edia-toolbox/edia_installer){target="_blank"}). |
| 78 | + |
| 79 | +<div id="edia-modules"></div> |
| 80 | +## Modules |
| 81 | + |
| 82 | +=== ":edia-edia_rounded: EDIA Core" |
| 83 | + |
| 84 | + [{ width=100 align=left }](https://github.com/edia-toolbox/edia_core){:target="_blank"} |
| 85 | + |
| 86 | + * The 🖤 heart of the EDIA toolbox. |
| 87 | + * Structure your experiment: `Sessions` <> `Blocks` <> `Trials` <> `Trial Steps` |
| 88 | + * Use logically nested Config files (JSON) to manage the compiled experiment. |
| 89 | + * Send messages to the XR user. |
| 90 | + * Experimenter interface. |
| 91 | + * Download from :fontawesome-brands-github: → [:edia-edia_rounded: `EDIA Core`](https://github.com/edia-toolbox/edia_core){:target="_blank"} |
| 92 | + |
| 93 | + |
| 94 | +=== ":edia-Edia_Lsl_Yellow: EDIA LSL" |
| 95 | + |
| 96 | + [{ width=100 align=left }](https://github.com/edia-toolbox/edia_lsl){:target="_blank"} |
| 97 | + |
| 98 | + 1. Use the [LabStreamingLayer](https://labstreaminglayer.org/){:target="_blank"} protocol to synchronize your data. |
| 99 | + 2. The :edia-Edia_Lsl_Yellow: `EDIA LSL` module is a convenience wrapper and extension of the [LSL4Unity](https://github.com/labstreaminglayer/LSL4Unity){:target="_blank} package. |
| 100 | + 3. It provides prefabs and scripts which allow you to |
| 101 | + 2. Send precisely timed triggers with a single command. |
| 102 | + 3. Easily stream data (eye tracking data, movement data) in world or local coordinates. |
| 103 | + 4. Download from :fontawesome-brands-github: → [:edia-Edia_Lsl_Yellow: `EDIA LSL`](https://github.com/edia-toolbox/edia_lsl){:target="_blank"} |
| 104 | + |
| 105 | +=== ":edia-Edia_Rcas_Cyan: EDIA RCAS" |
| 106 | + [{ width=100 align=left }](https://github.com/edia-toolbox/edia_rcas){:target="_blank"} |
| 107 | + |
| 108 | + 1. <span style="color: cyan;">**R**</span>emote <span style="color: cyan;">**C**</span>ontrol <span style="color: cyan;">**A**</span>nd <span style="color: cyan;">**S**</span>treaming. |
| 109 | + 2. Interface with your experiment which runs on a mobile headset. |
| 110 | + 3. Load Config files from another device. |
| 111 | + 4. Send commands to proceed in the experiment. |
| 112 | + 5. Stream the headset view to the experimenter's device (*limited). |
| 113 | + 6. Download from :fontawesome-brands-github: → [:edia-Edia_Rcas_Cyan: `EDIA RCAS`](https://github.com/edia-toolbox/edia_rcas){:target="_blank"} |
| 114 | + |
| 115 | +=== ":edia-Edia_Eye_Orange: EDIA Eye" |
| 116 | + [{ width=100 align=left }](https://github.com/edia-toolbox/edia_eye){:target="_blank"} |
| 117 | + |
| 118 | + * Central :eye: eye tracking package of the `EDIA toolbox`. |
| 119 | + * Provides a unified interface for accessing eye tracking data. |
| 120 | + * Example visualizations of eye tracking data: gaze, eye openness, ... |
| 121 | + * Interface for logging eye tracking data via [:edia-Edia_Lsl_Yellow:`EDIA LSL`](https://github.com/edia-toolbox/edia_lsl) or writing it to disc. |
| 122 | + * Download from :fontawesome-brands-github: → [:edia-Edia_Eye_Orange: `EDIA Eye`](https://github.com/edia-toolbox/edia_eye){:target="_blank"} |
| 123 | + |
| 124 | + |
| 125 | +=== ":edia-Edia_Eye_Orange: EDIA Eye Submodules" |
| 126 | + { width=100 align=left } |
| 127 | + |
| 128 | + Separate submodules (packages) allow to parse the eye tracking data from the respective device into the unified EDIA eye tracking structure. |
| 129 | + **Supported headsets:** |
| 130 | + |
| 131 | + <div class="grid" markdown> |
| 132 | + |
| 133 | + [:fontawesome-brands-meta: Meta Quest Pro](https://github.com/edia-toolbox/edia_eye_quest){:target="_blank"} |
| 134 | + { .card } |
| 135 | + |
| 136 | + [:brands-vive: HTC Vive Pro Eye](https://github.com/edia-toolbox/edia_eye_vive){:target="_blank"} |
| 137 | + { .card } |
| 138 | + |
| 139 | + [:brands-varjo_bg: Varjo Aero](https://github.com/edia-toolbox/edia_eye_varjo){:target="_blank"} |
| 140 | + { .card } |
| 141 | + |
| 142 | + [:brands-pico_bg: PICO 4 Enterprise](https://github.com/edia-toolbox/edia_eye_pico){:target="_blank"} |
| 143 | + { .card } |
| 144 | + |
| 145 | + </div> |
| 146 | + |
| 147 | + |
| 148 | + |
| 149 | + |
| 150 | +## Get involved |
| 151 | +If you want to use EDIA or start contributing, please |
| 152 | +[reach out to us](contact.md). |
| 153 | + |
| 154 | + |
0 commit comments