Skip to content

Commit e1ef95b

Browse files
committed
Add citation with publication
1 parent 9aaa88e commit e1ef95b

File tree

1 file changed

+24
-0
lines changed

1 file changed

+24
-0
lines changed

README.md

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,30 @@
22

33
Author: Evan Murray
44

5+
## Citation
6+
7+
[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.15270938.svg)](https://doi.org/10.5281/zenodo.15270938)
8+
```
9+
@inproceedings{murray_sckinect_2025,
10+
address = {Johns Hopkins University Bloomberg Center},
11+
title = {{SCKinect}: {A} {SuperCollider} plugin for the {Kinect}},
12+
copyright = {Creative Commons Attribution 4.0 International},
13+
shorttitle = {{SCKinect}},
14+
url = {https://zenodo.org/doi/10.5281/zenodo.15270938},
15+
doi = {10.5281/ZENODO.15270938},
16+
abstract = {SCKinect is a SuperCollider plugin that allows users to interact with a Kinect sensor. Its core implementation contains a unit generator called ’Kinect’, designed to output motion-tracking data to control buses. The plugin also includes commands, facilitating interaction with Kinect devices through the interpreter. The interpreted nature of SuperCollider and the server-language duality allow multimedia enthusiasts to efficiently communicate with technical rendering systems. This is perfect for live performances and interactive installations. With the addition of this plugin, performers can interact with the Kinect directly in SuperCollider with low latency. This paper will cover the implementation of the plugin and its potential applications.},
17+
language = {en},
18+
urldate = {2025-04-23},
19+
publisher = {Zenodo},
20+
author = {Murray, Evan and von Coler, Henrik},
21+
month = apr,
22+
year = {2025},
23+
note = {Publisher: Zenodo
24+
Version Number: 1.0.0},
25+
keywords = {Audio-visual presentation, Computer vision, Feedback, Sensory, Machine Learning/classification, Motion Capture},
26+
}
27+
```
28+
529
## What is This Project?
630

731
SCKinect is a bridge between physical movement and sound generation. It allows you to:

0 commit comments

Comments
 (0)