Portions of this ongoing work have been published as Endowing Neural Language Learners with Human-like Biases: A Case Study on Dependency Length Minimization at LREC-COLING 2024 and presented as Neural-agent Language Learning and Communication: Emergence of Dependency Length Minimization at CogSci 2024 Posters.
The implementation is based on the NeLLCom framework and EGG toolkit.
- Installing EGG toolkit;
- Moving to the EGG game design folder:
cd EGG/egg/zoo - Cloning the current repo into the EGG game design folder:
git clone https://github.com/yuqing0304/DLM_exp.git cd DLM_exp - Then, we can run a game, for example, communicating with an impatient listener (Impa) with the RNN architecture (rnn) using the verb-final subject-modified language (finalSM) of half meaning space (half):
cd DLM_exp_main2/DLM_halffinalSMrnnImpa sbatch run.sh
speaker_hidden_size/listener_hidden_size: Size of the hidden layers in the speaker/listener networks.
meaning_embedding_dim/listener_embedding_size: Embedding size in the speaker/listener networks.
word_dropout_p: Word dropout rate for the input, interpreted as noise.
If you find this study useful in your research, please cite this paper:
@inproceedings{zhang-etal-2024-endowing,
title = "Endowing Neural Language Learners with Human-like Biases: A Case Study on Dependency Length Minimization",
author = "Zhang, Yuqing and
Verhoef, Tessa and
van Noord, Gertjan and
Bisazza, Arianna",
editor = "Calzolari, Nicoletta and
Kan, Min-Yen and
Hoste, Veronique and
Lenci, Alessandro and
Sakti, Sakriani and
Xue, Nianwen",
booktitle = "Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)",
month = may,
year = "2024",
address = "Torino, Italia",
publisher = "ELRA and ICCL",
url = "https://aclanthology.org/2024.lrec-main.516/",
pages = "5819--5832"
}