Staff

The many faces of GESIS

Publications

Chapter in an edited book

Linzbach, Stephan, Dimitar Dimitrov, Laura Kallmeyer, Kilian Evang, Hajira Jabeen, and Stefan Dietze. 2024. "The Impact of Prompt Syntax and supplementary Information on Knowledge Retrieval from Pretrained Language Models." In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics (NAACL2024), Mexico City, Mexico, Association for Computational Linguistics., https://arxiv.org/html/2404.01992v1.

Linzbach, Stephan, Tim Tressel, Laura Kallmeyer, Stefan Dietze, and Hajira Jabeen. 2023. "Decoding Prompt Syntax: Analysing its Impact on Knowledge Retrieval in Large Language Models." In WWW '23 Companion: Companion Proceedings of the ACM Web Conference 2023, edited by Ying Ding, Jie Tang, Juan Sequeda, Lora Aroyo, Carlos Castillo, and Geert-Jan Houben, 1145–1149. New York: ACM. doi: https://doi.org/10.1145/3543873.3587655.

Working and discussion paper

Dietze, Stefan, Hajira Jabeen, Laura Kallmeyer, and Stephan Linzbach. 2023. Towards syntax-aware pretraining and prompt engineering for knowledge retrieval from large language models. Joint proceedings of the KBC-LM workshop and the LM-KBC challenge @ ISWC 2023 Vol-3577. urn: urn:nbn:de:0074-3577-1.

Presentation at a conference

Biswas, Debanjali, Stephan Linzbach, Dimitar Dimitrov, Hajira Jabeen, and Stefan Dietze. 2023. "Broadening BERT vocabulary for Knowledge Graph Construction using Wikipedia2Vec: Poster Presentation." Joint proceedings of the KBC-LM workshop and the LM-KBC challenge @ ISWC 2023, 2023-11-06. https://www.ceur-ws.org/Vol-3577/paper6.pdf.