팔로우
Lukas Lange
Lukas Lange
Bosch Center for Artificial Intelligence
de.bosch.com의 이메일 확인됨
제목
인용
인용
연도
A survey on recent approaches for natural language processing in low-resource scenarios
MA Hedderich, L Lange, H Adel, J Strötgen, D Klakow
arXiv preprint arXiv:2010.12309, 2020
3332020
The SOFC-exp corpus and neural approaches to information extraction in the materials science domain
A Friedrich, H Adel, F Tomazic, J Hingerl, R Benteau, A Maruscyk, ...
arXiv preprint arXiv:2006.03039, 2020
822020
Adversarial alignment of multilingual models for extracting temporal expressions from text
L Lange, A Iurshina, H Adel, J Strötgen
arXiv preprint arXiv:2005.09392, 2020
362020
Delucionqa: Detecting hallucinations in domain-specific question answering
M Sadat, Z Zhou, L Lange, J Araki, A Gundroo, B Wang, RR Menon, ...
arXiv preprint arXiv:2312.05200, 2023
312023
NLNDE: The neither-language-nor-domain-experts' way of spanish medical document de-identification
L Lange, H Adel, J Strötgen
arXiv preprint arXiv:2007.01030, 2020
252020
ANEA: distant supervision for low-resource named entity recognition
MA Hedderich, L Lange, D Klakow
arXiv preprint arXiv:2102.13129, 2021
23*2021
CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain
L Lange, H Adel, J Strötgen, D Klakow
Bioinformatics 38 (12), 3267-3274, 2022
172022
KRAUTS: A German temporally annotated news corpus
J Strötgen, AL Minard, L Lange, M Speranza, B Magnini
Proceedings of the Eleventh International Conference on Language Resources …, 2018
162018
Feature-dependent confusion matrices for low-resource NER labeling with noisy labels
L Lange, MA Hedderich, D Klakow
arXiv preprint arXiv:1910.06061, 2019
142019
Nlnde at semeval-2023 task 12: Adaptive pretraining and source language selection for low-resource multilingual sentiment analysis
M Wang, H Adel, L Lange, J Strötgen, H Schütze
arXiv preprint arXiv:2305.00090, 2023
132023
Switchprompt: Learning domain-specific gated soft prompts for classification in low-resource domains
K Goswami, L Lange, J Araki, H Adel
arXiv preprint arXiv:2302.06868, 2023
112023
Boosting transformers for job expression extraction and classification in a low-resource setting
L Lange, H Adel, J Strötgen
arXiv preprint arXiv:2109.08597, 2021
112021
To share or not to share: Predicting sets of sources for model transfer learning
L Lange, J Strötgen, H Adel, D Klakow
arXiv preprint arXiv:2104.08078, 2021
112021
NLNDE: enhancing neural sequence taggers with attention and noisy channel for robust pharmacological entity detection
L Lange, H Adel, J Strötgen
arXiv preprint arXiv:2007.01022, 2020
112020
Closing the gap: Joint de-identification and concept extraction in the clinical domain
L Lange, H Adel, J Strötgen
arXiv preprint arXiv:2005.09397, 2020
112020
Fame: Feature-based adversarial meta-embeddings for robust input representations
L Lange, H Adel, J Strötgen, D Klakow
arXiv preprint arXiv:2010.12305, 2020
10*2020
Multilingual normalization of temporal expressions with masked language models
L Lange, J Strötgen, H Adel, D Klakow
arXiv preprint arXiv:2205.10399, 2022
82022
On the choice of auxiliary languages for improved sequence tagging
L Lange, H Adel, J Strötgen
arXiv preprint arXiv:2005.09389, 2020
82020
Rehearsal-Free Modular and Compositional Continual Learning for Language Models
M Wang, H Adel, L Lange, J Strötgen, H Schütze
arXiv preprint arXiv:2404.00790, 2024
72024
NLNDE at CANTEMIST: neural sequence labeling and parsing approaches for clinical concept extraction
L Lange, X Dai, H Adel, J Strötgen
arXiv preprint arXiv:2010.12322, 2020
62020
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–20