Follow
Giovanni Monea
Title
Cited by
Cited by
Year
Do llamas work in english? on the latent language of multilingual transformers
C Wendler, V Veselovsky, G Monea, R West
Proceedings of the 62nd Annual Meeting of the Association for Computational …, 2024
782024
Pass: Parallel speculative sampling
G Monea, A Joulin, E Grave
arXiv preprint arXiv:2311.13581, 2023
342023
A glitch in the matrix? locating and detecting language model grounding with fakepedia
G Monea, M Peyrard, M Josifoski, V Chaudhary, J Eisner, E Kıcıman, ...
arXiv preprint arXiv:2312.02073, 2023
92023
How do llamas process multilingual text? a latent exploration through activation patching
C Dumas, V Veselovsky, G Monea, R West, C Wendler
ICML 2024 Workshop on Mechanistic Interpretability, 2024
62024
Llms are in-context reinforcement learners
G Monea, A Bosselut, K Brantley, Y Artzi
52024
Separating Tongue from Thought: Activation Patching Reveals Language-Agnostic Concept Representations in Transformers
C Dumas, C Wendler, V Veselovsky, G Monea, R West
arXiv preprint arXiv:2411.08745, 2024
12024
Controllable Context Sensitivity and the Knob Behind It
J Minder, K Du, N Stoehr, G Monea, C Wendler, R West, R Cotterell
arXiv preprint arXiv:2411.07404, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–7