Wird geladen...

Integrating Multimodal Information in Large Pretrained Transformers

Recent Transformer-based contextual word representations, including BERT and XLNet, have shown state-of-the-art performance in multiple disciplines within NLP. Fine-tuning the trained contextual models on task-specific datasets has been the key to achieving superior performance downstream. While fin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proc Conf Assoc Comput Linguist Meet
Hauptverfasser: Rahman, Wasifur, Hasan, Md. Kamrul, Lee, Sangwu, Zadeh, Amir, Mao, Chengfeng, Morency, Louis-Philippe, Hoque, Ehsan
Format: Artigo
Sprache:Inglês
Veröffentlicht: 2020
Schlagworte:
Online Zugang:https://ncbi.nlm.nih.gov/pmc/articles/PMC8005298/
https://ncbi.nlm.nih.gov/pubmed/33782629
https://ncbi.nlm.nih.govhttp://dx.doi.org/10.18653/v1/2020.acl-main.214
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!