Conference item icon

Conference item

Information extraction from Swedish medical prescriptions with sig-transformer encoder

Abstract:

Relying on large pretrained language models such as Bidirectional Encoder Representations from Transformers (BERT) for encoding and adding a simple prediction layer has led to impressive performance in many clinical natural language processing (NLP) tasks. In this work, we present a novel extension to the Transformer architecture, by incorporating signature transform with the self-attention model. This architecture is added between embedding and prediction layers. Experiments on a new Swedish...

Expand abstract
Publication status:
Published
Peer review status:
Peer reviewed

Actions


Access Document


Files:
Publisher copy:
10.18653/v1/2020.clinicalnlp-1.5

Authors


More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Mathematical Institute
Role:
Author
More by this author
Institution:
University of Oxford
Division:
MSD
Department:
Psychiatry
Role:
Author
More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Mathematical Institute
Role:
Author
More by this author
Institution:
University of Oxford
Division:
MSD
Department:
Psychiatry
Role:
Author
More from this funder
Name:
Medical Research Council
Grant:
MC_PC_17215
Publisher:
Association for Computational Linguistics
Journal:
ACL Anthology More from this journal
Pages:
41-54
Publication date:
2020-11-01
Acceptance date:
2020-09-29
Event title:
3rd Clinical Natural Language Processing Workshop (ClinicalNLP 2020)
Event website:
https://clinical-nlp.github.io/2020/
Event start date:
2020-11-19
Event end date:
2020-11-19
DOI:
Language:
English
Keywords:
Pubs id:
1138000
Local pid:
pubs:1138000
Deposit date:
2020-10-16

Terms of use


Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP