SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17811790 of 4002 papers

TitleStatusHype
Recognizing UMLS Semantic Types with Deep Learning0
Neural Dependency Parsing of Biomedical Text: TurkuNLP entry in the CRAFT Structural Annotation Task0
Contextualized context2vec0
Context-Aware Neural Machine Translation Decoding0
Comparing the Intrinsic Performance of Clinical Concept Embeddings by Their Field of Medicine0
VSP at PharmaCoNER 2019: Recognition of Pharmacological Substances, Compounds and Proteins with Recurrent Neural Networks in Spanish Clinical Cases0
A Margin-based Loss with Synthetic Negative Samples for Continuous-output Machine Translation0
BOUN-ISIK Participation: An Unsupervised Approach for the Named Entity Normalization and Relation Extraction of Bacteria Biotopes0
BioReddit: Word Embeddings for User-Generated Biomedical NLP0
IxaMed at PharmacoNER Challenge 20190
Show:102550
← PrevPage 179 of 401Next →

No leaderboard results yet.