SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 15811590 of 4002 papers

TitleStatusHype
Finki at SemEval-2016 Task 4: Deep Learning Architecture for Twitter Sentiment Analysis0
Community Evaluation and Exchange of Word Vectors at wordvectors.org0
Firearms and Tigers are Dangerous, Kitchen Knives and Zebras are Not: Testing whether Word Embeddings Can Tell0
First Bilingual Word Embeddings for te reo Māori and English: Towards Code-switching Detection in a Low-resourced setting0
Fitting Semantic Relations to Word Embeddings0
FKIE_itf_2021 at CASE 2021 Task 1: Using Small Densely Fully Connected Neural Nets for Event Detection and Clustering0
Detecting Cross-Lingual Plagiarism Using Simulated Word Embeddings0
Flexible and Scalable State Tracking Framework for Goal-Oriented Dialogue Systems0
BERT's Conceptual Cartography: Mapping the Landscapes of Meaning0
Affordance Extraction and Inference based on Semantic Role Labeling0
Show:102550
← PrevPage 159 of 401Next →

No leaderboard results yet.