SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 15811590 of 4002 papers

TitleStatusHype
Finki at SemEval-2016 Task 4: Deep Learning Architecture for Twitter Sentiment Analysis0
Extrapolating Binder Style Word Embeddings to New Words0
Firearms and Tigers are Dangerous, Kitchen Knives and Zebras are Not: Testing whether Word Embeddings Can Tell0
First Bilingual Word Embeddings for te reo Māori and English: Towards Code-switching Detection in a Low-resourced setting0
Combining Acoustics, Content and Interaction Features to Find Hot Spots in Meetings0
FKIE_itf_2021 at CASE 2021 Task 1: Using Small Densely Fully Connected Neural Nets for Event Detection and Clustering0
Extractive Summarization using Continuous Vector Space Models0
Extracting UMLS Concepts from Medical Text Using General and Domain-Specific Deep Learning Models0
Combination of Domain Knowledge and Deep Learning for Sentiment Analysis of Short and Informal Messages on Social Media0
A Sense-Topic Model for Word Sense Induction with Unsupervised Data Enrichment0
Show:102550
← PrevPage 159 of 401Next →

No leaderboard results yet.