SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 561570 of 4002 papers

TitleStatusHype
Arabic POS Tagging: Don't Abandon Feature Engineering Just Yet0
Arabic aspect sentiment polarity classification using BERT0
A Linear Dynamical System Model for Text0
A Question Answering Approach for Emotion Cause Extraction0
Adapted Sentiment Similarity Seed Words For French Tweets' Polarity Classification0
Misinforming LLMs: vulnerabilities, challenges and opportunities0
Breaking Down Word Semantics from Pre-trained Language Models through Layer-wise Dimension Selection0
Alignment-free Cross-lingual Semantic Role Labeling0
News and Load: A Quantitative Exploration of Natural Language Processing Applications for Forecasting Day-ahead Electricity System Demand0
A Case Study to Reveal if an Area of Interest has a Trend in Ongoing Tweets Using Word and Sentence Embeddings0
Show:102550
← PrevPage 57 of 401Next →

No leaderboard results yet.