SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27412750 of 4002 papers

TitleStatusHype
Word2net: Deep Representations of Language0
Word2rate: training and evaluating multiple word embeddings as statistical transitions0
Word2Sense: Sparse Interpretable Word Embeddings0
Word-Alignment-Based Segment-Level Machine Translation Evaluation using Word Embeddings0
Word and Document Embeddings based on Neural Network Approaches0
Word and Phrase Features in Graph Convolutional Network for Automatic Question Classification0
Word associations and the distance properties of context-aware word embeddings0
Word-Context Character Embeddings for Chinese Word Segmentation0
WordDecipher: Enhancing Digital Workspace Communication with Explainable AI for Non-native English Speakers0
Word Definitions from Large Language Models0
Show:102550
← PrevPage 275 of 401Next →

No leaderboard results yet.