SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 15311540 of 4002 papers

TitleStatusHype
R\'epliquer et \'etendre pour l'alsacien ``\'Etiquetage en parties du discours de langues peu dot\'ees par sp\'ecialisation des plongements lexicaux'' (Replicating and extending for Alsatian : ``POS tagging for low-resource languages by adapting word embeddings'')0
\'Etude sur le r\'esum\'e comparatif gr\^ace aux plongements de mots (Comparative summarization study using word embeddings)0
Apprentissage de plongements de mots sur des corpus en langue de sp\'ecialit\'e : une \'etude d'impact (Learning word embeddings on domain specific corpora : an impact study )0
Du bon usage d'ingr\'edients linguistiques sp\'eciaux pour classer des recettes exceptionnelles (Using Special Linguistic Ingredients to Classify Exceptional Recipes )0
Word Sense Distance in Human Similarity Judgements and Contextualised Word Embeddings0
BERT-based Ensembles for Modeling Disclosure and Support in Conversational Social Media Text0
Data Augmentation with Unsupervised Machine Translation Improves the Structural Similarity of Cross-lingual Word Embeddings0
Quasi-orthonormal Encoding for Machine Learning ApplicationsCode0
InfiniteWalk: Deep Network Embeddings as Laplacian Embeddings with a NonlinearityCode0
TIME: Text and Image Mutual-Translation Adversarial Networks0
Show:102550
← PrevPage 154 of 401Next →

No leaderboard results yet.