SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 291300 of 4002 papers

TitleStatusHype
Analyzing Continuous Semantic Shifts with Diachronic Word Similarity MatricesCode0
Coreference Resolution System for Indonesian Text with Mention Pair Method and Singleton Exclusion using Convolutional Neural NetworkCode0
Cross-domain Semantic Parsing via ParaphrasingCode0
Cross-lingual Lexical Sememe PredictionCode0
Data-driven models and computational tools for neurolinguistics: a language technology perspectiveCode0
Contrastive Learning in Distilled ModelsCode0
Analytical Methods for Interpretable Ultradense Word EmbeddingsCode0
Contrastive Loss is All You Need to Recover Analogies as Parallel LinesCode0
Contextual String Embeddings for Sequence LabelingCode0
3D-EX : A Unified Dataset of Definitions and Dictionary ExamplesCode0
Show:102550
← PrevPage 30 of 401Next →

No leaderboard results yet.