SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22612270 of 4002 papers

TitleStatusHype
Improving Cross-Domain Chinese Word Segmentation with Word EmbeddingsCode0
Russian Language Datasets in the Digitial Humanities Domain and Their Evaluation with Word EmbeddingsCode0
Using Word Embeddings for Visual Data Exploration with Ontodia and Wikidata0
Relation Extraction Datasets in the Digital Humanities Domain and their Evaluation with Word Embeddings0
Using natural language processing techniques to extract information on the properties and functionalities of energetic materials from large text corporaCode0
Efficient Contextual Representation Learning Without Softmax Layer0
A Framework for Decoding Event-Related Potentials from Text0
Still a Pain in the Neck: Evaluating Text Representations on Lexical CompositionCode0
SuperTML: Two-Dimensional Word Embedding for the Precognition on Structured Tabular DataCode0
Interpretable Structure-aware Document Encoders with Hierarchical Attention0
Show:102550
← PrevPage 227 of 401Next →

No leaderboard results yet.