SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11311140 of 4002 papers

TitleStatusHype
A Language-Based Approach to Fake News Detection Through Interpretable Features and BRNN0
Multi-SimLex: A Large-Scale Evaluation of Multilingual and Crosslingual Lexical Semantic Similarity0
Graph-based Syntactic Word Embeddings0
Comparison between Voting Classifier and Deep Learning methods for Arabic Dialect Identification0
Demonstration of a Literature Based Discovery System based on Ontologies, Semantic Filters and Word Embeddings for the Raynaud Disease-Fish Oil Rediscovery0
MultiVitaminBooster at PARSEME Shared Task 2020: Combining Window- and Dependency-Based Features with Multilingual Contextualised Word Embeddings for VMWE Detection0
Unmasking Contextual Stereotypes: Measuring and Mitigating BERT’s Gender BiasCode1
Interdependencies of Gender and Race in Contextualized Word Embeddings0
SMM4H Shared Task 2020 - A Hybrid Pipeline for Identifying Prescription Drug Abuse from Twitter: Machine Learning, Deep Learning, and Post-Processing0
Argument from Old Man’s View: Assessing Social Bias in Argumentation0
Show:102550
← PrevPage 114 of 401Next →

No leaderboard results yet.