SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 15511560 of 4002 papers

TitleStatusHype
Identifying Cognates in English-Dutch and French-Dutch by means of Orthographic Information and Cross-lingual Word Embeddings0
Lexicon-Enhancement of Embedding-based Approaches Towards the Detection of Abusive Language0
Automatic Term Extraction from Newspaper Corpora: Making the Most of Specificity and Common Features0
Automatic Creation of Correspondence Table of Meaning Tags from Two Dictionaries in One Language Using Bilingual Word Embedding0
Usability and Accessibility of Bantu Language Dictionaries in the Digital Age: Mobile Access in an Open Environment0
FRAQUE: a FRAme-based QUEstion-answering system for the Public Administration domain0
TF-IDF Character N-grams versus Word Embedding-based Models for Fine-grained Event Classification: A Preliminary Study0
OSACT4 Shared Tasks: Ensembled Stacked Classification for Offensive and Hate Speech in Arabic Tweets0
Sentiment Analysis for Hinglish Code-mixed Tweets by means of Cross-lingual Word Embeddings0
LMU Bilingual Dictionary Induction System with Word Surface Similarity Scores for BUCC 20200
Show:102550
← PrevPage 156 of 401Next →

No leaderboard results yet.