SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 551575 of 4002 papers

TitleStatusHype
Building Web-Interfaces for Vector Semantic Models with the WebVectors Toolkit0
Case Studies on using Natural Language Processing Techniques in Customer Relationship Management Software0
ArbEngVec : Arabic-English Cross-Lingual Word Embedding Model0
AraWEAT: Multidimensional Analysis of Biases in Arabic Word Embeddings0
All-In-1 at IJCNLP-2017 Task 4: Short Text Classification with One Model for All Languages0
A Rank-Based Similarity Metric for Word Embeddings0
Arabic Textual Entailment with Word Embeddings0
A Linguistically Informed Convolutional Neural Network0
Adapting Neural Machine Translation with Parallel Synthetic Data0
Arabic POS Tagging: Don't Abandon Feature Engineering Just Yet0
Arabic aspect sentiment polarity classification using BERT0
A Linear Dynamical System Model for Text0
A Question Answering Approach for Emotion Cause Extraction0
Adapted Sentiment Similarity Seed Words For French Tweets' Polarity Classification0
Misinforming LLMs: vulnerabilities, challenges and opportunities0
Brundlefly at SemEval-2016 Task 12: Recurrent Neural Networks vs. Joint Inference for Clinical Temporal Information Extraction0
Alignment-free Cross-lingual Semantic Role Labeling0
News and Load: A Quantitative Exploration of Natural Language Processing Applications for Forecasting Day-ahead Electricity System Demand0
A Case Study to Reveal if an Area of Interest has a Trend in Ongoing Tweets Using Word and Sentence Embeddings0
A Progressive Learning Approach to Chinese SRL Using Heterogeneous Data0
A Process for Topic Modelling Via Word Embeddings0
ADAPT at SemEval-2018 Task 9: Skip-Gram Word Embeddings for Unsupervised Hypernym Discovery in Specialised Corpora0
Bridging the Modality Gap: Enhancing Channel Prediction with Semantically Aligned LLMs and Knowledge Distillation0
A Probabilistic Model for Learning Multi-Prototype Word Embeddings0
A Probabilistic Model for Joint Learning of Word Embeddings from Texts and Images0
Show:102550
← PrevPage 23 of 161Next →

No leaderboard results yet.