SOTAVerified

Task Arithmetic

A task vector specifies a direction in the weight space of a pre-trained model, such that movement in that direction improves performance on the task. We build task vectors by subtracting the weights of a pre-trained model from the weights of the same model after fine-tuning on a task. We show that these task vectors can be modified and combined together through arithmetic operations such as negation and addition, and the behavior of the resulting model is steered accordingly.

Papers

Showing 150 of 61 papers

TitleStatusHype
Task Singular Vectors: Reducing Task Interference in Model MergingCode2
Language Models are Homer Simpson! Safety Re-Alignment of Fine-tuned Language Models through Task ArithmeticCode2
Editing Models with Task ArithmeticCode2
Localizing Task Information for Improved Model Merging and CompressionCode2
Parameter Efficient Multi-task Model Fusion with Partial LinearizationCode1
NegMerge: Consensual Weight Negation for Strong Machine UnlearningCode1
Model Merging by Uncertainty-Based Gradient MatchingCode1
Merging Multi-Task Models via Weight-Ensembling Mixture of ExpertsCode1
Concrete Subspace Learning based Interference Elimination for Multi-task Model FusionCode1
AdaMerging: Adaptive Model Merging for Multi-Task LearningCode1
Localize-and-Stitch: Efficient Model Merging via Sparse Task ArithmeticCode1
An Empirical Study of Multimodal Model MergingCode1
Knowledge Composition using Task Vectors with Learned Anisotropic ScalingCode1
Have You Merged My Model? On The Robustness of Large Language Model IP Protection Methods Against Model MergingCode1
Fine-Tuning Attention Modules Only: Enhancing Weight Disentanglement in Task ArithmeticCode1
Task Arithmetic in the Tangent Space: Improved Editing of Pre-Trained ModelsCode1
CALM: Consensus-Aware Localized Merging for Multi-Task LearningCode0
Cross-Model Transfer of Task Vectors via Few-Shot Orthogonal AlignmentCode0
Efficient Model Editing with Task-Localized Sparse Fine-tuningCode0
Efficient Model Editing with Task Vector Bases: A Theoretical Framework and Scalable ApproachCode0
Investigating Task Arithmetic for Zero-Shot Information RetrievalCode0
Leveraging Submodule Linearity Enhances Task Arithmetic Performance in LLMsCode0
Multi-Task Model Merging via Adaptive Weight DisentanglementCode0
No Train but Gain: Language Arithmetic for training-free Language Adapters enhancementCode0
Towards Diverse Device Heterogeneous Federated Learning via Task Arithmetic Knowledge IntegrationCode0
Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization0
DuET: Dual Incremental Object Detection via Exemplar-Free Task Arithmetic0
Layer-Aware Task Arithmetic: Disentangling Task-Specific and Instruction-Following Knowledge0
Transferring Visual Explainability of Self-Explaining Models through Task Arithmetic0
Disentangling Task Interference within Neurons: Model Merging in Alignment with Neuronal Mechanisms0
CultureMERT: Continual Pre-Training for Cross-Cultural Music Representation Learning0
Mediator: Memory-efficient LLM Merging with Less Parameter Conflicts and Uncertainty Based Routing0
CAT Merging: A Training-Free Approach for Resolving Conflicts in Model Merging0
MetaGPT: Merging Large Language Models Using Model Exclusive Task Arithmetic0
MCU: Improving Machine Unlearning through Mode Connectivity0
What Matters for Model Merging at Scale?0
Bias Vector: Mitigating Biases in Language Models with Task Arithmetic Approach0
Neural Networks Remember More: The Power of Parameter Isolation and Combination0
When Domain Generalization meets Generalized Category Discovery: An Adaptive Task-Arithmetic Driven Approach0
On Fairness of Task Arithmetic: The Role of Task Vectors0
On Giant's Shoulders: Effortless Weak to Strong by Dynamic Logits Fusion0
OpenThaiGPT 1.6 and R1: Thai-Centric Open Source and Reasoning Large Language Models0
Beyond Task Vectors: Selective Task Arithmetic Based on Importance Metrics0
Scalable Strategies for Continual Learning with Replay0
Single-Input Multi-Output Model Merging: Leveraging Foundation Models for Dense Multi-Task Learning0
Soup to go: mitigating forgetting during continual learning with model averaging0
Subspace-Boosted Model Merging0
Task Arithmetic can Mitigate Synthetic-to-Real Gap in Automatic Speech Recognition0
Task Arithmetic for Language Expansion in Speech Translation0
BADTV: Unveiling Backdoor Threats in Third-Party Task Vectors0
Show:102550
← PrevPage 1 of 2Next →

No leaderboard results yet.