SOTAVerified

Task Arithmetic

A task vector specifies a direction in the weight space of a pre-trained model, such that movement in that direction improves performance on the task. We build task vectors by subtracting the weights of a pre-trained model from the weights of the same model after fine-tuning on a task. We show that these task vectors can be modified and combined together through arithmetic operations such as negation and addition, and the behavior of the resulting model is steered accordingly.

Papers

Showing 4150 of 61 papers

TitleStatusHype
Knowledge Composition using Task Vectors with Learned Anisotropic ScalingCode1
On Giant's Shoulders: Effortless Weak to Strong by Dynamic Logits Fusion0
MetaGPT: Merging Large Language Models Using Model Exclusive Task Arithmetic0
Task Arithmetic can Mitigate Synthetic-to-Real Gap in Automatic Speech Recognition0
HPE-CogVLM: Advancing Vision Language Models with a Head Pose Grounding Task0
Localizing Task Information for Improved Model Merging and CompressionCode2
To Each (Textual Sequence) Its Own: Improving Memorized-Data Unlearning in Large Language Models0
No Train but Gain: Language Arithmetic for training-free Language Adapters enhancementCode0
Have You Merged My Model? On The Robustness of Large Language Model IP Protection Methods Against Model MergingCode1
Ethos: Rectifying Language Models in Orthogonal Parameter Space0
Show:102550
← PrevPage 5 of 7Next →

No leaderboard results yet.