Personality Trait Recognition
Papers
Showing 11–13 of 13 papers
Benchmark Results
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | GPT-4 | Average accuracy in % | 77.9 | — | Unverified |
| 2 | LLama-3 70B | Average accuracy in % | 72.2 | — | Unverified |
| 3 | Mixtral 8x22B | Average accuracy in % | 72 | — | Unverified |
| 4 | Claude-3 Opus | Average accuracy in % | 71.1 | — | Unverified |
| 5 | Claude-3 Sonnet | Average accuracy in % | 70.9 | — | Unverified |
| 6 | Gemini 1.5 Pro | Average accuracy in % | 67.5 | — | Unverified |
| 7 | Qwen1.5 110B | Average accuracy in % | 65.7 | — | Unverified |
| 8 | Gemini 1.0 Pro | Average accuracy in % | 64.6 | — | Unverified |
| 9 | Claude-3 Haiku | Average accuracy in % | 64 | — | Unverified |
| 10 | Yi 34B | Average accuracy in % | 57.7 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Ensemble Modeling | Accuracy | 60.24 | — | Unverified |
| 2 | Ontology-Based | Accuracy | 51.42 | — | Unverified |