SOTAVerified

Text-to-Code Generation

Text-to-Code Generation is a task where we can generate code based on the natural language description.

Source: Text-to-code Generation with TensorFlow, πŸ€— & MBPP

Papers

Showing 1–10 of 20 papers

TitleStatusHype
Bridging the Gap Between Open-Source and Proprietary LLMs in Table QACode0
Aligning Crowd-sourced Human Feedback for Reinforcement Learning on Code Generation by Large Language Modelsβ€”0
Planning-Driven Programming: A Large Language Model Programming WorkflowCode1
SparsePO: Controlling Preference Alignment of LLMs via Sparse Token Masksβ€”0
Reranking Laws for Language Generation: A Communication-Theoretic Perspectiveβ€”0
Can OpenSource beat ChatGPT? -- A Comparative Study of Large Language Models for Text-to-Code Generationβ€”0
Generating Unseen Code Tests In Infinitumβ€”0
InverseCoder: Self-improving Instruction-Tuned Code LLMs with Inverse-InstructCode1
Can Large Language Models Solve Robot Routing?Code1
Text-to-Code Generation with Modality-relative Pre-trainingβ€”0
Show:102550
← PrevPage 1 of 2Next β†’

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1CodeT5BLEU41.48β€”Unverified
2CodeGPT-adaptedBLEU32.79β€”Unverified