SOTAVerified

Text-to-Code Generation

Text-to-Code Generation is a task where we can generate code based on the natural language description.

Source: Text-to-code Generation with TensorFlow, πŸ€— & MBPP

Papers

Showing 1–20 of 20 papers

TitleStatusHype
Magicoder: Empowering Code Generation with OSS-InstructCode4
Guiding Language Models of Code with Global Context using MonitorsCode2
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and GenerationCode1
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding and GenerationCode1
InverseCoder: Self-improving Instruction-Tuned Code LLMs with Inverse-InstructCode1
StructCoder: Structure-Aware Transformer for Code GenerationCode1
Planning-Driven Programming: A Large Language Model Programming WorkflowCode1
Can Large Language Models Solve Robot Routing?Code1
Bridging the Gap Between Open-Source and Proprietary LLMs in Table QACode0
PanGu-Coder: Program Synthesis with Function-Level Language ModelingCode0
Code Execution with Pre-trained Language ModelsCode0
C3PO: A Lightweight Copying Mechanism for Translating Pseudocode to CodeCode0
Text-to-Code Generation with Modality-relative Pre-trainingβ€”0
Can OpenSource beat ChatGPT? -- A Comparative Study of Large Language Models for Text-to-Code Generationβ€”0
Compilable Neural Code Generation with Compiler Feedbackβ€”0
Fine-Tuning Large Language Models for Answering Programming Questions with Code Snippetsβ€”0
Generating Unseen Code Tests In Infinitumβ€”0
Reranking Laws for Language Generation: A Communication-Theoretic Perspectiveβ€”0
SparsePO: Controlling Preference Alignment of LLMs via Sparse Token Masksβ€”0
Aligning Crowd-sourced Human Feedback for Reinforcement Learning on Code Generation by Large Language Modelsβ€”0
Show:102550

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1CodeT5BLEU41.48β€”Unverified
2CodeGPT-adaptedBLEU32.79β€”Unverified