Constituency Parsing
Constituency parsing aims to extract a constituency-based parse tree from a sentence that represents its syntactic structure according to a phrase structure grammar.
Example:
Sentence (S)
|
+-------------+------------+
| |
Noun (N) Verb Phrase (VP)
| |
John +-------+--------+
| |
Verb (V) Noun (N)
| |
sees Bill
Recent approaches convert the parse tree into a sequence following a depth-first traversal in order to be able to apply sequence-to-sequence models to it. The linearized version of the above parse tree looks as follows: (S (N) (VP V N)).
Papers
Showing 61–70 of 204 papers
Benchmark Results
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Hashing + XLNet | F1 score | 96.43 | — | Unverified |
| 2 | SAPar + XLNet | F1 score | 96.4 | — | Unverified |
| 3 | Label Attention Layer + HPSG + XLNet | F1 score | 96.38 | — | Unverified |
| 4 | Attach-Juxtapose Parser + XLNet | F1 score | 96.34 | — | Unverified |
| 5 | Head-Driven Phrase Structure Grammar Parsing (Joint) + XLNet | F1 score | 96.33 | — | Unverified |
| 6 | CRF Parser + RoBERTa | F1 score | 96.32 | — | Unverified |
| 7 | Hashing + Bert | F1 score | 96.03 | — | Unverified |
| 8 | NFC + BERT-large | F1 score | 95.92 | — | Unverified |
| 9 | N-ary semi-markov + BERT-large | F1 score | 95.92 | — | Unverified |
| 10 | Head-Driven Phrase Structure Grammar Parsing (Joint) + BERT | F1 score | 95.84 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Attach-Juxtapose Parser + BERT | F1 score | 93.52 | — | Unverified |
| 2 | SAPar + BERT | F1 score | 92.66 | — | Unverified |
| 3 | N-ary semi-markov + BERT | F1 score | 92.5 | — | Unverified |
| 4 | Hashing + Bert | F1 score | 92.33 | — | Unverified |
| 5 | CRF Parser + BERT | F1 score | 92.27 | — | Unverified |
| 6 | Kitaev etal. 2019 | F1 score | 91.75 | — | Unverified |
| 7 | CRF Parser | F1 score | 89.8 | — | Unverified |
| 8 | Zhou etal. 2019 | F1 score | 89.4 | — | Unverified |
| 9 | Kitaev etal. 2018 | F1 score | 87.43 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | CRF Parser + Electra | F1 score | 91.92 | — | Unverified |
| 2 | CRF Parser + BERT | F1 score | 91.55 | — | Unverified |
| 3 | CRF Parser | F1 score | 88.6 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SAPar | F1 | 83.26 | — | Unverified |