Difference between revisions of "Navajo and English"

From LING073
Jump to: navigation, search
(Developed Resources)
(Polished RBMT System)
 
(5 intermediate revisions by the same user not shown)
Line 7: Line 7:
 
*[https://wikis.swarthmore.edu/ling073/Navajo_and_English/Lexical_selection Lexical selection]
 
*[https://wikis.swarthmore.edu/ling073/Navajo_and_English/Lexical_selection Lexical selection]
 
*[https://wikis.swarthmore.edu/ling073/Navajo_and_English/Contrastive_Grammar Contrastive Grammar]
 
*[https://wikis.swarthmore.edu/ling073/Navajo_and_English/Contrastive_Grammar Contrastive Grammar]
 +
*[https://wikis.swarthmore.edu/ling073/Navajo_and_English/Structural_transfer Structural Transfer]
  
 
== External Resources  ==
 
== External Resources  ==
Line 13: Line 14:
 
*[https://github.com/apertium/apertium-eng English Transducer]
 
*[https://github.com/apertium/apertium-eng English Transducer]
 
*[https://github.swarthmore.edu/Ling073-sp22/ling073-nav-eng-corpus Corpus repository]
 
*[https://github.swarthmore.edu/Ling073-sp22/ling073-nav-eng-corpus Corpus repository]
 
 
CHANGE THIS
 
  
 
==NAV -> ENG Evaluation==
 
==NAV -> ENG Evaluation==
Line 86: Line 84:
  
 
==Additions==
 
==Additions==
===Disambiguation===
+
* ~20 noun stems
===Structural Transfer===
+
* 39 twol rules
 
+
* 1 disambiguation rule
===Adding Stems===
+
* 1 lexical selection rule
 +
* 1 transfer rule
  
 
==Polished RBMT System==
 
==Polished RBMT System==
* Precision: %
+
* Stems in transducer: 316
* Recall: %
+
* Over nav.longer.txt
* Coverage over large corpus: 0 / 0 (~0.00)
+
** coverage: 2437 / 4819 (~0.50570657812824237394)
* Stems in transducer: 0
+
* Over nav.basic.txt:
* Over xyz.txt:
+
* coverage: 762 / 1364 (~0.55865102639296187683)
** Word Error Rate (WER): 00.00 %
+
** Number of words in reference: 760
** Position-independent word error rate (PER): 00.00 %
+
** Number of words in test: 1188
 
** Percentage of unknown words: 00.00 %
 
** Percentage of unknown words: 00.00 %
** Number of position-independent correct words: 0/0
+
** Edit distance: 1176
** Coverage: 0 / 0 (0.)
+
** Word Error Rate (WER): 154.74 %
* Over xyz.corpus.large.txt
+
** Position-independent word error rate (PER): 153.68 %
** Coverage: 0 / 0 (~0.00)
+
** Number of position-independent correct words: 20
 +
**Number of unknown words which were free rides: 0
 +
**Percentage of unknown words that were free rides: 0%
 +
 
 +
 
  
  

Latest revision as of 14:49, 8 May 2022

Resources for machine translation between Navajo and English

Developed Resources

External Resources

NAV -> ENG Evaluation

Coverage Analysis

  • Monolingual transducer coverage of small corpus: 407 / 1216 (~33.47%)
  • Bilingual transducer coverage of small corpus: 533 / 1345 (~39.63%)

Sentence Evaluation

1.

 Original sentence: Dibé bikééʼ dínááh.
 Intended Translation: Go after the sheep.
 Biltrans Output: ^Dibé<n>/Wood<n>$ ^*bikééʼ/*bikééʼ$ ^dínááh<v>/go<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #Wood *bikééʼ #go.

2.

 Original sentence: Nimá dóó nizhéʼé bíighah nídaah.
 Intended Translation: Sit beside your mother and father.
 Biltrans Output: ^Má<n><px2sg>/Mother<n><px2sg>$ ^dóó<cnjcoo>/and<cnjcoo>$ ^zhéʼé<n><px2sg>/father<n><px2sg>$ ^*bíighah/*bíighah$ ^nídaah<v>/sit<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #Mother and #father *bíighah #sit.

3.

 Original sentence: Chidí biyiʼ ayóo deesdoi.  
 Intended Translation: It is very hot inside the vehicle.
 Biltrans Output: ^Chidí<n>/Automobile<n>$ ^*biyiʼ/*biyiʼ$ ^ayóo<adv>/remarkably<adv>$ ^deesdoi<v>/hot<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$  
 Translation Output: #Automobile *biyiʼ *ayóo #hot.

4.

 Original sentence: Kodi atooʼ hólǫ́.
 Intended Translation: Here is some stew.
 Biltrans Output: ^Kodi<adv>/Here<adv>$ ^atooʼ<n>/stew<n>$ ^*hólǫ́/*hólǫ́$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: Here #stew *hólǫ́.

5.

 Original sentence: Atooʼ łaʼ naa deeshkááł. 
 Intended Translation: I will give you some stew.
 Biltrans Output: ^Atooʼ<n>/Stew<n>$ ^łaʼ<det>/some<det>$ ^naa<post>/around<pp>$ ^deeshkááł<v>/give<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #Stew #some #around #give.

6.

 Original sentence: Wóláchííʼ bighan binaa ałhéénílyeed. 
 Intended Translation: Go around the ant mound.
 Biltrans Output: ^Wóláchííʼ<n>/Ant<n>$ ^ghan<n><px3sg>/house<n><px3sg>$ ^*binaa/*binaa$ ^ałhéénílyeed<v>/go<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #Ant #house *binaa #go. 


7.

 Original sentence: Jooł nikídílniihí tsáskʼeh biyaa íímááz. 
 Intended Translation: The basketball rolled underneath the bed.
 Biltrans Output: ^Jooł<n>/Ball<n>$ ^nikídílniihí<adj>/basketball<adj>$ ^tsáskʼeh<n>/bed<n>$ ^*biyaa/*biyaa$ ^íímááz<v>/roll<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #basketball #Ball #bed *biyaa #roll. 

8.

 Original sentence: Shiyázhí, hoghandi naanishísh ałtso íinilaa?
 Intended Translation: My child, did you finish your homework?
 Biltrans Output:^Yázhí<n><px1sg>/Little<n><px1sg>$^,<cm>/,<cm>$ ^hoghandi<adj>/home<adj>$ ^naanishísh<n>/work<n>$ ^ałtso<adj>/completed<adj>$ ^íinilaa<v>/finish<vblex>$^?<sent>/?<sent>$^.<sent>/.<sent>$
 Translation Output: #Little, #home *naanishísh #completed #finish?

9.

 Original sentence: Shiyázhí, nízhiʼ naaltsoos bikááʼ íníleeh.
 Intended Translation: My child, write your name on the paper.
 Biltrans Output: ^Yázhí<n><px1sg>/Little<n><px1sg>$^,<cm>/,<cm>$ ^hoghandi<adj>/home<adj>$ ^*naanishísh/*naanishísh$ ^ałtso<adj>/completed<adj>$ ^íinilaa<v>/finish<vblex>$^?<sent>/?<sent>$^.<sent>/.<sent>$
 Translation Output: #Little, *nízhiʼ #paper *bikááʼ #write.

10.

 Original sentence: Naaltsoos tsitsʼaaʼ naaltsoos atseedzį́ biiʼ hadéébįįd.
 Intended Translation: The cardboard box is filled with newspapers.
 Biltrans Output: ^Naaltsoos<n>/Paper<n>$ ^*tsitsʼaaʼ/*tsitsʼaaʼ$ ^naaltsoos<n>/paper<n>$ ^*atseedzį́/*atseedzį́$ ^*biiʼ/*biiʼ$ ^hadéébįįd<v>/fill<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #Paper #box #paper *atseedzį́ *biiʼ #fill

Additions

  • ~20 noun stems
  • 39 twol rules
  • 1 disambiguation rule
  • 1 lexical selection rule
  • 1 transfer rule

Polished RBMT System

  • Stems in transducer: 316
  • Over nav.longer.txt
    • coverage: 2437 / 4819 (~0.50570657812824237394)
  • Over nav.basic.txt:
  • coverage: 762 / 1364 (~0.55865102639296187683)
    • Number of words in reference: 760
    • Number of words in test: 1188
    • Percentage of unknown words: 00.00 %
    • Edit distance: 1176
    • Word Error Rate (WER): 154.74 %
    • Position-independent word error rate (PER): 153.68 %
    • Number of position-independent correct words: 20
    • Number of unknown words which were free rides: 0
    • Percentage of unknown words that were free rides: 0%