Difference between revisions of "Navajo and English"

From LING073
Jump to: navigation, search
(Additions)
Line 2: Line 2:
  
 
==Additions==
 
==Additions==
 
+
* ~20 noun stems
 +
* 39 twol rules
 +
* 1 disambiguation rule
 +
* 1 lexical selection rule
  
 
==Developed Resources==
 
==Developed Resources==

Revision as of 14:24, 8 May 2022

Resources for machine translation between Navajo and English

Additions

  • ~20 noun stems
  • 39 twol rules
  • 1 disambiguation rule
  • 1 lexical selection rule

Developed Resources

External Resources


CHANGE THIS

NAV -> ENG Evaluation

Coverage Analysis

  • Monolingual transducer coverage of small corpus: 407 / 1216 (~33.47%)
  • Bilingual transducer coverage of small corpus: 533 / 1345 (~39.63%)

Sentence Evaluation

1.

 Original sentence: Dibé bikééʼ dínááh.
 Intended Translation: Go after the sheep.
 Biltrans Output: ^Dibé<n>/Wood<n>$ ^*bikééʼ/*bikééʼ$ ^dínááh<v>/go<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #Wood *bikééʼ #go.

2.

 Original sentence: Nimá dóó nizhéʼé bíighah nídaah.
 Intended Translation: Sit beside your mother and father.
 Biltrans Output: ^Má<n><px2sg>/Mother<n><px2sg>$ ^dóó<cnjcoo>/and<cnjcoo>$ ^zhéʼé<n><px2sg>/father<n><px2sg>$ ^*bíighah/*bíighah$ ^nídaah<v>/sit<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #Mother and #father *bíighah #sit.

3.

 Original sentence: Chidí biyiʼ ayóo deesdoi.  
 Intended Translation: It is very hot inside the vehicle.
 Biltrans Output: ^Chidí<n>/Automobile<n>$ ^*biyiʼ/*biyiʼ$ ^ayóo<adv>/remarkably<adv>$ ^deesdoi<v>/hot<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$  
 Translation Output: #Automobile *biyiʼ *ayóo #hot.

4.

 Original sentence: Kodi atooʼ hólǫ́.
 Intended Translation: Here is some stew.
 Biltrans Output: ^Kodi<adv>/Here<adv>$ ^atooʼ<n>/stew<n>$ ^*hólǫ́/*hólǫ́$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: Here #stew *hólǫ́.

5.

 Original sentence: Atooʼ łaʼ naa deeshkááł. 
 Intended Translation: I will give you some stew.
 Biltrans Output: ^Atooʼ<n>/Stew<n>$ ^łaʼ<det>/some<det>$ ^naa<post>/around<pp>$ ^deeshkááł<v>/give<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #Stew #some #around #give.

6.

 Original sentence: Wóláchííʼ bighan binaa ałhéénílyeed. 
 Intended Translation: Go around the ant mound.
 Biltrans Output: ^Wóláchííʼ<n>/Ant<n>$ ^ghan<n><px3sg>/house<n><px3sg>$ ^*binaa/*binaa$ ^ałhéénílyeed<v>/go<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #Ant #house *binaa #go. 


7.

 Original sentence: Jooł nikídílniihí tsáskʼeh biyaa íímááz. 
 Intended Translation: The basketball rolled underneath the bed.
 Biltrans Output: ^Jooł<n>/Ball<n>$ ^nikídílniihí<adj>/basketball<adj>$ ^tsáskʼeh<n>/bed<n>$ ^*biyaa/*biyaa$ ^íímááz<v>/roll<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #basketball #Ball #bed *biyaa #roll. 

8.

 Original sentence: Shiyázhí, hoghandi naanishísh ałtso íinilaa?
 Intended Translation: My child, did you finish your homework?
 Biltrans Output:^Yázhí<n><px1sg>/Little<n><px1sg>$^,<cm>/,<cm>$ ^hoghandi<adj>/home<adj>$ ^naanishísh<n>/work<n>$ ^ałtso<adj>/completed<adj>$ ^íinilaa<v>/finish<vblex>$^?<sent>/?<sent>$^.<sent>/.<sent>$
 Translation Output: #Little, #home *naanishísh #completed #finish?

9.

 Original sentence: Shiyázhí, nízhiʼ naaltsoos bikááʼ íníleeh.
 Intended Translation: My child, write your name on the paper.
 Biltrans Output: ^Yázhí<n><px1sg>/Little<n><px1sg>$^,<cm>/,<cm>$ ^hoghandi<adj>/home<adj>$ ^*naanishísh/*naanishísh$ ^ałtso<adj>/completed<adj>$ ^íinilaa<v>/finish<vblex>$^?<sent>/?<sent>$^.<sent>/.<sent>$
 Translation Output: #Little, *nízhiʼ #paper *bikááʼ #write.

10.

 Original sentence: Naaltsoos tsitsʼaaʼ naaltsoos atseedzį́ biiʼ hadéébįįd.
 Intended Translation: The cardboard box is filled with newspapers.
 Biltrans Output: ^Naaltsoos<n>/Paper<n>$ ^*tsitsʼaaʼ/*tsitsʼaaʼ$ ^naaltsoos<n>/paper<n>$ ^*atseedzį́/*atseedzį́$ ^*biiʼ/*biiʼ$ ^hadéébįįd<v>/fill<vblex>$^.<sent>/.<sent>$^.<sent>/.<sent>$
 Translation Output: #Paper #box #paper *atseedzį́ *biiʼ #fill

Additions

Disambiguation

Structural Transfer

Adding Stems

Polished RBMT System

  • Precision: %
  • Recall: %
  • Coverage over large corpus: 0 / 0 (~0.00)
  • Stems in transducer: 0
  • Over xyz.txt:
    • Word Error Rate (WER): 00.00 %
    • Position-independent word error rate (PER): 00.00 %
    • Percentage of unknown words: 00.00 %
    • Number of position-independent correct words: 0/0
    • Coverage: 0 / 0 (0.)
  • Over xyz.corpus.large.txt
    • Coverage: 0 / 0 (~0.00)