Skip to content

Commit 7f41190

Browse files
committed
add training steps
1 parent 4844a53 commit 7f41190

File tree

1 file changed

+25
-1
lines changed

1 file changed

+25
-1
lines changed

README.md

+25-1
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,31 @@ This is the PyTorch implementation for [Dual Learning for Machine Translation](h
44

55
The NMT models used as channels are heavily depend on [pcyin/pytorch\_nmt](https://github.com/pcyin/pytorch_nmt).
66

7+
### Usage
8+
9+
You shall prepare these models for dual learning step:
10+
- Language Models x 2
11+
- Translation Models x 2
12+
13+
##### Warm-up Step
14+
15+
- Language Models \
16+
Check here [lm/](https://github.com/yistLin/pytorch-dual-learning/tree/master/lm)
17+
- Translation Models \
18+
Check here [nmt/](https://github.com/yistLin/pytorch-dual-learning/tree/master/nmt)
19+
20+
##### Dual Learning Step
21+
22+
During the reinforcement learning process, it will take language models and translation models as rewards, and update the translation models. \
23+
You can find more details in the paper.
24+
25+
- Training \
26+
You can simply use this [script](https://github.com/yistLin/pytorch-dual-learning/blob/master/train-dual.sh),
27+
you have to modify the path and name to your models.
28+
- Test \
29+
To use the trained models, you can just treat it as [NMT models](https://github.com/pcyin/pytorch_nmt).
30+
31+
732
### Test (Basic)
833

934
Firstly, we trained our basic model with 450K bilingual pair, which is only 10% data, as warm-start. Then, we set up a dual-learning game, and trained two models using reinforcement technique.
@@ -52,4 +77,3 @@ Firstly, we trained our basic model with 450K bilingual pair, which is only 10%
5277
| EN-DE (bleu) | | 21.42 | 21.57 | 21.55 | 21.55 | | | | |
5378
| DE-EN | 24.69 | 25.90 | 25.89 | 25.91 | 26.03 | 25.94 | 26.02 | 26.18 | 26.20 |
5479
| DE-EN (bleu) | | 25.96 | 26.25 | 26.22 | 26.18 | | | | |
55-

0 commit comments

Comments
 (0)