File size: 708 Bytes
a1c5705 |
1 2 3 4 5 6 7 8 9 10 11 |
This is a model trained in four stages:
Base Model -- 1 Gig of semi-structured pretraining data:

- Base pretraining phase 1 (Constant LR, text completion -- 20,000 steps 2/3 epoch)
- Base pretraining phase 2 (Cosine LR, text completion -- 10,000 steps 1/3 epoch)
Merge LORA into instruct model -- 100 MB of structured story-instruct data:

- Story-instruct tune phase 1 (Constant LR, ~1250 steps, 1 epoch)
- Story-instruct tune phase 2 (Cosine LR, ~1250 steps, 1 epoch) |