Proctora / README.md
Karko's picture
Update README.md
d5dd7f0 verified
|
raw
history blame
1 kB
metadata
license: cc-by-nc-4.0
pipeline_tag: text-generation

Proctora is a MoE model made of

  • OpenPipe/mistral-ft-optimized-1227 as a base model
  • SanjiWatsuki/Kunoichi-7B as a first expert dedicated to RP tasks.
  • samir-fama/SamirGPT-v1 as a second expert for factual answers.

I do not have yet any metrics for this model but subjective ones. It was made at first out of curiosity and experimentation.

My goal is to produce a model excellent at being a game master for RPG sessions. However being dissatisfied with the existing evaluation tool-sets, I decided to create my own (still a WIP on 01/16/24). And among my collection of models of small/medium models Proctora gave me the best results to evaluate the answers produced by other LLMs. Therefore, I surprisingly settled with it and gave it an appropriate name according to the task.

TLDR: Proctora is a tool for a tool!

I doubt this model will be useful for the community. I publish it for the sake of transparency in my creative process.