Transformers
Safetensors
English

This Hugging Face repository contains a fine-tuned Llama model trained for the task of extracting recombination examples from scientific abstracts, as described in the paper CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature. The model utilizes a LoRA adapter on top of a Llama base model. The model can be used for the information extraction task of identifying recombination examples within scientific text. For detailed usage instructions and reproduction of results, please refer to the Github repository linked above.

Bibtex

@misc{sternlicht2025chimeraknowledgebaseidea,
      title={CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature}, 
      author={Noy Sternlicht and Tom Hope},
      year={2025},
      eprint={2505.20779},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2505.20779}, 
}

Quick Links

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for noystl/llama-8b-e2e

Finetuned
(1407)
this model

Dataset used to train noystl/llama-8b-e2e

Collection including noystl/llama-8b-e2e