This repository provides the int8 quantized AoT compiled binary of Flux.1-Dev.

Follow this gist for details on how it was obtained and how to perform inference.

Downloads last month
0
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for sayakpaul/flux.1-dev-int8-aot-compiled

Finetuned
(416)
this model