AI & ML interests

Training DALL-E with volunteers from all over the Internet using hivemind and dalle-pytorch (NeurIPS 2021 demo)

Recent Activity

This organization is a part of the NeurIPS 2021 demonstration "Training Transformers Together".

In this demo, we've trained a model similar to OpenAI DALL-E โ€” a Transformer "language model" that generates images from text descriptions. Training happened collaboratively โ€” volunteers from all over the Internet contributed to the training using hardware available to them. We used LAION-400M, the world's largest openly available image-text-pair dataset with 400 million samples. Our model was based on the dalleโ€‘pytorch implementation by Phil Wang with a few tweaks to make it communication-efficient.

See details about how it works on our website.

This organization gathers people participating in the collaborative training and provides links to the related materials:

The materials below were available during the training run itself:

  • ๐Ÿ‘‰ Starter kits for Google Colab and Kaggle (easy way to join the training)
  • ๐Ÿ‘‰ Dashboard (the current training state: loss, number of peers, etc.)
  • ๐Ÿ‘‰ Weights & Biases plots for aux peers (aggregating the metrics) and actual trainers (contributing with their GPUs)

Feel free to reach us on Discord if you have any questions ๐Ÿ™‚

datasets

None public yet