microgen3D / README.md
BGLab's picture
Update README.md
10cf102 verified
metadata
pretty_name: MicroGen3D
tags:
  - GenAI
  - LDM
  - 3d
  - microstructure
  - diffusion-model
  - materials-science
  - synthetic-data
  - voxel
license: mit
datasets:
  - microgen3D
language:
  - en

microgen3D

Code

Dataset Summary

microgen3D is a dataset of 3D voxelized microstructures designed for training, evaluation, and benchmarking of generative models—especially Conditional Latent Diffusion Models (LDMs). It includes both synthetic (Cahn-Hilliard) and experimental microstructures with multiple phases (2 to 3). The voxel grids range from 64³ up to 128×128×64.

The dataset consists of three microstructure types:

  • Experimental microstructures
  • 2-phase Cahn-Hilliard microstructures
  • 3-phase Cahn-Hilliard microstructures

The two Cahn-Hilliard datasets are thresholded versions of the same simulation source. For each dataset type, we also provide pretrained generative model weights, comprising:

  • vae.ckpt – Variational Autoencoder
  • fp.ckpt – Feature Predictor
  • ddpm.ckpt – Denoising Diffusion Probabilistic Model

📁 Repository Structure

microgen3D/
├── data/
│   └── sample_data.h5                 # Experimental or synthetic HDF5 microstructure file
├── models/
│   └── weights/
│       ├── experimental/
│       │   ├── vae.ckpt
│       │   ├── fp.ckpt
│       │   └── ddpm.ckpt
│       ├── two_phase/
│       └── three_phase/
└── ...

🚀 Quick Start

🔧 Setup Instructions

# 1. Clone the repo
git clone https://github.com/baskargroup/MicroGen3D.git
cd MicroGen3D

# 2. Set up environment
python -m venv venv
source venv/bin/activate  # On Windows use: venv\Scripts\activate

# 3. Install dependencies
pip install -r requirements.txt

# 4. Download dataset and weights (Hugging Face)
# Make sure HF CLI is installed and you're logged in: `huggingface-cli login`
from huggingface_hub import hf_hub_download

# Download sample data
hf_hub_download(repo_id="BGLab/microgen3D", filename="sample_data.h5", repo_type="dataset", local_dir="data")

# Download model weights
hf_hub_download(repo_id="BGLab/microgen3D", filename="vae.ckpt", local_dir="models/weights/experimental")
hf_hub_download(repo_id="BGLab/microgen3D", filename="fp.ckpt", local_dir="models/weights/experimental")
hf_hub_download(repo_id="BGLab/microgen3D", filename="ddpm.ckpt", local_dir="models/weights/experimental")

⚙️ Configuration

Training Config (config.yaml)

  • task: Auto-generated if left null
  • data_path: Path to training dataset (../data/sample_train.h5)
  • model_dir: Directory to save model weights
  • batch_size: Batch size for training
  • image_shape: Shape of the 3D images [C, D, H, W]

VAE Settings:

  • latent_dim_channels: Latent space channels size.
  • kld_loss_weight: Weight of KL divergence loss
  • max_epochs: Training epochs
  • pretrained: Whether to use pretrained VAE
  • pretrained_path: Path to pretrained VAE model

FP Settings:

  • dropout: Dropout rate
  • max_epochs: Training epochs
  • pretrained: Whether to use pretrained FP
  • pretrained_path: Path to pretrained FP model

DDPM Settings:

  • timesteps: Number of diffusion timesteps
  • n_feat: Number of feature channels for Unet. Higher the channels more model capacity.
  • learning_rate: Learning rate
  • max_epochs: Training epochs

Inference Parameters (params.yaml)

  • data_path: Path to inference/test dataset (../data/sample_test.h5)

Training (for model init only):

  • batch_size, num_batches, num_timesteps, learning_rate, max_epochs : Optional parameters

Model:

  • latent_dim_channels: Latent space channels size.
  • n_feat: Number of feature channels for Unet.
  • image_shape: Expected image input shape

Attributes:

  • List of features/targets to predict:
    • ABS_f_D
    • CT_f_D_tort1
    • CT_f_A_tort1

Paths:

  • ddpm_path: Path to trained DDPM model
  • vae_path: Path to trained VAE model
  • fc_path: Path to trained FP model
  • output_dir: Where to store inference results

🏋️ Training

Navigate to the training folder and run:

cd training
python training.py

🧠 Inference

After training, switch to the inference folder and run:

cd ../inference
python inference.py

📜 Citation

If you use this dataset or models, please cite:

@article{baishnab2025microgen3d,
  title={3D Multiphase Heterogeneous Microstructure Generation Using Conditional Latent Diffusion Models},
  author={Baishnab, Nirmal and Herron, Ethan and Balu, Aditya and Sarkar, Soumik and Krishnamurthy, Adarsh and Ganapathysubramanian, Baskar},
  journal={arXiv preprint arXiv:2503.10711},
  year={2025}
}

⚖️ License

This project is licensed under the MIT License.