We're looking for funding - see Investors page.

17× Greener: How One AI Model Could’ve Saved 10,590 kg CO₂e

Training AI models in cleaner regions can cut emissions by over 90%. Stable Diffusion could have saved 10,620 kg CO₂e using smarter compute.

Not only does CarbonRunner bring carbon-aware infrastructure to CI/CD workflows on GitHub Actions, we also offer a powerful API to help you train AI/ML models in the lowest-carbon regions.

Instead of defaulting to fixed locations like us-east-1 in N. Virginia, CarbonRunner dynamically finds and shifts workloads to the greenest regions available, no great infrastructure changes required.


Stable Diffusion, a popular text-to-image AI model, used Machine Learning Impact calculator to estimate the impact of training their model on:

  • Hardware: A100 PCIe 40GB
  • Usage: 150,000 hours (Yikes!)
  • Cloud Provider: AWS
  • Region: US-East (N. Virginia)
  • Grid Intensity: 430gCO₂e/kWh
  • Total Emissions: 11,250 kg CO₂e

Had this workload been run using a carbon-aware approach like CarbonRunner, which shifts compute to cleaner grids averaging just 24g CO₂e/kWh, emissions would have been reduced 17×, saving 10,590 kg CO₂e; the equivalent of flying from New York to London 11 times.




Want to Shift AI Workloads to Cleaner Grids?

Stable Diffusion burned 11,250 kg CO₂e on US-East. The same training on a low-carbon grid would’ve cut emissions by 94%. Clean compute isn’t slower — just smarter and cheaper!