Our Impact
of estimated data-center cooling water avoided when a query runs locally instead of through a traditional AI data center.
Water
Large AI data centers rely on evaporative cooling towers that consume enormous volumes of fresh water. Microsoft's 2023 sustainability report revealed a 34% year-over-year increase in water consumption [2], driven largely by AI workloads. Google's data centers consumed 5.6 billion gallons of water that same year [3].
Researchers at UC Riverside found that a single conversation with GPT-4 uses approximately 500 mL of cooling water — roughly a full water bottle. That works out to about 250 mL per query [1].
Eco is designed to use local devices and contributor hardware instead of dedicated AI data-center racks. That can avoid the evaporative cooling footprint this estimate is based on.
per query — data center
data-center cooling avoided
Energy
Data centers currently consume 1–1.5% of global electricity [4]. Goldman Sachs projects that AI workloads alone will drive a 160% increase in data center power demand by 2030 [5].
Eco contributors run on existing consumer hardware — gaming PCs, workstations, and home servers that are already powered on. The marginal energy cost of running inference on a machine that's already drawing power is a fraction of what a dedicated data center rack consumes.
We don't claim energy savings — consumer hardware varies too much for honest estimates. What we can say is that decentralized inference redistributes AI workloads across hardware that already exists, rather than building new power-hungry infrastructure.
of global electricity — data centers
marginal demand — Eco contributors
Decentralization
Centralized AI forces a false trade-off: use a service that collects your data, or don't use AI at all. Eco launches with on-device chat first, plus clearly labeled network privacy modes as contributor access opens. Zero-Knowledge privacy is a research direction, not part of today's launch product.
And because Eco runs on distributed consumer hardware instead of centralized data centers, privacy doesn't come at the cost of building more infrastructure. No new cooling towers. No new server farms. Just the computers that already exist in people's homes.
Decentralized contributor routing is the direction of travel. Today, Eco's strongest launch guarantee is simpler: your first chats can run locally in your browser.
Encrypted
TLS encryption in transit and at rest
Confidential
Hardware-sealed enclaves that no one can access
Private
Split inference — contributors only see fragments
Zero-KnowledgeComing soon
Research direction for future privacy guarantees
water_saved = total_queries × 0.25 LOur methodology is deliberately conservative. We'd rather understate our impact than overclaim it. All source code — including the calculation you see on this page — is open source under AGPL-3.0.