eco
ImpactTransparencyPrivacyTerms
Start chatting
ImpactTransparencyPrivacyTerms

Our Impact

250mL

of estimated data-center cooling water avoided when a query runs locally instead of through a traditional AI data center.

Water

Every query to ChatGPT drinks a sip of water.

Large AI data centers rely on evaporative cooling towers that consume enormous volumes of fresh water. Microsoft's 2023 sustainability report revealed a 34% year-over-year increase in water consumption [2], driven largely by AI workloads. Google's data centers consumed 5.6 billion gallons of water that same year [3].

Researchers at UC Riverside found that a single conversation with GPT-4 uses approximately 500 mL of cooling water — roughly a full water bottle. That works out to about 250 mL per query [1].

Eco is designed to use local devices and contributor hardware instead of dedicated AI data-center racks. That can avoid the evaporative cooling footprint this estimate is based on.

250mL

per query — data center

~0mL

data-center cooling avoided

Energy

Your GPU was going to be on anyway.

Data centers currently consume 1–1.5% of global electricity [4]. Goldman Sachs projects that AI workloads alone will drive a 160% increase in data center power demand by 2030 [5].

Eco contributors run on existing consumer hardware — gaming PCs, workstations, and home servers that are already powered on. The marginal energy cost of running inference on a machine that's already drawing power is a fraction of what a dedicated data center rack consumes.

We don't claim energy savings — consumer hardware varies too much for honest estimates. What we can say is that decentralized inference redistributes AI workloads across hardware that already exists, rather than building new power-hungry infrastructure.

1.5%

of global electricity — data centers

~0%

marginal demand — Eco contributors

Decentralization

Privacy shouldn’t cost the planet.

Centralized AI forces a false trade-off: use a service that collects your data, or don't use AI at all. Eco launches with on-device chat first, plus clearly labeled network privacy modes as contributor access opens. Zero-Knowledge privacy is a research direction, not part of today's launch product.

And because Eco runs on distributed consumer hardware instead of centralized data centers, privacy doesn't come at the cost of building more infrastructure. No new cooling towers. No new server farms. Just the computers that already exist in people's homes.

Decentralized contributor routing is the direction of travel. Today, Eco's strongest launch guarantee is simpler: your first chats can run locally in your browser.

01

Encrypted

TLS encryption in transit and at rest

02

Confidential

Hardware-sealed enclaves that no one can access

03

Private

Split inference — contributors only see fragments

04

Zero-KnowledgeComing soon

Research direction for future privacy guarantees

How we calculate impact

Water savings per query
Each AI query to a traditional data center uses approximately 250 mL of cooling water — the midpoint of the 200–300 mL range identified by researchers at the University of California, Riverside for GPT-4 class models. Eco’s local and contributor paths avoid the data-center evaporative cooling footprint used in this estimate. Individual devices still consume electricity and may have ordinary hardware cooling.
What we count
Only completed inference queries where a contributor generated a full response. Failed, timed-out, and cached queries are excluded from our totals.
What we don’t count
We don’t claim carbon offsets. We don’t estimate energy savings — there are too many variables in consumer hardware configurations to make honest claims. We report only what we can directly measure.
water_saved = total_queries × 0.25 L

Our methodology is deliberately conservative. We'd rather understate our impact than overclaim it. All source code — including the calculation you see on this page — is open source under AGPL-3.0.

Verify our methodology

Sources

  1. [1]Li, P. et al. "Making AI Less Thirsty." University of California, Riverside. Communications of the ACM, 2024.
  2. [2]Microsoft Environmental Sustainability Report 2023. 34% increase in water consumption year-over-year.
  3. [3]Associated Press. "Google’s data centers consumed 5.6 billion gallons of water in 2023." 2024.
  4. [4]International Energy Agency. "Data Centres and Data Transmission Networks." Electricity 2024 Report.
  5. [5]Goldman Sachs Research. "AI is poised to drive 160% increase in data center power demand." 2024.
eco

Eco launches as a chat-first web product. Trust, legal, and impact pages stay close; desktop, developer, contributor, and governance stories return only when they are ready.

Explore

  • Chat
  • Impact
  • Transparency

Legal

  • Privacy Policy
  • Terms of Service