AI With Values: Our Commitment to Ethical Technology Partners
15th January 2026
... Comments

When most people think of AI assistants, they think of ChatGPT. It's become the household name, the default answer when someone says "AI." But recent revelations have made it clear that the most popular option isn't always the most ethical one.

At Polyspiral, we've always believed that our choice of tools and partners reflects our values. That's why we power our websites with 100% renewable energy. That's why we've spent 26 years building personal relationships with our clients rather than treating them as tickets in a queue. And that's why, when it comes to AI, we've chosen our technology partners carefully.

The Wake-Up Call

In recent weeks, troubling news has emerged about two of the biggest names in AI:

Grok (owned by Elon Musk's xAI) experienced catastrophic safety failures that allowed users to generate sexualized images of children. When asked for comment, the company responded with an automated message: "Legacy Media Lies." Even after global backlash from governments in Britain, France, India, Brazil, and the European Union, Musk claimed he was "not aware of any naked underage images generated by Grok. Literally zero"—despite overwhelming evidence to the contrary.

ChatGPT/OpenAI (led by Sam Altman) has faced its own controversies. While OpenAI maintains stronger technical safeguards than Grok, the timing of recent decisions raises serious questions. Less than 24 hours after successfully lobbying against California legislation requiring AI safety guardrails for minors, Altman announced that ChatGPT would allow adults to create "erotica." When criticised, he defended the decision by saying OpenAI is "not the elected moral police of the world."

California Assemblymember Rebecca Bauer-Kahan said it plainly: "This announcement proves their families were right that AI companies will never regulate themselves. They will always choose profits over children's lives."

These aren't abstract concerns. These are real failures with real consequences for real people, particularly vulnerable children.

Why We Choose Differently

At Polyspiral, we use two AI tools: Claude (by Anthropic) and Microsoft Copilot. Here's why:

Claude (Anthropic)

Claude is built by a company that genuinely prioritises safety from the ground up:

  • 18+ only platform with active enforcement—not just terms of service lip service
  • Text-only by design: Claude deliberately doesn't generate images or videos to prevent misuse
  • Immediate reporting: All child safety violations are reported to the National Centre for Missing & Exploited Children, with instant account bans
  • Safety by Design principles: Anthropic signed onto industry-leading child protection commitments and publishes transparent progress reports
  • Constitutional AI: Claude is trained using ethical principles baked into its foundation, not added as an afterthought

Microsoft Copilot

Microsoft brings decades of enterprise-grade security and compliance to AI:

  • Explicit prohibitions against creating child exploitation material or engaging in grooming
  • NCMEC reporting: All violations are reported to the National Centre for Missing & Exploited Children
  • Parental controls: Built-in tools for families who want to supervise children's use
  • Enterprise security: The same rigorous security that protects Fortune 500 companies

What About Google Gemini?

Google Gemini has committed to Safety by Design principles and prohibits child sexual abuse material. However, Common Sense Media—a leading nonprofit that rates technology for children—rated both Gemini's "Under 13" and "Teen Experience" platforms as "High Risk." Their assessment found that these appear to be adult versions with filters added, rather than platforms designed for child safety from the ground up. The filters still expose children to inappropriate material and fail to recognise serious mental health symptoms.

While Google's policies exist, the implementation hasn't met the standards we require for our business partners.

Ethics and the Environment: Our Dual Commitment

Just as we're committed to ethical AI partnerships, we're equally committed to environmental responsibility. AI has a significant carbon footprint—training large language models and running queries consumes substantial energy. This is an area of ongoing concern and development.

Our commitment: As more eco-friendly AI solutions are developed, we will evaluate and switch to those options. Ethics isn't just about child safety—it's also about planetary responsibility. We'll continue monitoring:

  • Energy efficiency of AI models and infrastructure
  • Renewable energy usage by AI providers
  • Carbon footprint transparency and offsetting programs
  • Sustainable AI development practices

Just as we moved Polyspiral to 100% renewable energy hosting in 2018, we're prepared to lead the way in choosing environmentally responsible AI tools as they emerge.

ChatGPT Isn't Synonymous with AI

The biggest misconception we encounter is that "AI" means "ChatGPT." It's understandable—OpenAI's marketing has been exceptional. But accepting that conflation is like saying "photocopying" means "Xeroxing" or "searching" means "Googling." It limits your options and obscures the ethical differences between providers.

There are multiple AI assistants, each with different:

  • Safety protocols
  • Environmental footprints
  • Business models
  • Values and priorities
  • Transparency levels

Choosing between them matters.

Why This Matters for Small Businesses

You might wonder: "I'm a small business owner, not a tech company. Why should I care which AI my web designer uses?"

Here's why it matters:

  1. Values alignment: If you've chosen Polyspiral because we share your values (environmental responsibility, personal service, ethical business practices), then you deserve to know we apply those same standards to our tools.
  2. Your reputation: When we create content, optimise your SEO, or build features for your site using AI assistance, you want to know it's been done ethically.
  3. Supporting the right companies: Every business decision is a vote for the kind of world we want. When we choose ethical AI providers, we're voting for companies that prioritize safety over growth-at-any-cost.
  4. The bigger picture: The AI industry is moving incredibly fast, often without adequate oversight. The only brake on harmful practices is businesses and individuals voting with their wallets and choosing ethical providers.

Our Promise

At Polyspiral, ethics matter. That's not just a tagline—it's reflected in:

  • Our switch to 100% renewable energy hosting in 2018
  • Our 26 years of commitment to personal, one-to-one client relationships
  • Our unlimited revisions and comprehensive training
  • Our transparent business practices
  • And now, our careful selection of AI technology partners

We will continue to:

  • Evaluate AI providers based on safety records and ethical practices
  • Prioritise tools built with child safety and environmental responsibility in mind
  • Switch to more eco-friendly AI solutions as they're developed
  • Be transparent with you about the tools we use and why
  • Put values before convenience or cost

Join the Conversation

We'd love to hear your thoughts. Have you considered which AI tools you or your service providers use? Do ethics factor into your technology decisions?

LinkedIn Poll: In light of recent AI safety revelations, which AI assistant do you use in your business?

  • ChatGPT/OpenAI
  • Claude (Anthropic)
  • Microsoft Copilot
  • Google Gemini
  • Other/None

Visit our LinkedIn page to vote and share your perspective.

About Polyspiral

Since 2010, Polyspiral has been creating award-winning websites powered by 100% renewable energy for small businesses and charities across Suffolk and Essex. We believe great web design goes hand-in-hand with ethical business practices—from our environmental commitments to our choice of technology partners.

Need a web designer who shares your values? Get in touch:

Because ethics matter—in hosting, in AI, and in every choice we make.

More
Popular Categories