
Personal Project
As AI audio and video tools get better, distinguishing real from fake gets harder. Signal Seal was an attempt to solve that — protect content from AI training, prove authenticity, and detect AI-generated media. Worked on this with a developer colleague. He trained the models, I handled product and design. The technology actually worked. The problem was who we were competing against.
Three core features:
Content Protection
Watermark audio and video files. The watermark is practically inaudible — you can barely tell the difference. But it renders the content useless for AI training.
Authentication
Prove who created something. Detect if content was edited after signing.
AI Detection
Point at any audio or video file. Determine if it's AI-generated. If it was Signal Seal signed, identify the original author.

The results were genuinely impressive. Audio protection broke clone attempts. Video protection made content invisible to AI. Detection worked against most tools.
Desktop

Protect
Mobile

AI Detector
Cutting-edge tools like Eleven Labs could still read protected content. The core promise couldn't hold against well-funded labs — and playing cat and mouse with billion-dollar companies isn't a game you win as a side project. The technology worked. We just discovered the true scale of the problem.