Today we’re launching Veri beta.
We rebuilt the system three times based on feedback from our alpha users. The analytics pipeline, the voice matching system, and the entire product architecture all got torn down and rebuilt from scratch because our early users told us what was working and, more importantly, what wasn’t.
Every major feature in this version traces back to a real conversation with a real creator. The workflows in Veri aren’t ones we designed in a vacuum. They’re workflows we designed together with the creators who use them.
The point
There are “AI for creators” tools built to replace the creative process. Entire teams building toward it. They scrape your peers, auto-generate everything, and remove you from the equation. The result is spam, slop, and content that actively damages your brand. That’s not what we’re building and it’s not what creators are asking for.
Here’s our take: AI is not a replacement for taste. It’s a self-scale tool. It lets you do more of what you actually want to do, however you want to do it.
Some creators want AI to generate a first draft they can rip apart and rewrite. Some creators want to write every word themselves and then run it through a multi-model council to pressure test their ideas before they commit production time. Some want research pulled and organized so they can focus on the creative. Some want all of the above depending on the content.
That’s why we rebuilt Veri as an OS, not a single tool. How you use it is up to you and the combinations are endless. We’re not here to dictate a workflow. We’re here to give you the system and let you decide what your process looks like.
And, we hear your concern. “AI content is slop.” We get it. But we also firmly believe that slop is slop whether you write it with a pencil, a keyboard, or AI. Bad content has always existed. What makes content good is taste, perspective, and a creator who actually has something to say. Veri doesn’t change that equation. It just gives creators with taste better tools to work faster and do more of what they’re already good at.
What we fixed since alpha
Analytics that actually match your dashboards. Our alpha ingestion layer had issues with YouTube’s API. Creators saw numbers that didn’t line up with YT Studio. We fixed that, then rebuilt it again to add cross-platform support. Veri now normalizes YouTube and TikTok metrics in one dashboard. Instagram and X are next.
Next Best Move cards are gone. Alpha had daily Moves cards and they felt like homework. Creators didn’t want an AI telling them what to do, especially when it was trash. So, we listened and scrapped the entire concept. The engine behind it has been rebuilt as an ambient agent that works in the background, surfacing signal from your analytics and content without forcing action items you didn’t ask for. It watches and notices what’s quietly working that you may have missed and proposes ideas when you ask for them.
Voice DNA. The first version of our voice matching was technically impressive and practically lifeless. Scripts came out sounding like a textbook, not like the creator. We scrapped it and rebuilt a five-stage pipeline that captures rhythm, pacing, tone, and how your voice shifts across content types. It’s not perfect yet. But it’s a real step toward “sounds like me.” And, if you’d rather write your own scripts and use Voice DNA as a pressure test against your existing voice profile, that works too.
From tools to OS. Alpha was a collection of disconnected features and surfaces. Beta is an operating system. One persistent memory layer. One context graph. Seven specialized agents that work as a connected pipeline or individually, however you want to use them. The system remembers your channel, your strategy, and your feedback across every session. The more you use it, the smarter it gets about you, because you’re training it on your own data, your own patterns, your own creative decisions.
What’s in the beta
- Unified analytics across YouTube and TikTok with normalized cross-platform metrics
- Veri Chat with persistent memory that learns your channel over time
- Ideas Agent that pulls from your analytics and content history to generate concepts worth exploring
- Concept Check and multi-model council to pressure test ideas and scripts before committing production time
- Research Agent with verified, cited sources
- Script Refiner powered by Voice DNA, for generating drafts or refining ones you wrote yourself
- Thumbnail Generator trained on your visual brand and CTR data
- Title Generator adapted per platform
- Package Agent that bundles everything into a portable doc you can review or iterate on
- Artifacts Library where all content lives with version history
Nothing auto-publishes. Every output is a draft for you to edit, reject, or throw away. Use the full pipeline or pick the pieces that fit how you work.
For our alpha users
If you were one of our first 100 alpha users, check your DMs and emails. We’re sending bonus VeriPoints your way as a thank you for helping us build this. You shaped the product. The beta you’re getting access to today is a direct result of the feedback, the frustration, and the honesty you brought to every session.
We’re turning on new features for active testers as they ship. The more you use it and tell us what’s working, the faster it gets better. That’s how we’ve built everything so far and we’re not changing the process.
What’s next
Instagram and X integrations. Ambient Inbox that surfaces ideas from your analytics changes without prompting. Deeper TikTok data coverage. More granular image generation. Semantic search across every artifact and conversation.
Up next on the content side: a series on how creators are actually using Veri. The workflows people are building, the combinations that are working, the things we didn’t expect. A cookbook for what’s possible.
We built the tools creators asked for, in one place, designed around workflows we built together. How you use them is up to you.
Connect your channel at joinveri.co.
Frequently Asked Questions
What is Veri's creator operating system?
Veri is an OS for creators, not a single tool. One persistent memory layer, one context graph, and seven specialized agents that work as a connected pipeline or individually. The system remembers your channel, your strategy, and your feedback across every session.
What changed from Veri alpha to beta?
Analytics now match YouTube Studio numbers with cross-platform support, Next Best Move cards were replaced by an ambient agent that surfaces signal without forcing action items, Voice DNA was rebuilt with a five-stage pipeline for better voice matching, and the product was restructured from disconnected tools into a unified OS.
What features are in the Veri beta?
Unified analytics across YouTube and TikTok, Veri Chat with persistent memory, Ideas Agent, Concept Check with multi-model council, Research Agent with citations, Script Refiner powered by Voice DNA, Thumbnail Generator, Title Generator, Package Agent, and Artifacts Library with version history.
How does Veri handle AI-generated content quality?
Every output is a draft for you to edit, reject, or throw away. Nothing auto-publishes. Veri believes AI is a self-scale tool that lets creators do more of what they want to do, not a replacement for taste and creative judgment.