Every AI tool you use is building a picture of you. What you care about, how you think, what you're good at. Right now, that picture is invisible. You can't see it. You can't fix it. You can't take it with you.
Sourced changes that. It listens along five dimensions — not to classify you, but to understand what you're expressing right now. When enough signals accumulate, the system weaves a portrait from your own words.
The system doesn't put you in a box. It asks five questions about what you're expressing in each moment. One sentence can fire multiple dimensions at once. That's reality, not a bug.
Values, commitments, concerns, conflicts. When two things you care about collide, the system hears that tension — it doesn't flatten it.
Vitality, flow, blockage, depletion. Not what you should feel. What's actually alive in you or stuck.
Beliefs, skills, certainties, doubts. What you know firmly and what you're still working out.
Practices, commitments, aspirations. What you're actively doing versus what you're reaching for.
Stories you've made peace with. Narratives that are shifting. The system notices when your meaning is changing.
Most AI has two moves: mirror what you said, or push you to the next topic. That feels robotic. A good facilitator has more range — and the two moves most AI systems are missing are precisely the ones that make conversations feel human.
When it understands what you've shared, it mirrors it back with depth. Shows you it heard the real version.
When it's not sure, it says "Got it" and lets you keep going. No fishing. No forcing the next topic.
When it sees a pattern clearly and you're searching, it surfaces what it notices. Gently. You decide what to do with it.
When it's lost, it drops the mask: "I can hear this matters, but I'm not sure what the core issue is. What do you think?"
When you're stuck: what would change? What's the smallest next step? It doesn't pretend paralysis is a mindset problem.
The system reads two things: your phase (settled, searching, or stuck) and its own confidence. Those two inputs determine how it responds. No turn-counting. No checklist pressure.
When enough signals accumulate, the system weaves a portrait — a composed synthesis of what it heard, in your language. Not a score. Not a profile. A living document with evidence you can trace back to the exact moment.
When the AI gets something wrong — and it will — you say “Not quite” and it learns. Correction isn't a complaint. It's a conversation.
Some things are too important to be scored. If something you say matters deeply but doesn't fit the system's categories, you can mark it sacred. The system preserves your exact words without analyzing, classifying, or using them for matching. The limit of measurement is a feature.
Conversations end. The system quiets. No “Would you like to continue?” No addictive continuity. The highest compliment the system can pay is to become unnecessary, because consolidation only happens in silence.
Your portrait is yours — exportable, portable, truly owned. What you made in the conversation belongs to you, even if you leave.
Sourced is built on the idea that AI should strengthen your thinking, not replace it. Like a bicycle — it goes where you pedal, but you still have to pedal.
Read the full theory →