
The Situation
SonifyMusic set out to unify multiple music platforms into a single AI-driven marketplace for licensing.
The ambition was clear, but there was no foundation yet—nothing the team could build against. There was no shared structure for how the product should function, and the complexity of music licensing wasn’t reflected in the experience. Early agency work existed, but it lacked the domain specificity required to support real licensing workflows.
At that point, there was direction, but no system connecting the pieces. Unlike legacy systems, this platform had no existing structure to refine—the system had to be defined from the ground up.
Why I Was Brought In
The team needed someone who could operate across product, domain, and design—someone who understood music licensing from the inside and could translate that into something a team could build.
I came in to define the product, shape its direction, and establish the system it would sit on. That included product architecture, interaction model, and ultimately how the company would present itself to the industry.
This went beyond improving an interface. It was about creating the structure the product—and the business—would depend on. That meant defining not just how the product would work, but how it would present itself. It required someone who could operate across product definition, domain expertise, and creative direction at the same time.
Establishing the Foundation
I mapped the full system, defined how the pieces connect, and created a structure that engineering could step into once hired. When the lead developer came on, he wasn’t interpreting disconnected ideas. He was building against a system that already held together.
That decision reduced risk and gave the product a clear starting point.
Art Direction & Brand Foundation
SonifyMusic didn’t begin with a defined product or a cohesive brand. The visual language, interaction patterns, and overall tone of the platform had to be established alongside the product itself.
I developed the brand and design system in parallel with the experience, defining how the product would present itself both functionally and emotionally. This included visual identity, layout systems, interaction patterns, and the overall aesthetic direction that would carry through every part of the platform.
The goal was clarity—consistency was a byproduct. In a system dealing with complex workflows and unfamiliar concepts like AI-assisted licensing, the interface needed to feel intuitive, grounded, and credible to both creators and industry professionals.
Because there was no existing system to extend, every screen contributed to establishing the language of the product. Design decisions weren’t just about usability. They defined how the platform would be perceived by users, partners, and investors.

Defining the System
From there, I built a unified architecture and developed the design system and interaction model in parallel. Because of timeline constraints, wireframes and visual design were created simultaneously, allowing the system to evolve in one continuous pass rather than separate phases.
An agentic AI interaction model became the central interaction component. It wasn’t layered on top of the product. It was how users moved through it—handling complexity conversationally while the underlying system maintained structure.
Proof of Concept
Onboarding
Onboarding was one of the first places I focused, because it’s where most music platforms introduce friction.
The standard pattern is familiar: a creator signs up, lands in an empty profile, and begins rebuilding their identity from scratch—bio, songs, metadata, collaborators, credits. It’s repetitive and slows down the moment where the platform should start feeling useful.
I designed onboarding as a conversational experience through an agentic AI interaction model.
Creator Profile and Catalog System
From there, I built the creator profile as a working system rather than a static page.
In addition to identity details, users define:
roles (songwriter, producer, publisher)
genre and stylistic tags
catalog structure and visibility
This allows matching to happen at both the song and creator level.
Instead of generic metadata, it captures the information required for real licensing workflows—ownership, usage context, and creative attributes.
These aren’t optional details in real licensing workflows. They determine whether a track can actually be used.
This level of specificity came directly from domain experience. It’s what allows the platform to move from “music discovery” into “music that can actually be licensed.”

Entry Point and System Awareness
Once onboarding is complete, users land in the Dashboard.
Because the profile and catalog are already populated, this isn’t a blank experience. It already reflects who the user is and what they can do.
The Dashboard surfaces:
relevant briefs and opportunities
catalog activity
playlists and matches
immediate next actions
It acts as a launch point for multiple workflows without forcing a single path.
The product meets users with context instead of asking them to figure out where to begin. It gives the system awareness of the user before the user has to figure out the system.
Extending the System
As the supply side took shape, a gap became clear. Most creators work in isolation and rarely receive structured feedback on how their music performs in real-world contexts.
I developed a song evaluation framework for evaluating songs across composition, production, lyrics, genre, and performance.
More importantly, it connects those attributes to opportunity:
alignment with briefs
suitability for TV, film, and brand
potential artist and label placement
Restructuring Demand
On the demand side, inefficiency showed up at the start. Creative briefs are often loosely defined, which makes them difficult to fulfill.
I designed a guided approach to brief creation through an agentic AI interaction that helps users clarify intent before requests enter the platform. The result is higher-quality opportunities and a stronger foundation for evaluation and matching.
Users define their role, context, and intent through a structured interaction that captures what’s needed for evaluation and matching.
The brief is constructed in real time as the user moves through the flow. Before submission, the system has enough information to activate matching.
From Brief to Fulfillment
Once a brief is posted, Soundsync AI generates a curated set of matches using both audio analysis and structured metadata.
Users can move directly from:
brief creation
to curated matches
to audition and licensing
Playlist System — Curation as a Workflow
They support:
tagging and categorization
performance metrics
collaboration and sharing
direct licensing actions
Soundsync AI match visibility
They allow users to organize, evaluate, and act within the same environment.
Expanding the Platform
Beyond the core flows, the platform required a full supporting system.
I designed:
account and profile management
billing, cart, and checkout
search and filtering systems
a notification system with a defined event matrix
preferences, permissions, and user state
Each part was designed as part of the same system, with consistent behavior across the platform.
Enabling Execution
There was no product management layer in place, so I built one alongside the product.
I created an operational layer alongside the product:
a full pipeline in ClickUp
feature-level workflows
a Kanban execution model
bug tracking and reporting
Every piece of work—from concept to delivery—was structured and visible.
Operating Across Layers

Outcome
Within one year, the platform reached internal launch readiness with a fully defined system across supply, demand, and transaction.
It was positioned for market release and second-round funding, with a standalone song evaluation framework application developed as an entry point into the ecosystem.
The response from industry professionals, investors, and partners was immediate.
What This Demonstrates
This work reflects how I approach product:
Start with the system.
Align it to real-world constraints.
Design for how people actually work.
Then scale it without losing coherence.




