Reimagining fashion discovery for
Gen Z by approaching it as a browsing problem not a search problem.

Reimagining fashion discovery for
Gen Z by approaching it as a browsing problem not a search problem.

Reimagining fashion discovery for
Gen Z by approaching it as a browsing problem not a search problem.

This case study documents how I designed a fashion marketplace around real fan behavior, not new search mechanics.

TIMELINE

5 months

(2023-2024)

ROLE

Worked as a Product Designer

COLLABORATORS

Product Manager, Front/Back-end developer, AI engineer

COMPANY

Teamsparta Inc. (Edtech SMB

Achievement

0

K

App downloads

Dangpro reached over 100,000 downloads within its first year

0

K

App downloads

Dangpro reached over 100,000 downloads within its first year

0

%

Improved awareness

Users reported clearer understanding of their blood sugar patterns after health check-up analysis

0

%

Improved awareness

Users reported clearer understanding of their blood sugar patterns after health check-up analysis

0

%

More consistent tracking

Users logged blood sugar and meals more regularly compared to before using Dangpro

0

%

More consistent tracking

Users logged blood sugar and meals more regularly compared to before using Dangpro

Overall mobile APP screeNs

Overall WEB screeNs

Situation

Situation

Ever paused a show or Instagram — not to search, but to analyze what they were wearing?

Ever paused a show or Instagram — not to search, but to analyze what they were wearing?

People zoom in. Screenshot. Compare details.
Not to search, but to understand the look, the moment, and the context.

You pause. Zoom in. Screenshot. But where do you even start searching? We’ve all been there-drawn to a look but stuck with a million product pages and zero answers.
Because fashion search wasn’t built for how we actually see. That’s the gap we set out to close.

Image source : Whisk

Image source : Whisk

Fashion isn’t experienced as a product list. It’s experienced as moments — airport sightings, public appearances, daily looks caught in real life.

Yet most shopping systems still force those moments back into product lists.

You pause. Zoom in. Screenshot. But where do you even start searching? We’ve all been there-drawn to a look but stuck with a million product pages and zero answers.
Because fashion search wasn’t built for how we actually see. That’s the gap we set out to close.

WHEN INSPIRATION BECOMES frustration

WHEN INSPIRATION BECOMES frustration

When inspiration turns into manual investigation

Image source : INSTAGRAM Account (@style_Moabom)

Every day, I saw fans screenshot celebrity outfits on Instagram, breaking down looks piece by piece together. Accounts like @style_moabom don’t just identify products. They interpret celebrity fashion in real time — manually, collaboratively, and at scale. What I recognized wasn’t a missing feature. It was an existing behavior that no product was designed to support.

Every day, I saw fans screenshot celebrity outfits on Instagram, breaking down looks piece by piece together. Accounts like @style_moabom don’t just identify products. They interpret celebrity fashion in real time — manually, collaboratively, and at scale. What I recognized wasn’t a missing feature. It was an existing behavior that no product was designed to support.

Define problem

The desire was never the problem. The system just wasn’t built for how fashion spreads.

Style goes viral before it becomes searchable. Especially for Gen Z, fashion is discovered through people, moments, and screens — not keywords.

Style goes viral before it becomes searchable. Especially for Gen Z, fashion is discovered through people, moments, and screens — not keywords.

But the digital system treats this behavior as noise. Fans jump between screenshots, comments, and external links — while the original moment that sparked desire slowly fades.

But the digital system treats this behavior as noise. Fans jump between screenshots, comments, and external links — while the original moment that sparked desire slowly fades.

How might we

What if fashion discovery started with how people already see, not how they’re forced to search?

What if fashion discovery started with how people already see, not how they’re forced to search?

The desire was clear. But screenshots kept piling up, and intent kept dissolving. Before designing a feature, we reframed the problem: this wasn’t about helping users search better — it was about meeting them at the moment interest forms.

We had a clear problem:
Users knew what they wanted - they just didn’t know what to search.Screenshots were piling up, but intent was getting lost. So instead of jumping to a feature, we asked:

How might we help users move from screenshots
to understanding a look , without keywords?

How might we help users move from screenshots
to understanding a look , without keywords?

How might we help users move from screenshots
to understanding a look , without keywords?

How might we turn visual obsession
into a structured, shoppable experience?

How might we turn visual inspiration into shoppable flows?

These questions guided us toward a platform solution, not a single feature.

These questions guided us toward a platform solution, not a single feature.

Asking real world again

We spoke with Gen Z shoppers. The behavior was already there.

72%

Take outfit screenshots weekly

Screenshots are how inspiration is saved

92%

Couldn’t find the exact item later


Discovery breaks after the moment passes

67%

Tried describing outfits in search



Visual intent gets lost in words

58%

Gave up mid-search


Friction dissolves purchase intent


The problem wasn’t inspiration. It was the lack of a system that could carry interest forward.
So I asked: What if discovery didn’t end at inspiration, but naturally evolved into exploration and action?

Hypothesis

Our hypothesis was simple.

If fashion discovery starts from

how people already browse celebrity outfits,

users can move from interest to action without switching tools or re-searching.

We hypothesized that organizing fashion by celebrity appearances and outfits would shorten the path from discovery to purchase.

solution walkthRough

I started by anchoring the product in an existing behavior, rather than introducing a new way to search.

Screenshot search - Pull based behavior

Screenshot based discovery

Users start from an image they already saved. Walaland detects the top and bottom automatically and matches them with real brand products.

This flow does not replace search. It supports the behavior users already have.

01

Users encounter a look on Instagram, TikTok, or in media and save it as a screenshot.

02

They upload the screenshot to Walaland. The system separates outfit components without requiring any text input.


03

Walaland returns purchasable matches from real brands and keeps the user inside the app to continue browsing.

Style-to-brand redirection - Push based behavior

Once a screenshot becomes a starting point,
users need a way to keep exploring without starting over.

Users exploring celeb lookbooks or moodboards can tap on any outfit -
and get instantly redirected to where that item (or the closest match) lives.
No more asking strangers for links and waiting. Just tap and go.

Exploring Walaland app

Users open the app and freely browse content. They see photos of outfits worn by celebrities, and can tap to get redirected to sites selling similar items.

Exploring Walaland app

Users can also swipe through short-form videos like Reels - where outfits worn by celebrities are featured and made shoppable.

Designing with AI in the loop

And behind that simplicity, 38-layer AI engine was working quietly in the background.

I designed the system to surface these signals clearly - without users needing to ask. Instead of dropdowns or filters, the UI simply showed what the AI saw : structured recommendations, top/bottom splits, and scrollable matches by type.

Weather app image
Weather app image
Weather app image

AI understood garment strucuture

WALA’s backend used a 38-layer deep learning model to scan a screenshot and detect visual signals:
tops vs bottoms, sleeve type, neckline, overall silhouette.

But raw detection alone wasn’t enough.

Designing with AI in the loop

After finalizaing our 1st version of product,
We tested with Gen Z K-pop fans, and listened. Closely.

These two personas revealed one clear insight: Not all users shop the same way - but both expect the product to understand their intent. Our hypothesis was simple: If we designed a visual-first experience, both vibe-driven browsers and brand-driven hunterscould find what they love - in their own way.

About Image

Jubi, 23

About Image

Jubi, 23

About Image

Jubi, 23

Persona 1

Emotion-first user

“I don’t know the brand… I just want more things like this.” Scrolls TikTok and K-drama screenshots daily, loves the feeling of an outfit, not the name. For her, discovery means vibe first, shopping later.

Titok lover

Moodboard scroller

Screenshot collector

Casual discoverer

Style-driven, not brand-driven

About Image

Yasmin, 28

About Image

Yasmin, 28

About Image

Yasmin, 28

Persona 2

Brand precise hunter

“If I fall for something, I need to know the exact brand.”
Tracks airport fashion and styling tags, builds moodboards by idol. Wants to buy the real item - not just something close.

Style perfectionist

Airport fashion tracker

Brand ID detective

Idol stylist follower

Exact-match shopper

The result? Both types found what they were looking for, whether it was a mood or a match.

REFLECTION

What stayed with users wasn't the AI.
It was how natrual the experience felt.

Looking back, this project taught me,

  • Style is subjective - but intent is universal.

  • Style is subjective - but intent is universal.

  • Style is subjective - but intent is universal.

  • Feedback loops are not just UX tools - they build trust.

  • Feedback loops are not just UX tools - they build trust.

  • Feedback loops are not just UX tools - they build trust.

Design Challenge

Fans were already analyzing celebrity outfits across screenshots and social feeds. The issue wasn’t discovery — it was that this behavior lived outside any shopping system.

Solution

I designed Walaland as a browsing-first marketplace. Instead of asking users to search or describe what they liked,
I structured discovery around moments, appearances, and saved images —
the way fans were already engaging.

Impact

Fans could browse fashion without breaking the cultural moment. Brands gained earlier, more meaningful exposure through real world contexts. The platform supports both B2C engagement and scalable B2B partnerships.