Shop what you see, Not what you type - with Walaland's visual-first shopping experience

Shop what you see, Not what you type - with Walaland's visual-first shopping experience

Walaland is an AI-powered fashion discovery platform

built for a generation who shops through screenshots, not search bars.

Weather app image

How might we

help users find fashion from screenshots, not only just kewords?

How might we

help users find fashion from screenshots, not only just kewords?

What if your screenshot knew exactly what you wanted?

What if fashion search felt as effortless as falling in love with a look?


Weather app image

How might we

help users find fashion from screenshots, not only just kewords?

How might we

help users find fashion from screenshots, not only just kewords?

What if your screenshot knew exactly what you wanted?

What if fashion search felt as effortless as falling in love with a look?


What if your screenshot knew exactly what you wanted?

What if fashion search felt as effortless as falling in love with a look?


Weather app image
Weather app image

How might we

help users find fashion from screenshots, not only just kewords?

How might we

help users find fashion from screenshots, not only just kewords?

What if your screenshot knew exactly what you wanted?

What if fashion search felt as effortless as falling in love with a look?


At Walaland, I led product design from 0→1 - translating AI complexity into a joyful, Gen Z–friendly experience. The result? $5.3M raised, 500+ brands onboarded, and a platform that now defines how K-fashion is discovered.

@WALA ICT (Global)

At Walaland, I led product design from 0→1 - translating AI complexity into a joyful, Gen Z–friendly experience. The result? $5.3M raised, 500+ brands onboarded, and a platform that now defines how K-fashion is discovered.

@WALA ICT (Global)

Date

2023.10-2024.12

2023.10-2024.12

Role

Worked as a Product Designer

Wearing the founder hat - with a design lens.

• Defined product vision alongside CEO and AI leads
• Designed the end-to-end mobile & web UX in Figma
• Turned AI styling logic (38 layers) into progressive filter UI
• Led cross-functional alignment across AI, PM, and partner teams

Worked as a Product Designer

Wearing the founder hat - with a design lens.

• Defined product vision alongside CEO and AI leads
• Designed the end-to-end mobile & web UX in Figma
• Turned AI styling logic (38 layers) into progressive filter UI
• Led cross-functional alignment across AI, PM, and partner teams

Company

@Wala ICT (Global)

@Wala ICT (Global)

Contribution

Turned deep AI complexity into joyful discovery - blending visual intuition with Gen Z clarity.

Turned deep AI complexity into joyful discovery - blending visual intuition with Gen Z clarity.

Situation

Situation

Ever paused a show or SNS just to figure out what they were wearing?

Ever paused a show or SNS just to figure out what they were wearing?

You zoom in. Screenshot. Scroll endlessly - hoping the right product magically appears. But fashion search wasn’t built for how we actually see.

That’s the gap we set out to close.

You pause. Zoom in. Screenshot. But where do you even start searching? We’ve all been there-drawn to a look but stuck with a million product pages and zero answers.
Because fashion search wasn’t built for how we actually see. That’s the gap we set out to close.

Image source : Whisk

Image source : Whisk

WHEN INSPIRATION BECOMES frustration

WHEN INSPIRATION BECOMES frustration

And…everyone's asking the same question : Where can I get that?

Image source : INSTAGRAM Account (@style_Moabom)

Every day, people screenshot celebrity outfits on Instagram and ask:

“Where’s this from?” “Can someone ID this shirt?”

Accounts like @style_moabom do their best to answer, but it's manual, slow, and scattered. That’s when we realized: This wasn’t just a feature waiting to be built.

It was a broken system that had already started - just without the right tools.

Every day, people screenshot celebrity outfits on Instagram and ask:

“Where’s this from?” “Can someone ID this shirt?”

Accounts like @style_moabom do their best to answer, but it's manual, slow, and scattered. That’s when we realized: This wasn’t just a feature waiting to be built.

It was a broken system that had already started - just without the right tools.

Define problem

As we dug deeper, one thing became clear, the desire was there, but the path was broken.

We live in a world where style goes viral before it's searchable.

We live in a world where style goes viral before it's searchable.

However, the system leaves them lost-scrolling endlessly, switching apps, asking strangers. The desire is clear. The path isn’t.

However, the system leaves them lost-scrolling endlessly, switching apps, asking strangers. The desire is clear. The path isn’t.

How might we

That’s when we paused and asked: What if fashion discovery started with how we see,
not how we type?

That’s when we paused and asked: What if fashion discovery started with how we see,
not how we type?

We had a clear problem:
Users knew what they wanted - they just didn’t know what to search.
Screenshots were piling up, but intent was getting lost.

So instead of jumping to a feature, we first reframed the problem -
then went out and asked:
Are real users actually experiencing this? And how are they handling it today?

We had a clear problem:
Users knew what they wanted - they just didn’t know what to search.Screenshots were piling up, but intent was getting lost. So instead of jumping to a feature, we asked:

How might we

help users find fashion from screenshots, not only just kewords?

How might we

help users find fashion from screenshots, not only just kewords?

How might we make search feel more like scrolling Shorts or Reels?


How might we make search feel more like scrolling Shorts or Reels?


How might we turn visual inspiration into shoppable flows?

How might we turn visual inspiration into shoppable flows?

These questions became our design compass. Not answers, but provocations - pointing us toward the right kind of solution.

These questions became our design compass. Not answers, but provocations - pointing us toward the right kind of solution.

Asking real world again

We talked to 12 Gen Z shoppers,
and..here's what we found

We talked to 12 Gen Z shoppers,
and..here's what we found

72%

Take outfit screenshots weekly

Screenshots are the default way to save style inspiration


92%

Could not find exact product afterward

Search tools fail their visual-first habits


67%

Tried typing outfid descriptions into search

But results were too broad or irrelevant


58%

Gave up mid-search

Frustration often blocks purchase intent


These insights helped us realize: The problem wasn’t inspiration. It was translation. Users knew what they loved - they just didn’t know how to act on it.

So we asked: What if the product could bridge that gap?

These insights helped us realize: The problem wasn’t inspiration. It was translation. Users knew what they loved - they just didn’t know how to act on it.

So we asked: What if the product could bridge that gap?

Hypothesis

So we framed a new hypothesis :

So we framed a new hypothesis :

Hypothesis

We believe that

replacing keyword-based search

with a visual-first discovery experience

will

increase conversion & engagement

if

visually driven users

can explore and shop
from screenshots with ease.

Asking real world again

We talked to 12 Gen Z shoppers,
and..here's what we found

Meet Sophie. She didn’t need more inspiration.
She needed translation.

We talked to 12 Gen Z shoppers,
and..here's what we found

72%

Take outfit screenshots weekly

Screenshots are the default way to save style inspiration


92%

Could not find exact product afterward

Search tools fail their visual-first habits


67%

Tried typing outfid descriptions into search

But results were too broad or irrelevant


58%

Gave up mid-search

Frustration often blocks purchase intent


These insights helped us realize: The problem wasn’t inspiration. It was translation. Users knew what they loved - they just didn’t know how to act on it.

So we asked: What if the product could bridge that gap?

These insights helped us realize: The problem wasn’t inspiration. It was translation. Users knew what they loved - they just didn’t know how to act on it.

So we asked: What if the product could bridge that gap?

solution walkthRough

We started building a product
that listens to the way people see.

Screenshot search - Pull based behavior

From screenshot to shoppable, in under 10 seconds.

You saw the look. You screenshotted it. That’s where Walaland comes in.

Upload the image. We segment the outfit, match it with real brand data, and make it shoppable.

Just like that - from inspiration to action.

01

See it. Screen shot it.

Scrolling Instagram, TikTok, or K-drama -
you spot the perfect look.
And you screenshot it.

Sophie’s watching a BLACKPINK fan edit on YouTube Shorts. Rosé steps out in a black jacket she’s never seen before.
Click. Another screenshot goes into her
“K-style” folder.

02

Upload to Walaland

And upload it to Walaland. Our AI auto-detects top and bottom. No typing needed.

This time, instead of giving up or googling for hours, she opens Walaland, uploads the image. No typing. No guessing.
Just → “That jacket.”

03

Get the look

Get real product matches in seconds, from real brands. Keep Shop the look like this.

Walaland detects the top and bottom separately, then shows her real-time matches from K-brands - ready to buy.

Sophie didn’t have to search. She just had to see.

Style-to-brand redirection - Push based behavior

Tag the style. Jump to the brand

Users exploring celeb lookbooks or moodboards can tap on any outfit -
and get instantly redirected to where that item (or the closest match) lives.
No more asking strangers for links and waiting. Just tap and go.

Exploring Walaland app

Users open the app and freely browse content. They see photos of outfits worn by celebrities, and can tap to get redirected to sites selling similar items.

Exploring Walaland app

Users can also swipe through short-form videos like Reels - where outfits worn by celebrities are featured and made shoppable.

Overall mobile APP screeNs

Designing with AI in the loop

And behind that simplicity? A 38-layer AI engine, working quietly in the background.

I designed the system to surface these signals clearly - without users needing to ask.

Instead of dropdowns or filters, the UI simply showed what the AI saw :
structured recommendations, top/bottom splits, and scrollable matches by type.

Weather app image
Weather app image
Weather app image

AI understood garment strucuture

WALA’s backend used a 38-layer deep learning model to scan a screenshot and detect visual signals:
tops vs bottoms, sleeve type, neckline, overall silhouette.

But raw detection alone wasn’t enough.

Designing with AI in the loop

After finalizaing our 1st version of product,
We tested with Gen Z K-pop fans, and listened. Closely.

These two personas revealed one clear insight: Not all users shop the same way - but both expect the product to understand their intent.

Our hypothesis was simple: If we designed a visual-first experience, both vibe-driven browsers and brand-driven hunterscould find what they love - in their own way.

About Image

Jubi, 23

About Image

Jubi, 23

About Image

Jubi, 23

About Image

Jubi, 23

Persona 1

Emotion-first user

“I don’t know the brand… I just want more things like this.”
Scrolls TikTok and K-drama screenshots daily, loves the feeling of an outfit, not the name. For her, discovery means vibe first, shopping later.

Titok lover

Moodboard scroller

Screenshot collector

Casual discoverer

Style-driven, not brand-driven

About Image

Yasmin, 28

About Image

Yasmin, 28

About Image

Yasmin, 28

About Image

Yasmin, 28

Persona 2

Brand precise hunter

“If I fall for something, I need to know the exact brand.”
Tracks airport fashion and styling tags, builds moodboards by idol. Wants to buy the real item - not just something close.

Style perfectionist

Airport fashion tracker

Brand ID detective

Idol stylist follower

Exact-match shopper

The result?
Both types found what they were looking for, whether it was a mood or a match.

Designing with AI in the loop

What stayed with users wasn't the AI.
It was how natrual the experience felt.

What did I really learn from designing WALALAND?

  • Style is subjective - but intent is universal.

  • Style is subjective - but intent is universal.

  • Style is subjective - but intent is universal.

  • Style is subjective - but intent is universal.

  • Feedback loops are not just UX tools - they build trust.

  • Feedback loops are not just UX tools - they build trust.

  • Feedback loops are not just UX tools - they build trust.

  • Feedback loops are not just UX tools - they build trust.

  • Invisible AI works best when it feels intuitive.

  • Invisible AI works best when it feels intuitive.

  • Invisible AI works best when it feels intuitive.

  • Invisible AI works best when it feels intuitive.

Designing with AI in the loop

We didn’t just build a product.
We surpassed what we set out to prove.

It was right after WalaLand was first launched on the App Store, when I tried running the app for the very first time!

What started as a tool to simplify discovery grew into a trusted platform.
With $5.3M raised, 500+ brands onboarded, and real traction from Gen Z users,
More than a product launch, it proved what thoughtful design can achieve.

0+

Funding secured

0+

Funding secured

0+

Funding secured

0+

Funding secured

0+

Global brands onboarded

0+

Global brands onboarded

0+

Global brands onboarded

0+

Global brands onboarded

Got feedback, a project, or an idea in mind?
I’d love to hear it!

Got feedback, a project, or an idea in mind?
I’d love to hear it!

Got feedback, a project, or an idea in mind?
I’d love to hear it!

Got feedback, a project, or an idea in mind?
I’d love to hear it!

© 2025 Kyuri Kim | Made with

in Sanfrancisco