SnapShop

Imagine a mobile app that enables you to impulsively identify and purchase a garment or accessory that you see in real life.

José Cabral (Photographer from Lisbon) - http://oalfaiatelisboeta.com

Why ?

We can say that people tend to copy what other people wear because we get influenced by our surroundings, if we think for example celebrities, many people would like to buy what they are wearing. In order to achieve that, people do a lot of research in internet to find that awesome shirt, or a beautiful hat that they had seen celebrity wearing.

There are a lot of websites and blogs especially focused on that, where the whole outfit of that person is manually analyzed in order to provide links to online stores, so their readers can buy those clothes also for themselves.

The owners of these websites and blogs, are the “man in the middle” in this process, and I imagine that the technology and automation together can do it in a much better and faster way in the future.

- Look at this yellow hat on this magazine!
- So awesome, let’s find it and buy.
- How?

So there is a empty space there to make it possible from the moment when a person sees the special clothing that someone is using on the streets, in a subway, or on the web, and in that moment he wants to have the possibility to find the same item or another similar one, and after that this person would even like to buy it instantly for himself.

How ?

With this application concept, you could easily search clothes and accessories, by using the picture you have taken and buy them right away via online. Imagine a app that receives a picture you have taken, identifies the garment on the picture, shows all the available options that are either equal or similar to the garment you are interested in, and provides you a centralized way to buy it from a bunch of different online stores from around the world.

User Studying

I interviewed two of my friends and asked a simple question from them, in order to get some clean mind feedback from two possible users, and to have a contrast with my initial interaction idea.

Question: Imagine a mobile app that enables you to identify and purchase a garment that you see in real life, how you imagine this working in your phone ?

Sara Oliveira, 28yrs old, Portuguese
“When I imagine this, the idea that comes to my mind is open the app that access my phone, taking a photo and in a few seconds I have what I’m looking for and where is the closest store. If I wouldn’t have a store in 10km around, the app should give me the possibility of purchasing it online directly on app, in a simple way: one screen with: choosing size, add delivery address and payment.”

Riia Hyppönen, 27yrs old, Finnish
“Well I imagine it would base on online stores so excluding handmade (etc) items of course.. by taking a picture the app would identify the product and/or maybe other similar ones and give you a list where you can find it and for what price and maybe direct you to the online store :) or you could insert the keywords manually and it tries to search the available options for you, or by scanning the code that is now in every clothes (where are the material details and stuff), but then it must be your friend or family to actually ask to scan the code :D I don’t know.. all that comes to my mind.”

I’ve made those interviews after start sketching some ideas, and I surprised myself that both answers significantly converged to my app interaction idea. So that gave me confidence to continue this concept with that in mind, focusing myself in this converging points.

Early sketch

I started this concept brainstorming some ideas by hand-drawing them on the paper and on the whiteboard. I imagined a very straightforwarded theoretical solution; capture picture, auto image analysis, show results and a online purchasing. But, I found after some sketching that this dependability of an automatic analysis technology or the lack of it, would influence a possible interaction solution, in terms of development.





App-Flow & Wireframes

To converge and to get a overview I planned this app flow.

Partial app flow (focused on the main interaction)

This Partial flow of the app is focused on the main interaction, the most differentiator one, the interaction from a impulse from a user to identify something and to buy itself.

Screen 3.1 — Alternative Options

I’ve imagined two solutions related with item identifying interaction. In the first one 3.1 — Highlight (A) the user is invited to highlight in the photo the item that he want to find, by drawing with a finger above the photo revealing where a item are. The second one 3.1 — Highlight (B) is maybe a the easiest solution for the user, but really more heavy in tech terms, I imagined a service that provides to the application a full analysis to garments and accessories available in a image, giving to the user a option to choose which one he want to search.

User Interface



Thank you for your visit, my name is Miguel Mendes. I'm working at Format as Product Design, you can check my resume here or email me to hello@miguelmendes.net.