Experience Prototypes – E-reality Shopper

So I’ve decided to run with a 3rd party application for google glass which you can preview furniture in your apartment that you would buy online to see if it would fit there/work with your decor/get a good idea how big things actually are. I would intend that people could use this for clothes too and might feel more inclined to buy clothes if we get an idea what it looks like with our complexion, how it fits, etc.

So for my experience prototype I wanted to go through Amazon with some friends and talk about the things that we would absolutely refuse to buy on Amazon and see if it would give me a better idea of what things I should be focusing on.

So here’s a list of things we wouldn’t buy (inspired by things amazon was trying to sell on cyber monday):

  • Hot tubs
  • Clothes, suits, jackets
  • Shoes
  • Watches
  • Wine (unless we knew the brand already)
  • Perfume
  • Bicycles
  • Skis/snowboards
  • Many types of furniture
  • Glasses
  • Mattresses

I think with some of these things, it’s not something that google glass could be much help with. Wine, perfume, etc. But for things like watches, bicycles, and furniture I think it could really help. We were talking about how you can buy cheap bikes on amazon but that it wouldn’t make much sense there unless you knew the height of the bike you were supposed to get. But if you could see it in space next to you and compare to you through google glass, you might change your mind (and see how high quality the parts are/aren’t).

lampsavedbrowsechairs placeblackchairexpertchairphotoshop

I’ve got started on some low fidelity design mockups. I’m working through how I want users to navigate through interfaces. I can imagine people may want to browse amazon on their computers as well as through google glass so that’s where the favorites feature comes in. There they can save the items they were interested to view on google glass later. But I thought they should also be able to browse without having to go back to the computer.

Since google glass implements a lot of voice control I wanted to use that as the main way of navigating through the interface. There would be some learned commands such as “browse <blank>” or “next page/previous page” to view the next page of favorites/browsing. To view the specific item you would need to read the name aloud that was next to the item.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s