Here’s my final project’s presentation! I hope the color spaces work in Preview correctly because it hasn’t been working properly in Acrobat.
131212_Project 3 Presentation_72dpi
These are my more developed comps since last Tuesday that show more simplification and more of a clear user flow.
At this point the user would activate the home page using Kinect to detect when they’re facing the screen. It would take a few seconds to confirm someone’s looking at the screen so that it doesn’t constantly show this screen when anyone walks by. Navigation’s been simplified since last time with two main nav icons at the top. Some tiles, such as the bottom and top tile are static tiles that would always appear depending on whether an event is approaching. At the bottom, you’ll notice that the “Like” button is now a “High 5” button. It probably wouldn’t be used for tiles, but for main pages so users aren’t reaching to low and high spots to “high 5” an item.
To activate the settings screen, users would swipe upwards revealing color adjustments and pattern selections. These basic elements would hopefully give the user the feeling of being able to control content. [Color selection and patterns coming soon!] To get back to the home screen, users would swipe the opposite direction.
Content for this page is still in development. A stats page from the stats tile (or should it be a main top nav icon?) would allow users to see statistics like average number of views, a daily eye tracking map, or etc.
For my experience prototyping I decided to delve more into user research based on feedback I had received last class along with keeping track of how many people travel by the screen location. This idea has three elements of readability: Attract (far distance read), Engage (close read), and Connect (letting users explore content).
Throughout the day, most people walking by the art building are students going to and from class. During peak hours (around every hour from 10–2), students from other buildings travel by the art building giving us a wider audience to hit rather than just art students. The University Pointe MAX stop isn’t very frequented by travelers since it is the end of the Green line. However, traffic from the Oregon City Orange Line (aka “the Crime Train”) next year might allow for a wider audience with people transferring trains.
Having a screen with moving content might allow more people to notice the Art Annex and give the Art Building the feeling of being PSU’s alternate “cultural entrance” since the building isn’t well recognized by the university and in turn reflects poorly on PSU’s art department.
LOW FIDELITY COMPS
I should be getting some high quality photos of the front of the annex soon so I can start mocking these screens up with the window, but until then, I’ll be using vector diagrams. A lot of things are placeholder such as content tiles and the background pattern. These comps are very low fidelity to get the idea of how content will content will appear depending on the user’s distance from the screen. Attract is far away, Engage is in front of the screen, and Connect is interacting with the interface.
Describe what your interface does in 2 sentences or less.
This interface would inform users about the PSU.GD program and allow them to navigate relevant information pertaining to their involvement/experience with PSU.GD. This interface would act as a way to draw attention through a physical and visual representation of the PSU.GD community.
Who would use your interface?
Students walking near the art building, potential future students, members of the graphic design community, and potentially PSU administration.
What would they hope to gain?
Users would discover information on upcoming events, video content, and get a general idea about the PSU.GD atmosphere/environment in a delightful, fun way.
What is the context/environment in which people will use your interface? Would it be used in public/private? Alone or in groups?
The interface will be publicly available to use by people near the Art Annex. It’s best used as a single person experience to draw users into finding out more info about the design program and what happens within the annex, but can be experienced as a group.
What sorts of physical items might a user have to interact with?
The user will be able to interact with the interface using touch and motion tracking gestures to navigate data and information displayed on screen. There may be the additional options to interact with the screen using your phone to access information on the go instead of in front of the screen.
What questions do you need answered about your interface to see if it is necessary or effective?
Are these interactions meaningful to you as clients/potential users? What types of interactions might you want to see? Would you rather see playful, game-like interactions, or more informative, “meaningful” interactions?
What information would be more important to you if you were a prospective student? What information would be most important to you as a current student?
For Project 3 I’ve decided to focus on developing the Blue Sky idea for the PSU.GD TV screens, a project briefly discussed in A+D Projects. I’ve been looking into Interactive Displays that act as informative pieces and I think there’s actually a lot of things to learn from non-interactive designs as well. Unlike some print pieces, we can benefit from a far read, a mid-distance read, and close read since using Kinect or a Sony Eye, we can detect user presence. There are so many cool possibilities for this project and I’ve been stoked for a while to work on it.
So here’s some research I’ve done on similar models:
UO Ford Alumni Center
In collaboration with Second Story, University of Oregon developed an interactive entrance to the Ford Alumni Center which would act as experience that celebrated UO’s past along with helping cultivate a new generation of students. This interactive piece has a lot of elements that I’d love to incorporate into the PSU.GD screens as it would help entice students into discovering more about PSU.GD. It’s a super cool system and can garner an excitement in the community! A lot of cool elements used in the Ford Alumni Center are very emotionally driven.
ASICS/NYC Marathon Interactive Wall
Another Second Story piece I was really drawn to was this interactive wall for the NYC Marathon. The thing that drew me the most to this interactive wall was the attraction element of it. People walking from the subway may not notice usual billboards and signs, but something that moves and detects users can instantly grab attention and interest. There’s a lot of experience-based elements in this wall as it uses wonder and intrigue to grab the viewer’s attention and draw them in.
Elements Interactive Wall
I first saw this on Vimeo a few weeks ago and it instantly drew me with it’s cool motion-based detection. I’d prefer to have motion based detection rather than touch based interactions and this was cool example of utilizing motion tracking to create an aesthetically interesting piece.
Sorry for the out of order numbers. I wanted each number to fit into one of the three states.
1. PSUGD Pattern attracts guests
2. Kinect facial detection shows interactive elements when a user approaches the screen.
10. Important events dictate what appears during the Attract state, such as Show & Tell, Be Honest, or Workshop notifications.
3. Modular grid appears with user presence similar to the PSU.GD website.
4. User clicks on modular tile to reveal more info.
8. tiles shift with a parallax effect depending on the user’s position.
9. When users are no longer detected, the screen returns to Attract state.
5. More info on specific tile appears full screen
6. User either presses back or swipes in front of screen to return
7. Video tile plays video content like Adam Garcia’s workshop video
So I watched Oblivion for my homework and I’ve gotta say. What a terrible, terrible movie that steals from a bunch of other greater scifi films. There’s appropriation and referencing films, and then there’s just not having any original things to say with the genre.
But other than my qualms with the film’s script, it was a visually stunning experience! If we break down the movie, there were 3 main interaction design pieces (unless you count things like drones having facial and voice recognition). There was the light table, the bubble ship, and Jack’s HUD on his gun. The two that were the most present were very HUD like with the bubble ship and the gun, but I’m going to analyze the more complex interface.
The Light table present at Jack and Vicka’s base was the most complex and probably informative interface in the film. It had many functions and acted as the base’s main control and communication system with the Tet. It’s such a beautiful interface that you might find something similar on a techy’s computer monitoring his computer hardware’s status. It makes a ton of sense being used by a technologically advanced alien race mostly comprised of sentient AI since it displays such a wide range of data, but doesn’t offer much to an outsider’s perspective. You may recognize a few things on the interface, but overall it’ll appear as information overload and will offer little value information wise. In the present day, it might have useful applications in a mission command center where people are trained to understand what each bit of data is or if it were simplified, it might make a good status system for a person’s house.
While doing research for Project 2, I came across the studio that that worked on the interfaces. If you want to see some more about Oblivion’s UI, go to Gmunk’s website.
PROJECT 3 IDEAS