HoloGlass

Mixed Reality Art-Viewing App

Research Question

How might art be displayed in mixed reality?

Objective

 

Reimagine an artist's portfolio in mixed reality. Design and develop an app for the HoloLens for museum and gallery curators to view artwork. 

Team and Timeframe

Our team had 4 designers and 3 developers. I was the Team Lead, the Project Manager, and the main liaison between the designers and the developers. We completed this project in 10 weeks.

Project Outcomes

Background

If the "real world" is on one end of the spectrum and virtual reality is on the other end of the spectrum, mixed reality falls somewhere in between. In mixed reality, a user wears a headset and interacts with both digital objects and the physical world at the same time. Mixed reality is a blending of the digital and physical worlds. 

Reality spectrum.

As team lead, I ensured my team took a human-centered design approach. We started with a strong research phase, defined a problem, brainstormed solutions, prototyped, tested and ultimately delivered our product at the end of the 10 week time-frame. 

Research

I designated a 10-day period of "pure research." I laid out the following 3-pronged research strategy:

1.

Learn About Target Users

Our design prompt indicated that art curators would be our target users. But art curators don't work in a vacuum. I wanted my team learn more about the context in which art curators work.

AR/VR Creation

2.

What kinds of mixed reality apps exist already? How are mixed reality apps designed and created? What kinds of resources are available? 

3.

AR/VR Industry

What kinds of enterprise mixed reality apps exist? How is AR/VR helping solve business problems? 

Goal: Solve a problem facing our users.

User Interviews

We created 3 personas of potential users for our application. I interviewed one person from each category: 

1. Primary Users: Curators Using the App Professionally

I interviewed a curator at the Whitney Museum of American Art. During our interview, I asked her about how she plans exhibits and the types of interactions she has with artists while planning a show. I also asked her about her workflow, and tried to understand which parts of her workflow work best, and which parts of her workflow could be improved. 

2. Secondary Users: Artists Uploading Their Work

I interviewed an independent artist in Brooklyn, NY. During our interview, I asked him about the details of his general workflow. I asked many questions about the type of contact he has with curators, and how he typically shows his work to curators. I also went on a studio visit to get a better understanding of how his art might be shown to a curator making a similar visit during the exhibit planning process.

3. Tertiary Users: Museum Employees (non-Curators) 

I spoke to an archivist at an art foundation. I asked her about the way in which her workplace interacts with curators who come to the foundation looking to do research for upcoming exhibits. I also asked her about how her foundation moves artwork from the foundation to museums for exhibits. 

Pictures from my site visits to an artist's studio and an art foundation.

Research Insights

My team and I used sticky notes to write down the major ideas we learned during our independent research endeavors, and grouped our ideas by patterns and emerging themes.

A selection of our insight generation wall.

This exercise led us to focus on solving a problem curators face. Specifically, we decided our app can make an useful intervention in a curator's exhibit planning process. Exhibit planning for many curators can be costly and inefficient. Oftentimes curators depend on low-fi copies of the artwork to plan exhibits. Time is wasted when curators coordinate the hanging of artwork before a show--it can be difficult to hang up art and take it down and rearrange it in order to get placement and flow correct. A mixed reality app that allows curators to plan their exhibits efficiently and with minimal physical labor could save money and time. 

 

Our HoloLens app allows true-to-size 3D renderings of artwork to be placed in the exhibit's real environment, thus making it easy for curators to digitally place objects in their gallery or museum before settling on the final position of the artwork. 

Brainstorming & Storyboarding

My team and I sketched out many concepts before settling on the one we pursued: the "Immersive Web." 

The "Immersive Web."

My team and I sketched out our concept in both 1st and 3rd person, considering the following things:

  • Interactions

  • User Input

  • User Guidance

​​

User input was especially important to consider during this time. Mixed reality is a wonderful medium because it allows for voice input, gaze input, and multiple types of gestures to accomplish tasks. It also has Cortana, the digital assistant, and 360 sound. My challenge at this point was to consider ways to harness these features without creating a convoluted design or a confusing user experience.

An "Immersive Web" scene.

Prototyping & Testing

First, we acted out our concept. Mixed reality is a 3D experience that blends the physical and digital worlds. Our world is 3D, so we figured the easiest and quickest way to get a "feel" for our app and it's early interactions would be by acting our flow. 

 

One of my teammates acted as the user, while another acted as a sculpture. Another teammate played Cortana, the voice-activated digital assistant, and another teammate moved the background images behind the sculpture. I filmed the scene. 

 

Doing this exercise helped us understand what it would be like to go through the interactions in our app while using a headset. It also helped us keep in mind the importance of determining how far digital objects would be placed from the user.

 

More insights about this process can be read in these blog posts I wrote on Medium.

My teammates Lu and Levi bodystorming.

Once we acted out the most basic interactions in our app, we used Marvel to continue to design. Marvel is mostly used for web and mobile prototyping, so it isn't ideal for mixed reality design. However, it was very useful for us to use Marvel to solidify all our interactions and to create our user flows. The visuals we used weren't necessarily accurate to the experience of our app, but it allowed my team and I to illustrate where certain features would be placed and to show how a user would get from point A to point B in our app. 

 

We also kept a spreadsheet of the interactions we designed. This spreadsheet helped us when we began creating our user flows.

Screenshot from Marvel prototype.

While the designers worked in Marvel, the developers built out a prototype in Unity.

 

Once we had a clickable prototype working in Unity, we began testing our app. During most of our testing sessions, we told users what our app was, showed them the basic interactions on the HoloLens, and let them narrate their thoughts as they went through our app. Then we sat down with each user and conducted an interview. 

 

We received a lot of useful feedback that we were able to incorporate in the final version of our app:

 “Make it more 3D. Get rid of extra lines.” -Faculty Member

                                      

“I want to see more." -Computer Science Student

 

“Everything should be on the same level in the HoloLens.” -Design Student 

                                         

One key takeaway from user testing was that users were not aware of 3D objects if they were not placed in their immediate line of vision. Taking note of this, my team and I designed UI elements to indicate where a user should look in order to "see everything" available in the app at a particular time. 

Our user testing method included the following steps:

1. Give user instructions on basic HoloLens interactions

2. Instruct the user to narrate their thoughts as they went through the process of selecting a sculpture to view.

3. One-on-one interviews after using the app

Testing our Unity prototype.

Development

Because the designers and developers often worked separately during this project, I was the main liaison between the two teams. I checked in regularly with the devs to assess where they were at and to see if they were able to meet their deadlines. I was responsible for communicating the design team's design decisions. I worked with the developers especially as they built out a user journey (as it appeared in our user flows).

One challenge we faced together was choosing between using 3D and 2D assets.

Mixed reality design is a new medium, and because there isn't much existing content, the "rules of design" are still being determined. This is great, because it means there aren't a ton of "bad decisions" that can be made. However, the one rule that our team determined is that 2D assets should not be used. Mixed reality is great because it is immersive, and 3D. 

 

My team and I knew we wouldn't be able to use high quality 3D assets of artwork because of time constraints. We thought we would use 2D assets of artwork instead. But this proved to lessen the experience of our app. We found this out when we tested on a group of users. 

 

We made the final decision to use stock 3D assets of generic shapes, and applied our own textures and materials to them to represent art. Using generic 3D assets provided a much better user experience and improved the quality of our app tenfold. 

A still from our Unity  prototype.

Deliverables

User Flows

The user flow below describes the journey a user takes while viewing a painting in the HoloGlass app. This is one of 4 main flows featured in the app. 

High-Fidelity Renderings

Case Study

Comparative Analysis

Challenges

Our university's department had only one HoloLens to share amongst over 100 students. We often did not have access to a HoloLens as we prototyped ideas, and our devs were rarely able to code and load their code onto the HoloLens in the same sitting. 

 

As a designer, I focused on producing deliverables that would convey our application to anyone, regardless of their familiarity with mixed reality. My team and I produced high fidelity renderings of a scene in our mixed reality experience, user flows that accurately showed multiple user journeys (and all the different types of input options available), and "snapshots" of a mixed reality scene showcasing every component of the experience. 

One particularly effective communication document was the snapshot we created. Drawing from 8ninth's mixed reality Table of Elements document, our snapshot describes the exact mixed reality details occurring in a rendering of a mixed reality scene within our app. 

  • Black LinkedIn Icon
  • Black Twitter Icon