Navy F/A-18 fighter pilots need access to vital information quickly – traditionally this information is a literal stack of documents on their kneeboards. We worked with the Navy to conduct Industrial Design and UX Research into the human-factor challenges of digitizing a paper kneeboard and collaborated with a military contractor (SoarTech), to create a proof-of-concept demo.
As the design lead for Phase 2 of this project, I took the human factors research gathered from Phase 1 and worked closely with stakeholders to create a user-focused prototype on the iPad. I also led mid-project UX workshops to validate our assumptions and understand the unique working environments of fighter pilots.
The Navy is currently experimenting with electronic flight bags and digitized kneeboard applications to lighten both the physical and mental load of dealing with a paper kneeboard.
There are many apps on the market that address a singular aspect of a kneeboard, like navigation or weather or airport documents. Yet, only a few have successfully consolidated those features together, and none have tailored these features to Navy F-18 fighter pilots.
Overall, how do we make it easier for Navy pilots to get the information they need while engaged in critical tasks of flying an F-18 jet?
Created a narrative for a stakeholder demo to focus the scope and effort for greatest value.
Design which focused on the pilot, combined with expertise in designing for an optimal iPad tablet experience.
In Phase 2, we partnered with SoarTech, a military contractor that specializes in artificial intelligence. In the discovery portion of the project, we began by identifying and understanding the problem. By clearly outlining the problem, we could then use that as our North Star to guide our efforts to solving that problem.
The current state of pilot kneeboards has focused primarily on digitizing flight data (formerly on paper) via multiple apps/resources.
Project Patella will address this gap by making kneeboard docs easily accessible and integrated for fighter pilots.
Our initial focus will be making a demo-able proof of concept of flight-driven AI.
Because this application will be built for a highly specific and nuanced environment, it was vital that we had an in-depth understanding of the user's (in this case, an f-18 fighter pilot's) experience throughout the entire process. To do so, I sat down with a former navy fighter pilot in front of a white board and detailed every step of a typical mission, from pre-flight to debrief. I grouped the steps into phases and highlighted key events with storyboard illustrations. The goal of this was to figure out at what points our AI-driven application could assist the pilot during a mission without interrupting or delaying the pilot's actions.
We then extracted pain points and opportunities from the journey map and converted them into potential features for our kneeboard app. To align on the priority of these features, I led a 2x2 feature prioritzation exercise in which we mapped each feature on a matrix based on its potential and the degree to which it could relieve the pain points of a pilot.
The prototypes below were created based on user research from Phase 1. The initial prototypes accounted for user needs and pain points, but the designs were not platform specific. For instance, we prototyped a night mode setting for the kneeboard UI, since many pilots stated that they usually wore night vision googles for night missions, and bright screens would be difficult to read with the googles on.
These prototypes were eventually scrapped because a lot of the components were not iOS standard, and a requirement for Phase 2 was that the app had to run on iOS. Since Dark Mode for iOS was still a new feature at the time, and based on the scope of the project, we did not prioritize a night mode for the final proof of concept. However, some aspects of the UI were kept for the final product, such as the sidebar navigation, a format that is familiar to pilots as it mimics the controls in a fighter jet with the buttons located next to the screens in the cockpit.
A style guide was create in conjunction with the interactive prototype to ensure that the interface would adhere to the iOS Human Interface Guidelines. Components contained specs and details of their use cases.
Here is a video demo of the final prototype showcasing the AI assisting with two types of emergencies - a landing gear error and a weather emergency resulting in a reroute to a divert field.
The final concept also included a Unity flight simulator (built by SoarTech) to mimic the conditions inside a cockpit and sent mocked environmental and aircraft data to the application.