TRANSLATION

Translation is a speech translation mobile application aimed at mediating communication between travelers and CBP Officers.

I have omitted and obfuscated confidential information in this case study. All information in this case study is my own and does not necessarily reflect the views of CBP.

A FAMILIAR TOOL FOR A NEW AUDIENCE

To keep our borders secure and our nation safe, CBP must inspect everyone who arrives at a U.S. port of entry. The CBP officers are authorized to ask questions about each traveler’s trip and personal background information. 

Our high level goals were to:
• Give officers the freedom to communicate with non-English speakers.
Eliminate the need for on-side translators for 90% of conversations

I led the design of the mobile application between April 2019 and January 2020, and collaborated with two developers and a product manager on all aspects of the application.

eARLY INSIGHTS FROM THE FIELD

We traveled to Dulles Airport and observed how officers communicated with non-English speakers.

In the majority of instances, officers and travelers communicate with a desk in between them. However, in some cases an officer may need to communicate while on the move using their CBP issued smartphone. 

Currently, if a traveler does not speak English, a translator is provided, either at the airport or over the phone. In either instance, travelers must stay at the airport for a much longer amount of time than the typical traveler. 

One officer described a time an on-site translator recognized a traveler’s regional accent, and quickly bonded with the traveler over shared cultural experiences. This camaraderie could have potentially interfered with the integrity of the interview.

DISPLAYING conversations

The officers overwhelmingly highlighted the importance of the accessibility of features, keeping clicks and taps to a minimum, and efficient communication.

Voice Conversations and Keyboard/Text Conversations emerged from this ranking as our most important functionality.

The officers’ rankings highlighted that we needed to prioritize verbal communication and that the ability to use the keyboard to type was secondary. As a result of these findings, the Voice Conversations screen became our home screen, while the keyboard screen became our second screen.

Wireframes

For each feature phase, I went through cycles of requirements, consensus, approvals, detailed specs and handoffs.

I experimented with different ways to display translations. From the feedback I received from the developers and our product owner, we decided to move forward with two screens (highlighted in blue) which entered the development phase of the application. This surprised me as I expected to only move forward with one of these screens, but I got feedback from the officers that both provided useful functionality.

high fidelity mockups

The final mockups are a product of the results from my research, user testing, and client feedback. I used the OIT Design Guide to guide the visual design.

STATIONARY OR MOBILE?

TRACKING DOWN TRANSLATORS

Human error

HOME

CASE STUDIES

ARTICLES

ABOUT