Understanding Financial Data Nonvisually
We worked on making financial data and visualizations accessible for people with visual impairments with the sponsorship of Bloomberg and the support of Carnegie Mellon University.
Skills used: Literature review, contextual inquiry with people with visual impairments and finance experts, empathy exercises, visioning, creative matrices, experience prototyping, book design, survey design, 508 compliance, rapid prototyping, and JS development
People with visual impairments (PWVI) are at a huge disadvantage when it comes to making financial decisions, be it in a professional setting, or when managing personal finances.
For our masters’ capstone project, we were asked to explore the domains of finance and visual accessibility, with the goal of giving PWVI better access to financial visualizations and other information. The first step was to quickly become experts in designing for accessibility.
We created Stockgrok, a web-based tool that provides auditory counterparts to visual cues in charts that are used to assess financial securities. With Stockgrok, a user can compare the price of any security of interest to its 50 Day Simple Moving Average purely through sound. Stockgrok uses a unique set of audio outputs that enable hearing the distance between lines, intersection points, and price position above or below its moving average. Stockgrok is an inclusive solution designed to empower people with visual impairments assess financial trends and make buy or sell judgements non-visually.
To begin understanding the complex space of accessibility, data visualizations, and expert finance decisions, our team interviewed ten subject matter experts (SMEs). Our interviewees ranged from accessibility and emerging technology academics to finance and accessibility industry professionals. Along with our literature review, we used these interviews to learn more about the current state and future of assistive technology, the benefits and challenges of using different modalities for communicating data, and current efforts to make financial data used for decision-making more accessible.
Our team conducted a question-storming session with our client to quickly generate a large number of research questions and align our team’s goals with those of Bloomberg. As a group, we represented the problem space as:
“Bloomberg’s products provide too much information, making it difficult for visually impaired people to get to the “nub” of the data.”
“Sighted people are privileged because financial tools are designed for them.”
As a result of generating questions individually and going over them as a group, we identified and scoped to three major problems to help guide our research:
As a team we felt it necessary to try to put ourselves in the shoes of a PWVI. However, it was important to develop empathy specifically in the context of our research, rather than just simulating blindness. After doing research on how to effectively design an empathy exercise, we wrote our own experimental protocol. In this exercise, we first learned the basics of the Apple Voiceover screen reader, and then used it to navigate our bank accounts on both our laptops and iPhones.
Our team split up some 60 research papers on accessibility, cognitive science, data visualization, and emerging technology. Every week, we conveyed our findings to each other and developed a shared understanding by printing out discrete points of interest onto individual notes. We then clustered these notes to see what themes emerged in each research space. We digitized these higher level themes into the clusters shown below
Since we’re focused on nonvisual means of communication, one of our teammates completed a survey of emerging technologies to give us a better idea of what tools we might be able to use, and understand their limits.
We conveyed our findings to our client through our secondary research report, offering five insights to help guide our contextual inquiry, as well as our future product designs.
1) “Don’t move my stuff.”
We leveraged the networks of our SMEs to deploy a screener survey to the PWVI community. Our survey screened for PWVI who use many of the same tools as finance workers, such as spreadsheets and programming languages. In total, we received 203 responses: 189 of these were legally blind, ranging from moderate to total vision loss with no light perception. Encouragingly, over 40% of respondents had checked financial securities in the last 30 days.
From our survey, we scheduled remote and in-person interviews with 10 PWVI: five with congenital blindness, and five whose vision declined later in life. Our interviews were structured around observing how our interviewees accessed financial data in real time. Some walked us through checking their actual financial information, while others we asked to recount recent interactions with financial data.
We also talked to six finance experts about their use of data and visualizations for decision-making. We focused on people who actively managed investments and frequently made buy/sell decisions. Roles included stockbrokers, the president of a capital management firm, and analysts, and MBA students.
Our goals were to learn more about the workflows of high performing users and identify what data is and is not important for guiding their steps and decisions.
After each interview, three team members interpreted notes and recordings, then created visual models of the workflows described. Many of the people we interviewed told us that they have their own idiosyncratic workflows. To capture how they follow the information scent, we create sequence models.
We combined all of our notes from our interpretation sessions, and clustered . We color-coded notes based on whether they came from PWVI or finance experts. As insights began to crystallize, we were excited to see overlap between the two different domains, suggesting that both groups faced some similar challenges in their decision-making and workflows.
We spent several team sessions “walking the wall,” a process during which team members, first silently, then through discussion, annotate the affinity diagram with questions, breakdowns, and design ideas. From here, we began ideating on possible solutions. As part of this process, we also created early prototypes to explore the forms of potential solutions. We tested internally with tactile materials and audio cues. We also considered the feasibility of our top design ideas, the input and output of each idea, and what impact the idea would have on sighted users and PWVI, respectively.
We presented our research to our client in our spring book, “What Am I Missing?” offering six insights, told from the perspective of a PWVI:
1) People who help me are interfaces too.
We presented our research findings for the semester to our clients at Bloomberg, as well as other members of Bloomberg accessibility, UX, and software teams. Our presentation focused on four specific areas: