• Skip to primary navigation
  • Skip to main content

Usability Resources Inc

User experience research, team training and coaching - in person and remotely

  • Home
  • About
    • Case Studies
    • My Process
  • Services
    • UX Training
    • Panel Development
  • Testimonials
  • Publications
    • Articles & Talks
  • Blog
  • Contact
  • Show Search
Hide Search

Articles

Liz Sanders on Generative Design Research

by Kay Corry Aubrey

If I had to rely on just two books to guide me through my career in UX they would be Dumas and Redish’s “A Practical Guide to Usability Testing” and Sanders and Stappers “Convivial Toolbox”. The Dumas and Redish book describes in exquisite detail how to design and carry out a usability study. It’s task oriented approach makes it indispensable step-by-step guide for anyone who needs to run a complex study on their own. This is the book I used in the 90s to learn this craft on my own because there were no training programs at that time.

The Liz Sanders and Peter Jan Strappers book, on the other hand, is a beautiful and comprehensive guide on how to systematically draw out creativity from other humans in a research situation. Their guiding belief is people are tired of living in a consumer-driven world that can’t be maintained. They crave the chance to make better choices around how to live vs. just spend and consume. People want to engage with others in creative activities. Convivial Toolbox offers a framework for understanding human creativity and how to tap it for a variety of design-oriented purposes.

I had the opportunity to talk with Liz about her work and ideas for the Spring 2021 issue of QRCA VIEWS Magazine. Here is a link to the published interview and a link to the podcast recording on the VIEWS Website.

Margaret Heffernan on “Uncharted – How to Navigate the Future”

by Kay Corry Aubrey

Margaret Heffernan was born in Texas but has spent most of her life in the U.K. She is a TED speaker and has had many successful careers that include thirteen years as a producer of documentaries for the BBC and as a serial entrepreneur and CEO of several tech companies in the United States. She is currently a professor at the University of Bath. Margaret has written four internationally acclaimed books on business and how to live a meaningful life that meets the challenges of our time.
We talked recently about the ideas in her most recent book, “Uncharted – How to Navigate the Future”. Margaret describes how we have become reliant on invisible systems that we do not understand. In the process of outsourcing our intelligence we have lost human skills, such as the ability to read a map, direct our attention, deal with ambiguity, or with people who are different from us. The way we are profiled by technology and algorithms impacts our ability to get jobs or loans or gain access to other opportunities. The scary part is we are unaware of how these systems directly impact our lives. Apps that show us how to raise our children diminish our ability to trust our own judgment. The algorithms are invisible—we do not know how they work or the value system that is embedded in them. But one thing is certain—they are designed to serve the interests of their creators and not the average person.

Here is a link to our QRCA VIEWS Luminaries Interview and to a review of Margaret’s book and a link to the audio podcast on QRCA VIEWS magazine of our conversation.

Jane Frost of the UK’s Market Research Society on How to Raise the Profile of Our Profession

by Kay Corry Aubrey

I interviewed Jane Frost, CEO of the UK’s Market Research Society for the Fall issue of QRCA VIEWS magazine. https://lnkd.in/e2mxuYq. With over 5,000 members the MRS is the second largest market research professional organization in the world. Jane described the central role qualitative research plays in the UK, how her organization is working with government to help the industry stay strong during the pandemic, and the MRS commitment to diversity. The face-to-face market research industry in the UK has disappeared overnight and may never come back to its previous level.

Jane described how qualitative research in the UK is more “pure” using traditional face-to-face methods such as in-depth interviews and focus groups. On this side of the pond, clients expect numbers to justify insights so our approach tends to be mixed with quantitative methods, though that may be changing. The MRS plays a pivotal role in advancing the stature and skills of their members and offer high quality professional education to their members to help them stay up to date and competitive. They also sponsor excellent conferences.

Lately, Jane and her organization have focused on advancing diversity and equity within the sector. The MRS has created the “MRS Pledge” where market research industry organizations sign on and commit to actively recruit and promote employees from under-represented and non-traditional backgrounds. Jane is hoping to generalize the “MRS Pledge” to become a model for a world wide “CEO Pledge” to raise diversity within the profession beyond the UK.
Here is a link to the article

How Can Voice AI Help Qualitative Researchers?

by Kay Corry Aubrey

This blog post originally appeared in QRCAs Qual Power Blog

Within three years, 50% of Web searches will be done via voice. Today almost one in four US households has access to a smart speaker such as Google Home or Alexa. Consumers are adopting voice technology faster than any other technology, including smart phones. Very soon voice artificial intelligence (AI) will become embedded in our everyday lives to the point where we may not even notice it anymore. How can qualitative researchers leverage this powerful trend?

For inspiration I spoke with four experts who are doing cool things with voice technology. They described unique ways to apply voice Artificial Intelligence (AI) that offer a preview on how this technology might transform our work as researchers. For example, consumers are shifting toward using their voice vs. their fingers to interact with technology and the Internet.

The Rise of the Talking Survey

Greg Hedges has had great success with voice-based surveys through virtual assistants such as Siri, Alexa and Google. According to him, “It’s like launching a focus group of one. People are interacting where they are most comfortable in their own home, using their own words. We’ve found that people are more spontaneous and natural when they talk vs. when they type.” Greg’s company also helps organizations integrate voice branding into their digital marketing ecosystem. Part of their expertise is redesigning a client’s SEO strategy to be phrase and question-based (vs. keyword based) to accommodate voice searches.

Ask Your Digital Twin Narrate Your Next Report

Domhnaill Hernon collaborates with artists to explore the deep connections between technology and human potential. He worked with Reeps One, a beatboxer, who fed hours of his audio recordings into Nokia’s AI machine. To their astonishment, the system returned new melodies he didn’t put in but sounded just like him. Rather than feeling threatened, the artist embraced the capability and now incorporates AI-generated tunes into his work. Soon this technology will be widely available, and you’ll be able to produce reports in your own voice that clients can listen to just like a podcast.

It’s hard to imagine how voice technology – and AI in general – will change our world. Technology is always a double-edged sword. On one hand, AI will be used to cure disease, make societies more efficient, and redistribute wealth so humans everywhere prosper. On the other, it might lead to a hardening of the social classes and a surveillance state. In a recent episode of 60 Minutes, AI expert Kai Fu Lee said that 40% of jobs will be eliminated within 15 years due to artificial intelligence. To empower ourselves we need to understand what AI is, how it works, its capabilities and limitations.

How Voice AI Works

As with any artificial intelligence, voice technology relies on two things: having access to a huge pool of data, and algorithms that look for patterns within the data. For voice, the algorithm is called Natural Language Processing (NLP). The result can only be as good as the data that are fed into the machine. Today in North America, Voice Assistants (VA) are 95% accurate if the person speaking is a white native-born man, 80% accurate if it’s a woman, and as low as 50% accurate if the person has an accent. This is because of the socially limited group of people who contribute their data by using voice assistants – VA users tend to be early adopters, white, and highly educated.

Jen Heape notes, “Natural Language Processing (NLP) cannot deal reliably with anyone who is not a white male, and this is deeply problematic, which is why Google and Amazon are giving away so much free so they can collect more complete samples.”

The algorithms that make up NLP leverage fixed rules of language around syntax, grammar, semantics. The algorithm can be taught, “if they say this, say that” and the machine learns the pattern. This capability allows the virtual assistant to process simple prescriptive (but useful) commands such as “turn on the lights,” “play NPR,” or “order more lettuce,” because the technology has learned the vocabulary and structure of English sentences.

Can a Machine Be Conversational?

However, voice technology is still very much in its infancy. The machine has no concept of culture or social inferences. As Heape noted, “If I were to say ‘The kids just got out of school’ and the listener is in the same time zone, they’d know it’s 3 or 3:30. However, the voice technology would not be able to infer this because it lacks the data.”

Freddie Feldman leads a voice design team which creates chatbots and conversational interfaces for medical environments. According to Feldman, chat bots and voice technology in general are helpful in medical environments to get quick answers to predictable questions. “But for anything more crucial, dynamic or that requires understanding the other person’s psychology you’ll need to call someone in the end.” In theory, it’s possible that voice technology will have deeper human characteristics one day. “The technology is there. It’s just a question of someone piecing it together.”

It’s hard to imagine any machine being able to understand and integrate all the rich signals we send and receive in a conversation: the look on a person’s face, the tone of their voice, their diction, their physical posture, our perception of anger and pleasure, or what they are thinking. These elements are as essential to meaning and human connection as the words themselves. As Heape said, “VAs will never replace the human. There will always be a human pulling the lever. We decide what the machine needs to learn. VAs will remove the arduous elements. But we need a human to interpret the results and analyze it. We’re still so much at the beginning of it — we have not fed the machine.”

My feeling is there will be abundant opportunities for qualitative researchers, but – first – we need to understand the beast and what it cannot do.

Learn More about Artificial Intelligence and Voice Technology

Thomas H Davenport and Rajeev Rananki, “Artificial Intelligence for the Real World; Don’t start with moonshots”, Harvard Business Review, January-February 2018. (free download).

Joanna Penn, “9 Ways That Artificial Intelligence (AI) Will Disrupt Authors And The Publishing Industry”, Creative Penn Podcast #437, July 2019.

Oz Woloshyn and Karah Preiss, Sleepwalkers podcast on iHeartRadio.

Voice 2019 Summit, New Jersey Institute of Technology, July 22 – 25.

 

Acknowledgements

Thank you to the experts I spoke with while researching this post:

  • Freddie Feldman, Voice Design Director at Wolters Kluwer Health
  • Jen Heape, Co-founder of Vixen Labs
  • Greg Hedges, VP of Emerging Experiences at RAIN agency
  • Domhnaill Hernon, Head of Experiments in Art and Technology at Nokia Bell Labs.

 

Tools for managing Customer Journey Map research

by Kay Corry Aubrey

This article was originally published in the Fall 2016 edition of VIEWS Magazine. For the past several years I’ve incorporated NVivo into my work, when the project is complex. Products such as NVivo are widely used in academia, but less so in the commercial world, though there is a big need for this type of technology. The UX world is gradually learning about QDA and its value because many threads need to be woven together in a typical UX study and the data comes from varying digital sources such as video, audio, spreadsheets, and transcripts. This article presents a case study of how I leverage QDA in one type of UX research – customer journey mapping.

As qualitative researchers, we often need to mine for insights through mountains of unstructured digital data such as transcripts, videos, photographs, web analytics, and social media posts. Though many of us might enjoy the analysis portion of a project, this phase can be very time consuming. In this article I am going to introduce you to game-changing tools that can greatly improve your efficiency with data analysis and synthesis, once you get the hang of it. The technology is called qualitative data analysis software (QDA), and I am going to demonstrate how it works by showing how you might leverage it in a real project—building a customer journey map.
READ THE ARTICLE

  • Go to page 1
  • Go to page 2
  • Go to Next Page »
  • Facebook
  • LinkedIn
  • Twitter

© 2023 Usability Resources Inc · Design by Steck Insights Web Design Logo