Cost Considerations in Qualitative/UX/User/Human-Centered Design Research

twigandfish
twig+fish
Published in
9 min readJan 22, 2020

--

For such a well-established industry as research and insights production, we still face challenges that arise from a lack of knowledge about how our work gets done. Ultimately, these challenges result in budget and time issues that can hit at our credibility. In our experience, we have been asked to carry out ethnographic research in 2 weeks, or to conduct a focus group with under $5,000. Such requests make us realize that perhaps the efforts of research are not as well-understood as we thought.

We always make it our responsibility to mitigate factors that can deflate or devalue our practice. As such, we have broken down the primary considerations we take when estimating the time and budget of any research program. Take in mind, because we are consultants, this is primarily from an outside-looking-in perspective, but many of these considerations also apply to in-house research teams.

Research as a Service

One of our primary considerations is understanding how our work will be positioned in an organization. In order to be successful, research requires a fair amount of alignment on expectations, as well as an effective insights socialization approach. No matter what kind of methodology we execute, these two factors ultimately drive how much more “incidental” work will be required to design and deliver a meaningful research program.

Alignment

How much agreement already exists on research expectations, and study utility and limitations.

  • Stakeholder Perceptions. We always consider how much the stakeholder team understands, values, and envisions the work of researchers. Some teams are more motivated and capable to operationalize the work, while others will require more efforts to course-correct into better habits.
  • Objectives Identification. We always consider how well-described study objectives are within the stakeholder team. If the intent and purpose of the learning need is not understood, then we must spend more effort extracting, transmitting, and communicating the spirit of the study.
  • Organizational Standards. We always consider the larger context and potential for research in an organization. Sometimes research is a value-add to other team’s workflows, while in other organizations it is a standalone benefit meant to promote concepts like connection and empathy. Research as a value-add is where we often see perennial resource issues (such as research-team-of-one syndrome or over-emphasis on validation methods).

Socialization

The ways in which findings and implications will be considered once the study is complete.

  • Form Factor. We always consider the most appropriate approach for communicating findings and implications. While documentation might be a way for us to demonstrate our value, it may not always be of service to our stakeholders. We listen for the hand-off need to help us understand the best final deliverable.
  • Longevity. We always consider the lifecycle of our data points, because it will determine when we may need to conduct further studies. Validation output (where data is behavioral and tied to an offering) is more time-bracketed than Discovery output (where data is more emotional/attitudinal and takes longer to shift), for example. If an organization only pursues Validation questions, they are likely misappropriating resources that can be better used to collect more enduring data.
  • Actionability. We always consider how prescriptive the insights must be to our collaborating teams. Research can provide the function of informing an offering, but it also can be a source of continued inspiration for creative teams. Both of these functions have different implications for how the work gets done, and its associated resource allocation.

Study Design

Executing studies is just one part of what we do, but it is very critical. We are passionate about following a rigorous approach to address our stakeholders’ objectives, and to reduce redundancy. We have found that organizations are more comfortable approaching researchers with a conceptual study design in mind already, but this may create more work to redirect if that concept does not include the most useful methods. A study should manifest by critically examining the contextual and dynamic needs (where do we need to be, and who needs to be involved). Ultimately, the study design is the intellectual part of our work that we really enjoy, and we need to take the time to explore its inputs and outputs.

Context

The critical environments and conditions from which we will learn.

  • Mobilization. We always consider the effort it will take to travel, be in the field (away from our desk responsibilities), and the amount of communication it takes to be in a particular place at a particular time. Whether its in people’s homes, in a facility, or remote, mobilizing the research engine can either be a rough start or a well-oiled machine.
  • Toolkit. We always consider the best modes of data capture and production. Being in the field may necessitate analog data capture and video recording, which can require more extensive post-production and analysis time. Being in a facility, it may be worth it to leverage their established tools. In addition, having the right tools for remote data collection (including having vetted several competitor options) is crucial to how we practice.
  • Occupational Health and Safety. We always consider our personal health and wellbeing as part of the study design. When going into private contexts, it is our policy to never go alone (for example). We also do the work to ensure that we have access to food and water (say, if we are driving around rural northern Michigan for a day). Finally, we do the work to have legal protections in place in order to gain the access we need to our participants in a way that is ethical and safe.

Dynamic

The engagement strategy that will allow participants to articulate themselves.

  • Engagement. We always consider the various ways in which participants articulate themselves. Some participants are verbal, others are visual, others require something else to engage them. Some data gathering dynamics can also feel intimidating or make participants nervous, so we must do the work to alleviate these risks to our data credibility and viability.
  • Co-location. We always consider who needs to physically be present during the study, at what times, and for what purpose. With the rise of robust tools that offer remote collaboration, sometimes they can be over-favored as a cost-cutting measure. We weigh the benefits and drawbacks of face-to-face interactions to the study design.
  • Timing. We always consider the timing in which we should be collecting the data (not to be confused with appropriate time of year, which is context). Being synchronous with participants and stakeholders may not always be the best way to collect our data, for example.

Recruiting Strategy

Finding the people we need to learn from can be difficult enough, but scheduling them to our study design constraints adds another layer of complexity. With nearly every researcher we meet, their biggest challenge is finding the people to engage with, and they worry when the well will run dry. Recruiting is the work we need to do to make our study designs worth it, and it’s also likely our biggest source of continued anxiety. While there are a number of direct-to-researcher recruiting services available, they tend to attract the most willing and able of the population. And while we may have resources like customer lists, we sometimes have internal disagreements about how to identify our sample or how involved we want these customers to be in our insights programs. Our job as researchers is to help teams navigate these nuanced realities.

Sample

The variables that must be considered to identify the people we need to learn from, and how many will make our data valid.

Populations. We always consider what slice of the entire population will be best to learn from. Sometimes engaging with a representative sample is most useful, while in other cases we may need to seek outliers (or “extremes”). Researchers do the work to figure out exactly why, and how to identify and schedule each.

Criteria. We always consider the various ways in which we can identify and select our sample. Typically, participants are selected based on a combination of demographic, psychographic, behavioral and ability factors. However, we do the work to understand which factors might introduce a bias or skew in our data set.

Blindspots. We always consider the target and accessible populations for our studies. We may not always be able to access an entire representation of the kind of population we need to learn from, but in those cases, it’s our responsibility to make it clear why and how to address it in the future.

Management

Administrative requirements that will ensure participants will engage with the study design.

Recruiting Resources. We always consider ways to build off existing measures and tactics to streamline the arduous recruiting task. Some organizations have well-organized customer lists, existing screening tools, or internal resources to take on the task. The availability of useful recruiting resources can greatly impact the overall cost of planning a study.

Administration. We always consider who should be in charge of the logistics of recruitment. Again, many of these online-based recruiting tools that are direct-to-researcher can be a real cost-saver. However, there may be a need for co-managed or “concierge” like services to efficiently tap into nuanced populations. We always think about the amount of time spent on incidental recruiting tasks to determine when and how to outsource.

Incentives. We always consider the best way we can offer a return to our participants for their time and effort (not to be confused with compensation). We think about the form factor, delivery method, and destination of the incentive as part of our study planning.

Managing Costs

These cost considerations may seem daunting or ethereal, but a seasoned researcher will know how to address each with enough flexibility to do their work within resource realities. Of course, there is always a point of diminishing returns in modifying the research service, study design, or recruiting strategy to too restrictive (or arbitrary) constraints. A more sustainable solution for researchers is to educate, coach, and make evident the value of research to their organizations.

While it’s easier said than done to demonstrate the value of research, we have incorporated a few must-haves that consistently help stakeholders understand where and how they should be allocating (or seeking new) resources. It removes the argument of the actual cost and resource allocation necessary to do our work well.

  1. We have a repeatable process. By using, publicizing, and sticking to our research process, we eliminate the non-intellectual work often involved in the nebulous act of insights production. Things like order of operations, information dependencies, and logistics become clear and agreed upon at the study onset so that it does not contribute to wasteful debates later on. In our 5-phase process, we clearly indicate milestones, deliverables, costs, timing, and assumptions so that it is clear what will happen when, for what reason, and its contribution to the overall learning needs of the organization.
  2. We never lead with method. We do not simply accept a request for proposal that only indicates a methodology and sample size. We never “sell” a proprietary method we developed ourselves. We always press back for the objectives, the research questions, and the origins of both. We use this as a way to then design a study that will adhere to the needed context and dynamic requirements that will allow participants to articulate themselves.
  3. We seek the appropriate partners. Unless you are in a resource-rich organization in which participant selection and scheduling is operationalized, it is rarely a good idea to handle all recruiting yourself in-house. We find that we spend more time worried and anxious about securing people, and less time on the actual study objectives. Whether we are working with a full service, or an automated algorithm, we weigh the costs and benefits of the many ways we can identify and schedule participants. We are not always using Ethnio, for example, or solely relying on a third-party recruiting consultant. We adapt to whatever is best for the study.

The fact is that research involves moving parts that can drive up or streamline cost. Some studies do require a heavier time and money lift, while others can be done swiftly and with lower budgets. As a domain, there is a trend toward lean, guerilla, and rapid approaches to consistently and uni-directionally bring costs down, but in the long run these may not always be the most serviceable approaches to organizations’ learning needs. The best research investment an organization can make is into an experienced research leader (or partner vendor) that has the ability and motivation to consider all of the factors listed here.

--

--

twigandfish
twig+fish

a human-centered research consultancy that empowers teams to practice empathy