Mitigating Bias in Technology and Automation

Association Forum talks with Beth Kanter and Allison Fine, authors of “The Smart Non-Profit”.

By Kristin Gover, CAE

Shot Of An Attractive Young Businesswoman Standing And Looking Contemplative While Holding A Cup Of Coffee In Her Home Office with technology

Staying human-centered and member-focused is critical to all associations. With the pace of change due to the pandemic, many association leaders are moving more quickly to adopt new technology for maintaining connections with their members while easing workloads for their team members.

Beth Kanter and Allison Fine, authors of “The Smart Non-Profit“, describe smart technology as one of the levers that leaders need to pull, along with rethinking workplace culture and a more relational approach. They emphasize that adoption of smart tech requires a much slower approach and time spent on readiness, and association leaders will need to have their eyes wide open about both the benefits and the risks of using new smart technologies.

I recently interviewed Kanter and Fine about their book and considerations association leaders must evaluate during implementation including how to mitigate bias when automating systems.

What is your definition of smart tech?

In our new book, “The Smart Nonprofit”, we define smart tech as the universe of technologies that includes Artificial Intelligence (AI) and its subsets and cousins such as machine learning and natural language processes. These technologies use Library of Congress-sized data sets to find patterns and make decisions for and instead of people. Examples of current commercial applications include screening resumes, answering rote questions (e.g. “what time do you open?”), automatically updating budgets, organizing meetings, etc. The key to success is understanding how to marry the best of people with the best of technology.

Can you explain how to evaluate technology and data while focusing on the relationships and experiences members seek?

While some feel that the interests of workers are at odds with smart tech — that humans and machines are in direct competition — we believe that this is a false dichotomy that’s uninformed, unimaginative, and just plain wrong. Smart tech and humans are not competing with one another; they are complimentary, but only when the tech is used well.

There will be parts of jobs that are suitable for automation, but few, if any, that can (or should!) be completely replaced by smart tech. What automation can change for the better is the experience of work. Rather than doing the same work faster and with fewer people, smart tech creates an opportunity to redesign jobs and reengineer workflows to enable people to focus on the uniquely human parts of work.

Inexpensive commercial applications using smart tech are increasingly available off-the-shelf for every department from communications to accounting to service delivery. Using this technology well requires a deep understanding of what the technology is and does, and careful, strategic thought on how to incorporate it into an organization in a way that gets the best out of the technology and people. This tech/people marriage is called co-botting.

Co-botting takes time and careful implementation to do well. And when it is done well, the benefits to staff are enormous. A recent MIT research study found AI implementation benefits beyond efficiency and decision-making. For example, more than 75% also saw improvements in team morale, collaboration, and collective learning. These profound cultural changes indicate how much can be gained from the strategic and careful implementation of smart tech in comparison to previous technologies.

The Trevor Project, an organization that provides crisis counseling to young lesbian, gay, bisexual, transgender, queer, and questioning (LGBTQ+) people, has stepped carefully and wisely into automation and co-botting. They created Riley, a chatbot that helps train counselors by providing real-life simulations of conversations with potentially suicidal teens. Riley expands the training capacity of the organization enormously by always being available for a training session with volunteers. But the Trevor Project also knows that staying human-centered and ensuring that teens are always talking directly to another human being is critical to fulfilling its mission. Riley isn’t subtracting from the human experience; it’s adding to it.

What are some of the ethical concerns and considerations that must be evaluated when implementing smart tech to mitigate biases?

Smart tech products are now available for fundraising, hiring, communications, volunteering, finance and more. Even the smallest organizations are beginning to use the technology, although they may not know it because it is often invisible to end users.

These automated forms make it much easier to screen many more people. However, they aren’t neutral or infallible. A computer programmer will likely embed their own biases directly into them. For instance, in determining eligibility for housing, they may ask a question about credit history, which is a metric often used to discriminate against Black and brown people who have a harder time establishing good credit. In addition, smart tech requires Library of Congress-size data sets to become adept at identifying patterns and making predictions. These data sets will have embedded within them historic biases.

What conditions are most important for organizations when implementing smart tech?

Making the transition to smart tech requires leaders to dig into the implications of automation and make smart, ethical choices about using tech that enhances our humanity and makes work better for everyone. This is why we believe so strongly that using smart tech well is primarily a leadership, not a technical, challenge. Success requires that leaders ensure that their organizations are:

  • Human-centered. This means finding the sweet spot between people and smart tech, while ensuring that people are always in charge of the technology.
  • Prepared. These leaders must actively reduce bias embedded in smart tech code and systems. A thoughtful, participatory process is required to select values-aligned systems, vendors, and consultants.
  • Knowledgeable and Reflective.  The impact of using smart tech is too big to leave to the IT department alone. Leaders in the boardroom, the C-suite, and on staff need to lean into what smart tech is and what it does.

Once automated systems are in place, leaders need to be vigilant about whether the technology is performing as intended, or whether unintended consequences have arisen, and how clients and end users ultimately feel about the systems. In particular, smart tech systems can have racial and gender bias built right into them. It is incumbent on organizational leaders to investigate how biased these products are and to take the lead in mitigating the harms.

How to Approach Software Purchases

Kanter and Fine urge leaders not to grab software off the shelf, but to follow a “ready, set, go” approach as follows:

  • Identify key pain points to determine the right use cases. These should focus on areas where smart tech can take over rote tasks that can streamline unmanageable workloads and reduce worker stress. Outline exactly what tasks and decision-making people will retain and what tasks will be automated when the system is implemented. This includes identifying how automation will be supervised by someone with subject matter expertise.
  • Choose the right smart tech for the job. Make sure the product or system you choose will create the right co-botting balance. Ensure that the assumptions built into the smart tech align with your values. And be sure that the tasks that require empathy and intuition will be assigned to people, while tasks such as data entry or analyzing huge swaths of data will be assigned to smart tech — and not the other way around.  
  • Create a virtuous cycle of testing, learning, and improving. Step carefully and slowly, because it can be difficult to undo the harms of automation once smart tech is in place. Pilot the new system and workflow to ensure that your hopes and assumptions are correct.

About the authors of “The Smart Non-Profit”

Beth KanterBeth Kanter is an internationally recognized thought leader and trainer in digital transformation and well-being in the nonprofit workplace. She is the co-author of the award-winning Happy Healthy Nonprofit: Impact without Burnout and co-author with Allison Fine of The Smart Nonprofit.  As a sought-after keynote speaker and workshop leader, she has presented at nonprofit conferences around the world to thousands of nonprofits.

 

Alison Fine
Allison Fine is among the nation’s pre-eminent writers and strategists on the use of technology for social good. She is the author of the award-winning Momentum: Igniting Social Change in the Connected Age and Matterness: Fearless Leadership for a Social World and co-author with Beth Kanter of the best-selling The Networked Nonprofit. She is a member of the national board of Women of Reform Judaism and was chair of the national board of NARAL Pro-Choice America Foundation and a founding board member of Civic Hall.

About the Author

Kristin is the director of communications at the Academy of General Dentistry. Working in marketing, public relations and communications has allowed her to combine her passion for public policy and journalism to create networks that connect people to organizations, causes, or communities with shared values. She is also a 2022-23 member of Association Forum's Content Working Group.

Related Articles

Happy IT Technician Working At The Office Using Her Laptop
,

Learning to Learn: Improving Technology Literacy

Thanksgiving is here—when did that happen? As always, with fall comes thoughts of back-to-school and…

READ MORE
,

Avoid Getting Spooked by AI Hallucinations

New wrinkles continue to emerge regarding AI and its seemingly infinite applications.

READ MORE
,

Cybersecurity: Upping Your Game to Ensure Your Association Stays Safe

Cybersecurity is imperative for every organization. Read on for tips to modernize your cybersecurity in...

READ MORE