4 Questions Your Association Should Be Asking About AI

Read about the AI insights coming from a “Digital Den” held by technology consultant DelCor.

By Tobin Conley, CAE & Atabak Akhlaghi

AI chatbot usage and concepts

As associations continue to learn about artificial intelligence (AI) and calibrate their policies and procedures to prepare for the changes that AI will introduce to the association space, DelCor wanted to provide an opportunity for association executives to discuss some of the challenges they are facing and address questions they may have. Recently, DelCor hosted a virtual forum with some of our client technology leaders to discuss how associations should be approaching AI from the perspectives of operations, strategy, and governance. 

The conversation shed some light on how association technology leaders should be positioning themselves for success at this critical juncture. It also helped clarify how some associations are approaching the challenges presented using AI. We wanted to share the key themes and ideas discussed during the meeting in the hope that they can be valuable for all association leaders to think about in the coming months.

Who Should Own Decisions Concerning AI?

As with all matters of technology, decisions concerning the use of AI should be made collaboratively between an association’s business leaders and IT staff. The IT department should come to the table with an understanding of how AI technology can be used to support the organization’s business and operational needs along with the risks the organization will incur and need to strategize against. While IT is the expert on technology, business leaders are responsible for strategically orienting the association to operate optimally.

What Kinds of Policies and Governance Should Associations Implement?

Generative AI technology (such as ChatGPT) can be complicated, and many of the potential effects—both positive and negative—have not been thoroughly explored yet. As a result, it’s difficult to identify exactly how you should handle organization-wide policies. However, it’s important to start defining your organization’s policies and governance with respect to AI technology as your staff may already be incorporating AI into their workflow.

We suggest you start by identifying ways your staff are already using these tools, setting guidelines for acceptable uses of generative AI, and educating your staff about the benefits and risks associated with leveraging this new technology. For example, staff should be advised that AI technology is not completely secure, and the content generated isn’t always correct. Staff need to be careful because the organization is still responsible for any content generated through AI.

To foster productive use of AI tools while limiting organizational risk, associations should provide staff with guidelines about the appropriate use of such tools. While these guidelines will need to be flexible (and regularly reviewed to ensure they are up to date with changes in AI technology), they will give employees clear guardrails within which to work. Association staff can (and will) use AI to increase their productivity—having a policy in place will help ensure that they do so safely.

How Can Associations Approach Concerns About Security and Privacy?

While there is a push to incorporate AI into the day-to-day business of associations, it’s important that all aspects of how AI might be integrated into the operations of an association are properly vetted by the organization’s IT first. While doing so, associations also need to consider how their vendors may be integrating AI into their systems and ensuring that the association is not susceptible to any unacknowledged risks.

Association leaders need to identify ways to use AI tools that the organization can implement without compromising security. For example, some associations have been working on ways to implement AI to assist staff in searching through digital libraries and archived documents.

Additionally, it’s important to understand that people are often the weakest link in an organization’s cybersecurity. Untrained staff may unknowingly make poor decisions compromising the safety and privacy of member data. Associations need to train their staff to safeguard member data from AI tools in addition to establishing organization-wide policy concerning how member data can be utilized with any AI tools in the future.

How Can Associations Approach Educating Their Staff?

While it is imperative that both technical and nontechnical staff receive training on AI technology, it is still difficult to identify the most reputable sources for training. However, associations should be actively seeking guidance from those they trust in the industry about resources to educate their staff. Technical staff will need to understand the functionality that AI tools can bring as well as how to integrate those tools securely and properly. Nontechnical staff will need to understand best practices for using AI as well as how to make optimal use of any AI tools their associations may integrate into their systems.

Association leaders need to understand how best to approach generative AI technology from the perspective of their organization’s needs and operations. Generative AI technology can provide tremendous value to associations, but it’s important that organizations do their due diligence. To that end, here are some resources that can help you set your association up for success. 

About the Author

Tobin is the director of research and learning as well as a senior strategic consultant at DelCor. Atabak is a proposal and marketing writer for DelCor.

Related Articles

Happy IT Technician Working At The Office Using Her Laptop
,

Learning to Learn: Improving Technology Literacy

Thanksgiving is here—when did that happen? As always, with fall comes thoughts of back-to-school and…

READ MORE
,

Avoid Getting Spooked by AI Hallucinations

New wrinkles continue to emerge regarding AI and its seemingly infinite applications.

READ MORE