Managing the Risks of Generative AI Policy

October 27, 2023

There has been a lot of discussion around the potential threat posed by generative AI platforms such as ChatGPT. When it comes to generative AI, the biggest threat to purchasing coops and buying groups is not to embrace it.

AI is here to stay, and purchasing groups will see a widening gap between adopters and non-adopters.

Generative AI is a type of artificial intelligence that creates various types of brand-new content, including text, imagery, audio and other content.

So how do group leaders get started with AI?

The typical advice is to begin to play with tools such as ChatGPT – experiment with having it create document summaries, write emails, and perform simple tasks inside and outside your work life.

While this can be good advice for an individual, it is not necessarily good advice for an organization unless rules are put in place.

Do You Know Where Your Data Is?

The second biggest threat to purchasing groups when it comes to AI is not knowing how and when your team is using it. A recent survey found that over 40% of professionals have used ChatGPT or other generative AI at work. Alarmingly, 68% admit they’re using it without their boss’s knowledge.

I’ve said it before that when it comes to data security, buying groups and purchasing cooperatives must be extra aware. It’s one thing for a company to put its own data at risk. It’s an entirely different story when putting your members’ data at risk. Simply put, it’s not your data to lose and the effects on their businesses can be devastating.

I’m not making the point that “playing” with ChatGPT is bad. I’m saying that employees playing with these tools however they want, possibly without your knowledge, is not in the best interest of your group or your members.

Playing By the Rules

The real first step to get started with AI is to establish clear rules and policies for office use. I don’t want to debate or predict the possible risks to data privacy posed by generative AI. Use the tools in a way that best suits your group and your strategic direction. Don’t assume that everyone in your office shares the same understanding of what is or is not appropriate that you do.

Start by clearly defining what information can and cannot be uploaded to ChatGPT. Is it okay to upload member lists? Supplier contracts? Purchasing data?

A good rule of thumb is to classify information as “in front of the firewall” and “behind the firewall”. Information publicly available on your website or on the internet is fair game, but information not publicly available is not.

When defining what is and is not acceptable use, clearly define the following:

The goal of these policies is not to stifle your group’s use of these tools. The goal is to create the best possible outcome and smoothest adoption of AI.

Generative AI isn’t a distant dream – it’s here to stay and it’s taking business by storm. I agree that the best way to learn about AI tools is to use them. Before you do so, it’s important for groups to establish clear guidelines and policies for using AI to protect you, your group, and your members from potential mishaps.

Written by Steve Seguin

——

LBMX offers a business-to-business marketplace platform, helping independent businesses, their buying groups, and suppliers buy better and sell more. Its Private Group Marketplace for Groups has transformed billing and ordering, rebate management, real-time analytics, e-commerce and product information management across the building materials, HVAC, plumbing, sporting goods, industrial supply, manufacturing, and agricultural industries. Its LBMX Supply Cloud platform allows suppliers to look at their industrial distribution customers through one lens, offering full EDI, PIM, Analytics and Payments.

Stay informed about the latest updates, industry knowledge, and exciting product releases by being a part of the LBMX community on LinkedIn and YouTube.