What market research legal teams need to know about generative AI

 

• • • • • •

Not all generative AI is created equal (legally speaking). Here’s how Suzy is taking steps to build our new AI tools with you in mind.

Generative AI has upended many industries since it first came roaring onto the scene during the earlier part of the year. At first, everyone was just excited about what the tools could do. But with data privacy issues, a lack of regulations, and many other concerns, it swiftly became clear that there could be legal ramifications for brands using generative AI tools.

Not only were users uncertain of where the information was coming from, but also whether any data input into a tool like ChatGPT could be removed. Several brands even took steps to prevent their teams from using generative AI tools, including iHeartMedia, Samsung, Apple, and Verizon.

As enterprise brands strive to remain competitive in today’s constantly changing business landscape, AI has the potential to change things for the better. In fact, we at Suzy believe AI will positively transform our industry and allow market researchers to uncover powerful insights faster and with more granularity than ever before. 

But how do you know if you can trust an AI tool? What should brands and their legal, privacy, and security teams look for? 

At Suzy, we’ve been asking those big questions since the beginning of the year, and have involved both our clients and legal team in discussions to find the answers. Let’s explore how to find a market research AI tool that you can trust and what you should look for.

Make sure you have control

When working with providers that offer AI products, check to see how consent is used. Are you providing blanket consent to use the tool, or are there exceptions? A tool that enforces blanket consent often means you can either consent to your data being used, or you don’t get to use certain features in the tool. 

At Suzy, we’ve designed our product to let our clients have control since day one. Suzy’s AI features are automatically turned off for our clients. Instead, our users get to choose to opt-in to use our AI tools.

And if they choose to use Suzy’s AI, our clients can turn off their data on the entire contract level, preventing their data from being used at all. You can stop your data from being used in qualitative research or even for one particular project for a particular brand.

In short, we’ve built a lot of control into our platform, so our clients can feel confident about their data living on it and trust our tools. Our clients get control over their data and if they don’t want to use the AI, they aren’t excluded from any of our features.

Make sure your data is protected

When ChatGPT first hit the masses and all the questions first started coming up, the main questions our team received were, "How will you keep my information separate from public models? How do I know that what the AI is spitting out isn't mixed together with some other client’s confidential data?" 

Those are critical questions. After all, you wouldn’t want competitors to have access to your proprietary data, including any new innovations you’re testing. 

At Suzy, we’ve built our platform on a private instance. What that basically means is that our client data is not going into a never-ending black hole of data. It’s not being pumped into ChatGPT or public models. We take confidentiality seriously. You can trust that your data isn’t being shared with your competitors and that your competitive insights will remain confidential.

Get respondent consent and check to see that personal information (PI) is protected

It’s tricky to ensure permissions are ironclad when multiple platforms must be used to conduct research. Let’s say a company wants to run AI for qual analysis. The platform they choose to use must source respondents from panel providers since they don’t own an audience. Since those companies don’t control the panel, they may or may not be getting all the consent they need from respondents along the way. If there are multiple platforms involved, it’s even harder to guarantee that personal information is protected as it is shuffled from tool to tool.

At Suzy, we own our panel and we've educated our members about how we're using their information. We can tell our clients what kind of consent and permission we've received from those members and can demonstrate it because they're our first-party audience. You don’t have to wonder or hope that disclosures are being made along the way—you can trust that we take privacy seriously. 

We also know our enterprise clients care about respondent choice. So, when we built our AI tools, we kept that consideration in mind. We’ve built controls to strip or protect personal information before it goes in or after it comes out

Let's say one of our Crowdtap members wanted to participate in a focus group and answer some questions for a brand. Initially, that member is okay with her personal information being used. But if that respondent then decides six months later that she doesn’t love the idea of her information being out there, she can contact Suzy and request that her PI be deleted. We have the tools to honor that request

That’s true when it comes to both qual and quant content. Wherever personal information is, whether in response to an open-ended question or in a transcript from a qualitative video, we have tools to strip the content of identifying information.

The AI must be informed by experts

You wouldn’t necessarily rely on Wikipedia, Reddit, or the rest of the internet to teach your team how to write a survey or run qualitative research. Instead, you’d want to make sure that your team had the expertise and knowledge to run research well, likely with degrees and certifications to back it up. AI is no different and tools must be informed by expert market researchers—not ChatGPT’s best guess—for you to trust the outputs.

At Suzy, we believe the market researcher is the expert, and we don’t see generative AI replacing them anytime soon. But we do believe that AI can help insights professionals do their jobs more efficiently and effectively. 

When we train our AI model, we leverage our in-house Center of Excellence and other experts to teach our AI what a good survey looks like or what kinds of analysis to output from a qualitative transcript. Everything that goes into our suite of AI tools is informed by the wealth of knowledge we have on our internal team so you can trust the outputs and the insights that come out of our AI.

The tool must welcome feedback

AI isn’t perfect yet, and though it has grown by leaps and bounds there is still room for improvement. The AI tool your team chooses for market research needs to give you the room to express whether something is working or not. That’s the only way the AI can get better. With plenty of feedback, a tool can be curated to produce insights and data you can trust and take to your stakeholders.

Since the launch of our AI-powered suite of tools, 30 enterprise brands have enrolled in an early access program with new brands added daily. And we want feedback on it—going so far as to build an AI feedback portion of every single analysis recommendation that comes up at the bottom. Our users can rate how useful or accurate the information was. With their insights, we can continually train our tool so it gets better and better.

Final words: Suzy’s AI is being built with you in mind

The world of AI is uncertain right now. But one thing is clear: AI has the potential to save market research teams a lot of money by increasing efficiency and helping insights get across the finish line faster. With insights at the speed of AI, you can truly drive consumer-led growth and allow the consumer to inform all your business-critical decisions. 

From the moment we began conceptualizing AI tools for our consumer research platform, we involved our legal and compliance team and started conversing with our clients. With their critical input, we’ve been able to build an AI product that we are proud of and that our clients (and their legal teams) can trust. 

Want to see how it works? Request access to our early access program.

 
Previous
Previous

State of the Consumer: Your AI Cheat Sheet for Back to School Season

Next
Next

Maximize your market research budget with Suzy