What strategies can organizations use to address the added compliance burden that comes with AI-enabled tools? How do you keep up with changes to both the vendor’s Terms of Service as well as various data privacy laws?

426 viewscircle icon6 Comments
Sort by:
Data Protection Officer and AI Legal Lead in Software8 days ago

We use a two-fold approach. From a governance perspective, we have a committee that reviews AI-enabled tools at a high level. We also maintain close collaboration with compliance, legal, and other relevant teams. I agree with the approach of considering the specific regulatory environment of each country, as some regulations like GDPR are personal-data focused, while others are more concerned with data sets. We have specialized teams for different regulations, which helps us understand the regulatory posture in each country when deploying or using AI tools. Ultimately, collaboration between departments and having the right internal expertise are key to protecting the company’s interests.

Director of IT in Banking8 days ago

We addressed this by working closely with our data analytics team to develop templates for business stakeholders who own the data. These templates include metadata tags, such as specifying whether data can be used internally or adding a “no AI” tag. This approach draws on requirements from GDPR and also considers the need to protect intellectual property. Rather than trying to keep up with every AI tool accessing our data pools, we focus on keeping all data in silos within our Google environment and controlling access through additional metadata.

CISO/CPO & Adjunct Law Professor in Finance (non-banking)8 days ago

My approach is to make data the central focus. Once you know what data you have, you can drill down into the relevant jurisdictions and identify any overarching regulations, such as GDPR, which is person-based rather than just database-based. Ideally, you should find out what the business intends to do before rolling anything out. If you can gather this information in advance, you can identify potential legal issues and involve legal teams early in the process. This allows legal and compliance to work on the different aspects in parallel, rather than risking a situation where something is rolled out only for legal to later block it due to jurisdictional restrictions. Getting legal and compliance involved early, and requiring the business to specify exactly what data they want to use, helps streamline the compliance process.

Information and Security Office & Enterprise Data Governance/AI in Finance (non-banking)8 days ago

This is a significant challenge for us as a global organization, since we must comply with a wide range of regulations, including Canadian regulations, GDPR from Europe, the UK’s distinct regulations, as well as Australian, Singaporean, and various US regulations. One particular challenge is dealing with vendors who use click-through agreements, which can change after a point-in-time assessment. Sometimes, vendors will enable AI features without notifying us, and we only discover these changes after the fact, which forces us to chase due diligence and evaluate the impact retroactively. It often feels like we are chasing a moving train, and managing this compliance landscape is an ongoing challenge.

Director of Information Security19 days ago

As part of our TPRM process, we added a series of questions specifically focused on AI. We approach vendors differently based on their responses, with particular attention to issues like the use of public LLMs or training on customer data. While this may not directly address every compliance issue, it is one way we have adapted to the increased burden.

We have also explored using AI to help manage the compliance workload, such as updating policies, reviewing documentation, and assisting with authoring. Of course, a human always reviews and finalizes these materials, but AI helps us handle the burden itself.

Content you might like

We configured our current consent manager to support Do Not Sell signals.19%

We built an internal tech solution to propagate Do Not Sell signals to relevant adtech platforms.43%

We use a different tool, in addition to our consent manager, to meet this requirement.16%

We are not doing anything, but know we need to find a solution.8%

We are not doing anything yet, and are purposely waiting until CPRA's Do Not Share requirement comes into effect in 2023.3%

We are not doing anything, and do not need to meet this requirement.10%

View Results

Strongly agree5%

Agree67%

Neutral22%

Disagree3%

Strongly disagree1%

View Results