This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 1 minute read

Your SaaS Contract May Be Training Someone Else's AI on Your IP

The most consequential risks in SaaS agreements rarely appear in headline provisions. Instead, they are embedded in data usage or service improvement clauses, where vendors are often granted broad rights to use customer data. While such language may appear routine, its scope has expanded significantly with the rise of artificial intelligence. “Service improvement” increasingly encompasses the use of customer data to train, fine-tune, or otherwise enhance AI models.

This shift is not merely theoretical. Numerous SaaS providers have revised their terms of service to expand data usage rights related to AI, frequently through unilateral updates rather than negotiated contract amendments. This “quiet update” approach allows vendors to introduce materially broader data rights without drawing focused attention. These updates often include expansive license grants permitting the use of customer inputs, prompts, and outputs for model development and optimization.

Several recurring phrases warrant careful scrutiny:

  • “To improve our services.” This broadly framed language is sufficiently broad to include AI training and related activities, even where such uses are not expressly identified.

  • “Aggregated or anonymized data.” Although anonymization suggests reduced risk, it does not eliminate the potential for extracting valuable patterns, relationships, or structural insights from the data.

  • “Non-personal usage data.” This category frequently extends beyond basic metadata to include behavioral, interactional, and content-derived signals.

In negotiating SaaS agreements, organizations should consider implementing targeted contractual safeguards:

  1. Explicit AI training carve-outs. Provisions should clearly prohibit the use of customer data for training, fine-tuning, or benchmarking AI models absent express consent.

  2. Defined data ownership and limited license grants. Agreements should affirm that all customer data remains the property of the customer, with vendor rights narrowly confined to what is necessary to provide the contracted services.

  3. Audit rights and transparency obligations. Customers should retain the ability to monitor how their data is used, processed, and shared, including visibility into downstream data practices.

Regulatory scrutiny in this area is increasing. The Federal Trade Commission has cautioned that retroactive modifications to terms of service that materially expand data usage, particularly where prior representations emphasized privacy protections, may constitute unfair or deceptive practices.

Accordingly, organizations should adopt a proactive approach to reviewing and negotiating SaaS agreements, with particular attention to evolving data usage provisions and their implications for AI-related activities.

Tags

saascontracts, datagovernance, aicompliance, artificialintelligence, dataprivacy, techtransactions, contractnegotiation, dataownership, inhousecounsel, legaltech, termsofservice, aigovernance, corporate, intellectual property