A recent conversation that went viral among Salesforce professionals has put a spotlight on something most admins didn't know existed: a default-enabled setting buried in Salesforce Setup that grants Salesforce access to your organization's customer data — not for your org's benefit, but for Salesforce's own AI training, research, and development purposes.
The reaction was swift. Thousands of Salesforce admins, architects, and security professionals shared the discussion, many expressing shock that the setting existed at all — and that it was turned on by default.
But before you panic, let's be clear: this is not a data breach, and this is not a scandal. This is a configuration choice that every Salesforce customer has the right to make — once they know it exists. The problem isn't that the option is available. The problem is that most organizations have never been asked to make a conscious decision about it.
In this guide, we'll walk you through exactly what this setting does, how to find it, what the Einstein Trust Layer actually protects (and what it doesn't), and a decision framework for whether your organization should opt out.
The Opt Out of Customer Data Access setting is a Salesforce configuration option that controls whether Salesforce can use your organization's customer data for purposes beyond running your org's own Einstein AI features.
When this setting is not opted out (the default state), Salesforce may use your customer data on an aggregate basis to:
When you opt out, Salesforce employees can no longer view your customer data outside of a support case, a pilot, or as otherwise described in your legal agreements. Your data stops contributing to Salesforce's broader AI training pipeline.
This setting is not the Einstein Trust Layer. It is not the toggle that enables or disables Einstein AI features in your org. It is a separate, independent control that governs Salesforce's downstream use of your data — distinct from how Einstein AI operates within your instance.
Follow these exact steps to locate the Opt Out of Customer Data Access setting in your Salesforce org:
Click the gear icon in the upper-right corner of your Salesforce instance and select Setup.
In the Quick Find box on the left sidebar, type "Einstein" or "Opt Out".
Click on Opt Out of Customer Data Access under the Einstein section in Setup. The exact path is:
Setup → Einstein → Opt Out of Customer Data Access
You'll see a toggle labeled "Allow Salesforce Access to Customer Data" (or similar phrasing). If this is enabled (toggled ON), Salesforce currently has access to use your customer data for AI training and research.
If you choose to opt out, toggle the setting to disable Salesforce access. If you choose to keep it enabled, document that decision with your stakeholders.
Regardless of your choice, document the following:
This is where confusion runs rampant in the Salesforce community. These are two completely separate controls that serve different purposes. Here's a clear comparison:
| Feature | Einstein Trust Layer | Opt Out of Customer Data Access |
|---|---|---|
| What it does | Protects your data during real-time AI operations (inference, predictions, generative AI prompts) | Controls whether Salesforce uses your data for global AI model training and R&D |
| When it applies | Every time Einstein AI processes a request in your org | Governs Salesforce's downstream, aggregate use of your data |
| PII masking | Yes — automatically masks sensitive data before sending to LLMs | Not applicable — this controls data usage policy, not real-time processing |
| Zero-data retention | Yes — LLM partners don't store your data | Not applicable — this is about Salesforce's own internal data usage |
| Can you disable it? | No — it's mandatory and always on | Yes — you can toggle it in Setup |
| Impact on Einstein features | Essential — Einstein won't work without it | None — opting out does NOT break any features |
| Who benefits | Your org (data stays protected during AI use) | Salesforce (they use your data to improve their global models) |
| Certifications | ISO 27001/27017/27018, ISO 42001, FedRAMP | Governed by Master Subscription Agreement |
The Einstein Trust Layer protects your data when YOU use AI. It ensures PII is masked, data isn't retained by third-party LLMs, and all interactions are logged.
The Opt Out setting controls whether SALESFORCE uses your data. It determines whether your customer data contributes to Salesforce's global AI model training, research, and development activities.
These two controls are completely independent. The Trust Layer works regardless of your opt-out status. Opting out does not weaken any Trust Layer protections. And keeping the setting enabled does not give you additional Trust Layer benefits.
No. This is the single most important fact in this article.
Opting out of customer data access does NOT disable, degrade, or break any of the following Einstein features in your org:
All of these features rely on the Einstein Trust Layer for data protection during use — not on your data access opt-in status. The opt-out setting only controls whether your data is used for Salesforce's own global model improvements.
The one potential trade-off: opting out may mean your data doesn't contribute to global model personalization improvements that could benefit the broader Salesforce ecosystem. However, most organizations — especially those with sensitive customer data — will find this an acceptable trade-off.
Salesforce is not the first major SaaS platform to face scrutiny over default opt-in AI data training. This is part of a well-documented industry pattern:
Zoom updated its Terms of Service to grant itself broad rights to use customer content — including audio, video, and chat — for AI training. The backlash was immediate and severe. Zoom reversed course within days, explicitly promising not to use customer content for AI training without consent.
Adobe updated its Terms of Use with language that was perceived as granting the company rights to use creative professionals' work for AI training. The creative community revolted. Adobe employees publicly criticized the decision. Adobe was forced to clarify and update its terms.
LinkedIn quietly enabled AI training on user content — including posts, comments, and profile data — by default. The change triggered regulatory scrutiny in the EU and widespread user backlash. Users had to manually navigate to a buried privacy setting to opt out.
Slack faced intense criticism after users discovered the platform was using customer messages, files, and data to train its machine learning models by default. Users were shocked to learn that opting out required sending an email to Slack — there was no in-app toggle.
Every one of these companies:
Salesforce's "Opt Out of Customer Data Access" setting fits squarely within this pattern. The key difference: Salesforce does provide an in-app setting to control this — you just need to know it exists.
While every organization should review this setting, businesses in regulated industries face additional considerations:
Not every organization needs to make the same decision. Here's a framework to guide your evaluation:
The Opt Out of Customer Data Access setting is just one piece of a comprehensive AI governance strategy. Here are broader best practices every Salesforce admin should follow:
Review all Einstein-related settings in your org, not just this one. Understand what data is being used, how, and by whom.
Classify your Salesforce data by sensitivity level (public, internal, confidential, restricted). This makes governance decisions faster and more consistent.
Understand exactly what your contract with Salesforce allows regarding data usage. If the AI training clause concerns you, negotiate specific terms at renewal.
Use Salesforce's built-in audit trail capabilities to track changes to security settings, data access patterns, and AI feature usage.
Document your organization's policies around AI data usage, including which features are approved, how data flows through AI systems, and who has authority to change settings.
Ensure all Salesforce admins understand the implications of data access settings and know how to review them. This shouldn't be a one-person responsibility.
Don't wait for an audit or a regulatory inquiry. Proactively involve legal counsel in AI governance decisions.
Salesforce updates its privacy and AI features regularly. Review release notes for any changes to data handling, and subscribe to Salesforce's Trust and Security communications.
By default, yes. Unless you have specifically opted out via Setup → Einstein → Opt Out of Customer Data Access, Salesforce may use your organization's customer data on an aggregate basis to train global predictive AI models, improve services, and conduct research and development. Opting out stops this usage entirely.
No. Opting out does not break, disable, or degrade any Einstein features in your org. Einstein GPT, Agentforce, Einstein Activity Capture, Einstein Copilot, and all other AI features continue to work normally. The opt-out only controls whether Salesforce uses your data for their global model improvements.
The Einstein Trust Layer is a mandatory security architecture that protects your data during real-time AI operations — it masks PII, enforces zero-data retention with LLM partners, and logs all interactions. The Opt Out of Customer Data Access is a separate setting that controls whether Salesforce uses your data for its own AI training and R&D. They are completely independent.
Navigate to Setup → Einstein → Opt Out of Customer Data Access. Toggle the setting to disable Salesforce's access to your customer data. The change takes effect immediately. Document your decision and the date for your compliance records.
No. The Einstein Trust Layer operates independently and remains fully active regardless of your opt-out status. PII masking, zero-data retention, toxicity detection, and audit logging all continue to function as designed.
Opting out aligns with GDPR and CCPA/CPRA data minimization principles by preventing your data from being used for purposes beyond your contracted services. For organizations subject to these regulations, opting out eliminates potential concerns about "sale/sharing" interpretations and helps satisfy data purpose limitation requirements.
When the setting is enabled (not opted out), Salesforce may use your customer data on an aggregate basis for AI model training and service improvements. This includes insights, reports, and scoring from Einstein services. Salesforce excludes certain categories by default, including credit card numbers, government IDs, racial/ethnic origin data, and financial information. PII is automatically masked by the Einstein Trust Layer before reaching any LLM partners.
Most compliance professionals and legal advisors recommend that organizations in regulated industries — including financial services, healthcare, insurance, and government — opt out of customer data access as a precautionary measure. Even though Salesforce's Trust Layer provides robust protections, the secondary use of data for AI training may raise questions under industry-specific regulations like HIPAA, SOX, FINRA guidelines, and state privacy laws.
We recommend reviewing this setting quarterly or whenever there is a major Salesforce release. Include it in your regular security audit checklist, and re-evaluate whenever your organization's data governance policies, regulatory landscape, or Salesforce contract terms change.
Salesforce may update the functionality or scope of this setting through platform releases. It's important to subscribe to Salesforce's release notes and security bulletins, and to review your Einstein settings after each major release to confirm your opt-out status is still in effect.
The Opt Out of Customer Data Access setting isn't a threat — it's a governance opportunity. Every Salesforce org should know this setting exists, understand what it controls, and make a conscious, documented decision about whether to opt out.
For many organizations — especially those handling sensitive data or operating in regulated environments — opting out is the prudent choice. For others, keeping the setting enabled may be acceptable after a thorough review with legal and security stakeholders.
What's not acceptable is leaving it in its default state simply because nobody knew it was there.
Take 5 minutes today. Go to Setup → Einstein → Opt Out of Customer Data Access. Make your decision. Document it. Move on.
At Vantage Point, we help organizations navigate the intersection of Salesforce, AI, and data governance. Our AI Governance Assessment service includes:
Whether you're implementing Agentforce, deploying Einstein Copilot, or simply trying to understand what your Salesforce org is doing with your data, we bring the expertise to help you move forward with confidence.
Ready to audit your Salesforce AI governance settings? Contact Vantage Point today →
Vantage Point is a Salesforce, HubSpot, and AI implementation partner that helps businesses unlock the full potential of their CRM investments. From Sales Cloud and Service Cloud to Data Cloud, MuleSoft, and Agentforce, we bring deep technical expertise paired with strategic business insight. Our partnerships with Salesforce, HubSpot, Anthropic (Claude AI), Aircall, and Workato enable us to deliver integrated, future-ready solutions across any industry.
Learn more at vantagepoint.io