This guide shows how to integrate OpenAI’s ChatGPT with Salesforce step-by-step. We’ll start with the quick legacy route using Remote Site Settings + Apex, then move to the recommended, secure setup with Named Credentials + External Credentials so your API keys stay hidden, headers are injected automatically, and all calls run server-side (no LWC key exposure).
Why integrate OpenAI with Salesforce?
Because your reps, agents, and ops users want summaries, draft emails, next-best suggestions, classification, data enrichment, and natural-language insights right inside Salesforce. Done right, it saves time, boosts quality, and lets admins automate magic without switching tools.
What API endpoint are we calling?
We’ll keep it simple and hit the Chat Completions style endpoint:
POST https://api.openai.com/v1/chat/completions
Headers:
Authorization: Bearer <YOUR_API_KEY>
Content-Type: application/json
Body:
{
"model": "gpt-4o-mini",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Write a 2-line summary for this Salesforce Case: <text>"}
],
"temperature": 0.2
}
Swap the model/message as you like. The Auth header stays the same.
Prereqs
- Salesforce org with permission to create Apex, Flows, and Setup changes.
- An OpenAI API key (keep this private).
- Basic familiarity with Setup → Security items in Salesforce.
Path 1 (Legacy): Remote Site Settings + Apex (Full Code)
This works today and is easy to understand. The downside is key management/security. You must store/rotate the key yourself (we’ll use Protected Custom Metadata to keep it safer).
If you’re starting fresh, skip to Path 2 (Named Credentials). But if your org still uses RSS, here’s the cleanest version.
Step 1: Remote Site Settings
Setup → Security → Remote Site Settings → New Remote Site
- Remote Site Name:
OpenAI_API
- Remote Site URL:
https://api.openai.com
- Save.
This allows callouts to that domain.
Step 2: Store API Key safely (Protected Custom Metadata)
- Setup → Custom Metadata Types → New Custom Metadata Type
- Label: OpenAI Config
- Object Name:
OpenAI_Config
- Visibility: Protected (very important)
- Add a Text(255) field:
API_Key__c
. - Create a record (e.g.,
Default
) and setAPI_Key__c
to your key. - Restrict access via Permission Sets (only admins/integrations).
If you can’t use protected CMDT, choose Protected Hierarchy Custom Settings or Encrypted fields and tighten permissions.
Step 3: Apex callout class (Chat Completion)
public with sharing class OpenAI_RemoteSite_Client {
public static final String OPENAI_URL = 'https://api.openai.com/v1/chat/completions';
public static final String MODEL = 'gpt-4o-mini';
public static final Decimal TEMP = 0.2;
public class ChatResponse {
public String id;
public String model;
public List<Choice> choices;
public class Choice {
public Integer index;
public Message message;
public class Message { public String role; public String content; }
}
}
public class OpenAIException extends Exception {}
// Read key from Protected Custom Metadata
private static String getApiKey() {
OpenAI_Config__mdt cfg = [SELECT API_Key__c FROM OpenAI_Config__mdt LIMIT 1];
if (cfg == null || String.isBlank(cfg.API_Key__c)) {
throw new OpenAIException('OpenAI API key not configured.');
}
return cfg.API_Key__c;
}
@AuraEnabled(cacheable=false)
public static String chat(String userPrompt) {
if (String.isBlank(userPrompt)) {
throw new OpenAIException('Prompt cannot be empty.');
}
HttpRequest req = new HttpRequest();
req.setEndpoint(OPENAI_URL);
req.setMethod('POST');
req.setTimeout(20000);
req.setHeader('Content-Type','application/json');
req.setHeader('Authorization','Bearer ' + getApiKey());
Map<String,Object> body = new Map<String,Object>{
'model' => MODEL,
'messages' => new List<Object>{
new Map<String,Object>{ 'role' => 'system', 'content' => 'You are a helpful assistant.' },
new Map<String,Object>{ 'role' => 'user', 'content' => userPrompt }
},
'temperature' => TEMP
};
req.setBody(JSON.serialize(body));
Http http = new Http();
HttpResponse res = http.send(req);
if (res.getStatusCode() >= 200 && res.getStatusCode() < 300) {
ChatResponse parsed = (ChatResponse) JSON.deserialize(res.getBody(), ChatResponse.class);
if (parsed != null && parsed.choices != null && !parsed.choices.isEmpty()) {
return parsed.choices[0].message != null ? parsed.choices[0].message.content : '';
}
return '';
}
throw new OpenAIException('OpenAI error: ' + res.getStatus() + ' — ' + res.getBody());
}
}
Test in Anonymous Apex:
System.debug(OpenAI_RemoteSite_Client.chat('Write a two-line summary for this Case: delayed shipment, customer angry.'));
Step 4: Optional – Invocable method for Flow
public with sharing class OpenAI_FlowAction {
public class Input { @InvocableVariable(required=true) public String prompt; }
public class Output { @InvocableVariable public String answer; }
@InvocableMethod(label='OpenAI Chat' description='Send prompt to OpenAI and return answer')
public static List<Output> run(List<Input> inputs) {
List<Output> out = new List<Output>();
for (Input i : inputs) {
Output o = new Output();
o.answer = OpenAI_RemoteSite_Client.chat(i.prompt);
out.add(o);
}
return out;
}
}
Then in Flow:
- Add Action → OpenAI Chat (this invocable).
- Input your dynamic prompt (record fields/text template).
- Output: store to a variable or a field.
Step 5: Optional – LWC that calls Apex
// openaiChat.js
import { LightningElement, track } from 'lwc';
import chat from '@salesforce/apex/OpenAI_RemoteSite_Client.chat';
export default class OpenAiChat extends LightningElement {
@track prompt = '';
@track answer = '';
loading = false;
handleChange(e) { this.prompt = e.target.value; }
async handleAsk() {
this.loading = true;
try {
this.answer = await chat({ userPrompt: this.prompt });
} catch (e) {
this.answer = 'Error: ' + (e?.body?.message || e.message);
} finally {
this.loading = false;
}
}
}
<!-- openaiChat.html -->
<template>
<lightning-card title="Ask ChatGPT">
<div class="slds-p-around_medium">
<lightning-textarea label="Prompt" value={prompt} onchange={handleChange}></lightning-textarea>
<lightning-button variant="brand" class="slds-m-top_small" label="Ask" onclick={handleAsk} disabled={loading}></lightning-button>
<template if:true={answer}>
<div class="slds-m-top_medium">
<lightning-formatted-text value={answer}></lightning-formatted-text>
</div>
</template>
</div>
</lightning-card>
</template>
The browser never sees the API key; it calls Apex, which performs the server-side callout.
Path 2 (Recommended): Named Credentials + External Credentials
This is the secure, modern way. Your API key sits in Salesforce’s External Credential (a secret vault), permissions control who can use it, and code/flows reference a Named Credential. No hardcoded keys; rotation is easy.
Step A: Create External Credential + Permission Set mapping
- Setup → Named Credentials → External Credentials → New
- Label:
OpenAI_Key
- Authentication Protocol: Custom Header
- Header Name:
Authorization
- Header Value:
Bearer {API_KEY}
(use a Custom Header Parameter with a Secret)
- Label:
- Click Principals → create one (e.g.,
OpenAI_Principal
), set the secret value to your API key. - Permission Set → External Credential Principal Access: grant a Perm Set that maps users→principal so only allowed users/processes can call OpenAI.
Step B: Create Named Credential (base URL + header)
- Setup → Named Credentials → New
- Label/Name:
OpenAI
- URL:
https://api.openai.com
- Identity Type: Named Principal
- Authentication: Use External Credential → choose
OpenAI_Key
- Save.
- Label/Name:
Now any Apex/Flow can reference callout:OpenAI
safely.
Step C: Apex code using the Named Credential
public with sharing class OpenAI_NC_Client {
public static final String MODEL = 'gpt-4o-mini';
public class ChatRequestMessage {
public String role; public String content;
public ChatRequestMessage(String r, String c) { role = r; content = c; }
}
public class ChatRequest {
public String model; public List<ChatRequestMessage> messages; public Decimal temperature;
}
public class ChatResponse {
public List<Choice> choices;
public class Choice {
public Integer index;
public Message message;
public class Message { public String role; public String content; }
}
}
@AuraEnabled(cacheable=false)
public static String chat(String prompt) {
HttpRequest req = new HttpRequest();
req.setEndpoint('callout:OpenAI/v1/chat/completions');
req.setMethod('POST');
req.setHeader('Content-Type', 'application/json');
ChatRequest cr = new ChatRequest();
cr.model = MODEL; cr.temperature = 0.2;
cr.messages = new List<ChatRequestMessage>{
new ChatRequestMessage('system','You are a helpful assistant.'),
new ChatRequestMessage('user', prompt)
};
req.setBody(JSON.serialize(cr));
Http http = new Http();
HttpResponse res = http.send(req);
if (res.getStatusCode() >= 200 && res.getStatusCode() < 300) {
ChatResponse parsed = (ChatResponse) JSON.deserialize(res.getBody(), ChatResponse.class);
if (parsed != null && parsed.choices != null && !parsed.choices.isEmpty()) {
return parsed.choices[0].message != null ? parsed.choices[0].message.content : '';
}
return '';
}
throw new AuraHandledException('OpenAI error: ' + res.getStatus() + ' — ' + res.getBody());
}
}
Step D: Zero-code Flow HTTP Callout (bonus)
- Create/Select a Flow (Screen Flow or Record-Triggered).
- Add Element → Action → HTTP Callout.
- Choose Named Credential =
OpenAI
. - Method: POST — Path:
/v1/chat/completions
- Headers:
Content-Type: application/json
- Body: Build JSON with a text template (model, messages).
- Map the response body to a text variable and use it in your screen/field update.
Security notes you must follow
- Never call OpenAI from LWC (client-side) — you’ll leak the key. Always go LWC → Apex → OpenAI.
- Prefer Named Credentials over Remote Site Settings for key storage, rotation, and least privilege.
- Use Permission Set mapping on the External Credential so only approved users/processes can call OpenAI.
- Add governors: timeouts, retries, and Queueable Apex for long prompts to keep UI snappy.
- Log responsibly: don’t log raw prompts containing PII; consider redaction where needed.
Common errors & quick fixes
- 401/403 Unauthorized: External Credential secret missing/wrong; Named Credential not linked; header not
Authorization: Bearer <key>
. - 404: Wrong endpoint path (
/v1/chat/completions
). - 415 Unsupported Media Type: Forgot
Content-Type: application/json
. - LIMIT_EXCEEDED / Timeout: Use Queueable or Continuation; reduce prompt size; consider async processing.
- Mixed content/CORS: You tried calling from LWC directly — move callout to Apex/Flow.
Conclusion
You now have two working recipes to integrate ChatGPT with Salesforce:
- Remote Site Settings + Apex — fast and clear, but you manage secrets.
- Named Credentials + External Credentials — the recommended secure approach that scales, rotates keys, and keeps secrets out of code.
Start with the method that fits your org today, and plan a quick hop to Named Credentials when you can. The payoff: faster case summaries, smarter email drafts, cleaner data, and happier users — inside Salesforce.
Next steps: Learn with Namaste Salesforce (AI + OpenAI courses)
Want to go from “it works” to “it sells”?
Check out my hands-on training:
- AI for Admins & Builders – Prompt patterns, Flow + HTTP Callout, ethical guardrails, and real case studies.
- OpenAI + Salesforce Integration (Apex + Flows) – Secure Named Credentials, Invocable Apex, async patterns, and LWC UX.
- Conga CPQ + AI Use Cases – Price explanation, quote summaries, proposal drafting, and approval notes with ChatGPT.
👉 Explore the courses on Namaste Salesforce — practical labs, simple language, and real-world projects. Let’s build smart, secure AI experiences that your users will actually love.