How to Use OpenAI’s ChatGPT API with Plain JavaScript and Laravel

How to Talk to OpenAI’s ChatGPT API Using JavaScript A Quick Guide

If you’re curious about how to send a prompt to OpenAI’s ChatGPT API and get a response back — all using JavaScript — you’re in the right place. Whether you want to build a chatbot, automate some writing tasks, or just experiment, this quick guide will get you started.

What You Will Build

    • A plain HTML/JS frontend that sends a user’s prompt.
    • A Laravel backend that handles the call to OpenAI securely.
    • Clean separation between frontend and backend.

Step 1: Set Up the Laravel Backend

First, create a Laravel project if you don’t have one yet:

composer create-project laravel/laravel chatgpt-api-backend

Then add your OpenAI key to your .env file:
OPENAI_API_KEY=your_openai_api_key_here

Create a Controller

php artisan make:controller ChatController
Inside app/Http/Controllers/ChatController.php:

namespace App\Http\Controllers;
use Illuminate\Http\Request;
use Illuminate\Support\Facades\Http;
class ChatController extends Controller
{
public function chat(Request $request)
{
$prompt = $request->input('prompt');
$response = Http::withToken(env('OPENAI_API_KEY'))->post('https://api.openai.com/v1/completions', [
'model' => 'text-davinci-003',
'prompt' => $prompt,
'max_tokens' => 100,
'temperature' => 0.7,
'n' => 1,

]);
return response()->json($response->json());
}
}

The parameters above are defined as below:

  • engine: Specifies the model to use for generating the completion.
  • prompt: The text string that serves as the starting point or instruction for the model.
  • max_tokens: Specifies the maximum length of the generated completion. In this case, it is set to 100 tokens, which determines the length of the reply.
  • temperature: Controls the randomness of the output. Higher values like 0.8 make the output more random, while lower values like 0.2 make it more focused and deterministic. Here, the temperature is set to 0.7.
  • n: Determines the number of completions to generate. In this case, we request only one completion.
  • stop: An optional string or array of strings that indicates where the model should stop generating text. If set to null, it means there is no predefined stopping condition, and the model will generate text until it reaches the max_tokens limit.
  • stop: null

Add a Route

In routes/api.php :

use App\Http\Controllers\ChatController;
Route::post('/chat', [ChatController::class, 'chat']);

Now our backend is ready at http://your-domain/api/chat.

Step 2: Create the Frontend (Plain JavaScript)

Let’s create a simple index.html that we can save and open in the browser:

<!DOCTYPE html>
<html>
<head>
<title>ChatGPT with Laravel Backend</title>
</head>
<body>
<h1>Ask ChatGPT</h1>
<textarea id=”prompt” rows=”4″ cols=”50″ placeholder=”Type your prompt…”></textarea><br>
<button onclick=”sendPrompt()”>Send</button>
<pre id=”response”></pre>

<script>
async function sendPrompt() {
const prompt = document.getElementById(‘prompt’).value;

const response = await fetch(‘http://localhost:8000/api/chat’, {
method: ‘POST’,
headers: { ‘Content-Type’: ‘application/json’ },
body: JSON.stringify({ prompt })
});

const data = await response.json();
document.getElementById(‘response’).textContent = data.choices?.[0]?.text.trim() || ‘No response’;
}
</script>
</body>
</html>

Don’t forget to run your Laravel backend using php artisan serve.

Optional: Allow CORS (Frontend on Different Domain)

If your frontend is hosted separately (like on a different port or domain), you’ll need to allow cross-origin requests.

Install Laravel CORS package (already included from Laravel 7+):

composer require fruitcake/laravel-cors

Then in app/Http/Middleware/HandleCors.php, allow your frontend origin or use * for all during development.

You can also configure it in config/cors.php:

'paths' => ['api/*'],
'allowed_methods' => ['*'],
'allowed_origins' => ['http://localhost:5500'], // or your frontend origin
'allowed_headers' => ['*'],

Summary

  • Laravel securely handles your OpenAI API key.
  • Your frontend stays clean and simple with plain HTML + JS.
  • This approach keeps your key safe while keeping development smooth.

Hope this was fun 🙂

Share This Post

Leave a Comment

Your email address will not be published. Required fields are marked *

Subscribe To my Future Posts

Get notified whenever I post something new

More To Explore