Skip to main content

Overview

This example demonstrates how to create a simple chat completion using the OpenAI Ruby SDK. Chat completions are the foundation of interacting with GPT models for conversational AI applications.

Prerequisites

Before running this example, ensure you have:
1

Install the SDK

gem install openai
2

Set your API key

Set the OPENAI_API_KEY environment variable:
export OPENAI_API_KEY='your-api-key-here'

Basic Example

Here’s a complete example of creating a basic chat completion:
#!/usr/bin/env ruby
# frozen_string_literal: true
# typed: strong

require_relative "../lib/openai"

# gets API Key from environment variable `OPENAI_API_KEY`
client = OpenAI::Client.new

begin
  # Non-streaming:
  pp("----- standard request -----")

  completion = client.chat.completions.create(
    model: "gpt-4",
    messages: [
      {
        role: "user",
        content: "Say this is a test"
      }
    ]
  )

  pp(completion.choices.first&.message&.content)
end

Key Concepts

Client Initialization

The SDK automatically reads your API key from the OPENAI_API_KEY environment variable:
client = OpenAI::Client.new

Creating a Completion

Use the chat.completions.create method to send a request:
completion = client.chat.completions.create(
  model: "gpt-4",
  messages: [
    {
      role: "user",
      content: "Say this is a test"
    }
  ]
)

Accessing the Response

The response contains an array of choices. Each choice has a message with the generated content:
pp(completion.choices.first&.message&.content)
The &. operator is Ruby’s safe navigation operator, which prevents errors if the value is nil.

Message Roles

Chat completions support three primary message roles:
  • user - Messages from the end user
  • assistant - Messages from the AI assistant
  • system - Instructions that guide the assistant’s behavior

Next Steps

Streaming Responses

Learn how to stream chat completions for real-time responses

Function Calling

Integrate tools and function calling into your chat completions

Build docs developers (and LLMs) love