require "bundler/setup"require "openai"client = OpenAI::Client.new( api_key: ENV["OPENAI_API_KEY"] # This is the default and can be omitted)
If you don’t pass an api_key, the client will automatically use the OPENAI_API_KEY environment variable.
3
Make your first request
Create a chat completion:
completion = client.chat.completions.create( messages: [{ role: "user", content: "Say this is a test" }], model: "gpt-4")puts completion.choices[0].message.content# => "This is a test"
You can also stream responses as they’re generated:
stream = client.chat.completions.stream_raw( model: "gpt-4", messages: [ { role: "user", content: "How do I output all files in a directory using Python?" } ])stream.each do |chunk| next if chunk.choices.to_a.empty? print chunk.choices.first&.delta&.contentend
For streaming with the Responses API, use client.responses.stream() which provides a higher-level interface with event handlers.