Skip to main content
The GemAI chat interface provides a clean, intuitive messaging experience powered by Google’s Gemini AI models. Built with Jetpack Compose, the interface delivers real-time streaming responses with visual feedback for message states.

Message structure

Each message in GemAI follows a well-defined data model that tracks essential information:
Message.kt
data class Message(
    val id: Long = 0,
    val conversationId: Long,
    val timestamp: Long,
    val content: String,
    val participant: Participant,
    val status: MessageStatus,
)

Message participants

Messages are categorized by participant type, which determines how they’re displayed in the UI:
User messages represent input from you. They’re created using the Message.send() factory method:
Message.send(conversationId: Long, content: String): Message
User messages start with MessageStatus.LOADING status while being processed.
The Participant enum has two values: USER and MODEL. Each participant has a corresponding role property that maps to Gemini API requirements ("user" or "model").

Message status indicators

The interface provides visual feedback through four distinct message states:
  • LOADING - Displayed while your message is being sent to the AI
  • SENT - Indicates your message was successfully delivered
  • FAILED - Shows when message delivery encountered an error
  • RECEIVED - Marks AI responses that have been successfully received
You can update message content and status dynamically using the update() method:
message.update(content = "Updated text", status = MessageStatus.SENT)

Message composition

When you compose a new message, the application creates it with automatic timestamp generation:
Example: Creating a user message
val userMessage = Message.send(
    conversationId = 123,
    content = "What is Clean Architecture?"
)
// Result: Message with participant=USER, status=LOADING
All timestamps are generated using System.currentTimeMillis() to ensure accurate chronological ordering in the chat history.

Message display patterns

The chat interface handles different message scenarios with specific patterns:

Receiving streamed responses

When the AI responds, messages can be created incrementally as content streams in:
Example: Receiving AI message
val aiMessage = Message.receive(
    id = messageId,
    conversationId = 123,
    content = streamedContent
)

Foreign key relationships

Messages maintain referential integrity through foreign key constraints in the Room database:
MessageEntity structure
@Entity(
    tableName = "messages",
    foreignKeys = [
        ForeignKey(
            entity = ConversationEntity::class,
            parentColumns = ["id"],
            childColumns = ["conversationId"],
            onDelete = ForeignKey.CASCADE
        )
    ]
)
When you delete a conversation, all associated messages are automatically removed through cascade deletion.

User interaction flow

The typical interaction pattern in the chat interface follows these steps:
  1. Compose - You type a message in the input field
  2. Send - The message is created with LOADING status
  3. Process - Your message is sent to the Gemini API via the repository layer
  4. Stream - AI response is received incrementally and displayed in real-time
  5. Complete - Both messages are marked with final status (SENT and RECEIVED)
val message = Message.send(
    conversationId = conversationId,
    content = userInput
)

Error handling in the UI

When message delivery fails, the status is updated to FAILED, providing visual feedback:
message.update(status = MessageStatus.FAILED)
You can retry failed messages by resubmitting them through the SendMessageUseCase, which handles the complete message lifecycle.

Startup prompts

GemAI provides smart startup prompts to help you begin conversations quickly. These pre-built suggestions appear when you first launch the app or start a new conversation.

Prompt structure

Each startup prompt includes text, an icon, and creation timestamp:
StartUpPrompt.kt
data class StartUpPrompt(
    val id: Long = 0,
    val text: String,
    val icon: PromptIcon = PromptIcon.QUESTION_MARK,
    val createdAt: Long = System.currentTimeMillis(),
)

Prompt categories

Prompts are organized by icon type to indicate their category:

CODE

Coding-related prompts and programming questions

LITERATURE

Writing, poetry, and creative content prompts

TRANSLATION

Language translation and linguistic prompts

SCIENCE

Scientific and technical questions
GemAI supports 21 different prompt categories including:
  • CODE - Programming and development
  • LITERATURE - Writing and creative content
  • TRANSLATION - Language and translation
  • SCIENCE - Scientific topics
  • ART - Design and artistic prompts
  • EDUCATION - Learning and teaching
  • HEALTH - Wellness and medical
  • ENTERTAINMENT - Media and fun
  • FINANCE - Business and money
  • TRAVEL - Exploration and trips
  • FOOD - Culinary and recipes
  • FITNESS - Physical activity
  • ENVIRONMENT - Sustainability
  • HISTORY - Historical topics
  • TECHNOLOGY - Gadgets and tech
  • And more…

Default prompts

The app includes four default startup prompts:
PromptEntity.kt
val DefaultPrompts = listOf(
    PromptEntity(text = "Translate text to Spanish", icon = PromptIcon.TRANSLATION),
    PromptEntity(text = "Write a short poem about nature", icon = PromptIcon.LITERATURE),
    PromptEntity(text = "Create a Python function for sorting a list", icon = PromptIcon.CODE),
    PromptEntity(text = "Generate a script for a short play", icon = PromptIcon.LITERATURE),
)

Retrieving prompts

Use the GetPromptsUseCase to fetch available prompts:
GetPromptsUseCase.kt
class GetPromptsUseCase @Inject constructor(
    private val chatRepository: ChatRepository
) : BaseUseCase<Unit, List<StartUpPrompt>> {
    override fun performStreaming(): Flow<List<StartUpPrompt>> {
        return chatRepository.getPrompts()
    }
}
Prompts are stored in the Room database and can be customized by adding new entries to the prompts table.

Using prompts in the UI

When you tap a startup prompt, its text is automatically inserted into the message input field, ready to send to the AI model. This provides a quick way to start meaningful conversations without typing.
You can extend the prompt system by adding custom prompts to the database with your preferred categories and icons.

Build docs developers (and LLMs) love