Skip to main content

Overview

The prediction market agent supports multiple search engines for gathering information. Search tools are provided by the prediction_market_agent_tooling library and integrated into various agents. Primary search function using Tavily’s AI-powered search API. Optimized for research tasks and agent workflows. Location: prediction_market_agent_tooling.tools.tavily.tavily_search
query
str
required
The search query string
search_depth
str
default:"basic"
Search depth level: "basic" or "advanced"
  • basic: Faster, fewer results
  • advanced: More thorough, includes additional sources
max_results
int
default:"5"
Maximum number of search results to return
return
TavilyResponse
Response object containing search resultsStructure:
  • results (list): List of search result objects
    • title (str): Page title
    • url (str): Page URL
    • content (str): Relevant content snippet
    • score (float): Relevance score
from prediction_market_agent_tooling.tools.tavily.tavily_search import tavily_search

response = tavily_search(
    query="Will Bitcoin reach $100k by 2025?",
    search_depth="basic",
    max_results=5
)

for result in response.results:
    print(f"{result.title}: {result.url}")
Tavily search requires TAVILY_API_KEY environment variable to be set.

TavilyResponse

Response model for Tavily search results. Location: prediction_market_agent_tooling.tools.tavily.tavily_models
results
list[TavilyResult]
List of search result objects
query
str
The original search query

TavilyResult

Individual search result object.
title
str
Page title
url
str
Page URL
content
str
Relevant content excerpt from the page
score
float
Relevance score (0.0 to 1.0)

search_google

Google search integration using the tooling library. Location: prediction_market_agent_tooling.tools.google
query
str
required
The search query
return
list[str]
List of URLs from search results
from prediction_market_agent_tooling.tools.google import search_google

results = search_google("prediction markets 2025")
for url in results:
    print(url)

GoogleSearchTool

Function calling wrapper for Google search. Location: prediction_market_agent.tools.web_search.google

Schema

search_google_schema = {
    "type": "function",
    "function": {
        "name": "search_google",
        "parameters": {
            "type": "object",
            "properties": {
                "query": {
                    "type": "string",
                    "description": "The google search query.",
                }
            },
            "required": ["query"],
        },
        "description": "Google search to return search results from a query.",
    },
}

Usage

from prediction_market_agent.tools.web_search.google import GoogleSearchTool

tool = GoogleSearchTool()
results = tool.fn(query="prediction markets")
print(tool.schema)

Search in Agents

Think Thoroughly Agent

Integrates Tavily search as a LangChain tool:
from prediction_market_agent_tooling.tools.tavily.tavily_search import tavily_search
from langchain.tools import tool

@tool("tavily_search_tool")
def tavily_search_tool(query: str) -> list[dict[str, str]]:
    """Search the web for information on a topic."""
    output = tavily_search(query=query)
    return [
        {
            "title": result.title,
            "url": result.url,
            "content": result.content,
        }
        for result in output.results
    ]

# Use in agent
agent = create_react_agent(
    model=llm,
    tools=[tavily_search_tool],
    state_modifier=system_message,
)

Prophet Research Integration

Search is integrated into the research workflow:
from prediction_market_agent.tools.prediction_prophet.research import prophet_research
from pydantic_ai import Agent

research = prophet_research(
    goal="Will Bitcoin reach $100k?",
    agent=Agent(...),
    openai_api_key=api_keys.openai_api_key,
    tavily_api_key=api_keys.tavily_api_key,
    subqueries_limit=4,
    max_results_per_search=5,
    min_scraped_sites=10,
)

print(research.report)

Configuration

Environment Variables

TAVILY_API_KEY
str
required
API key for Tavily search serviceGet your key at tavily.com
GOOGLE_API_KEY
str
Google Custom Search API key (if using Google search)
GOOGLE_SEARCH_ENGINE_ID
str
Google Custom Search Engine ID

API Keys Class

from prediction_market_agent.utils import APIKeys

keys = APIKeys()
tavily_key = keys.tavily_api_key  # Returns SecretStr

Search Strategies

Basic Search Strategy

1

Generate Query

Use LLM to generate targeted search query from market question
2

Execute Search

Call tavily_search with appropriate parameters
3

Filter Results

Remove duplicate URLs and irrelevant sources
4

Scrape Content

Use web scraping tools to extract content from result URLs
5

Analyze

Feed content to LLM for analysis

Advanced Research Strategy

from prediction_market_agent.tools.prediction_prophet.research import prophet_research
from prediction_market_agent_tooling.tools.openai_utils import get_openai_provider
from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel

# Configure research agent
research_agent = Agent(
    OpenAIModel(
        "gpt-4o-2024-08-06",
        provider=get_openai_provider(api_key=api_keys.openai_api_key),
    ),
    model_settings=ModelSettings(temperature=0.7),
)

# Perform thorough research
research = prophet_research(
    goal=market.question,
    agent=research_agent,
    openai_api_key=api_keys.openai_api_key,
    tavily_api_key=api_keys.tavily_api_key,
    initial_subqueries_limit=20,
    subqueries_limit=4,
    max_results_per_search=5,
    min_scraped_sites=10,
)

Search Depth Comparison

Basic Search

Use for:
  • Quick lookups
  • Simple queries
  • Cost optimization
  • Real-time agent responses
Characteristics:
  • Faster execution
  • Lower cost
  • Fewer sources
  • Good for straightforward questions

Advanced Search

Use for:
  • Complex research
  • Important predictions
  • Multi-source verification
  • Deep analysis
Characteristics:
  • Slower execution
  • Higher cost
  • More comprehensive
  • Better for nuanced questions

Error Handling

Search functions can fail due to:
  • Invalid API keys
  • Rate limiting
  • Network errors
  • No results found
from prediction_market_agent_tooling.tools.tavily.tavily_search import tavily_search

try:
    results = tavily_search(query="...", max_results=5)
    if not results.results:
        print("No results found")
        return None
except Exception as e:
    print(f"Search failed: {e}")
    return None

Best Practices

Cost Management

  • Use basic search for most queries
  • Reserve advanced search for critical predictions
  • Cache search results when possible
  • Implement rate limiting

Query Optimization

  • Use LLM to generate targeted queries
  • Include date ranges for time-sensitive questions
  • Filter out prediction market URLs to avoid circular references

Result Processing

  • Deduplicate URLs across searches
  • Track previously scraped URLs
  • Validate URLs before scraping
  • Handle failed scrapes gracefully

Performance

  • Limit max_results based on needs
  • Use parallel scraping when possible
  • Implement timeouts for slow sources
  • Cache frequently accessed results

Dependencies

pip install tavily-python httpx

See Also

Build docs developers (and LLMs) love