The three AI tools that power the Argument Analysis Tool
The Argument Analysis Tool provides three AI tools that can be used by flows to gather external data. Each tool is defined using ai.defineTool() and can be autonomously invoked by the AI during flow execution.
export const webSearch = ai.defineTool( { name: 'webSearch', description: 'Searches the web for a given query and returns a list of search results, including organic results, news, and academic papers to get a comprehensive overview.', inputSchema: WebSearchInputSchema, outputSchema: WebSearchOutputSchema, }, async (input) => { if (!process.env.SERPAPI_API_KEY || process.env.SERPAPI_API_KEY === 'YOUR_API_KEY_HERE') { throw new Error( 'SERPAPI_API_KEY is not configured. Please add it to your .env file.' ); } console.log(`Performing real web search for: ${input.query}`); try { const response = await getJson({ api_key: process.env.SERPAPI_API_KEY, q: input.query, engine: 'google', location: 'United States', }); const organicResults = (response.organic_results || []).map(result => ({ title: result.title, link: result.link, snippet: result.snippet, })); if (organicResults.length === 0) { throw new Error('No web search results found for the query.'); } // Limit to top 5-7 results to keep context focused return organicResults.slice(0, 7); } catch (error: any) { console.error('Error performing web search with SerpApi:', error); throw new Error(`SerpApi search failed: ${error.message}`); } });
From src/ai/flows/generate-argument-blueprint.ts:72:
const mainAnalysisPrompt = ai.definePrompt({ name: 'mainAnalysisPrompt', tools: [webSearch], // ← Tool available to AI system: `... **Execution Process:** 1. Analyze Input 2. Comprehensive Web Search using the \`webSearch\` tool with the provided \`searchQuery\` to find credible opposing viewpoints 3. Synthesize information from MULTIPLE diverse, high-authority sources ... `,});
The AI autonomously calls webSearch({ query: searchQuery }) during execution.
const WebScraperInputSchema = z.object({ url: z.string().url().describe('The URL of the webpage to scrape.'),});const WebScraperOutputSchema = z.string().describe( 'The extracted textual content of the webpage.');
export const webScraper = ai.defineTool( { name: 'webScraper', description: 'Uses a headless browser via SerpApi to fetch the full HTML content of a given URL, then returns its main textual content. Use this to read the content of an article or webpage provided by the user, especially for modern, JS-heavy sites.', inputSchema: WebScraperInputSchema, outputSchema: WebScraperOutputSchema, }, async (input) => { if (!process.env.SERPAPI_API_KEY || process.env.SERPAPI_API_KEY === 'YOUR_API_KEY_HERE') { throw new Error( 'SERPAPI_API_KEY is not configured for the web scraper.' ); } console.log(`Scraping URL with SerpApi: ${input.url}`); try { const html = await getHtml({ api_key: process.env.SERPAPI_API_KEY, url: input.url, }); if (!html) { throw new Error('SerpApi returned no HTML content.'); } const dom = new JSDOM(html); const doc = dom.window.document; // Remove non-content elements for cleaner text doc.querySelectorAll( 'script, style, nav, footer, header, aside, form, button, ' + '[role="navigation"], [role="banner"], [role="contentinfo"]' ).forEach(el => el.remove()); // Try to find the main content area const mainContent = doc.querySelector( 'main, article, #content, #main, .post, .entry-content, .article-body' ); const textSource = mainContent || doc.body; let text = textSource.textContent || ""; // Clean up whitespace const cleanedText = text.replace(/\s\s+/g, ' ').trim(); if (!cleanedText) { throw new Error('Could not extract meaningful content from the page.'); } console.log( `SerpApi scraping successful, content length: ${cleanedText.length}` ); // Return a reasonable amount of content to avoid oversized AI context return cleanedText.substring(0, 15000); } catch (error: any) { console.error('Error in webScraper tool with SerpApi:', error); throw new Error(`Web scraper failed: ${error.message}`); } });
While not currently used in active flows, this tool can be invoked by the AI if needed:
// In a prompt that has webScraper in its tools array:system: `If the user provides a URL, use the webScraper tool to fetch its content before analysis.`
const TwitterSearchInputSchema = z.object({ query: z.string().describe( 'The search query for X/Twitter. Exclude hashtags or "from:" filters, just provide the keywords.' ),});const TweetAuthorSchema = z.object({ name: z.string().describe("The author's display name."), username: z.string().describe("The author's unique username/handle."), profile_image_url: z.string().url().describe( "URL to the author's profile picture." ),});const PublicMetricsSchema = z.object({ retweet_count: z.number(), reply_count: z.number(), like_count: z.number(), impression_count: z.number(),});const TweetResultSchema = z.object({ id: z.string().describe('The unique ID of the tweet.'), text: z.string().describe('The full text content of the tweet.'), author: TweetAuthorSchema, created_at: z.string().describe( 'The date and time the tweet was created.' ), public_metrics: PublicMetricsSchema.describe( 'Engagement metrics for the tweet.' ),});const TwitterSearchOutputSchema = z.array(TweetResultSchema);
if (!process.env.API_KEY || process.env.API_KEY === 'YOUR_API_KEY_HERE') { throw new Error('API_KEY is not configured. Please add it to your .env file.');}
Write descriptive tool descriptions that help the AI understand when to use them:
description: 'Searches the web for a given query and returns a list ofsearch results, including organic results, news, and academic papers toget a comprehensive overview.'
The AI uses these descriptions to autonomously decide when to invoke tools.