Slack messages are injected into the AI review prompt:
SLACK CONTEXT:#eng-infra (2 days ago) - @alice:"We should add rate limiting to the auth endpoints. 100 req/min per IP."#eng-infra (2 days ago) - @bob:"Agreed. Use Redis for tracking, it's already deployed."
5
AI review validates against team decisions
Claude can now verify:
Does the PR implement what the team discussed?
Are there gaps between the Slack discussion and implementation?
Should the PR address additional points raised in Slack?
# Slack MCP server base URLSLACK_MCP_URL=https://your-slack-mcp-server.railway.app# No auth token needed for MCP call - Slack bot token is stored in MCP server
Unlike Linear and Sentry integrations, the Slack bot token is stored in the MCP server itself, not passed from Nectr. This simplifies configuration since the bot token has workspace-level access.
The MCPClientManager doesn’t have a dedicated get_slack_messages() method yet. Use the generic query_mcp_server() method:Source:app/mcp/client.py:118Usage:
from app.mcp.client import mcp_clientfrom app.core.config import settingsif settings.SLACK_MCP_URL: slack_messages = await mcp_client.query_mcp_server( server_url=settings.SLACK_MCP_URL, tool_name="search_messages", args={ "channel": "eng-infra", "query": "rate limiting", "days": 7, }, auth_token=None, # Not needed - bot token in MCP server )else: slack_messages = []
Returns:
[ { "text": "We should add rate limiting to the auth endpoints. 100 req/min per IP.", "user": "Alice Johnson", "timestamp": "1710086400.123456", "channel": "eng-infra", "permalink": "https://workspace.slack.com/archives/C01234/p1710086400123456", }]
For consistency with other integrations, add a dedicated method:
# Add to app/mcp/client.py:MCPClientManagerasync def get_slack_messages( self, channel: str, query: str, days: int = 7,) -> list[dict]: """Search recent messages in a Slack channel. Args: channel: Channel name without # (e.g., "eng-infra") query: Search query (keywords, phrases) days: How many days back to search (default 7) Returns: List of message dicts: {text, user, timestamp, channel, permalink} Empty list if Slack MCP is not configured or call fails. """ if not settings.SLACK_MCP_URL: logger.info( "SLACK_MCP_URL not configured — skipping Slack message fetch" ) return [] return await self.query_mcp_server( server_url=settings.SLACK_MCP_URL, tool_name="search_messages", args={"channel": channel, "query": query, "days": days}, auth_token=None, # Bot token stored in MCP server )
Here’s how Slack context can be pulled during a review:
# 1. Extract channel references from PRimport reSLACK_CHANNEL_PATTERN = re.compile(r"#([a-z0-9-]+)")pr_text = f"{pr_data['title']} {pr_data['body']}"channels = SLACK_CHANNEL_PATTERN.findall(pr_text)# Example: "Discussed in #eng-infra" -> ["eng-infra"]# 2. Build search query from PR metadataquery_keywords = [ pr_data["title"], pr_data.get("body", "")[:100], # First 100 chars]query = " ".join(q for q in query_keywords if q).strip()# 3. Pull Slack messages from referenced channelsslack_messages = []for channel in channels[:2]: # Limit to 2 channels if settings.SLACK_MCP_URL: messages = await mcp_client.query_mcp_server( server_url=settings.SLACK_MCP_URL, tool_name="search_messages", args={"channel": channel, "query": query, "days": 14}, ) slack_messages.extend(messages)# 4. Format Slack context for AI review promptslack_context = ""if slack_messages: slack_context = "\n\nSLACK CONTEXT:\n" for msg in slack_messages[:5]: # Limit to 5 most relevant timestamp = datetime.fromtimestamp(float(msg["timestamp"])) slack_context += f"""#{msg['channel']} ({timestamp.strftime('%b %d')}) - @{msg['user']}:\"{msg['text'][:200]}...\"Permalink: {msg['permalink']}"""# 5. Include in AI reviewreview_prompt = f"""Review this pull request:PR Title: {pr_data['title']}PR Description: {pr_data['body']}{slack_context}Diff:{pr_diff}If Slack context is present:1. Verify the PR implements what was discussed in Slack2. Check for gaps between Slack discussion and implementation3. Flag if the PR should address additional points raised by the team"""
Here’s how Slack context appears in an AI-generated review:
# PR Review: Add rate limiting to auth endpoints## Team Discussion Context💬 **Slack conversation in #eng-infra (2 days ago)**:@alice: "We should add rate limiting to the auth endpoints. 100 req/min per IP."[View in Slack](https://workspace.slack.com/archives/C01234/p1710086400123456)@bob: "Agreed. Use Redis for tracking, it's already deployed. Check out the example in api-gateway."[View in Slack](https://workspace.slack.com/archives/C01234/p1710086500789012)## Implementation Analysis✅ **Matches team discussion**:- Rate limiting implemented: 100 requests/min per IP ✓- Using Redis for tracking ✓⚠️ **Potential gap**:- @bob mentioned "check out the example in api-gateway"- Current implementation uses a custom rate limiter- Consider: Is this consistent with the api-gateway pattern?## Suggestions1. **Verify consistency**: Review api-gateway rate limiter implementation to ensure this follows the same pattern (easier to maintain)2. **Test with real traffic patterns**: 100 req/min might be too low for authenticated users. Consider tiered limits: - Anonymous: 100 req/min - Authenticated: 1000 req/min3. **Document decision**: Add comment explaining why Redis was chosen over alternative approaches## VerdictAPPROVE_WITH_SUGGESTIONS - Implementation aligns with team discussion,but verify consistency with existing patterns.
Automatically determine which channels to search based on PR metadata:
def detect_relevant_channels( pr_title: str, pr_body: str, changed_files: list[str],) -> list[str]: """Detect which Slack channels are relevant for this PR. Returns: List of channel names (without #) """ channels = set() # 1. Explicit mentions in PR text = f"{pr_title} {pr_body}" explicit = re.findall(r"#([a-z0-9-]+)", text) channels.update(explicit) # 2. Infer from changed files if any(f.startswith("frontend/") for f in changed_files): channels.add("eng-frontend") if any(f.startswith("backend/") for f in changed_files): channels.add("eng-backend") if any(f.endswith(".sql") or "migrations/" in f for f in changed_files): channels.add("eng-infra") # 3. Infer from PR labels/keywords if "security" in text.lower(): channels.add("eng-security") if "performance" in text.lower(): channels.add("eng-performance") return list(channels)[:3] # Limit to 3 channels