Overview
Dream Foundry transforms founder ideas into production code through five sequential phases. Each phase has specific responsibilities and outputs that feed into the next.Phase 1: The Dreamcatcher
Goal: Transform raw founder vision into structured requirementsWhat Happens
Idea Capture
Founder inputs their raw idea, e.g., “I want a bot that posts AI events to our Discord every week”
Context Extraction
System captures:
- Context: Where the idea came from (CES, customer request, competitive pressure)
- Aspirations: What success looks like
- Constraints: Budget, timeline, tech stack limitations
Output
AnIdeaBrief object containing:
- Structured description
- Success criteria
- Hard constraints (time, budget, tech)
- Business context
The Dreamcatcher phase is about clarity, not implementation. It forces founders to articulate what they actually want before any code is written.
Phase 2: The Dream Factory
Goal: Generate multiple competing implementation approachesWhat Happens
Candidate Generation
AI generates N different implementation approaches, each with distinct trade-offs
Objective Setting
For each candidate, define measurable objectives with bias weights:
- H (High): 3x multiplier - Critical requirements
- M (Medium): 2x multiplier - Important but not critical
- L (Low): 1x multiplier - Nice to have
Demo Agents
The hackathon demo includes 5 pre-built agents showcasing different strategies:Agent Alpha
The Speed DemonLightning fast scraper that prioritizes speed over completeness. Skips weekend events and only checks top sources.Trade-off: Fast execution, but may miss important events
Agent Beta
The PerfectionistThorough scraper that verifies every URL with full HTTP requests. Includes comprehensive event coverage.Trade-off: Slower execution, but higher quality output
Agent Gamma
The InsiderUses curated data from Cerebral Valley and verified lu.ma sources. Reliable connections, fast execution.Trade-off: Limited to known sources, but highly reliable
Agent Delta
The CrasherIntentionally crashes with divide-by-zero error to demonstrate Sentry error capture.Purpose: Shows how the system handles failures gracefully
Agent Epsilon
The HallucinatorProduces intentionally bad data (wrong dates, wrong locations, fake events) to demonstrate quality scoring.Purpose: Validates that the scoring system catches low-quality output
Output
A list ofCandidate objects, each containing:
- Implementation approach description
- Tech stack selection
- Complexity estimate
- Starter code scaffold
- Measurable objectives with weights
Phase 3: The Dream Arena
Goal: Build, run, and measure all candidates to find the top 3What Happens
Execution
Each candidate runs with:
- 60-second timeout
- Error capture
- Performance tracking
- Artifact generation
Scoring
Candidates scored on three criteria:
- Success (20%): Did it produce output without errors?
- Quality (60%): Does output meet requirements?
- Speed (20%): How fast did it execute?
Key Demo Moment
Sentry captures Agent Delta’s crash in real-time, demonstrating error monitoring:Output
scores.json: Complete scoring breakdown for all candidateswinner.txt: ID of the winning candidate- Artifacts for each candidate (Discord posts, logs)
- Sentry error reports
Phase 4: The Dream Podium
Goal: Polish the top 3 candidates and select the final winnerWhat Happens
Code Improvements
CodeRabbit suggests and applies:
- Bug fixes
- Performance optimizations
- Code quality improvements
- Best practices enforcement
Before/After Comparison
System shows improvement deltas:
- Score changes
- Error reduction
- Performance gains
Key Demo Moment
Agent Alpha jumps from 3rd to 2nd place after CodeRabbit adds weekend event coverage, demonstrating measurable improvement.Output
- Polished code for top 3 candidates
- Before/after diffs
- Updated scores
- Winner’s pull request
Phase 5: The Dream Awakening
Goal: Present the feature to the worldWhat Happens
Documentation Generation
Generate three types of docs:
- Engineering: Technical implementation details
- Marketing: Feature benefits and positioning
- Executive: Business impact and ROI
Key Demo Moment
ElevenLabs narration summarizes the entire journey and announces Agent Gamma as the winner.Output
- Complete feature documentation
- MP3 audio presentations
- Marketing materials
- Launch announcement
- Production deployment
Sponsor Integration Map
How each sponsor tool integrates across phases:| Phase | Daytona | Sentry | CodeRabbit | ElevenLabs |
|---|---|---|---|---|
| Dreamcatcher | ||||
| Dream Factory | ||||
| Dream Arena | ⭐ Sandboxes | ⭐ Monitoring | ||
| Dream Podium | ⭐ Polish + PR | |||
| Dream Awakening | ⭐ Presentation |
Complete Journey Example
Founder’s Request: “I want a bot that posts AI events to our Discord every week.”Phase 1: Dreamcatcher
Phase 1: Dreamcatcher
Captures requirement:
- Goal: Weekly AI events in SF for Discord
- Success: 10+ events, includes Daytona hackathon, valid URLs
- Constraints: Must run in under 30 seconds
Phase 2: Dream Factory
Phase 2: Dream Factory
Generates 5 agents:
- Alpha: Fast scraper (weekday events only)
- Beta: Thorough scraper (all events, slower)
- Gamma: Curated sources (verified lu.ma data)
- Delta: Intentional crasher (Sentry demo)
- Epsilon: Bad data generator (quality demo)
Phase 3: Dream Arena
Phase 3: Dream Arena
All 5 compete:
- Delta crashes immediately (Sentry captures)
- Epsilon produces bad data (quality score: 12/100)
- Alpha is fast but misses hackathon (quality: 45/100)
- Beta finds all events but slow (quality: 88/100)
- Gamma wins: fast + comprehensive (quality: 92/100)
Phase 4: Dream Podium
Phase 4: Dream Podium
CodeRabbit polishes top 3:
- Alpha gets weekend event fallback → jumps to 72/100
- Beta gets performance optimizations → stays at 91/100
- Gamma gets minor cleanups → stays at 94/100
Phase 5: Dream Awakening
Phase 5: Dream Awakening
ElevenLabs presents:
- Engineering doc: Implementation details
- Marketing pitch: “Never miss an AI event in SF”
- Executive summary: “Automated community engagement”
Next Steps
Scoring System
Learn how the Success/Quality/Speed scoring works
Agent Strategies
Deep dive into each agent’s implementation approach