JARVIS uses Convex as its real-time database for live updates to the frontend corkboard. All tables support real-time subscriptions, enabling the UI to update instantly as agents discover new intelligence.
Schema Overview
The schema is defined in frontend/convex/schema.ts and includes five core tables:
captures — Raw face captures from glasses, Telegram, or manual upload
persons — Identified individuals with dossiers and board positions
intelFragments — Research data fragments from various sources
connections — Relationships between persons
activityLog — Live activity feed for the sidebar
Captures Table
Stores raw face captures before identification.
URL or storage path to the captured image
Unix timestamp in milliseconds
Capture source: "glasses", "telegram", or "upload"
Processing status:
"pending" — Just captured, awaiting processing
"identifying" — Face detection in progress
"identified" — Successfully matched to a person
"failed" — Identification failed
Reference to the identified person (set when status = "identified")
Example: Create Capture
frontend/convex/captures.ts
import { mutation } from "./_generated/server" ;
import { v } from "convex/values" ;
export const create = mutation ({
args: {
imageUrl: v . string (),
source: v . string (),
},
handler : async ( ctx , { imageUrl , source }) => {
const captureId = await ctx . db . insert ( "captures" , {
imageUrl ,
timestamp: Date . now (),
source ,
status: "pending" ,
});
// Log activity for live feed
await ctx . db . insert ( "activityLog" , {
type: "capture" ,
message: `New face captured via ${ source } ` ,
timestamp: Date . now (),
});
return captureId ;
},
});
Persons Table
Core table storing identified individuals and their intelligence dossiers.
Person’s name (from identification or manual input)
Identification confidence score (0-1)
Pipeline status:
"identified" — Person identified, no research yet
"researching" — Agents actively gathering intelligence
"synthesizing" — Compiling final dossier
"complete" — Dossier ready
Position on the corkboard UI
Complete intelligence dossier (populated when status = "complete") 2-3 sentence executive summary
Current company/organization
Array of work experience entries Time period (e.g., “2020-2023”)
Array of education entries Social media and web presence Array of notable achievements or activities
Suggested conversation starters
Potential security or reputation concerns
Example: Create Person
frontend/convex/persons.ts
export const create = mutation ({
args: {
name: v . string (),
photoUrl: v . string (),
confidence: v . number (),
boardPosition: v . optional ( v . object ({ x: v . number (), y: v . number () })),
},
handler : async ( ctx , { name , photoUrl , confidence , boardPosition }) => {
const now = Date . now ();
const pos = boardPosition ?? {
x: 100 + Math . random () * 800 ,
y: 100 + Math . random () * 500 ,
};
const personId = await ctx . db . insert ( "persons" , {
name ,
photoUrl ,
confidence ,
status: "identified" ,
boardPosition: pos ,
createdAt: now ,
updatedAt: now ,
});
await ctx . db . insert ( "activityLog" , {
type: "identify" ,
message: `Identified: ${ name } ( ${ Math . round ( confidence * 100 ) } % confidence)` ,
personId ,
timestamp: now ,
});
return personId ;
},
});
Example: Update Dossier
frontend/convex/persons.ts
export const updateDossier = mutation ({
args: {
id: v . id ( "persons" ),
dossier: v . object ({
summary: v . string (),
title: v . optional ( v . string ()),
company: v . optional ( v . string ()),
workHistory: v . array (
v . object ({
role: v . string (),
company: v . string (),
period: v . optional ( v . string ()),
})
),
education: v . array (
v . object ({
school: v . string (),
degree: v . optional ( v . string ()),
})
),
socialProfiles: v . object ({
linkedin: v . optional ( v . string ()),
twitter: v . optional ( v . string ()),
instagram: v . optional ( v . string ()),
github: v . optional ( v . string ()),
website: v . optional ( v . string ()),
}),
notableActivity: v . array ( v . string ()),
conversationHooks: v . array ( v . string ()),
riskFlags: v . array ( v . string ()),
}),
},
handler : async ( ctx , { id , dossier }) => {
await ctx . db . patch ( id , {
dossier ,
status: "complete" ,
updatedAt: Date . now (),
});
},
});
Intel Fragments Table
Stores individual research findings from various intelligence sources.
Intelligence source: "exa", "linkedin", "twitter", "google", "pimeyes"
Type of data: "profile", "post", "article", "connection"
JSON string containing extracted data
Whether this fragment has been verified
When this fragment was collected
Index: by_person on personId for fast lookups
Example: Query Intel Fragments
export const getByPerson = query ({
args: { personId: v . id ( "persons" ) },
handler : async ( ctx , { personId }) => {
return await ctx . db
. query ( "intelFragments" )
. withIndex ( "by_person" , ( q ) => q . eq ( "personId" , personId ))
. collect ();
},
});
Example: Create Intel Fragment
export const create = mutation ({
args: {
personId: v . id ( "persons" ),
source: v . string (),
dataType: v . string (),
content: v . string (),
verified: v . optional ( v . boolean ()),
},
handler : async ( ctx , { personId , source , dataType , content , verified }) => {
const now = Date . now ();
const fragmentId = await ctx . db . insert ( "intelFragments" , {
personId ,
source ,
dataType ,
content ,
verified: verified ?? false ,
timestamp: now ,
});
const person = await ctx . db . get ( personId );
await ctx . db . insert ( "activityLog" , {
type: "research" ,
message: `[ ${ source . toUpperCase () } ] New ${ dataType } intel for ${ person ?. name ?? "unknown" } ` ,
personId ,
agentName: source ,
timestamp: now ,
});
return fragmentId ;
},
});
Connections Table
Stores relationships between persons for network visualization.
First person in the relationship
Second person in the relationship
Type of relationship: "colleague", "classmate", "mutual_follow", etc.
Human-readable description of the connection
Indexes:
by_person_a on personAId
by_person_b on personBId
Example: Query Connections
frontend/convex/connections.ts
export const getForPerson = query ({
args: { personId: v . id ( "persons" ) },
handler : async ( ctx , { personId }) => {
const asA = await ctx . db
. query ( "connections" )
. withIndex ( "by_person_a" , ( q ) => q . eq ( "personAId" , personId ))
. collect ();
const asB = await ctx . db
. query ( "connections" )
. withIndex ( "by_person_b" , ( q ) => q . eq ( "personBId" , personId ))
. collect ();
return [ ... asA , ... asB ];
},
});
Activity Log Table
Powers the live activity feed in the sidebar.
Event type: "capture", "identify", "research", "complete"
Optional reference to person
Name of the agent that performed the action
Example: Query Recent Activity
export const recentActivity = query ({
handler : async ( ctx ) => {
return await ctx . db
. query ( "activityLog" )
. order ( "desc" )
. take ( 50 );
},
});
Real-Time Subscriptions
Convex queries automatically subscribe to updates. The frontend uses useQuery hooks to receive live data:
frontend/app/components/Board.tsx
import { useQuery } from "convex/react" ;
import { api } from "@/convex/_generated/api" ;
function Board () {
// Auto-updates when persons table changes
const persons = useQuery ( api . persons . listAll );
// Auto-updates when new activity is logged
const activity = useQuery ( api . intel . recentActivity );
return (
< div >
{ persons ?. map ( person => (
< PersonCard key = {person. _id } person = { person } />
))}
< ActivityFeed items = { activity } />
</ div >
);
}
Backend Integration
The Python backend calls Convex via HTTP API:
backend/db/convex_client.py
import httpx
class ConvexGateway :
async def store_person ( self , person_id : str , data : dict ) -> str :
payload = {
"path" : "persons:store" ,
"args" : { "data" : data},
"format" : "json"
}
resp = await self ._client.post(
f " { self .base_url } /api/mutation" ,
json = payload
)
return resp.json().get( "value" )
async def update_person ( self , person_id : str , data : dict ) -> None :
await self ._mutation( "persons:update" , {
"person_id" : person_id,
"data" : data
})
See Backend Integration for full implementation details.
Next: MongoDB Storage Learn about persistent storage for raw captures and long-term archival