Prerequisites
Python 3.10+ Hridaya uses modern async syntax and type hints
Git Required for cloning the repository
pip Python package manager
SQLite 3.35+ Usually included with Python
Optional : PostgreSQL 13+ with TimescaleDB extension for high-volume historical data storage
Installation steps
Clone the repository
git clone https://github.com/yourusername/hridaya.git
cd hridaya
Create a virtual environment (recommended)
# Create virtual environment
python -m venv venv
# Activate it
# On Linux/macOS:
source venv/bin/activate
# On Windows:
venv\Scripts\activate
Virtual environments isolate Hridaya’s dependencies from your system Python, preventing version conflicts.
Install dependencies
pip install -r requirements.txt
Core dependencies explained
# Async HTTP client for Steam API calls
aiohttp==3.13.2
aiohappyeyeballs==2.6.1 # IPv6/IPv4 fallback for aiohttp
aiosignal==1.4.0
# Async database drivers
aiosqlite==0.22.0 # SQLite async driver
asyncpg==0.31.0 # PostgreSQL async driver (for TimescaleDB)
# Data validation and parsing
pydantic==2.12.5 # Type-safe data models
pydantic_core==2.41.5
annotated-types==0.7.0
# Configuration and environment
PyYAML==6.0.3 # YAML config parsing
python-dotenv==1.2.1 # .env file support
# Utilities
attrs==25.4.0
frozenlist==1.8.0
idna==3.11
multidict==6.7.0
propcache==0.4.1
yarl==1.22.0
typing-inspection==0.4.2
typing_extensions==4.15.0
Verify installation
python -c "import aiohttp, aiosqlite, pydantic, yaml; print('All dependencies installed successfully!')"
You should see: All dependencies installed successfully!
Configuration
Basic configuration (config.yaml)
Hridaya uses a YAML configuration file to define rate limits and tracking items.
config.yaml - Complete example
Minimal configuration
# RATE LIMITS
# Steam enforces 15 requests per 60 seconds globally per IP
LIMITS :
REQUESTS : 15 # Max requests allowed
WINDOW_SECONDS : 60 # Time window in seconds
# TRACKING ITEMS
TRACKING_ITEMS :
# Example 1: Real-time trade activity
- market_hash_name : "Revolution Case"
appid : 730 # CS2 app ID
currency : 1 # 1=USD, 2=GBP, 3=EUR, 6=RUB, etc.
country : 'US'
language : 'english'
polling-interval-in-seconds : 8 # Poll every 8 seconds
apiid : 'itemordersactivity' # Endpoint type
# Example 2: Historical price data
- market_hash_name : "AWP | Neo-Noir (Factory New)"
appid : 730
currency : 3 # EUR
country : 'IN'
language : 'english'
polling-interval-in-seconds : 60 # Poll every 60 seconds
apiid : 'pricehistory' # Requires Steam cookies
# Example 3: Order book snapshots
- market_hash_name : "Dreams & Nightmares Case"
appid : 730
currency : 2 # GBP
country : 'IN'
language : 'english'
polling-interval-in-seconds : 30 # Poll every 30 seconds
apiid : 'itemordershistogram' # Requires item_nameid
# Example 4: Current market prices
- market_hash_name : "MP9 | Starlight Protector (Field-Tested)"
appid : 730
currency : 2
country : 'IN'
language : 'english'
polling-interval-in-seconds : 60
apiid : 'priceoverview' # Simplest endpoint
Configuration field reference
Maximum number of API requests allowed per time window. Steam enforces 15 requests per 60 seconds globally.
Time window in seconds for rate limiting. Keep at 60 to match Steam’s limits.
TRACKING_ITEMS[].market_hash_name
Exact market name of the item as it appears on Steam Community Market. Examples:
"Revolution Case"
"AWP | Neo-Noir (Factory New)"
"Sticker | Ninjas in Pyjamas | Katowice 2014"
Steam application ID. Common values:
730 - Counter-Strike 2 (CS2)
570 - Dota 2
440 - Team Fortress 2
252490 - Rust
753 - Steam (trading cards, backgrounds, emoticons)
TRACKING_ITEMS[].currency
Currency code for price display. Common values:
1 - USD (US Dollar)
2 - GBP (British Pound)
3 - EUR (Euro)
6 - RUB (Russian Ruble)
7 - BRL (Brazilian Real)
9 - NOK (Norwegian Krone)
Two-letter country code (ISO 3166-1 alpha-2). Examples: 'US', 'GB', 'DE', 'IN', 'BR'
TRACKING_ITEMS[].language
Language for text responses. Examples: 'english', 'german', 'french', 'spanish'
TRACKING_ITEMS[].polling-interval-in-seconds
How often to poll this item (in seconds). Must be feasible within rate limits. Hridaya validates this on startup.
Recommended ranges:
High-frequency: 8-15 seconds (order activity)
Medium-frequency: 30-60 seconds (order books, prices)
Low-frequency: 300-3600 seconds (historical data)
Which Steam Market API endpoint to use. Valid values:
'priceoverview' - Current lowest/median price and volume
'itemordershistogram' - Full order book (requires item_nameid)
'itemordersactivity' - Recent trade history (requires item_nameid)
'pricehistory' - Historical price chart (requires Steam cookies)
TRACKING_ITEMS[].item_nameid
Steam’s internal numeric item ID. Required for itemordershistogram and itemordersactivity. For CS2 items, Hridaya auto-populates this from steam-item-name-ids . For other games, you must provide it manually. Example: - market_hash_name : "Revolution Case"
item_nameid : 176042118 # Manually specified
apiid : 'itemordershistogram'
Environment variables (.env)
Create a .env file in the project root for sensitive credentials:
# Required for pricehistory endpoint
sessionid = YOUR_STEAM_SESSION_ID
steamLoginSecure = YOUR_STEAM_LOGIN_SECURE_COOKIE
# Optional Steam cookies
browserid = YOUR_BROWSER_ID
steamCountry = US
Navigate to Steam Community Market
Open Developer Tools
Press F12 or right-click → Inspect Element
Find cookies
Go to Application tab (Chrome) or Storage tab (Firefox)
Navigate to Cookies → https://steamcommunity.com
Copy values for:
sessionid
steamLoginSecure
Paste into .env
sessionid = PASTE_YOUR_VALUE_HERE
steamLoginSecure = PASTE_YOUR_VALUE_HERE
Security reminder :
Never commit .env to version control
Cookies expire - re-fetch them if pricehistory stops working
These cookies grant access to your Steam account - treat them like passwords
Advanced: TimescaleDB setup (optional)
For high-volume historical data, Hridaya supports TimescaleDB (PostgreSQL extension) as an alternative to SQLite for the price_history table.
Install PostgreSQL and TimescaleDB
Ubuntu/Debian
macOS
Docker
# Add PostgreSQL repository
sudo sh -c 'echo "deb https://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -
# Add TimescaleDB repository
sudo sh -c "echo 'deb https://packagecloud.io/timescale/timescaledb/ubuntu/ $( lsb_release -c -s ) main' > /etc/apt/sources.list.d/timescaledb.list"
wget --quiet -O - https://packagecloud.io/timescale/timescaledb/gpgkey | sudo apt-key add -
# Install
sudo apt update
sudo apt install postgresql-14 timescaledb-2-postgresql-14
# Configure TimescaleDB
sudo timescaledb-tune
sudo systemctl restart postgresql
# Install via Homebrew
brew install postgresql timescaledb
# Start PostgreSQL
brew services start postgresql
docker run -d \
--name timescaledb \
-p 5432:5432 \
-e POSTGRES_PASSWORD=password \
timescale/timescaledb:latest-pg14
Create database and enable extension
# Connect to PostgreSQL
psql -U postgres
-- Create database
CREATE DATABASE steam_market ;
-- Connect to it
\c steam_market
-- Enable TimescaleDB extension
CREATE EXTENSION IF NOT EXISTS timescaledb;
-- Create user for Hridaya
CREATE USER hridaya WITH PASSWORD 'your_secure_password' ;
GRANT ALL PRIVILEGES ON DATABASE steam_market TO hridaya;
Configure Hridaya to use TimescaleDB
Modify the SQLinserts initialization in your code: # In your scheduler or main script
from src.SQLinserts import SQLinserts
wizard = SQLinserts(
sqlite_path = "data/market_data.db" , # Still used for live data
timescale_dsn = "postgresql://hridaya:your_secure_password@localhost:5432/steam_market" ,
timescale_pool_min = 10 , # Min connections
timescale_pool_max = 100 # Max connections
)
When TimescaleDB is configured:
price_overview, orders_histogram, orders_activity → SQLite
price_history → TimescaleDB (hypertable with compression)
Verify hypertable creation
-- Check if price_history is a hypertable
SELECT * FROM timescaledb_information . hypertables ;
-- View compression policy
SELECT * FROM timescaledb_information . jobs WHERE proc_name = 'policy_compression' ;
-- View retention policy
SELECT * FROM timescaledb_information . jobs WHERE proc_name = 'policy_retention' ;
Hridaya automatically configures:
Compression : Data older than 7 days
Retention : Data older than 90 days is deleted
Partitioning : By market_hash_name for optimal query performance
Why use TimescaleDB over SQLite?
SQLite is great for :
Development and testing
Low-to-medium data volumes (< 1M rows)
Single-instance deployments
Simplicity (no separate database server)
TimescaleDB excels at :
High-volume historical data (millions of price points)
Automatic compression (10x+ space savings)
Time-based queries (“last 30 days of data”)
Horizontal scaling potential
Automatic data retention policies
Example : Tracking 100 items with hourly pricehistory data:
SQLite: ~8.76M rows/year → ~2GB database
TimescaleDB: ~8.76M rows/year → ~200MB compressed
Running Hridaya
Basic execution
Running in the background
Linux/macOS (screen)
Linux (systemd)
Docker
# Start a detached screen session
screen -S hridaya -dm python cerebro.py
# Reattach to view logs
screen -r hridaya
# Detach: Press Ctrl+A, then D
# Kill the session
screen -S hridaya -X quit
Create /etc/systemd/system/hridaya.service: [Unit]
Description =Hridaya Steam Market Tracker
After =network.target
[Service]
Type =simple
User =youruser
WorkingDirectory =/path/to/hridaya
Environment = "PATH=/path/to/hridaya/venv/bin"
ExecStart =/path/to/hridaya/venv/bin/python cerebro.py
Restart =always
RestartSec =10
[Install]
WantedBy =multi-user.target
# Enable and start service
sudo systemctl daemon-reload
sudo systemctl enable hridaya
sudo systemctl start hridaya
# View logs
sudo journalctl -u hridaya -f
# Stop service
sudo systemctl stop hridaya
Create Dockerfile: FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD [ "python" , "cerebro.py" ]
# Build image
docker build -t hridaya .
# Run container
docker run -d \
--name hridaya \
--restart unless-stopped \
-v $( pwd ) /data:/app/data \
-v $( pwd ) /config.yaml:/app/config.yaml \
-v $( pwd ) /.env:/app/.env \
hridaya
# View logs
docker logs -f hridaya
# Stop container
docker stop hridaya
Validation and troubleshooting
Configuration validation
Hridaya performs comprehensive validation on startup:
def validate_required_fields ( self , items : list ):
"""Validate that each item has all required fields."""
valid_apiids = { 'priceoverview' , 'itemordershistogram' , 'itemordersactivity' , 'pricehistory' }
for index, item in enumerate (items):
# Check universal required fields
required = [ 'market_hash_name' , 'apiid' , 'polling-interval-in-seconds' , 'appid' ]
for field in required:
if field not in item:
print ( f " \n ❌ CONFIG ERROR: Item { index + 1 } missing required field ' { field } '" )
exit ( 1 )
# Validate apiid is recognized
if item[ 'apiid' ] not in valid_apiids:
print ( f " \n ❌ CONFIG ERROR: Item { index + 1 } has invalid apiid ' { item[ 'apiid' ] } '" )
exit ( 1 )
# Check endpoint-specific required fields
if item[ 'apiid' ] in ( 'itemordershistogram' , 'itemordersactivity' ):
if 'item_nameid' not in item:
print ( f " \n ❌ CONFIG ERROR: Item { index + 1 } missing 'item_nameid'" )
exit ( 1 )
Feasibility validation
def validate_config_feasibility ( self , rate_limit : int , window_seconds : int , items : list ):
"""Validate that config is feasible given rate limits."""
total_reqs = 0
for item in items:
reqs_per_window = window_seconds // item[ 'polling-interval-in-seconds' ]
total_reqs += reqs_per_window
if total_reqs > rate_limit:
print ( f " \n ❌ CONFIG ERROR: Infeasible configuration" )
print ( f " Calculated: { total_reqs } requests per { window_seconds } s" )
print ( f " Limit: { rate_limit } requests per { window_seconds } s" )
exit ( 1 )
# Success - config is feasible
utilization = (total_reqs / rate_limit) * 100
print ( f " ✓ Config feasible: { total_reqs } req/ { window_seconds } s ( { utilization :.1f} % capacity)" )
Common issues
ModuleNotFoundError: No module named 'aiohttp'
Cause : Dependencies not installed or wrong Python environment.Solution :# Ensure virtual environment is activated
source venv/bin/activate # or venv\Scripts\activate on Windows
# Reinstall dependencies
pip install -r requirements.txt
FileNotFoundError: config.yaml not found
Cause : Running cerebro.py from wrong directory.Solution :# Ensure you're in the project root
cd /path/to/hridaya
python cerebro.py
sqlite3.OperationalError: unable to open database file
Cause : data/ directory doesn’t exist.Solution :mkdir -p data
python cerebro.py
asyncpg.exceptions.InvalidCatalogNameError
Cause : TimescaleDB database doesn’t exist.Solution :psql - U postgres
CREATE DATABASE steam_market ;
\c steam_market
CREATE EXTENSION timescaledb;
Next steps
Quickstart guide Get up and running in 5 minutes
API reference Explore the Steam Market API endpoints