Recommended patterns and strategies for using BinaryDB effectively
This guide covers best practices for using BinaryDB in production applications, including performance optimization, error handling, and data organization strategies.
Use exists() to verify keys before performing operations that expect existing records.
db = Database("./data/users")db.load()user_key = "user:123"# GOOD: Check before updatingif db.exists(user_key): try: db.update(user_key, {"last_login": "2026-03-04"}) db.commit() except RecordTypeError: print("User record is not a dictionary")else: print(f"User {user_key} not found")db.close()
The update() method only works on dictionary records:
db = Database("./data/config")db.load()# This will raise RecordTypeError if value is not a dicttry: db.update("config:app", {"debug": True}) db.commit()except RecordTypeError: print("Cannot update non-dictionary record") # Handle appropriately - maybe use set() instead db.set("config:app", {"debug": True}) db.commit()db.close()
Transactions create a memory snapshot of all data. Keep them short to minimize memory usage.
db = Database("./data/inventory")db.load()# GOOD: Short transactiondb.begin()db.set("product:1", {"stock": 50})db.set("product:2", {"stock": 30})db.end()# BAD: Long-running transaction with complex logicdb.begin()# ... many operations ...# ... complex calculations ...# ... API calls ...db.end() # Memory snapshot held for too longdb.close()
BinaryDB does not currently implement context manager protocol (__enter__/__exit__), but you can wrap it if needed.
Since BinaryDB doesn’t support with statements natively, use try/finally:
from binarydb.database import Databasedef process_data(): db = Database("./data/mydb") try: db.load() db.set("processed", True) db.commit() finally: db.close()process_data()
Or create a simple wrapper:
from contextlib import contextmanager@contextmanagerdef database_context(path): db = Database(path) try: db.load() yield db finally: db.close()# Now you can use with statementswith database_context("./data/mydb") as db: db.set("key", "value") db.commit()
# If you need:# - Large datasets (> 100 MB) → Use SQLite, PostgreSQL# - Multi-process access → Use Redis, PostgreSQL# - Complex queries → Use SQL databases# - Production-grade performance → Use specialized databases# - Untrusted data sources → Use JSON, SQLite (NOT pickle)