Skip to main content

MemoryAnalyzer

Advanced TensorFlow memory analysis for detecting leaks, fragmentation, and optimization opportunities.

Constructor

MemoryAnalyzer(sensitivity: float = 0.05)
sensitivity
float
default:"0.05"
Analysis sensitivity threshold (0.0-1.0). Lower values detect smaller anomalies

Methods

detect_memory_leaks

Detect potential memory leaks using statistical analysis.
def detect_memory_leaks(self, tracking_results: TrackingResult) -> List[Dict[str, Any]]
tracking_results
TrackingResult
Results from MemoryTracker
List[Dict]
List[Dict[str, Any]]
List of detected leaks with type, severity, and description
type
str
Leak type: ‘monotonic_increase’, ‘memory_spikes’, or ‘insufficient_cleanup’
severity
str
Severity level: ‘high’, ‘medium’, or ‘low’
description
str
Human-readable description
Example:
analyzer = MemoryAnalyzer(sensitivity=0.05)
leaks = analyzer.detect_memory_leaks(tracking_results)

for leak in leaks:
    print(f"{leak['severity'].upper()}: {leak['description']}")

analyze_fragmentation

Analyze memory fragmentation patterns.
def analyze_fragmentation(self, profile_result: ProfileResult) -> Dict[str, float]
profile_result
ProfileResult
Results from TFMemoryProfiler
Dict[str, float]
Dict[str, float]
fragmentation_score
float
Overall fragmentation score (0.0-1.0)
trend
float
Fragmentation trend (positive = increasing)
max_fragmentation
float
Maximum fragmentation observed
min_fragmentation
float
Minimum fragmentation observed

detect_patterns

Detect memory usage patterns.
def detect_patterns(self, tracking_results: TrackingResult) -> List[Dict[str, Any]]
tracking_results
TrackingResult
Results from MemoryTracker
List[Dict]
List[Dict[str, Any]]
Detected patterns including ‘periodic_pattern’ and ‘step_pattern’ types
Example:
patterns = analyzer.detect_patterns(tracking_results)

for pattern in patterns:
    print(f"Pattern: {pattern['type']} - {pattern['description']}")

analyze_efficiency

Analyze memory usage efficiency.
def analyze_efficiency(self, profile_result: ProfileResult) -> float
profile_result
ProfileResult
Results from TFMemoryProfiler
float
float
Efficiency score from 0-10 (higher is better)

correlate_with_performance

Correlate memory usage with performance metrics.
def correlate_with_performance(self, profile_result: ProfileResult) -> Dict[str, Any]
profile_result
ProfileResult
Results from TFMemoryProfiler
Dict[str, Any]
Dict[str, Any]
memory_duration_correlation
float
Correlation between memory and duration
function_efficiency
Dict[str, Dict]
Per-function efficiency metrics
recommendations
List[str]
Performance optimization recommendations

analyze_memory_gaps

Classify allocator-vs-device hidden memory gaps over time.
def analyze_memory_gaps(self, events: List[TelemetryEventV2]) -> List[GapFinding]
events
List[TelemetryEventV2]
Chronologically ordered telemetry samples
List[GapFinding]
List[GapFinding]
Prioritized list of gap findings sorted by severity and confidence
Example:
events = tracker.get_tracking_results().events
gaps = analyzer.analyze_memory_gaps(events)

for gap in gaps:
    print(f"Severity: {gap.severity}")
    print(f"Classification: {gap.classification}")
    print(f"Description: {gap.description}")

score_optimization

Score optimization opportunities across multiple dimensions.
def score_optimization(
    self,
    profile_result: ProfileResult,
    events: Optional[List[TelemetryEventV2]] = None
) -> Dict[str, Any]
profile_result
ProfileResult
TensorFlow profiling result object
events
List[TelemetryEventV2]
Optional telemetry event series for gap analysis. When provided, result includes gap_analysis section
Dict[str, Any]
Dict[str, Any]
overall_score
float
Overall optimization score (0-10)
categories
Dict[str, float]
Category-specific scores (memory_efficiency, fragmentation, performance)
top_recommendations
List[str]
Top optimization suggestions
priority_actions
List[str]
Priority actions to take
gap_analysis
List[Dict]
Hidden memory gap findings (when events provided)

Configuration

The analyzer uses configurable thresholds for gap analysis:
analyzer.thresholds = {
    'gap_ratio_threshold': 0.05,      # 5% of device total
    'gap_spike_zscore': 2.0,          # z-score for spike detection
    'gap_drift_r_squared': 0.6,       # R-squared for drift
    'gap_fragmentation_ratio': 0.3,   # fragmentation threshold
}

Example

from tfmemprof.analyzer import MemoryAnalyzer
from tfmemprof.profiler import TFMemoryProfiler
from tfmemprof.tracker import MemoryTracker

# Run profiling and tracking
profiler = TFMemoryProfiler()
tracker = MemoryTracker(sampling_interval=0.5)

tracker.start_tracking()

with profiler.profile_context("training"):
    model.fit(x_train, y_train, epochs=5)

tracking_results = tracker.stop_tracking()
profile_results = profiler.get_results()

# Analyze results
analyzer = MemoryAnalyzer(sensitivity=0.05)

# Detect memory leaks
leaks = analyzer.detect_memory_leaks(tracking_results)
if leaks:
    print("\nMemory Leaks Detected:")
    for leak in leaks:
        print(f"  [{leak['severity']}] {leak['description']}")

# Analyze fragmentation
frag = analyzer.analyze_fragmentation(profile_results)
print(f"\nFragmentation Score: {frag['fragmentation_score']:.3f}")
print(f"Trend: {'increasing' if frag['trend'] > 0 else 'decreasing'}")

# Detect patterns
patterns = analyzer.detect_patterns(tracking_results)
for pattern in patterns:
    print(f"\nPattern: {pattern['type']}")
    print(f"  {pattern['description']}")

# Get efficiency score
efficiency = analyzer.analyze_efficiency(profile_results)
print(f"\nMemory Efficiency: {efficiency:.1f}/10")

# Performance correlation
perf = analyzer.correlate_with_performance(profile_results)
print("\nPerformance Recommendations:")
for rec in perf['recommendations']:
    print(f"  - {rec}")

# Comprehensive optimization scoring
events = tracking_results.events
opt_score = analyzer.score_optimization(profile_results, events=events)

print(f"\nOptimization Score: {opt_score['overall_score']:.1f}/10")
print("\nCategory Scores:")
for cat, score in opt_score['categories'].items():
    print(f"  {cat}: {score:.1f}/10")

print("\nPriority Actions:")
for action in opt_score['priority_actions']:
    print(f"  - {action}")

if 'gap_analysis' in opt_score:
    print("\nHidden Memory Gaps:")
    for gap in opt_score['gap_analysis']:
        print(f"  [{gap['severity']}] {gap['description']}")

Build docs developers (and LLMs) love