run_cpu_inference
Trained model with
predict_proba method. Should support CPU-based inferenceInput features DataFrame for inference
Dictionary containing inference metrics:
inference_latency_ms: Time taken for inference in millisecondsoutput_mean_probability: Mean of predicted probabilities (for class 1)output_std_probability: Standard deviation of predicted probabilities
Example
Use Cases
- Performance Benchmarking: Measure inference latency for deployment planning
- Model Monitoring: Track prediction distribution through mean and standard deviation
- CPU Deployment: Optimize and validate CPU-based inference performance