MinConfidence

data class MinConfidence(val threshold: Float) : KarlInstruction(source)

Sets the minimum confidence threshold for prediction generation and presentation.

This instruction controls the quality bar for predictions by specifying the minimum confidence level required before a prediction is generated or presented to the user. It helps balance between providing helpful suggestions and avoiding noise from uncertain or unreliable predictions.

Confidence threshold effects and implications:

Quality vs. quantity trade-offs:

  • Higher thresholds: Fewer but more reliable predictions

  • Lower thresholds: More predictions but potentially less accurate

  • Optimal thresholds: Balanced based on user tolerance for incorrect suggestions

Dynamic threshold considerations:

  • Learning maturity: Lower thresholds acceptable for new systems building confidence

  • Domain criticality: Higher thresholds for high-stakes decisions or actions

  • User expertise: Experienced users may prefer lower thresholds for more options

  • Context sensitivity: Different thresholds for different prediction types or scenarios

Threshold range interpretation:

  • 0.0 - 0.3: Very permissive, includes experimental and exploratory predictions

  • 0.3 - 0.5: Moderate filtering, balances exploration with reliability

  • 0.5 - 0.7: Conservative approach, focuses on well-supported predictions

  • 0.7 - 1.0: Very conservative, only high-confidence predictions are shown

Implementation behavior:

  • Predictions below the threshold are suppressed and not returned to applications

  • The instruction affects all prediction types unless overridden by type-specific instructions

  • Confidence calibration may adjust actual thresholds based on historical accuracy

  • Alternative suggestions may still be provided even if primary prediction is suppressed

Adaptive threshold strategies:

  • Learning-based: Automatically adjust thresholds based on prediction accuracy feedback

  • Context-aware: Use different thresholds for different application scenarios

  • User-adaptive: Learn individual user tolerance for prediction accuracy

  • Time-varying: Adjust thresholds based on system maturity and data availability

Example usage:

val instructions = listOf(
KarlInstruction.MinConfidence(0.6f), // Only show predictions with 60%+ confidence
// Other customization instructions...
)

Throws

IllegalArgumentException

if threshold is not between 0.0 and 1.0 inclusive

Constructors

Link copied to clipboard
constructor(threshold: Float)

Properties

Link copied to clipboard

A floating-point value between 0.0 and 1.0 representing the minimum confidence level required for prediction generation. Values closer to 1.0 result in fewer but more reliable predictions, while values closer to 0.0 result in more predictions with potentially lower accuracy.