trainStep
Executes asynchronous neural network training step from user interaction data.
This method processes a single user interaction, converts it to neural network training data, and performs one iteration of supervised learning via backpropagation.
Training Pipeline:
Validation: Verify engine initialization state
Preprocessing: Convert InteractionData to feature vector
Target Generation: Create expected output values for supervision
Forward Pass: Compute current model prediction
Backward Pass: Calculate gradients and update weights
Bookkeeping: Update training statistics and history
Cleanup: Maintain bounded memory usage
Feature Engineering:
input[0] = action_type_hash // Normalized interaction type [-1, 1]
input[1] = timestamp_normalized // Time-of-day feature [0, 1]
input[2] = user_hash // User identification [-1, 1]
input[3] = context_indicator // Context presence [0, 1]Asynchronous Execution:
Returns immediately with Job for non-blocking operation
Training executes in background using engineScope
Multiple training steps can execute concurrently
Mutex ensures thread-safe weight updates
Error Handling:
Initialization check prevents invalid operations
Exception handling with comprehensive logging
Graceful degradation on training failures
No-op job returned for uninitialized engine
Performance Optimizations:
Bounded training history (1000 examples max)
Periodic logging to reduce I/O overhead
Memory-efficient data structures
Lazy evaluation of expensive operations
Training Metrics:
Incremental training step counter
Loss calculation and logging every 10 steps
Interaction count tracking for analytics
Training example timestamp recording
Return
Job representing the asynchronous training operation. Callers can join this job to wait for training completion or launch fire-and-forget
Parameters
InteractionData containing user interaction context, timing, and behavioral information for supervised learning
See also
Feature extraction from interaction data
Target value generation for supervision
Neural network forward propagation
Gradient computation and weight updates