User Value Metrics
UX experiment success metrics must reflect actual user value and behavior — task completion, error rate, retention, or outcome measures — not what is easiest to instrument (clicks, page views, session duration).
$ prime install @community/rule-user-value-metrics Projection
Always in _index.xml · the agent never has to ask for this.
UserValueMetrics [rule] v1.0.0
UX experiment success metrics must reflect actual user value and behavior — task completion, error rate, retention, or outcome measures — not what is easiest to instrument (clicks, page views, session duration).
Loaded when retrieval picks the atom as adjacent / supporting.
UserValueMetrics [rule] v1.0.0
UX experiment success metrics must reflect actual user value and behavior — task completion, error rate, retention, or outcome measures — not what is easiest to instrument (clicks, page views, session duration).
Severity
warning
Applies When
selecting success metrics for any UX experiment or product change evaluation
Verify By
For each metric, ask: 'Does a positive movement here mean a user actually got value?' Reject metrics that could improve while user outcomes worsen.
Loaded when retrieval picks the atom as a focal / direct hit.
UserValueMetrics [rule] v1.0.0
UX experiment success metrics must reflect actual user value and behavior — task completion, error rate, retention, or outcome measures — not what is easiest to instrument (clicks, page views, session duration).
Severity
warning
Applies When
selecting success metrics for any UX experiment or product change evaluation
Verify By
For each metric, ask: 'Does a positive movement here mean a user actually got value?' Reject metrics that could improve while user outcomes worsen.
Examples
- Good: task completion rate, error rate on key flows, 30-day retention, activation metric (first meaningful action).
- Good: time-to-complete for a core task (shorter = more efficient, but validate it's not just skipping steps).
- Bad: 'clicks on this button increased 30%' — users may be clicking because it's confusing, not because it's working.
- Bad: 'page views increased' after adding a modal — the modal creates extra views without delivering value.
- Bad: 'session duration increased' — could mean users are lost, not engaged.
Rationale
Convenience metrics (clicks, page views) can improve while user success decreases, producing false-positive experiment results. Teams that optimize for convenient metrics systematically mislead themselves.
Severity
warning
Applies When
selecting success metrics for any UX experiment or product change evaluation
Verify By
For each metric, ask: 'Does a positive movement here mean a user actually got value?' Reject metrics that could improve while user outcomes worsen.
Source
prime-system/examples/frontend-design/primes/compiled/@community/rule-user-value-metrics/atom.yaml