Skip to main content

Reviewer UI patterns

This page is for teams building a review experience on top of AetherFS.

The core idea is simple:

  • the session is the review unit
  • the manifest is the tree view
  • annotations are the review notes
  • approvals are the workflow gate
  • events and health provide supporting context

What a reviewer UI usually needs

A strong reviewer UI usually answers these questions:

  • What session am I reviewing?
  • What changed or what should I inspect?
  • What does the filesystem tree look like?
  • What comments or review notes already exist?
  • Is the session healthy enough to approve?
  • What happened recently in this session?
  • Can I approve, deny, or ask for more work?

Core route families for a reviewer UI

Session detail

Use:

  • GET /v1/sessions
  • GET /v1/sessions/{sessionId}

These routes drive:

  • review queues
  • reviewer landing pages
  • status chips such as pending approval or conflict

File tree

Use:

  • GET /v1/sessions/{sessionId}/manifest

This should drive the review tree and path navigation.

Review notes

Use:

  • GET /v1/sessions/{sessionId}/annotations
  • POST /v1/sessions/{sessionId}/annotations
  • PATCH /v1/sessions/{sessionId}/annotations/{annotationId}

These routes drive:

  • comment threads
  • unresolved comment lists
  • per-file review panes

Approval controls

Use:

  • POST /v1/sessions/{sessionId}/request-approval
  • POST /v1/sessions/{sessionId}/approvals/{approvalId}/grant
  • POST /v1/sessions/{sessionId}/approvals/{approvalId}/deny

These routes drive the actual workflow transition.

Supporting context

Use:

  • GET /v1/sessions/{sessionId}/health
  • GET /v1/sessions/{sessionId}/analytics
  • GET /v1/sessions/{sessionId}/events
  • GET /v1/sessions/{sessionId}/bus

These routes give the reviewer context without overloading the core tree-and-comment surface.

1. Review queue

Show:

  • session ID or task label
  • current session status
  • whether approval is pending
  • summary tags
  • last activity time
  • last known health

Primary route:

  • GET /v1/sessions

2. Session review page

Recommended layout:

  • header: session metadata, status, health, approval state
  • left pane: manifest tree
  • main pane: selected file review surface
  • side pane: annotations, event history, approval controls

This layout maps well onto the public model instead of trying to flatten everything into one feed.

3. Annotation-focused review panel

Good reviewer UIs usually support:

  • all comments for the session
  • comments only for the selected file
  • unresolved-only view
  • resolve/unresolve action
  1. Fetch session detail.
  2. Fetch manifest for the initial tree.
  3. Fetch annotations for the session or selected file.
  4. Fetch health if you display quality state.
  5. Fetch recent events or recent bus activity if you display timeline context.

That sequence gets the core page usable quickly without forcing all secondary context to block first paint.

Handling stale review state

The biggest review bug is stale commentary on newer content.

The public annotation model already helps with this by requiring expectedContentHash at annotation creation time.

Reviewer UI guidance:

  • display the file content hash or version context internally
  • show when the selected file has changed since a note was created
  • avoid pretending an old annotation definitely applies to the current content

Approval UX guidance

When approval is required, the UI should show:

  • why approval was requested
  • what the reviewer is expected to decide
  • enough file/tree/annotation context to make that decision

When the reviewer grants:

  • show the durable next step clearly

When the reviewer denies:

  • capture concrete feedback
  • preserve it in the workflow so the user or agent can act on it

A strong deny flow includes:

  • required or strongly encouraged feedback text
  • links back to unresolved annotations
  • a visible path to request approval again later

Use health carefully

Health is useful, but it should not automatically replace human review.

Use health as:

  • a signal
  • a filter
  • a confidence indicator

Do not treat STATUS_PASSING as equivalent to “approve automatically” unless your product truly wants that policy.

Use analytics carefully

Analytics can help reviewers understand:

  • how much churn happened
  • whether restores occurred
  • whether the work was unusually noisy or unstable

But analytics should remain supporting context, not the main review surface.

Event log versus bus in reviewer UX

Use the event log for:

  • durable history
  • “what happened” timelines
  • support and audit views

Use the bus for:

  • live progress while a session is still active
  • watch mode experiences
  • supervisor or operator views

Reviewer UI anti-patterns

Avoid:

  • treating annotations as generic chat
  • treating session status as the only review signal
  • using tags instead of explicit approval state
  • mixing live bus messages and durable event history into one undifferentiated timeline

If you want the smallest solid review experience, build:

  1. review queue from sessions
  2. tree from manifest
  3. comments from annotations
  4. approval controls
  5. health badge

That gets you a useful review product without inventing private server concepts.

See also: