Skip to main content
The Timeline page is the cross-signal change log for a project. It is designed to answer the question: what changed, when, and what else was happening at the same time? Instead of looking at AI Search, community activity, owned-content changes, and action outcomes in isolation, the Timeline page places them on one shared chronology.

What the Timeline Page Shows

The page combines filters, summary cards, a shared timeline chart, and event lists.

Summary cards

At the top of the page, DevTune summarizes the selected window with counts such as:
  • Annotations
  • Content Changes
  • Interventions
  • Page Outcome Citations

Downstream Context card

The Downstream Context card appears below the timeline and connects your GTM activities to measurable business outcomes. It bridges the gap between upstream work (content changes, community engagement) and downstream results (adoption, AI citations). The card displays two metrics: Adoption Tracking Shows the number of tracked domains (GitHub repos, package registries) and the most significant adoption trend in your competitive set. This validates whether your GTM efforts translate into actual package downloads or repository growth. Action Outcomes
  • Correlated Interventions: Actions you completed where the system detected a measurable citation increase afterward
  • Page Outcome Citations: Total AI citations received by URLs you specifically targeted with Actions

Understanding Interventions

An intervention is created automatically when you mark an Action as “Done.” It tracks:
  • The target URL you were working on (e.g., a docs page)
  • Baseline citation metrics before your change
  • Content hash at completion time
When your domain is crawled after the change, the system compares new citation counts to the baseline. If citations increased, the intervention is marked as correlated — validating that your Action had measurable impact.

Understanding Page Outcome Citations

A page outcome citation is an AI citation received by a specific page URL that you’ve taken action on. Here’s how it works:
  1. You complete an Action targeting a specific URL (e.g., “Update authentication docs”)
  2. The system creates an intervention and monitors that URL
  3. When AI platforms cite that URL in responses, it’s counted as a page outcome citation
  4. The system tracks daily citation counts for all your intervention URLs
Example: You improve your “Getting Started” page on March 1st. Between March 5-15, that specific page gets cited 12 times across ChatGPT, Claude, and Perplexity. Your page outcome citation count for that window is 12. This metric answers: “Are the specific pages I worked on actually getting cited more in AI search?”

Timeline chart

The main chart overlays multiple signal types across the same date range so you can see whether movement in one part of the system lines up with another. Outcome markers appear as visual indicators showing when Actions were completed and when content changes were detected on specific URLs.

Event lists

Below the chart, DevTune breaks the period into event streams such as:
  • Manual annotations
  • Owned-content changes
  • Community-linked movement
  • Outcome / intervention context

Understanding Outcome Markers vs Completed Actions

Completed Actions are work items your team finished (e.g., “Update authentication docs page”). When you mark an Action as Done, the system creates both an intervention (which tracks citation metrics for correlation) and an outcome marker (a visual indicator on the timeline). Outcome markers are visual timeline indicators showing two distinct event types:
  • Action completed markers (blue dots) — When Actions targeting specific URLs were marked Done
  • Change detected markers (amber/green dots) — When domain crawls detected content changes on those URLs
A single Completed Action creates one action-completed marker. However, a URL can accumulate multiple markers over time — your initial action completion, followed by subsequent content changes detected during crawls. This helps you visualize the sequence: “We completed work on March 1st, the page changed on March 5th, then changed again on March 10th.” The key distinction: Completed Actions are work items in your workflow; outcome markers are timeline visualizations showing when events occurred on specific URLs.

Filters

The Timeline page follows the same shared controls as the AI Search and Community Discourse surfaces:
  • Source
  • Date range
These filters narrow both the chart and the underlying event lists.

Annotations

The Timeline page supports manual annotations so your team can mark launches, docs updates, campaigns, migrations, or other events that may explain downstream changes. This is especially useful when you want to separate deliberate work from ambient market movement.

How to Use It

The best use of the Timeline page is causal investigation:
  1. Choose a date range where you saw a notable gain or drop
  2. Review the timeline chart for movement across signals
  3. Inspect the event lists around the same period
  4. Add annotations for launches or changes that are missing context
  5. Use the evidence to decide whether the movement was caused by your work, competitor movement, or broader discussion/activity shifts

Best Practices

  • Use annotations consistently for launches, migrations, major docs updates, and campaign starts
  • Review Timeline when a metric changes sharply and the cause is not obvious from a single page
  • Pair it with Actions and Owned Content when you want to connect shipped work to downstream signal movement

Next Steps