Use CasesAISQLdashboards

How to Track User Engagement Metrics Directly From Your Database

Product analytics tools are convenient until they're not. Mixpanel, Amplitude, and Heap do a great job of tracking front-end events clicks, page views, funn...

Marcus Chen· Solutions EngineerMarch 25, 20268 min read

Product analytics tools are convenient until they're not. Mixpanel, Amplitude, and Heap do a great job of tracking front-end events clicks, page views, funnel steps. But your actual product data lives in your database: subscription records, feature usage tied to server-side logic, support tickets, billing events, session durations computed server-side.

When you need to answer questions like "which users are most active but haven't upgraded" or "how does engagement correlate with retention at 30 days," your product analytics tool often can't answer it without a complex custom event setup that someone built months ago. Or maybe never built.

The cleanest version of user engagement tracking isn't a separate analytics service it's your database, queried well. This guide walks through the key engagement metrics worth tracking, how to calculate them directly from your database, and how to keep those metrics flowing without a standing SQL request queue.

Why Your Database Is the Best Source of Engagement Data

Most product analytics tools work by collecting events that you instrument on the front end. This creates two common problems:

Coverage gaps. If your team didn't instrument a specific action say, a user completing a background import it's invisible to your analytics tool. Your database, on the other hand, records the output of that action whether you thought to track it or not.

Metric drift. When your product changes, event names need to be updated in the instrumentation, the analytics platform, and any dashboards that reference them. It's common for these to fall out of sync. A database query is always reflecting reality because it reads the actual records.

For server-side products APIs, SaaS platforms, data tools your database often has more reliable engagement signal than a client-side event stream. Querying it directly gives you:

  • Sessions and logins from your auth tables
  • Feature usage from action logs or job records
  • Collaboration signals from shared resources, comments, invitations
  • Subscription and upgrade events from your billing tables
  • Support touchpoints from ticket tables
  • Let's go through the metrics that matter most and how to build them.

    The 5 Engagement Metrics Worth Tracking

    1. Daily and Monthly Active Users (DAU / MAU)

    The most fundamental engagement metric. An "active user" is any user who performed a meaningful action in your product within a given window.

    What counts as "meaningful" depends on your product. For a data tool, it might be running a query. For a project management tool, it might be any action other than login. Define it once and be consistent.

    -- Daily active users for the past 30 days
    SELECT
      DATE(created_at) AS day,
      COUNT(DISTINCT user_id) AS dau
    FROM user_events
    WHERE event_type = 'query_run'
      AND created_at >= NOW() - INTERVAL '30 days'
    GROUP BY DATE(created_at)
    ORDER BY day DESC;

    Your DAU:MAU ratio is a useful stickiness indicator. A ratio above 0.20 means users are coming back most days that's strong for B2B. Under 0.10 means engagement is sporadic and worth investigating.

    -- DAU:MAU ratio (stickiness)
    WITH dau AS (
      SELECT COUNT(DISTINCT user_id) AS daily_users
      FROM user_events
      WHERE created_at >= CURRENT_DATE
    ),
    mau AS (
      SELECT COUNT(DISTINCT user_id) AS monthly_users
      FROM user_events
      WHERE created_at >= NOW() - INTERVAL '30 days'
    )
    SELECT
      dau.daily_users,
      mau.monthly_users,
      ROUND(dau.daily_users::numeric / NULLIF(mau.monthly_users, 0), 3) AS dau_mau_ratio
    FROM dau, mau;

    2. Feature Adoption Rate

    Which features are actually being used? This matters more than total logins because shallow adoption users logging in but not using core features is a warning sign for churn.

    -- Feature adoption: % of active users who used each feature last 30 days
    WITH active_users AS (
      SELECT DISTINCT user_id
      FROM user_events
      WHERE created_at >= NOW() - INTERVAL '30 days'
    ),
    feature_users AS (
      SELECT feature_name, COUNT(DISTINCT user_id) AS users_who_used
      FROM user_events
      WHERE created_at >= NOW() - INTERVAL '30 days'
      GROUP BY feature_name
    )
    SELECT
      f.feature_name,
      f.users_who_used,
      COUNT(a.user_id) AS total_active,
      ROUND(100.0 * f.users_who_used / COUNT(a.user_id), 1) AS adoption_pct
    FROM feature_users f
    CROSS JOIN active_users a
    GROUP BY f.feature_name, f.users_who_used
    ORDER BY adoption_pct DESC;

    3. User Engagement Score

    An engagement score rolls multiple signals into a single number per user. This is useful for:

  • Identifying power users to interview or convert to advocates
  • Spotting at-risk accounts before they churn
  • Prioritizing outreach for expansion opportunities
  • A simple version assigns points to each action type:

    -- Engagement score per user, last 30 days
    SELECT
      u.id AS user_id,
      u.email,
      u.plan_type,
      SUM(
        CASE event_type
          WHEN 'login'          THEN 1
          WHEN 'query_run'      THEN 3
          WHEN 'dashboard_view' THEN 2
          WHEN 'alert_created'  THEN 5
          WHEN 'invite_sent'    THEN 4
          ELSE 1
        END
      ) AS engagement_score,
      COUNT(*) AS total_events,
      MAX(e.created_at) AS last_active
    FROM users u
    JOIN user_events e ON e.user_id = u.id
    WHERE e.created_at >= NOW() - INTERVAL '30 days'
    GROUP BY u.id, u.email, u.plan_type
    ORDER BY engagement_score DESC;

    Tune the weights based on what correlates with retention in your product. If dashboard creation is strongly predictive of month-2 retention, weight it heavily.

    4. Time to First Key Action (Activation Metric)

    Activation is the moment a new user first experiences the core value of your product. For AI for Database, that's probably "ran their first query" or "created their first dashboard." For a project tool, it's "created first project with a teammate."

    Time to activation is a leading indicator of long-term retention:

    -- Average time from signup to first key action (query run), last 90 days
    SELECT
      AVG(EXTRACT(EPOCH FROM (first_query.created_at - u.created_at)) / 3600) AS avg_hours_to_activation,
      PERCENTILE_CONT(0.5) WITHIN GROUP (
        ORDER BY EXTRACT(EPOCH FROM (first_query.created_at - u.created_at)) / 3600
      ) AS median_hours_to_activation,
      COUNT(*) AS users_who_activated
    FROM users u
    JOIN LATERAL (
      SELECT MIN(created_at) AS created_at
      FROM user_events
      WHERE user_id = u.id AND event_type = 'query_run'
    ) first_query ON true
    WHERE u.created_at >= NOW() - INTERVAL '90 days'
      AND first_query.created_at IS NOT NULL;

    If your median time to activation is 3 days, there's probably friction in onboarding. If it's under an hour, your activation flow is working well.

    5. Engagement by Cohort (Retention Curve)

    Cohort retention tells you whether early engagement translates into long-term use. Group users by the week they signed up and measure what percentage are still active in weeks 2, 4, 8, etc.

    -- Weekly cohort retention (simplified)
    WITH cohorts AS (
      SELECT
        user_id,
        DATE_TRUNC('week', created_at) AS cohort_week
      FROM users
      WHERE created_at >= NOW() - INTERVAL '12 weeks'
    ),
    activity AS (
      SELECT
        e.user_id,
        DATE_TRUNC('week', e.created_at) AS activity_week
      FROM user_events e
      WHERE e.created_at >= NOW() - INTERVAL '12 weeks'
      GROUP BY e.user_id, DATE_TRUNC('week', e.created_at)
    )
    SELECT
      c.cohort_week,
      a.activity_week,
      DATE_PART('day', a.activity_week - c.cohort_week) / 7 AS weeks_since_signup,
      COUNT(DISTINCT a.user_id) AS retained_users
    FROM cohorts c
    JOIN activity a ON a.user_id = c.user_id
      AND a.activity_week >= c.cohort_week
    GROUP BY c.cohort_week, a.activity_week
    ORDER BY c.cohort_week, a.activity_week;

    Making This Repeatable Without Writing SQL Every Time

    The queries above are useful one-offs. But engagement tracking only works if it's consistent and automatic. There are three ways to keep these metrics flowing without manually running queries:

    Option 1: Pin them as dashboard cards

    With AI for Database, you can ask each of these questions in plain English and pin the result to a dashboard that auto-refreshes. "Show me daily active users for the past 30 days" becomes a chart that updates every morning without anyone touching it.

    Option 2: Set up automated alerts

    Instead of checking dashboards, define conditions that matter:

  • "Alert me when DAU drops more than 15% below last week's average"
  • "Notify me when a free-tier user's engagement score exceeds 50 for three consecutive days" (expansion signal)
  • "Send a Slack message when a user who activated in the past week hasn't logged in for 5 days"
  • These run against your database on a schedule and fire only when conditions are met. No stored procedures, no cron jobs, no engineering involvement.

    Option 3: Build a weekly engagement snapshot report

    Combine your key metrics into a single weekly report: new signups, activated users, DAU/MAU this week vs. last, top and bottom engaged accounts. Set it to email you and your team every Monday morning.

    Segmenting Engagement by Plan, Channel, and Cohort

    Raw engagement numbers tell you what's happening. Segmented numbers tell you why.

    Three segmentation cuts that usually reveal something useful:

    By plan type: Are free users engaging at all? Is there a clear usage threshold where users upgrade? This tells you where to put your trial-to-paid nudge.

    -- Engagement by plan type
    SELECT
      u.plan_type,
      COUNT(DISTINCT u.id) AS total_users,
      COUNT(DISTINCT e.user_id) AS active_users,
      ROUND(100.0 * COUNT(DISTINCT e.user_id) / COUNT(DISTINCT u.id), 1) AS active_pct,
      AVG(engagement_scores.score) AS avg_engagement_score
    FROM users u
    LEFT JOIN user_events e ON e.user_id = u.id
      AND e.created_at >= NOW() - INTERVAL '30 days'
    LEFT JOIN (
      SELECT user_id, COUNT(*) * 2 AS score
      FROM user_events
      WHERE created_at >= NOW() - INTERVAL '30 days'
      GROUP BY user_id
    ) engagement_scores ON engagement_scores.user_id = u.id
    GROUP BY u.plan_type
    ORDER BY avg_engagement_score DESC NULLS LAST;

    By signup channel: Users who found you via organic search often behave differently than those from a Product Hunt launch or a paid ad. Tracking engagement by acquisition source tells you which channels bring sticky users.

    By company size: If you have a company_size field, segment your engagement by it. Enterprise teams often show different activation patterns than individual users knowing this changes how you build onboarding flows.

    From Metrics to Action

    Numbers without follow-through are noise. Here's how to close the loop:

    High engagement on free tier → upgrade prompt. If a free user has run 50+ queries in a week, they're activated and getting value. That's the moment for a targeted upgrade prompt or a personal outreach from the sales team.

    Declining engagement on paid plan → proactive support. A paying customer whose engagement score drops 50% week-over-week is a churn signal. Flag them automatically, reach out before they decide to cancel.

    Low feature adoption for a key feature → improve onboarding. If only 20% of users have used your best feature, the onboarding flow isn't surfacing it effectively. Build the metric first, then test changes to your product and measure the lift.

    Ready to try AI for Database?

    Query your database in plain English. No SQL required. Start free today.