Skip to main content
The TruGuard Analytics page provides deep visibility into how input and output guardrails are performing in production. It helps teams monitor request volume, validation behavior, latency impact, and failure patterns across validators and categories. This page is designed for runtime oversight, performance tuning, and security/compliance auditing.

Overview Metrics

At the top of the page, high-level metrics summarize overall guardrail activity. Total Requests The total number of requests processed by TruGuard during the selected time range.
This includes both input and output guardrail evaluations.
Requests Trend A time-series view showing how guardrail-protected requests change over time. Requests Breakdown
  • Successful Requests โ€“ Requests that passed all enabled validations
  • Request Errors โ€“ Requests that triggered validation failures or exceptions

Guardrail Latency

This section helps you understand the runtime overhead introduced by guardrails. Aggregate Guardrail Latency Displays latency percentiles across all guardrails:
  • Average
  • p60
  • p90
  • p99
Input Guardrail Latency Distribution A histogram showing how long input guardrails take to execute.
  • X-axis: Latency buckets (ms)
  • Y-axis: Frequency of requests
Output Guardrail Latency Distribution Similar to input latency, but focused on output guardrails. This is especially important for:
  • PII detection
  • Content moderation
  • Post-generation filtering
Guardrail Latency by Validator Breaks down latency by individual validators (input and output). For each validator, you can see:
  • Average latency
  • p60, p90, and p99 values
  • Trends across days

Guardrail Analysis

This section focuses on what validations are happening and where failures occur. Filters: You can switch views between:
  • All Guardrails
  • Input Guardrails
  • Output Guardrails
Validation Volume Key metrics include:
  • Total Validation Checks โ€“ Number of validator executions
  • Exceptions โ€“ Requests where at least one validator failed
  • Exception Rate (%) โ€“ Percentage of validations resulting in failures
A time-series view shows how validation volume and exceptions evolve over time.

Category Failure Analysis

This table breaks down failures by category, stage, and volume.
CategoryStageValidation VolumeFailed VolumeFailure %
Examples include:
  • PII Leak (Input / Output)
  • Intent Violations
  • Regex Matches
  • Prompt Injection

Validation Failures

The Validation Failures tab provides a detailed, request-level view of guardrail violations. Each row represents a single validation failure and includes:
  • Validator ID
  • Date
  • Category
  • Stage (Input or Output)
  • Prompt / Output (truncated)
  • Action Taken
  • Latency
This table is searchable and filterable for faster investigation.

Validation Failure Details

Clicking on any failure opens a detailed view showing exactly what happened. Original Prompt / Response: Displays the full input or output content that triggered the validation. Exception Details Includes:
  • Validator Type (e.g., Regex, PII, Content Safety)
  • Failure Reason โ€“ Human-readable explanation of why the validation failed
  • Condition โ€“ The rule or pattern that triggered the failure (e.g., [A-z])
On-Fail Action Shows the action configured for this validator, such as:
  • Filter
  • Fix
  • Encrypt
  • Allow but log
  • Stop execution
  • Re-ask
Updated Prompt / Response If the configured action modified the content