08.05 The Compliance Dashboard
Reading and customizing the Compliance Dashboard — the three default widgets (controls-by-framework bar chart, pass-rate trend line chart, pass/fail pie chart), the framework-filter dropdown that scopes the view, and the relationship to the rest of the compliance module.
Why this matters
The Compliance Dashboard is the page most working compliance practitioners open first when they want a sense of where the program is. It's the answer to "how is our compliance posture doing?" without requiring the reader to compose a query. A well-tuned compliance dashboard surfaces the three or four questions the program looks at most often (control coverage, recent test pass rates, framework-by-framework distribution) and puts the answers on one screen.
The other reason this matters: the dashboard is where the gap between intention and reality shows up. The compliance module's underlying data (frameworks installed, controls defined, tests scheduled, evidence attached) is the truth of the program; the dashboard is what makes that truth legible at a glance. Programs whose dashboards show "100% passing" are either operating exceptionally well or aren't testing rigorously; programs whose dashboards show systematic failures are either operating poorly or have honest test cycles. The dashboard is the diagnostic, not the verdict.
The third thing worth knowing: the Compliance Dashboard is scoped by framework. The page surfaces a multi-select dropdown at the top that filters all the dashboard's widgets to the selected framework(s). Programs running multiple frameworks (most programs of any size) use the filter to view per-framework posture; programs running one framework can ignore the filter (it defaults to All Frameworks).
The fourth thing: the dashboard is customizable via the same UILayout widget framework as the Risk Management Dashboard (see The Risk Dashboard for the customization mechanics, which apply identically here). The default three-widget layout is a starting point, not a fixed view.
Before you start
Have these in hand before opening the dashboard:
- The Compliance permission on your account (
compliancepermission key, displayed as Allow Access to "Compliance" Menu or equivalent). Without it, the Compliance Dashboard tile doesn't appear in the Reports Hub and direct navigation returns a permission error. (See Permission Reference.) - At least one framework installed. A fresh install with no frameworks shows an empty dashboard — the widgets need framework data to render anything meaningful. See Installing a Framework for the install paths.
- Some completed control tests. The pass-rate trend and pass/fail widgets need test history to render usefully. A program that's installed frameworks but hasn't yet started the test cycle will see empty widgets; that's expected and changes as the test cycle accumulates data. See Control Tests and Evidence Collection.
- Optional: a clear audience for the dashboard view. The Compliance Dashboard is good for the compliance team's daily check-in and for executive readouts of overall posture. It's less good for the audit conversation itself (which usually wants the Compliance reports and the per-control detail).
Step-by-step
1. Open the Compliance Dashboard
Two paths into the dashboard:
- Reporting → Reports Hub opens the Reports Hub at
/reports/dashboards.php. Filter to Compliance category and click the Compliance Dashboard tile. - Direct URL:
/reports/compliance_dashboard.php. Same destination; useful when bookmarking or linking from documentation.
The dashboard loads with the default widget layout the first time you open it. Subsequent visits load any layout customizations you've made (saved per user).
2. Use the framework filter to scope the view
At the top of the dashboard, a multi-select dropdown labeled Framework (or All Frameworks) lists the active frameworks installed in the system. The first load selects all frameworks by default, meaning the widgets aggregate across every framework.
To scope to specific frameworks:
- Click the dropdown to open the framework list.
- Deselect frameworks you want to exclude, or select only the framework(s) you want to focus on.
- The dashboard widgets refresh automatically to reflect the filter.
The filter writes to the URL as ?frameworks=id1,id2,... — useful for bookmarking specific framework views (e.g., a "SOC 2 only" bookmark, an "ISO 27001 only" bookmark for different audiences).
The dropdown lists only active frameworks (frameworks whose status is set to active in admin configuration). Frameworks marked Inactive are hidden; this is what makes the filter manageable for installations with many frameworks (typically those running the ComplianceForge SCF Extra or the UCF Extra).
The filter is not persistent across sessions — each fresh load starts at All Frameworks. For per-user persistent scoping, bookmark the URL with the desired filter parameters.
3. Read the default widgets
The default Compliance Dashboard layout has three widgets, configured in $ui_layout_config['compliance_dashboard'] in simplerisk/includes/functions.php:
- Controls by Framework (horizontal bar chart, full-width) — distribution of controls across the frameworks selected by the filter. Each bar shows a framework's control count. Useful for "where is our compliance work concentrated?" — programs installing many frameworks see how the population breaks down.
- Pass Rate Trend (line chart, full-width) — pass rate over time. Each point on the line is the percentage of completed test audits that resulted in a Pass for the period. Useful for "is our control posture improving or degrading?" — a flat line at 100% is suspicious (probably under-testing); a declining line is a signal that controls are failing more often.
- Pass/Fail Distribution (pie chart, smaller) — split between Pass, Fail, and Inconclusive results across the dashboard's filtered scope. Useful for "what's our current pass mix?" at a glance. The Inconclusive wedge is the one to watch — see Control Tests and Evidence Collection for why Inconclusive isn't a soft Pass.
The widgets refresh on page load and on framework-filter change; there's no live-updating channel, so opening the dashboard in the morning gives you the morning's state.
4. Customize the layout
The Compliance Dashboard uses the same UILayout widget framework as the Risk Management Dashboard. Customization works the same way — drag widgets to rearrange, resize via corner handles, add/remove widgets via the toolbar at the top, restore the default layout if a customization went badly.
The default Compliance widget set is small (three widgets); the available custom-widget options out of the box are limited (typically just the WYSIWYG widget for free-text annotations). Programs needing more compliance-specific widgets either build them via the Customization Extra or rely on the dedicated compliance reports for the questions the dashboard doesn't answer.
The customization is per-user, so two compliance team members on the same instance can run different layouts on the same dashboard.
5. Drill from a widget to the underlying detail
Like the Risk Management Dashboard, the Compliance Dashboard widgets are summary visualizations; the underlying detail lives in the compliance reports and the per-control views. Click into a chart segment (where supported) to filter the underlying data; for analyses the dashboard doesn't surface, the Compliance reports cover the gaps:
- Audit Timeline (
/reports/audit_timeline.php) — chronological view of audit activities and findings. - Audit Remediation Cycle Time (
/reports/audit_remediation_cycle_time.php) — average time from finding to remediation, by framework. - Dynamic Audit Report (
/reports/dynamic_audit_report.php) — configurable audit-progress report builder. - Control Gap Analysis (
/reports/control_gap_analysis.php) (under Governance) — controls missing tests, owners, or that are out of date.
See The Built-In Reports for the full report library.
For per-framework drill-down, the compliance module's framework view at /governance/index.php (Frameworks tab) is where you read which controls a framework has, who owns each, and which tests are scheduled. The dashboard summarizes; the framework view shows the work.
Common pitfalls
A handful of patterns recur with the Compliance Dashboard.
-
Treating 100% pass rate as good news. A flat line at 100% pass rate often means the test cycle isn't rigorous enough — tests with vague Expected Results that pass by default, or test definitions that aren't really exercising the controls. Investigate the test definitions if the pass rate stays at 100%; the rigor question is more important than the number.
-
Not filtering by framework when running multiple. The default All Frameworks view aggregates across everything, which is useful for the executive summary but unhelpful for the working-program view ("how is our SOC 2 doing specifically?"). Use the filter to scope to the framework that matches the question you're asking.
-
Reading the dashboard without the report library. The three default widgets answer three questions. Compliance programs have more than three questions. The Audit Timeline, Audit Remediation Cycle Time, and Dynamic Audit Report cover questions the default dashboard doesn't; they live one click away in the Reports Hub.
-
Customizing the layout without understanding the widget set. The default layout exists because three widgets is approximately what fits usefully on a screen. Adding additional widgets when the available widget set is small often produces a dashboard with redundant widgets (the WYSIWYG content widget can show the same chart from the dedicated reports, but as an embedded version). Stick close to the default unless you have a specific question the default doesn't answer.
-
Confusing dashboard scope with audit scope. The dashboard filter scopes the visible widgets, not the underlying data. An audit conversation about SOC 2 needs SOC 2-scoped data through the audit reports, not the dashboard view. Don't substitute the dashboard for the per-control evidence the audit conversation requires.
-
Reading once, never returning. The dashboard's value comes from regular reading — daily for active compliance teams, weekly for steady-state programs. A program that opens the dashboard once a quarter for the executive readout misses the trends that the in-between visits would have surfaced.
-
Ignoring the Inconclusive segment of the pass/fail pie. Inconclusive results often get treated as "soft passes" — the test couldn't determine, so the program assumes the control is operating. The Inconclusive segment is the signal that something needs attention: either the test is poorly designed (can't produce a determination) or the operational reality is genuinely uncertain. Either way, growing Inconclusive over time is a problem; treat it as a follow-up signal.
-
Sharing the dashboard externally without the Pass Rate Trend context. A snapshot of the Pass/Fail pie shared with leadership shows a moment in time. The Pass Rate Trend line shows whether things are getting better or worse. For external distribution, share both — the snapshot answers "where are we now?", the trend answers "where are we going?"
-
Forgetting that the customization is per-user. A team member who customizes their dashboard doesn't change anyone else's view. Programs that want a standardized dashboard view across the team have to either accept that each user runs their own layout, or document the recommended layout and have each user reproduce it.