A lightweight, browser-based, aggregate dashboard for examining patterns and trends in AI capability over time, grounded in the CloudPedagogy AI Capability Framework.
The AI Capability Signals Dashboard supports sense-making, reflection, and governance-aligned conversation about how AI capability is developing across an organisation — without surveillance, benchmarking, or individual assessment.
It is designed to help teams notice patterns, surface tensions, and ask better questions, not to measure performance or enforce compliance.
This tool is part of the CloudPedagogy AI Capability Tools suite.
The AI Capability Signals Dashboard helps individuals, teams, and organisations:
- observe aggregate patterns in AI capability across the six framework domains
- explore trends over time, rather than one-off snapshots
- surface imbalances or tensions (e.g. innovation accelerating faster than governance)
- support leadership, governance, and strategic discussions without performance anxiety
- turn AI-related data into reflection and dialogue, not action-by-default
- document shared understanding for committees, workshops, and reviews
The tool is capability-led, reflective, and interpretive.
It is explicitly designed to support professional judgement, not replace it.
This tool is not:
- a monitoring or surveillance system
- a performance dashboard or KPI tracker
- a benchmarking or maturity-scoring tool
- a compliance or audit instrument
- a risk register or legal assessment
- an automated decision-making or recommendation system
All outputs are signals, patterns, and prompts for discussion — not verdicts or decisions.
The dashboard is grounded in the six interdependent domains of the CloudPedagogy AI Capability Framework:
Shared understanding, boundaries, risks, and realistic expectations of AI in context
Role clarity, partnership practices, human judgement in the loop, and responsible prompting
Practical use of AI in workflows, experimentation, iteration, and improvement of practice
Fairness, inclusion, harm reduction, transparency, and downstream impact awareness
Oversight, accountability, policy alignment, approvals, and decision hygiene
Review cycles, learning from experience, capability renewal, and institutional memory
These domains act as lenses, not checklists.
- Enter basic context information (e.g. team, programme, or organisational unit; optional notes)
- Record aggregate capability signals across the six domains
- Signals may be derived from prior assessments, workshops, surveys, or agreed reflections
- No individual-level data is required or supported
- Optionally add timepoints to compare capability signals over time
- View the dashboard to explore:
- domain-level patterns and trends
- emerging imbalances or tensions
- areas of acceleration, stagnation, or lag
- Use the built-in reflective prompts to support discussion, sense-making, and governance conversations
- Export or print summaries for committee papers, workshops, or documentation
The tool is designed to be used collectively and deliberatively, not mechanically.
The AI Capability Signals Dashboard provides:
- aggregate domain profiles (no individual or unit drill-down)
- trend visualisations over time
- imbalance and tension indicators
- explanatory “why this matters” context
- structured discussion prompts for groups and committees
- printable, shareable summaries for governance use
- AI steering groups and working groups
- Curriculum review and programme-level discussions
- Research governance and ethics boards
- Leadership workshops and away-days
- Capability retrospectives and annual reviews
- Cross-functional sense-making conversations
The dashboard is especially effective in contexts where trust, ethics, and accountability matter.
- The application runs entirely client-side
- No accounts, analytics, or tracking
- No data is uploaded or transmitted
- All inputs exist only within the user’s browser session
- Clearing the browser resets the session
- Suitable for static hosting (e.g. AWS S3)
The tool is explicitly designed to avoid surveillance and performance monitoring.
- Node.js (v18+ recommended)
- npm
From the project root:
npm install
npm run devThis repository contains exploratory, framework-aligned tools developed to support reflection, discussion, and sense-making around AI capability in education, research, and public-service contexts.
The tool is provided as-is for learning and experimentation.
It is not production software, not a governance system, and not a compliance or benchmarking instrument.
All outputs are indicative only and must be interpreted in context, alongside professional judgement and local institutional requirements.
These tools are designed to:
- Explore ideas related to the CloudPedagogy AI Capability Framework
- Support reflective, governance-aligned discussion
- Enable capability-led organisational learning
- Demonstrate concepts through lightweight, browser-based tools
These tools are not:
- ❌ Audits or formal evaluations
- ❌ Rankings, league tables, or maturity scores
- ❌ Monitoring or surveillance systems
- ❌ Automated decision-making or risk engines
- ❌ Substitutes for institutional governance or accountability
Responsibility for interpretation and any subsequent use remains with the user or adopting institution.
This repository is licensed under the
Creative Commons Attribution–NonCommercial–ShareAlike 4.0 International (CC BY-NC-SA 4.0) licence.
You may:
- Use, share, and adapt the tool for educational, research, and public-interest purposes
- Do so with appropriate attribution
- Share adaptations under the same licence
Commercial use, resale, or incorporation into paid products or services is not permitted without explicit permission.