Adult Social Care · Operational BI

When the Data Says
On Time But Clients
Say Otherwise

A punctuality paradox hiding a systemic flaw — and how interrogating a single anomaly changed how an entire care operation was scheduled.

Adult Social Care
Power BI · SQL
Service Managers · Care Coordinators
Workload redistribution + travel time policy

"The system logged visits as completed. Clients reported carers were constantly late. Both were telling the truth — the data just hadn't caught the lag in between."

Operations restructured carer scheduling
01 / 05

The Punctuality Paradox

A care provider was fielding a growing volume of client complaints. The recurring theme: carers arriving late. Some clients reported this was happening regularly, affecting their routines and trust in the service.

The operational data told a different story. Punctuality metrics were green. Visits were logging as completed on time. There was no obvious signal in the dashboard that anything was wrong.

The question wasn't why are carers late — it was why doesn't our data know they're late.

Rather than starting with the visit logs, I started where the signal was clearest: the complaints data. Lateness dominated. It clustered in specific geographic areas and certain time slots. That told me where to look next.

02 / 05

Finding the 12-Hour Anomaly

Pulling the visit log data, I began calculating visit durations by comparing check-in timestamps against check-out timestamps. Almost immediately, something didn't add up.

12hrs+
Visits logging at impossible durations A care visit doesn't last 12 hours. These were unclosed sessions — the system logging a new check-in as the close of the previous visit.

The average visit duration for the business was being distorted upward significantly. This wasn't data noise — it was a systemic logging lag. When a carer checked in to a new visit, the previous visit wasn't always closing properly, creating overlapping records and inflating completion rates.

The system registered visits as punctual. In reality, those visits hadn't properly ended.

Visit Duration Distribution Illustrative · based on actual pattern
<30m
30–60m
60–90m
90–120m
2–4h
4–6h
6h+ ⚠

The next question: who was this affecting most? Cross-referencing overlapping visits against carer workload revealed a consistent pattern.

Overlap Count vs. Carer Workload Each dot = one carer · one month
Workload (visits/month) → ↑ Overlapping visits
Standard workload — low overlap
High workload — high overlap (the pattern)

The carers with the most overlapping visits weren't the least experienced. They were the busiest — the ones the operation relied on most.

03 / 05

The Dashboard Design

The dashboard needed to work for two different audiences — service managers who needed the headline picture fast, and care coordinators who managed individuals day to day. Two pages, each with a distinct purpose.

Total Visits
1,847
↑ 3.2% vs last month
Complete Check-in & Out
74%
↓ 1.4% vs last month
Overlapping Visits
312
↑ 18% vs last month
Concentrated in high-workload carers
Missed Check-ins
89
↑ 7% vs last month
Incomplete Check-ins
143
↓ 2% vs last month
Avg Visit Duration
3.4h
Expected: ~45 min
⚠ Inflated by unclosed sessions

Colour language was deliberate throughout: red and amber signal system failures — overlaps, missed check-ins — never individual performance. Blue and teal indicate volume and completion. Red never points at a person. Only at a process.

The carer-level view included a Gantt-style timeline for any selected day. Overlapping bars — two visits occupying the same time slot — were immediately visible and undeniable at the individual level.

Carer Visit Timeline · Selected Day Gantt view · Page 2
Carer A
08:00
10:15
13:00
16:30
Carer B
08:00
09:45 ⚠
11:30 ⚠
14:00
16:00 ⚠
Carer C
08:30
11:00
missed
15:30
Carer D
08:00
09:30 ⚠
11:15 ⚠
13:30 ⚠
15:45 ⚠
Completed Overlapping ⚠ Missed check-in

📸 Dashboard screenshots · Insert Power BI exports here

04 / 05

The Ethical Dimension

This case sits at an uncomfortable intersection. Data that could easily have been used to performance-manage individual carers was actually evidence of a systemic failure in scheduling design.

Getting the framing right — and presenting the findings in a way that protected carers while giving operations something actionable — was as important as the analysis itself. In adult social care, the people in the data are vulnerable. The people delivering the care are often undervalued. Both deserve to have the data used carefully.

I

Protect, don't surveil

High-overlap carers were cross-referenced with workload first. The data was never used to discipline — only to understand.

II

Design language matters

Colour coding was set up so red pointed to process failures, never to individuals. A design choice with a real ethical consequence.

III

Question the system, not the person

The most important analytical move was asking why the data was wrong before asking who was performing badly.

05 / 05

What Changed

Operations used the findings to open a conversation about travel time — something that had never been formally factored into scheduling. Back-to-back visits in different locations were the norm. The data now showed exactly where that was breaking down.

0
Carers performance-managed as a result of this data
  • Travel time formally factored into carer scheduling for the first time
  • Workloads redistributed based on geographic clustering and visit volume
  • Logging system flagged for review — lag behaviour escalated to supplier
  • Ongoing dashboard monitoring adopted by care coordinators as standard practice
  • Client complaint volume used as a leading indicator alongside operational data going forward

The complaints about lateness didn't reflect carers failing. They reflected a scheduling system that hadn't accounted for the reality of the job. The data made that undeniable.