Akara makes the case for ambient sensing in hospital operating rooms
Akara’s pitch is straightforward: hospitals lose hours every day in operating rooms because nobody has a dependable real-time view of what’s happening between cases. On TechCrunch’s Equity podcast this week, the company argued for ambient sensing ove...
Akara thinks the OR needs sensors before robots, and that’s probably right
Akara’s pitch is straightforward: hospitals lose hours every day in operating rooms because nobody has a dependable real-time view of what’s happening between cases. On TechCrunch’s Equity podcast this week, the company argued for ambient sensing over another robot in the hallway.
That may sound incremental. In hospital operations, the boring infrastructure usually matters more.
Akara, which landed on Time’s Best Inventions of 2025 list, has moved from cleaning robots to ceiling-mounted thermal sensors that track room activity, case progress, and turnover milestones. The aim is to treat the OR like a live operational system instead of a schedule that starts slipping before lunch.
That focus tracks. OR downtime is expensive, often absurdly expensive. Large medical centers commonly estimate OR time at $60 to $100 per minute. If a system can recover even 60 to 90 minutes a day across a handful of rooms, people pay attention. Unlike a lot of hospital AI pitches, this one maps cleanly to a problem administrators, perioperative leaders, and staff already know well.
The problem is in the handoffs
Hospitals don’t usually lose OR time because surgeons forgot how to operate. They lose it in the gaps between steps.
One patient wheels out. Cleaning starts. Instruments need processing or delivery. The next patient has to be staged. Anesthesia, nursing, transport, and support staff all need the same timing picture, and they usually don’t have it. A lot of this still runs on status calls, whiteboards, manual timestamps, and somebody walking the floor asking what’s going on.
That holds together until volume rises, staffing gets thin, or a case runs long. Then the schedule stops meaning much.
Akara’s bet is that ambient thermal sensing can give hospitals a shared stream of operational truth without creating a privacy fight. The system watches for events like patient arrival, procedure start, closure, wheels-out, room empty, and turnover completion. It does that with thermal signatures, not RGB video.
That part matters.
Why thermal makes sense here
Computer vision in hospitals has a trust problem. Even when the use case is operational, people hear "camera" and think surveillance. In an operating room, where staff already deal with compliance, privacy, and institutional risk aversion, that can sink a deployment early.
Thermal sensors avoid a lot of that.
They capture heat patterns, not faces, badge IDs, or readable screens. The data still needs careful handling because room presence and workflow context can be sensitive. But thermal is much easier to defend than standard video from a privacy and governance standpoint.
It also fits the environment better than a lot of people assume:
- lighting changes matter less
- masks, gowns, and PPE don’t break detection the way they can with RGB systems
- low-resolution thermal data is often enough for occupancy and workflow state changes
- edge processing can keep raw frames local, with only event metadata leaving the room
That last point can be the difference between a stalled review and an actual deployment.
If Akara is doing this well, the raw thermal feed stays on-device, inference runs on local edge hardware, and the system emits structured events into hospital software. Think wheels_out, room_empty, turnover_start, turnover_end, each timestamped and pushed into downstream systems over HL7 v2, FHIR, or vendor-specific integrations.
That’s a practical architecture. Hospitals respond to practical.
For engineers, this is event detection in a messy physical environment
The AI here has less to do with foundation models and more to do with pulling reliable signals from a difficult room.
A plausible stack looks familiar to anyone who has built edge vision systems:
- ceiling-mounted thermal sensors with wide field-of-view coverage
- local inference on something like
NVIDIA Jetsonor anEdge TPU - CNN-based detection on thermal frames, likely mixed with temporal modeling
- sequence smoothing using a state model, maybe an
HMM, maybe anRNN, maybe a simpler rule-driven layer if the data is good enough - event streaming through
MQTTorKafka - orchestration logic that updates Epic, Cerner, or the surrounding notification fabric
The hard part isn’t spotting warm bodies. It’s inferring useful state changes from messy motion.
An OR is full of occlusions, reflective equipment, HVAC quirks, carts, curtains, lights, booms, and layout differences from room to room. A model trained in Hospital A can behave oddly in Hospital B. Thermal data has its own failure modes too. Air vents shift local signatures. Equipment heats up. Staff cluster together. A patient bed entering the room can look like other movement patterns unless the temporal logic is solid.
So the system has to do three things well.
1. Detect events fast enough to matter
If the value is operational, alerts can’t show up 10 minutes late because the model waited for perfect confidence. A good system aims for confidence high enough to act on.
2. Turn noisy detections into workflow state
Single-frame predictions don’t help much. You need sequencing and context. Closure tends to follow incision. Wheels-out tends to follow closure. Turnover starts when the room state actually changes. This is standard temporal inference, just in a high-friction environment.
3. Hold up across sites
Per-room calibration is unavoidable. Per-hospital tuning probably is too. Anyone selling zero-config deployment in this setting is selling fantasy.
That doesn’t weaken the product case. It just puts it in the category of real industrial ML instead of demo-grade ML.
The least glamorous part of the strategy may be the strongest one
Akara reportedly got strong validation through the NHS. That could matter more in the U.S. than another polished pilot deck.
Hospitals are tired of AI startups that can demo but not deploy. They want systems that have survived procurement, governance review, and day-to-day clinical operations somewhere else. External validation lowers the "we are your beta site" risk, and health systems remember who burned them on that front.
It also gives Akara a cleaner entry point into U.S. hospitals because this category is easier to justify than clinical AI. This is operational software with sensing attached, not a model advising on diagnosis or treatment. The regulatory burden is lower than clinical decision support, though HIPAA, BAAs, audit logs, RBAC, and secure transport still matter.
For technical buyers, that distinction is important. You’re integrating workflow telemetry, not asking clinicians to trust a model on patient care.
Hospitals are finally starting to act like event-driven systems
A lot of healthcare enterprise software still behaves like a record system first and an operational system second. Akara’s approach fits a different model. The OR becomes a stream of state changes, and downstream teams subscribe to them.
That fits the problem better.
If wheels_out fires at 10:32 AM and the system predicts room readiness at 10:52 AM based on turnover history, staffing, and case type, housekeeping can be dispatched immediately. Sterile processing gets advance notice. Transport gets a pickup target. Perioperative staff see the same forecast. If the turnover slips, everyone gets the same updated estimate.
That’s standard event orchestration. In hospitals, it’s still less common than it should be.
It also helps explain why Akara moved away from robotics. Robots need a legible environment. Most hospitals still don’t provide one. If a cleaning robot doesn’t know when a room is actually ready, whether the hallway is clear, or whether a case delay just changed the sequence, then it’s not fixing the main problem. It’s waiting around at great expense.
Sensors first. Then automation has something stable to connect to.
The limits are real
Hospitals shouldn’t buy the fantasy version of this category.
Thermal sensing won’t solve staffing shortages. It won’t remove all manual coordination. It won’t generalize perfectly across every OR footprint. And it can generate false positives if installation and calibration are sloppy.
A few things technical teams should press on hard:
- Event accuracy by room and by event type.
room_emptymay be easier thanclosure_complete. Don’t average them together and call it success. - Drift over time. Layout changes, HVAC changes, seasonal patterns, and staffing changes can degrade performance.
- Integration quality. If alerts land in a dashboard nobody watches, the product fails no matter how good the models are.
- Fallback behavior. When confidence drops, does the system abstain cleanly, or does it flood teams with bad updates?
- Security posture. Event-only pipelines help. They still need proper
TLS 1.3,mTLS, auditable access controls, and sensible retention policies.
And yes, privacy is still a live issue even with thermal data. No faces doesn’t mean no risk. In a hospital, occupancy data plus timestamps plus procedure context can still be sensitive.
Why technical teams should pay attention
If you build systems for hospitals, this is a useful pattern to watch because it shows where applied AI is getting real traction in healthcare.
Not in giant language-model wrappers for everything. Not in humanoid robotics demos. In tightly scoped sensing systems that produce structured operational signals and plug into old, annoying enterprise plumbing.
That’s a real business. It’s also a solid engineering problem.
The stack is concrete: edge inference, temporal event detection, integration with HL7 and FHIR, privacy-conscious data handling, and workflow orchestration. None of it is glamorous. All of it matters.
If Akara is right, the operating room may end up being one of the better places for ambient AI because the ask is so grounded. Hospitals don’t need a machine that sounds smart about surgery. They need one that can tell housekeeping, transport, nursing, and scheduling what’s happening now, with enough accuracy to keep the day moving.
That’s the kind of claim that survives contact with reality.
Useful next reads and implementation paths
If this topic connects to a real workflow, these links give you the service path, a proof point, and related articles worth reading next.
Design controlled AI systems that reason over tools, environments, and operational constraints.
How field workflows improved throughput and dispatch coordination.
Carbon Robotics has a new model called the Large Plant Model, and the practical change is straightforward: its farm robots can now identify and act on weeds they weren't explicitly trained on ahead of time. That matters because this is applied comput...
Obvio, a San Carlos startup founded by former Motive engineers Ali Rehan and Dhruv Maheshwari, has raised a $22 million Series A led by Bain Capital Ventures to roll out solar-powered camera pylons that enforce stop signs. The pitch is straightforwar...
Orchard Robotics has raised a $22 million Series A to mount high-resolution cameras on tractors, scan orchards and vineyards during routine fieldwork, and turn those passes into per-tree data farms can use the next day. The round is led by Quiet Capi...