Health data is some of the most sensitive information about a person. Blood pressure readings, heart rate logs, oxygen saturation over time: these things tell a story about your body that most people would share only with a doctor or a close family member.
And yet the default model for health apps treats this data as a commercial asset. It gets synced to a proprietary cloud, sometimes shared with third parties for research or advertising purposes, and protected by privacy policies that few people read and fewer still understand.
How most health apps actually work
To use the majority of health tracking apps, you create an account. That account ties your readings to your identity. Your data is uploaded to a server owned by the company. From that point, what happens to it is governed by a terms of service document written by their lawyers, not yours.
Some apps are upfront about this. Many are not. The practice of using health data to build advertising profiles, or selling anonymised datasets to insurers and pharmaceutical companies, is more common than most people realise. Even when data is "anonymised," research has repeatedly shown that health data can often be re-identified from a surprisingly small number of data points.
A different approach
With Vitals, the model is straightforward. Your data stays on your device. Full stop.
Vitals does not request internet permission. It cannot send your data anywhere, even if it wanted to. There are no analytics SDKs, no crash reporters that phone home, no advertising frameworks. There is no account, so there is no way to tie your readings to your identity in any system we control.
When you export your data, you control exactly what gets exported, in what format, and where it goes. You might email a PDF to your GP. You might save a CSV to a USB drive. That decision is entirely yours.
Why this matters to us
We're building tools for people to use in their daily lives. Health is one of those areas where the cost of getting trust wrong is very high. If we built something that misused sensitive data, however indirectly, we wouldn't want to use it ourselves. That's the simplest test we know.
We think software companies should be straightforward about what they do with data, and that the best default is to collect as little as possible. In the case of Vitals, we collect nothing, because we built it so that we never could.