Matter and Privacy
As of October 1, 2024, I've stepped away from my role as VP of Technology at Matter. This means that I can no longer personally vouch for the privacy aspects of the app.
When I was still working at Faculty, we took on a new client that was not yet named Matter. We eventually transitioned from an agency-client relationship to a startup relationship, where I became the VP of Technology. This is what I've been doing for the past two-ish years.
Chris wrote some good background on founding Matter, so I won't repeat all of those details, but I've been wanting to write a bit about the origin story from my perspective.
When we were trying to figure out how to turn some of the neuroscience, existing material, and lots of our CEO Axel's ideas into a product, we started discussing the idea of building an app around the concept of allowing users to log memories that they could later recall to improve their happiness. As a natural skeptic, it took me a little while to come around to believing that this was even possible and wasn't just hand-wavy wellness stuff. I've since been convinced that we have technology that—when employed correctly—can actually improve happiness by having our users recall positive experiences. And we have actual science (which I will link in the future) that proves that their brains will create/release neurotransmitters ("happiness chemicals" in the context of what we're working on) in line with their original positive experience, making them feel happier. For real.
So, as a very bare concept, we landed on the idea of allowing users to store photos of these positive experiences, as well as logging ratings of the emotions they experienced so they could recall them later to stimulate these neurotransmitters.
At one point Axel asked me "what do you think of that?" I said "To be honest, I have to ask: why would I ever send you a private photo and a positive rating for a sexual desire emotion? I would never send something like that to another party like Facebook, so why would I do it for us?"
This led us down an interesting—and mostly unique—path to how we handle private user content, and how we model threats against this private data. We adopted user privacy as a core value, and how we think about this informs many other decisions we make with the app and the whole ecosystem handling our users' data. This became so important to us that we knew it needed to be one of the foundational aspects of how we work and this decision informed the product, not the inverse. We knew it was not something we could bolt on later—that trying to add this once we'd already exposed user data (to even ourselves) would be error-prone at best, and impossible at worst.
Early on, we set out some core principles:
- we need to build trust with our users so they can believe what we say when it comes to how we handle their data (advanced users can audit traffic over the network to help build this trust, if they desire)
- we need to protect our users from mistakes we might make (we shouldn't be able to suffer a data leak of our users' data if we have a weak password or our code has a bug)
- even if we are competent enough to prevent a leak from ever happening, and even if our users trust us to do what we say, we must be resilient to being strong-armed by a future controlling power (e.g. if someone we don't trust buys us)
We also had some extremely unusual conversations related to parameters around how far we can go with this:
- "should we build our own datacentre for this?" "no, probably not. We can use an existing host if we're careful about what data we collect and how we collect it." "but would our things be safer if we did?" "honestly, someone much larger than us will do a much better job with the physical layer… I don't think we want to spend our funding on hollowing out a mountain and hiring a militia."
- "we can have the best intentions, but we can't always rely on those intentions. If one of our users' data became valuable to an evil nation state and they kidnapped my family, I'll be honest, I'd probably have to hand over the data."
Given these criteria and extremes, we decided that our best course of action is to just never have our users' private data.
This means that when you rate something high "pride" in Matter, we can't tell you've done that. We've intentionally set up our metrics system to refuse to collect this kind of personal data, and we (the people and systems at Matter) simply never get it (only the app running on your device gets this data). We don't store the data on our servers (outside of analytics—and even then never the data we consider private like emotion ratings); it always stays on your device and within your device's datastore. (Matter is an iPhone app, so we store data on your phone with Core Data, and in a private database that syncs within your iCloud account, but is set up in a way that even we can't access it. The app code that runs within our developer credentials, on your device, can read and write this data, but it is never sent to us and we have no way of accessing it through Apple's tooling. It's truly private to you.)
We (again, the people and systems at Matter) do get the product of some of those inputs, but never in a way that we can reverse it. A very simple version of this is if we were to collect the product of an a multiplication operation with the value "600", we don't know if the inputs were "1 × 600", "100 × 6", "30 × 20", "12 × 50", etc. We don't know what went into a Matter Score for a memory but we do know the score. We know the "600" but not the "8" or the "75". We don't even know how you described a memory or what's in a photo you attached. All we know is that there is a memory, it has a score of 600, and it belongs to a numbered account.
Numbered account? Well, we also don't know who—specifically—our users are, and this is maybe the most controversial of our decisions. We don't have accounts; we don't even currently have email addresses, outside of our mailing list. There is no forced association between our mailing list users and our app users. In the future, we may allow users to opt in to self-identifying, but even then we'll continue to be careful about never collecting private data.
When users add memories to the app, they'll usually add content such as images. We don't want to (and we don't) hold these, either—at least not in a way we can see them. We primarily store these images on your device, but because the size of this storage is limited, we do have a system for storing assets such as images that have been encrypted on-device, and the actual photo contents or the decryption keys are never sent to us. We store data for users, here, but to us it looks like random noise (the binary ciphertext), never like a photo of whatever it is you're storing a photo of. I intend to write more about this in the future, since we expect to publish some open source tooling related to this.
So, we don't have your data in a database that we can access in any way (again, beyond collecting metrics on user-driven events that we don't consider private, so that we can know number of active users, performance in the system, etc.).
This poses a couple serious problems. The main problem is: if I lose my data, how can I recover it?
Well, the short answer here is: we can't. We can't identify you by email to reset your password. We don't have your email address (associated with your data, at least), and you don't have a password. Even if we did have those things, we don't have your data so we can't restore it.
Right now the app has backup/restore functionality and we expect users to use that to protect themselves from data loss. We've put a lot of thought into storing these backups for a user, but having that user identify themselves is a difficult problem. Storing that data on behalf of the user, in a way that we can't get to it is also a problem. But a very interesting problem. I think we have good solutions to these problems that we expect to build into the app before we're out of beta, and I also hope to post about this in the future.
There's a bit more info in our Privacy Policy, which we're bound by.
I've been working on lots of technology things at Matter, but overseeing our privacy implementation has been one of the most rewarding.
One time, almost a year into working on this stuff, Axel said to me "I love that you're thinking about this stuff and coming up with these kinds of solutions" to which I barely held back a tear and replied "I've been thinking about this stuff very seriously for over a decade, and I love that you're the first person who's really let me implement it."