We started out by following the journey of a single data point as a user enters one into Clue, the period tracking app. We saw it travelling to third party service providers that help us, the creators of the app, to understand how the app is being used, where it needs improvement, etc.
In the second part, we went into the hidden world of Facebook, data and money flows in digital advertising. In my mind, this is a narrow path we need to keep carefully navigating to avoid falling into ethically deep waters.
In this third blog post, I’d like to describe where data can also go and when data in itself becomes a product. In other words, when information about you, your health and your behavior becomes the product. Our priority at Clue is to assure our users that their data is protected and only being used in ways that they understand and agree to. As we continue to dive deeper into this issue of your data and privacy online and work to find better ways to be more transparent, we welcome your opinions on how to do so.
Users must be able to understand what is happening to their data. There needs to be an agreement and an understanding between the user and the company about what is given and what is taken. Here at Clue, we get your data (and money if you subscribe to Clue Plus or something else you consciously chose) and we give knowledge and insights about your menstrual cycle, hormones, and body back in order to empower you to live your fullest life. It’s that simple. This is the transaction that the Terms of Service document describes. Every time you download and start using an app, you are requested to tick a box saying that you accept and understand what you give and what you get. Of course we all know that most people don’t read them, and of the ones who try, most will get lost in the small print that was written specifically to not to be understood.
In contrast, here are Clue’s Terms of Service, written to be both educational and readable. If you have any questions or concerns, we welcome these, too, and we’ll respond promptly and take your points on board.
Again, as a founder, I care deeply about our users’ privacy, and about honoring their trust. Now, for other companies, it's a bit more complicated. What they get from users is data, and this is their real product, which they then sell to other companies. The user gets a free service in return (the app) which is also the data collection vehicle. There are a few problems with this model, in my opinion. The biggest is the lack of transparency. Do you the user understand what is really going on? That the data you create is the product, that you are the product?
And what is being enabled with this data, particularly your sensitive health data? There is an uneven power balance, where the user has no way to know how companies might take advantage of knowing more than what the individual does.
But it isn’t so black or white either. For example, with more data a pharmaceutical company might be able to develop better medicine, that which takes account of variations and patterns which only a big data set allows us to see. Women and other people with cycles experience a lot of variation that, famously, the pharmaceutical industry has failed to address, providing us with a “one size fits all” medicine that in fact only fits people without cycles. But even when it comes to medicines developed specifically for women, such as the contraceptive pill, far too little attention is paid to the great natural and healthy variation between our individual hormonal profiles. The approach still seems to be to fit the woman to the pill, rather than the pill to the woman. More and better data might change that, and pharmaceutical companies have an interest in developing products that people are more satisfied with.
That loops me back to point one: The user or patient must be in a position to understand and choose who they want to share data with.
At Clue we also ask users, via our Terms of Service, to agree with us sharing their data (with all personal identifiers removed or part of it) with carefully vetted researchers. We currently don’t charge the researchers when we let them do their science work on the data. In fact, we give out small research grants and employ people, shouldering costs to enable this data to work for a greater societal good. We do this because we believe that as guardians of this unique data set, we have a responsibility to make it benefit users and society the best way we can. And with more research, eventually people will get better products and better care.
We also make an effort to communicate the findings of the research back to the people who shared their data, and in general to be transparent about how this data is being used.
Personally, I’d rather have my de-identified data be entrusted to a scientist with a health and scientific objective than give my personal data to a marketing executive with a revenue goal. Of course, people are free to share or sell their data in any way they please, but should not be duped, blindfolded, or lured into a deal they don’t even know exists.
The key is that users understand what’s happening with their data, and have given their clear-eyed consent to it, whether that involves sharing data with private companies, universities or governmental bodies.
In conclusion, I will say, data is an incredible opportunity for us to understand ourselves, our bodies and lives in ways we didn’t before. We should not shy away from generating, collecting, analyzing and learning from the data. But users must demand that they are put first, and as tech companies we have a huge responsibility to govern and handle data, this powerful specimen, with rigorous ethical scrutiny and appropriate care. Policy makers have a big task at hand to regulate and think hard about how we want technology to shape our world, and your life.
Technology must never be leading; money even less so. Humans must come first.