We think Pixoneye is the most private, secure way to personalise on the planet.
Pixoneye’s computer vision analysis is performed by our SDK entirely on your phone. This is possible because we have developed a uniquely tiny and efficient AI, and it means that we don’t need to upload any of your photos to the cloud.
The only information we ever send away from the phone is the completed analysis. This is a series of simple probabilities which indicate the likelihood, as calculated by our algorithm, that you belong in each of our categories. We don’t need to collect any data which could be used by hackers to steal your identity such as your name, address or date of birth.
We believe this makes Pixoneye inherently the most secure personal data company in the world. At no point do we even have the data used for our analysis. Your photos remain where they belong: on your phone. We know who you are, without taking any of your data.
AI is ‘dumb’
When people hear that there is a company using AI to analyse photo galleries, they often imagine the Hollywood version of AI – Hal, Blade Runner, or The Matrix. They imagine a computer with human-like intelligence is flicking through their images, responding emotionally to them, and judging their life and character.
The truth is, AI is very far from being at that level of sophistication. Despite being cutting edge and extremely complex, our AI is narrow AI, rather than general AI. This means it is capable only of the very limited tasks for which it has been built.
When our AI is scanning a photo gallery it looks at the photos in a very different way from the way a human would. It is looking for a particular pattern of pixels, light, and colour values which indicate, for example, the presence of a dog in an image. Then it figures out the strength of this ‘dog signal’ relative to other signals in the gallery as a whole. From this, it calculates the likelihood that you own a dog, and this single number indicating likelihood is the only thing that is shared about the user.
This is all the analysis that takes place. If it hasn’t been trained to look for a signal, the AI is completely blind to that element in the photos. It has no way of seeing the photo of you at the pub when you should have been off work sick, or the racy pics you sent to your partner later that night.
There are also some characteristics that we will never train the AI to see, and you can read about what these are and why we made that decision here.
Our AI is never 100% certain about any decision it makes about an individual because the analysis is probabilistic. What is passed to the brand is a single number indicating the probability that an individual falls into our various categories (dog owner, parent, etc).
This is very different, and much preferable from a privacy perspective to the current way that you are tracked and profiled by other data companies. Facebook shares what pages you have liked, your education level, and even your list of friends with third parties. Retargeting companies allow any third party to follow you around the web. And the DMPs and your ISP sell the collated details of your browsing and purchase history. A huge number of other companies track and share your actual location, in real time, with their clients. All of this information records your actual behaviours.
With Pixoneye, your actual behaviour is not tracked or stored in any way. Only the list of probabilities is calculated and shared with the brands which you have granted access. The information can therefore never be used to incriminate you or demonstrate that you have engaged in any specific behaviour.
Brand ownership of data
Because Pixoneye sells our technology and not data directly, the information we generate for our clients belongs to them. We cannot sell it or share it with any third parties without the informed consent of both the brand and the end user. Furthermore, although the analysis is stored on our servers, it is ring-fenced for each of our clients.
This means that when you download an app from a brand you trust which uses Pixoneye, and grant that app gallery permissions, the relationship you have is with that company only, and not with any third party, including us. This gives you the power to choose which brands you trust, and the transparency to know easily who you have given your data to, and how to take away access to that data. We like to call this ‘data monogamy’.
In contrast to the way we do things, many other data services, SDKs, and cookies, own the data they create and often share it with others. This means you have no control or visibility over where your data goes and who can see it. We view this as an abuse of your privacy, and we are thankful that since the 25th of May, GDPR has made this type of practice much more heavily regulated.
Strict licensing agreement
Just because the brands using Pixoneye own the data that we generate for them, this doesn’t mean they can do whatever they want with it. Quite the opposite, our partners must abide by our strict licensing and service agreement. This agreement includes stipulations that:
- The partner, whether or not they are based within the EU, must comply with the data protection legislation in force. Since the 25th May 2018, this is the GDPR.
- They cannot install the Pixoneye SDK on a device without first obtaining permission from the user to access their photo gallery.
- They must make it very easy to withdraw the permission to access the gallery. Once this is done, Pixoneye will delete that user’s profile from their servers.
- Data created using Pixoneye’s technology may not be used to target political messages or discriminate against the user based on gender, sexual orientation, race or religion.