“Consumer data should be owned by the consumer. If we want to collect and use it for any marketing purpose, we must explain how we will do so – and obtain consent and permissions. (GDPR explains this quite nicely.)

But to get that agreement, the consumer must understand the trade-off. They need to understand what’s in it for them and see real value in the arrangement. On the whole, I’d argue we’re not yet holding up our end of the bargain,” says Robin ‘Bob’ Caller, CEO and founder of Overmore Group.

“That data is not, and never will be, ‘your’ first-party data. The customer is the first party in this transaction – you, as the marketer, are the second party.”

“As second-party data holders (i.e., you, the marketer), brands must obtain permission from the first party.”

“The path of least resistance is accepting that ‘privacy-first’ means ‘user-first.’ That needs to be buttressed by express and informed consent, an unbundling of permissions and empowering consumers to retain sovereign control over the what, as well as the why, when and how their personal data is used.

Details in this AdExchanger article.

About face.

“Obscuring your face does not hide you from facial recognition systems, researchers have found.”

“A group from the Max Planck Institute found that blurred images were still individually identifiable with just a few non-obscured images to train from. With the proliferation of images on social networks, it is possible that almost anyone’s blurred face could still be identified.”

“The researchers said only 10 fully visible examples of a person’s face were needed to identify a blurred image with 91.5% accuracy. With an average of just 1.25 tagged images, the system could still correctly identify an individual 56.8% of the time, which is 73 times higher than chance would allow.”

“The best method for staying anonymous is to post all your pictures …with a black box over your face and shoulders. The next safest would be blocking it out with a white box, then a Gaussian blur.”

Details in Wired magazine.

“[T]he customer is king. And data sharing increases comfort and convenience for customers, improves products and services, and contributes to achieving societal goals such as improving road safety, reducing fuel consumption and facilitating traffic management,” says Eric-Mark Huitema Director General of European Automobile Manufacturers’​ Association (ACEA).

“[The] principle that should guide the future framework for access to in-vehicle data is customer choice.”

“In-vehicle data sharing should be based on clear terms and conditions ensuring that consumers know what data they share and with whom, in full compliance with privacy and data protection rules. [C]ustomers need to give permission to allow third-party access to data and that they should remain in control of data sharing at all times.”

“Any future EU framework for access to in-vehicle data should not constrain innovation and competitiveness. Instead, it should lay down basic principles in key areas to safeguard fair and non-discriminatory access, technology neutrality, customer choice and – above all – people’s safety and security.”

Details from the European Automobile Manufacturers Association.

Consent seems to be the hardest word.

“Ad tech companies are investing in better consent and preference experiences for end users simply because they have no other choice but to try and emerge as brands that end users can safely trust,” says Romain Gauthier Gauthier, CEO and co-founder of French consent management platform Didomi.

Ad tech companies are in a good position to help address that challenge because they have integrations with both supply and demand, said Todd Parsons, Criteo’s chief product officer.

“Making consent interoperate across sites isn’t something CMPs can do,” he said. “Consumers benefit from ad experiences that are respectful of privacy and personalization preferences not just for one site but across domains – and getting consent is just one part of that.”

From a consumer perspective, if a login is required, people will use whatever mechanism they’re presented with as long as they’re convinced there’s value in doing so, Parsons said.

Details in this AdExchanger article.

“No consent – no photos” is the new “No shirt, no shoes (no mask), no business.”

You need consent under GDPR to take photos or videos in a pub and upload them to the pub’s social media networks. An informative sign is not enough, says Spain’s Agencia Española de Protección de Datos – AEPD.

The fact that the pub had provided an explanatory poster does not guarantee that the consent was unequivocal, since it can’t prove that each and every one of the attendees to the event had read the poster and agreed with its content.

Details from GDPR Hub.

“Canada Privacy Commissioner Daniel Therrien said key parts of the proposed Consumer Privacy Protection Act (CPPA, also known as Bill C-11) won’t increase consumers’ control over their data. He suggested quick and effective remedies for violating the law and encouraged innovation.

“[The CPPA “leaves out an important facet of our current legislation, the idea that meaningful consent requires that the person giving it understands the consequences of what they are consenting to.”

“Moreover, the privacy notices that serve as the basis for consent would still be allowed to use vague, if not obscure, language to describe the purposes for which companies intend to use a person’s data.”

“Bill C-11 lists only a few violations of the CPPA that justify administrative penalties. The list, Therrien noted, does not include obligations related to the form or validity of consumer consent for handling data, nor the numerous exceptions to consent, “which are at the core of protecting personal information.”
“It also does not include violations to the principle of accountability.”

Details from IT World Canada.

“Complying with GDPR and ethical considerations when developing a digital service is actually a ‘win win situation.'” – says Forbrukerrådet’s eloquent Finn Lützow-Holm Myrstad in a conversation with IAPP – International Association of Privacy Professionals’ Jedidiah Bracy.

Some key points:
  • If you don’t collect the data, it can’t be peaked or misused. If there is too much data, it’s bound to be misused.
  • Data Transparency: This is important, but it won’t fix everything. You will never have the perfect consumer who will make the perfect choices, the disclosures take too long to read and manipulative choices are hard to overcome. This is why we need some baseline protections in place.
  • Default settings play a significant role as the vast majority of people will stick with those choices.
Dark patterns include:
  • Illusion of control: choices which are too granular or difficult to find.
  • A big difference between turning a feature on and those to turn it off.
  • Confirm shaming: when you are made to feel bad for making a certain choice.
  • Triggering fear of losing something if you make the choice.

Details in this IAPP Article.

“Cookie replacement solutions connecting first-party data to individual ads through universal IDs are coming, but rather than chasing a retooled version of a historically clunky solution, marketers should build new data frameworks that employ statistical modeling and AI to illustrate a probabilistic media journey,” says Mark Sturino, VP of data and analytics at Good Apple.

“The specifics of how this can be done will vary by industry and primary KPIs; building plans that consider media mixes at the DMA level with purposeful changes in budgets, channel mix, and messaging in previously identified markets will provide data scientists the necessary variance in data across similar markets to run better analysis.”

“An additional advantage of a probabilistic approach looking at data from population segments rather than individuals is that it can be more directly tied to key ROI-driving tactics… A better connection of media data to sales will allow for better ROI conversations with brand clients, and connecting marketing efforts to ROI is clearly fundamental to success.”

Details in AdExchanger.

“A vast body of research has shown [mobility] data is highly reidentifiable. Previously, researchers showed that knowing four random points of someone’s trajectory points, such as when and where you take your morning coffee, was enough to uniquely identify that person 95% of the time in a dataset of 1.5 million people. Other studies have replicated similarly high unicity numbers with location data obtained from vehicles, smart cards in public transport, credit card transactions and mobile phone metadata in a number of countries.”

” But what happens when the dataset is much bigger? Do trajectories get ‘lost in the crowd’ and become effectively anonymous?”

Per a study conducted by Ali Farzanehfar and Florimond Houssiau the answer is “No.”

“People remain unique in population-scale datasets, and thus that dataset size is not sufficient protection for individual privacy. The reidentification risk is still very high even for a population of 20 million people (93% of people unique with three points).”

Read details of their research via the International Association of Privacy Professionals.

U.S. lawmakers have reintroduced legislation to protect connected devices.

“IoT” should also stand for “Internet of Threats” until we put in place appropriate cybersecurity safeguards, said U.S. Sen Ed Markey.

The Hill reports that the Cyber Shield Act introduced by Sen. Markey and Rep. Ted Lieu would:

  • Create a voluntary cybersecurity certification program for internet-connected devices.
  • Establish an advisory committee made up of cybersecurity experts in government, the private sector and academia to create security benchmarks for internet-connected devices. The benchmarks would enable the devices’ manufacturers to voluntarily label their products to show they have met these standards.

Details in The Hill.