In our 3rd privacy expert interview, we had the pleasure of speaking with Peter Kosmala, a renowned data privacy leader, speaker and public affairs professional. At the time of this interview, Peter was the VP of Platform at dataxu — a Boston-based global software company that leverages data to improve consumer marketing.
Working in a field where data is the name of the game, dataxu had to ensure GDPR compliance was locked down. In this interview, Peter shares insights and best practices on AI, data, the ad space and how to manage and think about data, privacy and compliance at your own organization and the processes dataxu went through to become GDPR compliant.
About the Interviewee
Peter Kosmala, CIPP is a data privacy leader, international speaker and public affairs professional who has worked on behalf of many companies and organizations that include dataxu, the 4A’s (American Association of Advertising Agencies), the DAA (Digital Advertising Alliance) and the IAPP (International Association of Privacy Professionals). At the time of this interview, Peter was Vice President of Platform at dataxu — a Boston-based software company that leverages data science to improve consumer marketing.
Prior to dataxu, Peter served as SVP of Government Relations for the 4A's: The American Association of Advertising Agencies. He also served as the first managing director of the Digital Advertising Alliance (DAA) where he launched “AdChoices,” a cross-industry effort that offers online consumers choice in the use of their data in interest-based digital advertising.
Peter also helped grow the International Association of Privacy Professionals (IAPP) from a small professional community to the world’s largest organization in data privacy. He led the creation of the first privacy credential, the Certified Information Privacy Professional (CIPP), and grew this to a full portfolio of training and certification programs.
For the full interview experience with Peter, listen to the recording on SoundCloud here:
*Editors Note: Below is a condensed version of the recorded interview.
FEROOT: Peter, tell us a little bit more about dataxu and your background?
PETER: dataxu is a programmatic media company founded in 2009. It's one of the original players in the programmatic media space and an AI powered platform, artificial intelligence. It's very, very data intensive, and very real time. We serve a number of clients that are both agencies and brands and I'm the Vice President of the platform.
I am also coming out of six years of representing the largest trade association for the advertising agency business here in the States, which is the 4A’s: the American Association of Advertising Agencies. I was the SVP of government relations working in Washington. That's where the crux of my policy experience comes from.
Before that, I was vice president of the International Association of Privacy Professionals (IAPP), which is the largest association representing professionals in the privacy field across any number of industries, including government and non-profit organizations. They presently have around 40,000 members globally.
For the IAPP, I put together what has since become the standard certification for data privacy, which is the CIPP: the Certified Information Privacy Professional. Back when I joined the IAPP, that didn't exist. That was 2004. I led the team that put it together — a group of advisors — and we put together a body of knowledge, a testing protocol, all the essentials for a professional certification and that was designed to elevate the recognition of the profession and its importance in an increasingly data driven economy. But also importantly, to establish some learning protocols and standards for the field so that everyone knew what is considered to be essential knowledge for privacy.
FEROOT: How was dataxu affected by GDPR? Do you have a lot of business in Europe?
PETER: We do. The vast majority of our business is in North America. But I think roughly a third of it is in Europe and yes, we were affected but in the sense that we got prepared.
We spent many, many months leading up to it to prepare our system and our vast affiliate network for compliance. Now, it's thoroughly GDPR compliant. It was a very difficult undertaking, however, because we have a proprietary platform and we are essentially integrating a vast assortment of third-party data sources into our environment. In fact, we can do that with clients and advertiser’s data too. So, we really look at ourselves as more of an integration platform where we're taking a wide variety of data sources, first party and third party, for the purpose of targeted marketing for our clients.
As such, we need to make sure that all of that is data is locked down to the best of our ability from a GDPR standpoint. And we did that in time for the compliance deadline.
FEROOT: Can you explain a little bit more about your processes? For instance, how do you make your data practices transparent?
PETER: Well, there's policies and notices available to consumers so that they thoroughly understand what programmatic media is and how it's used. But we're not really a consumer facing company in that regard. So, the best thing we can do is make sure that everything that we're doing is locked down from a compliance standpoint, and not just GDPR, but all applicable US laws, as well in every jurisdiction that we're in business, which is, by the way, three regions in North America. We're here in the US. We have an East Coast, West Coast and Midwest office. We have one EMEA (European, Middle East and Africa) office based in London that serves the region generally. And then we have offices in Sydney and Singapore, serving the Asia Pacific region. We're in business in all three regions. I would say North America is our largest market, followed closely by Europe. And then Asia Pacific is a third but we're thoroughly compliant with applicable data protection laws in each of those jurisdictions.
But, as I say, our value is to the advertisers and their agencies that are serving customers or consumers with advertising messaging. We're enabling to do them to do all sorts of things in terms of understanding who their consumers are, what their behaviours are and how they can refine and essentially advance and elevate what they're doing with their digital marketing, in whatever channel they're playing in.
So, we're sort of behind the scenes in that respect. Anyone that inquires directly with us obviously gets smart and accurate answers. But most of the time, the interaction or the dialogue is between the consumer and the brand. We're part of that digital advertising delivery chain, if you will, where it's between the brand, the agency, the vendor, the platform and then the consumer.
FEROOT: I am curious to hear your thoughts on privacy operations. We heard that you've heavily invested into preparing for the GDPR and other regulatory obligations from the governance, compliance and policies point of view. What did you have to change from the day-to-day privacy operations point of view leading up to and after the GDPR?
PETER: Well, we didn't really change much because we already had a general counsel and privacy officer, Sarah Weatherford, who came on board and she has a very strong technology background here in the New England area.
She heads up a team that consists of associate councils that, from a regulatory aspect, had everything locked down from the beginning. The rigorous exercise we went through was really on the technical and engineering side of the business. We went through a very rigorous analysis of everything that we were doing and made sure that it was compliant. It's not so much that we found any glaring holes or gaps. It's just that we were bringing that into a consistent way into what we knew we had to do in terms of notification obligations, processing obligations or certain durations of data, storage and retention. All these different aspects that we need to make sure we locked down from a European perspective.
It was a good exercise to go through. I'm sure it was not unlike what others have gone through in any number of technology settings, particularly advertising technology, but it was significant work because it meant you had to look at every aspect of the product that touches data. And we're built on data. Data is our first name. So, it's something that we had to undertake quite thoroughly.
We also built a lot of privacy and security protection into the product from the beginning. It wasn't necessarily in direct response to Dr. Ann Cavoukian's announcement about Privacy by Design , as much as it just was a logical way to proceed to build those protections in because you don't want data leakage, you don't want to be using data beyond the scope that it's been designed for. You don't want to retain it longer than you need to. And especially a platform such as ours, which is integrating a multiplicity of various sources from the clients own data, to data that we've revealed, to data that our partners brought to the table like LiveRamp, or Oracle or others that we work with daily.
FEROOT: What is one of the best practices you'd recommend to get comfortable with, or to follow, in terms of embedding privacy controls right into the product, the data, or maybe the databases or the UI/UX that consumers will be using?
It’s a great question and a valid need and challenge presented by any technology company that wants to do this right and do it responsibly.
I think Dr. Ann Cavoukian’s work is a natural starting point. There's resources published through Ryerson University presently or through the Ontario Office of the Information Privacy Commissioner (OTC) when they first launched the Privacy By Design program. So, there's definitely literature there and guidance that can help to visualize what needs to be done.
There's also other excellent work in the field that emulates and follows what Dr. Cavoukian set forth, such as Michelle Dennedy, who is a very talented chief Privacy Officer for Cisco Systems. She has published a book called “The Privacy Engineers Manifesto”, which is all about a perspective from within. It's someone who knows technology and engineering well. She's a software developer herself. But she knows how to visualize privacy and security from a development standpoint. And she's written it from that perspective, like, if you're a privacy engineer, which is really a role that doesn't formally exist, but just notionally this idea of how do I build, how do I bake in privacy, early in the product development lifestyle, no matter what I'm developing and carry that forward? That's a great book to look at for that.
I'm also a big fan of the principles-based approach. I don't think we have to overthink this. I don't think that developers and engineers need to necessarily memorize the GDPR. It's a legal document, it's very dense, it's usually handed over to a chief counsel or a legal expert to translate. But what you can do, is be very mindful, very sensible, very literal and aim to understand privacy practices. These are the standard frameworks that have essentially informed all of the major data protection frameworks in the world. In the US, there's something called the Fair Information Practices or FIPS, those are just very sensible principles around data collection, data storage and data retention.
The CSA model code, the Canadian Standards Association, became the influence and the inspiration in many ways for PIPEDA — Canada’s federal privacy law. It's also based in part on the APEC privacy principles, the Asia Pacific Economic Coordination, which is a global organization headquartered in Asia, but it includes countries in North America and Latin America and others that articulate very specific principles around purpose specification, what is the purpose of the data that you're collecting, do not exceed that purpose, etc.
If you're going to gather data, because let's say it's a postal code data you want so that you can provide weather information in return. Now that makes sense. That's purpose specification. I'm purposely asking for this result, but if I suddenly exceed that and say, oh, well, now I want to market to you, you better do that with permission. You can't just exceed the purpose arbitrarily and gather more information that you don't need for that purpose or exceed the purpose at hand. It's just sensible: you specify the purpose and that's the data you gather for. And if you want to exceed that, you need additional permission.
It just seems very logical as these sort of choice moments that you have when you're building a product or when you're devising a data gathering scheme that you need to instill in yourself, as a developer, what would the consumer think in this situation, and what do they need to know? They don't need to know everything, because you can over inform them. They can get noticed fatigue and this has happened so, you have to strike the right balance of knowing what is the context. What principle can I apply and how can I operationalize that?
It’s a very long-winded answer to say, just keep it simple. Don't get overwhelmed by legal requirements, but the principles-based and understand what literature is out there and available to you, as a developer engineer, or product lead, or even senior exec that can help you understand what needs to be done. And it's done in many cases by peers, by people who are just like you facing similar challenges and have figured it out
FEROOT: In your opinion, is privacy affecting purchasing decisions? Have you noticed this trend? And because dataxu was ahead of the game on privacy compared to most companies, did that make your company a more competitive choice as a vendor?
PETER: That’s a great question. I mean, on the one hand, you want it to be asked, because you want it to be important because you believe in privacy as I do. I know you folks do, it's what you do. So you want it to be important, and you want people to be aware of it and asking about it. The flip side of that, however, is it's not asked about all the time and that may not be because people don't care or they don't have a concern for it, but it just has a relative importance to other things. What we found and what I can attest to in my own travels with Dataxu is it has come up. It's not a deal breaker. It's not the leading question like, before we get started, what do you guys do about privacy? It's not necessarily that but GDPR has elevated the consciousness in many ways. It's been written about so extensively, even in mainstream media, you know, TV, newspapers, etc, that even general folks who didn't really think they were really involved in data in any way, and yet, we all are, have come to understand that, wow, it's a new regulation. I wonder if I should be thinking about this and businesses are doing the same.
GDPR drove up visibility and awareness and in a way that, prior to May or even prior to last year, really wasn't coming about all that much. So, we do get asked about it. It's actually, I would say, one of the checklist of issues that agencies and brands ask and want to know. They want to know are you GDPR compliant because first they don't want to get in trouble. They don't want any liability issues. So that's just a basic assumption, like “you guys have got this locked down, right?” Because if you don't, that just reflects very poorly on your general awareness of what needs to be done.
So, that’s almost an assumption, but they do ask generally about privacy, like, what sort of notices do you provide? Or what sort of information is available? What policies have you set forth? They also asked about things like ad fraud, because they want to know how are you dealing with that and be assured that everything that we're getting is actually human traffic and not bot driven, not false or fraudulent traffic. They want to know about brand safety, they want to make sure that you have a system in place to ensure that what they have selected as the sites and channels and environments that their brand is going into, whether that's an ad message for a text or social media or graphic or display ad that wherever it's going that it’s legit, that it’s a brand safe environment. It has nothing to do with human trafficking or drug smuggling or porn or any number of these sort of salacious categories. There's a huge hierarchy created so what this really all comes down to is trust. They want to the brand or the agency wants to engage with you as a platform and know that you've got every dimension of trust locked down. And that's really ultimately how to look at privacy.
Because... if you just look at it as itself as data privacy, and when you look at it from the big picture, that's actually siloing it down a little too far. Because people don't make decisions on that alone necessarily, they want to know that you are as trustworthy as a partner as a brand? And privacy is a big part of that, but it's not the only part. So I would say that, we have conversations with people that really is, they don't actually say, can we trust you, but you can tell that's the subtext of the line of questioning, they want to know about privacy and security, ad fraud, prevention, brand safety, all these dimensions that impact their relationship with the consumer and that would adversely impact their brand image if any of them went wrong.
FEROOT: That was a fantastic and very thorough answer. Thank you!
Good. You're welcome!
- Read our past expert interview with Dr. Ann Cavoukian
- Check out our interview with Eric Sutherland on Building a Framework of Trust
- Sign up to our blog to get notified of new content, webinars and interviews!