Jules Polonetsky, AOL’s chief privacy officer
Many of the largest companies on the Web are using ad dollars to pay their power bills, so much so that advertising — particularly when based around targeted ads — has become the engine of the modern Internet economy. Google’s empire was built on ad dollars. Microsoft’s effort to purchase Yahoo was about bringing in more ad revenue. Internet pioneer AOL is pegging its entire future on plans to transform itself into an ad-supported media and content company.
Through a string of acquisitions, AOL has built the largest ad network on the Web. According to comScore, ads placed by Platform A reach more than 90 percent of the online population.
But that also means that AOL business units have access to information about the activities of nine out of 10 Internet users. What happens to all that information? How is it used? Just how much do Internet companies know about what we do online?
Watchdog groups aren’t the only ones asking these questions. Increasingly, many lawmakers are wondering if it might be time to set some limits on how Internet companies collect and use data.
In December, the Federal Trade Commission issued a set of self-regulatory guidelines about behavioral targeting for the concerned parties to consider and offer comments. Predictably, the feedback saw many Web companies complain that the principles were too restrictive, and consumer advocacy groups gripe that they did not go far enough.
These groups charge that Web companies, left to their own devices, cannot be trusted to adequately protect consumer privacy.
As a result, groups like the Center For Digital Democracy have lobbied — unsuccessfully — for legislation to protect consumer privacy online. In New York State, Assemblyman Richard Brodsky has introduced a bill that would curb significantly current online data collection practices.
Against strenuous objection from industry groups, the bill, loosely based on principles drafted by the Network Advertising Initiative (NAI) in 2002, is now pending in the New York legislature.
Recently, InternetNews.com had a chance to sit down with Jules Polonetsky, AOL’s chief privacy officer, while he was in town for a debate about online privacy at the National Press Club in Washington, D.C.
Polonetsky’s career prior to joining AOL included a stint as chief privacy officer at DoubleClick, the online ad giant recently acquired by Google, and terms as a New York State legislator and New York City Consumer Affairs Commissioner.
Q: A lot of the discussion around behavioral targeting concerns whether we need government regulation to protect consumers. The FTC has opted for a policy of self-regulation. Privacy advocates on one end say that is not enough. On the other side, we have the [Interactive Advertising Bureau, or IAB], of which AOL is a member, saying that no one has been able to prove that there is a legitimate harm to consumers. Is there a harm to consumers?
I think the key thing for both consumers and businesses is that there are clear rules in place that so it’s obvious to consumers what’s happening, and make clear rules for businesses about what’s appropriate and what’s not appropriate. Don’t use sensitive profiles. Don’t keep data forever. Give people the ability to opt out.
I think it’s hard to envision a law that solves all of the issues in one neat box. Who knew years ago that people would want to be broadcasting information about themselves to their 800 friends? Despite my snafu of inadvertently spamming all my friends with an “I love you” message intended for my wife, I generally enjoy the ability to share my pictures with people and to connect with my friends.
[cob:Special_Report]So I think the range of consumer perceptions of what they actually want and the kind of controls that are needed are still evolving.
Years ago, when I was the chief privacy office at DoubleClick, people said, “Look, we’ll have a law and regulate this.” Who was even thinking about Web 2.0? And who was thinking about mashable widgets and programs that plugged into each other and social networking?
We would have completely missed the boat, so by the time we finished nailing the right definitions, we would be five years out of date.
So I think what businesses and consumers need in the privacy arena now is clear standards. Businesses will compete — and do quite well — as long as there’s a level playing field. Consumers, as long as they understand and feel in control of their experience, will be satisfied with their interactions.
Page 2 of 5
Q: Can you ever imagine a time when the technology settles down enough that national legislation would make sense?
I think businesses have a chance to prove now that, just like Web 2.0 has given users more control over their experience, they’re ready to give consumers that same level of control over their data. If the technology solves the problem, there won’t be a problem. If the business practices and technology don’t solve the problem, the inevitable result will be legislation aimed at stopping the outlier practices.
Q: Any concern that it would take some kind of catastrophe to bring that about?
Bad laws are usually made in response to knee-jerk reactions to specific instances. The bigger issue here is as businesses increasingly use more and more data online, are they recognizing that they are the custodians of user data and have to prove that they are trustworthy in how they use it? We have the chance to ensure that technology goes in that direction.
If businesses don’t succeed, then much of this use of data will be rejected by users, and business models or technologies that limit personalization will be increasingly popular. So it behooves industry to step up to the plate.
I applauded the FTC when they proposed their principles. I said it would be a provocation to industry to move on some of these issues. They are not brand-new issues. Some of them have been around for years and years. But there is a need to move. So I don’t look at it in terms of harm. I look at it as what is it that businesses should be doing to please consumers and serve the advertisers who help provide that experience that works for both.
If there is a business and market failure, then all kinds of crazy things happen at that point — legislation [or] use of ad-blocking and cookie-blocking and other technologies that would address the issue far quicker than a bill. So if you don’t want consumers running away from your site because someone else is providing a better experience, you better super-serve them. Today, businesses are only first understanding that super-serve them means prove to them that they can trust you with their data.
Q: AOL and other companies have pursued a policy of privacy education. What are some of the biggest misunderstandings among consumers about what is going on with their data?
I don’t think education solves the problem. The solution is a combination of education, more transparency and visibility for the people who don’t want to spend time being educated, but want to understand what’s going on when they interact with a program, and clear rules that don’t allow practices that people are likely to find overly intrusive. If you succeed in those areas, you solve the sturm und drang that exists in the market.
Q: AOL collects many different kinds of data. In your FTC comments, we read that personal account information for the portal AOL.com is kept apart from cookie-based ad units like Advertising.com and Tacoda that are designed to avoid personally identifying users. But the fear is that all of these data are under the umbrella of one company — AOL.
Then you really better appoint a very senior, respected, strong and well-paid chief privacy officer if you’re going to make sure those sorts of internal rules are being followed. [Smiles] Only partly being facetious.
Q: How are the data sets siloed?
Some of the businesses are actually different businesses that are based in different places that have different infrastructures and technologies. Others are integrated, and I need to make sure that there are technical barriers in place, and in other cases I need to make sure that there are policy barriers in place so people can’t access data — even if it’s something that’s appropriate under our policy — without ensuring that that use is something that users would be comfortable with.
So in some cases it’s policy rules, some places it’s technology barriers, and in some places it’s the reality that the business units aren’t integrated.
Page 3 of 5
Q: So in the case of Tacoda, is there a technical barrier between the data it accesses and the personal information of the AOL portal?
I think there’s a good structure to ensure their business model does what it does when they serve ads on AOL, which they do — we are a member of their network just like other Web sites. My people need to understand that any of the data we may have, [just] because we recognize you personally, may not be linked to the Tacoda cookie.
So I just sent out cookies [real ones] to the entire company with a little “recipe” for proper [Web browser] cookie use — “Recipe No. 1 for Proper Cookie Use.” In it, I gave a number of cookie practices.
For instance, I say, “Why would you be setting a cookie that expires in 30 years? Do you expect to have a cookie that lives for 30 years? Has a consumer ever had a computer for 30 years? Has any cookie ever lived for 30 years? They get deleted, they get opted out, you change your preferences … Why would you set a 30-year cookie? Stop doing it. Make sure it expires in two years, because there’s no way you’re going to need on for anywhere near that long, so do that.”
One of the other pieces in this recipe that went along with this nice, black-and-white cookie that they all got is: “Remember, our ad-serving divisions — Adtech, Ad.com, Tacoda — have made commitments that their ad serving is not personal, that they do not connect personal information to any of their ad activity. Remember that you must keep your personal AOL data separate from that ad data.”
Part of it is expecting our people to read the privacy policies, but we also promote it and educate people about it. There is a privacy lead in each of those divisions. They’re charged with making sure that that barrier is kept. I have technical people on my team who actually test and self-audit and run cookie scans. I have a cookie registry where I have developers across the company registering their cookies with me, telling me what they do so I can check them and say, “Sure.”
Q: These are all internal controls.
They’re internal but these are things that are visible. There’s often discussion that a lot of this is invisible. I would submit that most of the rest of data collection that happens in the rest of the world is somewhat invisible. I use my discount card at the supermarket — I don’t see the data flowing from my finger into the computer into wherever it goes.
Technically, I agree that the average consumer isn’t sophisticated [enough] to go ahead and learn [how their data are being collected and used], but lots of reporters are, lots of advocates are, lots of techno-experts are, and if you do something that’s out of line and you’re high-profile, you’re going to get caught. You can’t get away with it. The openness of most of the technology does create a policing [mechanism].
One of the things I’m trying to do on Privacy Gourmet [a corporate blog] is give some of those technical users an explanation of some of the nuts and bolts so they can do their work — their exposing and their discussing, because that will help police the system as well.
So Jeff [Chester, executive director of the Center for Digital Democracy] plays a key role and Marc [Rotenberg, executive director of the Electronic Privacy Information Center] plays a key role in helping internal privacy officers say, “You can’t do this.”
There are people out there who will bring the wrath of God down on you.
Page 4 of 5
Q: Getting back to legislation, how has the conversation gone in New York, with Assemblyman Brodsky, one of your old colleagues?
I haven’t had a chance to talk to him yet about the NAI principles — which I was one of the lead drafters of, and have just helped update substantially. The eight-year-old principles had lots of holes, so I don’t think that it’s a great idea to enshrine eight-year-old rules that most applicants haven’t liked because of the holes that were there. So the NAI has just put out its proposed new rules, there’s a 45-day comment period, and there’ll be some changes.
Not everybody is a part of the NAI. And I’d argue that companies that do behavioral targeting and aren’t submitting to the self-regulation — who knows what they’re doing? It’s a shame if they’re not in it, because we’re all committing to these decent practices.
When there’s a baseline law that says this is what’s OK, this lower level of previously self-regulated activity, it makes it very hard for the good players to say, “No — the minimum is this higher standard. No one ought to be doing less than this higher standard.” It would legitimize an older set of standards that haven’t been updated to include some of the more current issues that the NAI document now includes.
I have not had a chance to talk to Richard [Brodsky]. He’s an incredibly smart guy and a former colleague. Hopefully, at some point, we’ll have a chance to meet and discuss the [NAI] rules and how they actually play out in the industry, and see where that goes.
Q: So the principles on which the bill was based were drafted in 2002. Would his bill be more palatable if it were based on the updated version just released in 2008?
I hope it won’t be seven years before we update and include new stuff. Things are changing quickly.
I think there’s a very good chance that anything that gets passed today that captures the updated version of the NAI, by the time it’s signed by the governor, I’m going to have a list of the things that I’ll be saying, “Wait, I thought we solved that problem. Now there’s an outlier doing something different, and we’ve got to include that.”
So I think that this is an area that self-regulation — being able to tweak as you see what companies are actually doing — is paramount. Legislation would freeze in place a solution that would be outdated sooner than that.
Q: So if legislation can’t keep up with the nuances of technological innovation, is there any room for some kind of a baseline privacy law on data collection?
I think this is an area where we’re still trying to figure out exactly what people’s expectations are.
I would often have debates with the guy who was the inventor of the Buddy List, and he always had different ideas about different promotional things that he wants to do, and I’d say, “No! You can’t do this! You can’t do this! It’s got to be opt-in…” And he’d say to me, “You know, if you were around when I did the Buddy List, you would have said ‘opt-in’. People downloading instant messaging software shouldn’t be able to see whether your friends are online.'”
And I probably would have been. Why should you — just because you know my screen name — be able to track whether I’m online or not? That’s outrageous! Opt people in.
And he’s like, “That would have been the end of it, because the whole point of instant messaging is that I can ping you because I see that you’re online. There, you broke my product.”
You can easily see a law that says you shouldn’t, by default, broadcast information about what you’re doing to all your social networking friends. Well, I kind of like that; I like being able to do it.
If I wasn’t even aware of that business model when I was drafting legislation, who knows what I would have broken that people do indeed want and are now using to promote political candidates or using to fund-raise or using to do all sorts of things?
Page 5 of 5
I think it’s early yet because we don’t know what the users’ baseline expectations are. I think you have companies committing themselves by putting up privacy policies and making themselves liable for doing or not doing things.
As the FTC becomes increasingly technologically sophisticated, where they understand what’s going on and are able to bring enforcement cases that really target particular nuances of representations, companies are increasingly going to understand that there is a pretty good law in place, and that’s the law against deceptive trade practices.
If what you’re doing is deceptive to the users, you’re on the hook for some pain.
Q: Any types of data that should be categorically off limits?
Both AOL and our ad networks have long had rules against using the kinds of clickstream profiles that, even if anonymous, people might find discomforting — sensitive health, sexual.
We don’t want there to be “the Viagra guy,” “the HIV person,” the person labeled with a sexual preference because of their Web browsing.
One of the holes in the NAI rules is that if it was personal, it was ruled out — but it wasn’t addressed if it was nonpersonal. I had started getting requests from advertisers to do some things that were discomforting, and I said, “You know what, it’s against our rules, but someone else is going to do it somewhere, and it’s going to embarrass the entire industry. We ought to have these rules across the industry.”
And the NAI indeed adopted them, and just about everybody who does anything behavioral in a significant way is now bound by those. Google is in the midst of joining. Microsoft, I think, just did join. You don’t have everybody, but you have just about anybody who’s anybody.
Q: AOL is not.
Ad.com is, and Tacoda is. That’s where we do the kind of activity that’s covered. I actually follow these rules even on AOL, but we don’t serve ads on other sites — our ad servers do that as part of their activity.
Q: In Europe … they have taken a little tougher stance on privacy, with the Article 29 Working Party recommendations that six months is the maximum amount of time that you can keep search data on your server logs, considering IP addresses as [personally identifiable information] and some other things.
Any comment on what would be the effect on AOL’s business were those recommendations become the law of the land in Europe?
We will, of course, comply with the guidance of local regulators, and are taking a look at what needs to be done to tweak any systems so that the rules are well-respected.
Q: And the business impact?
To be determined, I think. To be determined.