Kamala Harris sent shock waves through the mobile app developer community with her authoritarian ultimatum. Time is ticking on her recent 30-day warning of steep fines for mobile software developers selling mobile applications in the State of California that do not issue privacy warnings before their app is downloaded.
In keeping with InfoQ’s tradition of presenting content from around the web that is useful for its readers, the following is a debate about the brewing effects of AG Harris’ investigation into the lack of proper, as defined by California State law, privacy notices on the majority of mobile apps sold in that state. Here’s a comment from a community member over at networkworld.com.
those that advocate turning privacy OFF by default present the most embarrassingly specious arguments to support their actions. Every argument I've seen can be used for the other side as well. Bottom line: most people want privacy ON by default. Data miners don't.
The exchanges at Slashdot.org positively sizzled…by Nerdfest ‘Why treat mobile apps as a special case? All software applications, client-side or web based should be treated the same way.’
by demonbug (309515):
They aren't treated as special cases. The rules apply to any online applications, which include pretty much all mobile apps. It's just that mobile app makers have been very poor at following the rules, likely because so many of them are small fly-by-night companies that don't have a legal department telling them what they are supposed to be doing. So 100 companies get notices that they need to have privacy policies posted, it gets splashed all over the news, and hopefully this will wake the others up to the fact that they need to be doing this just like the big boys.
by Bogtha (906264)
She's supposedly been consulting with app developers, although not ones representative of the larger industry. This is what could happen if it had to be within the app: Receive letter requiring a policy in your app within 30 days. Shit, we outsourced this (common because mobile developers are few and far between). Pay for changing the design to include a button to show the policy. Pay for a developer to make the necessary changes. Shit, the developer we used has a full schedule, we have to find somebody else (again, common). Find a new developer. Get them up to speed on the project and get them to make the changes. Submit the update to Apple. Wait an unknown amount of time for it to be reviewed. Apple doesn’t like something in your app. Maybe their policies changed, maybe a previous reviewer didn't catch something, maybe you've just got a bad reviewer. Go back to the designer and developer and pay for them to do more work, if feasible. Resubmit to Apple. Wait an unknown amount of time for it to be reviewed. And you've got to fit that into 30 days. And that assumes the changes Apple requires you to make aren't fundamental to your business model or operation of the app. And that assumes only one round of alterations is required. And that assumes it's feasible for you to pay for expensive mobile developers. Meanwhile, here's what it would be like if the policy only needs to appear in the App Store: Receive letter requiring a policy in your app within 30 days. Stick a policy online. It can be anywhere, even if you don't have a website, you can just sign up on Wordpress.com or something and post it there. Log into iTunes Connect and put the link into the privacy policy field.
by concealment (2447304)
Instead of attaching a sample compliance letter, why didn't the AG attach a sample privacy policy and open source it so that developers can use it? Pasting in a generic document is much more likely to happen than all those app developers running out and hiring lawyers, so she will either get lower compliance or shoddier privacy policies. Is it too much to ask that government take the lead in this case? I can't imagine it costs the AG anything, since that office hires a staff of lawyers.
by Sarten-X (1102295)
Why didn't the AG attach a sample? Because it's a silly idea. This is a legal document, probably differing for every case, and the point in requiring it is to make developers take a hard look at what information they access and how they use it. Rubber-stamping a boilerplate lets developers say they have a privacy policy, but it doesn't actually encourage any increase in privacy until somebody's sued over it. Once that happens, there will be a few developers who think about privacy, but most won't even know the case happened. Like most legal documents, you usually don't actually need a lawyer to write it. You may need a lawyer to make it bulletproof against other lawyers, but any statement is enough. You could drop in a note saying "This app doesn't intentionally collect any personally-identifiable information, and doesn't contact external services" and probably satisfy the needs of the law, assuming it's accurate. In the event of a lawsuit, though, that statement would cause a little trouble (and open up room for opposing lawyers to argue), because it doesn't define "personally-identifiable" or "external" adequately. Does a game ask for a name for a high-score list? Does it send usage reports or download updates from a developer's server? A lawyer could enumerate all the things the app does and doesn't do, in absolutely clear language, so there's no question where users' data goes, but for many apps (especially for those made without the intent of profit) that's unnecessary. Developers should already know how their program works, so they should be able to define one aspect of it. Disclaimer: IANAL, but I've had my share of dealings with them.
by concealment (2447304)
I disagree that it's going to be that different. If they need to list different data fields that will be retained, or change a length of time, they can edit the open-source document for their specific needs. But this gives them a template to work from which has all of the lawyerese perfected. I can't agree that the document will differ in every case. In my experience, the differences will be slight, and thus having an open source document would encourage programmers to adopt a general standard (like a community rule) for how they're going to approach privacy issues. The result would be a raising of the overall standard to that of the proposed document, which is why it's a good idea to have professionals write it and "promulgate" it.
by Sarten-X
(1102295) A privacy policy shouldn't just be a checkbox on a compliance procedure. Like any policy, it should only be the result of careful consideration. Yes, eventually many developers will come to broadly the same conclusions, but the process of writing (and verifying) the policy conveys the importance it should have. The privacy policy is effectively a promise of what your app will or won't do, and if that promise is made just to save time, it likely won't mean anything to the person making it. Sure, there could be a Creative Commons-like system, where developers pick and choose what options they include. My concern is that by having an easy-to-make policy, the policy is also easy to forget. When a later version adds a new feature or advertisements, how likely is it that the long-forgotten privacy policy will be updated to match? If a legally-bulletproof blanket-permission policy can be made cheaply and easily, why not just apply that to all apps, regardless of the actual capabilities of the program? Clear language? Legalese is about as far from clear as one can get. Not to mention, but how exactly do you enumerate all the things your app doesn't do? "No other personal information is collected" or other similar wordings will do nicely. If there's something that you know your app will never try to do, it can be listed as a reassuring gesture to the user. That works for apps off the various stores....But doesn't work for pre-installed bloatware the provider is running while you use your phone. No App needs access to your cellular phone except to keep it from sleeping. Asking for contacts is bad programing and presentation....I'm seeing a lot of this disappear from the Google apps. But Apple doesn't present rights to the end user....
by Bogtha (906264)
This is a legal document, probably differing for every case, and the point in requiring it is to make developers take a hard look at what information they access and how they use it. Rubber-stamping a boilerplate lets developers say they have a privacy policy, but it doesn't actually encourage any increase in privacy until somebody's sued over it. This happens anyway. I have to fight this battle every time I build an app that collects personal information. Every single time in four years of developing apps, I have been provided with the privacy policy for their website, that specifically describes things that are only applicable to their website, that doesn't account for their mobile app at all. I've got a current project hanging at the moment where we've chased them for a real privacy policy about half a dozen times. The rest of the app is finished, we're still waiting for the privacy policy, weeks later. If it wasn't for us insisting, the app would be live with a meaningless privacy policy they don't follow, and I'm certain other app developers aren't as insistent as us.
by Eraesr (1629799)
Actually, that isn't the biggest problem. Yeah sure, an in-app privacy policy is a problem for a developer, but I'm sure that if you've submitted your app to the appstore within the 30 day limit and it's denied by Apple because of a different reason, a judge will probably take that into account when deciding on that issue. No, a much bigger issue in the difference between in-app or an in-store privacy policy is for the consumer. If the privacy policy is in the store, you can read it and assess it before downloading and installing the app. If you don't like the privacy policy, then don't download and install the app. If it's an in-app document or link, then you have to download, install, run, possibly even create an account and login all before you get to see the privacy policy. By that time, the app has probably already completely sucked all personal information out of your phone and submitted it to the app owner. Same with a EULA (End User License Agreement) that's presented to you when you install a piece of software on your PC. That EULA is presented to you after you've bought the software. So if you don't agree with the EULA, then I'm pretty sure the seller is forced to completely refund the software to you. It's basically the same thing as buying a bread from the baker and after paying, the baker says that you are only allowed to eat the bread at home, and only if don't put any meat on it.
While at computerworld.com, denizens there had this to say:
This really points to a larger problem regarding those who want to leave everything up to the states: you wind up with all these different individual state laws that you must contend with if you plan on doing business (among other things) nationally.
Edgar Lafsatfanbois: ‘You'll soon start seeing "Not available in CA" on apps like other products.’ by fustakrakich (1673220) ‘ ...why didn't the AG attach a sample privacy policy and open source it so that developers can use it?’
by bmo (77928)
…the real intention here is to put small independent developers with their 'disruptive' technology who can't afford a gaggle of lawyers out of business. It's not a conspiracy. This is an issue everyone in the 80s running single-line BBSes had to deal with. The ECPA became law 24 years ago. The California AG's message should not surprise you. Copy someone else's privacy policy. It's what lawyers do anyway. You think they actually work at this stuff? It's all boilerplate. You can say "we do not collect any user data" and make sure your program doesn't phone home or disclaim all privacy whatsoever. and hope nobody actually reads your privacy policy. Copy Facebook's privacy policy if you want to be evil. They bury the "we own everything you post" in language that you and I can understand but not 90 percent of users. And at the end of it, say "we reserve the right to change this policy in the future." to further cover your ass. It's not hard if you're honest and up front. It's only hard if you want to deceive users. That's where the tricky language comes in.