loosely from macintouch:
looking at what is currently happening with Apple’s iCloud, I suppose you could always posit,
“there is something disturbing about my calendar, GPS location, address book, and my information in the address books of others being stored on an apple server.”
regarding the upload of contacts and related information to Apple’s iCloud may not have anything to do with social data mining. people need to know that Android’s own “Siri-like” voice recognition works via a Google server on the internet. the audio of what you say is sent to the server, and a list of possible meanings is sent back for processing by the handset.
and, I’m sure we can all rest easy believing that Google is not interested in personal data mining and private data retention…
…of course, Google getting busted multiple times hacking private networks tends to show intent /1.
yet, any other attempts at theorizing, otherwise, in this age of privacy abdication and private data retention, is amazingly naive.
well look… Apple has been storing user calendars, address books, email, bookmarks, photos and more on their servers since the days of iTools, later known as .Mac, Mobile Me – and, now iCloud. that’s more than a decade of effort with no known issues that compromise anyone.
NOTE: I don’t recall hearing about any major privacy breaches, yet. we can wish the same could be said for banks, credit card companies, ISPs, mobile phone companies like RIM and even gaming companies like Sony.
maybe it’s funny how that track record has been a non-story.
on the other hand, any time Google stock is over $305 or $475 I’m a fan when we consider the reality that their insidious search capability gives them all manner of advantage when it comes to research and accumulated data.
go look up the word, scienter. I wonder how serious Google is about doing no evil? but then, who interprets what is good, and for whom, what is evil, and for whom, and how someone might define intent? the truth of the day changes with circumstances.
peace be to my Brothers and Sisters.
brian patrick cork
1/ 3rd Party Qualification of Statement. Please do your own research before any judgment is passed.
Check out the investigations by the German regional data protection authorities (which triggered the revelation of the program, after missing WiFi trawling hard disks were queried), the Canadian privacy commissioner (whose rulings triggered Google to introduce an internal privacy by design program), and regulators and advocates around the world.
For 6 years Google secretly used WiFi scanning tools unannounced on Streetview camera cars. They did everything short of encryption busting to take advantage of generally poor standards of technical security knowledge by home wifi access point operators (most of whom would have been unaware of the risk from an apparently responsible global corporation, and so could not have been assumed to have authorised this surveillance as they were never asked).
The result was to hoover up access point info (even where SSIDs where hidden, as I understand it, since they used techniques closer to sigint than ‘casual stumbling’ to ID the AP), and data stream fragments, apparently including snippets of private email messages, the odd login or password, and various online transactions.
That this was inadvertent beggars belief. If Google can ‘inadvertently’ install such a system on a global basis and operate it for 6 years, collating data for a key commercial purpose (Android etc. geolocation), especially when it’s hidden in an already controversial photographic tool pushing the boundaries of surveillance of ‘public’ spaces, then its management is either deceptive, deliberately blind, or seriously incompetent in the governance department.
None of these options is a basis for future trust.
It was ultimately confirmed as a deliberate, sustained global program, routinely capturing wifi traffic and location data, for routine use in collation of a cross reference database for GPS and phone tower geolocation augmentation (and whatever else they chose).
It represented an abuse of the ‘ask forgiveness, not permission’ idea behind software prototyping, because that cute philosophy does not apply to breaching personal information security and privacy, where data loss is often irrevocable. Hacking into real personal data is not consistent with the rapid prototyping method that works so well for disposable code versions. This was more along the lines of ‘see if we can get away with it’, which is the motto of the irresponsible teenager or junior crim.
It breached the trust of the global population (some of whom were legitimately concerned with the transfer of risk to unwitting people by the Streetview camera program) by not being also revealed when the details of Streetview were reluctantly revealed after being introduced without privacy impact analysis or any sort of permission. It breached privacy expectations, and in many cases privacy laws, around the world, though only some regulators were willing to address it robustly. It also potentially breached cybercrime laws in relation to unauthorised access to or possession of data, computers or communications networks.
Since Google had not disclosed what they were doing, and it had not been done or known before, no one could be implied to have given permission or authorisation. Excuses, or attempts to ‘blame the victim’, don’t change this.