Google's ToS is 16 pages with what appears to be about 50+ hyperlinks, including several hyperlinks to "additional service-specific terms" which itself has ~50 links to other terms which are all multiple pages.
Perhaps instead of pinning all of the blame on users, we could have the companies producing labyrinthian ToS contracts written by top-grade lawyers and full of legalese (that no layperson should be expected to understand) shoulder at least some of the blame?
This doesn't even touch on the fact that many topics (as related to data aggregation and privacy) are highly technical and require at least a few years of post-secondary to even begin wrapping your head around (e.g. de-anonymization via large sparse datasets is not something I can reasonably teach my 85-year old parent, nor to my child, both of which use Google services in some capacity).
But, yes... Let's blame it on Average Joe, who just wants to watch their dog for a few minutes while at work and saw an ad on TV about a convenient way to do so. Shame on them for not being both a lawyer and a CS graduate.
I don’t understand why aren’t there any standard terms of service which are generally applicable and companies can make minor adjustments to them if they can justify it
More like "If you're not part of the solution, there's money to be made in prolonging the problem." (I don't know who said it, but I'm paraphrasing from something I've seen on a demotivational poster re: consulting)
A solution to this is for courts to limit what is applicable in a ToS to a certain number of words, and have overly broad statements always favor the entity who has to agree.
This, in effect, nullifies all but the most important components of a Tos.
Due diligence is expected among a mature population. But you're right it's not entirely on individuals. There should be ways to disseminate information about the threats these products pose to personal liberty, especially in a nation that uses the word "liberty" so freely in its foundational documents.
>Due diligence is expected among a mature population.
I wholly agree.
But we're quickly approaching (and in some cases, past) the point where proper due diligence requires a 4-year post-secondary education in a related CS field, if not more.
We're talking about products that take multiple domain experts several years of collaboration to create. How is it reasonable to expect my mechanic, accountant, etc. to do their due diligence on how that product processes their data, especially when it's processed in a black-box created by several other domain experts, and their only source of information is purposefully opaque terms written by lawyers?
> proper due diligence requires a 4-year post-secondary education
I don't think that's the case here or indeed very commonly. You don't necessarily have to understand implementation details if some core tenet of popular ethics is being violated. One key feature of the domain -- namely that you don't own "your data" and so you don't get to decide what happens with it -- is pretty clearly in violation of principles that the vast majority of Westerners would at least profess to hold. Beyond the motivating principle that third parties should be required to receive explicit whitelist access to use privately-owned data, "implementation details" refers mostly to policy and enforcement, not really technologies.
Eh, it's exactly what you expect from America though. Ie the embodiment of short term thinking. Economy, environment, politics, etc - not that America is entirely unique here, just that the population seems to embrace this as a foundation in my experience.
Privacy to tech like this is very hypothetical till it happens, and it'll rarely happen. If it's not in our faces we won't vote against it.
>Eh, it's exactly what you expect from America though. Ie the embodiment of short term thinking.
I think this is the entirely wrong framing. My other comment covers some of it, but specifically in regards to your comment: it's a lack of education, not the embodiment of short-term thinking.
And really, we can't expect every person that uses Google (or whatever other large tech company) to thoroughly understand all of the bits and pieces of technology that could be used to fuck them. Or how things that we've been told are anonymous/private become non-anonymous/non-private when combined with other sparse data. These are complex topics that even many technologists don't understand (or are outside of their field of expertise).
These companies hire top lawyers to write complex ToS, use as many dark-patterns as legally possible, do illegal things until they get caught doing so, evolve their terms frequently, etc. Yet somehow they've convinced everyone to blame the layperson. It's remarkable, really.
What would be really swell is if we could, you know, not have companies spend millions of dollars on how-to-fuck-your-user initiatives.
But we can't live in a world where the responsibility isn't on the individual, can we?
Ie if we expect corporations to not fuck you over, who is there to enforce that? Who has the power to keep them in check? Okay, maybe Government should hold that role - but who then keeps the government in check? Who ensures that the spying or privacy from the Government is kept in check? etc
Ultimately the buck always stops at the individual. And we have to be hyper aware of long term implications, because money, greed and power has deep, deep pockets (as you also mentioned) and the fight will be never ending.
We, as a community, have de-propritized education, health care, public safety, privacy, etc. Sure, powerful forces have been pushing for that exact thing, but we can't expect them to "just be nice" or w/e.
I'm very pro "Big Government". However my ideas behind big government will not work without individual responsibility. Until then citizens are purposefully and willfully giving their power away with every tiny step. The blame is on us, and our current state is inevitable. My 2c.
My last sentence was more wishful thinking than a proposed solution. I am obviously aware the world isn't as utopic as the sentence would require.
The main point I wanted to get across is that it's baffling that companies aren't blamed in these conversations. It's always the user who is blamed ("well you read the ToS didn't you!"). And that's dumb, because the vast majority of users aren't lawyers and don't have CS degrees -- both of which are becoming increasingly required to provide informed consent to a ToS. (edit: in every other contract I sign, a lack of informed consent is grounds to void the contract, exception being tech-company ToS contracts)
If you still want to blame my 85-year old parent for not understanding what Google is doing with his data, go for it, I guess. Just seems stupid to do so, because he barely can open up a web browser but is somehow expected to understand the complexities of data aggregation and what impact it will have on him. And as time marches on, it's equally ridiculous to suggest that he just never use a computer to avoid the issue.
>And we have to be hyper aware of long term implications,
Without post-secondary education in niche fields, this is becoming impossible. Especially across multiple services with changing terms, in countries with changing laws, in a world where technology evolution outpaces curriculum changes.
> Without post-secondary education in niche fields, this is becoming impossible. Especially across multiple services with changing terms, in countries with changing laws, in a world where technology evolution outpaces curriculum changes.
I agree, but again i go back to, "but how else can it work"?
Of course i don't expect everyone to be knowledgeable on all low level systems. However, to the point of your 85 year old grandma, she is a tiny demographic in a much larger, much more reasonably informed demographic who also completely ignore the implications.
Name a demographic that isn't wildly ignorant of things that are reasonable to know?
But again, i repeatedly fallback to "But who else can do this?". This is why i'm pro Government, but not until people start pushing for responsibility on this front. It may not be reasonable for your grandma to be responsible for Google Data stuff, but she _(and the rest of us)_ have sat around for dozens of years watching authority figures have little to no accountability or oversight.
The issue isn't about Google. The issue is about us, and our inability to build a government and authority system that is in-line with our views. We hand our power over with no thought or oversight and then we're shocked when it all comes back against us. This has nothing to do with Google or CS, imo.
My argument is that the "reasonably informed demographic" is incredibly small. I can only say the same thing so many times, though, so I'm not sure how to explain it in a different way.
To restate my example, even very smart CS graduates may not realize that anonymized data joined with other anonymized data can result in de-anonymized data, because the linking and de-anonymization of sparse datasets is a niche subfield that has only recently begun being explored.
Many people may think they are reasonably informed (they look into the ToS, see that data is anonymized, and decide that they are okay with that) without knowing that the data may later by de-anonymized through advanced statistical analysis they've never been exposed to in all their schooling. So while they thought they were informed, they weren't. This repeats across several domains.
>But again, i repeatedly fallback to "But who else can do this?".
Why is that when a problem is identified, people demand a solution be provided at the same time? I don't have a solution, sorry. But that shouldn't preclude me from identifying a problem.
I honestly did not expect saying basically "Let's put some of the blame on Google, because they're the ones with the dark patterns and lawyers and experts, rather than solely blaming the layperson" would be met with much pushback.
> My argument is that the "reasonably informed demographic" is incredibly small. I can only say the same thing so many times, though, so I'm not sure how to explain it in a different way.
I think we're in agreement here. To be clear, i'm mostly talking about intent, an attempt to stay informed and a willingness to act - to push for centralized leadership who is informed.
Ie as i said before, your grandma is not expected to know this. She is expected to fight for a government that will be, and that will also be able to be held accountable.
We have neither the oversight on government current, nor the willingness to act. Your grandma built the same world we are building today. One of inaction and obfuscation.
If society cannot be informed and active on what is essential to build that world (whatever that may be), then we are doomed. Currently, the population at large is not. At least, not from what i can see in action.
I agree, but again i go back to, "but how else can it work"?...
Name a demographic that isn't wildly ignorant of things that are reasonable to know?
Who defines "reasonable"?
When you get delayed on a flight due to a maintenance issue, are you equipped to determine if that delay was reasonable? Most likely not, although many mechanically inclined people may be in a position to make that call. Those same people may not be in a position to arbiter the reasonableness of Google's ToS (side-stepping the whole obfuscation of details that was previously covered).
When society gets reasonably complex, we out-source those decisions. In the example of the aircraft, we have a regulatory body who makes the rules about what is reasonable. It wasn't always like that, of course, but the need grew out of the growing complexity and risk profile. So to your question and an earlier point, there may be room for regulatory bodies as an alternative for "how else can it work?".
Reasonable is defined by what it takes to outsource.
If you cannot determine factors by which outsourcing is successful or not, by which it is accountable or not, democracy fails, and you can no longer outsource it.
Agreed, and we do nothing to fight that. We're all complacent with it. Hell, not only did we not fight it, ie we didn't push for government control and oversight, but we signed up. We let them in and laid out welcome platters.
This isn't about being informed on obscure topics. As i said this has nothing to do with Google. It's about our willingness to fight for a government that can handle this, and fight to control said government.
This is absolutely by design and part of a larger pattern of propaganda that keeps Americans scared of the government and in love with the idea of becoming billionaire CEOs themselves because it's "moral". That holy "free market" has rewarded those rich people for being some damn smart and efficient--they deserve it, not the damn communist free loader leftists who hate America.
That's an odd take, I honestly don't find anything about this article, or the broader topic of privacy and overreach by companies and law enforcement, amusing in any way.