Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Most technical adults I know are in some stage of trying to revert their internet usage back to the offerings this bill would allow through. The less than 1 million users careveout is HUGE— allowing for the power of small forums, federated socials, etc. seems like a great bill unless you’re trying to exploit young people.


It seems you're implying everyone who disagrees with you is trying to exploit young people. It's the same rhetoric as the politicians who drafted the bill. We need to have open minds in civil society, some humility to anticipate desenting opinions, and careful fingers to type out comments with.


Yes.

I have some concerns about this bill and tactic. For example, even if it worked as intended, would we see an uptick of problems in 18 year olds' use of algorithmic social media as they're suddenly away from home/out from under their parents' supervision and given access to a bunch of exploiting content? It's similar to how the first two years of driving are more dangerous regardless of when those 2 years take place.

And as somebody who was 12 when COPPA went into effect, there are unintended consequences of banning minors from platforms:

1.) They lie, so either that's going to be commonly known/accepted OR a giant millstone around the neck of any existing company. This makes it a lot harder to pick out accounts that belong to minors, which makes it harder to both research and protect kids.

2.) Related to that, if you're breaking the rules by being in a space, you are way less likely to speak up. If you're a 15 year old who lies and says you're 18 to be on whatever social media platform, then if somebody harasses you, you're less likely to report it because it would get your account banned.

That's not even mentioning what something like this would do for the edge case of kids who genuinely are artists or content creators.


> 1.) They lie, so either that's going to be commonly known/accepted OR a giant millstone around the neck of any existing company.

You hint at an under-mentioned point here: we want laws that encourage companies to be aware of their users and to protect them, and an unintended consequence of laws that say, "you can continue operating as normal as long as you don't know any of your users are kids" is that companies hear, "don't make any moderation or safety features that might open you up to that kind of accusation."

Of course demanding that companies know all of their users perfectly is an obvious privacy violation with obviously even worse consequences. But even though it's better than hooking up ID verification to social networks, "pretend teenagers don't use the Internet" isn't harmless policy, it's not just that there are popups people click through.

----

I personally feel like this kind of "don't knowingly target" stuff is often counterproductive to keeping kids safe online. It means that when they hang out, almost every space they enter is going to be specifically designed for adults, and will systematically ignore the fact that they exist or might have unique needs -- because ignoring that kids exist and removing safeguards is now the safest thing for the website to do.

On a really small scale, think back to when Youtube got targeted for programming "aimed at kids". One short-term result I saw from that was animators/streamers trying to deliberately make their streams less child-appropriate so they wouldn't be swept up. It's anecdotal and I'd like to see more research on it, but I vaguely wonder if the result of these crackdowns isn't often to make social sites more dangerous for kids.


> But even though it's better than hooking up ID verification to social networks, "pretend teenagers don't use the Internet" isn't harmless policy, it's not just that there are popups people click through.

Related to that, we can't just pretend teenagers have no ability or agency. Honestly, it's a toss up who would 'win' a cat and mouse game between the MN legislature and a group of teens with programming capability. Adolescents are in a developmental stage where they're establishing themselves as individuals away from their parents/adult authorities; it's natural that they're going to seek out spaces that either their parents don't know about or don't want them going to. Our job as adults is to make sure that process is safe for them while still allowing them the autonomy to learn to make good decisions.

> I vaguely wonder if the result of these crackdowns isn't often to make social sites more dangerous for kids.

I wonder this too. I was a very digital kid back in the day before there were regulations against it, and there were opportunity costs to kicking out the under-13s that would be very magnified for 13-to-18s.

It prevents kids from having their own social structures and spaces. For example, 7-10 year old me ran a curatorial site for the Geocities' kids neighborhood and late elementary school me also had an IRC channel. It seems bonkers, but there were advantages: Since I/some of the other kids could run things ourselves (with some adult help from trusted adults), it kept creepy adults from integrating themselves into the group by providing resources. (Think the stereotypical college kid buying high schoolers booze; if kids can't sign up or learn how things work, then they have to play in adult playgrounds instead of making their own.) It meant I could kick people and that the conversation was age-appropriate (because that was where I talked about kid stuff and how dare you be off-topic in my channel [kids make great dictators]).

Related, having an admin/building group of kids is really helpful as a buffer, especially in the teen years. Lots of teens aren't going to tell their parents much, but they will tell other teens, so having some teens around who know how stuff works and gives advice is helpful.

You can't legislate for teens without remembering that they have agency and will act independently.


The thing is that there's no benefit to algorithmic social media, so erring on the side of caution is responsible. And we see a clear and extremely reasonable threat. The ideal benefit is that teens won't form a social media addiction.

Algorithmic social media uses the exact same principles as gambling addictions. Teens are much more subject to these exploitations and we don't let them sit in casinos all day and loot boxes have seen some regulation as well.

Social media may be even worse because the gamble is far less tangible, it's cost in the surface is only time and the reward is purely an emotional one.


I would support efforts at restraining exploitative use of algorithmic content for all ages. The main issue is that age-restricting teens especially is logically difficult and can create unintended consequences. Teens are going to act on their own and since many of today's adults (particularly of the generation that tends to legislate in the US) were not raised in a digital world, there will be groups of teens that are better with technology/computers than the adults making laws and therefore the laws either need to account for that (and requires controls so draconian that companies are likely to just ban the kids or stop operating, which opens up the 'what about lying' issue).

You mention casinos. One difference between casinos/alcohol/cigarettes is that there are easy places to intervene/place responsibility: the point of service is a physical location subject to local law. Digital regulation is a lot murkier and easier to exploit. (And not just by creepy adults: a lot of us kids in the 90s/early 00s picked up on the argument that if we weren't legally able to be held accountable, then we couldn't be legally held accountable for things like piracy either).

The only way to ensure something like this could be enforced would be to require ID for signing up for/into any algorithmic service and HELL no. Not only should we not throw the baby out with the bathwater, we also shouldn't set the cradle on fire.


The carve out for education and govt is also HUGE.

The power of small forums exists today - inside larger platforms that through aggregation have been able to corral the resources to persist.

Smaller forums falter and fade fairly easily (source: having admin'd them).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: