The GP gave examples of regulatory mechanisms that work for other situations, but I also am unsure of what specific regulatory mechanism would work well for this, and those examples don't really bring anything to mind, even though I am in favor of a regulatory solution. I think coloring the question as poisoning the well might be a bit premature.
That is, what are traffic lights and safety lanes for social networks? How do we enforce those for private products when generally they are things why apply to our public spaces? A social network is much more of a mall than a public park, so what restraints are we willing to make towards a private property, and what will actually work? I think those are very valid questions, and while the argument from blueterminal may have gotten there in a roundabout way (and in a way that some consider not in good faith), I they are well worth considering in detail.
To me it's blatantly obvious something needs to be done, I'm just not sure what that is, and am slightly afraid we'll implement a fix that if not as bad as the problem, is still much worse than it needs to be unless we consider it carefully.
It's the pre-emptive foreclosure of discussion I find objectionable, negating the attempt rather than exploring the problem space.
I would personally reduce the private property privileges substantially since a social network by definition derives its value from the number and variety of people that use it. I'd like if FB were at least as searchable to its users as to advertisers, for example; arguably FB knows more about many of its users than they know about themselves.
> I would personally reduce the private property privileges substantially since a social network by definition derives its value from the number and variety of people that use it.
That does make sense, and also fits with how private property can't entirely be viewed in isolation if it has negative externalities. I.e. You should be able to pollute the air immediately above your land as much as you want because that doesn't just stay on your land.
> I'd like if FB were at least as searchable to its users as to advertisers, for example; arguably FB knows more about many of its users than they know about themselves.
That might help in some small amount (and as long as people people actually reviewed info and requested removal, and that had to be honored, it would help), but I'm not sure the companies in question wouldn't just turn their neuroscience divisions to the task of making people want to allow the data for some reason.
To me this feels more like a drunk driving type situation, or age of consent, or ability to sign away your rights. We believe some things should be disallowed because the combined cost to society pf allowing it is much greater than the sum of the individual costs for disallowing it added up. But even if that can be sold as a good idea for this, I'm not sure what specific things we would do to block it that aren't nebulous and can be gamed. Maybe disallowing advertising, but that seems to be targeting one industry, and I suspect something else with a similar negative and the same incentives might take its place for this area.
That is, what are traffic lights and safety lanes for social networks? How do we enforce those for private products when generally they are things why apply to our public spaces? A social network is much more of a mall than a public park, so what restraints are we willing to make towards a private property, and what will actually work? I think those are very valid questions, and while the argument from blueterminal may have gotten there in a roundabout way (and in a way that some consider not in good faith), I they are well worth considering in detail.
To me it's blatantly obvious something needs to be done, I'm just not sure what that is, and am slightly afraid we'll implement a fix that if not as bad as the problem, is still much worse than it needs to be unless we consider it carefully.