As of 2022, when this presentation was delivered, when people think about social media, they're generally talking about corporate-controlled, siloed, and exclusionary sites that want you to only interact with them and have all of your friends, family, and other contacts only interacting with them. Zuckerberg's Folly, Jack;s Bird Site, and other such companies offer a way of connecting to others and sending messages to each other as a way of getting people to sign up for the site, but their primary purpose at this point is collecting data on you, your browsing habits, your interests, your politics, and other aspects of your life on-line and off-line that they can package up and sell to advertisers in the hope that the advertisers will be able to show you something that you're so interested in that you buy it.
The "social" part of the site is almost always secondary to and in service of gathering the data to sell to advertisers. Because of that, these sites are usually structured in specific ways.
- They track you, both on what you do on their site and what you do on other sites on the Internet, often through the use of beacons on other sites that invite you to "connect with" other people or corporations on that social media site but that also check to see if you're logged in to that site and send data back to the company that you visited this new site.
- Most "social" sites have algorithms in use that are tailored toward the idea of promoting "engagement" with the site, because the longer you are on the site, and the more things you interact with on the site, the more ads you get shown and the more data that you give to the site owners to try and refine their ads and algorithms. "Engagement," however, does not necessarily mean that you're going to be shown things that you like to see, only things you are more likely to interact with. Which can often mean that the algorithm is tuned to show you people being Wrong on the Internet and other things that make it an "outrage machine" rather than a social site. And similarly, your posts are being shown to other people to encourage them to interact with other people on the site, which can sometimes mean you become the target of malicious actors.
- The site is designed to make it as difficult as possible for you to pick up yourself and your contacts and go to some other site. A user who leaves the site no longer provides data, and that makes the site less valuable to the real customers, the advertisers.
Because the social aspects of a social media site are often incidental once the site gets big enough to be a data warehouse instead, and because curation tools often interfere with data collection, many social sites have inadequate or nonexistent methods for users to make their experience on the site better.
- Content moderation tends to be nonexistent on those sites, with only the most egregious violations of the Terms of Service policed and given consequences. This is often because the sites have grown sufficiently large that the small team tasked with content moderation can't effectively do their jobs. This often means that content moderation is done by machines, who don't have flexibility or any understanding of nuance, and the humans are the appeal team for the decisions the machines make. Which also means that it takes time to go through the backlog.
- The only time that content moderation happens quickly is if the advertisers (the real customers) threaten to pull their money away from the site because they fear backlash about certain content being associated with their ads, or if they are unhappy that their own content is being used in ways they don't approve of (even if those uses are covered under applicable laws.) If an advertiser demands that anything that looks like "porn" leave the site, then you get bans on "female-presenting nipples" or wholesale indiscriminate content destruction because a computer determines something looked like a forbidden thing and deleted it without bothering to check if it actually was.
- Many of these social sites, because they have to rely so heavily on computers to make moderation (and monetization) decisions, often set things up to make it easy for corporations to claim that content is infringing on their intellectual property (usually with a decision of "an allegation of copyright infringement is an automatic takedown and strike against the 'infringer') and extremely difficult for someone to appeal that decision and have their content put back up. And, usually, that content that's been restored will last as long as it takes for a corporation's people or robots to lodge another copyright complaint against the video, at which point the appeal process has to start all over again.
This preference for and use of automatic tools makes it easy for malicious actors, like police, to avoid accountability or having their actions broadcast to the world by playing copyrighted music during interactions with the public, trusting that the computers searching for those music pieces will lodge copyright complaints and get the videos taken down, even though the music was incidental to the actual point of the video, and subject the person trying to hold someone to account to go through the byzantine appeal processes to get their material restored, after which point the opportunity may have long since passed.
Other malicious actions, like brigading, dogpiling, and doxxing, often go unpunished and unremarked upon, or when there is a sufficient volume of complaints, there's often a "well, the thing they're doing isn't nice, but it doesn't violate our Terms of Service" (when the thing itself often does violate the ToS, but the person who would be banned or restricted is very good at getting "engagement" with the site) that resembles the schoolyard decisions of "well, weird kid, stop being weird and things will be fine," and that don't give someone the tools to be able to block at will and with sufficient ease as to curate their own space. Block chains and block lists are almost always third-party tools and require someone to at least temporarily give their account password to someone else to achieve results, which requires trusting that the person maintaining the tool isn't going to try and do something nefarious with it while they have the account.