Okay, Libraries, Let's Destroy Facebook!

Alex Byrne
Youth Services Librarian
Pierce County Library System
Twitter: @HeofHIShirts

for

Washington Library Association Conference
5 May 2022
#wla2022

Land Acknowledgment

The Washington Library Association Conference is situated on the traditional lands of the first people of Seattle and the surrounding area. The lands I live and work on are part of the traditional lands of the Coast Salish people and the Puyallup Tribe, and the conference is located on the traditional lands of the Coast Salish people as well. Their descendants deserve our thanks for their immense contributions, past and present, to our state and local history, culture, and identity as Washingtonians.

Content Warnings

To the best of my knowledge, there shouldn't be any content warnings about this presentation. If it turns out that I'm wrong about that, please let me know.

Social Media As It Is Now

Data, Data, Data

As of 2022, when this presentation was delivered, when people think about social media, they're generally talking about corporate-controlled, siloed, and exclusionary sites that want you to only interact with them and have all of your friends, family, and other contacts only interacting with them. Zuckerberg's Folly, Jack;s Bird Site, and other such companies offer a way of connecting to others and sending messages to each other as a way of getting people to sign up for the site, but their primary purpose at this point is collecting data on you, your browsing habits, your interests, your politics, and other aspects of your life on-line and off-line that they can package up and sell to advertisers in the hope that the advertisers will be able to show you something that you're so interested in that you buy it.

Track and Trap

The "social" part of the site is almost always secondary to and in service of gathering the data to sell to advertisers. Because of that, these sites are usually structured in specific ways.

  • They track you, both on what you do on their site and what you do on other sites on the Internet, often through the use of beacons on other sites that invite you to "connect with" other people or corporations on that social media site but that also check to see if you're logged in to that site and send data back to the company that you visited this new site.
  • Most "social" sites have algorithms in use that are tailored toward the idea of promoting "engagement" with the site, because the longer you are on the site, and the more things you interact with on the site, the more ads you get shown and the more data that you give to the site owners to try and refine their ads and algorithms. "Engagement," however, does not necessarily mean that you're going to be shown things that you like to see, only things you are more likely to interact with. Which can often mean that the algorithm is tuned to show you people being Wrong on the Internet and other things that make it an "outrage machine" rather than a social site. And similarly, your posts are being shown to other people to encourage them to interact with other people on the site, which can sometimes mean you become the target of malicious actors.
  • The site is designed to make it as difficult as possible for you to pick up yourself and your contacts and go to some other site. A user who leaves the site no longer provides data, and that makes the site less valuable to the real customers, the advertisers.

No Real Moderation

Because the social aspects of a social media site are often incidental once the site gets big enough to be a data warehouse instead, and because curation tools often interfere with data collection, many social sites have inadequate or nonexistent methods for users to make their experience on the site better.

  • Content moderation tends to be nonexistent on those sites, with only the most egregious violations of the Terms of Service policed and given consequences. This is often because the sites have grown sufficiently large that the small team tasked with content moderation can't effectively do their jobs. This often means that content moderation is done by machines, who don't have flexibility or any understanding of nuance, and the humans are the appeal team for the decisions the machines make. Which also means that it takes time to go through the backlog.
  • The only time that content moderation happens quickly is if the advertisers (the real customers) threaten to pull their money away from the site because they fear backlash about certain content being associated with their ads, or if they are unhappy that their own content is being used in ways they don't approve of (even if those uses are covered under applicable laws.) If an advertiser demands that anything that looks like "porn" leave the site, then you get bans on "female-presenting nipples" or wholesale indiscriminate content destruction because a computer determines something looked like a forbidden thing and deleted it without bothering to check if it actually was.
  • Many of these social sites, because they have to rely so heavily on computers to make moderation (and monetization) decisions, often set things up to make it easy for corporations to claim that content is infringing on their intellectual property (usually with a decision of "an allegation of copyright infringement is an automatic takedown and strike against the 'infringer') and extremely difficult for someone to appeal that decision and have their content put back up. And, usually, that content that's been restored will last as long as it takes for a corporation's people or robots to lodge another copyright complaint against the video, at which point the appeal process has to start all over again.

This preference for and use of automatic tools makes it easy for malicious actors, like police, to avoid accountability or having their actions broadcast to the world by playing copyrighted music during interactions with the public, trusting that the computers searching for those music pieces will lodge copyright complaints and get the videos taken down, even though the music was incidental to the actual point of the video, and subject the person trying to hold someone to account to go through the byzantine appeal processes to get their material restored, after which point the opportunity may have long since passed.

Other malicious actions, like brigading, dogpiling, and doxxing, often go unpunished and unremarked upon, or when there is a sufficient volume of complaints, there's often a "well, the thing they're doing isn't nice, but it doesn't violate our Terms of Service" (when the thing itself often does violate the ToS, but the person who would be banned or restricted is very good at getting "engagement" with the site) that resembles the schoolyard decisions of "well, weird kid, stop being weird and things will be fine," and that don't give someone the tools to be able to block at will and with sufficient ease as to curate their own space. Block chains and block lists are almost always third-party tools and require someone to at least temporarily give their account password to someone else to achieve results, which requires trusting that the person maintaining the tool isn't going to try and do something nefarious with it while they have the account.

The Alternative

Decentralized Social Media

There exists alternatives to the way that current social media works. In addition to smaller sites that aren't algorithm/engagement/advertisers focused, the biggest challenger to current corporate social media sites are a set of decentralized social media sites and programs. Rather than one site that everyone comes to, decentralization allows anyone who is interested and has the necessary infrastructure to stand up a social media site of their own. Decentralization then goes hand in hand with federation, the use of a common protocol (language) to communicate with other social media sites. So long as two sites are speaking the same language (using the same protocol), they can communicate with each other so their users can follow each other and get each other's updates across sites. The aggregation of decentralizeed, federated sites across the Internet makes up the Fediverse (the Federated Universe), often shortened to "fed" or "fedi" by those who use it.

Leaning into the idea of the public library as a democratizing force who has historically aligned themselves in favor of privacy and trying to reduce the amount of corporate control over information (and generally promoting themselves as in favor of "freedom" even in situations where that would mean allowing and promoting materials that are offensive and anti-social), I have a "small, simple" proposal for libraries.

Your library (or library system) should host at least one instance / node of a Fediverse server/service.

The Benefits

By thinking a little bit about such a proposal, it becomes obvious that while it can be summed up in a single sentence, there's a lot more complexity and things to unpack than there appears at first blush. Some of the benefits to the proposal that I can think of immediately include:

  • Decentralized social media puts control back in the hands of people to make decisions about who they want to follow, what content they want to see, and how much they want to engage with their social media, subject to the decisions that get made by the administrators of their social media node.
  • Decentralized social media encourages the use of methods other than ads and data collection to pay the bills, so people using those services won't see "promoted posts" or other ads cluttering up their timeline, for the most part. A library-run instance, funded by library dollars, wouldn't need to run ads to handle server costs, so all of the interactions on the server would be people-to-people interactions.
  • Decentralized social media, at least for text-based instances like Mastodon, Pleroma, MissKey, and Bookwyrm will have relatively modest requirements for hardware and relatively modest costs for bandwidth and that hardware. (Popularity, of course, could explosde those costs, but they're likely to be smaller than setting up something that's going to be about images and video, like PixelFed or PeerTube.) Libraries with limited budgets can still potentially stand up a small server for their community with small initial starting costs in the objects, and then advertise their service as a local alternative to the big things.

Technical Barriers

That said, there are a lot of potential challenges that come with those benefits, both in the technical administration of the service and in the demands for staff time that running a social interaction space will produce. Because people.

  • Libraries and library systems that are very under-resourced may not have an IT person, much less an IT department, to handle when technical problems come up that need fixing, or even to handle the routine maintenance part of updating and ensuring their instances are patched against security vulnerabilities. Having an enthusiastic volunteer might help, but they're not necessarily going to want to maintain that system for years and decades on end (unless they're really committed to the bit, and people who are often are running their own social, instead of yours.)
  • Even if there is an IT department or person in a library system, they may be far too busy working on putting out fires in library-critical systems or keeping the entire operation running on duct tape and string to even consider adding something new to their responsibilities. And even if a staff person promises to handle the aspects of running updates and moderating, the response to a suggestion might still be no, because the library system itself doesn't have the capacity to take on new work.
  • One of the big barriers to doing things on-line is that most tools, including decentralized social media, are meant to be run in a Linux/POSIX-compatible environment. For people who aren't IT folks, and who aren't experienced with running webservers and other Internet-connected servers, Linux means learning an entirely new operating system, new tools, new commands, and getting to know an entirely different community of people than they know as a user, and especially a user that's been in a Windows or MacOS (or Android/iOS) environment for the entirety of their computing experiences. There aren't a lot of user-friendly options for someone to learn how to deploy a public-facing server and keep it secured so that it only does the things that it's supposed to do and resist attacks and attackers. (And to keep domain names registered and Secure Socket Layer / Transport Layer Security certificates updated and correctly installed and configured, and, and, and…) YUNoHost is trying to make it easier to set up and deploy a personal hosting solution that can include decentralized social media operations and similar applications for small groups, but it's meant for people who are trustworthy rather than as a general distribution.
  • Many of the services and technical aspects discussed above can be outsourced. There are hosting services for decentralized social media spaces that will handle the updating and the technical backends and the hosting parts, which just needs you to point your domain name at them, and so forth, but of course, that costs money to do, and in places that are already under-resourced, money is usually one of those very rare resources that gets highly scrutinized. If you can't make a case immediately that the program will be popular and well-used, there's not really a likelihood of funding coming from the library. (Grant funding may be possible, but it's not usually there for ongoing maintenance and operations of any given project.)

Content Barriers and Questions of Moderation

In addition to the potential technical barriers, running a social site means that there will need to be people time devoted to making sure that the people play nice with each other and that there aren't things that are patently illegal being shared or posted to the social media space. There's a strong likelihood that this part of running a social media site will take more time and effort than the technical aspects.

  • The biggest decision that will have to be made, and possibly made up front, is what kind of moderation will be enforced on the social site. The tools that provide federated, decentralized social media instances don't usually make content decisions on their own, meaning that it's entirely possible for people who are of very different political persuasions to use the same tools to connect. It's possible for instance administrators to decide how much of any other instance they want to connect with, up to and including cutting off another instance entirely and prohibiting any content from that instance from appearing on their own instance, as well as preventing anyone from their instance from following someone on an interdicted instance. With these powerful tools at their disposal to shape the experience of their users, libraries will have to make philosophical decisions about how much and what kind of moderation they wish to engage in with their users. Some libraries will interpet their policies and decisions to mean that if they offer the service, there will be no moderation or content control (past, presumably, that which is illegal in their jurisdictions) in the name of intellectual freedom and anti-censorship. Other organizations may apply their acceptable use policies and/or Rules of Conduct policies to their online services and expect their social people to abide by those same rules of behavior online as well as in the library. If there are potentially minors being allowed to sign up for the service, additional rules might be put in place for them to be able to use the service. All of these decisions will need to be thought through, and policies and documents created, before embarking upon offering social media services to the public.
  • Beyond what content gets moderated, how moderation happens and what steps are provided for moderation will need to be conceived, outlined, and posted so that all of the people who choose to use the service know what will be expected of them and how they can expect bad behavior to be moderated by the library staff (or their designated people.)
  • And finally, the question of who is moderating becomes extremely important. Spreading the job of looking at potentially objectionable content among the staff is a good idea so as to not lock specific people into having to look at the worst of what people come up with on a daily basis. Rotating teams so that noboody stares into the abyss for too long is important to the mental health of everyone involved. Similarly, while moderation teams should be composed of people from as many different experiences as possible (so as to catch more baqd behavior that's more visible to people who are used to having that kind of behavior used against them,) people should be allowed to refuse to participate if it would be harmful for them to have to moderate. And, as with so many other things, if the kind of work being asked of someone is beyond what they would normally do in their classification, they should be paid appropriately to the classification of work they are doing. It's bad form to expect more of someone than what you're paying them for.

These are an example of the possible problem and benefit space. Using and providing social media services to people may surface new and different benefits and problems than the ones outlined here. Even with all of these possible drawbacks, I think it's worthwhile for public libraries to offer an instance, especially if those instances can then link up to each other, so that library patrons all over the country and the world can connect with each other through their libraries, in an ad-free environment without the pressures or the algorithms trying to drive more and more "engagement" with content they don't actually like. For people who want a social media space that's actually social, or for people who aren't going to leave their current corporate centralized social media space until they can be sure there are enough people out there, a network of library social media spaces might help boost numbers and give people the nudge to try something different than what they've always known.

Further Resources

This presentation was inspired by a lightning talk by Aniwha Ferrari, Decentralizing Social Media: How Libraries Can Destroy Facebook, which is an introduction to the idea of the Fediverse and some of the potential benefits to not using centralized, corporatized social media. I'm hoping that some other enterprising person will take these ideas and put them into practice, and then come back with a report about how well things worked and didn't, so that we have a better idea of how to try again that will make it less stressful, less taxing, and more doable.

An excellent overview of some of the more popular Fediverse services (and what they're alternatives to) is handled by Fediverse.party, collectively maintained by several administrators. It doesn't cover some of the newer frontiers of decentralized, federated social media services, but it's a good overview of many of the programs that can communicate with each other using shared protocols (some of which can have plugins installed so that they can communicate across a greater swath of the Fediverse.) The overview here really helped conceptualize what might be possible, and which services to recommend to librarians as being potentially low-cost and low-bandwidth, rather than suggesting they set up a PeerTube instance and start trafficking in video content created by their people.

Many of the benefits and drawbacks to standing up your own instance follow the idea of Darius Kazemi, who maintains both a social network (Friend Camp) and the customized version of Mastodon that Friend Camp runs on (Hometown). Darius believes in people who want to run their own social media sites running small (100 people or less) social networks for themselves and their friends and then letting those friends make bigger connections to other social sites through federation connections. By keeping instances small, many of the moderation questions that happen for larger instances don't come up (because you're not trying to make a decision that some amount of 10,000 accounts are going to agree with), and many of the costs and problems that come from trying to host an instance of 10,000 don't come up, either. Plus, all the benefits of having a small community that generally gets along with each other. Libraries that successfully run their own socials might use this concept to start breaking out their socials into smaller groups and delegating moderation power to the people that want to step up and run them, or that would allow for staff to select which instances they're interested in taking a moderation hand with, and who they want to let in to their specific local instances. It would be an interesting transition to go from host and admins of platforms to providers of spaces for people to set up their own small social networks and make decisions about who they want to tinvite in, lowering the technical and content costs by using the library's already-in-place infrastructure. (It's where I'd love for this idea of library social media services to go, honestly.)

On the matter of content moderation and the way that tools often do not have an inherent morality of their own, Aymeric Mansoux and Roel Roscam Abbing published "Seven Theses on The Fediverse and the Becoming of FLOSS", (FLOSS meaning "Free/Libre Open Source Software", programs that are offered with the source code also available without cost for others to use and modify and often in forms that can just be run without monetary charge, often with licensing requirements that say any modified programs must also have their source code available without charge) a set of ideas about how the Fediverse and the tools that undergird the Fediverse are going through questions about whether or not they are allowed to impose morality or conscience on the Fediverse and its tools, because there are other aspects of the Fediverse that espouse ideas and philosophies anathema to their own. As libraries continue to struggle through making decisions about "neutrality" and "intellectual freedom" that have historically privileged white supremacy and its underlying ideals, so are at least some parts of the FLOSS space trying to wrestle with the idea of whether it's a betrayal of FLOSS ideas to put restrictions and prohibitions on who can use tools and what they can be used for. Because, after all, the same tools that create social justice-oriented spaces and safe spaces can also be used to create spaces whose most socially-acceptable content are "dank memes" and overt promotion of fascism and fascist candidates. And for most of these tools, there are no built-in lists of who to exclude or places to avoid for those who are just getting set up. (At least one of the stories is about the Discourse over an app creator hard-coding a list of services and places it refused to connect to through the app.) Some tools themselves are beginning to get a reputation for being used by people with specific political views, such that a person looking to set up their own social might find themselves having to work harder to make social connections because they chose a particular tool based on resource requirements without knowing choosing that tool gave them a reputation, warranted or otherwise.

And, for discussion purposes, Alyssa Rosenzweig offers "The Federation Fallacy", which suggests that federation is not the way to break the control of corporations over social media, because the barriers to entry and maintenance of your own social media site are too large for the average person to want to deal with. Instead, Rosenweig argues that our best hope is to find the nascent centralizing spots (like the Mastodon instance at mastodon.social) and convince them that they should be governed by democratic processes to produce an information democracy, charting a middle path between the dictatorships (benevolent or otherwise) that result from centralization and the anarchy that results when everyone is expected to run their own units. (Successes of information democracy cited are Mastodon, even though it centralized, and Wikipedia, at least when Wikipedia is running on its ideals.)

If Rosenzweig is right, and centralization is inevitable, then having an information democracy, as provided by a library, is still very preferable to the oligarchy of corporate interests and their desires to lock everyone else out for their own purposes of profit. So, even if most library social media spaces were to fail or exist in relative obscurity with few members, the ones that succeeded, if they start and worked with other democratic interfaces, could still present a useful alternative to the corporate oligarchies. Thus, even though it may not be small or simple once there's some thought put into it, I still think it's a good idea for libraries to explore running their own socials, or providing space for library users to run their own.