8

Digital Well-Being Is a Team Sport

When we think about the physical well-being of our families, many responsibilities fall on the shoulders of parents. And while our responsibilities to keep our kids healthy are significant, we know we don’t have to go it alone. We’re part of a broader team of people and organizations that are working together toward that same goal. We teach our kids to evacuate a burning building safely, but we also expect that buildings are designed with working fire exits and that trained firefighters can control the flames. We make sure our kids wear seatbelts, but we also expect cars to have airbags and pass federal collision safety requirements. Doctors step in to help when our kids get sick; public works crews make sure the water we drink is sanitary. Police work to keep our communities safe; schools help prepare our children for future studies and careers.

As we think about our children’s digital well-being, some key responsibilities are borne by parents—we’ve spent most of this book talking about them. But it would be disingenuous to assume that all of the responsibility for creating healthy digital citizens should be on families alone. Digital well-being is a community effort. We need a team of players, all bringing their talents to bear—a support system of experts that parents can partner with and trust. Understanding the roles (or potential roles) of the other team players is important for parents’ awareness and also helps hold all team members’ proverbial feet to the fire. As parents, our expectations can become a self-fulfilling prophecy for whether or not the other team roles are completed.

With that in mind, let’s meet some of the other members of the team. We will also look at opportunities to advocate and ask those team members to do more to ensure the digital well-being of our children. There are hundreds of different roles that could form part of our extended digital citizenship team, but for practicality, we will look at the three key players: tech providers, governments, and schools.

Digital Platform Providers

We’ll start with the most obvious member of the team, the tech providers themselves. Despite providing the tools that enable many of our digital dysfunctions, most digital platform providers have a vested interest in creating healthy digital citizens—for self-preservation, if nothing else. Responsible digital platforms have processes to report disconcerting activities. Facebook has been working to improve this process by expanding its “flag as inappropriate” option to identify a variety of potentially disconcerting behaviors, such as those suggesting self-harm, misinformation, hate speech, and so on. Facebook and other digital platforms depend on us to use these tools when we observe activities that we feel are problematic.

But digital platform providers need to do more to support our digital citizenship team effort. Author and entrepreneur Eli Pariser says we should expect more from our digital platform providers in exchange for the power we give them over our discourse. He believes we should ask not just how we make digital tools user-friendly, but also how we make digital tools public-friendly. In other words, it’s our responsibility to make sure our digital platforms never serve individuals at the expense of the social fabric on which we all depend. With that in mind, let’s look at three key responsibilities we should expect of our digital platform providers.

Establish Meaningful Norms

We should expect our virtual platforms, as members of our digital citizenship team, to establish and clearly communicate standards for participation in our virtual spaces. Some already do a good job of this, including Flickr, Lonely Planet, and The Verge. Flickr’s community norms are simple, readable guidelines that are clearly designed for community members, not just lawyers, to understand.1 They include some clear “dos” like:

“Play nice. We’re a global community of many types of people, who all have the right to feel comfortable and who may not think what you think, believe what you believe, or see what you see. So, be polite and respectful in your interactions with other members.”

And they also include some clear “don’ts,” like:

“Don’t be creepy. You know the guy. Don’t be that guy. If you are that guy, your account will be deleted.”

We should expect all of our digital platforms to establish a clear code of conduct. And we should expect it to be actively embedded throughout the virtual space. Even the examples I mentioned have their norms pretty deeply buried in the back corner of their sites. In chapter 2, I talked about the idea of sign-posting—creating messages and reminders of the norms of behavior for our physical shared spaces. A similar approach should be an expectation for our virtual platforms as well. Imagine if, instead of one more ad for new socks on Pinterest, a reminder appeared to “post something kind about someone else today.” Or imagine if, instead of poking my eyes out from watching yet another Geico ad before a YouTube video plays, we might be presented with tips for how to respectfully disagree with the content of someone else’s video. Sure, this would cause the platform providers to give up a fraction of a percentage of advertising revenue, but that’s a very reasonable expectation for them to remain a trusted member of the team.

Verify Human Users

A second expectation of our platform providers is that they take more seriously the responsibility of identifying the users of their platforms that are not human. Most people would be shocked to learn how many of the “people” we engage with in virtual spaces are actually robots (“bots”) designed to create highly reactive content. Some of the most divisive posts that flood the virtual world each day are generated by these bots, which are capable of arguing their digital positions with unsuspecting humans for hours on end.2 One study found that during the height of the Covid-19 pandemic, nearly half of the accounts tweeting about the virus were bots.3 YouTube and Facebook both have about as many robot users as human users.4 Last year, Facebook removed over 2 billion fake accounts, but until additional verification is added, new accounts will be created, also by bots, almost as quickly as the old ones are removed.5

In addition to clearly labeling bots as bots, platform providers should do more to verify the identity of human users as well, particularly those that are widely followed. Many of the dark and creepy parts of our virtual world exist because online platforms have been irresponsibly lax in verifying that users are who they say they are. This doesn’t mean platforms couldn’t still allow anonymous users, but such accounts should be clearly labeled as unverified so that when your “neighbor” asks your daughter for information about her school online, she can quickly recognize if she should be suspicious. The technology to do this sort of verification exists and is fairly straightforward (banks and airlines use it all the time). Twitter piloted this approach—you may have seen the little blue checkmarks next to some people’s accounts—but then stopped, claiming it didn’t have the bandwidth to continue. The lack of expectation for verified identities enables fraud, cyberbullying, and misinformation. If digital platforms want us to trust them to be the host of our virtual communities, we should expect them to identify and call out users who are not who they say they are.

Improve Content Curation

The third responsibility of our digital platform team members is to be more proactive in curating the content on their platforms. This starts with quickly addressing posts that incite racism, violence, terrorist activity, or features that facilitate buying illegal drugs, participating in identity theft, or human trafficking. Twitter recently began adding warning labels to bullying or misleading tweets from political leaders.6 A notable example is when a tweet from Donald Trump was flagged for claiming that mail-in ballots lead to widespread voter fraud (for the record, there is absolutely no evidence that mail-in ballots are any more susceptible to voter fraud than in-person ballots). Apple has also taken this responsibility seriously with a rigorous review process on apps that are added to its mobile devices. Unlike the web, Apple does not permit apps that distribute porn, encourage consumption of illegal drugs, or encourage minors to consume alcohol or smoke on its devices. Apple and Google have both begun requiring apps on their respective stores to have content-moderation plans in place in order to remain.7

Effective content moderating also means doing more to empower human moderators. Reddit and Wikipedia are the largest examples of platforms that rely on human moderators to make sure their community experiences are in line with their established norms. In both cases, humans are not just playing a policing role, but taking an active part in developing the content on the platform. Both rely on volunteer curators, but we could reasonably expect human moderators to be compensated for their time and energy in making virtual community spaces more effective. This can be done in a variety of ways. YouTube currently incentivizes its content creators to upload videos to its platform by offering them a percentage of advertising revenue. A similar incentive could be given to encourage users who help curate the content on these platforms. This is very different from YouTube’s current approach, which uses bots to moderate and curate. As author and technologist James Bridle points out, content on YouTube that is created by bots is also policed by bots, meaning robots are convincing other robots that their content should be viewed, and the human users of the platform are left paying the price.

Another simple way to empower users as moderators is to provide more nuanced options for reacting to each other’s content. Right now, “liking” or “disliking” are about all the options we have to respond to content on shared platforms. Some platforms have added a happy face, a heart, and most recently a hug, but that is still an incredibly limited set of response options for the variety of content flowing around our digital world. In the physical world, soft-negative feedback is a critical tool for helping people learn the norms of community space. Most of the feedback we give in the physical world is much more subtle than what we can do online. If you were in a conversation with someone who said their kids were not going to get a vaccine because it contains a secret tracking microchip, we might respond with an “I don’t know about that” or a “hmmm, I think you might want to check your facts.” But in the virtual world, our only option might be to click the “thumbs down” button. In a world where very subtle reactions carry great significance, giving a big “thumbs down” to a friend is like the social equivalent of a full frontal assault. On the other hand, if you choose to sidestep the awkward moment by unfollowing your friend, you have just made sure they never hear your feedback again, likely reducing their sounding-board pool to people with similar views, which is even less helpful for establishing shared societal norms. What if instead of just “liking” or “disliking,” we could tag things as “I question the source of this post” or “I love you, but I still disagree with you” or “that’s a pretty extreme viewpoint.”

Digital platform providers care what parents think; their continued existence depends on our continued trust. We should expect digital platforms to establish and clearly infuse their environments with media that teach appropriate norms of behavior on their digital spaces. We should call for them to do a better job of clearly labeling nonhuman users of their platforms and to empower their users to be more involved in content curation. Sending this message is as simple as sharing our needs with the platforms themselves (they all have forms to share feedback) or reaching out to the elected leaders who oversee them.

Local and Federal Government

Government, the next member of our digital citizenship team, plays a critical role by protecting and providing. Protection means keeping citizens safe from each other and external threats. Driving laws reduce accidents; diplomacy (and military strategy if necessary) keeps us safe from foreign attacks. And government provides the means to make sure citizens have access to the services they cannot reasonably provide for themselves including access to transportation, health care, quality schools, and commercial goods and services. The nuances of where and how the government accomplishes its role should be a topic of ongoing debate as it continually adapts and adjusts to the evolving needs of citizens.

As we become increasingly dependent on the digital world for many critical functions of our democracy and livelihood, we should expect the government to carry out its role to protect and provide in virtual spaces as well. Governments worldwide are beginning to step up to this responsibility. The European Union, Brazil, Japan, and Australia have recently made sweeping changes to protect their citizens from having their private information stolen online. EU leaders are currently pursuing new laws that keep companies like Amazon and Apple from giving their own products preferential treatment over their competitors’ in their online stores in order to keep healthy competition in the digital marketplace. In Britain, officials are drawing up laws to ensure Facebook and Google play better in the sandbox with smaller competitors.8 But government efforts to date have largely prioritized improving online business practices over creating healthy online environments for digital citizens. We need to call on our government leaders to exercise their responsibility in creating safe digital spaces for our families. The following are three specific opportunities around which we can call on government leaders to take action.

Provide Oversight for Digital Spaces

Governments have created stringent expectations that traditional media companies must comply with. This means we don’t have to worry about our children seeing graphic sexual content or hearing profanity while watching TV. In response to an infamous wardrobe malfunction involving Janet Jackson and Justin Timberlake during a Super Bowl halftime show, the US Broadcast Decency Enforcement Act established fines for media companies that violate the rules. The FCC also enforces guidelines—known as “KidVid rules”—that require broadcasters to include at least 150 hours of child-appropriate programming as part of their core schedule. And media companies that enable the sharing of illegal or defamatory information can be held liable. If a guest on CNN engages in hate speech or the New York Times publishes illegal content on its site, even if it’s written by a guest author, the company is legally responsible. It seems reasonable for laws to require these organizations to remove hate speech and posts that deliberately incite violence or encourage terrorism.

But this oversight does not apply in the digital world. Internet-based media platforms like YouTube, Netflix, and Facebook can ignore these requirements, despite the fact that they have become our primary source of media. The loophole applies to online information-sharing sites as well. US Code 47 section 230 exempts websites from responsibility for content shared by users on their platform, even if it is defamatory or incites violence. But steps can be taken to close these loopholes. For example, in 2018, Section 230 was amended to make virtual spaces liable for enabling human sex trafficking—shuttering sites like backpage.com and removing the sex forums from Craigslist. Why not also make online platforms responsible for complying with the same standards as traditional media companies? This might include ensuring that a balance of media is offered in suggested playlists, hateful or violent media is removed, clear warning labels appear on graphic content, and controls are available for parents to limit the types of media shown to their children. It doesn’t make any sense for traditional media companies to perform against a set of expectations while allowing new media platforms to completely sidestep them just because we haven’t been diligent about keeping our laws caught up with advances in technology.

Ironically, the only legal requirement aimed at protecting children from harmful online media in the United States is the Children’s Internet Protection Act (CIPA). And in a textbook example of twisted logic, CIPA puts almost all of the responsibility on schools and libraries to figure out how to filter obscene images, child pornography, and other blatantly sexual content and no responsibility on the platforms providing that content in the first place. Not only do schools already have enough on their plate, but most individual schools don’t have the technical chops or scale to shift the behaviors of major tech providers.

Other countries are beginning to take this role more seriously. Sweden, Norway, Denmark, and Belgium, for example, have all created guidelines that restrict digital advertising targeting children younger than twelve.9 In the same way the government has taken its role as protector seriously in physical spaces, we should expect it to also take this role seriously in the digital world by establishing and enforcing consistent standards across all types of media platforms.

Ensure Equity of Access

A second key area where government needs to support parents in ensuring digital well-being is in equity of access. You can’t become an effective digital citizen if you are excluded from the digital world. For almost a century, we have considered access to water, electricity, and phone service essential to a healthy life, not optional luxuries. As such, we have made it a priority to ensure all citizens have access to these basic utilities. If left to natural market forces alone, the availability of basic utilities would be highly inequitable.

Take electricity, for example. The cost to have electricity delivered to my house in a neighborhood in suburban Virginia is relatively low. But for someone living on a farm in Iowa or the Native American reservation in Alaska, the cost per home for providing electricity is prohibitively expensive. Without government involvement, there would be many parts of the country where it would just never be commercially viable to provide electricity. As part of the Great Depression Recovery Plan, the United States set an ambitious goal to make sure electricity was considered a basic utility that was available to all citizens regardless of whether they lived in highly dense or remote communities. Governments can do this in a number of ways, including:

· Offering access to government land (under streets, space on utility poles, etc.) to a utility provider to build their infrastructure in exchange for providing service to remote areas

· Providing grants to cover the capital expenditure for bringing utilities to remote locations

· Becoming a utility provider itself in areas where having a commercial provider just doesn’t make economic sense

In addition, governments can create a variety of support services to help struggling families afford monthly utility costs. Programs like the Low Income Home Energy Assistance Program (known as LIHEAP) assist families with energy costs to ensure heat is available in the winter and cooling in the summer. Many states have their own energy assistance programs to supplement LIHEAP.

It’s long past time to consider access to the internet a basic utility. As education, banking, health care, voting, and many other critical services migrate online, it is nearly impossible to be a functioning member of society without having access to the digital world. Fifty-five million people in the United States do not have a computer at home, and tens of millions of households do not have enough computers to allow for concurrent use by multiple family members. Individuals in these nondeviced or device-deficient households are often unable to access education, telehealth, and employment. Possessing a functioning, connected computer and the skills to use it productively is a basic, fundamental need in today’s society. Even applying for an entry-level job at Walmart or Target is not possible without access to the internet. During the Covid-19 pandemic, lack of internet access also meant lack of access to public education, as schools moved to learning online.

In 2016, the United Nations passed a resolution declaring internet access a basic human right.10 Universal internet service is already provided in Switzerland, Finland, Spain, and the UK. Uruguay provides a computer and connectivity to every student as they enter public school. The United States lags behind the rest of the developed world in terms of treating access to the virtual world like a utility. Fortunately, the same strategies used to make sure everyone has access to electricity and water also apply to accessing the internet, so it’s a well-paved path if we are willing to make the case for elected leaders to make it a priority.

Protect Our Digital Privacy

Finally, government has a role to play in protecting our digital privacy. In the United States, the Health Insurance Portability and Accountability Act (HIPAA) sets boundaries on the use and release of digital health records and requires access controls and audit logs from providers to protect the information. It holds violators accountable, with serious penalties that can be imposed if they violate a patient’s privacy rights. Beyond medical records, there is really no guidance to uphold the promise of privacy in other parts of the digital world in the United States.

Other countries have begun taking action though. In 2018, the European Union implemented the General Data Protection Regulation (known as GDPR) with the primary goal of giving citizens control of their personal data. This regulation applies to any company holding data about any EU resident regardless of the company’s location. And, it turns out, the EU isn’t messing around. Penalties for noncompliance are up to 4 percent of a company’s annual global turnover—a big stick. Brazil, Australia, and Japan have followed suit to create their own similar data privacy regulations.

Creating laws that ensure our basic privacies online should be table stakes in a digital world where participation is no longer optional to remain a functioning member of society. We need a regulatory context that protects our data from being used and sold without our consent and a pathway for appropriate compensation when we do. In “Blueprint for a Better Digital Society,” Jaron Lanier and E. Glen Weyl propose the idea of data dignity, where the government would protect our right to own the data that exists about us now and forever.11 This means when our data is sold to digital advertisers, we would get a royalty payment over time as the data is used. Lanier and Weyl estimate that a family of four could earn up to $20,000 a year from the royalties paid on the use of their data. This approach would not only bring additional transparency to what data of ours is being used but also allow all parties to benefit financially from its use.

To be an effective team player when creating a healthy digital environment for our children, government leaders need to move from reactive to proactive mode. Citizen voice matters to elected officials. If we want digital well-being to be a priority to them, we must make it clear that it is important to us. We should actively call on our elected leaders to fulfill their responsibilities to protect and provide in the digital world.

Schools and Universities

Our most important partners in teaching digital citizenship are our children’s schools. Throughout their time in school, our kids benefit from teachers and coaches who provide guidance and have deep expertise that helps reinforce the concepts we’re teaching at home. As members of our digital citizenship team, schools and universities have the important role of preparing the next generation of effective digital citizens, and also of preparing the future designers and builders of our digital platforms themselves.

Teaching Digital Citizenship in Schools

We should expect schools to closely collaborate with parents to support digital well-being. Many of the ideas presented in this book so far apply equally to teachers and other youth leaders, but I will add a few notes specific to schools here. First, teaching digital citizenship in school is not about creating a new class. It’s about creating a new culture. The five attributes of effective digital citizens need to be modeled by teachers and school leaders and embedded into activities across all subjects. The Los Angeles Unified School District has used the five attributes presented in this book as a common framework and language to help its 35,000 teachers start a new conversation with their students. In particular, the elements of creating informed and engaged digital learners are particularly aligned to the responsibilities of schools. Every school should have a plan for how it will use technology to enable and support its core learning vision. Schools should establish ways to highlight and recognize students who demonstrate the attributes of digital citizens.

Parents can play a critical role in helping schools fulfill their responsibility in this area. A simple and helpful action that parents can take is to ask school leaders how they are teaching digital citizenship. If there is no plan or if the current approach is overly focused on online safety but missing other elements of digital citizenship, work with school leaders or the PTA to suggest some changes. Ask to see a copy of the school’s acceptable-use policy for devices. If the policy is not written in language that is appropriate for youth or only includes things not to do, ask to organize a group of parents to make some recommended changes.

Preparing effective digital citizens is a responsibility of higher-education institutions as well. Two years ago, University of Washington professors Carl Bergstrom and Jevin West were concerned that their institution wasn’t doing enough to prepare their students to recognize or respond to digital misinformation. They began offering a course called Calling Bullshit in the Age of Big Data, focused on detecting and addressing false information.12 The class explores common misinformation traps, including misleading advertising, data visualization, and statistical tricks, and provides strategies for respectfully responding to people who intentionally or unintentionally spread it. Bergstrom and West believe that for the safety of our future society, the skill of recognizing and addressing BS is one of the most important skills any student needs to learn at college.

Unfortunately, the University of Washington is an outlier when it comes to explicitly teaching principles of digital citizenship. When you consider that you can take a course on tree climbing (Cornell University), whether or not Harry Potter is real (Appalachian State University), walking as a form of art (Centre College), and embracing distraction (Belmont University), it seems reasonable to expect more colleges to also prepare students to be effective members of our digital world. Oh, and if you worry that students aren’t interested in the topic, the first semester the University of Washington course was taught, the 160-seat course limit was reached after one minute of the opening of registration.13

Preparing Future Designers

A second, less obvious but equally important responsibility of our education partners is to train the future designers and engineers who will build the platforms of our digital world. If we really want to have virtual spaces designed around supporting humanity and democracy, we need a generation of designers and developers who think differently about what they’re creating.

For several years, I’ve served as a design resident for the global design firm IDEO. You may have never heard of IDEO, but the products they design likely fill your house under the names of a thousand brands. IDEO designed the first widely manufactured Apple mouse. If you have a toothbrush with that nice rubberized grip, you can thank IDEO for that, too. IDEO has become successful because their design philosophy is to obsessively keep the needs of people first and foremost when designing products and services. Its core design principle is empathy for the users. In the case of the toothbrush, the designers observed that when people used smooth, flat toothbrush handles with wet hands, the brush often slipped or people dropped it (and everyone hates using a toothbrush that has just dropped on the floor or in the sink). So IDEO designed around that problem. If a toothbrush has a rubber grip, it’s much easier for human hands to hold. Now, almost every toothbrush has one. The process of designing for the needs of people has become known as human-centered design.

Today’s digital platforms are often designed to prioritize advertising opportunities or number of downloads over the needs of the people and communities who use them. We could take a page from IDEO’s book when training future designers. We need schools and universities to prepare developers and engineers who know how to keep humans (and humanity) at the center. This means teaching strategies to understand the societal impacts of the features that we build. For our schools and universities, this means rethinking our approach to teaching coding, business, and engineering. Traditionally, an unseen transom divides business and tech programs from humanities programs. A typical humanities graduate will likely never take a coding class and vice versa. I was once speaking to a group of humanities majors about the importance of learning the language of computer code in order to be part of these critical conversations. I realized the irony in the difficulty of trying to convince humanities majors of the importance of learning a new language.

We are starting to see this approach in some universities. The d.school at Stanford seeks to be a space where students can learn human-centered approaches to designing and building practical solutions. The New School, located in New York City, recently launched a new minor called “Code as a Liberal Art,” which teaches computer science not as a tool for career preparation, but for creative expression, cultural criticism, and “civic awareness for understanding an increasingly computational society.”14 And Brigham Young University is pioneering an approach it calls Humanities+, in which all humanities majors are encouraged to gain experience in tech or business fields before they graduate, and vice versa. In our future world, it should become increasingly difficult to distinguish between humanities and business or tech programs if we want to create a generation of tech leaders who know how to build virtual spaces with humans at the center. As James Bridle wisely notes, making our online world more human-friendly will not be addressed by building the technology better but by changing the society that is producing these technologies. Universities must realize that they are not just creating software engineers, but the designers of our virtual governments, digital community spaces, and online education institutions.

As parents, we should reach out to schools, universities, and educators to emphasize the urgent need to teach our children to be healthy digital citizens. We should be active collaborators with education partners to keep these issues at the forefront. We should encourage our children to seek out postsecondary education opportunities at institutions that understand the importance of blurring the lines between tech and the humanities, or at least help our children create such programs for themselves by taking a nontraditional blend of courses. We need a generation of designers that understands the priority of building tools that run our virtual world in a way that supports and empowers humanity.

The Missing Team Members

In addition to the members of our digital citizenship team that we’ve considered, we need to recognize a couple of players that are not yet part of our team in the way they should be. These are groups that can make a unique contribution but have not yet been deeply involved because we have not made digital citizenship a priority for them.

Researchers

More research is necessary to help us understand the best approaches and impact of our work to prepare young people to thrive in a digital world. Sandra Cortesi, a fellow at the Berkman Klein Center for Internet & Society at Harvard University and the director of youth and media, focuses on researching the elements of digital citizenship: which models matter, what skills are being taught, and areas where additional research is needed to understand how young people can become successful in the digital world. But Cortesi’s focus is something of an outlier in respect to our national research agenda. The US Department of Education gives around $200 million in funding for education research each year, yet there is no dedicated focus on studying or supporting creating healthy digital citizens. Likewise none of the $60 billion federal defense R&D budget is focused on digital citizenship, even though our greatest national security risk is our own inability to maintain a civil society in a virtual space.15 If we are serious about the digital well-being of our children, we must also be willing to prioritize an associated national research agenda.

Librarians

We have coaches for music, art, math, sports, just about everything a kid may need. Everything, that is, except for navigating the digital world. Lisa Guernsey, an author and expert on how literacy skills have morphed in the digital world, says all kids need media mentors—nonjudgmental, tech-savvy individuals who can help understand our kids’ interests and point them to appropriate media and online experiences. Librarians are uniquely suited to play this role. In many ways, it’s the same skill set librarians have used for years to match kids with the right books and provide meaningful context around the ideas they are exploring.

As books themselves increasingly become digital, having large rooms of paper books in every community library is less necessary. Even if you prefer to read a paper book, there are far more efficient ways to store and distribute paper books with the help of virtual apps than having them lined up on a shelf in a physical building. We need the libraries of the future to play a more critical societal function than book storage or internet access points. Imagine if libraries redesigned themselves around being the center for digital learning for our communities? The staff at Providence Public Library in Rhode Island has already begun working digital-citizenship coaching skills into their daily practice. The library offers a Data Navigators 2.0 program that teaches high school students and members of the community basic principles of data analytics and data visualization so they can be informed digital citizens. A Teen Squad program builds on these skills by coaching young people through the use of technology to help solve important problems in collaboration with a variety of local community organizations. The principles of digital citizenship discussed in this book are woven through the library’s programs. In a great example of team members working together, the library programs are provided in partnership with the Rhode Island Department of Education so that participating students can get academic credit for what they are learning and, in the summer programs, actually get paid for participation as part of a strategic workplace development initiative. As students develop these critical skills they are awarded microcredentials (digital badges) that can help them qualify for future employment and postsecondary education opportunities.16

Librarians could become the coaches that our communities need to reinforce digital citizenship skills, particularly when it comes to finding and evaluating the best digital content. Instead of looking to librarians for the distribution of books or providing a space for internet access, we may be better served by seeing librarians as our expert community guides to help navigate and make meaning out of a complex digital world.

Public Media Platforms

Some final missing team members that we should consider are public media platforms. To understand the importance of their role, let’s take a minute to look at the history of television. In the 1950s and ’60s, there were limited options for high-quality television content for children and a realization that as a society, we should use TV for more aspirational goals. To address the failure of commercial television to provide quality media for children in the United States, Congress passed the Public Broadcasting Act, creating PBS, NPR, Mr. Roger’s Neighborhood, and Sesame Street—all of which were founded on public interest values about what we want kids to learn. Literally every single minute of a PBS Kids show is informed by educational research, reviewed for age appropriateness, and aligned to learning outcomes. Additionally, the fact that these were nonprofit entities allowed them to address important sensitive issues that for-profit companies tended to avoid, like poverty, racism, and stereotyping.

For decades, public media organizations played a staple role in our society, not only making sure quality children’s programming was available but also producing news and educational content for adults. But in today’s virtual world, we have shifted away from the model where a limited set of organizations produce the media we consume. Now the vast majority of the news and other media is produced by individuals and shared on social media platforms. But to date, there has not been any significant public platform to host these digital media communities. Our main third spaces for sharing digital media—Facebook, YouTube, TikTok, Twitter, Instagram, Reddit, Amazon—are all for-profit companies.

We need public media to evolve to play the same role in our virtual world that it played so effectively for many years in our analog media world. We need virtual community spaces that are free to the public, with the sole purpose of supporting our virtual media-sharing community. By removing the dependence on ad revenue, the design and experience of these community public media spaces could be different. The ways our personal data would be used could be different. And most importantly, a publicly owned version of YouTube or Facebook—with features and functionality that would provide us a noncommercial social gathering place—could actually be a forcing function to improve the experience provided by commercial community platforms. We could easily fund the whole effort through a small tax on targeted online advertising. Without adding public virtual community spaces, our future society depends entirely on commercial entities to permit us to gather and engage as a virtual community.

Being digital for good is a team sport. Families remain at the center of preparing their kids to be effective digital citizens, but they should not be expected to shoulder the burden alone. We should be continually identifying potentially missing members of the team and work in partnership with social platform providers, governments, and educational institutions to create an effective environment for our kids. This means helping these institutions understand what we’re expecting, where they are falling short, and what they could be doing differently. If every person who reads this book reaches out to at least one digital platform provider, school, or elected leader to push them to be a more engaged member of the digital citizenship team, there would be a momentum started that would bring about meaningful change and deeper collaboration around a shared goal of preparing our kids to be safe and effective digital citizens.

Next Steps

Action Items

· Reach out to your elected leaders and let them know that they need to take action to ensure adequate protections are in place for our online data.

· Use the feedback tools when problematic posts are made on shared media sites so the platform providers can take action.

· Use feedback channels to ask platform providers to verify user identity and to be transparent about which users are verified.

· Use feedback channels to ask platform providers to more clearly establish and enforce norms of appropriate behavior on their sites.

· Encourage older kids to get experience being moderators in online spaces, including Wikipedia, Reddit, and other online forums.

· Ask school leaders and librarians how they are teaching digital citizenship beyond online safety.

Conversation Starters (for other team members)

· Platform providers:

o What more could you be doing to verify the identity of the users on your platform?

o How are you establishing and enforcing norms?

· Government leaders:

o What policies should be in place to protect our data from being used and shared without our permission?

o How will you prioritize funding for research and implementation around teaching digital citizenship?

o How can we ensure universal access to the internet?

· School leaders:

o How are you teaching students digital citizenship?

o What is the plan for teaching coding and other digital skills?

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!