What’s Dumbing Down America Media Zombies or Educational Disparities? Can you name all of Brad and Angelina’s kids? President John F. Kennedy’s siblings? The sisters in Louisa May Alcott’s Little Women? Jacob’s sons from the Old Testament? My guess is the first question is easiest for most readers coming of age in the twenty-first century, whether we are actually actually interested in knowing the Jolie-Pitt children’s names or not. After all, you don’t have to try very hard to hear them mentioned in celebrity gossip or fan magazines that feature their pictures. Television, magazines, and the Internet help us much more with the first question than the others. The other questions require us to draw on knowledge of history, literature, and the Bible, information that is not circulating as freely and rapidly as information about contemporary popular culture. I admit that my ability to name any of Jacob’s sons is solely based on memories of the play Joseph and the Amazing Technicolor Dream Coat. Is popular culture turning us into a nation of shallow idiots? Many critics of popular culture are certain that the answer is yes. Although there are numerous examples of ways popular culture can help us waste time with content that is not exactly intellectually stimulating, the cultural explanation explanation helps us overlook very important structural factors that shape educational disparities. Popular culture does not help us understand the educational experiences of young people who live in communities with overcrowded, dilapidated schools, whose families may have attained little education themselves. But focusing on popular culture may get more attention than addressing these complicated structural factors. Consider these recent news stories suggesting technology and culture are to blame: “Is Google Making Us Stupid?” (Atlantic), “Does the Internet Make You Dumber?” (Wall Street Journal), “Are Smartphones Making Us Stupid?” (Huffington Post), “Generation Hopeless: Are Computers Making Kids Dumb?” (Associated Press), and last “Is It Just Us, or Are Kids Getting Really Stupid?” (Philadelphia), which argues that the Internet is “rewiring” young people’s minds, and not for the best.1 A Washington Times story called “The Pull of Pop Culture” argues that young people must choose between “the pull of the popular or the push of schooling,” and that kids consistently choose the former, or 50 Cent over Shakespeare. A Chicago Sun-Times story, “Successful Kids Reject Pop Culture’s Message,” notes that being able to graduate from high school is based on kids’ “ability to reject the nonsense they are exposed to in our pop culture.”2 A 2008 book by Emory University English professor Mark Bauerlein, The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future, reflects this same concern. Within these stories, popular culture is cast as antithetical to education and knowledge, something that prevents learning. None address the massive budget cuts that many public schools have had to endure, or the dramatic racial and ethnic disparities in high school and college graduation rates. That one’s ZIP code is a central predictor of the quality of education one has access to also gets left out of these attention-grabbing headlines. Concerns that popular culture makes us dumber predate the Internet age. Communications scholar Neil Postman argues in his 1985 book, Amusing Ourselves to Death, that as the United States shifted from “the magic of writing to the magic of electronics,” public discourse changed from “coherent, serious and rational” to “shriveled and absurd,” thanks largely to television.3 Drawing from Aldous Huxley’s Brave New World, Postman decries what he sees as the rejection of books in favor of a show-business mentality that has pervaded every aspect of public life, from politics and religion to education. He believed that these amusements undermine our capacity to think, encouraging us to move away from the written word—rationality, in his view—toward television and visual media. Postman got it partly right. This new media world does act as a never-ending shiny object that grabs our attention. It distracts us from knowing too much about the way American society is structured, being too aware of social problems that might seem boring in the face of so much other interesting stuff out there to pay attention to. This keeps us focused on cultural explanations for social issues, rather than the less immediate—and arguably less interesting—structural conditions that shape our education system. But instead of impeding knowledge and discourse across the board, new media like the Internet have increased public discourse, along with the number of amusements available to distract us. Television news programs now use interactive media to further engage citizens, through live blogs and using sites like YouTube in presidential debates, rather than just enabling people to be passively entertained. In fairness to Postman, who wrote before the Internet age, these developments are still unfolding. But rather than replacing traditional means of informing the public and furthering the flow of knowledge, new media and even popular culture are sometimes used to create new ways to educate. This chapter considers the complaints that popular culture interferes with education and has created an intellectually lazy population. As we will see, changes in visual media and the increased ability to communicate electronically have altered how people interact and exchange information. Television, texting, and a culture awash in seemingly frivolous gossip may appear to be the causes of educational failure, but the reality is far less entertaining. Problems within education stem from structural factors bigger than popular culture: lack of resources, inconsistent family and community support, and inequality. While some school districts have significant dropout and failure problems, Americans are not as dumb as we are often told … at least no more so than we have been in the past. The vast divides of educational attainment and intellectual achievement can be explained not by popular culture, but by the continuing reality of inequality in American society. A Nation of Television Zombies? Does television put viewers into a hypnotic trance, injecting ideas into otherwise disengaged minds? During the 1970s, several books suggested that this was in fact the case. Marie Winn’s 1977 book, The Plug-in Drug, described television as a dangerous addiction. Following Winn, in 1978 Jerry Mander’s provocatively titled Four Arguments for the Elimination of Television concurred. According to Mander, television viewers are spaced out, “little more than … vessel[s] of reception” implanted with “images in the unconscious realms of the mind.” Put simply, Mander argues that television viewing produces “no cognition.”4 Television viewing increases with age (television viewing is highest for adults seventy-five and over), yet nearly all of the concerns about television dulling the intellect focused on children and teens.5 According to Nielsen Media Research, children and teens watch much less television than their elders: adults sixty-five and over watched an average of more than forty-seven hours per week in 2009, almost double that of children two to eleven, who averaged just over twenty-five hours. Teens twelve to seventeen watched the least television of any age group, averaging just over twenty-three hours.6 Television viewing has been declining in recent years, particularly among young people and teens, who more often use newer forms of media during their leisure time.7 Both Winn’s and Mander’s books rely upon anecdotal observations yet make important charges about the negative effects television supposedly has on thinking. Some of these claims seem like common sense: television shortens one’s attention span, reduces interest in reading, promotes hyperactivity, impedes language development, and reduces overall school performance. Yet research into these claims reveals that television is not exactly the idiot box its critics suggest. It might surprise you to learn that one of the programs most heavily criticized in the 1970s was Sesame Street, the educational program many of us grew up watching. Cognitive psychologist Daniel R. Anderson studied claims that preschoolers get transfixed in zombielike fashion while viewing Sesame Street, as well as the contradictory complaint that it contributes to hyperactivity. Studies where researchers observed three- to five-year-olds watch television found that their attention is anything but fixed: they look away 40 to 60 percent of the time, draw letters with their fingers in the air along with characters, and pay more attention to segments compatible with their current cognitive aptitude level. There was no evidence of hyperactivity after watching, and Sesame Street viewers had larger vocabularies and showed greater readiness for school than other children. Anderson and several colleagues conducted a long-term study, following 570 children from preschool into adolescence, to see if a relationship between preschool television viewing and academic performance exists. Their findings cast serious doubt on the speculation that television impedes learning later in life. In contrast to the claims that the nature of television itself dulls intellectual ability, their data repeatedly reveal that content matters: children—especially boys—who watched what they call “informative” programming as preschoolers had higher grade point averages and were likely to read more as teens. These findings counter a well-worn idea that television primes children to expect to be entertained at all times, leading to intellectual laziness and the idea that learning is boring.9 Their study also challenges the idea that television has a “displacement” effect: people spend more time watching television, and thus less time engaged in more rigorous intellectual activities like reading. Anderson and colleagues found that this effect was small, complicated, and observed only in middle- and high-income kids. Children who watched fewer than ten hours a week actually had poorer academic achievement than those who averaged about ten hours of viewing per week, and those who watch a lot more than ten had slightly lower academic achievement than those in the middle. The authors conclude that there is no evidence that television viewing displaces educational activities; instead, it is likely that television viewing replaces other leisure activities, like listening to music, playing video games, and so forth. The authors also found that more television viewing did not necessarily translate into doing less homework. The authors list other studies to support their claims, finding that television does not ruin reading skills, lower intelligence quotient (IQ), or otherwise interfere with education. This does not mean that parents should let kids watch as much television as they want and let them do their homework when they feel like it. We should certainly not presume from this study that television is children’s best teacher, but it does not necessarily have the damaging effect critics have suggested. In fact, the best predictor of student achievement is parents’ level of education. It is likely that this effect is so strong—for better and for worse in some cases—that television cannot compete with the academic environment created by parents. Parents who encourage reading, read themselves, and emphasize the importance of education are a far more powerful source than television. Not surprisingly, reading more is a good predictor of school success, but watching television does not interfere with literacy skills, as many critics charged.11 This connection means that educational achievement—a good predictor of one’s economic success—is inherited more than we might care to acknowledge. The critiques of educational television have had political underpinnings in some cases. Anderson describes how much of the concern about Sesame Street was driven by those who sought to cut funding for the Children’s Television Workshop, and public television more generally, during the early 1990s.12 If opponents could find that educational programming had no impact, or even deleterious effects, they could justify eliminating public funding as yet another form of budgetary pork. But such was not the case. Television has never really left the hot seat. More recently, TV has been blamed for causing attention deficit/hyperactivity disorder (ADHD) and even autism. Although it may seem like television’s electronic images can wreak havoc on the young brain’s wiring process, research does not support this conclusion. It is likely that people who have grown up with electronic media think differently from those who did not, but different is not always pathological. Let’s look more closely at some of the research on ADHD and television. It is mostly based on correlations, and therefore causality cannot be assessed. But if you Google “television and ADHD” you will be told otherwise. One online article concludes in its headline, “It’s Official: TV Linked to Attention Deficit.”13 But the authors of the study cited by this article would not go that far. The study in question, published in a 2004 issue of the journal Pediatrics, assessed the “overstimulating” effect television may have on children who watch TV as toddlers. To do so, they asked parents about their children’s television viewing at ages one and three and asked them questions regarding their children’s attentional behavior at age seven. Although they did find a relationship between lower attentional behavior and more television viewing, the authors themselves acknowledge that “we have not in fact studied or found an association between television viewing and clinically diagnosed ADHD,” because none of the children in the study had been diagnosed.14 They also conclude that it is equally likely that a more lax or stressful environment might make television viewing more prevalent in early childhood and that television viewing is associated with, but not the cause of, children’s inattention. Likewise, a 2006 study published in the Archives of Pediatrics and Adolescent Medicine found significant differences between children diagnosed with ADHD and their peers. The authors found “no effect of subsequent story comprehension in either group,” and that for the non-ADHD children, “children who have difficulty paying attention may favor television and other electronic media to a greater extent than the media environment of children without attention problems.”15 Most interestingly, their study found that any effect that television watching had on attention was with the non-ADHD kids only; those diagnosed with ADHD showed no declines in attention after watching television. This study challenges the conventional wisdom that television has particularly adverse effects for children with ADHD; instead, the authors conclude that “the cognitive processing deficits associated with ADHD are so strongly rooted in biological predisposition that, among children with this diagnosis, environmental characteristics such as television viewing have a negligible effect on these cognitive processing areas.” A similar study was published in 2007, claiming an association between television viewing and “attention problems,” but did not assess ADHD. Another study did use the protocol for diagnosing ADHD, but again it was unclear whether any participants had actually been diagnosed with the disorder.16 In 2011 a study of four-year-olds watching a fast-paced clip of SpongeBob SquarePants made national news, claiming that children who watched the clip did not perform as well on cognitive tests as the children in control groups who did not see the cartoon segment. To read the news coverage, it seemed as though the undersea cartoon character was uniformly making kids dumb. ABC News headlined its story “Watching SpongeBob SquarePants Makes Kids Slower Thinkers, Study Finds.”17 YouTube videos and blogs boldly stated that “Sponge-Bob makes kids stupid.” The study itself, published in the journal Pediatrics, did not go that far. Based on a nonrandom study of sixty four-year-olds from mostly white, affluent families, the experiment involved showing a subsample a fast-paced clip from the cartoon, followed by cognitive tests and a test to measure the ability to delay gratification. The SpongeBob viewers performed worse on all of these tests, but the authors cannot—and did not—claim that this result enabled them to draw any conclusions about the children’s long-term intellectual prospects. The authors’ conclusion included an interesting hypothesis: that the fantasy nature of the program actually required more of the children cognitively, making it harder for them to perform well on the tests immediately after. They state, “Encoding new events is likely to be particularly depleting of cognitive resources, as orienting responses are repeatedly engaged in response to novel events.”18 So we could just as easily conclude that a fast-paced cartoon requires more mentally and is more of a cognitive workout than slower tasks. Some critics have even asserted that television is linked with autism, which garnered coverage in a 2006 issue of Time and in the online magazine Slate.19 A study by economists found a correlation between autism rates and cable-television subscription rates in California and Pennsylvania. They did not measure what children watched (or if children were watching at all). Studies like this, although profoundly flawed, help maintain the doomsday specter of television. Easy answers for complex neurological processes are digestible to the public and thus make for interesting speculation, but probably will yield little in the way of getting to the root cause of autism, just as study after study on television and video games will likely do little for those attending struggling schools. The cumulative effect of questionable studies helps create an environment where television seems to be the answer for educational failure. The American Academy of Pediatrics insists that parents should not allow children under two to watch any television, for fear that it interferes with development, a claim that has yet to be scientifically supported. The AAP statement does not reference any research on infants, but instead focuses on research on older children and teens. Still, the AAP concludes that “babies and toddlers have a critical need for direct interactions with parents and other significant care givers (e.g., child care providers) for healthy brain growth and the development of appropriate social, emotional, and cognitive skills.”20 Although television does not provide the direct one-on-one interaction babies need and can never replace human interaction, there is no evidence of direct harm from television. A 2003 Kaiser Family Foundation (KFF) report found that the majority of children under two—74 percent—have watched television (or at least their parents admit that they have), and 43 percent watch every day.21 I am not suggesting that propping infants up in front of the TV set is a good idea, especially if children are left unattended (in the KFF report, 88 percent of parents said they were with their children all or most of the time). But there is no evidence that television has a negative impact on infants either, only that it does not necessarily contribute to their development. If parents decide they would like to keep their children away from television, they have the right to make that choice. But many parents are made to feel guilty for choosing to allow some television viewing when there is no concrete evidence of harm. The TV blackout is especially difficult for parents with older children who might watch or those who enjoy watching TV themselves. In contrast to the widespread belief that television interferes with intelligence, writer Steven Johnson suggests that the opposite might be true. In his book Everything Bad Is Good for You: How Today’s Popular Culture Is Actually Making Us Smarter, Johnson argues that television has actually become more complex and cross-referential and that the best dramas and comedies of today require significantly more of viewers than in the past. He cites programs like 24, which expect that viewers think along with the show and draw from plot twists and information from previous shows, in contrast to older television, which provided more exposition, if any was needed at all. He says that these kinds of shows are “cognitive workouts” and that even reality shows sometimes encourage us to develop greater social intelligence.22 Although I’m not sure that television makes most people smarter—I would hypothesize that those who are already intelligent can use television to improve upon an already strong intellect—the research does not support blaming educational failure on television. It is another attempt to use a cultural explanation while once again ignoring social structure. Certainly, being able to concentrate and focus is important to educational success. But focusing on popular culture helps us ignore issues such as hunger and family and neighborhood violence that may interfere with learning. These issues are also more likely to be major concerns in low-income areas with high dropout rates. Minding Newer Media Although concerns about television will probably never completely fade away, they are sometimes overshadowed now by newer forms of media, particularly time spent online. Adults are more likely to spend time online than children or teens: adults aged thirty-five to forty-four spent an average of nearly thirty-nine hours online in 2008, compared with just over twelve hours for teens twelve to seventeen, according to Nielsen Media Research.23 And video games also cut into television time, especially for boys. A 2007 study, published in the Archives of Pediatrics and Adolescent Medicine, found that 36 percent of their respondents in a nationally representative sample played video games, averaging an hour a day (and an hour and a half on weekends). Gamers reported spending less time reading and doing homework than nongamers.24 While this may indicate that video gamers’ schoolwork will suffer, other studies—including two that I discussed above—have found no evidence that video games were associated with lower academic performance. In one of these studies, published in Pediatrics in 2006, the authors seem to contradict themselves. In their analysis they state that “video game use [was] not associated with school performance,” yet conclude that “television, movies and video game use during middle school years is uniformly associated with a detrimental impact on school performance.” They also neglect to add that television use itself has no negative impact, just heavy viewing during the school week, according to their own findings.25 Another researcher responded to this contradiction by writing to the journal that the “conclusions are not warranted,” yet the authors refused to accept their own study’s findings, responding that “from this ‘displacement’ perspective, we have little reason to believe that four hours of video game time would be any different from four hours of television time.”26 The reality is that very few people actually play video games for four hours a day, as the 2007 study found; in the Pediatrics study, 95 percent of kids played fewer than four hours a day. Their unfounded conclusion that video-game playing must negatively affect academic achievement reflects the persistent belief that video games are problematic; it’s equally likely that children who must spend the same amount of time in other activities, such as caring for siblings or doing extensive household chores, would also find their academic achievement lower. Focusing on video games does not address the broader structural factors that impact school success or failure. For people who have played video games, the question about gaming and academic achievement might seem backward. Wouldn’t games that require you to learn often complex rules at increasingly difficult levels actually provide intellectual benefits? Steven Johnson, author of Everything Bad Is Good for You, makes this argument, using The Sims as an example, where users need to master a host of rules as they play the game about urban planning. Yes, common sense dictates that people (of all ages) should not neglect their other responsibilities in favor of playing, but the games themselves tend to offer a kind of mental workout, especially improving spatial skills.27 I suspect the disdain for video games and other new media comes from a lack of familiarity. The games are so much more complex now than when they first came out in the 1970s that they compel users to play a lot more than Pong, Merlin, or Atari did when I was growing up. Back then the games were much like other children’s toys that kids played occasionally and mostly grew tired of. By contrast, games today are likely to be serious endeavors that kids don’t give up after a few weeks, but instead are likely to continue to play into adulthood. Video games bear little resemblance to their predecessors from decades ago, and thus seem like a strange new development for many older adults. But at least some people over forty have a frame of reference for video games, unlike texting, a relatively new development. Recently, texting has come under fire for presumably ruining young people’s ability to spell and write coherently. Many complaints come from people I can relate to: college professors who read students’ papers and e-mails. A Howard University professor told the Washington Times that electronic communication has “destroyed literacy and how students communicate.” A University of Illinois professor wrote to the New York Times that she is concerned about the informality in written communication, with no regard for spelling and grammar. A tutor wrote an op-ed in the Los Angeles Times of the “linguistic horrors” she frequently reads in students’ essays. “The sentence is dead and buried,” the author concludes.28 I can relate to these concerns, especially when I get rambling e-mails in all lowercase letters from students. But to tell the truth I have not seen a decline in students’ ability to write since e-mail and texting became so widespread. And according to a Pew Internet and American Life study, teens don’t confuse texting with actual writing. A surprising 93 percent of those surveyed indicated that they did some form of writing for pleasure (journaling, blogging, writing music lyrics, and so on). Most teens—82 percent—also thought that they would benefit from more writing instruction at school. Others are also optimistic. Michael Gerson of the Washington Post writes, “A command of texting seems to indicate a broader facility for language. And these students seem to switch easily between text messaging and standard English.”29 Texting reminds me of another form of language use that is all but obsolete: shorthand. This used to be considered a skill, taught in school often to prepare students for secretarial work. Court reporters also master a language within a language in their daily work. But because texting is associated with young people, critics presume it is a detriment rather than a new skill. And like television, video games, and the Internet, texting is not just a young person’s activity (although the younger a person is, the more texts they are likely to send per day).30 According to industry research, the median age of a texter is thirty-eight.31 Perhaps at the heart of these concerns are uncertainties about these new media. Will they distract people from being productive citizens? Enable too many shortcuts? Much has been written recently about teens and multitasking, mostly with an undercurrent of anxiety. “Some fear that the penchant for flitting from task to task could have serious consequences on young people’s ability to focus and develop analytical skills,” warns a 2007 Washington Post article. Time published an article in 2006 called “The Multitasking Generation,” stating that “the mental habit of dividing one’s attention into many small slices has significant implications for the way young people learn, reason, socialize, do creative work and understand the world. Although such habits may prepare kids for today’s frenzied workplace, many cognitive scientists are positively alarmed by the trend.” The article goes on to quote a neuroscientist who fears that multitaskers “aren’t going to do well in the long run.”32 It is interesting that rather than celebrate the possible positive outcomes of multitasking—which most mothers will tell you they have no choice but to learn—where young people are concerned, the prognosis is grim. As Time observes, multitasking is a valuable professional skill, as any brief observation of the frenzied Wall Street trader or busy executive reveals. The Kaiser Family Foundation released a report on youth multitasking in 2006 and found that while doing homework, the mostly likely other activity teens engage in is listening to music. Most of the multitasking comes while doing other leisure activities, like instant messaging and Web surfing at once. The KFF study seems to imply that using a computer to do homework invites distraction. “When doing homework on the computer is their primary activity, they’re usually doing something else at the same time (65% of the time),” the report concludes.33 It’s also the case that people think they are better at multitasking than they actually are. As many other professors have likely also observed, students who spend time online during class lectures and discussions can miss crucial information, though they might think they can do both at once. Yet computer use is a vital part of being educated in the twenty-first century. In creating access to a tremendous amount of information, the Internet also changes the nature of education. Items that had to be researched from a physical library can be recalled by computer or smartphone, basically eliminating the need for memorization of many facts. These shifts remind me of Albert Einstein’s alleged ignorance of his own phone number, which he supposedly said he could look up if he needed to know. How many phone numbers do you know now that phones remember them for us? Yes, the Internet and other technologies can be major distractions and have created new ways to take intellectual shortcuts and to cheat. Education needs to evolve along with the technology, technology, shifting the nature of learning away from memorization and onto teaching how to think. The Internet can and has been used to thwart cheating, too, and rather than new media being the enemy, educators need to make peace with them and embrace them as much as possible. Just as the written word moved societies away from oral culture, visual media require a new intelligence that needs to be fully integrated into education today. Our continued reliance on standardized testing impedes this shift in many ways. But a new way of sharing information has arrived and will likely continue to mutate in the coming years. How Dumb Are We Really? For those who glorify the past, the present or future can never compare. What’s interesting is that complaining about how little the next generation knows never abates. People have found young people’s knowledge lacking for centuries, and commentators have grimly assessed Americans’ intellectual abilities, whether it be math, reading skills, or geography, for more than a century.34 The complaint that we are superficial and interested only in amusements has been around for a long time. But are we really less knowledgeable than our predecessors? One source of support critics look to is SAT (formerly known as the Scholastic Aptitude Test) scores. Between 1967 and 1980, average verbal scores fell 41 points, from 543 to 502, a fall of about 8 percent, and math scores fell 24 points, from 516 to 492. As you can see in Figure 4.1, this appears to suggest that high school aptitude nosedived during the 1970s. Since that time, average math scores rose to an all-time high in 2005 before falling back to previous levels in the years after. Verbal scores continue to fluctuate but have yet to match levels of the late 1960s and early 1970s. Figure 4.1: Average Critical Reading and Math SAT Scores, 1967–2011 Source: College Board Critic Marie Winn, author of The Plug-in Drug, argues that television is the “primary cause” for this decline, claiming that as kids grew up watching more television in the late 1960s, their ability to read declined. But as the above-noted studies detail, television had little to do with high school grade point average, which is highly related to SAT scores.35 Ironically, the decline in SAT scores from four decades ago reflects a positive trend: more high school students are taking the test and planning on attending college than in the past. According to the US Department of Education, in 1972, 59 percent of high school seniors planned on attending college, compared with 79 percent in 2004.36 The number of students enrolled in college more than doubled between 1970 and 2009 as well.37 Not only are more people attending college, many more African American and Latino students are attending college than in 1970, groups that have been historically underrepresented and tend to have slightly lower scores on average than whites or Asian Americans.38 In 2011 more students took the SAT than ever before in history; 44 percent of the test takers were minority students, the largest proportion in history.39 They are also more likely to attend underfunded and overcrowded urban schools with less qualified teachers, and in some cases English is their second language.40 Donald P. Hayes, Loreen T. Wolfer, and Michael F. Wolfe of Cornell University suggest that the decline in the quality of textbooks also help explain declining achievement. They examined eight hundred textbooks published between 1919 and 1991 and found that the newer texts are less comprehensive and, in their estimation, less likely to prepare students to master reading comprehension.41
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 88). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (pp. 87-88). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 87). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (pp. 86-87). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 86). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (pp. 85-86). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 85). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 85). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (pp. 84-85). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 84). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (pp. 83-84). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 83). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 83). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 82). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 82). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (pp. 81-82). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 81). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (pp. 80-81). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 80). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 80). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (pp. 79-80). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 79). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (pp. 78-79). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 78). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (pp. 77-78). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 77). Taylor and Francis. Kindle Edition.
Sternheimer, Karen. Connecting Social Problems and Popular Culture (p. 77). Taylor and Francis. Kindle Edition.
The post What’s Dumbing Down America Media Zombies or Educational Disparities? Can you name appeared first on PapersSpot.