Blogging About Blogging

Like the many other forms of digital history we have considered thus far, blogs are criticized for variable reasons. Comparative brevity, supposedly informal tone, and willy-nilly content can make the most serious of internet users quite uncomfortable.

In defense of blogs, and of the history blog, we should note that arguably the most important advantage of blogging is the alternate community it creates for writers and readers, scholars and enthusiasts.[1] With functions such as sharing and commenting, users may accept and provide feedback on their work. These capabilities may soften criticisms of blogs as digital history, and other advantages should be noted.

Unexpected influences and ideas for work, alternative forms and content, and a wider public conversation are particularly noted by Ralph Luker.[2] In fact, the range of blog topics, lengths, and methodologies alone are of great formal and informal use to scholars. Posts like Webre’s “Thin is In: Rethinking 40 Years of Intellectual History in ‘Age of Fracture,’” and Frank’s “A Christmas Abortion” illustrate how good writing, on a blog or in a monograph, are valuable to the expanding discourse in historical scholarship.

But how do we get scholars resistant to technological change to consider these advantages to blogging? Is it by blogging about blogging, preaching to the choir of already-convinced internet users? Is it by publishing about blogging, immortalizing words and ideas into a monograph that is soon replaced by more recent scholarship both on and offline? After fifteen years of blogging, why do we still need to validate digital history practice? I certainly do not have the answers, but these questions point to the considerable obstacles digital historians face quite regularly.

Lauren Ericson

[1] Alex Cummings and Jonathan Jarrett, “Only Typing? Informal Writing, Blogs, and the Academy,” in Writing History in the Digital Age (2013)

[2] Ralph Luker, “Were There Blog Enough and Time,” Perspectives on History (2005)



Despite any paranoia regarding the rapid and pervasive shift in technology and digital media, I see benefits to embracing anything that promotes making cross-discipline connections, and really, the digital world certainly makes interconnectivity an increasingly foreseeable frontier. I divide my conceptualization of interconnectivity in three major parts: connection between ideas in the same field, connection between disciplines, and connections between learning and experience.

As an embarrassing anecdote to move into this concept, I will certainly admit that I am not always the brightest of people, and I did not realize how connected certain aspects of history were interconnected. To me, Europe and Asia existed on an entirely different timeline from American History. The Harlem Renaissance and the Prohibition eras seemed entirely disconnected to me due to the way they were taught. My personal experiences in education, we paid more attention to Prohibition in history classes and the Harlem Renaissance was a focus in my American Literature classes—they weren’t the same history. The Antebellum era and Victorian era were separate periods of time, because we learned about Europe and the Americas as separate histories. I learned about many different Native American cultures, in isolated contexts. I never made connections between them, because no one impressed those connections.

This interconnectivity also applies to connections between differing disciplines, which is the most obvious issue many wish to combat. In a quick, but non-representative, survey of my friends who carry bachelor’s degrees within the beloved STEM fields, have taken few (if any) classes that connected history or philosophy and their fields. Sure they took the standard core courses that taught the basics in liberal arts fields, but nothing that really connected the things they wanted to learn and these seemingly superfluous courses. Which in a small way, is a bit surprising, because many of the foundational principles coincide with major cultural or historical patterns. However, admittedly, you can complete liberal arts program with a similar amount of experience in the sciences, and this “lack of” partially introduces us to the concept of studying digital history. Liberal arts scholars often avoid learning skills related to the sciences, including computer sciences and technological skills.   Burton discusses this as a “chicken and the egg conundrum: “(h)istorians will not develop digital technology skills because there is no field of digital history to make those skills valuable” and visa-versa. As Cohen notes, the field of history currently focuses on the value of the monograph. The primary incentive, as both authors point out, is to keep producing history in the “traditional” way–through dissertations, theses, journals, and books. Many students are discouraged from participating in less traditional methods of production, and are even pressed to believe that the traditions are the most beneficial way to build a resume and gain access to employment.

This also presents another psychological location of disconnect is between what you study and what you experience. Interest decreases dramatically when you cannot answer the question—“Why does this matter to me?” This is almost the core question within Public History, yet the field of history (along with other liberal arts disciplines) experience a near crippling aversion to digital media, which almost places us at odds with ourselves in today’s thickly mediated world. Cohen describes this crippling aversion as a product of “widespread subtle biases that hinder the academy’s adoption of new media”. Cohen also notes that economists, physicists, and law scholars participated in digital media production since almost the dawn of the possibility. They saw the benefits of this new emerging world, despite it hadn’t yet become He also points out the history field’s reliance on the pinnacle “monograph” as the summation of one’s career. As technology moves forward, traditionalists fall into the background. Students ask “why should I be learning this?” more and more; the field is one that you study because of love, and there is no value in it. With the massively available and “free” internet, it seems foolish for any discipline to completely snub the digital world. We live our whole lives through the digital world these days, and anything outside of that seems increasingly irrelevant as newer generations entrenched in the digital experiences.

On a different note, Nate Silver’s site presents a brilliant marriage of disciplines/styles/techniques. The FiveThirtyEight site is still going strong today, and it is not sitting on a shelf in the catacombs—I mean stacks—of a university library, like a traditional economics dissertation or book might. Even a comparable site, styled in a different fashion faded into the internet archives, while the other is still widely spoken about regarding its revolutionary approach. FiveThirtyEight is a widespread tool that, whether 100% accurate or not, still benefits a broad audience.

These are links to the two comparisons.  You may also find a page from the Princeton site, competing with Silver’s tool and pointing out weaknesses of his method: 

As commenter Tom Walters on Cohen’s article points out, perhaps Silver’s advantages lay in the fact that he had a stronger mastery of communicative skills. The Princeton page is much more technical and statistical in the writing style. Silver’s pages have more of a narrative structure. In my observations, another noticeable difference is Silver’s design is smoother and more visually appealing. This exposes a propensity and keen eye for the visual arts as well as writing. Walters’ comment reveals another capability about coming into the digital era, that is pointed out by  both of these authors.  We can actually have a dialogue regarding our information and make a platform for ourselves defending why our chosen field is important/relevant–with the skills we can demonstrate aptitude and relevance. We also keep up with the “new world.” In Cohen’s case, he could use the digital platform as a sounding board to produce his physical book format.

And once again I ramble too much.
-Kayla R. Wirtz

Narratives or data: What are the goals of Digital History?

When discussing the development of digital history as a medium for delivering academic information to a wider audience, it is necessary to reflect on the details of how history has traditionally been presented. Between questioning whether or not the use of narrative is methodologically and theoretically sound or whether history is considered a subfield of the humanities or social science, we need to find time to reflect on how academics define history in the first place. The utilization of digital media as an avenue for the dissemination of history has clearly experienced change over the past couple decades. It may be evident to historians that digital media is a useful tool for making academic information more accessible, but the inward focused, self-reflexive nature of these readings is a necessary precursor to interacting with a public audience.

Questioning whether or not the use of narrative in the historical record is a viable method reminded me of one of the most influential theorists in the field of anthropology. White’s question of whether historical narrative is appropriate was remarkably similar to Clifford Geertz’s question of what was more appropriate for ethnographic analysis and anthropology – description or narrative. Published roughly a decade apart, both of these works examine whether or not storytelling is an appropriate method of documenting information. It seems worth noting that this parallel exists when another question being asked is whether history fits into either the humanities or social sciences. Rosenzweig claims that historians are used to working by themselves in isolation. Comparatively, social scientists are in a position to question every decision they make within academia as their work inherently affects the public. Before we can answer whether or not narrative is an appropriate methodological approach, shouldn’t we be questioning how history can be informed by both academic divisions?

Regardless of whether we think that information should be made digitally accessible, we must consider ethical ramifications. As Rosenzweig states, one potential problem is that it widens the gap between populations that have access to digital media and those who do not. Also, if we rely on story-telling or narrative as a method of transmission, whose voice is used to tell the story? Are we tiptoeing into territory where information may belong to a particular social class even though we have taken it out of academia? Are we appropriating a population’s history because we think it should be made available? How much input and feedback should we receive prior to making knowledge available? Can historic knowledge be used as propaganda to further an agenda? Are we leaving out pertinent information based on what we, as academics, think is important?

One of the critiques of FiveThirtyEight was that because information is traveling so much more quickly, that instant gratification compels people to want an immediate answer to their questions. However we decide to make knowledge available, whether through unbiased, raw data or compelling narrative, the presence of accessible information should not only serve to answer an existing question. Rather, public access to knowledge should incite more questions among its audience similar to the questions that are being asked within academia.

Digital History: The non-Boring past

As a public historian, I find access important for the narrative. Examples that Burton gives in American Digital History, such as History Matters, are small examples in the access, but along with that  I am thinking of sites like the Oral History Association’s information on vital issues in Oral history  (another controversial field in history, so perhaps not a great example), along with “how tos.”Access like this is important for students who may not have programs like GSU or Columbia’s Oral History program. In fact, in undergrad I used this access to interview Janice Blumberg for a paper.

History has been seen most often in the world of academia, a place that does not feel welcoming to many people and at times, as mentioned in the articles is written off as boring and closed off to most. It is known for intense, long and often complicated prose. For me though, history lies in how historians use what is written or seen as boring, be it; photos, the census, art, plays, and even the academic prose. The creation of the internet and use of digital history merges all the sources into new and fun ways to talk and analyze the past. New tools such as blogs and podcast allow for new and different stories to be told. New stories such as underworlds of cultures that fit into a small niche of lives. Blogs allow history to be spread around to more than the typical history world. Digital history is an important form of history because of these facts. Not only does it open up history to the general public, it allows for people to interact with history making, such as the work that the USHMM opens up to people to search for children in concentration camps. Essay on New Media mentions the use of a combination of multimedia and scholarly research in the Ethington publication on Los Angles, this is an important use of digital media. In this new version of history, historians are able to combine past historical research and new digital research to make new and perhaps more meaningful contributions to the history world. Given that digital history is becoming so important, historians must be rewarded for digital work the same as traditional work. Also, that digital history as a form of scholarly work allows historians to use information in new and innovated ways beyond the common narrative form. Narrative form as White says is part of history and will always be part of history because of the fact that it provides information in episodic form, it is easy to understand and forms a well-known and common communication form that is seen everywhere. Digital history is innovative, just like Oral History was. Although as mentioned in the articles, digital history must have a way to be analyzed by peers similar to print work. Peer review seems like an easy condition, but until that is achieved and likely after that digital history will always be seen as problematic.

History as Storytelling, and the Reluctance to Share Authority

History, especially the human kind, means different things to different people, and no two people experience a moment in the same way. Human history, and by extension the narrative mode of communication, cannot be scientific because we cannot successfully subject it to the scientific method: there is no hypothesis to test, little is ever really a “fact,” and true objectivity, while a noble goal, is not actually attainable. But why must history be a science? Is it necessary to measure humanity, and, if so, what are we hoping to deduce?

While I don’t subscribe to the idea that “those who forget history are doomed to repeat it,” I do feel that the study of history is important. For me personally, the past is a treasure trove of stories of humanity; tales of cruelty and oppression often run parallel with those of incredible will, triumph, and kindness. Knowing what humanity has been—and where we’ve been—brings me a sense of belonging in the greater scheme of life; if ever I feel sorry for myself, it helps me to look for my place in history and thus be reminded of how small I really am. But also, my historical storytelling makes me somewhat popular at parties.

Even if children rank history as their least favorite school subject,[1] people do love stories, sometimes even more so if you tell them that it’s a “true story.” Perhaps, beyond the restrictive and detached high school teaching method of memorization that Burton discusses,[2] people just need to feel a connection to something bigger than themselves—they need to feel a personal attachment. While scholars find their Zen in ever dissecting the canon and reshaping historiography with their new interpretations and theories, some people just want to feel something in the history of humanity; they want to learn. I am hard pressed in seeing how that is wrong or any less worthy a pursuit.

Dan Cohen says that, “good is good, no matter the venue of publication or what the crowd thinks,”[3] and I wholeheartedly agree. Technology is not just a new way to crunch data, and enhancing the study of history through technology doesn’t have to involve destroying the system altogether. Technology is merely one tool of education and dissemination but, like Burton says, academic “resistance has less to do with the tools of the web and more to do with the web’s culture.[4] Why are scholars so afraid to let the public in to interpret and narrate their own stories—stories that we all have a right to? Are they afraid that the masses will get it wrong? Certainly, the Ivory Tower is not beyond reproach in that regard.

Members of academia can continue to scoff at digital history, but they cannot stop the public from embracing technology and they cannot stop irresponsible history from being disseminated. Far from showing a lack of interest in history, the public is consuming it now more than ever. We absorb historical portrayals through film, TV, video games, and even social media. And the public is questioning the practices of historiography when they ask, for instance, why so many textbooks only tell the tales of dead white men. If the public wants to know, and academia doesn’t have the tools to answer in the public forum, then where will the information come from? When history goes “wrong” on the Internet, are we to argue with armchair historians on the Web, or are we to instead help guide the public in critical historical methodology? Rather than dismiss blogs, podcasts, or other civilian history-based websites as irrelevant or unvetted, why not get involved with the conversation in a helpful and meaningful way?

The only danger technology poses to the field of history is that it may force new thinking, approaches, and practices as they relate to sharing authority. Technology, as it advances and becomes more commonplace, makes itself more affordable and available. Educators need not be highly skilled developers to lead the way in adopting new forms of storytelling, but they do need to be engaged in the process. If video game developers research history in addition to programming—for instance in the work of Ubisoft’s Assassin’s Creed series—what excuses do historians have in not learning how to learn some basic programming skills to share history themselves? While academia makes excuses as to why digital media and technology is unfeasible, history marches on.

– Laurel Wilson

[1] Orville Vernon Burton, “American Digital History,” Social Science Computer Review 23 (2005), reprinted online Center for History and New Media

[2] Ibid.

[3] Dan Cohen, “The Ivory Tower and the Open Web: Introduction: Burritos, Browsers, and Books,” July 26, 2011 Dan Cohen online

[4] Burton, “American Digital History”

Digital History’s Long Road


“You have died of dysentery.” Does this sound familiar? It does if you’ve played Oregon Trail (and hopefully, that is the only reason). Though it may not be the very first, it is certainly one of the earliest examples of Orville Vernon Burton’s definition of digital history: “the process by which historians are able to use computers to do history in ways impossible without the computer.” It is also very much an example of historians’ reluctance to embrace non-traditional, digital approaches to their field.

Introduced in 1971, this simple game, which mixes text and graphics, was created by Don Rawitsch, a student teacher in Minnesota who wanted to develop new ways to teach American history. Anyone who is curious as to how it works can play it for free online.

Oregon Trail was a pioneer of educational technology (no pun intended) and by many measures an enormous success. Not long after the Minnesota Educational Computing Consortium hired Rawitsch in 1974, the game became a staple of classrooms. As of 2011, Oregon Trail has sold more than 65 million copies, and it is available on wide variety of platforms (cell phones, Facebook, Wii).

Despite its widespread use as a tool to teach history, Oregon Trail did not start much, if any, dialogue among historians as to what new technologies could add to the field, nor did it radically change the way history was taught. So, where was the revolution?

Part of the issue may lie in the game’s own representation of American history. Mostly, Oregon Trail inculcates the values of self-sufficiency and rugged individualism. What is more, the indigenous peoples of the continent do not always play a prominent role in the story. In this way, it does little as a means of inquiry, and thus does not serve the purpose of real scholarship. In all fairness, Rawitsch’s goal was to spark middle school students’ interest in American history, and he succeeded in this goal. His modest goal could be the very reason that this groundbreaking approach to teaching history did not cause much of a stir in the ivory tower.

But does it have to be this way? Dan Cohen pointed out that other fields— the sciences, law, economics— have actively absorbed digital technology into their evolving practices. Why, then, are historians so slow to embrace new forms of technology, and when they do embrace it, why do they tend to ignore its potential for new modes of inquiry and representation? Much of it, I believe, comes from the democratic, or more accurately, non-historian-centric nature of digital production. Anyone can write a blog. Few historians know how to write code. This situation stirs the terrible fear that historical scholarship will degenerate into what Dan Cohen’s mentor, Frank Turner, described as “playing tennis with the net down.”

Historians’ fear of a “netless tennis game” does not stem from pure snobbery. There are a lot of lousy interpretations of history about which many amateur, and even professional, historians feel deeply adamant. Holocaust deniers are an obvious example, as are the white-washers of Southern history.

I am, therefore, very interested in Dan Cohen’s assertion that “good is good.” It is not a point he elaborates in this class’ reading but he does dedicate an entire chapter to this idea in the book for which this reading is the draft of an introduction. So I can’t really comment on that at the moment, but I do think that digital history can go forward, even with the current biases against it.

It is unfortunate that digital resources are largely viewed as aggregators, disseminators of “information” rather than “knowledge.” But how is aggregation of information not helpful to history? Primary sources are often aggregates of information. The thesis of Devil in the Shape of a Woman did not tumble out of the court records that formed its core research; Carol Karsen put a lot of time and thought into doing something original with them. It is thus not surprising that primary source collections, such as Ayers and Thomas’ “Valley of the Shadow”, comprise some of the most popular forms of digital history.

What’s more, the divide between formally trained producers of history and producers of digital content grows smaller every day. It is much easier to start a blog or build a website than it is today. Also, the difference between “knowledge” and “information” should not be viewed uncritically.

So I am going to advance my own idea for a potential project. How about an aggregator of scholarly texts in the style of Rotten Tomatoes? Imagine, for example, how a Ph.D. candidate studying for comprehensive exams could benefit from this. Or imagine how an experienced scholar could use such a tool to assess and question the assumptions of his or her chosen field. 

Digital history has a long road to haul, but it is moving forward. Hopefully, no one else will get dysentery.

–Will Greer

The Old meets the New

The two pieces that talked rather well to one another were American Digital History and Ivory Tower and the Open Web. What was especially interesting and important was that the projects mentioned in the Ivory Tower all started small as ideas of an individual and from there quickly developed a following. This might be a good moment to remind oneself that for the two or three successful website or blog we talk about there are hundreds that failed. Nevertheless, there is also the message that one has to dare and try. The tools are all available to develop a successful online presence. At the same time, what was rather reminiscent of last week’s readings, especially the division between books and ebooks, was the reluctance of scholars to embrace these new technologies. The statement about how liberal professors are with regard to politics and economic but how much they lack ingenuity in technical affairs was so true. My dissertation advisor still handwrote his first drafts for book manuscripts and I had a series of professors or colleagues who still use flip phone and have no Internet at home. I honestly am more with this group since technology has let me down too often. At the same time, those who are willing to embrace the difficult world of digital humanities should get their reward from tenure committees just like those who publish excellent monographs.

Orville Burton shows rather nicely how much the humanities have embraced technological opportunities. Considering Burton’s age, one can clearly see some of his own growth of a historian through the article. The use of new computer technology was a hallmark of the new Social History quantitative analysis. However, here is a good example of how much technology has become useful. Only about a decade ago, historians tended to find job posting once a week on the website of the American Historical Association or the monthly publications. Today all jobs are posted on h-net and available immediately. Nevertheless, some groups, such as H-Diplo, have done much to utilize the opportunities provided by h-net to their fullest potential. Others, like H-South barely engage in posting news updates and do not even do reviews of important books.

The list Burton provides with digital history projects was great since it offered a glance at the various possibilities. Impressive, potentially also depressing, is that there were “only” 800 websites related to U.S. History. Many of these websites provide great tools in the classrooms. I have not yet done much in this regard, but especially in my research seminar this semester, I have already pointed students to a number of websites to locate primary sources. For undergraduate, who cannot travel to locate their resources or who work with a small campus library, these resources are essential.

The statement that caught my attention with Burton’s articles was that history has a low standing among students in high school, and this probably still holds true in college. I think that is where Hayden White comes in. As a historian primarily interested in historiography, White traces the uses of narrative since the early nineteenth century historians. He concludes that it is important to tell the human past, which is a set of events, in a narrative fashion. He seems it is the best way to “imagine” the past and communicate it to people. This is where I feel Burton’s statement and White intersect. Teachers, and I purposefully use the word here in contrast to historian, have increasingly abandoned the good lecture based narrative in contrast to some half-witted powerpoint presentation that has neither narrative or story.

Narrative: The White Flag in the Digital History Debate

The uneasy relationship between historians and “technology” is one many of us are quite familiar with. Academic practice, which has been more or less consistent for decades now, is complicated significantly by the ability of historians or, as some would have it, self-proclaimed historians, to publish online. However, if the past fifteen years have taught us anything, it is that technology will continue to be a major part of our personal and professional lives. Let us use it for good, no?

We certainly do not need to abandon the strongholds of academic, historic discourse. In fact, our use of narrative as a primary, valuable methodology is well-suited to the technological medium of online book publication, blogging, and (live) Tweeting. In fact, our use of narrative is extremely important: “There is a certain necessity in the relationship between the narrative conceived as a symbolic or symbolizing discursive structure, and the representation of specifically historic events.”[1] Given the significance of narrative to historic practice, perhaps narrative, as a discourse and technique, may aid our uneasy transition to accepting digital historical scholarship.

In part, I mean this as an alternative to the truly daunting task of learning the bells and whistles more suited to the technological, rather than historical, side of things. As Orville Vernon Burton wrote, most “historians lack the digital creation and programming skills necessary to make their historical scholarship truly digital historical scholarship.”[2] Although I agree that there is a skills-related disconnect among technology and history professionals, I do not believe this discredits the historian’s work on digital platforms. A complicated digital format is not necessary for most internet users, let alone individuals who hope to use the internet for a (traditionally) separate purpose. Even without programming skills, historians can use digital formats to create scholarship valuable to the public through use of narrative.

Why narrative? Well, all of the plug-ins in the world do not change the content of a digital site. As such, digital historians should embrace narrative formats. According to Hayden White, narrative provides information and meaning,[3] as it provides episodic and configurational dimensions.[4] More importantly, for our purposes, it already an accepted historiographic discourse!

So, while digital mediums bring up issues of information versus knowledge, entertainment versus education,[5] we can reflect on merging traditional and forward-thinking historiographical practice in a non-controversial way. Marrying technology and narrative is “perfectly in line with the fundamental academic goals of research, sharing knowledge and meritocracy.”[6] Making our research-backed stories available to a new public is certainly a move in the right direction for historians. In a sense, narrative is an important tool both on and offline, and could certainly be the white flag waving between battling the reluctant historians on one side and the digital age historians on the other.

However, transferring narrative from one medium to the other does not strengthen our arguments, or mean that every argument is appropriate for digital forums. There are many, many examples of successful digital history projects that do not use narrative, but other methods. Taken to the next level, technology can offer historians an immersive experience, or can offer a hypermediated one. This would be doing “history in ways impossible without the computer,” or true digital history, one might say.[7]

– Lauren Ericson

[1] Hayden White, “The Question of Narrative in Contemporary Historical Theory,” History and Theory 23, no. 1 (1984), 30.

[2] Orville Vernon Burton, “American Digital History,” Social Science Computer Review 23, no. 2 (2005).

[3] White, “The Question of Narrative in Contemporary Historical Theory,” History and Theory, 19.

[4] White, “The Question of Narrative in Contemporary Historical Theory,” History and Theory, 27.

[5] Dan Cohen, “The Ivory Tower and the Open Web: Introduction: Burritos, Browsers, and Books [Draft],” Dan Cohen (blog), July 26, 2011,

[6] Cohen, “The Ivory Tower and the Open Web: Introduction: Burritos, Browsers, and Books [Draft],” Dan Cohen.

[7] Burton, “American Digital History,” Social Science Computer Review.

Week Two: History, Narrative, and New Media



This week, we begin to consider how history has evolved as a discipline, both in terms of earlier debates about the very nature of history and its methodology and more recent challenges from digital technologies and new platforms for disseminating knowledge about the past. Were the proponents of narrative history right about the fundamental necessity for a diachronic, semi-literary approach to interpreting the past – or were more radical critics correct to suggest that narrative is just one of many approaches (and perhaps far from the best)? As we consider new platforms such as blogs, podcasts, and digital archives, how do we think about storytelling? Can history move beyond narrative and still inform audiences and publics about the past?

Are there other ways of telling about the past? What are some landmarks in terms of the embrace of digital media by historians – what has worked and what hasn’t?

Where do we fit talented upstarts like Nate Silver in our understanding of scholarship and new media? Can we think of parallels that are more specifically historical?

Allowing for Many Readings and Many Writers

The growth of media from paper to the various sources of our current media world changes the way people interact with media. The change in interaction has happened gradually and then all of a sudden there was rapid change. The change from paper to e-books represents an influential change. This change has allowed for more people to self-publish, but has lowered the amount of money that some may make as authors and publishers. E-books have allowed for a larger amount of people to publish, while this is a positive change for the fact that more people can express their creative juices, it also floods the market with similar books, some that may claim facts, but in turn are not factual. Although the article argues that the change in books to e-books is different than the change in music, it could be seen as similar. For instance, when music started to thrive on the internet more musicians became well-known. A key example of this is Justin Bieber, he is like the E.L. James of music, found through youtube and made famous after that. The change in books to e-books, will not be significant on the paper book world. This is due to the fact that so many people prefer hand held books, as mentioned in the article on regarding paper books. The article says that the book is the perfect tool, easy to read and never fails. Until electronics such as Kindles can achieve all the qualities of books many will still choose books over e-books. Personally, even though I work in social media and web design I still prefer paper books. I go back to music, thinking of my friend Doria, an amazing musician with a lot to say. She fought to be released from her contract, knowing that she could do more as a self-produced musician. Now her past all love songs are intermixed with protest music. While the new used of media not only allows for more people to be part of writing world, it also allows for writers (or musicians) who have more to say to say it.

The change in media goes beyond the sources and technology people use to view media, but also in the discussions that surround media and the way media is used. While media has changed, one question has remained the same in the study of media, “does the meaning of media come from the media itself or does the audience create the meaning of the media?” One of the arguments used in Media studies is that the terms signifier or “sound image” and signified or “concept” are suggested by Saussure to be arbitrary. Semiology, popularized in the 1960s, as Saussure explained is related to how meaning is generated in “texts,” allowing to view cultures and therefore pieces of cultures as text. Media, argued in Semiology, is polysemic meaning that it holds multiple interpretations and it opened to various readings. The arguments in Semiology answers the question of meaning and media. Through the use of Semiology, it becomes clear that there are many aspects to the viewing of media. While reading this, I was watching Wall-e, a children’s film that not only shows off new technology, but also is signified to be a film that warns of the future of issues that currently are happening in America. This is a prime example of the use of polysemic meanings in media. Media at times seems to forget the intelligence of its audience, for example how can two different news outlets tell two different stories using the same “facts?” A question I’ll never be able to answer, but one likely answered through polysemic readings.