Digital Humanities
Public Group active 15 hours, 15 minutes agoNew tools for scholarship, new modes of communication, new forms of organization, same old humanity.
Gleick on Information
- This topic has 9 replies, 10 voices, and was last updated 6 years, 7 months ago by Michael Griffin.
Viewing 10 posts - 1 through 10 (of 10 total)
-
AuthorPosts
-
September 14, 2017 at 11:49 am #1069Paul SchachtParticipant
Now that you’ve read Chapter 7 of Gleick’s The Information, write something in response to each of the following questions:
- How did the chapter change your understanding of what a “computer” is? How had you thought about this word before? How would you define it now? Why the change?
- How did the chapter change your understanding of what “information” is? How had you thought about this word before? How would you define it now? Why the change? (Consider this: a laptop with 128 GB of internal storage can store 1,024,000,000,000 bits of information — irrespective of whether the information “means” anything.)
If, by the time you get around to posting, you don’t have anything to add to other people’s answers, tell us about something else you see differently as a result of reading this chapter. (You might see quite a few things differently if you take a moment to learn more about who Charles Babbage and Ada Lovelace were.)
- This topic was modified 6 years, 7 months ago by Paul Schacht.
- This topic was modified 6 years, 7 months ago by Paul Schacht.
September 14, 2017 at 5:02 pm #1073Phoebe SiegelParticipant- Before reading the chapter I would have thought of a computer as a set of formulas and long scripts that create a page for a reader. This still is partly true but after reading the chapter I was surprised by how mathematical based the creation of the computer was. After seeing how Turing went about developing the computer it seems as if it started almost as a math problem and that he developed the idea of the computer almost accidentally through the formulas he created. I found it interesting that he did not have the invention of a computer in mind when he was working on the project.
- I found that my idea of information changed slightly after reading the chapter because of my new understanding of codebreaking and the algorithms and studies that surround the idea of codebreaking. I found it very interesting to read that a hundred letters chosen at random could contain more information than a thousand carefully chosen letters. I think this was interesting because it definitely confused my view on information. I would think that the more information and the more carefully chosen that information is the more you would get out of this but this was not necessarily true. I found it interesting also to learn how often an algorithm must be changed in order to keep information private and I began to realize how many algorithms and formulas go into creating the information that we can get on computers today.
September 14, 2017 at 6:48 pm #1074Zachary VeithParticipant- Unlike, say, a typewriter, I never really thought of a computer as a physical, mechanical machine. For me, they were always just this box that contained a virtual realm outside the physical world. But reading Gleick’s descriptions of Turing’s and Shannon‘s early “computers,” built to perform mechanical tasks, changed that. I can view computers now as machines being fed information to perform numerous functions. Typing this sentence involves pushing buttons in a certain order, which sends a signal to a receiver, and again and again till a letter appears on my word document. I can now see the mechanical side of computers that I did not before.
- As confusing as much of it was, Shannon’s theories on the predictability of information was interesting. I never thought of how predictable information, especially language, can be. I found Shannon‘s experiment with his wife and the novel (detailed on page 12) really insightful into language. Before, the spelling of words to me was simply “just the way it was.” But now I’m noticing all the patterns, even in the words I’m writing now, of the English language. I can see how, in a way, this repetition makes the transition of written information easier and places less importance on each individual letter.
September 14, 2017 at 8:31 pm #1075Holly GilbertParticipant- Like Phoebe and Zack, I never really considered the mathematical basis for the creation of a computer. Much of the time, I don’t even think of the term “computer” as “something that computes,” despite the fact that human workers and simpler machines have always been computers; I just associate the word with the mysterious machine I use every day. The article made the development of today’s technology feel much less magical – clearly, years of thought, logical development, and groundbreaking theories all built this possibility. Turing and Shannon may not have conceived the idea of a modern computer, but they certainly laid the tracks.
- The association of information with uncertainty, as the author of the article warns, seemed counterintuitive at first. The idea that information isn’t increased, according to Shannon, by adding more and more pieces (I almost wanted to write information and had to find a substitute, to further the point), is almost opposite of how I’m accustomed to thinking of it. Yet the idea that we cannot glean as much from redundancies helps shift this – information is what better helps us understand, or Shannon’s case, better helps with cryptography. This does not mean that extra material necessarily becomes information once we remove the everyday meaning.
September 15, 2017 at 8:23 am #1076Justin AndersonParticipant- Until recently, I have always thought of computers as mailboxes that send away requests for data then receive the results. Gleick’s chapter and my own personal studies on computers have shown me the sheer amount of processing that goes on within my devices. Additionally, I don’t always think of my phone as a computer. My current description of a computer would be a processor with a visual interface which can read and carry out commands and display a visual representation, or map, of the requested data.
- Before reading this chapter, I have never thought of information in terms of entropy. “Information” to me has always been a rigid set of facts and knowns, usually with a high level of organization. The level of perceived disorganization to data is deceiving to the average viewer of information. I still can’t fully understand how ten lines of seemingly random text could somehow contain more information than one hundred lines of text.
September 15, 2017 at 10:09 am #1077Allison MaierParticipant- Like Zach, I had always thought of a computer as something that existed outside the physical world. The “language” a computer used was so foreign that trying to understand how any of it worked was overwhelming. For example, all of commands we have been using in class, like “cd” or “chmod,” were confusing because they did not make sense to me as words. However, after reading this article it makes a bit (haha) more sense to me because I can see how a computer uses a specific code to operate, with patterns and a key. Shannon’s model for communication, in particular, made the process seem more straightforward.
- This chapter definitely presented a new way to look at information, and in particular how a language can communicate that information, for me. I think before reading this chapter, information had never seemed so mathematical. The probability of certain patterns in a language, for example, are something I had obviously been internalizing but never really consciously thought about. It struck me how prevalent this concept is, even if we don’t realize it. For example, the test Shannon did on his wife is pretty much the same premise as Wheel of Fortune. While Shannon’s concepts of information, uncertainty, and probability are heavy, it also becomes clearer as the chapter goes on that we already deal with these concepts frequently in our everyday lives, we usually just don’t recognize them.
September 15, 2017 at 11:00 am #1078Tyler WaldriffParticipant- Prior to reading the chapter, I always saw my computer as a machine of many different parts that function as a base for other program to interact with, creating a near infinite number of way in which the base computer can be modified, to provide a vast amount of content to the user, whether it be for information or entertainment. I now see computers as sort of a giant, complex calculator, that can read a nigh infinite number of symbols that were designed by Turing. This is because of the complex language that I’ve learned went into creating the modern computer interface. An input is entered to the computer, in a language the computer understands, then interpreted and translated back into a format that can be easily understood by its users. Pretty fascinating.
- This chapter changed my understanding of information, in that rather than seeing it as just the finished product, I recognize the entire process that goes into gathering and relaying that single piece of information. For example, a calculator gives you the information of “4” when you enter “2+2,” but the information is not just the “4” at the end, its the entire process that occurs in the inter workings of the device, which go through many codes and algorithms to produce that result.
September 15, 2017 at 11:20 am #1079Isaac SanabriaParticipant- Before reading this chapter I thought of a computer as a machine that did things and I had no idea how it got to that spot. I never really tried to think how computers worked behind the screen because it just seems confusing. I was right I guess because this chapter did confusing me a bit. This chapter changed my understanding of computers. I always thought of the word “computer” as a machine because like most machines there is so much going on behind the scenes in order for everything to work well. I now understand how much is going on inside the machine in order to make appear what is on the screen. The main change in my thought on computers now is that I know how much math is behind it. I couldn’t have imagined that, but it does make sense after reading about it and thinking a little bit.
- After reading this chapter I see information as something different as well. I see information for what it is behind its appearance. There is much more going on than what we see just like any machine ever created. For example, a car, to make it run there needs to be gas in it, but there are a bunch of other little components that go into a car to make it run smoothly. My change of view is simply because the chapter was written well and I was able to understand how computers work as well as what goes on under the keyboard.
September 15, 2017 at 1:34 pm #1080Jack SnyderParticipant- Before this reading I’d thought of computers in a more physical sense with parts and software that reads, writes, or executes code on a hard-drive using its other parts. I dont know how I would change my definition, but I can definitely say that Alan Turing breaking down computers into such a simple machine in a thought experiment was very interesting. That is in essence what a mathematical proof is, so its an interesting origin to computer logic, and adds to my understanding of how computers work at their core.
- Before this reading I had thought of information as a fact or statement that is certain – like data. Shannon’s definition of information in the reading was interesting because it described information as the potential for learning new things, associating it with uncertainty in messages and entropy. This makes sense to me in the context of Turing’s computer model, aligning with how inputs write onto a blank slate – the blank spots contain the potential for information based on the input of the original message, and so the infinite amount of empty space represents the potential for computations and the potential for information. This perspective of data can be applied to the 128GB of space that can be used on a computer to store an infinite amount of possible code/information in the 1,024,000,000,000 bits. I think that my definition of information is alright for practical and tangible data, but Shannon’s definition of information is more useful for learning new things or considering intangible possibilities for data. Each definition has its own use case.
September 15, 2017 at 2:09 pm #1081Michael GriffinParticipant- I learned that a computer is much more complicated than it seems. When I was younger I built my own computer with a motherboard, CPU, and all that jazz. This alone was difficult, but in hindsight, the physical assembly of the system was much easier to understand than the internal workings. Before, I had understood a computer as a means of transferring information from one person to another, and while that’s true, I now realize that it’s much more than that. It’s much more complicated with all its functions that relate to math and numbers for means of transferring information.
- I used to see information as just another set of data that informs people about facts and history from the past, but now I see it as an effective way of reiterating to someone, the sequence of how it makes things work and the role is plays on our everyday lives.
-
AuthorPosts
Viewing 10 posts - 1 through 10 (of 10 total)
- You must be logged in to reply to this topic.