Thursday, September 27, 2012

New Form of Warfare

So the article as a whole isn't what I'm concerned about, because I'm not into politics too much, but just read the first few lines. 


BEIRUT (AP) — Syrian authorities on Thursday sent text messages over cell phones nationwide with a message for rebels fighting President Bashar Assad's regime: "Game over."
The messages signed by the Syrian Arab Army also urged the rebels to surrender their weapons and warned the countdown to evict foreign fighters has begun. The texts appear to be part of the regime's psychological battle against the rebels, but are highly unlikely to have any effect on fighters intent on toppling Assad.
Syrians say they began receiving the messages a day after rebels bombed a military command center in Damascus — a major security breach of the heavily guarded capital that highlighted the regime's growing vulnerability in the face of a rebellion growing in confidence and capabilities.
People with cellular subscriptions received the messages while those with prepaid phones did not, residents in the Syrian capital said. (The whole article can be found here.)

The fact that we are using cell phones as a form of  psychological, guerrilla warfare is so fascinating. Pretty soon, war will no longer entail guns and bombs, it will just be passive-aggressive text messages from one side to another. 

Wednesday, September 26, 2012

Response to "Born Digital"

The description between the generation gap of the digital natives, digital immigrants and digital settlers couldn't have been more true. I spent the whole time reading this article laughing my butt off. I would like to say that I have a fairly decent understanding of how digital media works. My mom on the other hand though couldn't be further opposite. When she got her first smart phone, she affectionately called it her "genius phone" because of its capabilities. And one year, her New Year's Resolution was to create a Facebook page. In May she finally figured out how to get onto the website that is so difficultly named "www.facebook.com" For the new six months, I had to teach and reteach her her password, where the status bar was, and that someones wall was different from a personal message.

My grandpa, on the other hand, learned how to navigate Facebook too quickly. My grandpa has over 2,000 Facebook friends, only knowing about 5% of them personally. For example, I have a cousin (on my other side of the family, mind you) who lives in California. My cousin has a girlfriend who my family has never met, yet Grandpa added her on Facebook the day they became "Facebook Official." ... Yup, he's that guy. He has no sense of boundaries when it comes to Facebook either. Now that he's retired, he spends all day on Facebook, and takes up 98% of most people's newsfeeds by sharing literally every picture he sees,  commenting on them, and posting the exact same status every night (which reads "Great day with My facebook friends, good Night and God Bless.) And yes,  that horrible capitalization is a direct quote. When introducing myself to people around town, I have to constantly answer to the question, "Are you related to Ken LeGreve... you know, the Facebook one?" And my response is always, "I wish I could say no." A vast majority of the people my grandpa adds end up blocking him (and I will admit, I've even blocked my own bloodline as well) because when I tried to simply delete him, he re-added me within 10 minutes. I don't know how he noticed one friend out of 2,000 had disappeared so fast, but it's just too much.

It is so fair to say that different generations interact with digital medias differently. And while some people say that the internet needs to have a minimum age of use, it's probably equally fair to say that there should be a maximum age of use as well ;)

Digital Media Analysis: Is Online Sharing Legal?


     Pinterest is a form of digital media (which I will defend throughout the paper,) and is a new, emerging, and interactive social media outlet. The purpose of Pinterest is to give its users a format to find and share information. The intended use of Pinterest is for items, or “pins,” as they are called, to be shared or “repinned.” In fact, that’s how I came to know about the website. I was sharing interesting links on my Facebook profile when one of my friends introduced me to Pinterest, so I wouldn’t need Facebook as a middle-man to relay the information I found on another site. Pinterest functions by having users import information from an outside source (a newspaper/cooking recipe website/etc.) and then once it’s on Pinterest, the link spreads virally throughout the website internally, but I’ll explain that more in detail later. The main flaw with Pinterest is that while it is a sharing website, it is currently illegal to share information without consent from the owner of the link, which then in part makes the whole website illegal. Pinterest shows how current copyright laws are outdated with new forms of social interaction.
This is what Pinterest looks like:


Pinterest is a lot like a virtual magazine and a lot like an online tack board. Imagine flipping through a magazine, seeing an article that is interesting to you, cutting it out, and “pinning” it to your tack board to someday return to, to find again. Pinterest is a website that hosts a variety of interesting objects, be it pictures, recipes, blog articles, clothes sold on online companies or anything else imaginable. When the user finds something that they find interesting, they click on the item and then the “repin” button and that “pins” that item onto a virtual “board.” A board can be categorized as vaguely or as descriptive as the pinner chooses, much like a magazine cutter could leave her articles scattered on her bedroom floor, or could be categorized into a filing system.
Manovich’s definition of modularity is that it is “The fractal structure of new media… elements are assembled into larger-scale objects but continue to maintain their separate identities,” (Manovich 30). That relates to Pinterest because a user can have a screen of multiple boards, and each board consists of numerous individual pins- a much like several tiny squares fitting into one bigger box, and several boxes fitting into a larger box.
Now, to find items to pin, the user can scroll through a newsfeed of posts that have recently been “repinned” by people who the user chooses to follow - much like the newsfeed of information in Facebook. Or, if the user is looking for something specific, she can click on an organizational button to limit her search. For example, let’s say a Pinner is looking for wedding inspiration, (which is a commonly pinned genre.) She would click on the “Weddings and Events” button and that would bring her to a newsfeed containing every pin within the website that has been pinned to a board that has been labeled as a wedding board.
Pinterest can digitally “read” what genre a board falls under (because of the numeric coding associated with how a Pinner “tags” a post,) and then sorts all boards of the same kind together.  The automation of this process also defines Pinterest as a digital media artifact because the numeric coding of tags and the modular sorting of boards lead to a computerized, completely automatic sorting process (Manovich 32).
So, the items we pin can be from someone we know personally, if found in our newsfeed, or we could repin an item that originated from a complete stranger on the other side of the world. There are numerous ways to find the same link, too. Because once a link has been repinned, it is now on both the user’s board, and also the board from which the user found the link. In essence, the link multiplies with every share. And with every share, the user has the ability to change the caption of the link. So, the link now exists on the website under a different tag as well. Consequently, now a link could be found on several hundred thousand people’s newsfeeds, and potentially under a different tag. Because of this, there becomes an infinite amount of ways to find a pin. Now, Manovich’s idea of digital media variability is expressed, because he says, “A new media object is not something fixed once and for all, but something that can exist in different, potentially infinite ways” (Manovich 36). A pin multiplies every time it is repinned, it does not just exist on Pinterest once, but now each link exists several hundred thousand times, in several hundred thousand individual locations, in potentially several hundred thousand formats. 
Pinning a pin would be pointless if the links were merely pictures and lead to nowhere. Pinterest is not an online cloud of saved .jpg’s.  The point of Pinterest is to link us to other sites. A picture of a piece of pie, when clicked, links you to the website that hosts the recipe for that pie. A picture of a headline of a blog will bring you to the actual blog where the article was posted, for the user to read. A picture might link to a tumblr or flickr site. Or a clothing store could post a picture of a shirt, so when clicked on, the user would be directed back to the store’s website where they could find the shirt to buy. Anything from anywhere on the web can be added to Pinterest, categorized for someone to find and be interested in, and be then linked back to the original host site. It’s almost like a Google search, except you don’t have to know what you’re looking for.  
In fact, it’s almost like the memex that Bush mentions in his article, “As We MayThink.” It’s a way to store tons of digital information in one place, categorized and sorted together, to be referenced when needed.
It’s as though Pinterest is a hyerreality (Baudrillard) of the internet, within the internet. The whole internet (theoretically) is stored within Pinterest, (or at least could be,) and the posts that people pin give an indication of how people want to live their lives. Pinterest has a demographic of upper-middle class white females between the ages of 15-35. So, according to Pinterest, important things in life are hair, makeup and clothing styles, foods to try, popular wedding and home décor trends, and pop culture. Commonly pinned things become viral, and unpopular topics fade off into the hidden corners of the internet, to be rarely seen or pinned again.
Now that we know how Pinterest functions, we can talk about the flaws of its system. The biggest issue with Pinterest has to do with copyrights.
Having the interconnectivity with every person who shares the same interest as us, and every website on the internet, is both a blessing and a curse. While the point of Pinterest is to share information, and people post things onto Pinterest so it can be shared, copyright laws say that Pinterest users technically aren’t allowed to share anything that they don’t have specifically granted permission for. … So, unless you are able to track the link’s origin (which you may or may not be able to do because of the tagging system,) and unless you are granted specific permission from the creator, pinning a post on Pinterest is illegal.
Sometime last year, a stipulation in Pinterest’s legal section said, ““You acknowledge and agree that you are solely responsible for all Member Content that you make available through the Site, Application and Services. Accordingly, you represent and warrant that: (i) you either are the sole and exclusive owner of all Member Content that you make available through the Site, Application and Services or you have all rights, licenses, consents and releases that are necessary to grant to Cold Brew Labs the rights in such Member Content, as contemplated under these Terms; and (ii) neither the Member Content nor your posting, uploading, publication, submission or transmittal of the Member Content or Cold Brew Labs’ use of the Member Content (or any portion thereof) on, through or by means of the Site, Application and the Services will infringe, misappropriate or violate a third party’s patent, copyright, trademark, trade secret, moral rights or other proprietary or intellectual property rights, or rights of publicity or privacy, or result in the violation of any applicable law or regulation.” 
A Pinterest-loving lawyer was one of the first people to find this clause within the website and after that, the questionable legality of Pinterest went viral.
The point of Pinterest is to find and share information, and for that information to then be found again. Information is meant to go viral. Companies and blog writers want their work to be shared as a way of free promotion. But giving each person (potentially millions) granted permission to share the site
According to The Daily Dot, Pinterest workers have since told the Wall Street Journal that this is just merely a case of the law being behind technology, since all major websites need to have a section on copyrights, just to cover their asses in the case that someone tries to pull someone else’s website off as their own. Sharing, spreading, and repining a link to the website, with credit to the rightful owner, is in fact perfectly fine, even if the owner doesn’t express permission for the link to be shared.
There are millions of sharing sites on the internet besides Pinterest. Sites like 9GAG or The Berry or StumbleUpon are sharing sites. Anything with a URL can be linked and shared onto virtually any website. The new wave of social interaction is no longer “Hey, come to my house and look at this link I found!” but rather now, “Hey, look at this link I sent you!”
It all boils down to a question: should media on the internet allowed to be shared and linked? After all, it wouldn’t be published on the internet if the maker didn’t want people to find it, right? And how could a company ever turn down free promotion to its website? Most importantly, if a person isn’t illegally, calling someone else’s work his own, what is the issue with giving the website’s link to someone else?
The ideas of copyrights, sharing and ownership within digital media are all very interesting (albeit confusing,) and Pinterest is a fine example of this. Pinterest, a new and digital media, speaks of the new and ever-changing world we live in. And I bet, copyright laws soon will be changing to match the viral, share-able virtual world we live in.


Works Cited
Bush, Vannevar. "As We May Think." Atlantic Magazine . 07 1945: Web. 26 Sep. 2012. <http://www.theatlantic.com/magazine/archive/1945/07/as-we-may-think/303881/>.
Baudrillard, Jean. “Simularca and Simulations.” Selected Writings. Stanford University Press, (1998):  166-184.
Kowalski, Kristen. "Why I Tearfully Deleted My Pinterest Inspiration Boards." DDK Portaits. ddkportaits.com, 24 02 2012. Web. Web. 25 Sep. 2012. <http://ddkportraits.com/2012/02/why-i-tearfully-deleted-my-pinterest-inspiration-boards/>.
Manovich, Lev. "What is New Media?." Language of New Media. (2002): 19-63.
Orsini, Lauren Rae. "Pinterest Addresses Copyright Concerns." Daily Dot. (2012): n. page. Web. 25 Sep. 2012. <http://www.dailydot.com/business/pinterest-copyright-infringementlegality-statement/>.
"Pinterest/Copyright and Trademark." Pinterest. Pinterest, 06 04 12. Web. 25 Sep 2012. <http://pinterest.com/about/copyright/>.

Wednesday, September 19, 2012

Response to The Medium is the Massage

I love typography. I love designing it, I love looking at it, and with this book, I learned that I love to read it. I like the way it clicks with my artsy brain. The photos, the inverted/repeated/upside down/ italicized words and the seemingly unrelated photos, my brain understands that. Sometimes I can read a picture better than I can read words.

Which is (not so) ironic, because one of McLuhan's points in The Medium is the Massage is how different mediums effect us differently. A book reads different than a picture book reads different than an audio book reads different from a pdf of a book. For me, I've found that I people become rushed and revert to "skimming" an article when reading online, which is why I always print my pdf's out and read them as if they are a book, because I relate that to "scholarly learning time" and don't rush myself the way I would rush down my Facebook newsfeed.

I do think it's funny how he writes his ideas on modern technology, (which is fast, expansive and all-inclusive,) through the form of a book, which private, isolated, and and individualized. (Yes, I realize the book was written 1967.) I'm curious as to what he'd said about technology and the internet in today's lifestyle. What would he say about Facebook profiles? - talk about public! Or how governments, schools and newspapers have gone digital, the way information is relayed through those sources is completely different from 40, let alone 4 years ago. The internet changes everyday, and the way we interact with it is changing everyday as well. Companies just can't keep up. As McLuhan said, "They're doing today's job with yesterday's technology."

Monday, September 17, 2012

Response to "Simulacra and Simulation"

Okay, I'll be honest. My mind feels like mushy Jell-O right now and I'm not sure of too much Baudrillard said in his article here. The Disneyland example, however, turned my previously liquid Jell-O into halfway solid Jell-O, partly refrigerated but not quite fully solid yet.

Here's what I think I understand: (And the only reason why I'm typing it all out, and not quite interpreting it all yet is because I'm hoping by writing it out, it will make more sense in my mind.)
1. A hyperreal isn't a representation, mimic, or opposite of the reality - it's more of a simulation of the reality... but it's different from it, because it's not the same. So, like a fake real. (I'm imaging the Matrix movies with this.)
2. In a hyperreal, the reality has been replaced with symbols and signs. Like if all of our stop signs were magically replaced with red octagons, we'd probably still stop at most busy intersections. (I also feel smart because I'm learning about symbols and signs in Culture and Communication right now, so I have a very remedial understanding of semiotics!)
3. There are phases which the symbols go through, which Baudrillard called the "precession of simulacra," where within each phase, the symbols get more and more "simulated." The order goes: faithful copy, perverted copy (we think it's a fake copy), pretends to be a faithful copy but really has no copy (first hidden simulation), everything is completely simulated. 

So, right now, I'm trying to compare The Maxtrix (which I read was very influenced by Baudrillard's thought process) to the example of Disneyland. I can see the Matrix example very clear, the computer program world is the simulated world that people have lived in their whole lives and believe is real, and the world that is actually real, has become cumbersome and a complete wasteland. The real and the hyperreal are completely different - so different that they aren't even contrasts of each other, they are in fact two completely separate worlds. People however, don't realize that they aren't living in the "real" world. The hyperreal conceals that the real world is no longer the "real" world.  

When I think of Disney, I think of how I know everything there is fake. Most of the buildings are merely cardboard cutouts and those cardboard cutouts aren't even to scale. They're cut and lined up to make us think that the world goes on forever, when in reality, it's only a few acres. What makes this a hyperreal, however, is the fact that Disney survives on the fact that it goes on as if those lies are real. They want us to think that we're in a dreamland, and life there is scripted and "roboticized", but of course, that script and those robots would never be admitted. (I think of the difference between Disney jungles where the animals are friendly and out in the open, compared to a real jungle, where if you were as close to an animal as Disney would suggest, you would have been dead for three minutes already.) Which leads us to question, are the animals real and tamed to the point that they're no longer a real representation of themselves, or are they completely robotic, which also isn't a proper representation of itself either. The point of it all isn't to have us question the differences though, it's to mindlessly believe that they fake lion is real. We're not supposed to know the difference. 


The examples are now pretty solid in my mind, and of the info I think I know, I think I know fairly well. But I am looking forward to class tomorrow to solidify everything else Baudrillard said. The examples, solid. Baudrillard as a whole, still a little mushy.

Saturday, September 15, 2012

Interest vs. Internet... in Youtube Form

So, similar to the way that a program "scans and doesn't read" a scanned digitized file, is the way that Youtube makes it's captions for it's videos. It merely only guesses what the video is saying.

For a quick laugh, watch this Rhett and Link video where they mess with Youtube's failed caption-rendering program.


Wednesday, September 12, 2012

Response to "Always Already New"

Only less than two pages into this article, I just had to laugh. Giltleman says that "the internet is wrong about it's own history," when she's referring to a mishap in a program that read a few scanned documents wrong translating the word "interest" into "internet". That reminds me of this "quote" that's been floating around the internet for a few years now...


And speaking of Abraham Lincoln... (insert awkward transition here.) Gitleman contrasts the documentation of history before and after the invention and popularization of the internet. Before the web, history was a record of the past. When newspapers were delivered, the "breaking news" featured within them was already days, if not weeks old, depending on if we're taking about pre- or post- Pony Express ;) Now, "breaking news," is just that. I can look up an article on an event merely seconds after it occurs. There really is no "history" to document, after all, the event could still happening while the article hits the newsstands (and by before it hits the newsstands, I mean before the post is published onto a website.) History, in modern terms, takes place in the now. History is no longer the periodic documentation of the past, history is the constant tracking of the present. And as far as digitization of history goes, it's hard to view timelines. Digital articles can be altered, moved to different websites, and all together deleted. After all, it's said that the average lifespan of a webpage is somewhere between 40-70 days.

Even in the internet's infancy, critics talked about the hard to call line between "the true and the false, the important and the trivial, the enduring and the ephemeral." (Well, I guess my addition of the picture of Lincoln wasn't too far fetched as far as where this post is going... this is a happy coincidence!)  The article even continues to talk about how is possible to cite an article off the internet, and your brain can be filled with complete BS. It is possible to find a website with literally anything on the internet. If you wanted to find a website on how to properly elect an octopus into office, I'm sure it wouldn't be too far of a search before an article was found.

The web offers plenty of uncertainties within citations. Citing a book induces no fear because even if a book is 50 years old, unless every copy of the book was burned in a horrible fire, it would be possible to track it down. Now, people run the risk of citing a website that will someday expire. In a digital world that is meant to keep everything accessible at all times, things sure are tricky to find.

Case in point: The internet is sketchy as hell.

Monday, September 10, 2012

Response to "As We May Think"

Ever since the beginning of civilization, people have worked to make their own lives easier. In essence, they worked hard so they never had to work hard again. To solve problems that in their own time were difficult, (but in hindsight are now considered relatively easy,) things like the abacus were created. The abacus made a mathematician's job incredibly easier, but the mathematician still had work to do. (Also, isn't it funny to think that at one point a "mathematician" was considered anyone who could at 2+2 with the help of beads?) And over time, the abacus turned into the calculator turned into the basic computer turned into a machine that can essentially think for us. Now, a mathematician is no longer someone who knows how to compute incredibly hard equations by hand, with a little assistance of an invention to keep his math straight; a "mathematician" is a computer who can do equations that are monumentally harder than any human could ever compute within milliseconds, all with a little help of a human controller, who's sole function is to enter the equation and press a big red button that says, "solve."

As Vannevar Bush said in the essay "As We May Think,"

"A mathematician is not a man who can readily manipulate figures; often he cannot. He is not even a man who can readily preform the transformations of equations by the use of calculus. He is promarily an individual who is skilled in the use of symbolic logic on a high plane, and especially he is a man of intuitive judgement in the choice of the manipulative processes he employs."

Technology has made our lives too easy. No longer do we ask ourselves, "How do I solve this?" but rather, "Which program is best at solving this for me?" Soon, our whole lives will revolve around technology (more than they already do, we still do use our brains for some things,) and that concept scares the crap out of me. But then again, maybe that's because I just recently watched The Matrix movies and have a deep fear of a computer system trying to beat me up. Or maybe its because I learned they are no longer teaching cursive in schools, because it is now outdated and won't be a necessary tool for budding adults to need to know. Or maybe because of the fact that over 60% of children who are now kindergarten students, when they complete college, will hold a job that currently does not even exist. (I paraphrased that statistic and am hoping the percentage is correct.)

The idea that computers are taking over our society reminds me of both the giant brains in the episode of Futurama that try to kill the world with knowledge, and the lethargic, lazy, don't-know-how-to-do-anything-for-themselves society from the movie Wall-E. When will the line between whether humans control computers or whether computers control humans be crossed? Perhaps it already has...

Wednesday, September 5, 2012

Response to "Why I Blog"

In the essay, "Why I Blog", author (or blogger, so to speak,) Andrew Sullivan talks about the contrasts between writing published journals and writing a blog entry. Blogs are "spontaneous expressions of instant thought," where journal articles are rehearsed, reviewed and re-edited, (because you can never edit just once.) Blogs are "colloquial" and "unfinished," where articles are professional and polished. Blogs are "personal" to the author, who often fills his posts with emotional opinions about topics. Readers come to know the author quite deeply -- and often relate to the author on a friendly, equal level, where columnists are removed from their work, hiding any personal bias and sometimes not even signing their names.

The ironic thing about the contrast between those two styles of writing, is that while I'm sitting here typing my response, I find myself pausing, thinking, contemplating, deleting, rewording, deleting again, and carefully picking the words I keep. Which, according to Mr. Sullivan, or Andrew (depending on if I'm referring to him as the columnist or the blogger, respectively,) is the exact opposite of what a blog should be. If I were to write this response in "proper blog manner," it would be filled with intense emotional appeals, relateable humorous anecdotes, a few spelling errors (from typing so fervently,) and ultimately, it would be left without a conclusion. And for two reasons: first, a blog is a "draft" of sorts. The author hasn't had time to revise his or her work, and perhaps, never will.  After all, a blog entry is a snapshot in time, showing exactly how the author feels about a subject, and perhaps the author hasn't quite finished his ponderings yet, and the conclusion is to be posted in a different entry. And also, a post might be conclusion-less because as Andrew has pointed out, a blog post is merely the start of a conversation, or the budding of an opinion; it is not the end-all of all knowledge. Perhaps, the blogger is waiting to finish his thoughts, and is opening his post up for discussion among his followers. Maybe one of his followers holds the answer to the blogger's ponderable question that the author would have never thought of by himself.

To keep this post in non-blog fashion, I end with a conclusion: I hope I can learn to blog properly. Emotionally. Opinionated. I hope I can learn to read an article and know instantly how I feel about it, with little time wasted to discover my thoughts, hidden away. My high school English teacher always pleaded to my class, "Tell me you love this book or tell me you hate this book, but never tell me you don't know how you feel." I hope I can learn to know how I feel. I also hope to learn to let others influence how I feel, much like a reader responding to a blog post, I hope to let my peers challenge my views and in that, push me to see what my views actually are.

So, here's to a new year, a new way of learning. And here's to blogging.

Welcome!

As I sat at my desk trying to think of a half-way decent name for this blog that has been inherently thrust upon me, I hit a mental block. Yes, a block before I even started. Every name I came up with was either very, very lame, or very, very cheesy. (Or already taken, but that's besides the point.) As I complained my naming woes to my hallmates, one of them suggested, "Why don't you just call it "Lame and Cheesy?" ... And thus my blog was born.

So, here begins the digital documenting of small insights into the brain of a first year Communication, Culture, and Media major. Considering I'm a female art student surrounded by a sea of male engineers, hopefully my writings will be far more insightful and far less cheesy than the title of my blog.