Wednesday, December 15, 2010

Safe Cyber Space: Youth and Social Media

It is no secret the Internet can be a dangerous place, particularly when the user is unaware of its true potential. While most Internet users have acquired online best practices through their own trial and error, the many forms of social media magnify this process more intimately. Stories of teen suicide sparked by cyber bullying and other forms of virtual exploitation filled the news in 2010. While social media provides a platform for building networks of friends, it can also create a false sense of belonging for those yearning to fit in. When one or more members decide to be hurtful to another member, the damage can sometimes be irreparable.

Another form of social media that has found its way into mainstream news is “sexting.” The term “sexting” refers to sending nude, partially nude, or sexually suggestive messages using a texting device. In 2008, TRU, a global leader in research on teens and 20-somethings, conducted “the first public study of its kind to quantify the proportion of teens and young adults that are sending or posting sexually suggestive texts and images.” The study revealed that 20% of teenagers between the ages of 13 and 19 have posted “nude or semi-nude pictures or videos of themselves”, and an overwhelming 39% have posted “sexually suggestive messages.”1 These activities seem frivolous to these young mediators, but in the eyes of the law, they are felonies.

In a 2009 CBS news report, Harry Smith discusses a Pennsylvania case with CBS News legal analyst Lisa Bloom where six teens faced charges of child pornography for having nude images of each other on their mobile phones. If convicted, these children could spend years in prison and will be required to register as sex offenders, something that will follow them the rest of their lives. Although large numbers of teens are engaging in similar activities, Smith comments, “few realize they are breaking the law.”2

How can these children be held accountable if they are unaware these activities are illegal? One might argue in the traditional sense of the law, one cannot claim ignorance as a defense. However, these teenagers were given access to a global playground without proper instruction. Should this be cause for concern? Is it time to consider some sort of regulation or educational reform to ensure a safe environment for our kids’ digital activities? A fair comparison might be putting a loaded pistol in the hands of an unknowing six year old without the expectation of some sort of mechanical exploration. The underlying issue here is not that these activities are statistically overwhelming, or even that they are illegal. The real issue is the inherent dangers that accompany such activities. When the above loaded pistol goes off and kills someone because of the child’s ignorance of the mechanism, the parent becomes the responsible party simply because as an adult, certainly he or she knows the dangers of putting a loaded weapon in the hands of a child. Should the same hold true for the parent whose child cyber bullies a classmate to a suicidal-end?

Another alarming statistic the TRU study revealed was that as many as 15% of the teenagers they surveyed admitted to having posted nude or semi-nude images of themselves “to someone they only knew online.” Chat rooms are common places to meet people while surfing the Internet. Sparking a conversation with an online persona can be exciting, especially when that persona fills a void. In most cases, these types of relationships blossom into something completely innocent. However, if asked to insert nude personal photos into an online conversation, the child’s virtual pen pal may have something more sinister in mind. Sexual predators lurk in chat rooms waiting to pounce on their prey, while their child victims are none the wiser. They believe what is revealed in their instant messenger window, simply because they have no reason for doubt, if doubt is even possible in their young minds. Chat rooms are like any other form of social media for these kids. MySpace, FaceBook, and texting is all commonplace in their daily lives. Much like prior generations wrote a letter or phoned a friend, the youth of today rely on digital media as a necessary component of their social relationships.

Fortunately, law enforcement realized the dangers of online child predation and put in place systems to fight these types of crimes. In 2006, NBC Dateline commissioned The Intelligence Group to conduct a nationwide survey on “What are kids really up to on the computer?”3 Out of 500 responses from 14-18 year-olds, 58% said a person they had met online wanted to meet them in person, and 29% said that they had a “scary” online experience. The survey also revealed that around 50% of the teens, “did things online they would not want their parents to know about.” These, again, are alarming statistics. According to the U.S. Census Bureau, in 2009 there were approximately 22 million teenagers living in the U.S.4 If 23% of those teenagers chat with strangers online on a regular basis, and 58% of those same teens received requests to meet the stranger in person, the number of teenagers at risk, at any given moment, is approximately 2,934,800. That is a scary scenario that clearly illustrates the fact that a simple click of the mouse could effectively sign a child’s death warrant.

Social media is a wonderful tool for communication. It allows people to stay in touch, reconnect, make new connections, form online interest groups, quickly communicate and collaborate from afar, share ideas, images, and videos, all with minimal barriers or gate keeping. The free-flow of information runs rampant in this new age of digital technology, and since the inception of Web 2.0, the torrential outpour of creativity and expression has been both liberating and empowering. But as Jonathan Zittrain discusses in his book, The Future of the Internet and How to Stop It, the very elements that make a free-flowing system strong are also its biggest dangers.5 For example, entering a dedicated chat room looking for an innocent conversation, could instead lead to disclosing personal information to an intruder with unsavory intentions. The freedom exists at both ends of the conversation. However, the opacity of the digital screen allows for deceit at either end as well. The unsuspecting chat room visitor has no idea who he or she is really talking to, while the individual on the other end is uninhibited to be exactly who he or she wishes to be.

The Internet, one of the greatest technological advancements in the history of mankind, has literally changed the way the world operates. Gen M, the first generation fully occupied by digital pacifiers, is hard-pressed to imagine a world without iPods and DVRs. Anything less than High-Definition television with high-definition video games is considered “old school”, and having a mobile phone, a personal computer, and Internet access is the norm rather than the exception. Just as parents teach their children how to traverse the real world, they are now faced with a second, much more illusive landscape, a digital terrain that is perhaps even more complex.

Where should instruction begin? A good number of children know more about the digital landscape than their parents. Elizabeth Englander and Kristin Schank speak on this very topic at eSchoolNews.com. They write, “Although kids are comfortable with technology, they are not necessarily knowledgeable about it — don’t confuse the two. We all need to talk with kids about technology. Don’t worry about how much you know or don’t know. Ask kids what’s happening online with them. Ask them to tell you (or show you) what they’re up to online. And keep in mind that even if you might not know how to do a particular thing, you do know that even online they should watch what they say and be civil to others. Don’t hesitate to make that message loud and clear.”6

If teaching begins at home and enforces the same message, regardless of the medium, the result should be universal. However, if one or both parents are digitally disengaged, an entirely different scenario could materialize. This is very important. The child must be prepared for both worlds, offensively and defensively. If the parents cannot provide this kind of comprehensive guidance, the next line of defense lies within the educational system, where indeed it should. In Utopia, the two would interplay. In reality that is not always the case. This is one of those cases.

The technological shuffling of the last 10 years alone is mind-boggling. Now that it is a part of everyday life it would be irresponsible for our educational systems not to proactively address these issues as part of their everyday curriculum. The tragedies and missteps of our youth over the last two years must serve as an immediate call-to-action, and our school administrators must arm themselves for this imperative. Instructors must be prepared to teach the tools needed for personal and professional survival while simultaneously stressing the importance of safety and civility.

A Safe Cyber Space is absolutely necessary for the well being of today’s youth. Digital educational curriculum needs to begin in kindergarten and be incrementally applied through the senior year of high school. It should be integrated within the learning environment where applicable to the exercise. For example, a module on penmanship or spelling need not require the aid of a computer. However, writing a short story using the spelling words from the week would benefit from the use of the computer keyboard in a program such as MicroSoft Word, with spell checker disabled of course. The child would then print out the assignment and turn it in to the teacher.

The primary objectives for grades K-3 are learning how to use a computer as a tool for particular problem-solving tasks. Additionally, the development of a closed wiki-style class community serves as an introduction to learning about collaborative environments. Individual blogs are set up for the child to use for self-expression and for other students to comment on. As the gatekeeper, the teacher supervises the network using specific examples to discuss acceptable versus unacceptable uses of shared communities. Offline worksheets could be developed as homework assignments asking various questions about network etiquette. These types of exercises could also benefit parents who may be inept to the digital world, perhaps enabling their growth along with their child’s. Introduction of other digital devices like mobile phones provide a way of familiarizing students with their uses while also helping the child to remember his or her parent’s phone number.

By the end of third grade, having mastered the various uses of the computer while also acquiring some sense of appropriate community conduct, limited Internet access should be integrated into the fourth-grade learning environment. Again, under the watchful eye of the instructor, fourth graders will learn about the World Wide Web and its various uses as an educational tool. The application of a class wiki will continue for collaboration, creativity, and conversation, still controlled but perhaps not as closely monitored. Less gate keeping allows for the perception of increased freedom, testing previous lessons on community etiquette.

These lessons should be reinforced at every grade level, however relaxed incrementally throughout the maturation process, but always monitored to uniformly address misconduct as necessary. The point is not to punish the child. The objective is to teach respect and constraint both online and off. A good offline exercise might be to sit in a circle and say one good thing about everyone in the class. One might argue these types of activities are not part of standard educational curriculum. However, in order for a Safe Cyber Space to be realized, these types of exercises are appropriate and necessary. It starts with an understanding of what should and should not be said or done while communicating with others. By building a foundation of mutual respect, the children in these programs will be intellectually and emotionally prepared to act and react responsibly after the community controls have been lifted.

Fifth grade is usually greeted with more in depth reading and writing assignments. Armed with the knowledge that the Internet is a useful educational and research tool, fifth graders receive open access to explore the areas in which their assigned topics lead them. Restrictions still exist in the form of barriers of entry to inappropriate sites, and any type of social media is still maintained at the classroom level in the form of the closed wiki model.

Grades six through ten continue in the same fashion with open Internet access for research purposes only. The class wiki model continues, with more emphasis on writing and sharing ideas through the students’ individual blogs. Commenting is also now required in order for students to start learning about critical thinking and discussion.

At this point, perhaps even before, a good number of students will have personal mobile phones. It must be anticipated and therefore addressed early. Appropriate cell phone use should be integrated within the same curriculum as respectful online community behavior, emphasized in a way that will register once kids start using them freely. Unfortunately, educational systems have no control over when a parent decides to give their child a mobile phone. What they do have control over, however, is teaching best practices regarding its use. Again, homework assignments focusing on appropriate versus inappropriate cell phone use could be beneficial to both the child and the parent. For example, writing a short paper exploring the possible ramifications of cyber bullying or sexting would certainly get the student thinking about these things in a more pragmatic manner.

By a student’s junior year he or she will try to buck the system in any number of ways, trying to declare their own independence while simultaneously searching for their own identity. The best line of defense for this age group is providing awareness. For this reason classes focusing solely on social media should be required starting with the junior year and continuing until graduation. Acting as a mentor, the class instructor will facilitate a discussion-based environment in which the “rules” of online etiquette are openly and critically addressed. It will continue a discussion that has been under construction since their kindergarten year, in preparation for this very moment. Students will learn how to build a personal online brand through the use of social media tools. They will learn what types of images and videos are appropriate for online disclosure, as well as keys to keeping the conversation civil and respectful.

In the senior year, this conversation continues but with increased professional emphasis. These lessons are imperative as students prepare to enter college or the work force. More and more companies are using FaceBook entries to screen potential employees. Our youth must be prepared to show they have the skills to act responsibly both online and off.

The above scenario is not too different from what is being emphasized currently in our educational environments, with a few exceptions. New tools of technology must replace the traditional tools of education. Teaching the appropriate uses of these tools incrementally throughout the maturation process will prepare our youth for the more illusive world they will come to depend on and interact with the rest of their lives. The emphasis on social media is extraordinary simply because it is an uncontrolled environment with potential unseen dangers. Requiring discussion in this area, allows for all variables to be explored in an open and honest forum. These types of exercises build awareness while creating better-informed and hopefully more responsible members of society.



Notes
1 The National Campaign to Prevent Teen and Unplanned Pregnancy and Cosmogirl.com,
“Sex and Tech,” (www.thenationalcampaign.org/sextech/pdf/sextech_summary.pdf, 2008)
2 CBS News, Smith, Harry, “Sexting Shockingly Common Among Teens,”
(http://www.cbsnews.com/stories/2009/01/15/national/main4723161.shtml, June 15, 2009)
3 Dateline NBC, “Most teens say they’ve met strangers online,”
(http://www.msnbc.msn.com/id/12502825/ns/dateline_nbc/, April 27, 2006)
4 U.S. Census Bureau, “American Community Survey,” (http://bit.ly/dU9iBx, 2009)
5 Zittrain, Jonathan, The Future of the Internet and How to Stop It, (Harrisonburg, Virginia:
R.R. Donnelley, 2008)
6 Englander, Elizabeth and Schank, Kristin, “Reducing bullying and cyber bullying,”
(http://www.eschoolnews.com/2010/10/06/reducing-bullying-and-cyberbullying/, October 6,
2010)

Wednesday, December 8, 2010

The Future of the Internet and How to Stop It - Case Study Presentation

My case study focuses on generative systems, how they evolve, expand, populate the masses, and then become weakened from the very elements that make them successful.

I use the personal computer as my media object as it is a generative system which is adaptable to a great number of functions, all at the user's whim. Compared to an Information Appliance such as the typewriter or the old IBM Mainframe, designed for specific functions, not easily modified, and serviced by the vendor exclusively, the personal computer is designed with a low common denominator, making it easy to configure according to what the owner needs or wants it to do.

The Internet is also a generative system in it's current interaction with personal computers. Collaboration, innovation, ability to adapt to changing conditions are among the many advantages of the generative Web. In fact the Internet was created around the concept of letting it become what it will. The framers decisively held firm to the "procrastination principle", and didn't worry about what problems might occur, rather leaving such issues to the "end-points" to solve as the problems arose. This is what created the Internet as we know it today; a flourishing network of innovation, collaboration, communication, ideas, participation, and so on...

Unfortunately, Zittrain also talks about, (in Derrida style) the very things that make a generative system strong are also it's greatest danger. The dangers come from the exploitation of the network by amateurs and abusers. For example, amateurs unknowingly release harmful code. Contrarily, abusers very knowingly release harmful code. Either way, harmful code circulates the network waves infecting anything and everything it comes into contact with, and in an exponential fashion. So the freedoms that the Internet allows for honest people, also pathes the way for the bad guys.

In reaction and fear to this type of activity, the minds of regulatory personnel start spinning and debating solutions. Instead of applying the "procrastination principle" they will indeed be figuring out ways to stop online hackery and the threat of technological terrorism. So, in a worse case scenario the logical solution would be to disallow interaction with the Internet, relieving the threat that lies within.

I use the scenario of a "Read-Only Internet" as an extreme. There are many, many areas to explore in this discussion. One of them being freedom. Even that in itself opens many avenues of discussion.

However, if regulation deemed the Internet as "Read-Only" how would that effect us? For starters, our personal computers would become mere informational appliances. We could still create and innovate, but it would be more on a personal level. We could only use the Internet for research purposes. We would have no way of collaborating on projects or ideas with people around the world, or for that matter people in our own back yard.

I use extremes to make a point as well as hopefully creating a jumping off point for discussion. I truly, truly hope the Internet will never be "Read-Only" and honestly, I don't think it will ever be, at least in the U.S. My hope in my presentation is to get people thinking about how we as a participatory culture can prevent this from happening on a large scale.

Tuesday, November 30, 2010

Is privacy really that important?

My response to privacy issues was practically recited word-for-word by EFF in their article, "On Locational Privacy, and How to Avoid Losing it Forever." My stance has always been, "I'm not doing anything wrong so why worry about it". Reading those words as a "common response" definitely caught my attention. So if I'm not doing anything wrong, why should I worry about digital privacy?

As I type these words I realize I do think about it. I am cautious about what I do and say online. I'm careful about the online company I keep as well as things I search for. I have learned that once online, it stays online. It is now common practice for hiring professionals to conduct personal online inquiries in which they form "personal" judgements. Whatever happened to "What happens in Vegas stays in Vegas?" Recently, I came to the realization that my email signature may even be offensive to some. So, I removed my liberal stamp, along with my "thought-provoking" peace statement. Now, I'm nothing more than a pist off Democrat trapped in a red state with no one to talk to about it but my mom! Such is life.

What I found refreshing about this week's reading was the blog by Aspen (what a great name) Baker. She did not reveal her secret, but I would love to know what it takes to keep an online community private. Whatever she has done, I think she hit the nail on the head. We should be able to trust our online communications to the people we choose to share them with.

To that, one will most definitely argue, "What about terrorists?" I am certainly one of those. This is where I am conflicted. I am also conflicted on the many other online activities that prove as evidence in criminal wrong-doing. Where do we draw the line? Can we draw a line? I am very much on the fence on this issue. EFF made very logical assessments. Do I want someone to know my every move? The answer to that is probably "no", but like I said, safety issues aside, I'm not doing anything wrong so why should I care? My only response to these issues is, "If you don't want to be found out, don't do it digitally."

Similar case in point, the TSA security measures have recently been a hot topic of debate. If you don't want to be groped, don't fly. I would walk through an airport naked if it meant I would get to my destination safely. Yes, it's a sad day when we have to come to these measures, but it is now a fact of life, and I'm pretty sure most folks would prefer preventative measures before another 911. Privilege and freedom come with a price. I say, if you're not doing anything wrong, quit bitching about small sacrifices and think about the one's who have given the ultimate sacrifice protecting your personal freedoms.

Wednesday, November 17, 2010

Redemption of the fourth estate, or just another venture in capitalism?

While Ted Koppel's op-ed article in Sunday's Washington Post wasn't on the list, I found it rounded out this week's reading quite nicely. In his article, Koppel remembers a time when television news was objective and honest, when journalists reported the true state of affairs, and when the fourth estate, as intended, kept an unbiased, watchful eye over government activities. Of course that was also before networks figured out news could generate revenue. I guess they decided churning a profit was more important than churning the truth. I'm not going to lump all of today's journalists into one group as Koppel did, but his opinions are well-founded for a good number of them.

The issue at hand, however, is not what happened 20, 30, or even 40 years ago. The issue at hand is how we, as a participatory culture, combat what journalism has become. The current state of dominant news media consists of charismatic personalities with open wallets. While the righties head for the bank, the lefties head for MSNBC. I love Keith Olbermann and Rachel Maddow, but I also know which side they're pulling for, and I shouldn't.

Ted Koppel also states in his article, "The transition of news from a public service to a profitable commodity is irreversible." If that is true, we are seriously doomed, and on many, many levels. However, Boler gives me hope. The fact that we are aware gives me hope, and the fact that she and others have put theories into place gives me hope. The question now is how to put these theories into action. Collectively how can we establish a near equivalent to the objective honesty that once existed? How can we as a participatory culture realize and report the truth?

Boler suggests it must come from a grassroots movement where, "All are interested in challenging and intervening in dominant media structures, and in cutting across modes of distribution with aims of resisting the messages and form of dominant media".

So my question again, how do we do this? Boler provides a list of possibilities, but how do we enlist the trust of the masses? More importantly, how do we enlist the trust of each other? It takes support and it takes money. We have already established that honest news doesn't generate revenue. How long before we see ourselves in the mirror of our opposition?

Perhaps my once youthful optimism has been tainted by the practices of everyday life. My hopes of "changing the world" have been reduced to hopes of making changes in my own life. But even as I succumb to the realities of my culture, I have not given in to its ignorance.

On some level I do still have hope that the American people will wake up to the fact that the only objective of dominant news media is partisan perception, in some cases outright dishonesty. The youth of our nation are already clued in to this fact. Thankfully they are the future of this great country. At the same time, it saddens me that parody and satire are the preferred media portals for these same individuals. In retrospect, I guess the whole situation is funny in a sad sort of way.

If the internet is the answer to redeeming the fourth estate, I say hoorah for the good guys. But as Boler suggests, the term itself may be apt for a new definition. Perhaps one that includes the watchful eye of the people. If the government knows we as a people are watching them and reporting the real facts, perhaps then we can get back to the fabric that makes America a great nation.

Wednesday, November 10, 2010

"When the power of love overcomes the love of power, the world will know peace..."

...I borrowed that line from an email a friend sent me, who borrowed it from Jimi Hendrix. Now I'm using it online, to share with you. I admit, a little tame compared to Girl Talk, but hopefully you get the point.

The reason I chose this quote as my blog title was to demonstrate my interpretation(s) of the dynamics that seem to be going on regarding the whole issue of copyright, participatory culture, convergence, and collective intelligence. I can sum up the entire scenario like this:  The left is pushing against the right, and in the name of power, control, and money the right is pushing back...unless of course concession means more power, control, and/or money. But even then, the concession is only for their gain. I use the terms "left" and "right" (somewhat) loosely here. (I would have to do more research before I can talk more absolutely.)

The point is, one side is pushing against the other for political reasons. The side that wishes to open up the lines of communication, share ideas for a greater good, and participate in the world around them are being stifled by the side that wishes to maintain control. Jenkins uses the CNN/YouTube debate as an example. The simple fact that Mitt Romney refused to debate with a snowman, even though the snowman had a legitimate question, illustrates the higher ground on which these power players wish to be regarded. These "elite" types are much more comfortable controlling the conversation. They don't like the idea of participatory culture because incoming questions might make them uncomfortable or worse, make them look incompetent.

Unfortunate, for these power people, the internet is here to stay and it's only going to push the right further to the right and eventually off the playing field altogether. Well, maybe not altogether, but I do think the masses will be heard sooner than most may think. The world can only gain from sharing ideas and knowledge. Culture can only make itself relevant through innovation and creativity. Progress can only progress if it is allowed. A very small number of people hold the power, the control, and the money in this country. These are the same people who stand in protest against a culture of participation. They want to keep their power. They want to keep their control, and they most definitely want to keep their money.

It may seem I'm speaking only about politics proper. What I mean to convey is that politics are at play in every area that concerns power, control, and money. The owners of copyrights want to maintain the control of those copyrights to make as much money as possible. In this sense they hold the power. Participatory culture and collective intelligence would certainly dethrone patent holders if laws regarding certain types of patents were reversed, and convergence would keep a lot of corporate giants on their toes trying to keep up with the status quo. True, some companies have been successful in listening to their "fandom", but it was all in the name of making more money. Their attitude is, "If it'll make us more money we'll listen, if not we don't care what you have to say."

This tug-o-war won't end anytime soon. Meanwhile, the internet with all its channels of global communication will continue to grow. File sharing legally, and illegally, will continue to spread, and I don't think Girl Talk will go out without a fight. People have been borrowing, stealing, and manipulating ideas from the past since the beginning of time. Artists do it just as musicians do it just as politicians do it. The only difference now, the involved parties can share their creations online, and there inlys the problem.

Bottom line, the internet has and will continue to reshape culture and society. Those who keep up will succeed. Those who can't will fade away. Companies already see the necessity of listening to their customers. Politicians will acknowledge this truism soon enough. The remaining entity of Hollywood could be a different story. I think they will fight tooth and nail to the very end only to be slowly put out of business by the "amateur" film makers who collectively will prevail. At that point we will begin to see true democracy in action...at least for a little while.

Sunday, October 31, 2010

Multitasking mediocrity...

Yes, the many, many media streams do indeed keep us in a constant state of imagined crisis. Fortunately it hasn't really affected me too much, probably because I was raised in the "deep attention" generation (I don't feel the need to answer every phone call, text message, or email as soon as it arrives). Twitter seems to be calling my name a little more often though, so perhaps there is hope.

A couple things I found interesting in both Hayles' reading and Stone's presentation were the cognitive shift(s) that are taking place. The human mind is a very complex machine and its ability to adapt to different environmental conditions is quite remarkable. Now I understand better why some people think differently than me.

Growing up I remember my occupations were watching television (one of only three channels), listening to music, practicing piano, talking on the phone or playing with friends, and doing homework. All of these activities were centered on the activity itself. The word "multitasking" had yet to be coined (at least to my knowledge) and distractions were rare. I sat in solitude while I did my homework or practiced my piano, deeply attending to the completion of the activity at hand. It wasn't until years later that the concept of "multitasking" became the buzzword that it is today.

My definition of "multitasking" is being able to simultaneously work on multiple tasks while also being able to give each task the concentration it requires as if it were the only task being attended to. It's not merely a matter of doing two or more things at one time. Multitasking requires a total and complete shift in mindset in order for multitasking to be successful.

In other words, if someone claims they are multitasking, the result of each completed task must be of equal quality as if they had each been attended to in succession rather than simultaneously. For example, a fifteen year old is hired to keep the score of a basketball game. This activity in itself takes a good amount of undivided attention. However, while s/he is supposed to be keeping score, s/he is also texting with friends. When confronted, the teen says, "I can multitask", even though the purpose of the confrontation was the frequent inaccuracy of the score. In this example, multitasking clearly does not work because the quality of one of the tasks has been compromised.

This is where I become cynical when discussing this trend toward a culture of multitasking. My cynicism lies in the question of details that could become lost due to lack of deep attention? Will "Generation M" understand that certain things require more attention than others, while still other matters require complete attention?

The thing I find most interesting in Hayles' article is when the topic of AD/HD enters into the conversation, drawing links between "hyper" attention and the aforementioned cognitive disorder. I understand the commonalities, however one is a biological disorder while the other appears to be a cultural malfunction. We do not have control over the amount of media  available to us. We do, however, have control over how much of that media we allow our children to consume. If an otherwise non-AD/HD child has to take Ritalin in order to perform well in school, there is something very wrong.

Good parenting requires regulating a child's consumption of anything that could be detrimental to that child's welfare, including over-consumption of media. Is it necessary to give children mobile phones with texting or internet capabilities? The current issues of "sexting" and "cyberbullying" could be limited if these options were not made available to kids under the age of 18. Similarly, a regulation such as this would put the purpose of the mobile phone into perspective for this age group.

Taking this idea a step further, why are social networking sites made available to kids under the age of 18? Furthermore, why do parents allow their children to participate on these sites at home? I believe the responsibility begins and ends with the parents. There are reasons for age restrictions, all having to do with the ability to handle the responsibilities that go along with the privilege.

A cultural shift is obviously taking place and I do find it intriguing. To imply, as Halyes does in her article, that "Children growing up in media-rich environments literally have brains wired differently than humans who did not come to maturity in such conditions" is simply fascinating. Additionally, as Hayles mentions, it is a condition that educators will have to come to terms with in order to be effective in their roles. A compromise between the two polarities of "hyper" attention and "deep" attention would indeed be an ideal scenario, and if educators could establish and maintain such a condition I think real progress could be made.

Therefore, I think what lies beneath these issues is perhaps thinking in terms of teaching our children how to manage a media-rich environment. A system might look like this: Kindergarten through 12th grade - limited forms of media are used for educational purposes only. Undergraduate settings - embrace previously limited forms of media in order to teach students how to use them both responsibly and professionally. A system like this would in no way discount the relevance of technology; it would simply impress the importance of different media tools in the right place and at the right time.

In reality, technology is nothing more than a set of tools. A powerful set of tools which, through continued progress, will provide great possibilities for not only our generation, but for all future generations to come. Technology is also changing our culture, creating opportunities and efficiencies our forefathers could not have imagined. But like every other set of tools, the proper uses of technology must also be taught. If we don't start these teachings at a young age by incrementally focusing on the situational pros and cons of each type of attention, progress could potentially give way to an over-stimulated culture of self-indulged mediocrity. A mediocrity hidden behind the more politically correct guise of multitasking.

Tuesday, October 26, 2010

...but is it probable?


Collective Intelligence left me feeling like a nebulous being floating within a larger cultural specimen being dissected and analyzed by those who fancy such studies. In this particular instance, the bunny men are microscopically drawing diagrams, relationships, interactions (or lack there of), and speculating outcomes of the fourth cultural space as it relates to its predecessors. Levy speaks of this fourth, or knowledge space, as perhaps a terminating space from which those who proceed its final formulation could enjoy the solutions to the thousands of years of strife that have come before. For the first time in my life, I feel like a part of history instead of a part of making history. A hard pill to swallow, especially in angst of my recent birthday.

I do however try to imagine how this whole idea of collective intelligence might play out. In Levy's Utopian scenario it plays out perfectly (of course): democracy by the people, for the people. After all, that's what our founding fathers had in mind, right? It was only when populations exceeded a realistic voice that we had to switch to government by representation. Now that we once again have the possibility of true democracy in the sights of our imagination, will we be able to return to a democracy truly by the people? A better question, will the powers that be allow us to return to such a system?

If collective intelligence is what our founding fathers intended, how will we achieve such collaboration in favor of good for the people? Has the current system divided us beyond repair, or will a system of demodynamics help us to overcome our differences? Given the opportunity, I think the latter would prevail. The majority of people care about equal rights for all. The majority of people are tired of corporations effecting policy. The majority of people want the same opportunities regardless of race, religion, gender, handicap, and so on, and the majority of people want to see these rights acknowledged and enforced.

Will collective intelligence in action affect the status quo? Indeed it will. Potentially greater than anything we have seen before. It will certainly disengage our state representatives as conduits to "what's best for the people". Their services will no longer be needed or desired. That is not to say that a core assembly of administrative officials should be displaced. Quite the opposite is true. Even a democracy based on collective intelligence needs a system of checks and balances.

My question is how this Utopian scenario might be put into practice? Is it possible? Is it feasible? It is certainly imaginable. What would the infrastructure look like and who would build it? How would the input and output be managed or filtered? How would we come to a consensus and how would our consensus' effect change? Case in point:  The war we are currently engaged in. If we as a collective, "vote" against the war, how would that effect the overall state of affairs? Could we simply disengage? As Levy mentioned, these types of decisions are slow moving. Would this cause frustration?

On a larger scale, how would our relationships with other countries be handled? Foreign affairs often (if not always) affect us directly. How would we come to a collective consensus on issues like trade relations, immigration, terrorism? How would we go about correcting the mistakes our predecessors made, both here and abroad, if that's even possible?

I can imagine Levy's land of Utopia. But, it is just that, Utopia. If anything even remotely resembling a transition of this magnitude were to take place, it would (obviously) take years to transpire. Years that most of us probably don't have.

Friday, October 22, 2010

The real issues at hand....

In reading the blogs from a few of my class mates regarding this week's discussion topic, I found the polarity of opinions very interesting. One blogger felt that it is OK to download music illegally, and justified it by placing it under the umbrella of "file sharing". Another blogger, while fully sensitive to the implications of illegal downloading was somewhat confused by the two terms, "illegal downloading" vs. "file sharing".

In my mind, the two terms are different:  Illegal downloading is acquiring an object for free that otherwise should have been paid for. File sharing is simply sharing information online. I don't think there is any harm in sharing files if those files were acquired legally. I think the question that does fit under the same umbrella is whether the files you are sharing were intended, by the author, to be shared at no cost.

Something else I found interesting in a couple of the blogs I read was an attitude of "..this is what my generation does, and if you don't like it then you need to change your system". An artist makes h/er living by using their skills and expertise, just as we make our living(s) using our skills and expertise. Would you consider it fair if your boss decided he didn't want to pay you? The fact of the matter is it's illegal, and furthermore you would not be able to make a living. You would pursue the appropriate legal channels to ensure payment for your services.

A third, and much larger issue, that perhaps confused the discussion even further, is the question of what should be protected and to what extent. This is where the waters start to get a little murky on some levels. For example, if you use a quote from a book that is protected under copyright law, it is perfectly OK as long as you put it in quotation marks and give attribution to the author. Yet the debate rages on whether it's OK to sample beats from various artists' songs to create a new song that you call your own. In the former you are paying homage to the author by using h/er words in your own literary piece. However, in the latter it is considered copyright infringement or stealing. If attribution were given, would that make a difference? Or is it simply a matter of how the two institutions approach copyright? Or, is it something different all together? Perhaps this is where the greed of corporate America comes into play, which is the larger debate needing to be explored.

I wanted to identify these issues separately in an attempt to simplify the various areas of debate. The first two issues I mentioned are pretty much straight forward, at least in my mind. The third issue, I believe, is where the conversation needs to be focused. A couple of cases in point include Hollywood's global monopoly on entertainment. A second case is that of pharmaceutical companies who would rather gain financially than to collaborate with the larger scientific community to perhaps discover cures for diseases for the public good. As Kim mentioned in class, no research is currently being done to find a cure for Aids because there is more money to be made by not finding a cure. These are the types of issues I feel need to be addressed. These are the corporate giants that are hiding behind patents and copyright laws for their own financial gains, rather than caring about what's best or the people. The same dynamics are at work in the insurance industry. It is practice to find legal ways to deny coverage or to drop you altogether.

These are the areas we need to focus on. These are the areas that need change. How to go about making these types of changes is difficult because of all the powerful players involved. But it is certainly a stepping off point for discussion.

Wednesday, October 20, 2010

Who are they protecting anyway...?

After reading Information Feudalism, the current political climate came into perfect focus. What is it about humankind that lead them to take laws, that are clearly intended for the welfare of the public, and skew them in favor of their own private interests? Furthermore, what is it about American politics and corporations that make them feel they have the right to dictate to the world the laws of copyright, intellectual property, and patents? Is this an extreme case of ethnocentrism or just greed? OR, perhaps both?

From the very start of this debate American interests were arrogantly put ahead of the rest of the world. For example, America took issue when foreign countries translated and published American authors' work. They called it 'piracy'. However, the same America would not allow foreign authors to be protected in the US unless their work was published here at the same time it was published in the "country of origin". "The London Times saw this as an attempt to make New York the centre of world publishing."

In the same vein, Hollywood began their global monopoly of the entertainment industry as early as the 1920s. The political leaders of the time, and since, have been in complete support of this complete domination all in the name of money. As the author(s) recall, "trade follows the film". Hollywood has been given so much protection based on intellectual property that they dictate the level of competition allowed in foreign countries completely independent of the US government.

Drahos:
 "The end game for Hollywood is no restriction on its capacity to reach any type of screen in the world at any time and place".

"...the output of the US film and TV industry serves to dilute national cultures".

In view of this, I ask again. How can one country claim this kind of global control? Hollywood is an independent, international cartel endorsed 100% by the same people who create the laws concerning intellectual property, copyright, and patents.

Drahos suggests that his use of the word "feudalism" in the title of this book is too harsh. However, I'm not sure I agree. It is very clear who the controlling power players are in this scenario; the conglomerates who own the rights. These conglomerates, or monopolies, are way beyond acting in the interest of the welfare of the public. Using Hollywood as the most extreme case, they have no competition therefore they can charge what they want, which goes against everything we know about laws concerning monopolies. And like the pharmaceutical companies, they are allowed to copyright most everything they do.

The question now is when will these corporate giants, monopolies, conglomerates, etc. sink their daggers into the free flow of information that travels across the internet? Drahos has already made the point of the importance of knowledge. With that, I digress for a moment. Again, not a new concept, question, or speculation, but if all of the research scientists were allowed to share their research/information with each other instead of being forced to tuck it away behind patents, imagine how much further the world would be with respect to cures for any number of diseases.

So, how long will the free flow of information remain free? The US has already forced imposition of its intellectual property laws on Sweden, a country that doesn't have any such laws, and doesn't, by international agreement, have any obligation to do so. However, Sweden obliged. Will the regulation come in only the form of Hollywood protection, or will it come in other forms? This story has yet to be written, but somehow I fear the outcome. The US powers that be are very good at disguising their greed by wearing masks proclaiming concern for the good of the people. My only question now is...which people?

Tuesday, October 12, 2010

The "Unhuman" Network...

Perhaps it is best to begin at the end of this week's reading, for, like Biology, an object is sometimes better understood if the parts that make up that object are understood. Understanding networks is indeed one of those instances, as Galloway and Thacker succinctly demonstrate in The Exploit. However, the authors' also leave the reader with a question, perhaps better denoted as a proposition. One, nonetheless, that fits neatly into a more in depth discourse of network theory, if further dissection is needed or desired.

Do "unhuman" objects exist within networks? This is the question at hand. Drawing no conclusions on this topic, Galloway and Thacker begin, and end, their discussion with bits and atoms. However, earlier in the text they discuss the "physical layer" of network protocols. Is this not "unhuman"? If material conduits are required to allow communication between nodes on a computer network, how can this "unhuman" element be questioned? Is a definition of "unhuman" needed at this point to continue? Even by it's most basic definition, the "unhuman", or, that which is not human, provides the architectural or physical structure for networks. Even biological networks operate within and around elements that are "unhuman". As noted in the reading, emerging infectious diseases travel through hotels and airports, as well as from "unhuman" host to human host.

In this line of thought, perhaps a different question should be considered:  Can networks survive without the "unhuman" element? Can Osama bin Laden "swarm" at will, causing terror on a global level without the objects of airplanes, explosive devices, and other objects of mass murder? Can online social networks operate without the structural underpinnings of a physical computing device and connectivity? If the objects above are defined as "unhuman" and deemed necessary in order for each respective network to operate, the question that Galloway and Thacker leave the reader with quickly transposes itself into more than a mere proposition. Indeed it must be considered within the same context.

That is not to say that the human element is in any way less important. Quite to the contrary. Networks require the interaction of both human and "unhuman" elements, both symmetrically and asymmetrically, in order to be "flexible" and "robust". The "unhuman" element represents and provides the infrastructure for the network while the human element actively consummates the motivation, aliveness, and human interactivity of the network.

Even after considering any and all networks, and from any point in history, this theory seems to hold true. Take for example the Pony Express. This cross-country mail service (or network) could not have operated without the "unhuman" object of the horse working in tandem with the human object of the rider, not to mention the object of his journey, the "unhuman" element of the letter and the piece of paper it was written on. Consider the modern postal service. They do not use horses to deliver mail any longer, however the vehicles they do use are definitely "that which is not human".

The networks of today, that Galloway and Thacker discuss, as mentioned above, are no different; a network is a network is a network. Of course that is an over-simplification meant only for this posting. Networks vary in many ways - in size, in content, even in structure (ie. centralized, decentralized, and distributed). But as long as they meet the four conditions for being a network, they are all recognized and operate as networks; human and "unhuman" elements working in unison to form actively robust webs of ever-changing nodes and edges.

Wednesday, October 6, 2010

The materiality of media....

Materiality...a new concept in our discussions thus far. Hayles' reading on "Material Metaphors, Technotexts, and Media-Specific Analysis" reminded me a lot of The Medium is the Message. However, Hayles brings to the forefront the idea of materiality and the effect its presence has on the meaning of its content.

While focusing on the physical environment in which the content is embodied, Hayles suggests that the meaning of the content is derived specifically from within that environment. She continues to say that when that same content is recreated in a different media environment, the meaning of that content changes as well.

The key differentiating factor Hayles implies is the physical interaction the viewer has with the medium in which the content is embodied. For example, historically the act of reading a printed book has particular physical, psychological, and emotional associations that have become natural to us throughout the centuries. We use the activity of reading a book as a private time when we cuddle up with a blanket and a cup of hot tea and escape into a world far away from our own. Traversing the traditional book as well as interacting with the texture of its pages, cover, and size also hold sentiment. Sometimes we even hug the book in a gesture of intimacy or hold it up in delight once we have completed the reading. In all instances, it serves as a friend in which we interact with.

Now if we take the content of that same book and recreate it in a new form of media, how will the viewer interaction change? Obviously it changes everything. New bonds must be forged, physically, psychologically, and even emotionally. (Assuming of course that the viewer yearns for these bonds.) Even the interaction between reading a hardback book versus a soft cover book effect the meaning of the activity. The hardback book is treated with more respect and is held differently, if only by virtue of its' physicality. It is more common to see written notes or highlighted sections in soft cover books than in their hardcover counterparts simply because we interact with the two differently.

The advent of various new technologies has also changed the physical activity of interacting with content. The same text is consumed differently in each of the variety of media forms it is offered, bringing new understandings through each experience. The book above would be physically interacted with completely differently if its material form was digitized to be read on a computer screen. If its form was embodied in a book tape or CD, the interaction would be still different.

This concept of materiality exists subconsciously I think for most. The conversation of reading the book versus seeing the movie always brings with it comments on preference towards one or the other. The matter of interpretation and execution of book-to-film are always in the forefront of these conversations but Hayles' point is still made. Recreating the same text in the form of cybertext brings about similar conversations. The presence of animation, hypertext, imagery, etc., alters the meaning of the text in its own unique way, and is also often followed by the discourse of debate.

Whether new material forms in which literary texts are remediated effect the original text's integrity is always debatable. The point, however, is that these remediations offer new interpretations on every level of human interaction. Past the physical activity of consuming a literary text, from holding a printed book to adjusting a computer monitor to achieve a more convenient viewing position, our psychological and emotional reactions are also modified. Will we ever form the same type of bonds that have evolved through the historical materiality of holding and reading a printed book? Will we ever hug a computer screen the same way we hug a printed book? Or will our emotional attachments fade away in reaction to the cold impersonal touch of computers? I don't think so. Soon the same intimacy we've held so dear in our human interactions with printed books will evolve into a new type of intimacy for the interaction with our computers. One thing holds true though. We will come to experience our dear old friend the book in new and different ways as we comfortably adjust to our intimate new cyber environment.

Wednesday, September 29, 2010

Art Remediated...

According to Bolter and Grusin, digital arts is defined as graphic, static images made of pixels rather than oils or watercolors. That is, computer programs and the algorithms which support them are the tools that the digital artist uses and the computer screen is their canvas. Content varies from immersive illustration to highly mediated multimedia imagery.

The goal of the digital artist, with respect to immediacy, tends to vary depending on the composition and the elements used within the imagery. For example, Bolter and Grusin note that fantasy illustration is a popular theme among digital artists. In this genre, the digital artists attempts to achieve immediacy by creating a space in which they imagine to be real. On the other hand, digital artists who use a variety of elements that are clearly derived from different forms of media, seek to enhance the experience in a way that the viewer is well aware of the use of multimedia objects.

What differentiates the digital artist from the analog artists of the past? Some might claim the computer aids in the creation of a piece of digital art, thus discarding the artist's claim that what they have created is indeed art. The critic asserts that the digital artist has many menu options at hand in order to determine the most appealing look; a "happy accident" it is sometimes called. In comparison, the artists from the past spent hours painting and repainting over areas they were not happy with and only stopped when they were. A true artist is not satisfied until what they envisioned in their mind is executed properly, be it paint on canvas proper or pixels on a computer screen.

The masters of the past were experts at manipulating their tools, their canvases, and their palettes to achieve their vision. So too are digital artists. They achieve their visions with different tools, different palettes, and a different canvas. And like the artists of the past, they too react and contribute to the culture in which they live in.

The difference between the artists of the past, particularly pre-Impressionism, and the artists since, is their focus on immediacy. Prior to Impressionism the objective for the artist was to remove his presence from his work. Techniques these artists worked slavishly to master were proper perspective and proportion, realistic execution, and erasure of their brushstroke. Once photography claimed a more immediate reflection of nature, artists abandoned their attempts of immediacy for a much more hypermediated experience. Impressionistic techniques created only implications of form. Picasso, inspired by the invention of film, attempted to create multiple views of the person in a single form. Pollack, among others, had no problem leaving his paint on the surface of the canvas which created a three dimensional texture, wholly contradicting the erasure processes of the past. And Dadaism removed any effort at immediacy that might have remained. All of these styles were indeed meant to be abosrbed, but by no means intended to be realistic.

Two centuries after the evolution away from immediacy in art, the digital artist relishes in hypermediacy in its purist definition. The more the visibility of mediation, the better. As mentioned above, some digital artists do still attempt immediacy by attending to proper perspective and proportion, however, it is more common to witness the hypermediated alternative.

As Bolter and Grusin note, everything has been remediated since the beginning of writing. On a larger scale, and with centuries of impressions burned in our minds, as artists, it is difficult not to remediate. We often create with styles and images from the past in our minds. A type of Creative Commons if you will. We borrow from the past to create the cultures of today. Our tools and canvases have changed, as too have our intentions. The culture of today is one deeply ceded in technology, specifically digital apparatuses that we must master in order to stay employed. But even as we use our remediated tools and canvases, we are only responding to the generations of remediation that have come before.

Tuesday, September 21, 2010

Is progress such a bad thing........?

...and so the debate rages on. Something I find interesting in all the readings I do in EMAC is a consistent reaction to the introduction of new forms of technology. Throughout history cultures have gotten so stuck in their ways that when someone suggests something new, they tend to resist it.

For example, Benjamin spoke about the "aura" that once existed in art prior to mechanical reproduction. (I actually had another reading this semester on this very subject, but understand it better after reading Benjamin's article.) And, I must admit, that "aura" of process, place, time, and experience, or, "authenticity" intrigues me. I have an appreciation for the authentic so it also saddens me that this element in art no longer exists, at least in the form Benjamin spoke of.

However, if there were never any change, there would never be any progress. Yes, I agree with both Benjamin and Nichols that changes in technology create social and political shifts in society. But I don't necessarily think that is a bad thing. In fact, to the contrary. Most technological advancements throughout history have been for the good of society. Just because something changes the way people interact, accomplish tasks, or even approach art doesn't mean society is in a state of decay. Attitudes cause societal decay...but that's a topic for an entirely different discussion.

I think the underlying issue driving these continued debate(s) regarding advancements of any kind is the loss of control people feel when they are confronted with something they are unfamiliar with. Nichols talked about control in his article, but in a different way. Perhaps he skipped over the initial lack of control that I mentioned above, and got straight to the heart of the matter. Historically control has driven change. Nichols mentions the Great Exhibition of 1851 in which two permanent exhibitions, the zoo and the botanical gardens were unveiled and celebrated. In his perception, this event is an example of the human desire to control all things. In this instance, nature was taken out of it's natural condition so that the powers-that-be could make sense of it, or control it.

As such, the dawn of mechanical reproduction, photography, film, and now cybernetic systems (just to mention the one's discussed in the text) are all means to tighten control. In mechanical reproduction, the business owner could control and thus capitalize off the sweat of his workers. In photography, the camera could capture a more realistic representation of nature. In film, the director could create and control an imaginary world. Cybernetics operates in a fashion that can potentially control everything we do; most notably in his article, what type of baby you would like to have...if you don't like it, kill it. (I wonder what the Republicans have to say about that EXTREMELY DISTURBING fact?! Oh, I forgot, it's the rich people who are doing it).

So, progress changes our perceptions, our social interactions, and our overall culture. But isn't that what it's supposed to do? If advancements make the world a better place to live, isn't it all worth the initial uncertainty? The implications of cybernetic systems are far greater than anything that has come before. By sharing ideas, culture, music, even video, with people from all around the world, we can start to build global alliances and perhaps take a small step toward world peace. Will we take advantage of all the opportunities that are literally at our fingertips? Or will regulation take hold before these greater potentials are realized? But if we all got along, the folks who like to "control" things wouldn't have anything to do...perhaps I just answered my own question.

Tuesday, September 14, 2010

So far.....

OK...I'm going a little off topic on this blog because I want to reflect on this semester in EMAC so far. Hopefully you will find it relevant. As I go through my readings and lectures, I find it amazingly relevant. The thing I find most amazing is the foresight that those who came before us had on emergent media. I sit here without a clue as to what will come next.

Was Shakespeare really referring to TV in Romeo and Juliet, or talking about new media in Othello? With this kind of genius or foresight, why haven't we yet found a cure for cancer? Or, for that matter world peace? These are a serious questions that are a lot more important than new media. But if that's what he was referring to I guess that's where we are.

So back to the topic at hand. How did we end up here? What we study in EMAC makes perfect since on how we ended up here. More and more I feel like a robot, although I know I'm not. The articles I read and the lectures I listen to make me feel that this time is not worthy of the past, however I know it is. Perhaps we do not speak the same language but our expressions are a direct reflection of the culture we live in.

As artists, our thoughts and actions are direct responses to how we feel and the attitudes we hold close. We express ourselves in ways no generation before could imagine. Perhaps it's imitation, but it's an imitation that brings people together. People can see eye-to-eye on a global level before unheard of. Is this the answer to world peace? I doubt it, but I think it could be a start....perhaps, maybe.

I think what amazes me most is that we do have all this global new media/technology, and through it all we still don't see eye to eye. I can dance with someone whose color or religion is not congruent with mine, and we both walk away knowing we will never be friends. And that makes me sad.

New media can bring us together in many, many forms; music, video, ideas, sports, interests, etc. Are the powers that be too wrapped up in the past to allow for a new type of global unity? I know what drives the resistance, but I do not understand it. Why not let the comadre of the internet be the beginning to world peace. Perhaps it is and I just haven't seen it yet.

So, with all the past foresight and the present knowledge, why are we not using technology for more good? I soon foresee many governmental regulations on what we see, post, and send online, but, again, I don't understand. Is it so bad to get along with our global neighbors? Is world peace really that scary? Or is it something that is out of the control of new media that keeps us from getting along?

Tuesday, September 7, 2010

I found the reading on "From Memory to Written Record" very interesting. It's one of those things we really don't think about now days. When I hear someone say "there is no written record" on this or that, I just think no one bothered recording it. I don't even think about the fact that it was before people kept written records, or, perhaps even before people knew how to read or write!

What I found even more interesting in this reading was the fact that the division of classes was just as prominent in 1066 as they are today. The same "privileged" classes were feeding their own, trying to widen the chasm between themselves and the so-called "illiterate".

What is it about groups or classes of people that give them the desire to prove themselves superior to other groups or classes of people? And who has bestowed upon themselves the authority to define what the words "literate" and "illiterate" mean? Language has always defined literacy, and by that I mean the common language and principles of a society.

Taking that question a step further in reflection of Fusser's article, if we all were to depend solely on electronic memory to perform mechanical calculations such as arithmetic, writing, grammar, spelling, etc. how will the word "illiterate" be defined at that point? And will only those who are skilled with computers be considered literate?

Seemingly, it would no longer be about knowing how to write properly or spell properly because the computer will do all that for you. But the computers of today have already proven they cannot always calculate everything properly. What will happen to language if spelling errors are not caught by this electronic memory. Will it become a muddled mess of illiteracy?

And how will this electronic memory effect education? Will schools stop teaching students how to write and spell words correctly? Will they stop teaching math? Will the necessary foundations of knowledge that Socrates explained to Phaedrus, become obsolete and unimportant?

If this were to happen, the chasm between literacy and illiteracy will surely widen even further and favor only those who can afford these devices with electronic memory...and for those who cannot, I guess they will remain illiterate, at least by someone's definition. A tragic regression in this great time of progress.

(And yes, I did run my electronic spell check on this blog entry).

Tuesday, August 31, 2010

As We May Think...

I wasn't able to fully absorb this article during class last week, but after reading it more thoroughly....WOW! Bush's foresight is just plain spooky. I was not familiar with Bush until reading this article, so pardon my ignorance on his contributions.

With that said, I wonder if, and how much this article and his foresight, in general, drove the evolution of computers. I mean, he not only described personal computers, but also envisioned hypertext linking, and paths, and methods of information retrieval, exactly as they exist today!

Was he the first to discuss these issues? For example, his concerns of storing the world's information so that it can be easily indexed, sorted through, consumed, and shared...was he the first to consider this as important to advancements in science and other relevant areas? Was he the first to realize that combined knowledge could help in solving all kinds of issues? He mentioned the case of Mendel's concept of the laws of genetics being lost for an entire generation because his publication did not make their way into the right hands. I wonder how much other knowledge has been lost throughout the ages.

People like Bush who can think so far ahead of their time are just amazing to me (his article "As We May Think" was written in 1945). How was he able to conceive notions like hypertext linking before computers even existed? And what can we learn from people like Bush? How can we expand our minds in the ways that he did? I know this is a key objective of the EMAC program, but it is also a difficult tool to master. Perhaps it seems so elusive to me because it seems like everything has already been done...but new ideas and ways of doing things are being introduced all the time.

I would love to get a discussion going on this topic. I know I'm not the only one who is a bit stymied on this topic....(anyone who has experienced Dave Parry's "knowledge institution of the future" project knows what I'm talking about). I can't wait to hear from you all!