Posts Tagged ‘ productivity ’

Skyrim- it’s fun because it’s shaggy

I know, I know. I’ve been completely remiss in updating this blog. It’s not that I haven’t had content to add, but that I’ve been too busy to take the time to write it. I hope to catch up on that after the Christmas break, once the burden and expectations of coursework have abated, and my only priorities will be thesis research and whatever projects tickle my fancy (believe me, there are more than a few just waiting in the wings).

In addition to being super-busy with thesis prep and coursework, most of my at-home free time has been dominated by video games. For the directed study I’d planned this term, I played through both Dragon Ages; then some content analysis coding related to Mass Effect came up in my research methods course, and I just couldn’t help myself. And more recently, Skyrim, the most recent installment in the Elder Scrolls franchise was released, and I find myself well and truly addicted.

There are several things about Skyrim that set it apart from other games, but the most important quality is the high level of freedom given the gamer to explore the gameworld. Not all the rules are obvious; it often seems more like a simulation game than an RPG. And in a world as big as Skyrim, this can lead to unexpected results. The AV Club’s review of the game puts it best in the following excerpt:

The soul of Skyrim isn’t in these meticulous improvements, but in its shaggier side. Not every aspect of this world lines up perfectly. You might be anointed by an ancient priesthood as the greatest warrior in all the land, only to walk 10 yards down the road and get slaughtered by a stray bear. Incongruities like this arise all the time—characters behave weirdly, and quests veer off-script. It isn’t just about bugs, although there are some of those. These eccentricities are the result of an extremely detailed organic world acting out in unexpected ways.

Skyrim lets these rough edges show, because the element of chaos lets players feel like the game is happening to them, and they are alive in it—not just cogs in a pre-fab Game Experience. That’s what sets Skyrim apart from some of its contemporaries. Where many games with lavish production values seek to direct players’ imaginations, Skyrim seeks to ignite them.

Advertisements

Mission Statements: Workshopping the Proposal

While my study on mission statement dissemination is on hold, that doesn’t mean I’ve stopped thinking about it by any means.  I’m currently workshopping a research proposal for the study in two separate courses this term, and by the end of April I’m hoping to have a really fleshed-out plan of how to proceed.  Here are some of the documents that I’ve written and that are helping me shape this project.

PDFs:

Fall 2009, SSHRC Application: Program of Study

Winter 2010, HUCO530, Thesis Question

Winter 2010, LIS505, Research Proposal pt1 – Problem and Definitions

Designing Users/Interfaces

Huco 500 – Weekly questions

Tognazzini uses the term “user” quite a bit in his article without qualifying it.  He indicates that the most important part of building an interface that “anticipates” a user’s needs is knowing your user, but this goes without saying.  The user as a concept relies entirely on the service one offers; the user for a medical reference database for medical professionals will not be the same as the user of a social media application (e.g. Twitter).  It might be more valuable to start by thinking about the service an interface offers, and to consider the best/most effective possible way of showcasing/presenting that service.  Determining user expectations and behaviours will be much easier once this task is complete, and will avoid making generalizations about what users want.

Effective applications and services perform a maximum of work, while requiring a minimum of information from users. (Tognazzini)

How do you determine what the “minimum” is?  The interface still relies on the user having some idea of what result they need.  A developer should start with what service(s) the interface assists with, and build the interface based on what the requirements are to fulfill that service.  Note that the less information a user provides the less accurate a result will be.

How do you conduct an interface user study?  What tasks do you need test users to perform?  What questions should you ask?  How do you measure effectiveness and efficiency?

Readings:

Tognazzini, Bruce. “First Principles of Interaction Design.” Last accessed 7 October 2009.

This week: Google, Bing, and ‘bawdy houses’

I’m going to look at a few articles I’ve meant to write about from the last week. 

Firstly: 

I really have nothing intelligent to say about this, besides how wacky it is that 21st-century legislation in Canada still refers to locations in which prostitution takes place as “common bawdy houses”. 

***

This week Microsoft launched Bing, their official attempt at going head-to-head with the omnipresent (potentially omnicient?) Google.  Bing includes a search engine, mapping software to compete with Google Maps, and even 411 business/business category searches.  I got the following in another email at work (this tends to happen a lot):

Lost in all the excitement around today’s public preview launch of Bing, Microsoft’s new search engine, was the subsequent launch of Bing 411. This is a direct swipe at another Google product, GOOG-411.

Both are free and both use speech-to-text technology and voice recognition to completely automate directory assistance calls. GOOG-411 (1-800-466-4411) has been going for a while, and is surprisingly intuitive. It keeps adding features like nearby intersections.

My take on this?  Bing is a Google clone.  There have been a few attempts in the past by others to compete with Google, and all have failed miserably.  But if anyone has the money and the influence to face-off against Google, it’s Microsoft.

***

Speaking of Google and searching, here’s an interesting article by the nearly-as-omnipresent (at least in matters webby-and-right-wise) Cory Doctorow in this week’s Guardian: Continue reading

Cable and Broadband v. Copper

Received this article in an email at work. 

Congress has asked the Federal Communications Commission to develop a national policy for broadband deployment. But it may be more important to think through how the country will handle the aging and increasingly less relevant copper phone network.

While I have to agree with most of what Saul Hansell (the reporter) writes, I’m glad that the US is at least developing an official policy for broadband.  Since there are no regulations in place to legislate broadband voice service, the cable and IP companies like Vonage and Cablevision don’t suffer under the same restrictions imposed on the telcos.  It gives them an unfair advantage in what’s already a cutthroat industry (and, as the article effectively portrays, one whose once-solid business is now on decidedly shaky ground).  Even if the official policy is designed to promote broadband to consumers, setting ground rules for the cablecos to follow certainly can’t hurt.

Telephone companies need to invest in new infrastructure if they want to be competitive with cable and IP, and that means replacing old copper lines with fiber optics.  The problem is finding money to invest in a diminishing business (inevitably, as wireless strengthens its hold on the north american market); for investors it seems counter-intuitive to pour millions of dollars into a new, more reliable landline network for a legacy service that doesn’t promise a return on investment.  Up here for instance, TELUS has no problem finding funds to develop its 3G mobile network, but when the notion of revamping its existing landline infrastructure is quietly brought up it always gets firmly quashed.   

If the telcos can’t find a way to compete, I’m afraid Hansell’s prediction may come true:

What good will it do for the F.C.C. to come up with a spiffy new plan to get faster, cheaper broadband to more people if the phone companies fail and millions of people won’t be able to dial 911 in an emergency?

The Internet: Fundamentally transforming our brains

((As an aside, watching the Oscars and Wall-E just won best animated feature.))

The Guardian | The digital revolution risks changing the way we think

Jackie Ashley

…is the new way of social interaction actually changing the brains, and indeed the minds of a generation, and if so, what might that mean?

… We know from neuroscience that the brain constantly changes, physically, depending on what it perceives and how the body acts. So Greenfield suggests that the world of Facebook is changing millions of people, most of them young.

Greenfield argues that a shorter attention span, a disregard for consequences in a virtual everything-is-reversible world ((I bet sirdavid wishes it really was an everything-is-reversible world… sadly, not always the case)), and most importantly, a lack of empathy may be among the negative changes. She quotes a teacher who told her of a change in the ability of her pupils to understand others. She points out that irony, metaphor and sympathy can all be lost in cyberspace. ((I would disagree with this point; while maybe some of the nuances of the verbal medium are lost, everything that can be communicated textually can be communicated in cyberspace. So are we saying that metaphor and irony are harder to express in text? Oh, well sure, naturally that makes sense. What?))

There is also the question of identity. An intern working for Greenfield told her: “In a world where private thoughts and feelings are posted on the internet for all to see, it’s hard to see where ourselves finish and the outside world begins.” Where is the long-term narrative in a life reduced to a never-ending stream of bite-sized thoughts ((each roughly 140 characters in length))? Even clever writers end up “twittering” a burble of banalities.

… Digital culture has brought us a wider conversational democracy (good), which suffers from short attention span and is too self-referential (less good). There is no answer to this. The new world is here to stay. It is part of who we are becoming and how our minds are adapting. If you opt out of it, you cut yourself off…

On the broader points (the fact that digital communication is transforming the way we think, that the culture of the internet– in the main– promotes a shorter attention span and a penchant for self-reference) I agree with Ashley. The article is rather circular, though, presenting the argument that the most effective communication is physical and face-to-face, talking about how with the current events taking place globally we need more honest “IRL” debate, without coming right out and saying the Internet is seriously compromising the chance of this (I leave to you whether that’s actually the case or not– I don’t think so). While Ashley compliments Susan Greenfield for raising the long-term effects of digital social media, she never really gives those long-term effects any real consideration; seriously, what will happen in a world that can only communicate in 140-character blurbs when we need to solve such important problems as global warming, national security, and energy use?

But then, I’m not totally convinced that the digital revolution has the downside Ashley and Greenfield are suggesting. Short attention span, sure, but what about the thousands of bloggers out there that daily share their opinions– often carefully thought-out and meticulous– in cyberspace? What about video-bloggers on youtube, and podcasters on iTunes? …I think the big issue at stake, the most important part of us that is affected by the use of social media, is the one of identity. It’s too early to tell, I think, whether we’re looking at something greatly beneficial or mostly detrimental, but it’s important to recognize (and I think Ashley agrees with this) that whatever the changes are, we are changing the way we think by using this technology.

Anyway. To end as I began:

http://www.youtube.com/watch?v=UblUO0LjPUg

Agnotology – You mean ignorance isn’t bliss?

W I R E D | Clive Thompson on How More Info Leads to Less Knowledge

Normally, we expect society to progress, amassing deeper scientific understanding and basic facts every year. Knowledge only increases, right?

Robert Proctor doesn’t think so. A historian of science at Stanford, Proctor points out that when it comes to many contentious subjects, our usual relationship to information is reversed: Ignorance increases.

He has developed a word inspired by this trend: agnotology. Derived from the Greek root agnosis, it is “the study of culturally constructed ignorance.”

I had flashbacks of Neil Postman as I read this article. Arguably a different yet related concept is information-glut (or, perhaps more commonly, “information overload”): the notion that after a certain point the more information accumulates, the more chaos, uncertainty, and ignorance (rather than order, clarity, and knowledge) there is (Technopoly). As I recall, Postman posited this as something inevitable rather than driven by an actual desire to sow disinformation. He uses a story from Plato’s Phaedrus about the discovery of writing to illustrate his point. The god Theuth presents writing to the Egyptian King Thamus and describes how, by teaching it to his people, it would be “a sure receipt for memory and wisdom”. Thamus is less than enthused. He says to the god:

The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality. (371-372)

Continue reading