Posts Tagged ‘ technology ’

New term, new posts

So it’s that time of year again. The start of the new term means new courses, new projects, and new posts on the blog.

A description of what to expect:

After wrapping up my study of social media use at the reference desk at Grant MacEwan University, I’ll be conducting a similar study with librarians at the University of Alberta. This project represents roughly a third of the work I’ll be completing over the next few months, as well as a significant chunk of the research I intend to use for my thesis. This study, ostensibly, is framed within an LIS course entitled “Advanced Research Methods”, where (mainly) thesis students in the program form a support group to get through the early phases of their thesis research. I’m actually pretty excited about this project, particularly since I’m going into it with findings from my summer study. I expect to post one or two updates over the course of the term, at the least.

I’ll also be taking a course on Reference Services. Not sure if that’ll actually make it on the blog in any form, but it’s worth mentioning insofar as it’s something I’ll be preoccupied with.

The course I’m most anticipating, and that will definitely be featured in most of this Fall’s blog posts, is a directed reading called “Video Game Criticism”. For this, in addition to a ton of self-assigned readings, I’ll be playing Dragon Age 2 and subsequently writing a critical analysis of the game. The basis for this course– unlike most video game courses, which tend to be focused on design and production– is summarized by Ian Bogost in his introduction to Unit Operations:

…similar principles underlie both contemporary literary analysis and computation. I will use this commonality to analyze a field of discursive production that has yet to bind an authoratative place in either world– videogames. […]A practical marriage of literary theory and computation would not only give each field proper respect and attention from its counterpart, but also create a useful framework for the interrogation of cultural artifacts that straddle these fields.

In other words, I’m interested in developing a model or framework for studying video games that is analogous to how we perform literary criticism. As both an English student and a video game enthusiast (not to mention a digital humanist), the most urgent question is why I haven’t thought of doing a directed reading like this before.

Chief component of the directed reading– like last Winter’s directed study in social media and Knowledge Management– is to maintain journal entries (read: blog posts) about my progress in and thoughts of the game, and my synthesis of related readings about game design and theory.

In addition to the more formal journal entries (or “response papers”), I would like to start using this as a personal blog once more. I plan on at least making the attempt; in the past I’ve never been able to consistently keep that sort of thing up.

On a final note, a former professor of mine emailed me a link to this blog post about defining the digital humanities. Imagine my surprise in discovering that my own definition of DH, supplied for 2011’s Day of Digital Humanities, was prominently cited. As a mere graduate student, I feel sheepish about “eschewing disciplinary rigor”, adroitly or not (who am I to fight convention, after all?), but proud all the same that I apparently managed to “capture the spirit” of the DH community.

I must tip my cap to Eric Forcier, whose reply adroitly eschews disciplinary rigor in favor of admirably capturing the spirit of the DH community—especially in painting DH as an ephemeral, seemingly idiosyncratic curiosity that either attracts or repels people, and often changes them fundamentally:

When I first applied to this grad program, my understanding of what DH was all about was crystalline in its purity. Not so today. My idea of DH is that it’s sort of like a highway oil slick on a sunny day. When you look at the slick, depending on the angle, you might get a psychedelic kaleidoscope of reflected colours; if you’re lucky you might spot your reflection in it; then again, all you might see is darkness. And if you feel compelled to step in it, don’t be surprised if you slip. Those stains will not come out. -Eric Forcier, University of Alberta, Canada

I’ll try not to let it go to my head.

Assessing Social Media – Methods

I have written about various social media and web technologies as they relate to knowledge management (KM), and as they are discussed in the literature.  But I haven’t really touched on how the literature approaches measuring the application and success of such technologies in an organizational context.  Prusak notes that one of the priorities of KM is to identify the unit of analysis and how to measure it (2001, 1004).  In this review paper I will examine some of the readings that have applied this question to social media. For the sake of consistency, the readings I have chosen deal with the assessment of blogs for the management of organizational knowledge, but all of the methods discussed could be generalized to other emerging social technologies.

Grudin indicates that the reason most attempts at developing systems to preserve and retrieve knowledge in the past have failed, is that digital systems required information to be represented explicitly when most knowledge is tacit: “Tacit knowledge is often transmitted through a combination of demonstration, illustration, annotation, and discussion.” (2006, 1) But the situation, as Grudin explains, has changed—“old assumptions do not hold…new opportunities are emerging.” (ibid.) Virtual memory is no longer sold at a premium, allowing the informal and interactive activities used to spread tacit knowledge to be captured and preserved; emerging trends such as blogs, wikis, the ever-increasing efficiency of search engines, and of course social networks such as Twitter and Facebook that have come to dominate the Internet landscape open up a multitude of ways in which tacit knowledge can be digitized.

In his analysis of blogs, Grudin identifies five categories (2006, 5):

diary-like blogs, or personal blogs, developing the skill of engaging readers through personal revelation;

A-list blogs by journalists and high-profile individuals, as a source of information on events products and trends;

Watchlists, which track references across a wide selection of sources, reveal how a particular product, organization, name, brand, topic, etc is being discussed;

Externally visible employee blogs provide a human face for an organization or product, which offsets the potential legal and PR risks for a corporation.

Project blogs are internal blogs that focus on work and serve as a convenient means of collecting, organizing and retrieving documents and communication.

Lee, et al. make a similar move in categorizing the types of public blogs used by Fortune 500 companies (2006, 319):

Employee blogs (maintained by rank-and-file employees, varies in content and format)

Group blogs (operated by a group of rank-and-file employees, focuses on a specific topic)

Executive blogs (feature the writings of high-ranking executives)

Promotional blogs (promoting products and events)

Newsletter-type blogs (covering company news)

Grudin does not conduct any formal assessment of blogs, except to provide examples of project blogs, and to assign technical and behavioral characteristics to that particular sub-type that allowed them to be successful, based on his personal experience (2006, 5-7). Lee, et al.’s approach to assessing blogs involves content analysis of 50 corporate blogs launched by the 2005 Fortune 500 companies (2006, 322-23). In addition to the categories above, Lee, et al. also identified five distinct blogging strategies based on their findings, which broadly fall under two approaches (321):

Bottom-up, in which all company members are permitted to blog, and each blog serves a distinct purpose (not necessarily assigned by a higher authority)[1];

Top-down, in which only select individuals or groups are permitted to blog, and the blogs serve an assigned purpose that rarely deviates between blogs.

As the names suggest, a greater control of information is exercised in the top-down approach, while employee bloggers in companies adopting the bottom-up approach are provided greater autonomy.

Huh, et al. developed a unique approach in their study of BlogCentral, IBM’s internal blogging system (2007).  The study combined interviews with individual bloggers about their blogging practices and content analysis of their blogs.  Based on this data, they were able to measure two characteristics of blogs: the content (personal stories/questions provoking discussion/sharing information or expertise) and the intended audience (no specific audience/specific audience/broad audience).  These findings revealed four key observations:

– Blogs provide a medium for employees to collaborate and give feedback;

– Blogs are a place to share expertise and acquire tacit knowledge;

– Blogs are used to share personal stories and opinions that may increase the chances of social interaction and collaboration;

– Blogs are used to share aggregated information from external sources by writers who are experts in the area.

Rodriguez examines the use of WordPress blogs in two academic libraries for internal communication and knowledge management at the reference desk (2010).  Her analysis measures the success of these implementations using diffusion of innovation and organizational lag theories. Rogers’ Innovation Diffusion Theory establishes five attributes of an innovation that influence its acceptance in an organizational environment: Relative advantage, compatibility, complexity, triability, and observability (2010, 109). Meanwhile, organizational lag identifies the discrepancy between the adoption of technical innovation—i.e. the technology itself—and administrative innovation—i.e. the underlying, administrative purpose(s) for implementing the technology, usually representing a change in workflow to increase productivity.  In analyzing the two implementations of the blogging software, Rodriguez discovers that both libraries succeeded in terms of employee adoption of the technical innovation, but failed with the administrative innovation.  This was due specifically to the innovation having poor observability: “the degree to which the results of the innovation are easily recognized by the users and others” (2010, 109, 120). The initiators of the innovation in both cases did not “clearly articulate the broader administrative objectives” and “demonstrate the value of implementing both the tool and the new workflow process.” (2010, 120) If they had done so, Rodriguez suggests, the blogs might have been more successful.

While all of these studies approached blogging in a different way—project blogs, external corporate blogs, internal corporate blogs and internal group blogs—and measured different aspects of the technology—what it is, how it is used, if it is successful—they reveal a number of valuable approaches to studying social media in the KM context. Categorization, content and discourse analysis, interviews, and the application of relevant theoretical models are all compelling methods to assess social media and web technologies.

 


[1] One of the valuable contributions of Lee, et al.’s study is to also identify the essential purposes for which corporate blogs are employed. Some of these include product development, customer service, promotion and thought leadership. The notion of ‘thought leadership’ in particular, as a finding of their content analysis, is worth exploring; ‘thought leadership’ suggest that the ability to communicate innovative ideas is closely tied to natural leadership skills, and that blogs and other social media (by extension) can help express these ideas. Lee, et al.’s findings also suggest that ‘thought leadership’ in blogs will build the brand, or ‘human’ face of the organization, while acting as a control over employee blogs, evidenced by the fact that it is found primarily in blogs that employ a top-down strategy.


Bibliography

Grudin, J. (2006).  Enterprise Knowledge Management and Emerging Technologies. Proceedings of the 39th Hawaii International Conference on System Sciences. 1-10.

Huh, J., Jones, L., Erickson, T., Kellogg, W.A., Bellamy, R., and Thomas, J.C. (2007) BlogCentral: The Role of Internal Blogs at Work.  Proceeding Computer/Human Interaction CHI EA 2007, April 28-May 3. 2447-2452. San Jose, CA.  doi <10.1145/1240866.1241022>

Lee, S., Hwang, T., and Lee, H. (2006). Corporate blogging strategies of the Fortune 500 companies. Management Decision 44(3). 316-334.

Prusak, L. (2001). Where did knowledge management come from? IBM Systems Journal, 40(4), 1002-1007.

Rodriguez, J. (2010). Social Software in Academic Libraries for Internal Communication and Knowledge Management: A Comparison of Two Reference Blog Implementations. Internet Reference Services Quarterly 25(2). 107-124.

A Nostalgic Look Back: Cloning

Whatever happened to cloning?

No, no, this is a legitimate question.  I remember about ten years ago, maybe a little bit more than that, there was a buzz around ‘cloning’ as the next big scientific development.  I was in high school at the time, and I recall devouring every news story about Dolly, the first cloned sheep, that I could get my hands on.  I imagined a future in which the tiniest bit of our genetic material could be used to replicate life, and pondered the murky ethics that arose from this.  And then time passed, and the whole craze just sort of faded away.

I was reminded of this in reading Robert Pepperell’s 2003 edition of Posthuman Condition: Consciousness Beyond the Brain, in preparation for my term paper.  In the preface, Pepperell mentions with much urgency developments in the field of genetics and cloning specifically, and what this might mean in the re-definition of the ‘human’.  He references in particular a 2002 article in the Sunday Times about the imminence of the first successful human cloning (I’m fuzzy on this point, but I suspect my lack of memory suggests it wasn’t as successful or as imminent as Pepperell claims.)

So my question is this:  What happened to all the hype about cloning?  Would it have featured importantly in my Posthumanism course had it been offered eight years ago?  Is it strange that cloning hasn’t even gotten the merest mention in class?

Record-Keeping Processes for Child Care Program Inspectors

The Alberta Auditor General’s October report revealed that Alberta Children’s Services’ record-keeping was inconsistent and made it impossible to determine if child care programs were meeting provincial standards (Kleiss, 2010).  The report identifies that the problem occurs with the documentation (or lack thereof) of “low-risk” breaches of existing standards, which are often handled with a “verbal warning”, or recorded inconsistently by inspectors.  The Auditor General quite correctly notes that this poses a safety risk to the children in these programs (2010, 33-34).

My first question is why there is no consistency already in how inspectors are reporting breaches.  One would hope that such an important role in our society, the people responsible for holding accountable the services that provide our child care, would function like a well-oiled machine.  Is it a lack of training, i.e. human error?  Is it a poorly designed reporting system, i.e. system error?  How does this happen?

There are clearly existing procedural regulations in place for inspectors.  The Auditor’s report acknowledges the following about the inspection activity:

Authorities’ licensing officers inspect these programs at least twice a year and inspect in response to complaints and program reported critical incidences such as child injury. If a program is not complying with regulatory requirements, through delegation from the Statutory Director for Child Care, a licensing officer may:

  • issue a verbal warning to correct non-compliance
  • issue an order to remedy non-compliance
  • impose conditions on a license
  • vary a provision of a license
  • suspend a license and issue a probationary license
  • cancel a license

Enforcement action will vary depending on the severity of the non-compliance. Low risk non-compliance may warrant more serious enforcement action if frequently repeated or identified as part of a pattern of ignoring requirements. (35)

It goes on to describe how the lack of records tracking verbal warnings, as well as inconsistency in acquiring and providing evidence of non-compliance when issuing an order to remedy problematized the task of following-up in cases when further action would be required.  It is not clear if this type of documentation has always been inconsistent, but the Auditor’s report made three recommendations:

1. Review and improve documentation and training to ensure all program requirements are being met.

2. Improve the consistency of monitoring by correcting systems that ensure compliance with processes.

3. Improve follow-up processes by ensuring that all verbal warnings are adequately documented and resolved.

While the report seems to me the perfect example of why good records management is critical, it made me wonder how the office of the Auditor General conducted its audit, and how it came to draw the conclusions it did.  According to the report:

Our audit procedures included reviewing relevant legislation, standards, policies and procedures, interviewing senior staff at the Department and five Authorities, shadowing licensing officers as they inspected programs, reviewing inspection reports, and examining the Department’s Child Care Information System (CCIS). (33)

The mention of CCIS, an ERMS, made me wonder precisely what sort of information was recorded in it.  After all, CCIS must be counted among the “systems” and “processes” identified in the recommendations.  The report (which, if you have not realized by now, is remarkably thorough) describes CCIS as follows, in the context of recommendation 1:

Authorities record enforcement actions in CCIS, link it to the corresponding regulation, and do some analysis of that data. However, more detailed trend analysis of this data may reveal the location, timing, and types of non-compliance, as well as help in planning future monitoring or training actions. For example, in our sample, we identified a pattern across Alberta of non-compliance with a requirement for maintaining portable emergency records. This could indicate a need for training or stricter enforcement action in this area. (36)

The report also identifies, in the context of recommendation 3, that while “Orders to Remedy” were consistently entered in CCIS, “verbal warnings” were not, and there was no way to tell if any remedial action was taken in cases where verbal warnings were given.

Other “records” in this story worth noting:

  • The legislation, provincial standards and statuatory requirements that govern child care in Alberta
  • The Auditor General’s Report itself (and what it says about the governmental review process in our province)

(apologies for the unoriginal title)

________________________

References

Auditor General Alberta. (2010, October). Report of the Auditor General of Alberta—October 2010.  Retrieved on November 15, 2010 from http://www.oag.ab.ca/files/oag/OAGOct2010report.pdf

Kleiss, K.  (2010, October 28).  Better Paperwork Expected of Daycare Inspectors.  EdmontonJournal.com. Retrieved on November 15, 2010 from http://www.edmontonjournal.com/life/Better+paperwork+expected+daycare+inspectors/3737004/story.html

A Matter of Security

The big story over the weekend was about John Tyner, a software engineer who refused the TSA body scan and pat-down at the San Diego airport, and was subsequently removed from the airport and fined $10,000 for being uncooperative.  What makes this a big story is the fact that Tyner recorded the entire incident on his cell phone and then posted it on YouTube; he also wrote a full account on a blog using the moniker “johnnyedge”[1].  The video and blog have gone viral in the 48 hours since the incident took place, the YouTube video receiving over 200,000 hits.

There is quite a lot going on in this story that is worth examining.  First off, the relatively new practice of using the backscatter x-ray scanners and the TSA’s policy to administer a full pat-down to any passengers that opt-out of the scan have been under fire since they were first introduced.  Several stories have surfaced in the last year regarding the new technology, though none quite so markedly as Tyner’s.  One of the concerns raised was whether or not the body scan images were saved and stored [2]; the TSA confirmed that this was not the case in August, although it continues to be an issue raised in the argument against the body scans.  The issue does raise the question of precisely what does happen with the images?  How do the scanners work?  Is there no memory that stores images, even in the short term?  What if the scan does reveal someone in possession of something nefarious?  Doesn’t the scan represent evidence?  Surely there must be some system in place to preserve the image when this happens—if not, it does not seem particularly effective.  And if yes, the question is whether or not such a system violates the human rights of passengers.

I bet the TSA is rather unhappy right now, given the rising tidal wave of public discontent it is now facing.  I’ve written a lot about web content as records in this journal, so I won’t over-emphasize it now, but clearly the video/audio record Tyner preserved and uploaded to the Internet will impact the TSA’s operations—the extra time and labour spent dealing with uncooperative passengers, of navigating the negative press, and of correcting its policies and procedures will directly translate into dollar amounts.  As one article on Gizmodo suggests, there is a lot of money for manufacturers and lobbyists in the implementation and use of the new body scanners [3]; there’s a lot of money at stake if their adoption is stymied by bad press and public outrage.  And why?  Because one person recorded this activity and made the record public.

A movement in the US has grown around the rejection of the body scan technology and the TSA’s policies.  The website “I Made the TSA Feel my Resistance” has gone up, and is calling for “National Opt-Out Day” on November 24—the busiest day of the year for air travel.  It encourages passengers to refuse the body scan when they go through security. [4]

While I’ve always been sympathetic with the challenging (let’s face it—impossible) task of providing airport security, I think Tyner’s use of records and the web are useful in one very important way.  It forces us to ask: In what way does the body scan technology protect passengers?

____________________________________

[1] The original blog post and videos are available here: http://johnnyedge.blogspot.com/2010/11/these-events-took-place-roughly-between.html

An article by the Associated Press about the story’s popularity can be viewed here: http://www.mercurynews.com/breaking-news/ci_16617995?nclick_check=1

As well as a blog post on the CNN Newsroom website by one of the network’s correspondents can be viewed here: http://newsroom.blogs.cnn.com/2010/11/15/dont-touch-my-junk/?iref=allsearch

[2] The issue of whether the images are stored or not was first raised last January, as represented in this article on CNN.com: http://articles.cnn.com/2010-01-11/travel/body.scanners_1_body-scanners-privacy-protections-machines?_s=PM:TRAVEL

The TSA refuted these claims at the time on their blog: http://blog.tsa.gov/2010/01/advance-imaging-technology-storing.html

The issue again made headlines in August with the following article on cnet: http://news.cnet.com/8301-31921_3-20012583-281.html

Which the TSA again refuted: http://blog.tsa.gov/2010/08/tsa-response-to-feds-admit-storing.html

[3] Loftus, J.  (2010, November 14).  TSA Full-Body Scanners: Protecting Passengers or Padding Pockets?  Gizmodo. Retrieved on November 15, 2010 from http://gizmodo.com/5689759/tsa-full+body-scanners-protecting-passengers-or-padding-pockets

This article also effectively summarizes the current controversy surrounding Advanced Imaging Technology (AIT).

[4] http://www.imadethetsafeelmyresistance.com/

Tying reminder strings on my digital digits

This is a housekeeping post, mainly for my own benefit, but also as a teaser for my loyal reader(s) (hi Mom).

The anticipated records management post about the Edmonton City Centre Airport plebiscite is temporarily on hold, since I’m considering turning it into a larger project (that is, the term paper assignment for the same records management course).  I am eager to get started on the journal entry assignments, however, so I’ve rustled up a few stories that may serve as interesting fodder for my consideration:

Bank of Canada Unclaimed Balances and the Edmonton Journal/Alberta database

Brent Wittmeier’s blog: http://brentwittmeier.com/2010/09/19/unclaimed-bank-accounts-interview-on-rob-breakenridge-show/

Bank of Canada unclaimed balances service: http://ucbswww.bank-banque-canada.ca/scripts/search_english.cfm

Edmonton Journal database for Alberta bank records:  http://www.edmontonjournal.com/news/unclaimed-bank-accounts/index.html

Facebook breaches of privacy laws, and their actions to remedy the situation

CBC News story, Facebook privacy changes approved by watchdog: http://www.cbc.ca/canada/story/2010/09/22/facebook-privacy-commissioner.html

The recent discovery of lost British teleplays at the Library of Congress

LOC blog: http://blogs.loc.gov/loc/2010/09/by-jove-its-a-video-treasure-trove/

Are Digital Humanists Relevant?

On October 7, Distinguished Visitor Dr. Howard White presented “Defining Information Science” as part of the SLIS colloquia.

He began his presentation by offering the several traditional definitions of information science (Rubin, 2004; Hawkins, 2001; Borko, 1968), as well as Wikipedia’s definition as an illustration of how difficult it is to pin down, before offering his own much simpler definition:

[Information Science is] The study of literature-based answering.

Given that he was speaking to a room full of future librarians, White elaborated what that meant in the context of reference librarian.  the reference librarian should be able to provide relevant answers to “relevance-seekers” (library users) by giving truthful, novel, on topic, specific, understandable, and timely answers (in that order).  Librarians should be better equipped than Google to filter relevance for a given question; their “equipment” is “literatures”– that is, the library collection.  It’s possible to shorten White’s answer down even more: information science is the study of relevant answers, or simply relevance, given that relevance implies (a) a system (“literatures”) and (b) requirements for answers (truthfulness, novelty, on-topic-ness, specificity, understandability, and timeliness).

What struck me as most interesting, however, were the parallels between White’s librarian/information scientist and the digital humanist.  A digital humanist is, after all, essentially interested in seeking and supplying relevant answers by searching ‘literatures’ with the use of computational methods (Hockey, 2004). Does that make the digital humanist an information scientist?  And does that make the information scientist a digital humanist?

Works cited

Borko, H. (1968). “Information science: what is it?” American Documentation, 19(1).

Hawkins, D.T. (2001). “Information science abstracts: tracking the literature of information science.  Part 1: definition and map.” Journal of the American Society for Information Science and Technology, 52.

Hockey, S. (2004).  “History of Humanities Computing.”  A Companion to Digital Humanities, ed. Susan Schreibman, Ray Siemens, John Unsworth.  Oxford: Blackwell, 2004.

Rubin, R. E. (2004).  Foundations of Library and Information Science. 2nd ed.  New York: Neal-Schuman Publishers Inc.