Archive for the ‘Media and News’ Category

Around the World Symposium on Digital Culture

Wednesday, May 29th, 2013

Tomorrow we are organizing an Around the World Symposium on Digital Culture. This symposium brings together scholars from different countries talking about digital culture for about 17-20 hours as it goes from place to place streaming their talks and discussions. The Symposium is being organized by the Kule Institute for Advanced Study here at the University of Alberta. Visit the site to see the speakers and to tune in.

Please join in using the Twitter hashtag #UofAworld

U. of Virginia Teams Up With ‘Crowdfunding’ Site

Tuesday, May 21st, 2013

Mike linked me to a Chronicle Bottom Line blog story about how U. of Virginia Teams Up With ‘Crowdfunding’ Site to Finance Research. UVa is teaming up with USEED, a company that has built a “fundraising platform [that] taps the power of social networks and the voice of your students to engage alumni and win new donors…” USEED is unlike Kickstarter in that it creates a unique site for each university rather than forcing them to compete on the same site. It is closer to the FutureFunder.ca site for Carleton.

USEED is an example of a company that is experimenting with “social entrepreneurship” a gray area between for-profit and not-for-profit work. The Chronicle also has a story on the ambiguities of social entrepreurship. At times it seems like there are a lot of startups that are circling universities trying to figure out how to feed on our antiquated corpse.

CIFAR: Renewing their vision

Monday, April 22nd, 2013

Today I went to a meeting about Canadian Institute For Advanced Knowledge (CIFAR) in the hopes that they might have programs in the humanities. They do and they don’t.

One new initiative they have that is open to humanists is their global call for ideas. The call is open to anyone:

Do you have a question with the potential to change the world?

A number of their programs like Successful Societies, Social Interactions, Identity & Well-Being, and Institutions, Organizations & Growth seem to have humanists and social scientists involved, even if they aren’t issues central to the humanities.

In recognition of the absence of humanities programs they started a Humanities Initiative in 2009. Alas, it hasn’t yet developed any programs we could participate in. Here is some history:

In their 2009-2010 Annual Performance Report they state:

CIFAR organized a discussion with senior humanities researchers drawn from institutions across North America in May 2009 about the role CIFAR could play in supporting advanced research in the humanities. The meeting participants recommended the creation of an ad hoc Steering Committee that would undertake the process of identifying in detail how CIFAR should approach and support advanced humanities research. This Steering Committee met in December 2009, and following a telephone conference in April 2010 recommended that the Institute proceed with several pilot projects in the next year. Work on refining these projects and identifying task force members was underway by June 2010.

In a 2010, Final Report CIFAR Performance Audit and Evaluation, the evaluators note:

CIFAR’s Strategic Plan notes that the growth of its programs in the social sciences and humanities has not kept pace with growth in the natural sciences. CIFAR is, consequently, examining how its research model might be adapted to research in these disciplines with a specific focus in this five-year period on the humanities.

It is now 2013 and it seems the steering group recommended two pilot projects, neither of which seem to have done more than meet.

Pekka Sinervo, who presented here, suggested that it is hard to find examples of sustained conversations around a single question in the humanities of the sort that CIFAR supports. He challenged me to find examples they could use as models. Perhaps there isn’t a tradition of think tanks in the humanities? Perhaps senior humanists, of the sort CIFAR has recruited, are more solitary scholars who just can’t get excited about getting together to talk about ideas? Perhaps the humanities has lapsed into Cartesian solipsism – we think, we are, but alone.

I personally think CIFAR should restart and rethink their Humanities Initiative. If they are finding it hard to get humanists engaged in the ways other fields are, then try something different. I would encourage them to look at some examples from the digital humanities that have demonstrated the capacity to initiate and sustain conversations in innovative ways:

  • The Humanities and Technology Camp (THATCamp) is an extremely successful example of an open and inclusive form of conversation. Mellon supports this initiative that supports inexpensive “unconferences” around the world.
  • Networked Infrastructure for Nineteenth-century Electronic Scholarship Online (NINES) is a reinvented scholarly association that was formed to support old and new media research. This is not an elite exclusive community, but a reimagined association that is capable of recognizing enquiry through digital scholarship.
  • The Day of Digital Humanities is a sustained look at the question, “Just what do digital humanists really do?” Started at U of Alberta in 2009, the latest version was run by Michigan State University’s MATRIX: The Center for Digital Humanities & Social Sciences. Other organizations have used this “Day of …” paradigm to get discussion going around issues like digital archaeology.
  • 4Humanities is a loose group that looks at how to advocate for the humanities in the face of funding challenges. With minimal funding we support local chapters, international correspondents, and various activities.

In short, there are lots of examples of sustained conversations, especially if you don’t limit yourself to a particular model. Dialogue has been central to the humanities since Plato’s Academy; perhaps the humanities should be asked by CIFAR to imagine new forms of dialogue. Could CIFAR make a virtue of the problem they face around humanities conversations?

Can you start a dialogue with the potential to change the world?

The Never-Before-Told Story of the World’s First Computer Art

Saturday, January 26th, 2013

The Atlantic has a story about The Never-Before-Told Story of the World’s First Computer Art (It’s a Sexy Dame). The image (see above) was apparently created by an IBM programmer for the SAGE system and was used as a diagnostic.

According to Tipton, the program that displayed the pin-up image was a diagnostic that tested data flow between the two SAGE computers on site (referred to as the A and B computers). At the end of every shift, as one computer was about to go offline and switch over to the other, the active machine would begin transferring flight and intercept data to the standby machine so there could be a seamless switch over.

Two switching consoles on site were used to handle this process. After running the diagnostic, Tipton describes, if the pin-up displayed correctly on the screen, then data was being transferred between the A and B computers correctly. If the image displayed improperly, then the technicians immediately knew there was a problem.

This reminds me of the story of Lena and the use of her image. Why were so many early images drawn from porn? Does this say something about the male culture of computing in those years that it was cool/acceptable to use pin up pictures when you needed a graphic image?

Thanks to @manovich for this.

Big Buzz about Big Data: Does it really have to be analyzed.

Thursday, January 24th, 2013

The Guardian has a story by John Burn-Murdoch on how Study: less than 1% of the world’s data is analysed, over 80% is unprotected.

This Guardian article reports on a Digital Universe Study that reports that the “global data supply reached 2.8 zettabytes (ZB) in 2012″ and that “just 0.5% of this is used for analysis”. The industry study emphasizes that the promise of “Big Data” is in its analysis,

First, while the portion of the digital universe holding potential analytic value is growing, only a tiny fraction of territory has been explored. IDC estimates that by 2020, as much as 33% of the digital universe will contain information that might be valuable if analyzed, compared with 25% today. This untapped value could be found in patterns in social media usage, correlations in scientific data from discrete studies, medical information intersected with sociological data, faces in security footage, and so on. However, even with a generous estimate, the amount of information in the digital universe that is “tagged” accounts for only about 3% of the digital universe in 2012, and that which is analyzed is half a percent of the digital universe. Herein is the promise of “Big Data” technology — the extraction of value from the large untapped pools of data in the digital universe. (p. 3)

I can’t help wondering if industry studies aren’t trying to stampede us to thinking that there is lots of money to be made in analytics. These studies often seem to come from the entities that benefit from investment into analytics. What if the value of Big Data turns out to be in getting people to buy into analytical tools and services (or be left behind.) Has there been any critical analysis (as opposed to anecdotal evidence) of whether analytics really do warrant the effort? A good article I came across on the need for analytical criticism is Trevor Butterworth’s Goodbye Anecdotes! The Age of Big Data Demands Real Criticsm. He starts with,

Every day, we produce 2.5 exabytes of information, the analysis of which will, supposedly, make us healthier, wiser, and above all, wealthier—although it’s all a bit fuzzy as to what, exactly, we’re supposed to do with 2.5 exabytes of data—or how we’re supposed to do whatever it is that we’re supposed to do with it, given that Big Data requires a lot more than a shiny MacBook Pro to run any kind of analysis.

Of course the Digital Universe Study is not only about the opportunities for analytics. It also points out:

  • That data security is going to become more and more of a problem
  • That more and more data is coming from emerging markets
  • That we could get a lot more useful analysis done if there was more metadata (tagging), especially at the source. They are calling for more intelligence in the gathering devices – the surveillance cameras, for example. They could add metadata at the point of capture like time, place, and then stuff like whether there are faces.
  • That the promising types of data that could generate value start with surveillance and medical data.

Reading about Big Data I also begin to wonder what it is. Fortunately IDC (who are behind the Digital Universe Study have a definition,

Last year, Big Data became a big topic across nearly every area of IT. IDC defines Big Data technologies as a new generation of technologies and architectures, designed to economically extract value from very large volumes of a wide variety of data by enabling high-velocity capture, discovery, and/or analysis. There are three main characteristics of Big Data: the data itself, the analytics of the data, and the presentation of the results of the analytics. Then there are the products and services that can be wrapped around one or all of these Big Data elements. (p. 9)

Big Data is not really about data at all. It is about technologies and services. It is about the opportunity that comes with “a big topic across nearly every area of IT.” Big Data is more like Big Buzz. Now we know what follows Web 2.0 (and it was never going to be Web 3.0.)

For a more academic and interesting perspective on Big Data I recommend (following Butterworth) Martin Hilbert’s “How much information is there in the ‘information society’?” (Significance, 9:4, 8-12, 2012.) One of the more interesting points he makes is the growing importance of text,

Despite the general percep- tion that the digital age is synonymous with the proliferation of media-rich audio and videos, we find that text and still images cap- ture a larger share of the world’s technological memories than they did before4. In the early 1990s, video represented more than 80% of the world’s information stock (mainly stored in analogue VHS cassettes) and audio almost 15% (on audio cassettes and vinyl records). By 2007, the share of video in the world’s storage devices had decreased to 60% and the share of audio to merely 5%, while text increased from less than 1% to a staggering 20% (boosted by the vast amounts of alphanumerical content on internet servers, hard disks and databases.) The multimedia age actually turns out to be an alphanumeric text age, which is good news if you want to make life easy for search engines. (p. 9)

One of the points that Hilbert makes that would support the importance of analytics is that our capacity to store data is catching up with the amount of data broadcast and communicated. In other words we are getting closer to being able to be able store most of what is broadcast and communicated. Even more dramatic is the growth in computation. In short available computation is growing faster than storage and storage faster than transmission. With excess comes experimentation and with excess computation and storage, why not experiment with what is communicated. We are, after all, all humanists who are interested primarily ourselves. The opportunity to study ourselves in real time is too tempting to give up. There may be little commercial value in the Big Reflection, but that doesn’t mean it isn’t the Big Temptation. The Delphic oracle told us to Know Thyself and now we can in a new new way. Perhaps it would be more accurate to say that the value in Big Data is in our narcissism. The services that will do well are those that feed our Big Desire to know more and more (recently) ourselves both individually and collectively. Privacy will be trumped by the desire for analytic celebrity where you become you own spectacle.

This could be good news for the humanities. I’m tempted to announce that this will be the century of the BIG BIG HUMAN. With Big Reflection we will turn on ourselves and consume more and more about ourselves. The humanities could claim that we are the disciplines that reflect on the human and analytics are just another practice for doing so, but to do so we might have to look at what is written in us or start writing in DNA.

In 2007, the DNA in the 60 trillion cells of one single human body would have stored more information than all of our technological devices together. (Hilbert, p. 11)

Digital Humanities Pedagogy: Practices, Principles and Politics

Thursday, January 10th, 2013

Open Book Publishers has just published Digital Humanities Pedagogy: Practices, Principles and Politics online. Stéfan Sinclair and I have two chapters in the collection, one on “Acculturation and the Digital Humanities Community” and one on “Teaching Computer-Assisted Text Analysis.”

The Acculturation chapter sets out the ways in which we try to train students by involving them in project teams rather than only through courses. This approach I learned watching Jerome McGann and Johanna Drucker at the University of Virginia. My goal has always to be able to create the sort of project culture they did (and now the Scholar’s Lab continues.)

The editor Brett D. Hirsch deserves a lot of credit for gently seeing this through.

GAME THEORY in the NYTimes

Monday, December 24th, 2012

Just in time for Christmas, the New York Times has started an interesting ArtsBeat Blog called GAME THEORY. It is interesting that this multi-authored blog is in the “Arts Beat” area as opposed to under the Technology tab where most of the game stories are. Game Theory seems to want to take a broader view of games and culture as the second post on Caring About Make-Believe Body Counts illustrates. This post starts by addressing the other blog columnists (as if this were a dialogue) and then starts with Wayne LaPierre’s speech about how to deal with the Connecticut school killings that blames, among other things, violent games. The column then looks at the discourse around violence in games including voices within the gaming industry that were critical of ultraviolence.

Those familiar with games who debate the medium’s violence now commonly assume that games may have become too violent. But they don’t assume that games should be free of violence. That is because of fake violence’s relationship with interactivity, which is a defining element of video games.

Stephen Totilo ends the column with his list of the best games of 2012 which includes Super Hexagon, Letterpress, Journey, Dys4ia, and Professor Layton and the Miracle Mask.

As I mentioned above, the blog column has a dialogical side with authors addressing each other. It also brings culture and game culture together which reminds me of McLuhan who argued that games reflect society providing a form of catharsis. This column promises to theorize culture through the lens of games rather than just theorize games.

Short Guide To Evaluation Of Digital Work

Thursday, December 20th, 2012

The Journal of Digital Humanities has republished my Short Guide to Evaluation of Digital Work as part of an issue on Closing the Evaluation Gap (Vol. 1, No. 4). I first wrote the piece for my wiki and you can find the old version here. It is far more useful bundled with the other articles in this issue od JDH.

The JDH is a welcome experiment in peer-reviewed republication. One thing they do is to select content that has been published in other forms (blogs, online essays and so on) and then edit it for recombination in a thematic issue. The JDH builds on the neat Digital Humanities Now that showcases neat stuff on the web. Both are projects of the Roy Rosenzweig Center for History and New Media. The CHNM deserved credit for thinking through what we can do with the openness of the web.

20 Years Of Texting

Thursday, December 6th, 2012

It has been apparently 20 years since the first text message was sent according to stories like this one, 20 Years Of Texting: The Rise And Fall Of LOL from Business Insider.

 The first text message was sent on 3 December 1992, when the 22-year-old British engineer Neil Papworth used his computer to wish a “Merry Christmas” to Richard Jarvis, of Vodafone, on his Orbitel 901 mobile phone. Papworth didn’t get a reply because there was no way to send a text from a phone in those days. That had to wait for Nokia’s first mobile phone in 1993.

What is interesting is that texting is declining. FT reports a “steep drop in festive Christmas and New Year text messaging this year…”. With smartphones that can do email, apps on smartphones, and plans that make it affordable to call, we have more and more choices. Soon l33t will become an endangered language.

Save Library and Archives Canada

Saturday, June 2nd, 2012

The Canadian Association of University Teachers has a campaign to Save Library and Archives Canada from the “Badly conceived restructuring, a redefinition of its mandate, and financial cutbacks (that) are undermining LAC’s ability to acquire, preserve and make publicly available Canada’s full documentary heritage.” The issue is not just cuts, but how LAC is dealing with the cuts.

Daniel Caron, Library and Archivist of Canada, has announced that “the new environment is totally decentralized and our monopoly as stewards of the national documentary heritage is over.”

LAC will be decentralizing a large portion of its collections to both public and private institutions. LAC documents refer to this voluntary group of “memory institutions” as a “coalition of the willing.”

Go to the site now, read up on the issues, and consider taking action!

.