Nökkvi Jarl Bjarnason gave a talk on the emergence of national and regional game studies. What does it mean to study game culture in a country or region? How is locality appealed to in game media or games or other aspects of game culture?
Felania Liu presented on game preservation in China and the challenges her team faces including issues around the legitimacy of game studies.
Hirokazu Hamamura gave the final keynote on the evolution of game media starting with magazines and then shifting to the web.
I presented a paper co-written with Miki Okabe and Keiji Amano. We started with the demographic challenges faced by Japan as its population shrinks. We then looked at what Japanese Game Companies are doing to attract and support women and families. There is a work ethics that puts men and women in a bind where they are expected to work such long hours that there really isn’t any time left for “work-life balance.”
The conference was held in person at Nagoya Zokei University and brilliantly organized by Keiji Amano and Jean-Marc Pelletier. We limited online interventions to short lightning talks so there was good attendance.
Yesterday I was part of a signing ceremony for a Memorandum of Agreement between Ritsumeikan University and the University of Alberta. I and the President of the University of Alberta (Bill Flanagan) signed on behalf of U of A. The MOU described our desire to build on our collaborations around Replaying Japan. We hope to build collaborations around artificial intelligence, games, learning, and digital humanities. KIAS and the AI4Society signature area have been supporting this research collaboration.
Today (March 2nd, 2023) we are having a short conference at Ritsumeikan that included a panel about our collaboration, at which I talked, and a showcase of research in game studies at Ritsumeikan.
Sinykin talks about this as an “act as groundbreaking as the research itself” which seems a bit of an exaggeration. It is important that data is being reviewed and published, but it has been happening for a while in other fields. Nonetheless, this is a welcome initiative, especially if it gets attention like the LARB article. In 2013 the Tri-Council (of research agencies in Canada) called for a culture of research data stewardship. In 2015 I worked with Sonja Sapach and Catherine Middleton on a report on a Data Management Plan Recommendation for Social Science and Humanities Funding Agencies. This looks more at the front end of requiring plans from people submitting grant proposals that are asking for funding for data-driven projects, but this was so that data could be made available for future research.
Sinykin’s essay looks at the poetry publishing culture in the US and how white it is. He shows how data can be used to study inequalities. We also need to ask about the privilege of English poetry and that of culture from the Global North. Not to mention research and research infrastructure.
In 1987, William H. Dickey, a San Francisco poet who had won the prestigious Yale Younger Poets Award to launch his career and published nearly a dozen well-received books and chapbooks since, was …
Matthew Kirschenbaum has written a great essay on recovering early digital poetry, The Lost Digital Poems (and Erotica) of William H. Dickey ‹ Literary Hub. Dickey wrote some HyperPoems on HyperCard and so now they are hard to access. Kirschenbaum rescued them and worked with people to add them to the Internet Archive that has a HyperCard emulator. Here is what Kirschenbaum says,
Dickey’s HyperPoems are artifacts of another time—made new and fresh again with current technology. Anyone with a web browser can read and explore them in their original format with no special software or setup. (They are organized into Volume 1 and Volume 2 at the Internet Archive, in keeping with their original organizational scheme; Volume 2 contains the erotica—NSFW!) But they are also a reminder that writers have treasures tucked away in digital shoeboxes and drawers. Floppy disks, or for that matter USB sticks and Google Docs, now keep the secrets of the creative process.
This essay comes from his work for his new book Bistreams which documents this and other recovery projects. I’ve just ordered a copy.
Documenting the Now develops tools and builds community practices that support the ethical collection, use, and preservation of social media content.
I’ve been talking with the folks at MassMine (I’m on their Advisory Board) about tools that can gather information off the web and I was pointed to the Documenting the Now project that is based at the University of Maryland and the University of Virginia with support from Mellon. DocNow have developed tools and services around documenting the “now” using social media. DocNow itself is an “appraisal” tool for twitter archiving. They then have a great catalog of twitter archives they and others have gathered which looks like it would be great for teaching.
MassMine is at present a command-line tool that can gather different types of social media. They are building a web interface version that will make it easier to use and they are planning to connect it to Voyant so you can analyze results in Voyant. I’m looking forward to something easier to use than Python libraries.
Speaking of which, I found a TAGS (Twitter Archiving Google Sheet) which is a plug-in for Google Sheets that can scrape smaller amounts of Twitter. Another accessible tool is Octoparse that is designed to scrape different database driven web sites. It is commercial, but has a 14 day trial.
One of the impressive features of Documenting the Now project is that they are thinking about the ethics of scraping. They have a Social Labels set for people to indicate how data should be handled.
Part of what is interesting in the essay is how Salvador documents the different views about what such simulations were good for. SimRefinery was not a accurate simulation that would cover the complexity of the chemical engineering of a refinery; so what was it good for. Chevron apparently wanted something to help the staff who weren’t engineers to understand some of the connectiveness of a refinery – how problems in one area could impact others. Will Wright, the genius behind Maxis, didn’t think serious simulations were possible or something they wanted to do. He saw SimCity as a caricature that was fun. At best it might give people a “mental model” of the issues around city management. It was for that reason that MBS was a spin-off designed to contract with businesses that felt serious simulations were feasible and useful.
The Computer Literacy Project, on the other hand, is what a bunch of producers and civil servants at the BBC thought would be the best way to educate the nation about computing. I admit that it is a bit elitist to suggest we should laud this group of people for teaching the masses what they were incapable of seeking out on their own. But I can’t help but think they got it right. Lots of people first learned about computing using a BBC Micro, and many of these people went on to become successful software developers or game designers.
I’ve just discovered Two-Bit History (0b10), a series of long and thorough blog essays on the history of computing by Sinclair Target. One essay is on Codecademy vs. The BBC Micro. The essay gives the background of the BBC Computer Literacy Project that led the BBC to commission as suitable microcomputer, the BBC Micro. He uses this history to then compare the way the BBC literacy project taught a nation (the UK) computing to the way the Codeacademy does now. The BBC project comes out better as it doesn’t drop immediately into drop into programming without explaining, something the Codecademy does.
I should add that the early 1980s was a period when many constituencies developed their own computer systems, not just the BBC. In Ontario the Ministry of Education launched a process that led to the ICON which was used in Ontario schools in the mid to late 1980s.
Like many, I learned to program multimedia in HyperCard. I even ended up teaching it to faculty and teachers at the University of Toronto. It was a great starting development environment with a mix of graphical tools, hypertext tools and a verbose programming language. It’s only (and major) flaw was that it wasn’t designed to create networked information. HyperCard Stacks has to be passed around on disks. The web made possible a networked hypertext environment that solved the distribution problems of the 1980s. One wonders why Apple (or someone else) doesn’t bring it back in an updated and networked form. I guess that is what the Internet Archive is doing.
The goal of Lincs is to create a shared linked data store that humanities projects can draw on and contribute to. This would let us link our digital resources in ways that create new intellectual connections and that allow us to reason about linked data.