Codecademy vs. The BBC Micro

The Computer Literacy Project, on the other hand, is what a bunch of producers and civil servants at the BBC thought would be the best way to educate the nation about computing. I admit that it is a bit elitist to suggest we should laud this group of people for teaching the masses what they were incapable of seeking out on their own. But I can’t help but think they got it right. Lots of people first learned about computing using a BBC Micro, and many of these people went on to become successful software developers or game designers.

I’ve just discovered Two-Bit History (0b10), a series of long and thorough blog essays on the history of computing by Sinclair Target. One essay is on Codecademy vs. The BBC Micro. The essay gives the background of the BBC Computer Literacy Project that led the BBC to commission as suitable microcomputer, the BBC Micro. He uses this history to then compare the way the BBC literacy project taught a nation (the UK) computing to the way the Codeacademy does now. The BBC project comes out better as it doesn’t drop immediately into drop into programming without explaining, something the Codecademy does.

I should add that the early 1980s was a period when many constituencies developed their own computer systems, not just the BBC. In Ontario the Ministry of Education launched a process that led to the ICON which was used in Ontario schools in the mid to late 1980s.

In 2020, let’s stop AI ethics-washing and actually do something – MIT Technology Review

But talk is just that—it’s not enough. For all the lip service paid to these issues, many organizations’ AI ethics guidelines remain vague and hard to implement.

Thanks to Oliver I came across this call for an end to ethics-washing by artificial intelligence reporter Karen Hao in the MIT Technology Review, In 2020, let’s stop AI ethics-washing and actually do something The call echoes something I’ve been talking about – that we need to move beyond guidelines, lists of principles, and checklists.  She nicely talks about some of the initiatives to hold AI accountable that are taking place and what should happen. Read on if you want to see what I think we need.

Continue reading In 2020, let’s stop AI ethics-washing and actually do something – MIT Technology Review

The 100 Worst Ed-Tech Debacles of the Decade

With the end of the year there are some great articles showing up reflecting on debacles of the decade. One of my favorites is The 100 Worst Ed-Tech Debacles of the DecadeEd-Tech is one of those fields where over and over techies think they know better. Some of the debacles Watters discusses:

  • 3D Printing
  • The “Flipped Classroom” (Full disclosure: I sat on a committee that funded these.)
  • Op-Eds to ban laptops
  • Clickers
  • Stories about the end of the library
  • Interactive whiteboards
  • The K-12 Cyber Incident Map (Check it out here)
  • IBM Watson
  • The Year of the MOOC

This collection of 100 terrible ideas in instructional technology should be mandatory reading for all of us who have been keen on ed-tech. (And I am one who has develop ed-tech and oversold it.) Each item is a mini essay with links worth following.

Applying an Ethics of Care to Internet Research: Gamergate and Digital Humanities

Article: Applying an Ethics of Care to Internet Research: Gamergate and Digital Humanities

Thanks to Todd Suomela’s lead, we just published an article on Applying an Ethics of Care to Internet Research: Gamergate and Digital Humanities in Digital Studies. This article is a companion to an article I wrote with Bettina Berendt on Information Wants to Be Free, Or Does It? We and others are exploring the Ethics of Care as a different way of thinking about the ethics of digital humanities research.

‎GoQueer Locative Game

‎Queer places are, by definition, sites of accretion, where stories, memories, and experiences are gathered. Queer place, in particular, is reliant on ephemeral histories, personal moments and memories. GoQueer intends to integrate these personal archives with places for you to discover.

I recently downloaded and started playing the iOS version of ‎GoQueer from the App Store. It is a locative game from my colleague Dr. Maureen Engel.

Engel reflected about this project in a talk on YouTube titled Go Queer: A Ludic, Locative Media Experiment. Engel nicely theorizes her game not once, but in a doubled set of reflections show how theorizing isn’t a step in project design, but continuous thinking-through.

You can also see an article reflecting on this game by the title, Perverting Play: Theorizing a Queer Game Mechanic.

Big Tech’s Half-Hearted Response To Fake News And Election Hacking

Despite big hand waves, Facebook, Google, and Twitter aren’t doing enough to stop misinformation.

From slashdot I found a story about : Big Tech’s Half-Hearted Response To Fake News And Election Hacking. This Fast Company story talks about ways that social media companies are trying to prevent the misuse of their platforms as we head into the US midterms.

For Facebook, Google, and Twitter the fight against fake news seems to be two-pronged: De-incentivize the targeted content and provide avenues to correct factual inaccuracies. These are both surface fixes, however, akin to putting caulk on the Grand Canyon.

And, despite grand hand waves, both approaches are reactive. They don’t aim at understanding how this problem became prevalent, or creating a method that attacks the systemic issue. Instead these advertising giants implement new mechanisms by which people can report one-off issues—and by which the platforms will be left playing cat-and-mouse games against fake news—all the while giving no real clear glimpse into their opaque ad platforms.

The problem is that these companies make too much money from ads and elections are a chance to get lots of ads, manipulative or not. For that matter, what political ad doesn’t try to manipulate viewers?

The slashdot story was actually about Mozilla’s Responsible Computer Science Challenge which will support initiatives to embedd ethics in computer science courses. Alas, the efficacy of ethics courses is questionable. Aristotle would say that if you don’t have the disposition to be ethical no amount of training would do any good. It just helps the unethical pretend to be ethical.

Re-Imagining Education In An Automating World conference at George Brown

On May 25th I had a chance to attend a gem of a conference organized the Philosophy of Education (POE) committee at George Brown. They organized a conference with different modalities from conversations to formal talks to group work. The topic was Re-Imagining Education in An Automating World (see my conference notes here) and this conference is a seed for a larger one next year.

I gave a talk on Digital Citizenship at the end of the day where I tried to convince people that:

  • Data analytics are now a matter of citizenship (we all need to understand how we are being manipulated).
  • We therefore need to teach data literacy in the arts and humanities, so that
  • Students are prepared to contribute to and critique the ways analytics are used deployed.
  • This can be done by integrating data and analytical components in any course using field-appropriate data.

 

Sustainable Research: Around the World Conference

This week I am participating in the 6th Around the World Conference organized by the Kule Institute for Advanced Study.  This e-conference (electronic conference) is on Sustainable Research and we have a panel on a different topic every day of the week. (If you miss a panel, check out our YouTube channel.) Today we had a fabulous panel on Art and/in the Anthropocene that was led by Natalie Loveless and Jesse Beier. You can see some thoughts on the e-conference under the Twitter hastag #ATW2018, which we share with the American Trombone Workshop.

Manifest Attention

One of the problems with e-conferences is that they are local for everyone which means that everyone tunes in and out depending on what they have scheduled rather than devoting the time. When you fly to a conference you can’t be expected to leave the conference for a meeting, but when a conference is local or online we tend to not pay attention as we would when afar.

This has to change if we are to wean ourselves of flying any time we want to pay attention to a conference. We have to learn to be deliberate about allocating time to an e-conference. We have to manifest attention.

DPLA Primary Source Sets

Commodore Perry’s Expedition to Japan

The Digital Public Library of America (DPLA) has a fascinating collection of Primary Source Sets that bring together materials around a subject for teaching and historical thinking. For example they have a set on Commodore Perry’s Expedition to Japan that allows you to see both American and Japanese representations of Perry and the important visit. These sets show how a digital archive can be repurposed in different ways.

Composite Image by Picasso
From the Pablo Picasso’s Guernica and Modern War Set

Python Programming for the Humanities by Folgert Karsdorp

Having just finished teaching a course on Big Data and Text Analysis where I taught students Python I can appreciate a well written tutorial on Python. Python Programming for the Humanities by Folgert Karsdorp is a great tutorial for humanists new to programming that takes the form of a series of Jupyter notebooks that students can download. As the tutorials are notebooks, if students have set up Python on their computers then they can use the tutorials interactively. Karsdorp has done a nice job of weaving in cells where the student has to code and Quizes which reinforce the materials which strikes me as an excellent use of the IPython notebook model.

I learned about this reading a more advanced set of tutorials from Allen Riddell for Dariah-DE, Text Analysis with Topic Models for the Humanities and Social Sciences. The title doesn’t do this collection of tutorials justice because they include a lot more than just Topic Models. There are advanced tutorials on all sorts of topics like machine learning and classification. See the index for the range of tutorials.

Text Analysis with Topic Models for the Humanities and Social Sciences (TAToM) consists of a series of tutorials covering basic procedures in quantitative text analysis. The tutorials cover the preparation of a text corpus for analysis and the exploration of a collection of texts using topic models and machine learning.

Stéfan Sinclair and I (mostly Stéfan) have also produced a textbook for teaching programming to humanists called The Art of Literary Text Analysis. These tutorials are also written as Jupyter notebooks so you can download them and play with them.

We are now reimplementing them with our own Voyant-based notebook environment called Spyral. See The Art of Literary Text Analysis with Spyral Notebooks. More on this in another blog entry.