Computers in classroom have ‘mixed’ impact on learning: OECD report

The Globe and Mail and other sources are reporting that Computers in classroom have ‘mixed’ impact on learning. This is based on an OECD report titled Students, Computers and Learning: Making the Connection. The overall conclusion is that teaching is about the individual student and can’t be automated. Computers aren’t necessarily good for learning – they should be used for specific projects and used to teach real-world digital skills.

Students who use computers moderately at school tend to have somewhat better learning outcomes than students who use computers rarely. But students who use computers very frequently at school do a lot worse in most learning outcomes, even after accounting for social background and student demographics. (p. 3 of Report)

The Globe quotes Prof. Slotta of OISE to the effect that:

Technology is most effective in the classroom when it is used to develop skills similar to those that adults are using in everyday life, such as finding resources, critiquing arguments, communicating with peers, solving problems and working with data…

Skimming the report and the slide deck shows a complex picture where often countries like Japan have fewer computers in classrooms and do better on learning. Massive investment in computers like that of school boards who get laptops for every child doesn’t seem to lead to improvements in learning.

Put simply, ensuring that every child attains a baseline level of proficiency in reading and mathematics seems to do more to create equal opportunities in a digital world than can be achieved by expanding or subsidising access to high-tech devices and services. (p. 3 of Report)

The report also looked at loneliness and confirmed what parents have suspected,

Last but not least, most parents and teachers will not be surprised by the finding that students who spend more than six hours on line per weekday outside of school are particularly at risk of reporting that they feel lonely at school, and that they arrived late for school or skipped days of school in the two weeks prior to the PISA test.

The slide show prepared by Andreas Schleicher of the OECD suggest that there are larger questions about what sorts of skills should we be teaching in the coming age of automation. The second slide says “The kind of things that are easy to teach are now easy to automate, digitize or outsource.” A slide titled The Race between Technology and Education (title from work by Goldin and Katz) suggests that there is social pain when technology isn’t matched with education. The conclusion is that we need education for a world where many jobs can be automated. Just as the industrial revolution caused social pain in the form of dislocation and unemployment, so too could AI.

Journal of the Japanese Association for Digital Humanities

Announcing the first issue of the Journal of the Japanese Association for Digital Humanities. I am on the Editorial board of the Journal, but the real credit goes to Charles Muller, Christian Wittern and Kiyonori Nagasaki who are the working editors. This journal represents the maturing of the Japanese digital humanities scene. They have a Japanese Association (JADH) which was founded in 2011, and became constituent organization of ADHO in 2013. Now they have a journal. As Charles Muller, Editor-in-Chief, puts it in his “Dear Readers”,

While Digital Humanities has been practiced in Japan for more than two decades, up to now, little is known outside of Japan regarding the content of Japan advancements in this field. We therefore aim to rectify this situation by initiating a first-tier peer reviewed international journal published in English. Although we hope to be able to shed light on projects in developments in Japan, we will be accepting article submissions from DH practitioners around the world on a broad range of topics.

Digital Pedagogy Institute

P1150675

Robert Jay Glickman and Geoffrey Rockwell

Last week I participated in the Digital Pedagogy Institute that was organized by the University of Toronto Scarborough, Brock University and Ryerson University. I kept my Conference Report here.

This Institute focused not only technology in learning but also on important issues around the ethics of different learning models that involve technology. Ways of using technology to get active participation rather than just broadcasting video came up. Ways of thinking about students in collaborative projects came up – we need to get beyond the apprentice model and think of them as “citizen scholars.”

Continue reading Digital Pedagogy Institute

Metropolis II by Chris Burden (the movie) – YouTube

From the panopticonopolis tumblr I’ve discovered Metropolis II by Chris Burden. What an interesting take on the city.

Panopticonopolis (try saying it) by Misha Lepetic has mostly entries on cities, some of which appear in 3 Quarks Daily. Another article on The Forgotten Archipelago asks what happened to the Soviet ZATP cities – the special purpose, closed and hidden cities set up for secret research. What happened when the Soviet Union collapsed and the federal government could no longer fund these single-purpose cities?

I was led to the panopticonopolis from an article on Blob Justice, Part 1 which looks at the herd shaming that is taking place on the Internet starting with Cecil the lion. I can’t help wondering if this sort of Internet stampede is related to gamergate and Anonymous.

Spanish Cops Use New Law To Fine Facebook Commenter For Calling Them ‘Slackers’

Heather tweeted me a link to a story from Techdirt on how Spanish Cops Use New Law To Fine Facebook Commenter For Calling Them ‘Slackers’. The police in Spain can now fine people for disrespecting them. This outrageous law was also reported on by The Telegraph in a story First victim of Spain’s 'gag law' fined for criticising 'lazy' police. Despite Snowden’s revelations governments seem to be passing more and more laws to restrict speech and travel, often in the name of fighting terrorism. As Techdirt reports, the law is being defended with Orwellian arguments,

Defending the new law, the PP government has said that “demonstrations will become freer because they will be protected from violent elements”. (Quote from Telegraph article)

Medical Privacy Under Threat in the Age of Big Data

The Intercept has a good introductory story about Medical Privacy Under Threat in the Age of Big Data. I was surprised how valuable medical information is. Here is a quote:

[h]e found a bundle of 10 Medicare numbers selling for 22 bitcoin, or $4,700 at the time. General medical records sell for several times the amount that a stolen credit card number or a social security number alone does. The detailed level of information in medical records is valuable because it can stand up to even heightened security challenges used to verify identity; in some cases, the information is used to file false claims with insurers or even order drugs or medical equipment. Many of the biggest data breaches of late, from Anthem to the federal Office of Personnel Management, have seized health care records as the prize.

The story mentions Latanya Sweeny, who is the Director of the Data Privacy Lab at Harvard. She did important research on Discrimination in Online Ad Delivery and has a number of important papers on health records like a recent work on Matching Known Patients to Health Records in Washington State Data that showed that how one could de-anonymize Washington State health data that is for sale by search news databases. We are far more unique than we think we are.

I should add that I came across an interesting blog post by Dr Sweeny on Tech@FTC arguing for an interdisciplinary field of Technology Science. (Sweeny was the Chief Technologist at the FTC.)

Towards monocultural (digital) Humanities?

lenguas-utilizadas-revistas-hd
Domenico Fiormonte has written a nice essay on how the humanities (and digital humanities) run the risk of becoming monolingual, Towards monocultural (digital) Humanities?. The essay is a response to Greg Crane’s The Big Humanities, National Identity and the Digital Humanities in Germany and Greg responds then to Domenico here. The numbers are depressing (see the graphs from Domenico above). As he puts it (drawing on research with a colleague into DH journals):

These data show that the real problem is not that English is the dominant language of academic publications (and of DH), but that both Anglophone and a high percentage of non-Anglophone colleagues barely use/quote non-Anglophone sources in their research.

I can’t help thinking that the internet has allowed the big to get even bigger. The dominance of English in academic circles is exacerbated by the instant availability of English research. National languages don’t even have location as an advantage on the internet.

What can we do about it? Miran had a nice reply on Humanist (to the original posting by Greg Crane that was also on Humanist.) Domenico suggests that we all have to take some responsibility, especially those of us who have the “free ride” of being native English writers.

It is the responsibility of dominant languages and cultures to translate from marginal or less influential languages.

DH 2015 in Sydney, Australia

Digital Humanities 2015 (DH2015) is now finishing up. I have been keeping my conference notes here.

The conference was held on the lovely campus of the University of Western Sydney. I was part of a couple of events and papers at this conference including:

  • News Scholars Symposium: With Rachel Hendry, I helped organize a pre-conference event for new scholars. This was supported by CHCI, centerNet, the Kule Institute for Advanced Study and the University of Western Sydney.
  • I participated in a public panel on Building Communities and Networks in the Humanities where I talked about some of the forms of public engagement that we are trying at the Kule Institute including the Around the World Conference.
  • I helped Stéfan Sinclair with a workshop on Voyant 2.0 (link goes to current version which will soon be 2.0).
  • I gave a paper with Stéfan Sinclair on “Talking about Programming the Digital Humanities” that traced a history of the discussion about programming and tools in the digital humanities.
  • Finally, John Montague gave a paper on “Exploring Large Datasets with Topic Model Visualizations” that I was involved in. This paper discussed a visualization for exploring the results of topic modelling that you can try in prototype here.

It is hard to summarize a whole conference, but I would note some of the questions that the new scholars posed in the unconference are worth thinking about:

  • How does one learn about the field of digital humanities?
  • How does one learn skills in the digital humanities?
  • How does one teach the digital humanities?
  • What are the ethical issues in digital work in the humanities?