Naylor Report in Voyant

Correspondence Analysis (ScatterPlot) View

The Naylor Report (PDF) about research funding in Canada is out and we put it in Voyant. Here are some different

Continue reading Naylor Report in Voyant

FBI Game: What is Violent Extremism?

sheep

From Slashdot a story about an FBI game/interactive that is online and which aims at Countering Violent Extremism | What is Violent Extremism?. The subtitle is “Don’t Be A Puppet” and the game is part of a collection of interactive materials that try to teach about extremism in general and encourage some critical distance from the extremism. The game has you as a sheep avoiding pitfalls.

Continue reading FBI Game: What is Violent Extremism?

Geofeedia ‘allowed police to track protesters’

geofeedia
From the BBC a story about US start-up Geofeedia ‘allowed police to track protesters’. Geofeedia is apparently using social media data from Twitter, Facebook and Instagram to monitor activists and protesters for law enforcement. Access to these social media was changed once the ACLU reported on the surveillance product. The ACLU discovered the agreements with Geofeedia when they requested public records of California law enforcement agencies. Geofeedia was boasting to law enforcement about their access. The ACLU has released some of the documents of interest including a PDF of a Geofeedia Product Update email discussing “sentiment” analytics (May 18, 2016).

Frome the Geofeedia web site I was surprised to see that they are offering solutions for education too.

Professor Emeritus Seymour Papert, pioneer of constructionist learning, dies at 88

From Humanist and then MIT News, Professor Emeritus Seymour Papert, pioneer of constructionist learning, dies at 88. Papert was Piaget’s student and thought about how computers could provide children a way to construct knowledge. Among other things he developed the Logo language that I learned at one point. He also collaborated with the LEGO folk on Mindstorms, named after his book by that title.

When Women Stopped Coding

The NPR show Planet Money aired a show in 2014 on When Women Stopped Coding that looks at why the participation of women in computer science changed in 1984 after rising for a decade. Unlike other professional programs like medical school and law school, the percent participation of women when from about 37% in 1984 down to under 20% today. The NPR story suggests that the problem is the promotion of the personal computer at the moment when it became affordable. In the 1980s they were heavily marketed to boys which meant that far more men came to computer science in college with significant experience with computing, something that wasn’t true in the 70s when there weren’t that many computers in the home and math is what mattered. The story builds on research by Jane Margolis and in particular her book Unlocking the Clubhouse.

This fits with my memories of the time. I remember being jealous of the one or two kids who had Apple IIs in college (in the late 70s) and bought an Apple II clone (a Lemon?) as soon has I had a job just to start playing with programming. At college I ended up getting 24/7 access to the computing lab in order to be able to use the word processing available (a Pascal editor and Diablo daisy wheel printer for final copy.) I hated typing and retyping my papers and fell in love with the backspace key and editing of word processing. I also remember the sense of comradery among those who spent all night in the lab typing papers in the face of our teacher’s mistrust of processed text. Was it coincidence that the two of us who shared the best senior thesis prize in philosophy in 1892 wrote our theses in the lab on computers? What the story doesn’t deal with, that Margolis does, is the homosocial club-like atmosphere around computing. This still persists. I’m embarrassed to think of how much I’ve felt a sense of belonging to these informal clubs without asking who was excluded.

Computers in classroom have ‘mixed’ impact on learning: OECD report

The Globe and Mail and other sources are reporting that Computers in classroom have ‘mixed’ impact on learning. This is based on an OECD report titled Students, Computers and Learning: Making the Connection. The overall conclusion is that teaching is about the individual student and can’t be automated. Computers aren’t necessarily good for learning – they should be used for specific projects and used to teach real-world digital skills.

Students who use computers moderately at school tend to have somewhat better learning outcomes than students who use computers rarely. But students who use computers very frequently at school do a lot worse in most learning outcomes, even after accounting for social background and student demographics. (p. 3 of Report)

The Globe quotes Prof. Slotta of OISE to the effect that:

Technology is most effective in the classroom when it is used to develop skills similar to those that adults are using in everyday life, such as finding resources, critiquing arguments, communicating with peers, solving problems and working with data…

Skimming the report and the slide deck shows a complex picture where often countries like Japan have fewer computers in classrooms and do better on learning. Massive investment in computers like that of school boards who get laptops for every child doesn’t seem to lead to improvements in learning.

Put simply, ensuring that every child attains a baseline level of proficiency in reading and mathematics seems to do more to create equal opportunities in a digital world than can be achieved by expanding or subsidising access to high-tech devices and services. (p. 3 of Report)

The report also looked at loneliness and confirmed what parents have suspected,

Last but not least, most parents and teachers will not be surprised by the finding that students who spend more than six hours on line per weekday outside of school are particularly at risk of reporting that they feel lonely at school, and that they arrived late for school or skipped days of school in the two weeks prior to the PISA test.

The slide show prepared by Andreas Schleicher of the OECD suggest that there are larger questions about what sorts of skills should we be teaching in the coming age of automation. The second slide says “The kind of things that are easy to teach are now easy to automate, digitize or outsource.” A slide titled The Race between Technology and Education (title from work by Goldin and Katz) suggests that there is social pain when technology isn’t matched with education. The conclusion is that we need education for a world where many jobs can be automated. Just as the industrial revolution caused social pain in the form of dislocation and unemployment, so too could AI.

Science 2.0 and Citizen Research

This week I attended the second Science 2.0 conference held in Hamburg, Germany. (You can see my research notes here.) The conference dealt with issues around open access, open data, citizen science, and network enabled science. I was one of two Canadian digital humanists presenting. Matthew Hiebert from the University of Victoria talked about the social edition and work from the Electronic Textual Cultures Lab and Iter. It should be noted that in Europe the word “science” is more inclusive and can include the humanities. This conference wasn’t just about how open data and crowdsourcing could help the natural sciences – it was about how research across the disciplines could be supported with virtual labs and infrastructure.

I gave a paper on “New Publics for the Humanities” that started by noting that the humanities no longer engage the public. The social contract with the public that supports us has been neglected. I worry that if the university is disaggregated and the humanities unbundled from the other faculties (the way newspapers have been hit by the internet and the unbundling of services) then people will stop paying for the humanities and much of the research we do. We will end up with cheaper, research poor, colleges that provide lots of higher education without the research, or climbing walls. Only in the elite private universities will the humanities survive, and in those they will survive as a marker of their class status. You will be able to study ancient languages at elite schools because any degree is good from an elite school provides.

Of course, the humanities will survive outside the university, and may become healthier with the downsizing of the professional (or professorial) humanities, but we run the danger of unthinkingly losing a long tradition of thinking critically and ethically. An irony to be sure – losing thinking traditions through the lack of public reflection on the consequences of disruptive change.

Drawing on Greg Crane, I then argued that citizen research (forms of crowdsourcing) can re-engage the publics we need to support us and reflect with us. Citizen research can provide an alternative way of structuring research in anticipation of defunding of the humanities research function. I illustrated my point by showing a number of examples of humanities crowdsourcing projects from the OED (pre-computer volunteer research) to the Dictionary of Words in the Wild. If I can find the time I will write up the argument to see where it goes.

My talk was followed by thorough one on citizen science in environmental studies by Professor Aletta Bonn of the Citizens create knowledge project – a German platform for citizen science. We need to learn from people like Dr. Bonn who are studying and experimenting with the deployment of citizen research. One point she made was the importance of citizen co-design. Most projects enlist citizens in repetitive micro-tasks designed by researchers. What if the research project were designed from the beginning with citizens? What would that mean? How would that work?

Building Research Capacity Across the Humanities

On Monday I gave a talk at the German Institute for International Educational Research (DIPF) on:

Building Research Capacity Across the Humanities and Social Sciences: Social Innovation, Community Engagement and Citizen Science

The talk began with the sorry state of public support for the humanities. We frequently read how students shouldn’t major in the humanities because there are no jobs and we worry about dropping enrolments. The social contract between our publics (whose taxes pay for public universities) and the humanities seems broken or forgotten. We need to imagine how to re-engage the local and international communities interested in what we do. To that end I proposed that we:

  • We need to know ourselves better so we can better present our work to the community. It is difficult in a university like the University of Alberta to know what research and teaching is happening in the social sciences and humanities. We are spread out over 10 different faculties and don’t maintain any sort of shared research presence.
  • We need to learn to listen to the research needs of the local community and to collaborate with the community researchers who are working on these problems. How many people in the university know what the mayor’s priorities are? Who bothers to connect the research needs of the local community to the incredible capacity of our university? How do we collaborate and support the applied researchers who typically do the work identified by major stakeholders like the city. Institutes like the Kule Institute can help document the research agenda of major community stakeholders and then connect university and community researchers to solve them.
  • We need to learn to connect through the internet to communities of interest. Everything we study is of interest to amateurs if we bother to involve them. Crowdsourcing or “citizen science” techniques can bring amateurs into research in a way that engages them and enriches our projects.

In all three of these areas I described projects that are trying to better connect humanities research with our publics. In particular I showed various crowdsourcing projects in the humanities ending with the work we are now doing through the Text Mining the Novel project to imagine ways to crowdsource the tagging of social networks in literature.

One point that resonated with the audience at DIPF was around the types of relationships we need to develop with our publics. I argued that we have to learn to co-create research projects rather than “trickle down” results. We need to develop questions, methods and answers together with community researchers rather think that do the “real” research and then trickle results down to the community. This means learning new and humble ways of doing research.

CRediT: Open Standard for Roles in Research

The CRediT Project now has a Proposed Taxonomy for assigning credit. They have identified a short list of roles:

  • Conceptualization
  • Methodology
  • Software
  • Validation
  • Formal Analysis
  • Investigation
  • Resources
  • Data Curation
  • Writing – Original Draft
  • Writing – Review and Edit
  • Visualization
  • Supervision
  • Project Administration
  • Funding Acquisiton

They are looking for feedback.