The Digital Public Library of America (DPLA) has a fascinating collection of Primary Source Sets that bring together materials around a subject for teaching and historical thinking. For example they have a set on Commodore Perry’s Expedition to Japan that allows you to see both American and Japanese representations of Perry and the important visit. These sets show how a digital archive can be repurposed in different ways.
Having just finished teaching a course on Big Data and Text Analysis where I taught students Python I can appreciate a well written tutorial on Python. Python Programming for the Humanities by Folgert Karsdorp is a great tutorial for humanists new to programming that takes the form of a series of Jupyter notebooks that students can download. As the tutorials are notebooks, if students have set up Python on their computers then they can use the tutorials interactively. Karsdorp has done a nice job of weaving in cells where the student has to code and Quizes which reinforce the materials which strikes me as an excellent use of the IPython notebook model.
I learned about this reading a more advanced set of tutorials from Allen Riddell for Dariah-DE, Text Analysis with Topic Models for the Humanities and Social Sciences. The title doesn’t do this collection of tutorials justice because they include a lot more than just Topic Models. There are advanced tutorials on all sorts of topics like machine learning and classification. See the index for the range of tutorials.
Text Analysis with Topic Models for the Humanities and Social Sciences (TAToM) consists of a series of tutorials covering basic procedures in quantitative text analysis. The tutorials cover the preparation of a text corpus for analysis and the exploration of a collection of texts using topic models and machine learning.
Stéfan Sinclair and I (mostly Stéfan) have also produced a textbook for teaching programming to humanists called The Art of Literary Text Analysis. These tutorials are also written as Jupyter notebooks so you can download them and play with them.
We are now reimplementing them with our own Voyant-based notebook environment called Spyral. See The Art of Literary Text Analysis with Spyral Notebooks. More on this in another blog entry.
I just got an email announcing the soft launch of the Canadian Social Knowledge Institute (C-SKI). This institute grew out of the Electronic Textual Culture Lab and the INKE project. Part of C-SKI is a Open Scholarship Policy Observatory which has a number of partners through INKE.
The Canadian Social Knowledge Institute (C-SKI) actively engages issues related to networked open social scholarship: creating and disseminating research and research technologies in ways that are accessible and significant to a broad audience that includes specialists and active non-specialists. Representing, coordinating, and supporting the work of the Implementing New Knowledge Environments (INKE) Partnership, C-SKI activities include awareness raising, knowledge mobilization, training, public engagement, scholarly communication, and pertinent research and development on local, national, and international levels. Originated in 2015, C-SKI is located in the Electronic Textual Cultures Lab in the Digital Scholarship Centre at UVic.
I’ve been playing with DataCamp‘s Python lessons and they are quite good. Python is taught in the context of data analysis rather than the turtle drawing of How to Think Like a Computer Scientist. They have a nice mix of video tutorials and then exercises where you get a tripartite screen (see above.) You have an explanation and instructions on the left, a short script to fill in on the upper-right and interactive python shell where you can try stuff below.
The Naylor Report (PDF) about research funding in Canada is out and we put it in Voyant. Here are some different
- Here is the default Corpus View
- Here it is in the Topics (Topic Modelling) View
- Here is the Scatter Plot (Correspondence Analysis) View (see image above)
From Slashdot a story about an FBI game/interactive that is online and which aims at Countering Violent Extremism | What is Violent Extremism?. The subtitle is “Don’t Be A Puppet” and the game is part of a collection of interactive materials that try to teach about extremism in general and encourage some critical distance from the extremism. The game has you as a sheep avoiding pitfalls.
From the BBC a story about US start-up Geofeedia ‘allowed police to track protesters’. Geofeedia is apparently using social media data from Twitter, Facebook and Instagram to monitor activists and protesters for law enforcement. Access to these social media was changed once the ACLU reported on the surveillance product. The ACLU discovered the agreements with Geofeedia when they requested public records of California law enforcement agencies. Geofeedia was boasting to law enforcement about their access. The ACLU has released some of the documents of interest including a PDF of a Geofeedia Product Update email discussing “sentiment” analytics (May 18, 2016).
Frome the Geofeedia web site I was surprised to see that they are offering solutions for education too.
From Humanist and then MIT News, Professor Emeritus Seymour Papert, pioneer of constructionist learning, dies at 88. Papert was Piaget’s student and thought about how computers could provide children a way to construct knowledge. Among other things he developed the Logo language that I learned at one point. He also collaborated with the LEGO folk on Mindstorms, named after his book by that title.
The School for Poetic Computation is where I would study if I had the time (and money). Courses include:
- Generative Text
- Radical Computer Science
- Physical Computing
- Concepts and Theory
- Recreating the Past
The NPR show Planet Money aired a show in 2014 on When Women Stopped Coding that looks at why the participation of women in computer science changed in 1984 after rising for a decade. Unlike other professional programs like medical school and law school, the percent participation of women when from about 37% in 1984 down to under 20% today. The NPR story suggests that the problem is the promotion of the personal computer at the moment when it became affordable. In the 1980s they were heavily marketed to boys which meant that far more men came to computer science in college with significant experience with computing, something that wasn’t true in the 70s when there weren’t that many computers in the home and math is what mattered. The story builds on research by Jane Margolis and in particular her book Unlocking the Clubhouse.
This fits with my memories of the time. I remember being jealous of the one or two kids who had Apple IIs in college (in the late 70s) and bought an Apple II clone (a Lemon?) as soon has I had a job just to start playing with programming. At college I ended up getting 24/7 access to the computing lab in order to be able to use the word processing available (a Pascal editor and Diablo daisy wheel printer for final copy.) I hated typing and retyping my papers and fell in love with the backspace key and editing of word processing. I also remember the sense of comradery among those who spent all night in the lab typing papers in the face of our teacher’s mistrust of processed text. Was it coincidence that the two of us who shared the best senior thesis prize in philosophy in 1892 wrote our theses in the lab on computers? What the story doesn’t deal with, that Margolis does, is the homosocial club-like atmosphere around computing. This still persists. I’m embarrassed to think of how much I’ve felt a sense of belonging to these informal clubs without asking who was excluded.