Digital humanities – How data analysis can enrich the liberal arts

But despite data science’s exciting possibilities, plenty of other academics object to it

The Economist has a nice Christmas Special on the Digital humanities – How data analysis can enrich the liberal arts. The article tells a bit of our history (starting with Busa, of course) and gives examples of new work like that of Ted Underwood. The note criticism about how DH may be sucking up all the money or corrupting the humanities, but they also point out how little DH gets from the NEH pot (some $60m out of $16bn) which is hardly evidence of a take over. The truth is, as they note, that the humanities are under attack again and the digital humanities don’t make much of a difference either way. The neighboring fields that I see students moving to are media arts, communication studies and specializations like criminology. Those are the threats, but also sanctuaries for the humanities.

SHAPE

SHAPE is a new collective name for those subjects that help us understand ourselves, others and the human world around us. They provide us with the methods and forms of expression we need to build better, deeper, more colourful and more valuable lives for all.

From an Australian speaker at the INKE conference I learned about SHAPE or Social Sciences, Humanities & The Arts For People & The Economy. This is an initiative of the London School of Economics, the British Academy and the Arts Council of England. It is trying to complement the attention given to STEM fields. I like how they use the word shape in various assets as in:

The shape of then.

The shape of now.

The shape of if.

The shape of when.

You can read more in this Guardian story on University and Arts Council in drive to re-brand ‘soft’ academic subjects.

Gather

Gather is a video-calling space that lets multiple people hold separate conversations in parallel, walking in and out of those conversations just as easily as they would in real life.

Kisha introduced me to Gather, a cross between Second Life and Zoom. If you have a Gather account you can create a space – your own little classroom with different gathering spots. People then move around these 8-bit animated spaces and when they are in hearing distance they can video conference. Users can also read posters put up, or documents left around, or watch videos created for a space. It actually looks like a nice type of space for a class to use as an alternative to Zoom.

A Digital Project Handbook

A peer-reviewed, open resource filling the gap between platform-specific tutorials and disciplinary discourse in digital humanities.

From a list I am on I learned about Visualizing Objects, Places, and Spaces: A Digital Project Handbook. This is a highly modular text book that covers a lot of the basics about project management in the digital humanities. They have a call now for “case studies (research projects) and assignments that showcase archival, spatial, narrative, dimensional, and/or temporal approaches to digital pedagogy and scholarship.” The handbook is edited by Beth Fischer (Postdoctoral Fellow in Digital Humanities at the Williams College Museum of Art) and Hannah Jacobs (Digital Humanities Specialist, Wired! Lab, Duke University), but parts are authored by all sorts of people.

What I like about it is the way they have split up the modules and organized things by the type of project. They also have deadlines which seem to be for new iterations of materials and for completion of different parts. This could prove to be a great resource for teaching project management.

The Whiteness of AI

This paper focuses on the fact that AI is predominantly portrayed as white—in colour, ethnicity, or both. We first illustrate the prevalent Whiteness

The Whiteness of AI” was mentioned in an online panel following The State of AI Ethics report (October 2020) from the Montreal AI Ethics Institute. This article starts from the observation that if you search Google images for “robot” or “AI” you get predominately images of white (or blue) entities. (Go ahead and try it.) From there it moves to the tendency of “White people; and the persistent tendency of members of that group, who dominate the academy in the US and Europe, to refuse to see themselves as racialised or race as a matter of concern at all.” (p. 686)

The paper then proposes three theories about the whiteness of AI to make it strange and to challenge the myth of colour-blindness that many of us in technology related fields live in. Important reading!

Blogging your research: Tips for getting started

Curious about research blogging, but not sure where to start?

Alice Fleerackersand Lupin Battersby of the ScholCommLab have put together a good post on Blogging your research: Tips for getting started. Despite being committed to blogging (this blog has been going since 2003) I must admit that I’m not sure blogging has the impact it once had. Twitter seems to have replaced blogging as a way to quickly share and follow research. Blog platforms, like WordPress have become project news and promotion systems.

What few talk about is how blogging can be a way of journaling for oneself. My blog certainly serves as a form of memory by and for myself. If only I search it (which I often do when I’m looking for information about something I knew but forgot) then it is still useful. Does everything in academia have to be about promotion and public impact?

In this age of fake news we seem to be back in the situation that Socrates and Gorgias sparred about in Plato’s Gorgias. Gorgias makes the point that the orator or, in today’s terms the communications specialist, can be more convincing than the scholar because they know how to “communicate”.

Socrates: Then the case is the same in all the other arts for the orator and his rhetoric: there is no need to know [459c] the truth of the actual matters, but one merely needs to have discovered some device of persuasion which will make one appear to those who do not know to know better than those who know.

Gorgias: Well, and is it not a great convenience, Socrates, to make oneself a match for the professionals by learning just this single art and omitting all the others? (Gorgias 459a)

It certainly feels like today there is a positive distrust of expertise such that the blatant lie, if repeated often enough, can convince those who want to hear the lie. Does communicating about our research have the beneficial effect we hope it does? Or, does it inflate our bubble without touching that of others?

Freedom Online Coalition joint statement on artificial intelligence

The Freedom Online Coalition (FOC) has issued a joint statement on artificial intelligence (AI) and human rights.  While the FOC acknowledges that AI systems offer unprecedented opportunities for human development and innovation, the Coalition expresses concern over the documented and ongoing use of AI systems towards repressive and authoritarian purposes, including through facial recognition technology […]

The Freedom Online Coalition is a coalition of countries including Canada that “work closely together to coordinate their diplomatic efforts and engage with civil society and the private sector to support Internet freedom – free expression, association, assembly, and privacy online – worldwide.” It was founded in 2011 at the initiative of the Dutch.

FOC has just released Joint Statement on Artificial Intelligence and Human Rights that calls for “transparency, traceability and accountability” in the design and deployment of AI systems. They also reaffirm that “states must abide by their obligations under international human rights law to ensure that human rights are fully respected and protected.” The statement ends with a series of recommendations or “Calls to action”.

What is important about this statement is the role of the state recommended. This is not a set of vapid principles that developers should voluntarily adhere to. It calls for appropriate legislation.

States should consider how domestic legislation, regulation and policies can identify, prevent, and mitigate risks to human rights posed by the design, development and use of AI systems, and take action where appropriate. These may include national AI and data strategies, human rights codes, privacy laws, data protection measures, responsible business practices, and other measures that may protect the interests of persons or groups facing multiple and intersecting forms of discrimination.

I note that yesterday the Liberals introduced a Digital Charter Implementation Act that could significantly change the regulations around data privacy. More on that as I read about it.

Thanks to Florence for pointing this FOC statement out to me.

Why basing universities on digital platforms will lead to their demise – Infolet

I’m republishing here a blog essay originally in Italian that Domenico Fiormonte posted on Infolet that is worth reading,

Why basing universities on digital platforms will lead to their demise

By Domenico Fiormonte

(All links removed. They can be found in the original post – English Translation by Desmond Schmidt)

A group of professors from Italian universities have written an open letter on the consequences of using proprietary digital platforms in distance learning. They hope that a discussion on the future of education will begin as soon as possible and that the investments discussed in recent weeks will be used to create a public digital infrastructure for schools and universities.


Dear colleagues and students,

as you already know, since the COVID-19 emergency began, Italian schools and universities have relied on proprietary platforms and tools for distance learning (including exams), which are mostly produced by the “GAFAM” group of companies (Google, Apple, Facebook, Microsoft and Amazon). There are a few exceptions, such as the Politecnico di Torino, which has adopted instead its own custom-built solutions. However, on July 16, 2020 the European Court of Justice issued a very important ruling, which essentially says that US companies do not guarantee user privacy in accordance with the European General Data Protection Regulation (GDPR). As a result, all data transfers from the EU to the United States must be regarded as non-compliant with this regulation, and are therefore illegal.

A debate on this issue is currently underway in the EU, and the European Authority has explicitly invited “institutions, offices, agencies and organizations of the European Union to avoid transfers of personal data to the United States for new procedures or when securing new contracts with service providers.” In fact the Irish Authority has explicitly banned the transfer of Facebook user data to the United States. Finally, some studies underline how the majority of commercial platforms used during the “educational emergency” (primarily G-Suite) pose serious legal problems and represent a “systematic violation of the principles of transparency.”

In this difficult situation, various organizations, including (as stated below) some university professors, are trying to help Italian schools and universities comply with the ruling. They do so in the interests not only of the institutions themselves, but also of teachers and students, who have the right to study, teach and discuss without being surveilled, profiled and catalogued. The inherent risks in outsourcing teaching to multinational companies, who can do as they please with our data, are not only cultural or economic, but also legal: anyone, in this situation, could complain to the privacy authority to the detriment of the institution for which they are working.

However, the question goes beyond our own right, or that of our students, to privacy. In the renewed COVID emergency we know that there are enormous economic interests at stake, and the digital platforms, which in recent months have increased their turnover (see the study published in October by Mediobanca), now have the power to shape the future of education around the world. An example is what is happening in Italian schools with the national “Smart Class” project, financed with EU funds by the Ministry of Education. This is a package of “integrated teaching” where Pearson contributes the content for all the subjects, Google provides the software, and the hardware is the Acer Chromebook. (Incidentally, Pearson is the second largest publisher in the world, with a turnover of more than 4.5 billion euros in 2018.) And for the schools that join, it is not possible to buy other products.

Finally, although it may seem like science fiction, in addition to stabilizing proprietary distance learning as an “offer”, there is already talk of using artificial intelligence to “support” teachers in their work.

For all these reasons, a group of professors from various Italian universities decided to take action. Our initiative is not currently aimed at presenting an immediate complaint to the data protection officer, but at avoiding it, by allowing teachers and students to create spaces for discussion and encourage them to make choices that combine their freedom of teaching with their right to study. Only if the institutional response is insufficient or absent, we will register, as a last resort, a complaint to the national privacy authority. In this case the first step will be to exploit the “flaw” opened by the EU court ruling to push the Italian privacy authority to intervene (indeed, the former President, Antonello Soro, had already done so, but received no response). The purpose of these actions is certainly not to “block” the platforms that provide distance learning and those who use them, but to push the government to finally invest in the creation of a public infrastructure based on free software for scientific communication and teaching (on the model of what is proposed here and
which is already a reality for example in France, Spain and other European countries).

As we said above, before appealing to the national authority, a preliminary stage is necessary. Everyone must write to the data protection officer (DPO) requesting some information (attached here is the facsimile of the form for teachers we have prepared). If no response is received within thirty days, or if the response is considered unsatisfactory, we can proceed with the complaint to the national authority. At that point, the conversation will change, because the complaint to the national authority can be made not only by individuals, but also by groups or associations. It is important to emphasize that, even in this avoidable scenario, the question to the data controller is not necessarily a “protest” against the institution, but an attempt to turn it into a better working and study environment for everyone, conforming to European standards.

How’s the Alberta PSE Re-Think Going?

Anyways, in sum: the emerging Alberta 2030 recommendations are for the most part banalities.  Not necessarily bad banalities – there are a lot of worthy ideas in there, just none which suggest any evidence of innovative thinking or actual learning from other jurisdictions.  But there are two obvious flashpoints, neither of which seems very promising ground for the government to launch fights.

Alex Usher has just posted How’s the Alberta PSE Re-Think Going? (Part 2) which, surprise, follows How’s the Alberta PSE Re-Think Going? (Part 1). Part 1 deals with whether the McKinsey review of Post-Secondary Education is worth the $3.7 million the province is paying for it. (It is not!) Part 2 looks at the recommendations.

What Usher doesn’t talk much about is the “Building Skill for Jobs” aspect of the whole exercise. The assumption is that PSE is all about giving students skills so they can get jobs. I also suspect that the skills imagined by the government are mostly those needed by the energy industry, even though there might not be the jobs in the future. As Usher puts it, “most UCP policy is a nostalgia play for the resource boom of 2004-2014”.

The two flashpoints Usher mentions are 1) a recommendation around deregulating tuition and then balancing that with needs-based financial aid. 2) The second is a recommendation to have fewer boards. Instead of a board for institution, there could be just one board for the research university sector.

We shall see.

An Anecdoted Topography of Chance

Following a rambling conversation with his friend Robert Filliou, Daniel Spoerri one day mapped the objects lying at random on the table in his room, adding a rigorously scientific description of each. These objects subsequently evoked associations, memories and anecdotes from both the original author and his friends …

I recently bought a copy of Spoerri and friend’s artist’s book, An Anecdoted Topography of Chance. The first edition dates from 1966, but that was based on a version that passed as the catalogue for an exhibition by Spoerri in 1962. This 2016 version has a footnote to the title (in the lower right of the cover) that reads,

* Probably definitive re-anecdoted version

The work is essentially a collection of annotations to a map of the dishes and other things that were on Spoerri’s sideboard in his apartment. You start with the map, that looks like an archaeological diagram, and follow anecdotes about the items that are, in turn, commented on by the other authors. Hypertext before hypertext.

While the work seems to have been driven by the chance items on the small table, there is also an autobiographical element where these items give the authors excuses to tell about their intersecting lives.

I wonder if this would be an example of a work of art of information.