Racism, misogyny, death threats: Why can’t the booming video-game industry curb toxicity? – Silicon Valley

Silicon Valley is reprinting a story from the Washington post, Racism, misogyny, death threats: Why can’t the booming video-game industry curb toxicity? The story is one more on how nasty online gaming can be. The usual companies try to reduce the toxicity of game culture and don’t really succeed. So we are left to just ignore it?

With no clear methods to effectively monitor, halt or eliminate toxic behavior, many in the gaming community have simply tried to ignore it and continue playing anyway. Many of the titles cited most for toxic players remain the industry’s most popular.

Peter Robinson, “Textual Communities: A Platform for Collaborative Scholarship on Manuscript Heritages”

Peter Robinson gave a talk on “Textual Communities: A Platform for Collaborative Scholarship on Manuscript Heritages” as part of the Singhmar Guest Speaker Program | Faculty of Arts.

He started by talking about whether textual traditions had any relationship to the material world. How do texts relate to each other?

Today stemata as visualizations are models that go beyond the manuscripts themselves to propose evolutionary hypotheses in visual form.

He then showed what he is doing with the Canterbury Tales Project and then talked about the challenges adapting the time-consuming transcription process to other manuscripts. There are lots of different transcription systems, but few that handle collation. There is also the problem of costs and involving a distributed network of people.

He then defined text:

A text is an act of (human) communication that is inscribed in a document.

I wondered how he would deal with Allen Renear’s argument that there are Real Abstract Objects which, like Platonic Forms are real, but have no material instance. When we talk, for example, of “hamlet” we aren’t talking about a particular instance, but an abstract object. Likewise with things like “justice”, “history,” and “love.” Peter responded that the work doesn’t exist except as its instances.

He also mentioned that this is why stand-off markup doesn’t work because texts aren’t a set of linear objects. It is better to represent it as a tree of leaves.

So, he launched Textual Communities – https://textualcommunities.org/

This is a distributed editing system that also has collation.

Cybersecurity

The New York Times has a nice short video on cybersecurity which is increasingly an issue. One of the things they mention is how it was the USA and Israel that may have opened the Pandora’s box of cyberweapons when they used Stuxnet to damage Iran’s nuclear programme. By using a sophisticated worm first we both legitimized the use of cyberwar against other countries which one is not at war with, and we showed what could be done. This, at least, is the argument of a good book on Stuxnet, Countdown to Zero Day.

Now the problem is that the USA, while having good offensive capability, is also one of the most vulnerable countries because of the heavy use of information technology in all walks of life. How can we defend against the weapons we have let loose?

What is particularly worrisome is that cyberweapons are being designed so that they are hard to trace and subtly disruptive in ways that are short of all out war. We are seeing a new form of hot/cold war where countries harass each other electronically without actually declaring war and getting civilian input. After 2016 all democratic countries need to protect against electoral disruption which then puts democracies at a disadvantage over closed societies.

The oral history of the Hampsterdance: The twisted true story of one of the world’s first memes | CBC Arts

Th CBC has a nice (and long) oral history about Hampsterdance: The twisted true story of one of the world’s first memes. Deidre LaCarte created the original site on Geocities in 1998 as a challenge and it took off. As the CBC puts it, it was the original meme to take off. You can see the original here.

It becomes clear as one reads on that none of the assets of the site were original; they were all clipart or music taken from elsewhere. Nonetheless LaCarte and others were able to make some money on the success of the site.

I personally think the first viral internet meme was the Mrs. Fields (or Neiman Marcus) cookie recipe story that circulated by email. It was an urban legend about being billed $250 for a recipe by a Mrs. Fields store and then sharing that recipe. According to Snopes this legend has quite a history going back to a 1948 cookbook.

The Truth About ‘Video Game Addiction’


Recently the World Health Organization included “gaming disorder” in the International Classification of Diseases (ICD) 11.

Gaming disorder is defined in the draft 11th Revision of the International Classification of Diseases (ICD-11) as a pattern of gaming behavior (“digital-gaming” or “video-gaming”) characterized by impaired control over gaming, increasing priority given to gaming over other activities to the extent that gaming takes precedence over other interests and daily activities, and continuation or escalation of gaming despite the occurrence of negative consequences.

For gaming disorder to be diagnosed, the behaviour pattern must be of sufficient severity to result in significant impairment in personal, family, social, educational, occupational or other important areas of functioning and would normally have been evident for at least 12 months.

Needless to say, this has raised hackles in the gaming world. One balanced article in The Truth About ‘Video Game Addiction’ in Kotaku.

The Aggregate IQ Files, Part One: How a Political Engineering Firm Exposed Their Code Base

The Research Director for UpGuard, Chris Vickery (@VickerySec) has uncovered code repositories from AggregateIQ, the Canadian company that was building tools for/with SCL and Cambridge Analytica. See The Aggregate IQ Files, Part One: How a Political Engineering Firm Exposed Their Code Base and AggregateIQ Created Cambridge Analytica’s Election Software, and Here’s the Proof from Gizmodo.

The screenshots from the repository show on project called ephemeral with a description “Because there is no such thing as THE TRUTH”. The “Primary Data Storage” of Ephemeral is called “Mamba Jamba”, presumably a joke on “mumbo jumbo” which isn’t a good sign. What is mort interesting is the description (see image above) of the data storage system as “The Database of Truth”. Here is a selection of that description:

The Database of Truth is a database system that integrates, obtains, and normalizes data from disparate sources including starting with the RNC data trust.  … This system will be created to make decisions based upon the data source and quality as to which data constitutes the accepted truth and connect via integrations or API to the source systems to acquire and update this data on a regular basis.

A robust front-end system will be built that allows an authrized user to query the Database of Truth to find data for a particular upcoming project, to see how current the data is, and to take a segment of that data and move it to the Escrow Database System. …

The Database of Truth is the Core source of data for the entire system. …

One wonders if there is a philosophical theory, of sorts, in Ephemeral. A theory where no truth is built on the mumbo jumbo of a database of truth(s).

Ephemeral would seem to be part of Project Ripon, the system that Cambridge Analytica never really delivered to the Cruz campaign. Perhaps the system was so ephemeral that it never worked and therefore the Database of Truth never held THE TRUTH. Ripon might be better called Ripoff.

After the Facebook scandal it’s time to base the digital economy on public v private ownership of data

In a nutshell, instead of letting Facebook get away with charging us for its services or continuing to exploit our data for advertising, we must find a way to get companies like Facebook to pay for accessing our data – conceptualised, for the most part, as something we own in common, not as something we own as individuals.

Evgeny Morozov has a great essay in The Guardian on how After the Facebook scandal it’s time to base the digital economy on public v private ownership of data. He argues that better data protection is not enough. We need to “to articulate a truly decentralised, emancipatory politics, whereby the institutions of the state (from the national to the municipal level) will be deployed to recognise, create, and foster the creation of social rights to data.” In Alberta that may start with a centralized clinical information system called Connect Care managed by the Province. The Province will presumably control access to our data to those researchers and health-care practitioners that commit to using access appropriately. Can we imagine a model where Connect Care is expanded to include social data that we can then control and give others (businesses) access to?

An Evening with Edward Snowden on Security, Public Life and Research

This evening we are hosting a video conferencing talk by Edward Snowden at the University of Alberta. These are some live notes taken during the talk for which I was one of the moderators. Like all live notes they will be full of misunderstandings.

Joseph Wiebe of Augustana College gave the introduction. Wiebe asked what is the place of cybersecurity in public life?

“What an incredible time?” is how Snowden started, talking about the Cambridge Analytica and Facebook story. Technology is changing and connecting across borders. We are in the midst of the greatest redistribution of power in the history of humankind without anyone being asked for their vote or opinion. Large platforms take advantage of our need for human connection and turn our desires into a weakness. They have perfected the most effective system of control.

The revelations of 2013 were never about just surveillance, they were about democracy. We feel something has been neglected in the news and in politics. It is the death of influence. It is a system of manipulation that robs us of power by a cadre of the unaccountable. It works because it is largely invisible and is all connected to the use and abuse of our data. We are talking about power that comes from information.

He told us to learn from the mistake of 5 years ago and not focus too much on surveillance, but to look beyond the lever to those putting their weight on it.

Back to the problem of illiberal technologies. Information and control is meant to be distributed among the people. Surveillance technology change has outstripped democratic institutions. Powerful institutions are trying to get as much control of these technologies as they can before their is a backlash. It will be very hard to take control back once everyone gets used to it.

Snowden talked about how Facebook was gathering all sorts of information from our phones. They (Facebook and Google) operate on our ignorance because there is no way we can keep up with changes in privacy policies. Governments are even worse with laws that allow mass surveillance.

There is an interesting interaction between governments with China modelling its surveillance laws on those of the US. Governments seem to experiment with clearly illegal technologies and the courts don’t do anything. Everything is secret so we can’t even know and make a decision.

What can we do when ordinary oversight breaks down and our checks and balances are bypassed. The public is left to rely on public resources like journalism and academia. We depend then public facts. Governments can manipulate those facts.

This is the tragedy of our times. We are being forced to rely on the press. This press is being captured and controlled and attacked. And how does the press know what is happening? They depend on whistleblowers who have no protection. Governments see the press as a threat.  Journalists rank in the hierarchy of danger between hackers and terrorists.

What sort of world will we face when governments figure out how to manage the press? What will we not know without the press.

One can argue that extraordinary times call for extraordinary measures, but who gets to decide? We don’t seem to have a voice even through our elected officials.

National security is a euphemism. We are witnessing the construction of a world where the most common political value is fear. Everyone argues we are living in danger and using that to control us. What is really happening is that morality has been replaced with legalisms. Rights have become a vulnerability.

Snowden disagrees. If we all disagree then things can change. Even in the face of real danger, there are limits to what should be allowed. Following Thoreau we need to resist. We don’t need a respect for the law, but for the right. The law is no substitute for justice or conscience.

Snowden would not be surprised if Facebook’s final defense is that “its legal.” But we need to ask if it is right. A wrong should not be turned into a right. We should be skeptical of those in power and the powers that shape our future. There times in history and in our lives when the only possible decision is to break the law.

How Trump Consultants Exploited the Facebook Data of Millions

Cambridge Analytica harvested personal information from a huge swath of the electorate to develop techniques that were later used in the Trump campaign.

The New York Times has just published a story about How Trump Consultants Exploited the Facebook Data of MillionsThe story is about how Cambridge Analytica, the US arm of SCL, a UK company, gathered a massive dataset from Facebook with which to do “psychometric modelling” in order to benefit Trump.

The Guardian has been reporting on Cambridge Analytica for some time – see their Cambridge Analytica Files. The service they are supposed to have provided with this massive dataset was to model types of people and their needs/desires/politics and then help political campaigns, like Trump’s, through microtargeting to influence voters. Using the models a campaign can create content tailored to these psychometrically modelled micro-groups to shift their opinions. (See articles by Paul-Olivier Dehaye about what Cambridge Analytica does and has.)

What is new is that there is a (Canadian) whistleblower from Cambridge Analytica, Christopher Wylie who was willing to talk to the Guardian and others. He is “the data nerd who came in from the cold” and he has a trove of documents that contradict what other said.

The Intercept has a earlier and related story about how Facebook Failed to Protect 30 Million Users From Having Their Data Harvested By Trump Campaign Affiliate. This tells how people were convinced to download a Facebook app that then took your data and that of their friends.

It is difficult to tell how effective the psychometric profiling with data is and if can really be used to sway voters. What is clear, however, is that Facebook is not really protecting their users’ data. To some extent their set up to monetize such psychometric data by convincing those who buy access to the data that you can use it to sway people. The problem is not that it can be done, but that Facebook didn’t get paid for this and are now getting bad press.

Opinion | America’s Real Digital Divide

The problem isn’t that poor children don’t have access to computers. It’s that they spend too much time in front of them.

The New York Times has an important Opinion about America’s Real Digital Divide by Naomi S. Riley from Feb. 11, 2018. She argues that TV and video game screen time is bad for children and there is no evidence that computer screen time is helpful. The digital divide is not one of access to screens but one of attitude and education on screen time.

But no one is telling poorer parents about the dangers of screen time. For instance, according to a 2012 Pew survey, just 39 percent of parents with incomes of less than $30,000 a year say they are “very concerned” about this issue, compared with about six in 10 parents in higher-earning households.