Congratulations to Victoria Svaikovsky ARIA Intern for 2017

McGill hosted its annual event to showcase the work of undergraduate summer research projects. Among the many amazing projects was .txtLAB’s Victoria Svaikovsky who led a research project with two other students, Anne Meisner and Eve Kraicer-Melamed, on studying the intersections of race and Hollywood film using computational analysis.

Their project aimed to better understand the racial inequality that has long been identified in academic and popular criticisms of Hollywood. Focusing on questions of visible and audible marginalization as well as linguistic tokenization with respect to visible minorities, Svaikovsky and her team have produced an impressive study of over 800 screenplays.

Their work will be coming out as a lab white paper in the near future. Keep an eye out!

Gender and Equity in Publishing

The Just Review team held an inspiring event last night. It was a roundtable of six women discussing their experiences with academic and literary publishing. It was an amazing conversation covering many different perspectives. We had two academics, one editor, one publisher, a novelist and a poet. Here are some of the themes they touched on.

Cultivating Confidence

Putting oneself forward was a theme that kept recurring. Whether it was the confidence to send off your manuscript or speak up at a literary festival or reach out to a mentor, many of the panelists discussed how they consistently had to work against their own inner inhibitions. Based on their success as individuals you would never guess that this is something they wrestled with. But something they strongly emphasized was cultivating the confidence at an early an age as possible to take risks, speak out, and put oneself forward.

Prioritizing Carework and Generosity

Another key theme was about avoiding the myth of scarcity, by which they meant seeing gender and job competition as a competition or zero sum game. Instead, they encouraged all of us to think about how to cultivate the work of others and how, in the words of one participant, “to take up less space.” This might seem in contradiction to the first point about putting oneself out there, but it offers another way to think of literary work. Not only find your place, but do the work to make it possible for others, especially others who may have less privilege than you, to find their place. Generosity and empathy were two states of mind that were strongly emphasized.

Creating Parastructures

Finally, a core theme that kept emerging was the importance of creating peer-networks and “collectives.” Inevitably as a woman you will be subject to some kind of bias or discrimination in your career. These extra-institutional structures can be an important way of finding more rewarding spaces to work and create and find more open feedback loops to help improve your work. Creating these networks takes time. But the participants emphasized just how valuable such spaces have been in their lives and careers, whether it was creating independent presses, writing groups, or women-led gaming communities.

Much more was discussed over the hour and a half event that I can’t cover here. But I think it was a really crucial conversation to have and one that I hope inspired the many students who were present. I know I learned an incredible amount.

AI across the Generations

I gave a talk today with Paul Yachnin to the McGill Community for Lifelong Learning on “Conscientious AI.” The idea for the event was to give the audience some understanding of how machine learning works and what you might do with it. We then asked the tables to brainstorm ideas about what kinds of AI would they like to see — what would help them with day-to-day tasks as they age?

It was an amazing event, not only to see how into the topic they were but also to see the topics they cared about: many of the ideas related to meeting up with people, either new people or those from different generations. Some were related to facilitating learning in class, especially related to hearing. That’s one of the biggest impediments to learning — older people have a really hard time hearing each other and that makes for a less than satisfying educational experience. Finally, people suggested a need to develop a system that might create more appropriate course material for their interests and needs.

Besides hearing some fascinating ideas what it really showed me is how important it is that we engineer with people in mind. Most algorithms are designed to serve powerful interests — corporations, school boards, but we have not yet made the plunge of user-driven AI. What do different communities need and how can we help them? Stop thinking in terms of hockey-stick curves in terms of consumer growth and more about people.

More important was an issue that came up during Q&A. Most people are very afraid of AI. They see how it seems to drive things like polarization or unemployment. Why / how could it be a force for good? The main point I tried to bring home — the point I always try to bring home — is that AI is a political good that can be used to serve our interests if we treat it as something open and communal. If seniors participate in algorithms designed to serve seniors; if teachers and students participate in algorithms designed to serve education, then we will have AI which is responsive to human needs rather than humans constantly responding to AI.

Mainly it was just fun to be there with so many curious, conscientious learners.

On Prestige Bias in the Chronicle of Higher Ed

The Chronicle of Higher Education ran a version of our essay on the concentration of institutional prestige as its cover story this week. In it we expand our reflections about how to change the current system. The essay is based on our original piece that appeared in Critical Inquiry. Here is an excerpt from the new essay:

The current system of double-blind peer review that underlies most academic publications is essentially an invention of the second half of the 20th century. Its failings have been well documented and numerous projects in the sciences as well as the humanities are now underway to change it. Almost all of these fixes, however, continue to rely on two basic principles: First, that communities of scholars still make intuitive judgments about quality (judgments which are rarely, if ever, made explicit); and second, that they largely rely on established publishing practices that essentially transfer content from one place (the lab or the desk) to another (the library).

What we are imagining, by contrast, is a new form of algorithmic openness, in which computation is used not as an afterthought or means of searching for things that have already been selected and sorted, but instead as a form of forethought, as a means of generating more diverse ecosystems of knowledge. What values do we care about in terms of human knowledge and how can we use the tools of data science to capture and more adequately represent those values in our system of scholarly communication? Instead of subject indexes and citation rankings, imagine filtering by institutional diversity, citational novelty, matters of public concern, or any number of other priorities. How might we encode these values to create smarter, more adaptable, and more open platforms and practices?

It is clear from our study and others like it that elite institutions continue to be the locus of the practices, techniques, virtues, and values that have come to define modern academic knowledge. They diffuse it, whether in the form of academic labor (personnel) or ideas (publication), from a concentrated center to a broader periphery. Using digital technologies to guide the circulation of knowledge does not inherently make one complicit in the “neoliberalization and corporatization” of higher education or a practitioner of “weapons of math destruction,” to use the data scientist Cathy O’Neil’s well-turned phrase. Wisely and openly used, such technologies can help us not only reveal, but potentially undo, longstanding disparities of institutional concentration. It is time we built a scholarly infrastructure that is more inclusive and more responsive to a broader range of voices, including those outside of the academy.

Over the course of the 19th century, universities adopted many of the norms of print culture and in so doing transformed themselves into modern research universities. We need a similar reinvention for our own universities as they enter a new age.

Addressing epistemic inequality, and not simply publication inequities, will require us to rethink what universities do and what they are for in a digital age. “Digitization” means more than just transferring print practices to digital formats. We need to integrate data science, knowledge of our past practices, and contemporary understandings of institutional norms to reinvigorate the intellectual openness of the university. We need to use all of our analytical and interpretive capabilities to rethink who and what counts. The university is a technology. Let’s treat it like one.

Cultural Advocacy Internship – “Gender Bias in Book Reviews”

We are excited to announce the 2017-2018 Internship in Cultural Advocacy, focusing on gender bias in book reviews. The internship will address how women are both mis-represented and under-represented in the public discourse of book reviewing. Book reviews represent a significant cultural outlet that bestows authority, but as our lab’s new website called “Just Review” shows, there are a variety of ways that women writers are still being framed as though they belong to a Victorian set of values. A team of interns will be responsible for crafting a year-long advocacy plan to address how book reviews represent women, using a combination of computational approaches, social media campaigns, and social advocacy to engage key stakeholders. We are looking for motivated, self-directed students who want to make a positive change in the world. The internship will begin on October 1, 2017, and end on April 30, 2018.

Award: $1,000
Application deadline: Wednesday, September 20, 2017
To apply, send cover letter and resumé to alayne.moody@mcgill.ca

Congratulations to this year’s students!

We have had an excellent year at .txtLAB. I want to send out a special thanks to all of the students who have been contributing to the lab. You’ve made it a great place to work. Here is a list of projects that we’ve been working on this year:

Just Review, a student led project on gender bias in book reviewing

For years, women have been aware that their books are less likely to get reviewed in the popular press and they are also less likely to serve as reviewers of books. Projects like VIDA and CWILA were started to combat this kind of exclusion. Over time they have managed to make some change happen in the industry. Although nowhere near parity, more women are being reviewed in major outlets than they were five or ten years ago.

Just Review was started in response to the belief that things were getting better. Just because you have more female authors being reviewed doesn’t mean those authors aren’t being pigeon-holed or stereotyped into writing about traditionally feminine topics. “Representation” is more than just numbers. It’s also about the topics, themes, and language that circulates around particular identities. In an initial study run out of our lab, we found there was a depressing association between certain kinds of words and the book author’s gender (even when controlling for the reviewer’s gender). Women were strongly associated with all the usual tropes of domesticity and sentimentality, while men were associated with public facing terms related to science, politics and competition. It seemed like we had made little progress from a largely Victorian framework.

To address this, we created an internship in “computational cultural advocacy” in my lab focused on “women in the public sphere.” We recruited five amazing students from a variety of different disciplines and basically said, “Go.”

The team of Just Review set about understanding the problem in greater detail, working together to identify their focus more clearly (including developing the project title Just Review) and reaching out to stakeholders to learn more about the process. By the end, they created a website, advocacy tools to help editors self-assess, recommendations for further reading, and a computational tool that identifies a book’s theme based on labels from Goodreads.com. If you know the author’s gender and the book’s ISBN (or even title), you can create a table that lists the themes of the books reviewed by a website. When we did this for over 10,000 book reviews in the New York Times, we found that there are strong thematic biases at work, even in an outlet prized for its gender equality.

Topics are identified from Goodreads. Topics that show no bias are omitted.

Beyond the important findings that they have uncovered, the really salient point about this project is the way it has been student-led from the beginning. It shows that with mentoring and commitment young women can become cultural advocates. They can take their academic interests and apply them to existing problems in the world and effect change. There has been a meme for awhile in the digital humanities that it is a field alienating to women and feminists. I hope this project shows just how integral a role data and computation can play to promote ideals of gender equality.

We will be creating a second iteration in the fall that will focus on getting the word out and tracking review sites more closely with our new tools. Congratulations to this year’s team who have made a major step in putting this issue on the public’s radar.