Thinking About Working Issue #2

Thinking About Working: Issue 2

Written by Noah Leavitt, Director of the Career and Community Engagement Center

Will AI read your resume?

During Whitman’s Convocation, Professor of Computer Science Janet Davis invited our community to consider how increasingly sophisticated computational power might influence life during college. Provocatively, she asked the students assembled in Cordiner, “Will AI write your thesis?” – drawing attention to the extraordinary potential incursion of technology into some of the quintessential aspects of being a Whittie. A more provocative question is hard to imagine.

Many of us in the Career and Community Engagement Center (CCEC) perked up when we heard Davis because her topic was an extension of a discussion that we’ve been having over the past couple years about the ways that these same forces are changing the way that work is done and what the evolution of those trends might mean for our students (and, by extension, how best we prepare them to successfully enter and move through the world of work).

Following Convocation, as students have been coming to our office applying for campus jobs and off-campus internships, we have been particularly reflective on what we have learned over the past few years about how AI and massive data sets have changed the processes for applying for internships and jobs.

AI is fundamentally reshaping job and internship searches and review processes.  While students may imagine a benevolent mid-level manager reclining, feet up on a desk, sunlight streaming through the window behind them, carefully reviewing each carefully formatted and printed cover letter, in more and more situations the reality is quite different.

In his essential 2018 work, 21 Lessons for the 21st Century, historian Yuval Noah Harari predicted that “In the twenty-first century, the decision whether to hire somebody for a job will increasingly be made by algorithms.” (61).

Harari’s prediction was prescient. Just a year later, Vox reported, “recruiters are increasingly using AI to make the first round of cuts and to determine whether a job posting is even advertised to you. Often trained on data collected about previous or similar applicants, these tools can cut down on the effort recruiters need to expend in order to make a hire. Last year, 67% of hiring managers and recruiters surveyed by LinkedIn said AI was saving them time…“  (italics mine)

In the CCEC, more and more we hear about how resumes and cover letters are scanned and read by computers that search for, not just for key words, but rather for entire strings of language to assess candidates’ skills and suitability for positions.

Many of our advising sessions with students are, as one of my colleagues likes to say, “how to robot-proof their resumes” and other application materials. We draw on ideas like the helpful ones here (The Muse) and here (Medium) to explain about including keywords, focusing on skills, and bringing an almost Hemingway-like clarity to the writing. (A great introduction to this topic is this recent Wall Street Journal article instructing the reader How to Make Sure Your Résumé Passes Muster With an AI Reader.)

It’s worth noting that regardless of how students prepare their materials, it isn’t clear whether AI will necessarily result in bringing new talent into an organization. New research reported in the Wall Street Journal this Spring indicated that when applicants knew that they might be reviewed through an AI search process the might not be as likely to complete their application, thus potentially depriving the employer of excellent candidates.

In the CCEC, we do have concerns about what we’re seeing. In the conclusion of her Convocation remarks, Professor Davis decided that because programmers need to feed data to algorithms, AI won’t be writing students’ theses on their own in the near future. She raised a vital point that we worry about, too, which is the ways that programmers’ biases can infiltrate algorithms that screen applications leading to the AI reading the materials in what will result in a discriminatory manner.

In an excellent 2022 exploration review of this dilemma, Dr. Sanghamitra Dutta, Assistant Professor University of Maryland, sharply frames the issue: “AI decisions are tailored to the data that is available around us, and there have always been biases in data, with regards to race, gender, nationality, and other protected attributes. When AI makes decisions, it inherently acquires or reinforces those biases. For instance, zip codes have been found to propagate racial bias. Similarly, an automated hiring tool might learn to downgrade women’s resumes if they contain phrases like ‘women’s rugby team.’”

All may not be bleak, however. Harari sees AI, properly trained, as being a safeguard to the more individual human biases and more over discrimination that can enter subjective decisions. “Once we discover such mistakes, it would probably be far easier to debug the software than to rid humans of their racist and misogynist biases…”

Are Whitman students (and others) losing important professional opportunities because of a mysterious algorithm? If we are applying for a job, would we even know of the influence of a bias-laden AI that is looking at our materials? Today at noon, as part of the Sophomore Summit programming, the CCEC is hosting an alumni panel focusing on Data and Technology and so these would be great questions for them!  Our alumni are in positions throughout the economy where they can see the options available to programmers and engineers creating the data sets and algorithms that are shaping modern life and we invite you to reach out to them to ask these questions!

In the next Thinking About Working later this month we’ll take up the larger and potentially more worrisome question about whether AI will not just evaluate you as a candidate but also take your job. Stay tuned!

Leave a Reply

Your email address will not be published. Required fields are marked *