Work Smarter, Not Harder
The dark side of AI and its place in education.
Work smarter not harder was a phrase I heard like a broken record in college. Over and over. Smarter not harder. The existence of academic advantages are never going away. Rather, they’re evolving. Even in the three/four years I’ve been out of school, the avenues to ‘cheat the system’ have become more publicly obvious. ChatGPT and the rise of AI intersecting with education is culture’s current concern. Specifically, at the collegiate level.
So I’m here to ask the question: Why? In five parts.
The Decade Long Question Asked Once More
A Playing Field That Reflects Reality
Why Open AI Won’t Kill The College Essay Or Student
How Much Control Should A University Have?
The Dangers of Avoiding Teaching Students About Ever-Changing Tech
If you were only to read one section, I’d encourage you to read #5.
1: The Decade Long Question Asked Once More
Select universities are working overtime to ban the usage of AI like ChatGPT in the current educational ecosystem. Their reasoning being very one-dimensional and obvious: it’s an easy avenue to cheat on take-home work. Maybe this is a concern at the elementary level. But, at the college level? Hmm.
The New York City education department said Thursday that it’s restricting access on school networks and devices because it’s worried about negative impacts on student learning, as well as “concerns regarding the accuracy of content.” -The AP
Isn’t this what we said about Wikipedia?
Back in my day (which was less than four years ago), people in my classes were grouping together on Chegg memberships - submitting difficult math questions to ‘experts’ on the other line who’d nearly solve the equations for you under an hour. While I didn’t have one myself, I’d be lying if a Chegg account holder wasn’t my favorite person to hit the library with. Not because I could memorize answers for exams (I couldn’t), but because I could actually see step-by-step how the work was supposed to be done, which was really the only way to pass your exams and the class itself. Sometimes these step-by-step resources were given by the teachers. When they weren’t, they were given by the Internet. This was always coined as working smarter, not harder.
Language-based AI has entered the chat (punny, I know) and educators are asked the decade-old question: how are you going to respond to this new tech development?
2: A Playing Field That Reflects Reality
Interestingly enough, one of the distinct memories I have of college was a debate that was held my senior year covering the infiltration of AI use in the field of Human Resources. (My degree is in Business Management, by the way!) At the time, the idea of having a digital assistant that could eliminate the many paperwork heavy jobs in the HR field was controversial. So my professor had us debate it.
The prior summer, I’d held an internship where the company already had implemented an AI-HR product. My job as a intern was to make a manual for HR employees to learn how to use it. So when this debate came around, I quickly joined team affirmative. People are die-hard in their rebel cause at this age so these discussions quickly became heated. I learned much about the causes for my fellow students concern: Can AI be trusted to handle sensitive information? What happens when a workforce is reliant on AI? How many people will be out of jobs? Is AI going to light the planet on fire? Honestly, probably.
Flash forward past this flamboyant debate: I’m three years into a tech career where AI-HR product implementation is over 50% of my client work experience and all of those questions are still relevant. Yet, companies are already moving with it. They were already doing it year one into my tech-work life. And yesterday, I sat on a call discussing how to scale generative open-face AI, like ChatGPT, in a client-facing field safely, with maximum data protection.
Simply put: adjusting a client’s sails to the rapid waters of emerging technology is what pays my bills every month. And we don’t want to model this to incoming generations who will likely have an even deeper involvement in this sector?
3: Why Open AI Won’t Kill The College Essay Or Student:
The college essay being killed is a dominating concern with the advancement of AI technology. But, AI isn’t there yet. And likely, won’t ever be. Because good writing relies on personal accounts of storytelling, even in nonfiction research. Emotional intelligence is an experience that generative AI cannot yet comprehend. I suspect, as a writer and tech employee, that true emotional writing will never be replicable. Because AI doesn’t have these emotions. It mimics them.
On the contrary, the students that are in school to learn a skill, will learn the skill. The students that are there to blow through college will do just that. Take coding for example. Can an engineering student breeze their way through college now that there’s an Open AI platform at their fingertips that will write boilerplate code for them? I think not. I’m pretty sure Google already did that years ago. True engineering is a dark artistry that can’t and never will be a one-and-done gig. Especially if you’re planning a career around it. Learning how to respond to your environment alongside these tools will always be important. Just like it was for me in my transition to an HRIS role. But, mastery lies in knowing how these tools work for you, against you, and the limitations thereof. Because the limitations are far greater than what meets the eye.
4: How Much Control Should a University Have?
The difference between a successful college experience and a successful workforce boils down to the element of control. While both are places to learn and both are by definition ‘a business’ - a successful venture adapts their practice to favor reality rather than trying to control all of the external factors in their favor. Select colleges have instead historically attempted to do the opposite. Higher education often looks to control the external in their favor rather than embrace education around where the world is today. Sometimes this is imperative, but often it does not propel students forward to be workplace-ready. Rather, it sets them back.
If you haven’t read this article by The Free Press, I’d encourage you to do so. While I’m openly against the continuation of modern fraternity life in college-settings, I still found the article to be quite interesting.
While it’s hard to compare apples and oranges via the context of these two issues, I do think it’s important to see how far certain universities will go to maintain a feeling of authority over their own students for the sake of the desired public image: an image of unified control.
5. The Dangers of Avoiding Teaching Students About Ever-Changing Tech
Fortunately, you cannot deep-fake your way through life. However, the Internet thrives on the idea that you just might be able to. In a world of overtly-photoshopped images, AI modeling, and dangerous deep fake scams - education about how to be human during this attack on humanity is nothing short of vital if we want to protect our youth and their ancestors.
When freedom of speech is legal, but so is video-based calumny the last thing that should be done to protect upcoming generations is remove the topics of conversation away from modern education on a fear-based notion. When more than one-third (38%) of Gen Z consumers in the United States claim to spend more than four hours using social networks every day, it becomes a competitive marketplace for modern day education systems.
Who is teaching our youth more: the schools or the Internet?
With numbers like these: the answers become clear. To shy away from educational topics like generative AI and simply block them out of the education system, only reinforces the idea that what students learn about the Internet will come from the Internet itself. A dangerous cocktail so far.
In order to combat the darkside of what lies inside Pandora’s Floppy Disk, we need a unified front that encourages education about these tools in a safe learning space. Just because the nature of webculture is not physically violent, does not mean it isn’t inherently harmful or even lethal.
We wouldn’t send anyone to a battlefield without proper training. Yet, we’ll give any kid a smartphone without teaching them a thing about it. At the current trajectory, digital safety must be an educational right. Even if we hate it’s entailments.
Thanks for reading! Please note that all opinions and statements within this work are my own and do not reflect my employer or publishing team.
Lots of love to you.