I grew up with the world in the palm of my hand. From my earliest Gameboy Color to my school-issued ThinkPad, to my first smartphone, technology was constantly at my beck and call — always ready to take a picture, play a game, send a message or answer a question. Nothing felt off-limits with such potential at my fingertips: the ability to do whatever I pleased, whenever I wanted.
The older I become, the more disillusioned I grow with this narrative — technology is not as democratic as any of us were initially led to believe. In a society that increasingly prioritizes tech-forwardness and speed, rising costs price people out of the industry’s newest and greatest products, putting them at a disadvantage: not everyone can afford the fastest wi-fi connectivity or the latest smartphone.
According to a report published by the Code.org Advocacy Coalition, only 57.5% of high schools offer foundational computer science, preventing many youth from accessing the field entirely. Furthermore, students who are Black, Hispanic, Latino, or Native American are less likely to attend a school with computer science offerings — compounding an already fraught disadvantage in education and the workforce.
There is another, less apparent risk threatening this precariously painted illusion.
When computing began breaching the consumer market, it was in the form of the Commodore PET 2001, Apple II and TRS-80, who formed the “1977 Trinity”: three middle-class friendly personal computers that dominated retail and kickstarted the consumer computer industry. These systems, however, had one caveat: you needed to know BASIC, a programming language, to use them.
As the name indicates, BASIC was intentionally made learner-friendly so consumers, students, business owners and hobbyists could learn it. Writing commands in BASIC, users were able to operate their systems and write programs tailored to their needs; they weren’t just working a computer, they were making it work for them. A revolution began, ushering in a new era where anyone could communicate, research, or create and publish media.
Today, these capabilities remain, but our means of access have changed greatly. As consumers have accepted tech as the center of our lives, the opposite is also true: tech has centered consumerism.
To sell the most units, companies often sacrifice utility for simplicity. As early as the mid-1990s, Microsoft’s operating system innovations prioritized what tech journalist David Chernoff in 1999 called the “totally inexperienced user” — neglecting to improve features widely employed by legacy users, such as the right-click context menu — to focus on making the “easiest-to-use” operating system “ever.” Though it may seem like a niche concern, one can observe a snowball effect leading to where we are now, decades later: Corporate entities are only making it harder for users to retain control over their systems.
This includes everything from taking away the ability to move or resize taskbars, as Microsoft did in 2021 with Windows 11, to forbidding users from downloading software not notarized by the operating system’s company, like Apple did in 2019. Companies have also made their products nearly impossible to repair, allowing them to price gouge for certified parts and technicians and invalidate expensive warranties should you attempt anything on your own. The dilemma is obvious: Should corporations be the authority on what we are and are not allowed to do with our technology?
Though many of us may say no, the trend persists. Using apps (the ones you’re actually permitted to download, at least) requires agreeing to extensive terms and conditions, dictating the way you talk and the selling of your data. While buying a laptop and its operating system opens a whole world of possibilities, it also precludes several others only accessible through the ones you didn’t choose. When you do have a laptop, and it breaks in some way, you can’t even teach yourself how to repair it. People who don’t have access to computer science education in schools — often times, those who are already socially disadvantaged — may find their troubles further exacerbated by the reduced ability to experiment with the technology that is at their disposal.
The unfortunate reality is that technology is not as democratic as one would hope. Largely controlled by corporate interests, companies pick and choose what consumers have access to, crafting a limited usability which prioritizes their own goods and magnifies issues to push products manufactured by the companies themselves — selling both the problem and solution.
We can find more freedom in using our devices by channeling the spirit of the early technological revolution: challenging ourselves to create things, solve problems and share knowledge.
Consider teaching yourself to code and building a website to use with friends. When you think about replacing your current device, investigate building a custom one that meets your needs. There is power in knowledge — in learning how our devices work, we can reclaim technology and make it work for us again.
Ellie Willhite (she/her) is a freshman pursuing her BFA in cinematic arts with minors in sociology and Korean.