Some pieces of technology are so ubiquitous that it’s hard to imagine what things were like before they came along. Fifteen years ago today, the first modern smartphone – Apple’s first-gen iPhone – went on sale in the United States.
While business users had been answering emails on their BlackBerries (remember those?) for a few years beforehand, this signaled the start of the smartphone revolution. Before long, no shortage of competitors such as Samsung, Lenovo, and others had jumped into the category, along with Alphabet’s Android platform. Here in 2022, around one in three people worldwide own a smartphone (around 3.5 billion in total).
Want to know 15 ways the smartphone changed life as you know it? Keep on scrolling (probably on said mobile device.)
Back in 2005, Microsoft co-founder Bill Gates told the German newspaper, Frankfurter Allgemeine Zeitung, that the days were numbered for Apple’s biggest hit of the time: the iPod music player. Lots of Apple fans took this as Gates simply being bitter about Apple’s then-recent renaissance. In fact, Gates’ point was totally valid. The iPod, he suggested, wouldn’t be sustainable long-term because, at some point, a cell phone would come along that would offer MP3 playback as one of many features. Just a couple of years later, Gates’ hypothesis was proven correct – by Apple.
It’s not just the iPod that was cannibalized by the smartphone, though. There are plenty of other gadgets – such as flashlights, voice recorders, GPS, and more – that have been absorbed by the smartphone in the years since it made its debut.
Even tasks that once required the mighty personal computer to carry out can oftentimes be carried out using our phones. Like a Swiss Army Knife, having to fit these myriad tools into one single unit means they’re not always perfect replacements. But they’re frequently “good enough” to be replacements for the majority of people.
In some senses, the wallet is just another on the long list of items above that the smartphone has absorbed during its reign. But it’s probably the most important one. Services like Apple Pay and Samsung Wallet have rapidly become integral to our lives. When you manage to change established behavior in such a fundamental way, that’s pretty impressive. And incredibly lucrative for the companies who control these ecosystems.
The first touchscreens didn’t arrive with the smartphone. They had been around in various forms for several decades by the time the smartphone arrived. And, virtually without exception, they were horrible. With Apple’s impressively sensitive, multi-touch technology on the iPhone (which allowed the screen to recognize multiple points of screen contact simultaneously), the smartphone brought us into an era in which touch screens actually worked the way sci-fi movies had promised us they would. With multi-touch came a slew of new gestures, such as pinch-to-zoom.
You only have to walk into a computer store today and see the fingerprints on traditional desktop monitors to see that, here in 2022, touch controls are considered the default by many people. We have the smartphone to thank for that.
There was a time, not all that long ago, when your devices didn’t know exactly where in the world you were at any given moment – and, even if you told them, they wouldn’t snitch on you. The smartphone changed that. Geolocation opened up a bevy of new applications, from Waze-style crowdsourced traffic measurement on mapping apps to local weather readings to new sources of information that companies can use to target you with ads.
Heck, this geolocation information has even been used in murder trials to reveal where the plaintiff was at any given moment.
A bit like touchscreens, it wasn’t all that long ago that facial recognition and fingerprint sensors were bits of sci-fi that stood in as shorthand for “this story is set in the future.” No longer. Although they’ve yet to totally replace passwords, biometric technology has progressed in leaps and bounds over the past decade-and-a-half – and it’s on smartphones where most of us have encountered them.
Today, smartphone security systems like Touch ID and Face ID have become so commonplace as to be almost unremarkable. That’s pretty remarkable.
Smartphones were a game-changing technology for many people. But imagine if you didn’t have a computer prior to the smartphone. Or a landline phone. For lots of people in the developing world, smartphones represented a seismic shift in their ability to access information and communicate. While the price of high-end smartphones has crept up in recent years, even the most expensive phones are still far cheaper than comparably high-end computers.
Meanwhile, there is no shortage of low-cost phones with impressive specs. Since cellular connectivity is a more practical solution than hardwired phones and data lines in developing parts of the world, smartphones have made their influence felt very, very quickly.
Amazon changed the game when it patented a one-click ordering button in 1999. Quickly licensed by Apple, and imitated as closely as non-paying rivals could get away with, this innovation took 99% of the hassle out of e-commerce. In the age of the smartphone, this methodology has been applied to just about every other aspect of our lives.
Want a taxi? Uber lets you order one with a couple of taps. Looking for a date? Swipe left and right on Tinder until you find a match. Want to listen to an album by that artist whose music is playing in the store you’re in right now? Shazam will tell you their name, and Spotify will give you access to their complete back catalog, no album purchases required. And so on and so forth. This expectation has become a standard issue.
This is the dark underbelly – or, perhaps, the inevitable flipside – of on-demand culture. As it turns out, smartphone users are also the resources to be tapped whenever required. Constant connectivity in this context can be a downside: whether it’s the creeping expectation that all of us are reachable for responding to emails or Slack messages at any hour of the day (hey, it only takes a minute or so!) or the potentially mental health-damaging effects of “doom scrolling” Twitter or checking out social media platforms. It’s a thorny topic with pros and cons on both sides, but it’s most certainly a social phenomenon that just didn’t exist in the same way before smartphones took over our lives.
There are very, very few occasions these days when you wish that you had a camera with you, but do not. That’s because almost all of us carry around impressively adept, almost impossibly slimline digital cameras in our pockets. And, boy, do we love to use them!
Some 85% of photos today are reportedly taken on smartphones, and uploaded to social media in their hundreds of millions each day. While smartphone cameras won’t substitute for a high-end professional DSLR or broadcast video camera, the combination of hardware and software advances with smartphone cameras means that they’re more than capable (Capable enough, certainly, that phone cameras have been used to shoot some professional movies.)
Famed photographer Chase Jarvis once opined that “the best camera is the one that’s with you.” That, for many of us, makes the smartphone the best camera we own – or, at least, the one we rely on the most.
Social media of some sort has existed as long as there has been the internet. Facebook, the world’s biggest social network, emerged in the pre-smartphone era. So did Twitter. But smartphones were the natural companion piece to social media platforms, giving us the ability to connect on the go, complete with geolocation, the aforementioned smartphone camera, and a second screen (or, sometimes, the first screen) that was always with us.
The smartphone begat social media platforms from Instagram to TikTok to Tinder, all of which felt tailor-made for the smartphone era. Other social media platforms quickly learned they had to pivot to mobile-first in order to thrive. There would still be social media if the smartphone hadn’t come along. However, it most certainly wouldn’t be as ubiquitous.
Windows versus Mac? OK, boomer. Today’s platform war is firmly centered around mobile devices, most notably around Alphabet’s Android platform and Apple’s iOS. For the most part, Android is more widespread, while iOS hoovers up the overwhelming majority of the money in the mobile ecosystem. Although both platforms are objectively good with their own pros/cons, that hasn’t stopped folks from getting into heated debates about which is “the best.”
There are countless smartphone apps that have used machine learning in various smart ways at this point. But, for many people, the real turning point was the introduction of Siri — Apple’s smart AI assistant, with the iPhone 4s in 2011. While Siri had previously been available, in a somewhat altered form as an iPhone app, Apple made several tweaks and packaged it as a native application.
Just over a decade later, AI assistants like Amazon’s Alexa and Google Assistant have graduated to become their own category of hardware device found in an impressive number of homes around the world. But gadgets like the Amazon Echo would have never become a thing if the feature hadn’t first made its debut on the smartphone. And, in the process, given the world a piece of AI technology we would happily carry around chatting with.
Activism in the digital age can get a bad rap, with some critics referring dismissively to the era of “slacktivism” as a form of low-stakes, low-effort activism that asks supporters to do nothing more arduous than signing an online petition or changing their Facebook profile picture. But smartphones have most assuredly changed the way that activism is carried out – in ways far more significant than just Change.org petitions.
In particular, the powerful combination of constant connectivity and ubiquitous smartphone cameras has helped to spread awareness about some of the most notable news events of our time. Think the Arab Spring protests and the Occupy Wall Street movement in the early 2010s, or, more recently, the video recordings of George Floyd’s murder. In all these cases, smartphones played a key role in the way these events were reported on and disseminated. The Vietnam war is often cited as having changed American media through the advent of trends like “hit and run” reporting. Smartphones have triggered a similar change when it comes to citizen journalism, social justice, and the like.
Whether it was the iOS App Store or the Google Play store, the smartphone helped pioneer a new way for developers to find an audience for their products. One or two-person teams had been creating software, alongside the big companies, since the earliest days of the personal computer. But what they lacked was a way to distribute these that harnessed the reach of major companies like Apple and Google to aid with discoverability. Smartphone app stores changed that.
They also created entirely new business models, as seen through the rise of “freemium” apps, free to download by full of in-app purchases. Recently, challenges like Epic Games’ battle against Apple, have questioned the legality of gatekeepers like Apple taking such a large cut of developer fees. Regardless of how this turns out, however, there’s no doubt it’s changed distribution as we know it.
Some technologies aren’t just notable in their own right, they also act as bridges that take us from one era of computing to the next. Smartphones fall into both camps. While they certainly represented a game-changer in their own right, they additionally changed our expectations about what a computer was – particularly when it came to portability.
Sure, there were laptops in the 1990s. However, these weren’t used on the move nearly as effortlessly as smartphones. As such, smartphones helped set the stage for a new age of computing built around wearable devices like smartwatches and beyond. Would we have accepted the latter without smartphones preceding them? I’d argue we wouldn’t.