I’m a Lyft Driver. My Passengers Act Like I’m Part of the App

When riders come into my car, it’s as if the human behind the wheel disappears.
lyft sign in car
Photograph: Getty Images

In the future, ride-hailing passengers may conjure autonomous vehicles to their doorsteps with a few taps in an app. Currently, the vehicles come equipped with drivers. Some passengers, however, seem to have moved beyond our technological limitations to a conceptual world where human drivers have fallen into desuetude. In these riders’ minds, we are already in an era of autonomous vehicles.

I’m a flesh-and-blood Lyft driver in Portland, Oregon. I drive a 2016 white Kia Optima Hybrid. I pick up passengers in the early morning dark. When I begin to accelerate with a rider in my back seat, I sometimes imagine after reaching a certain speed that the real me disappears. I’m replaced by an illusory me—the generic, invisible driver whom many Lyft passengers seem to have in mind.

When I started working as a Lyft driver a few years ago, I talked and acted the way I do among the many strangers I engage with every day. But I soon found that passengers acted differently from how I expected they would in a stranger’s car with the stranger behind the wheel. Later, I realized my behavior had changed too.

My experiences are those of a driver, but I think about how the rides must feel from the passenger’s perspective. Based on what some have told me, they feel uncomfortable during entirely silent rides. So I always start with a couple of generic conversation prompts. It’s pretty easy to tell when someone wants a quiet ride. But other people are looking for someone to talk to, and I’m happy to oblige. Passengers also tell me about experiences with other drivers who’ve made them uncomfortable by hitting on them or going on political rants. I’ve also read about incidents where drivers have been assaulted by passengers, and passengers by drivers. Passengers and drivers are two strangers in a moving vehicle. Some of what seems to be passengers’ indifference to me is probably their way of dealing with what is a potentially frightening experience.

However, many passengers seem to lump the app, my car, and me into one mega-app. It’s as if initiating the ride through an app turns the experience of my driving them in my car into a different experience from what in reality is a guy driving them somewhere in his car.

“The abstract hypothesis isn’t crazy,” says Robin Hanson, associate professor of economics at George Mason University. He drew an analogy to the way servants have been treated throughout history. There are servants you need to talk to if you need to get something done, and there are servants you don’t need to talk to. “I would think the app would be moving people from the first to the second category.”

“When we’re using these apps, our focus on doing the transaction is in carrying out the steps in the app as opposed to connecting with another person,” says Susan Schneider, director of the Center for the Future Mind at Florida Atlantic University. “A person becomes just a step in an app. But it’s important to remember there is a person in that transaction, and not treat the person as just another algorithmic step.”

Seeming invisible isn’t the thing I have a problem with. What disturbs me more is how experiencing life through apps may prompt us to reframe our experiences as something different from what they are, into experiences abstracted from objective reality. A layer is created through which empathy and care don’t necessarily pass. That layer is a product of using services designed to make our lives easier, but it also disconnects us from each other. In my case, I felt as if I was seeing how passengers would act if they were in an autonomous car.

I’m an attentive listener, but sometimes I zone out. However, machines are indefatigable listeners. My gig work has made me cognizant of that. Here’s an example.

On December 25, 2020, I drove to pick up a woman and her mother. When passengers request Lyft rides, they can drag the “location pin” in the app to precisely where they want to be picked up. The pickup location for this ride wasn’t somewhere I could drive my car, although I was able to get within 25 feet. As the two got into my car, the daughter lit into me about not being where she had placed the pin. She called me every name in the book and demanded I “call Lyft” for a refund. The mother dramatically began to say her daughter’s name over and over, astonished that her daughter was doing this and pleading with her to apologize. The daughter wouldn’t let it go. I canceled the ride. I remembered how the mother had said her daughter’s name over and over. I retain vivid memories of the worst-behaved passengers. I assume retaining images of these faces increases my survival odds, and some people have suggested it’s a symptom of PTSD.

Two months later, I looked at my Facebook feed and saw the woman’s face in my “People You May Know” section. This spooked me. How’d she get there? What if she were stalking me, wanting to get in another rant about the location pin? I didn’t want to friend her on Facebook or befriend her in real life. I couldn’t help hearing her in the car, and it seems that Facebook was paying attention too. Google may have been tracking the ride too.

Thirty years ago, Hanson says, “before there was so much surveillance or computer information on people, when the issue came up, everyone was really concerned about it,” he says. Now, people keep “wishfully thinking that even though they’ve given away lots of information, it’s not being used against them or it won’t be used against them very much.” And it’s never been made clear to them how easy it would be for the information to be shared, he says. In my experience as a driver, when viewed through an app like Lyft, it’s easy to view others as abstractions, as less than real, and treat them accordingly.

I provide an essential service to many of my passengers. For different reasons, they are incapable of driving themselves, and I serve as a sort of souped-up, albeit ad hoc, service animal. Passengers may not have access to public transportation or they may have medical conditions that prevent them from driving. It’s gratifying to know I’m helping people get to work or just to get out of their house.

In order to make the most money in the shortest time, though, you do have to pick up people from bars. So alcohol consumption is a catalyst for many Lyft rides. The friend sitting next to you in the bar who tells you you’ve had enough to drink has been replaced by a ride-hailing app that gives you the ability to drink as much as you want because there’s always a driver to come get you, a few taps away on your phone. Some of these rides are harrowing for me.

My mega-app theory remains the best explanation I have to account for the things that people say and do in my car when they feel that no human worth acknowledging is around. And the issues aren’t as obvious as the stain left by that Instant Pot’s worth of sweet-and-sour meatballs spilled on my back seat. They are things like privacy and medical issues, my own liability for what passengers do, and maintaining our senses of human dignity.

Just for example, I’ve listened to passengers conduct entire Zoom therapy sessions. I’ve feared two passengers had fatally overdosed in my car. A couple started to undress each other while I took them through the 24-hour McDonalds drive-thru. I’ve been threatened numerous times. I had a passenger who demanded relentlessly that I take him to get Viagra pills. “You must know how,” he said. “Why won’t you answer me?” He wouldn’t believe I had no idea how to procure the pills and wouldn’t stop asking before I finally left him off at the seedy motel he’d entered in the app as his destination. I drove a woman who, after we started the trip, said I had to hurry because her appendix was about to burst.

However, the odd thing is that passengers don’t seem to be malicious or disrespectful people on the whole. It seems more that the way we’re trained to engage with the apps we use to get around, order food, or just make our lives better also trains them to no longer sense the human—a human like me—on the other end of the reality of their experiences.

Many of my passengers probably just want a quiet and safe ride, and I provide that service. I do worry that in relying on apps, we’re giving up something that is essentially human. Working as a driver, however, has made me reevaluate my view of social relationships, which are more and more and more a function of technology. It’s also made me more compassionate about others and mindful of certain snap judgments I unconsciously make about passengers. I’ve met some incredible passengers from a wide variety of backgrounds.

I know this all can come off like yet another driver complaining about the customers who pay him, but we can learn to operate from a place of social engagement first, and then disengage when we're concerned about our boundaries or safety.  Here are a few tips before you hail that next ride:

  • Don’t be afraid to say hello to your driver.
  • If you want a quiet ride, it’s OK to ask.
  • If you prefer a different route, just let your driver know.
  • If you have any questions, ask away.

There are news stories every week about autonomous cars. They imply a future where ride-hailing as we know it is supplanted by a service powered by self-driving cars. With each story I read, I think, nah, the era of autonomous vehicles is already here. People just haven’t gotten around to realizing it.


More Great WIRED Stories