“Alexa, Should We Trust You?”


Whether or not your home contains one of the so-called “smart speakers,” I encourage you to spend 15 minutes with the cover story of The Atlantic’s November 2018 issue. Judith Shulevitz’s “Alexa, Should We Trust you?” serves as a captivating look at the present state of voice-activated virtual assistants and an evenly-weighted consideration of the future of devices like the Amazon Echo, Google Home, and Apple HomePod.  (Full disclosure: I do not own any of these technologies, but I have seen several in operation in friends’ households.)

Before I share a few more comments about the article, it is useful to identify the distinctions between the pieces of hardware that may be sitting on your kitchen counter or bedroom dresser and the web-connected digital assistants that these gadgets allow users to interact with. The most popular line of smart speakers is the Amazon Echo, which — as of 2018 — 31 million U.S. users have plugged into a socket. The Echo hosts the upbeat female voice of Alexa. The bulk of Shulevitz’s article examines the relationship between human users and this admittedly robotic yet eerily articulate domestic oracle.

Less popular than the Echo/Alexa pairing — at least as measured by units sold — is the combination offered by Google, the search engine titan that powers 90% of web inquiries. Google’s devices, which number roughly 14 million active units in the United States, are identified as the Home series. The web-connected medium that they channel is the Google Assistant. Lastly, one of the most recent entries into the smart speaker market is the Apple HomePod. This cute, bulbous device lets users interact with Siri, the virtual assistant that many iPhone owners are familiar with.

With definitions out of the way, let’s move on to a few teasers in the hopes that you make time to read this worthy piece. The article’s introduction features some noteworthy statistics, namely that 8 million Americans own three or more smart speakers (i.e. that multiple devices are found in their homes) and that by the year 2021 estimates suggest that there will be nearly as many smart speakers in operation as the number of people on earth. That’s a stunning amount of domestic tech with ears eager to hear what we are up to. As Shulevitz notes, “They [the device manufacturers] want to colonize space. Not interplanetary space. Everyday space: home, car, office.”


And the fact that digital assistants like Alexa are constantly waiting for our input is one of the reasons that trust is a central theme that runs through the psychological aspects of human-computer interaction that Shulevitz explores. Users are, after all, not conversing with an entity located next to the kitchen sink; rather, they are speaking to a voice-recognition system whose computing “brain” is housed in gigantic server farms humming inside warehouses far from our homes. In reality, devices like the Amazon Echo serve as portals. Our words pass through them even though it feels like we are speaking with them.

When we employ the “wake word” that cues Alexa or Siri to pay attention, it prompts a system that inhabits a sprawling climate-controlled facility. Our queries about everything from the weather in Miami to the number of Ted Bundy’s victims are then tallied, analyzed, and categorized by the algorithm-infused AI infrastructure. These vast digital networks dispassionately learn everything they can about our likes, dislikes, and curiosities with the ultimate goals of (1) being able to predict what we might next desire, and (2) establishing metrics about how we — and others like us, based on demographics — think and behave. We are entrusting our personal data to Big Tech, and those firms are hungry to harvest it.

Although sharing information online is certainly not a new concept, especially given the billions of people worldwide who use Facebook and other social media platforms, the trust dynamic shifts when the devices we are interacting with can talk intelligently with us. When prompted by a seemingly humane entity, humans can be persuaded to divulge intimate questions and confessions. Large technology companies possess the power — and, in many cases, the incentive — to sell the fruits that grow from these sensitive topics to bidders who want to market products, services, and lifestyle aspirations to us. Consider this potentially unnerving fact from the article: if a web-connected Roomba sweeps crumbs and dog hair from your linoleum, then its manufacturer (iRobot) owns a partial floor-plan of your residence and knowledge of where your furniture is located. (!)

If the preceding paragraph suggests that The Atlantic’s cover story serves as an ominous deep-dive into the dark intentions of Silicon Valley, that would be inaccurate. Instead, the article is far more about our human nature and what our willingness (and desire) to direct questions to small devices that possess a voice — but no eyes, face, or conscience — means about our curiosities and vulnerabilities. We seek to be informed. And we don’t want to be lonely. Increasingly, it appears that we are willing to ask Alexa for assistance with both. What does it suggest about the human condition if we are more comfortable telling the black soup-can-like pod on the windowsill that we are feeling down than we are picking up the phone and discussing that same sadness with a friend?


Judith Shulevitz writes that, “We’ve been reacting to human vocalizations for millions of years as if they signaled human proximity.” And until now, they have. But today, in houses around the world, there are children, teens, and adults holding conversations with voices whose convenient accessibility and lack of moral judgement signal not a human proximity, but the nearness of a tantalizingly powerful amalgamation of trillions of data points arranged in the silhouette of a human form.

That form provides correct answers — at least, in many cases — but can it yield the right answers? And what does it mean if we believe that it can? It is the latter question that I think needs to be carefully considered. Are we nearing a cultural tipping point where humanity deems it desirable to not only request objective clarification from virtual assistants (e.g. “Alexa, how many acres can fit in a square mile?”) but also preferable to seek understanding or comfort from their AI brains (e.g. “Alexa, why am I struggling to make friends at my new school?”) instead of from fellow human beings?

By assessing one outcome of the quickening pace of technology advancement — and, in doing so, identifying a potentially alarming reality — Shulevitz remarks: “The line between artificial voices and real ones is on its way to disappearing.” If (or when) this distinction is fully realized, what will occur to the perceived value of face-to-face interactions? If discussing difficult subjects with a faceless digital assistant becomes good enough — or even better than talking to one’s closest confidants — will humanity have taken a step forward in sync with technology, or a step backward from genuine intimacy? Please, do not delay another minute. Follow this link to The Atlantic’s full article. You’ll be glad you did. Trust me.

Note — The images in this post, which were created by acclaimed illustrator Roberto Parada, are featured in The Atlantic’s on-line article and its November 2018 print edition.

Habits — with author James Clear


This afternoon I finished listening to Rich Roll’s interview with entrepreneur and author James Clear, and I was impressed. Actually, I was surprised and impressed. So much so that I am recommending the podcast — which is available in video form on YouTube here — because I believe it will be worth your time if you are trying to either establish good habits or break bad ones.

A regular follower of Rich Roll’s weekly podcast, I listened to a preview of his interview with James Clear last week. It was there that I learned that Clear was being featured as an expert on the topic of habits. Never having heard of the man, I searched the web for details about his education but could find nothing — no evidence of a Ph.D., university affiliation, or history of peer-reviewed scholarly publications. Having read enough books about psychology to know that it is not a field where one can make casual claims, I was suspicious of Clear’s authority on the subject of behavior change. My suspicions deepened when I learned that his new (and only) book is entitled Atomic Habits (2018).

As an English teacher, I can’t imagine any relationship between the words atomic and habits that seems reasonable. After all, atomic most frequently precedes either bombs or energy. And habits seem to have nothing to do with cataclysmic warheads or slamming tiny particles together to produce usable energy. Why didn’t James Clear choose a more authoritative — or at least serious — adjective for his first book? Like strategic or formidable or purpose-driven. Even Life-changing Habits would suggest content that is substantive rather than sensational.

For me, Atomic Habits sounds like a title that a motivational speaker would sell — not an educated investigator who had spent years delving into the science behind motivation, decision making, and cognitive processing. And in order to take Rich Roll’s interview seriously, I was really hoping for the latter. That’s what I thought before I started listening several days ago. Thankfully, I learned that my doubts were largely (though not entirely*) unfounded.

Yes, the title of James Clear’s book still feels like an odd choice. But the man who wrote it seems legitimate even though he has not earned a Ph.D. in psychology. Clear has a passion for understanding the human condition, and he appears to have done his homework. In fact, only several minutes into the interview he references author Charles Duhigg, who is a Pulitzer Prize-winning journalist. Before Duhigg transitioned into long-form non-fiction, he was a respected reporter for the New York Times. In 2012 he wrote The Power of Habit, a brilliant behavioral analysis that features over 60 pages of source notes. As an investigator and a writer, Duhigg is The Real Deal. I highly recommend his book, which I own and have read thoroughly. That’s my stack of hand-written notes next to it.


Anyway, Rich Roll’s interview with James Clear is very engaging; I learned a number of strategies and perspectives (ways of thinking differently, you might say) that I can immediately implement to encourage the formation of better habits and begin the disassembly of poor ones. For instance, Clear advocates rehearsing the first two minutes of any behavior that you’d like to become a routine. How can two minutes possibly make a meaningful difference? Consider the following example:

Let’s say you’d like to improve your health by taking several 30-minute walks each week. In order to engage in this low-impact exercise, you must first put on the proper socks and shoes, grab your keys, put on your jacket, and walk out the door. Clear asserts that if you move methodically through that two-minute routine several times each week — from opening your sock drawer to locking the front door behind you — that you will ingrain the habit of setting off with intention. Remember: if you won’t step out onto the porch, you can’t take a long walk. Therefore, the most important part of this fledgling habit is arguably its first two minutes.

Obviously, you can’t return to the warm comforts of your family room after this two-minute scenario and expect to see any health gains. So on several of those evenings you continue beyond the two minutes and complete your 30-minute walk. As a consequence of this sustained effort, positive health results will slowly begin manifesting. Meanwhile, the two-minute rehearsals that occur on the evenings that do not extend to the half-hour walk will add value because they perpetuate the habit of getting you out the door. And Clear believes that what prevents most people from establishing positive habits is that they don’t have the discipline to simply begin the process.

Having not read (or even seen firsthand) Clear’s book, I remain cautious in my endorsement of his scholarship. That is why the asterisk* appears in the fourth paragraph. But I am confident in recommending Rich Roll’s interview with this first-time author. Their conversation is very engaging, and I found myself nodding in agreement with many of the insights that Clear offers as well as most of the well-reasoned answers that he provides to Roll’s questions. Consider giving it a listen, or watching it on YouTube.  Links are featured above. And if you choose to read Atomic Habits, please let me know your thoughts!

Note — The image at the beginning of this post was obtained from Rich Roll’s website.

Brene Brown’s “Dare to Lead” is here!

Screen Shot 2018-10-26 at 10.06.42 AM

The latest book by renowned social worker Brené Brown is now available. Dare to Lead, which is the sixth title published by the respected University of Houston researcher and ground-breaking TED speaker, is likely to receive both scholarly and popular praise.

Brené-Brown-approved2-photo-by-Maile-WilsonDr. Brown, who is the author of four #1 New York Times bestsellers, has pioneered a new understanding of the roles of vulnerability and shame in the human condition. Her writing — which draws on decades of research largely conducted via personal interviews — has positively impacted thousands of readers from all walks of life.

Please see Brene Brown’s website for more details about Dare to Lead. It is sure to be an informative and inspiring read, especially for those interested in the culture of leadership that exists at their workplace, church, non-profit, or community organization. Amazon reviews of the book can be found here.

Note — The images featured in this post were obtained from Brené Brown’s website.

“Always On” Work Culture


On Friday, October 5, 2018 the Wall Street Journal published an online article entitled “How to Disconnect from ‘Always On’ Work Culture.” The following day the same piece appeared in the WSJ’s weekend print edition (Saturday/Sunday, October 6-7, 2018) with the title “Far From The Madding Co-Workers.” Its author, Matthew Kitchen, is the newspaper’s “Off-Duty Gear & Gadgets” editor.

Judging by the writer’s curious job title, I should have lowered my hopes for the article that graced section D’s front page beneath a captivating illustration by artist Steve Scott. Unfortunately, I didn’t. I dove into the first column like a social scientist expecting the latest research results from universities in Helsinki, Tokyo, or London mixed with insightful analysis about the status of blue-collar and white-collar workers. Unfortunately, though the article features some telling social commentary, that content feels overshadowed by the tech recommendations that Kitchen shares to better handle the onslaught of electronic requests from bosses and colleagues. Bummer.

In my eyes, a promising opportunity to explore a problematic element of  Western work-life — our inability to detach from the office — was not fully realized. Rather, the topic’s critical mass was mostly skirted in favor of a half-dozen suggestions for apps (the ubiquotous moniker which causes techies and non-techies alike to squeal with delight) that could be used to slow the rush of work-related e-mail and texts. Given Kitchen’s “Gear & Gadgets” job title, however, I understand that his approach to this article is entirely reasonable. He is, after all, the WSJ’s tech guy. But I can’t help but ask this question: When are we going to get serious about the fact that the intrusions of our “always on” career focus are created because the cultural mindset around work — and not so much the technology that the office embraces — is the issue that needs unpacking?

Sadly, the idea of getting serious is thrown half-way out the proverbial window when Kitchen opens his piece with this sentence: “I have a masochistic need to please bosses, so I’m never more than a few feet from my iPhone (notifications humming at all hours) and I never leave home without a MacBook in tow” (page D1). In terms of setting the tone, this sentence suggests that what is to follow is not going to be an evenly-balanced assessment of how the desire for career stability (and/or advancement) is complicating the need for work-life balance, familial intimacy, and long-term personal sanity. Rather, the author seems to be letting readers know that he isn’t going to tread very far from his charging cords and touch screens.

Kitchen tosses another wrench in the works when he remarks in the second paragraph that he is a millennial. While I understand his desire to be forthcoming about his age bracket (and thereby imply that he possesses an inherent affinity for digital technology) I find it troubling that he does not follow this statement with a caveat. And that caveat would be that even though millennials have been accused of being self-centered, entitled, and virtually addicted to technology because of their status as “digital natives,” they have also been shown to be remarkably astute social and cultural critics. Respected men and women from the Greatest Generation have given millennials their due credit, praising their cultural consciousness, awareness of civil rights conflicts around the globe, and desire to confront injustice in nearly all areas of commerce and social welfare.

Isn’t the infiltration of work-related communication into time that should be dedicated to child-rearing or bonding with one’s spouse a form of injustice? The middle of Kitchen’s article seems to indicate that it is, as he cites research from several studies and entertains thoughts of altering his own “masochistic need” to hover over his devices at all hours. But in the article’s conclusion he provides — in a non-humorous attempt at being humorous — that he is about to share “the ultimate key to work-life balance” except  “actually wait. Can you hold on a second? I gotta take this” (page D11). I find this conclusion to not only be a largely failed attempt a cleverness, but a sad comment about the importance of critically examining how educated adults are voluntarily letting work-related requests (and/or demands) run rampant over time that should be spent with friends and family — or even with just oneself.

This criticism aside, the article features some valuable research results as well as warnings about the negative impacts of our Pavlovian response to smartphone alerts. Consider these findings, which I am producing verbatim:

  • “According to a 2016 study by the Academy of Management, employees tally an average of 8 hours a week answering work-related emails after leaving the office.”
  • “[A] Harris Poll for the American Psychological Association found that 30% of men and 23% of women regularly bring work home.”

Offsetting these depressing (yet certainly not surprising) research results are the following developments that foster hope that we can curtail the invasion of work into our personal lives:

  • “In 2017, France instituted a new labor law that supports a new frontier in human rights, the ‘Right to Disconnect.'”
  • “Similar rights have been extended in Italy and the Philippines, are being explored in Germany and Luxembourg and were proposed in New York City.” (Note — Given NYC’s failed attempt at limiting the sizes of soft-drink containers in 2013-2014, I have doubts as to whether this new and certainly worthy initiative will find traction.)

I will not hide my disappointment with the fact that my hopes were unrealized, but I do credit Kitchen for acknowledging the subject. The Wall Street Journal should also be recognized for deeming this article relevant enough to place it on section D’s first page. Granted, section A would have been ideal. But I admit that it is very difficult to criticize digital technologies in 2018 — a time when they add tremendous value to our lives — without fear of being labeled a luddite. I simply wish a greater discourse was occurring around when, where, and how we employ these technologies. When left unchecked, use can turn quickly to abuse. We owe it to ourselves — and to our relationships with family, friends, and fellow community members — to talk (and write) more candidly about always on work culture.

TRIBE, by Sebastian Junger

Sebastian Junger

“What people miss [about combat] presumably isn’t danger or loss but the unity that these things often engender. There are obvious stresses on a person in a group, but there may be even greater stresses on a person in isolation” (p. 92-93).

It’s been over a week since I finished Sebastian Junger’s TRIBE (2016), a slender work of non-fiction by the bestselling author of The Perfect Storm, War, and Fire. Since completing this quick read, I have found myself repeatedly glancing at its matte black cover and feeling drawn uncomfortably to its title — the word TRIBE. For nearly ten days something has bothered me about it. Not until this afternoon, while copying passages from the book onto a yellow legal pad, did I finally determine why the title provokes me. Its typeface features a camouflage pattern. On the surface, this is fitting. But symbolically, the camouflage is indicative of a troubling fact about our society.

For a book that often refers to military service in order to explore the differences between tribal societies and modern Western culture, the use of camouflage is ideal. After all, the external face of the U.S. military – especially the Army and Marines – mixes olive drab, dark brown, chocolate, and greenish-yellow. Thus the camouflage typeface is not only appropriate in its connection to soldiers’ fatigues, but it appeals to shoppers whose passing gaze may fancy the green-brown-yellow pattern that adorns everything from women’s yoga pants to pre-teens’ school backpacks. Camouflage is cool.

But this afternoon I realized why the pattern has been nagging me. I value Junger’s use of military references to help readers understand the distinctions between tribal societies (both historical and contemporary) and modern Western culture. But it dawned on me today that it is the purpose of camouflage that has been provoking my discomfort; camouflage helps its host disappear. Drawing on influences from the natural world, humans have disguised their appearance for well over a hundred years through the use of specially-crafted garb. The goal of such clothing is to blend in to one’s surroundings, to become invisible against the background.

Understandably, on the battlefield and behind enemy lines, soldiers want to achieve invisibility. If you can’t be seen, it’s much less likely that you will be shot or bombed. But when soldiers return from duty and re-enter civilian life, what occurs if they are still camouflaged? Not from face-paint or jungle fatigues, but from the fact that most civilians in modern Western culture are at least partially – and in many cases largely — invisible inside their communities. Junger’s thesis asserts that disconnection has become widespread in the United States and western Europe, and that servicemen and women are not the only ones feeling lost. Rather, the entirety of modern Western culture is showing signs of alienation from community-centered values.

Junger believes that this disconnection, this sense of feeling invisible, is due to several causes: (1) the lack of a crisis around which people must band together in order to survive, (2) the affluence of modern society and the fact that its luxuries are prized when they are amassed instead of shared, and (3) the belief that success has become a solo effort, not one measured by improvements in group health and welfare. He writes: “Whatever the technological advances of modern society – and they’re nearly miraculous – the individualized lifestyles that those technologies spawn seem to be deeply brutalizing to the human spirit” (p. 93). These are heavy words.


Much of the work’s 136 pages features a fascinating historical analysis of why our human spirit has been eroding over the last several centuries. The first quarter of the book documents stories from the American frontier, a time during which a striking number of European settlers found more appealing living conditions with Native American cultures than they did with the colonies that they were, by nationality, a part of. As a result, both men and women migrated from colonies to tribal encampments. And even when rescue parties were sent out to bring these individuals home, they often resisted; in fact, more than a few hid from their rescuers. For those who left the colonies to join tribal life, the tight bonds of Native American cultures were more reassuring than what they were experiencing in “civilized” settlements on the east coast.

For the remaining chapters, the author turns toward World Wars I and II, conflicts during which the need to band together (as both civilian communities and military units) witnessed dramatic decreases in people’s self-reported depression, anxiety, and stress.  Because the members of those populations joined together for a common cause, they formed connections with strangers. They focused on others instead of themselves because their well-being (if not their survival) could be ensured only if the group remained viable. One of the most powerful examples that the book features is the well-documented phenomena that manifested in London during the Blitz. Despite weeks of brutal aerial bombardments by the German Luftwaffe, the citizens of London experienced a pronounced increase in spirit and emotional intimacy at the very time that their lives were most threatened. Imminent danger catalyzed relationship building.

The author writes: “What catastrophes seem to do – sometimes in the span of a few minutes – is turn back the clock on ten thousand years of social evolution. Self-interest gets subsumed into group interest because there is no survival outside group survival, and that creates a social bond that many people sorely miss” (p. 66). With the exception of natural disasters like floods, hurricanes, and wildfires, most Americans are largely insulated from anything resembling a true catastrophe. Although global warming, poverty, and discrimination are very pressing problems, they do not possess the commanding immediacy of an invasion by a foreign army, the outbreak of a communicable disease, or a power outage that puts half of the nation in the dark. Consequently, we rarely depend on others. This results in a situation where Junger writes: “A person living in a modern city or a suburb can, for the first time in history, go through an entire day – or an entire life – mostly encountering complete strangers. They can be surrounded by others and yet feel deeply, dangerously alone” (p. 18).

Supporting Junger’s argument are interviews with scholars and social scientists, who attest to the strengths of tribal cultures. These experts also provide sobering commentary about the ways in which modern society is falling far short of upholding the values that more primitive cultures maintain through their reliance on group dynamics. In response to Junger sharing some of his observations with anthropologist Sharon Abramowitz and asking her how suitable they are to disclose to readers, she responds this way: “You’ll have to be prepared to say that we are not a good society – that we are an antihuman society” (p. 93). She continues by saying that, “We are not good to each other. Our tribalism is to an extremely narrow group of people: our children, our spouse, maybe our parents. Our society is alienating, technical, cold, and mystifying. Our fundamental desire, as human beings, is to be close to others, and our society does not allow for that” (p. 94).

My only regret about TRIBES is that it is not twice as long. In my estimation, Junger has just scratched the surface on this provocative subject. Whether you are interested in Colonial American history, the impact of PTSD on servicemen and women, the dynamics of fraud and greed in the financial sector, or the health of your suburban neighborhood, I recommend investing a few hours in Sebastian Junger’s book. It is a quick read, but its content will stick with you. And after considering the author’s observations, you may understand why camouflage is both a blessing and a curse. Invisibility is beneficial on the battlefield, but it harms everyone — soldiers and civilians alike — back home.

Interested in learning more?  

  • Sebastian Junger’s TED talks are worth your time. Here is a link to his most recent one, a presentation recorded in May 2016 entitled Our lonely society makes it hard to come home from war.
  • Sebastian Junger’s film Restrepo, which won the Grand Jury Prize for Documentaries at the 2010 Sundance Film Festival, is riveting.

Note — The images contained in this post were obtained from Sebastian Junger’s website and Unsplash.com.

The BBC’s Loneliness Experiment

Screen Shot 2018-10-04 at 8.39.27 PM

When it comes to loneliness, you may not be alone.

On October 1, the British Broadcasting Corporation (BBC) unveiled data compiled during its 2018 Loneliness Experiment, an online survey designed by a trio of academics that was filled out by over 55,000 individuals beginning on February 14. An article posted on the BBC.com on Monday summarizes the data; it also features interviews with three participants of the project who come from different walks of life.

Perhaps the most noteworthy insight revealed by what the article deems “the largest study of loneliness yet” is this: “There is a common stereotype that loneliness mainly strikes older, isolated people – and of course it can, and does. But the BBC survey found even higher levels of loneliness among younger people, and this pattern was the same in every country” (emphasis is my own). By “every country,” the authors are referencing the fact that individuals from “237 different countries, islands, and territories took part in the survey.”

The following table provides data from seven different groups based on age. The group featuring the greatest percentage of respondents who indicated that they experienced frequent loneliness included those between ages 16 and 24.

BBC loneliness table

The article’s authors proceed to explain that this increased prevalence of loneliness among younger people is not necessarily a generational difference (i.e. that today’s teens and twenty-somethings feel lonelier than young adults growing up decades ago). The BBC.com staffers cite the fact that older people who completed the survey indicated that the loneliest periods of their lives occurred when they were younger.

Why? The author’s suggest that, “The years between 16 and 24 are often a time of transition where people move home, build their identities and try to find new friends.” Whether you navigated high school and college in the 1960s or early 2000s, these circumstances generally hold true. Young people immerse themselves in new employment and educational experiences, test new living situations, and venture forth into new relationships with friends, lovers, and employers.

Although these growth initiatives can result in powerful interpersonal bonds and the security of new-found belonging, they can also yield dramatic gulfs of soul-searching, isolation from the familiar, and a demoralizing uncertainty about what comes next. Anecdotally, being young has never been easy. The data from the BBC’s 2018 Loneliness Experiment seems to suggest that this has been true for many generations. Loneliness is a common condition experienced by people of all ages — but those who are younger self-report it at slightly greater rates. Regardless of age, it may be accurate to acknowledge that we are not alone in our loneliness.

“But I didn’t want to ask for help.”


What happens when elite athletes suffer from mental illness as a result of head trauma? One answer to this question can be found in the August 2018 issue of Bicycling magazine, which features the first-person account of professional cyclist Alison Tetrick.

Before Tetrick transitioned from a successful road-racing career to her current role as a gravel-racing champion in 2017, she suffered two concussions. The first, which occurred in 2010 at the Cascade Cycling Classic in Bend, Oregon, was devastating. During the race’s first stage, another rider — who was trying to avoid a crash she could see ahead — accidentally clipped Tetrick’s front wheel. The pair were traveling downhill at an estimated 45 mph when Tetrick was launched from her bike and hit the tarmac.

In her words, “I didn’t slide, didn’t tear shorts, didn’t bleed. I just hit the ground. I landed on my hip and head, and shattered my pelvis” (p. 56). After spending over an hour on the ground while first responders arrived and stabilized her, the 25-year-old was airlifted to a hospital where she was quickly diagnosed with a concussion, a form of traumatic brain injury (TBI). The journey that commenced after she was released from the ER included daunting physical rehabilitation, and an even more challenging mental climb — a process that is still on-going.

While her body slowly healed over the months following her crash, Tetrick’s mind seemed perpetually in a “dense, never-ending fog” (p. 56). In both her relationships with friends and with family, she struggled to maintain a stable emotional state; she felt anxious and irritable at times, and emotionally vacant at others. Her marriage ended. And she struggled to determine how she would continue as a professional athlete in a sport where confidence is key. But despite these challenges and concerns, Tetrick fought on. Her body recovered and she made a comeback in 2011 at the Merco Cycling Classic in Merced, California. Despite winning the second-stage time trial, holding the leader’s jersey for three more days, and then winning the overall event, Tetrick knew something was wrong.

She writes: “Throughout my recovery from my broken pelvis, and after, I felt vulnerable and fragile, insecure and mentally frail…But I didn’t want to ask for help. I wanted to pull myself up by my bootstraps, cowgirl up” (p. 56). In language that testifies to the understandable fear of being an athlete who is perceived as weak or lacking in confidence, Tetrick says: “I didn’t want to admit I wasn’t okay [mentally] because if I admit that, and I’m leading a bike race, I’m going to get stuck in a corner because people know I’m going to have a panic attack…As a professional athlete, you hide your weaknesses….You can constantly find ways to tell yourself, ‘People like me. I’m normal. I’m okay'” (p. 56).

But, when dealing with a host of frightening symptoms that seem to indicate that your personality is morphing in strange ways, she admits that, “Deep down I was like, I don’t know if I’m okay” (p. 56). She continued forward, though, training and racing until disaster struck a second time in October 2011. At the Pan American Games in Guadalajara, Mexico, her front wheel got stuck in a storm drain during a pre-race warm-up. She “flew over the handlebar, and smacked [her] head” (p. 58). Despite this injury — a second head trauma in less than a year and a half — she raced that day. But her life began unraveling shortly thereafter. The once vibrant and outgoing young athlete was not okay.

Looking back on the period following her second concussion, Tetrick describes her situation this way: “I stared at the wall for weeks, couldn’t move, couldn’t stop crying. The depression wouldn’t go away. My parents sent me to psychologists…We were trying everything, because I couldn’t function. I couldn’t sleep — I had to go on sleeping pills” (p. 58). Again, she fought back. With the support of family, friends, and a neuropsychologist whom she works with today, Tetrick got back on the saddle and started racing again. For two years she did so while using the antidepressant Wellbutrin. She reflects: “During that time I didn’t really have emotional highs or lows, I just felt flat” (p. 60).

Tetrick continued racing until shortly after she and her team finished the 2016 Tour of Flanders in Belgium. The day after that Tour, she attended a smaller race where riders “were taking all of these [unnecessary] risks” (p. 78). There, she saw a rider “hit a light pole” — a collision that seemed entirely unnecessary. And that is when Alison Tetrick decided that her professional road-racing career was nearing its end.


In 2017, Tetrick traded her skinny road tires for stouter off-road rubber and entered the 200-mile Dirty Kanza gravel race which is held in early June. It was her first time competing at that distance. And she ended up winning, sprinting to the finish line just ahead of reigning 2016 champion Amanda Nauman. Today, Tetrick seems to be on top of her game. Physically, she is performing better than she ever has before. However, she knows that the mental consequences of her concussions still follow her. She says: “Every day you have to make a choice for your mental health, and possibly deal with the physical side effects…I still get emotionally flooded. It’s an injury that you can’t see” (p. 79).

Despite the coverage that concussions receive in the press related to sports, military service, and workplace accidents, a stigma still exists around the psychological effects of traumatic brain injuries. Even though there is nothing shameful about being struck by a fellow cyclist and crashing to the pavement — or being tackled by a monstrous defensive lineman — we still seem to tread delicately around the emotional and mood-related consequences that those massive blows can impart. Physical injury caused by others’ actions can be interpreted as an sign of having performed courageously on the field of play, but mental anguish rarely receives equal respect. Both are tragic, to be sure, but it is the latter that is often darkened by shame.

I applaud Alison Tetrick for writing candidly about the physical and mental challenges that she has faced in the wake of her concussions. Speaking about broken bones can be easy, but talking about a flagging spirit or a troubled mind requires much greater resolve. Tetrick possesses a character made stronger by her willingness to be vulnerable.

Kate Spade: More than a name


The other day I was helping a middle-aged woman check out with her order, and I watched as she placed a slightly-worn wallet down on the counter between us. In small raised gold letters, the name kate spade stood out above the black leather. In an instant, I wondered what that name — and that brand — now holds for this woman, and for others. What role does it play in their fashion sense, in their estimation of what displaying this iconic logo on a purse or shoes or belt might now symbolize?

My second thought after recognizing the tiny gold letters was this: the name of a prominent fashion pioneer has taken on new meaning. On June 5, 2018, Kate Spade took her life. She, the woman, no longer exists. But the products she developed are still ubiquitous. The namesake brand that has represented quality, sophistication, and style for many years has not necessarily shifted in identity. Those traits remain. Though the importance of that name — perhaps even its jurisdiction, its sphere of social influence — may have.

As I processed the order, the urge to ask this woman about her perception of the logo on her wallet rose in my mind. But I quickly assessed that such a inquiry coming from a stranger would not be proper; in fact, the question would be so charged with threatening energy that she may have been rendered speechless. Obviously, I did not want to create an extremely uncomfortable moment for her — or for me. Yet inside my brief period of curiosity and reticence lies a question. And perhaps an opportunity.

Speaking about what Kate Spade chose to do is, I believe, inherently difficult because it causes most individuals to at least consider having an internal dialogue about an act that elicits tremendous uneasiness. Suicide. More than ever, though — especially in light of Anthony Bourdain’s choice to end his life only days after Spade’s tragic demise — we need to hold these conversations. And we need to open spaces for those dialogues to occur, spaces without shame or criticism or the fear of dismissal. But how do we accomplish this?

For a moment the other day I considered initiating such a conversation with a stranger, but the unspoken guidelines of appropriate social discourse dampened that impulse. However, I wonder if similar constraints largely inhibit — and perhaps prohibit — those conversations among friends, between partners, and with children. Suicide is a scary subject. And the conditions that can lead to suicide — depression, loneliness, and low self-image among others — are often just as scary.

So too often, I fear, we simply don’t consider raising the topic of suicide with anyone — stranger or loved one. And I believe that needs to change. Because we are all so much more than just our names.

Note — The image featured above was obtained from Unsplash.com.

Suicide in the United States


According to the National Institute of Mental Health (NIMH):

  • Suicide is the 10th leading cause of the death in the United States.
  • In 2016 it claimed the lives of nearly 45,000 Americans.
  • It is the 2nd leading cause of death of those ages 10-34.
  • It is the 4th leading cause of death of those ages 35-54.
  • 90% of those who died by suicide had an underlying mental illness.

Additional NIMH information about suicide can be found here.

Note — The image in this post was taken by photographer Kristina Tripkovic, and it was obtained from Unsplash.com.

Teen Depression


According to the National Institute of Mental Health (NIMH):

  • 20% of youth ages 13-18 live with a mental health condition.
  • 11% of youth have a mood disorder, a category that includes depression.
  • 50% of all lifetime cases of mental illness begin by age 14.

More information from the NIMH about teen depression can be found here.

Note — The image in this post was taken by Tiago Bandeira, and it was obtained from Unsplash.com.