“Always On” Work Culture


On Friday, October 5, 2018 the Wall Street Journal published an online article entitled “How to Disconnect from ‘Always On’ Work Culture.” The following day the same piece appeared in the WSJ’s weekend print edition (Saturday/Sunday, October 6-7, 2018) with the title “Far From The Madding Co-Workers.” Its author, Matthew Kitchen, is the newspaper’s “Off-Duty Gear & Gadgets” editor.

Judging by the writer’s curious job title, I should have lowered my hopes for the article that graced section D’s front page beneath a captivating illustration by artist Steve Scott. Unfortunately, I didn’t. I dove into the first column like a social scientist expecting the latest research results from universities in Helsinki, Tokyo, or London mixed with insightful analysis about the status of blue-collar and white-collar workers. Unfortunately, though the article features some telling social commentary, that content feels overshadowed by the tech recommendations that Kitchen shares to better handle the onslaught of electronic requests from bosses and colleagues. Bummer.

In my eyes, a promising opportunity to explore a problematic element of  Western work-life — our inability to detach from the office — was not fully realized. Rather, the topic’s critical mass was mostly skirted in favor of a half-dozen suggestions for apps (the ubiquotous moniker which causes techies and non-techies alike to squeal with delight) that could be used to slow the rush of work-related e-mail and texts. Given Kitchen’s “Gear & Gadgets” job title, however, I understand that his approach to this article is entirely reasonable. He is, after all, the WSJ’s tech guy. But I can’t help but ask this question: When are we going to get serious about the fact that the intrusions of our “always on” career focus are created because the cultural mindset around work — and not so much the technology that the office embraces — is the issue that needs unpacking?

Sadly, the idea of getting serious is thrown half-way out the proverbial window when Kitchen opens his piece with this sentence: “I have a masochistic need to please bosses, so I’m never more than a few feet from my iPhone (notifications humming at all hours) and I never leave home without a MacBook in tow” (page D1). In terms of setting the tone, this sentence suggests that what is to follow is not going to be an evenly-balanced assessment of how the desire for career stability (and/or advancement) is complicating the need for work-life balance, familial intimacy, and long-term personal sanity. Rather, the author seems to be letting readers know that he isn’t going to tread very far from his charging cords and touch screens.

Kitchen tosses another wrench in the works when he remarks in the second paragraph that he is a millennial. While I understand his desire to be forthcoming about his age bracket (and thereby imply that he possesses an inherent affinity for digital technology) I find it troubling that he does not follow this statement with a caveat. And that caveat would be that even though millennials have been accused of being self-centered, entitled, and virtually addicted to technology because of their status as “digital natives,” they have also been shown to be remarkably astute social and cultural critics. Respected men and women from the Greatest Generation have given millennials their due credit, praising their cultural consciousness, awareness of civil rights conflicts around the globe, and desire to confront injustice in nearly all areas of commerce and social welfare.

Isn’t the infiltration of work-related communication into time that should be dedicated to child-rearing or bonding with one’s spouse a form of injustice? The middle of Kitchen’s article seems to indicate that it is, as he cites research from several studies and entertains thoughts of altering his own “masochistic need” to hover over his devices at all hours. But in the article’s conclusion he provides — in a non-humorous attempt at being humorous — that he is about to share “the ultimate key to work-life balance” except  “actually wait. Can you hold on a second? I gotta take this” (page D11). I find this conclusion to not only be a largely failed attempt a cleverness, but a sad comment about the importance of critically examining how educated adults are voluntarily letting work-related requests (and/or demands) run rampant over time that should be spent with friends and family — or even with just oneself.

This criticism aside, the article features some valuable research results as well as warnings about the negative impacts of our Pavlovian response to smartphone alerts. Consider these findings, which I am producing verbatim:

  • “According to a 2016 study by the Academy of Management, employees tally an average of 8 hours a week answering work-related emails after leaving the office.”
  • “[A] Harris Poll for the American Psychological Association found that 30% of men and 23% of women regularly bring work home.”

Offsetting these depressing (yet certainly not surprising) research results are the following developments that foster hope that we can curtail the invasion of work into our personal lives:

  • “In 2017, France instituted a new labor law that supports a new frontier in human rights, the ‘Right to Disconnect.'”
  • “Similar rights have been extended in Italy and the Philippines, are being explored in Germany and Luxembourg and were proposed in New York City.” (Note — Given NYC’s failed attempt at limiting the sizes of soft-drink containers in 2013-2014, I have doubts as to whether this new and certainly worthy initiative will find traction.)

I will not hide my disappointment with the fact that my hopes were unrealized, but I do credit Kitchen for acknowledging the subject. The Wall Street Journal should also be recognized for deeming this article relevant enough to place it on section D’s first page. Granted, section A would have been ideal. But I admit that it is very difficult to criticize digital technologies in 2018 — a time when they add tremendous value to our lives — without fear of being labeled a luddite. I simply wish a greater discourse was occurring around when, where, and how we employ these technologies. When left unchecked, use can turn quickly to abuse. We owe it to ourselves — and to our relationships with family, friends, and fellow community members — to talk (and write) more candidly about always on work culture.

TRIBE, by Sebastian Junger

Sebastian Junger

“What people miss [about combat] presumably isn’t danger or loss but the unity that these things often engender. There are obvious stresses on a person in a group, but there may be even greater stresses on a person in isolation” (p. 92-93).

It’s been over a week since I finished Sebastian Junger’s TRIBE (2016), a slender work of non-fiction by the bestselling author of The Perfect Storm, War, and Fire. Since completing this quick read, I have found myself repeatedly glancing at its matte black cover and feeling drawn uncomfortably to its title — the word TRIBE. For nearly ten days something has bothered me about it. Not until this afternoon, while copying passages from the book onto a yellow legal pad, did I finally determine why the title provokes me. Its typeface features a camouflage pattern. On the surface, this is fitting. But symbolically, the camouflage is indicative of a troubling fact about our society.

For a book that often refers to military service in order to explore the differences between tribal societies and modern Western culture, the use of camouflage is ideal. After all, the external face of the U.S. military – especially the Army and Marines – mixes olive drab, dark brown, chocolate, and greenish-yellow. Thus the camouflage typeface is not only appropriate in its connection to soldiers’ fatigues, but it appeals to shoppers whose passing gaze may fancy the green-brown-yellow pattern that adorns everything from women’s yoga pants to pre-teens’ school backpacks. Camouflage is cool.

But this afternoon I realized why the pattern has been nagging me. I value Junger’s use of military references to help readers understand the distinctions between tribal societies (both historical and contemporary) and modern Western culture. But it dawned on me today that it is the purpose of camouflage that has been provoking my discomfort; camouflage helps its host disappear. Drawing on influences from the natural world, humans have disguised their appearance for well over a hundred years through the use of specially-crafted garb. The goal of such clothing is to blend in to one’s surroundings, to become invisible against the background.

Understandably, on the battlefield and behind enemy lines, soldiers want to achieve invisibility. If you can’t be seen, it’s much less likely that you will be shot or bombed. But when soldiers return from duty and re-enter civilian life, what occurs if they are still camouflaged? Not from face-paint or jungle fatigues, but from the fact that most civilians in modern Western culture are at least partially – and in many cases largely — invisible inside their communities. Junger’s thesis asserts that disconnection has become widespread in the United States and western Europe, and that servicemen and women are not the only ones feeling lost. Rather, the entirety of modern Western culture is showing signs of alienation from community-centered values.

Junger believes that this disconnection, this sense of feeling invisible, is due to several causes: (1) the lack of a crisis around which people must band together in order to survive, (2) the affluence of modern society and the fact that its luxuries are prized when they are amassed instead of shared, and (3) the belief that success has become a solo effort, not one measured by improvements in group health and welfare. He writes: “Whatever the technological advances of modern society – and they’re nearly miraculous – the individualized lifestyles that those technologies spawn seem to be deeply brutalizing to the human spirit” (p. 93). These are heavy words.


Much of the work’s 136 pages features a fascinating historical analysis of why our human spirit has been eroding over the last several centuries. The first quarter of the book documents stories from the American frontier, a time during which a striking number of European settlers found more appealing living conditions with Native American cultures than they did with the colonies that they were, by nationality, a part of. As a result, both men and women migrated from colonies to tribal encampments. And even when rescue parties were sent out to bring these individuals home, they often resisted; in fact, more than a few hid from their rescuers. For those who left the colonies to join tribal life, the tight bonds of Native American cultures were more reassuring than what they were experiencing in “civilized” settlements on the east coast.

For the remaining chapters, the author turns toward World Wars I and II, conflicts during which the need to band together (as both civilian communities and military units) witnessed dramatic decreases in people’s self-reported depression, anxiety, and stress.  Because the members of those populations joined together for a common cause, they formed connections with strangers. They focused on others instead of themselves because their well-being (if not their survival) could be ensured only if the group remained viable. One of the most powerful examples that the book features is the well-documented phenomena that manifested in London during the Blitz. Despite weeks of brutal aerial bombardments by the German Luftwaffe, the citizens of London experienced a pronounced increase in spirit and emotional intimacy at the very time that their lives were most threatened. Imminent danger catalyzed relationship building.

The author writes: “What catastrophes seem to do – sometimes in the span of a few minutes – is turn back the clock on ten thousand years of social evolution. Self-interest gets subsumed into group interest because there is no survival outside group survival, and that creates a social bond that many people sorely miss” (p. 66). With the exception of natural disasters like floods, hurricanes, and wildfires, most Americans are largely insulated from anything resembling a true catastrophe. Although global warming, poverty, and discrimination are very pressing problems, they do not possess the commanding immediacy of an invasion by a foreign army, the outbreak of a communicable disease, or a power outage that puts half of the nation in the dark. Consequently, we rarely depend on others. This results in a situation where Junger writes: “A person living in a modern city or a suburb can, for the first time in history, go through an entire day – or an entire life – mostly encountering complete strangers. They can be surrounded by others and yet feel deeply, dangerously alone” (p. 18).

Supporting Junger’s argument are interviews with scholars and social scientists, who attest to the strengths of tribal cultures. These experts also provide sobering commentary about the ways in which modern society is falling far short of upholding the values that more primitive cultures maintain through their reliance on group dynamics. In response to Junger sharing some of his observations with anthropologist Sharon Abramowitz and asking her how suitable they are to disclose to readers, she responds this way: “You’ll have to be prepared to say that we are not a good society – that we are an antihuman society” (p. 93). She continues by saying that, “We are not good to each other. Our tribalism is to an extremely narrow group of people: our children, our spouse, maybe our parents. Our society is alienating, technical, cold, and mystifying. Our fundamental desire, as human beings, is to be close to others, and our society does not allow for that” (p. 94).

My only regret about TRIBES is that it is not twice as long. In my estimation, Junger has just scratched the surface on this provocative subject. Whether you are interested in Colonial American history, the impact of PTSD on servicemen and women, the dynamics of fraud and greed in the financial sector, or the health of your suburban neighborhood, I recommend investing a few hours in Sebastian Junger’s book. It is a quick read, but its content will stick with you. And after considering the author’s observations, you may understand why camouflage is both a blessing and a curse. Invisibility is beneficial on the battlefield, but it harms everyone — soldiers and civilians alike — back home.

Interested in learning more?  

  • Sebastian Junger’s TED talks are worth your time. Here is a link to his most recent one, a presentation recorded in May 2016 entitled Our lonely society makes it hard to come home from war.
  • Sebastian Junger’s film Restrepo, which won the Grand Jury Prize for Documentaries at the 2010 Sundance Film Festival, is riveting.

Note — The images contained in this post were obtained from Sebastian Junger’s website and Unsplash.com.

The BBC’s Loneliness Experiment

Screen Shot 2018-10-04 at 8.39.27 PM

When it comes to loneliness, you may not be alone.

On October 1, the British Broadcasting Corporation (BBC) unveiled data compiled during its 2018 Loneliness Experiment, an online survey designed by a trio of academics that was filled out by over 55,000 individuals beginning on February 14. An article posted on the BBC.com on Monday summarizes the data; it also features interviews with three participants of the project who come from different walks of life.

Perhaps the most noteworthy insight revealed by what the article deems “the largest study of loneliness yet” is this: “There is a common stereotype that loneliness mainly strikes older, isolated people – and of course it can, and does. But the BBC survey found even higher levels of loneliness among younger people, and this pattern was the same in every country” (emphasis is my own). By “every country,” the authors are referencing the fact that individuals from “237 different countries, islands, and territories took part in the survey.”

The following table provides data from seven different groups based on age. The group featuring the greatest percentage of respondents who indicated that they experienced frequent loneliness included those between ages 16 and 24.

BBC loneliness table

The article’s authors proceed to explain that this increased prevalence of loneliness among younger people is not necessarily a generational difference (i.e. that today’s teens and twenty-somethings feel lonelier than young adults growing up decades ago). The BBC.com staffers cite the fact that older people who completed the survey indicated that the loneliest periods of their lives occurred when they were younger.

Why? The author’s suggest that, “The years between 16 and 24 are often a time of transition where people move home, build their identities and try to find new friends.” Whether you navigated high school and college in the 1960s or early 2000s, these circumstances generally hold true. Young people immerse themselves in new employment and educational experiences, test new living situations, and venture forth into new relationships with friends, lovers, and employers.

Although these growth initiatives can result in powerful interpersonal bonds and the security of new-found belonging, they can also yield dramatic gulfs of soul-searching, isolation from the familiar, and a demoralizing uncertainty about what comes next. Anecdotally, being young has never been easy. The data from the BBC’s 2018 Loneliness Experiment seems to suggest that this has been true for many generations. Loneliness is a common condition experienced by people of all ages — but those who are younger self-report it at slightly greater rates. Regardless of age, it may be accurate to acknowledge that we are not alone in our loneliness.

“But I didn’t want to ask for help.”


What happens when elite athletes suffer from mental illness as a result of head trauma? One answer to this question can be found in the August 2018 issue of Bicycling magazine, which features the first-person account of professional cyclist Alison Tetrick.

Before Tetrick transitioned from a successful road-racing career to her current role as a gravel-racing champion in 2017, she suffered two concussions. The first, which occurred in 2010 at the Cascade Cycling Classic in Bend, Oregon, was devastating. During the race’s first stage, another rider — who was trying to avoid a crash she could see ahead — accidentally clipped Tetrick’s front wheel. The pair were traveling downhill at an estimated 45 mph when Tetrick was launched from her bike and hit the tarmac.

In her words, “I didn’t slide, didn’t tear shorts, didn’t bleed. I just hit the ground. I landed on my hip and head, and shattered my pelvis” (p. 56). After spending over an hour on the ground while first responders arrived and stabilized her, the 25-year-old was airlifted to a hospital where she was quickly diagnosed with a concussion, a form of traumatic brain injury (TBI). The journey that commenced after she was released from the ER included daunting physical rehabilitation, and an even more challenging mental climb — a process that is still on-going.

While her body slowly healed over the months following her crash, Tetrick’s mind seemed perpetually in a “dense, never-ending fog” (p. 56). In both her relationships with friends and with family, she struggled to maintain a stable emotional state; she felt anxious and irritable at times, and emotionally vacant at others. Her marriage ended. And she struggled to determine how she would continue as a professional athlete in a sport where confidence is key. But despite these challenges and concerns, Tetrick fought on. Her body recovered and she made a comeback in 2011 at the Merco Cycling Classic in Merced, California. Despite winning the second-stage time trial, holding the leader’s jersey for three more days, and then winning the overall event, Tetrick knew something was wrong.

She writes: “Throughout my recovery from my broken pelvis, and after, I felt vulnerable and fragile, insecure and mentally frail…But I didn’t want to ask for help. I wanted to pull myself up by my bootstraps, cowgirl up” (p. 56). In language that testifies to the understandable fear of being an athlete who is perceived as weak or lacking in confidence, Tetrick says: “I didn’t want to admit I wasn’t okay [mentally] because if I admit that, and I’m leading a bike race, I’m going to get stuck in a corner because people know I’m going to have a panic attack…As a professional athlete, you hide your weaknesses….You can constantly find ways to tell yourself, ‘People like me. I’m normal. I’m okay'” (p. 56).

But, when dealing with a host of frightening symptoms that seem to indicate that your personality is morphing in strange ways, she admits that, “Deep down I was like, I don’t know if I’m okay” (p. 56). She continued forward, though, training and racing until disaster struck a second time in October 2011. At the Pan American Games in Guadalajara, Mexico, her front wheel got stuck in a storm drain during a pre-race warm-up. She “flew over the handlebar, and smacked [her] head” (p. 58). Despite this injury — a second head trauma in less than a year and a half — she raced that day. But her life began unraveling shortly thereafter. The once vibrant and outgoing young athlete was not okay.

Looking back on the period following her second concussion, Tetrick describes her situation this way: “I stared at the wall for weeks, couldn’t move, couldn’t stop crying. The depression wouldn’t go away. My parents sent me to psychologists…We were trying everything, because I couldn’t function. I couldn’t sleep — I had to go on sleeping pills” (p. 58). Again, she fought back. With the support of family, friends, and a neuropsychologist whom she works with today, Tetrick got back on the saddle and started racing again. For two years she did so while using the antidepressant Wellbutrin. She reflects: “During that time I didn’t really have emotional highs or lows, I just felt flat” (p. 60).

Tetrick continued racing until shortly after she and her team finished the 2016 Tour of Flanders in Belgium. The day after that Tour, she attended a smaller race where riders “were taking all of these [unnecessary] risks” (p. 78). There, she saw a rider “hit a light pole” — a collision that seemed entirely unnecessary. And that is when Alison Tetrick decided that her professional road-racing career was nearing its end.


In 2017, Tetrick traded her skinny road tires for stouter off-road rubber and entered the 200-mile Dirty Kanza gravel race which is held in early June. It was her first time competing at that distance. And she ended up winning, sprinting to the finish line just ahead of reigning 2016 champion Amanda Nauman. Today, Tetrick seems to be on top of her game. Physically, she is performing better than she ever has before. However, she knows that the mental consequences of her concussions still follow her. She says: “Every day you have to make a choice for your mental health, and possibly deal with the physical side effects…I still get emotionally flooded. It’s an injury that you can’t see” (p. 79).

Despite the coverage that concussions receive in the press related to sports, military service, and workplace accidents, a stigma still exists around the psychological effects of traumatic brain injuries. Even though there is nothing shameful about being struck by a fellow cyclist and crashing to the pavement — or being tackled by a monstrous defensive lineman — we still seem to tread delicately around the emotional and mood-related consequences that those massive blows can impart. Physical injury caused by others’ actions can be interpreted as an sign of having performed courageously on the field of play, but mental anguish rarely receives equal respect. Both are tragic, to be sure, but it is the latter that is often darkened by shame.

I applaud Alison Tetrick for writing candidly about the physical and mental challenges that she has faced in the wake of her concussions. Speaking about broken bones can be easy, but talking about a flagging spirit or a troubled mind requires much greater resolve. Tetrick possesses a character made stronger by her willingness to be vulnerable.

Kate Spade: More than a name


The other day I was helping a middle-aged woman check out with her order, and I watched as she placed a slightly-worn wallet down on the counter between us. In small raised gold letters, the name kate spade stood out above the black leather. In an instant, I wondered what that name — and that brand — now holds for this woman, and for others. What role does it play in their fashion sense, in their estimation of what displaying this iconic logo on a purse or shoes or belt might now symbolize?

My second thought after recognizing the tiny gold letters was this: the name of a prominent fashion pioneer has taken on new meaning. On June 5, 2018, Kate Spade took her life. She, the woman, no longer exists. But the products she developed are still ubiquitous. The namesake brand that has represented quality, sophistication, and style for many years has not necessarily shifted in identity. Those traits remain. Though the importance of that name — perhaps even its jurisdiction, its sphere of social influence — may have.

As I processed the order, the urge to ask this woman about her perception of the logo on her wallet rose in my mind. But I quickly assessed that such a inquiry coming from a stranger would not be proper; in fact, the question would be so charged with threatening energy that she may have been rendered speechless. Obviously, I did not want to create an extremely uncomfortable moment for her — or for me. Yet inside my brief period of curiosity and reticence lies a question. And perhaps an opportunity.

Speaking about what Kate Spade chose to do is, I believe, inherently difficult because it causes most individuals to at least consider having an internal dialogue about an act that elicits tremendous uneasiness. Suicide. More than ever, though — especially in light of Anthony Bourdain’s choice to end his life only days after Spade’s tragic demise — we need to hold these conversations. And we need to open spaces for those dialogues to occur, spaces without shame or criticism or the fear of dismissal. But how do we accomplish this?

For a moment the other day I considered initiating such a conversation with a stranger, but the unspoken guidelines of appropriate social discourse dampened that impulse. However, I wonder if similar constraints largely inhibit — and perhaps prohibit — those conversations among friends, between partners, and with children. Suicide is a scary subject. And the conditions that can lead to suicide — depression, loneliness, and low self-image among others — are often just as scary.

So too often, I fear, we simply don’t consider raising the topic of suicide with anyone — stranger or loved one. And I believe that needs to change. Because we are all so much more than just our names.

Note — The image featured above was obtained from Unsplash.com.

Suicide in the United States


According to the National Institute of Mental Health (NIMH):

  • Suicide is the 10th leading cause of the death in the United States.
  • In 2016 it claimed the lives of nearly 45,000 Americans.
  • It is the 2nd leading cause of death of those ages 10-34.
  • It is the 4th leading cause of death of those ages 35-54.
  • 90% of those who died by suicide had an underlying mental illness.

Additional NIMH information about suicide can be found here.

Note — The image in this post was taken by photographer Kristina Tripkovic, and it was obtained from Unsplash.com.

Teen Depression


According to the National Institute of Mental Health (NIMH):

  • 20% of youth ages 13-18 live with a mental health condition.
  • 11% of youth have a mood disorder, a category that includes depression.
  • 50% of all lifetime cases of mental illness begin by age 14.

More information from the NIMH about teen depression can be found here.

Note — The image in this post was taken by Tiago Bandeira, and it was obtained from Unsplash.com.