Tania Lombrozo

When I was pregnant with my first child, friends and family inevitably asked about my diet.

"Are you sticking to vegan food?" they wondered, with variable admiration or anxiety.

For some, the curiosity was about cravings. One mother told me she was irresistibly (and unexpectedly) drawn to hot dogs her third trimester; another confessed her sudden fascination with red meat. Surely, I'd crave things beyond the vegan sections of Whole Foods? (I didn't. Mostly.)

Many illnesses are contagious. You'd do well to avoid your neighbor's sneeze, for example, and to wash your hands after tending to your sick child.

But what about mental illness?

The idea that anxiety, autism or major depression could be transmitted through contact may sound crazy — and it probably is. There's a lot we don't know about the origins of mental illness, but the mechanisms identified so far point in other directions.

A new paper, just published in the Journal of Alzheimer's Disease, provides insights into the risks and benefits of coffee consumption.

It's the latest scientific study to hit the media. But different headlines give a very different picture of what the study found.

Some headlines depict good news:

"Here's More Evidence That Coffee Is Good For Your Brain" (Forbes.com)

Last April, I joined more than a dozen cognitive scientists at a workshop called "Breaking New Ground in the Science-Religion Dialogue." The workshop, organized by Cristine Legare at the University of Texas at Austin, aimed to encourage a sophisticated, evidence-based look at the psychology behind science and religion, as well as psychological factors that affect people's perception of believers, atheists and the relationship between science and religion.

In an interview earlier this year, Sen. Harry Reid argued that it's time for a woman to run for president.

"Women have qualities that we've been lacking in America for a long time," he told New York Times reporter Adam Nagourney. For instance, he said, "Women are much more patient."

The Magic Of Words

Jul 14, 2015

The philosopher George Berkeley famously argued (contra John Locke) that we can never have truly abstract ideas — ideas stripped of all particulars and details. When I think of a triangle, I imagine a particular triangle, not some abstract idea of "triangle." When I think of a dog, I imagine a golden retriever or a Yorkie or a mutt — not a "general" dog that embodies only the essence of "dogness," devoid of all nonessential features.

About 94 percent of Americans know how to ride a bike. For some, it's a primary form of transport, for others an occasional diversion.

The theory of evolution by natural selection is among the best established in science, yet also among the most controversial for subsets of the American public.

Say "philosopher" and most people imagine a bust of Socrates, obscure texts or intellectual tête-à-têtes in the so-called Ivory Tower, away from the muddle of real-life concerns. But three issues this past week made something clear: We need philosophers engaged in public life — and a public willing to engage them.

I confess: As a Ph.D.-carrying mother of two and student of human behavior, I couldn't resist reading Primates of Park Avenue, the provocative memoir about motherhood on New York City's Upper East Side, released this month.

If you follow the headlines in nutrition science, you may have come across the claim that a daily bar of dark chocolate could help you lose weight faster. Websites touted the sweet news earlier this year:

"Excellent news: Chocolate can help you lose weight!" (3/31, Huffington Post)

To function effectively in the world, you need to acquire a whole lot of information. You need to know exactly which medicine is appropriate for each ailment. You need to know how to fix your car and your router and your irrigation system. You need to know the date of every major holiday and how it is observed.

Right? Of course not. That would be crazy.

In 1998, my colleague Alison Gopnik wrote a provocative paper comparing the drive for explanations to sexual desire. Just as we're motivated to engage in an evolutionarily beneficial activity — reproduction! — by the promise of orgasm, so, too, we're motivated to discover the basic structure of the world around us by the promise of a satisfying explanation. It's the "aha!" moment that makes the learning feel worthwhile.

In anticipation of Mother's Day, I offer you a found poem: the output of Google's autocomplete search function. Start a search for "motherhood is" and you'll learn:

May your Mother's Day this year combine a recognition of the hard with a celebration of the magical.

Why do so many people oppose genetically modified organisms, or GMOs?

According to a new paper forthcoming in the journal Trends in Plant Science, it's because opposition to GMOs taps into deep cognitive biases. These biases conspire to make arguments against GMOs intuitive and compelling, whether or not they're backed by strong evidence.

We associate technology with the shiny and new. But humans have been using technology to change the environment and themselves since at least the lower Paleolithic period, when our ancestors were making stone tools.

Is the technology of today fundamentally different? In particular, does it change the way we think of ourselves or our relationships to each other and the environment? Does it change the way we think about what exists (metaphysics), about what and how we can know about it (epistemology), or about how we ought to live (ethics)?

Last week, I participated in a workshop on the science-religion dialogue during which I was asked: Are scientific and religious explanations philosophically incompatible?

I've been thinking about the question ever since. The simple answers — "yes" or "no" — have advocates, but they don't seem to do the issues justice.

Last week, I wrote a post calling for Ruth Bader Ginger ice cream. The post was inspired by Amanda McCall's "10 delicious solutions to Ben & Jerry's women problem," which included suggestions for ice cream flavors honoring a variety of women, from S'moria Steinem to Chocolate Chip Cookie Doughprah Winfrey.

Last week, Amanda McCall proposed "10 delicious solutions to Ben & Jerry's women problem": a suite of new flavors calling attention to Ben & Jerry's gross underrepresentation of women in their flavor names.

By McCall's count, only two of Ben & Jerry's more than 20 person-named flavors over the past three decades have featured women: "Liz Lemon's Greek Frozen Yogurt" and "Hannah Teter's Maple Blondie."

According to a news feature from the journal Nature, shortsightedness could be on the rise because children are spending less time outdoors than they used to.

If you want to understand the human mind, you have to reject the idea that we directly perceive and remember the world as it is. Our perceptual experience isn't simply a passive impression of the input received by our senses — and our memory isn't like a photobook or a video, comprehensively recording the details of our experience.

We all know a little knowledge can be a dangerous thing. Research increasingly supports a related proposition — that easy knowledge can be a dangerous thing. More specifically, having knowledge at our fingertips, as smartphones and intelligent search algorithms increasingly allow, might have negative consequences for human cognition.

Here's your task: Based on information about individual applicants to an MBA program, you need to predict each applicant's success in the program and in subsequent employment. Specifically, you'll be given basic information — such as the applicant's undergraduate major, GMAT scores, years of work experience and an interview score — and you'll need to assess the applicant's success (relative to other applicants) in terms of GPA in the MBA program and other metrics of achievement. Will the person be in the top quarter of all applicants? In the bottom quarter?

Valentine's Day isn't just about flowers and chocolates and heart-shaped candies. It's fundamentally about love. And we all know what love is, right?

Well, not so fast. Is love an emotion? An experience? Is it a kind of desire? Is it possible to love a fictional person? To love more than one person? Is romantic love fundamentally different from other kinds of love?

With the recent outbreak of measles originating from Disneyland, there's been no shortage of speculation, accusation and recrimination concerning why some people won't vaccinate their children.

When I was a kid, I liked this poem by Jean Little from her collection, Hey World, Here I Am!:

Our History teacher says, "Be proud you're Canadians."
My father says, "You can be proud you're Jewish."
My mother says, "Stand up straight, Kate.
Be proud you're tall."

So I'm proud.

But what I want to know is,
When did I have the chance to be
Norwegian or Buddhist or short?

Thinking machines are consistently in the news these days, and often a topic of discussion here at 13.7. Last week, Alva Noë came out as a singularity skeptic, and three of us contributed to Edge.org's annual question for 2015: What do you think about machines that think?

Which is a better magic trick: turning a dove into a glass of milk, or a glass of milk into a dove? Turning a rose into a vase, or a vase into a rose?

For most people, the way these transformations go makes a big difference. In each case, they find the transformation from a nonliving object to a living thing more interesting — but why? Is it just more exciting to see a living thing appear than to have it vanish? Or is there something deeper at work?

Sometime in 2014, I read Brigid Schulte's Overwhelmed: Work, Love, and Play When No One Has the Time and was struck by this passage comparing the culture of work in America with that in Denmark:

"Most Danes don't feel obligated to check their smartphones and e-mail after hours. In fact, they say, people who put in long hours and constantly check e-mail after hours are seen not as ideal worker warriors, as in America, but as inefficient."

Pages