Thursday 31 January 2019

Michael Cohen Is Ready to Talk Russia to Congress

President Donald Trump’s former personal lawyer Michael Cohen lied to Congress about issues central to the Russia investigation out of “blind loyalty” to his longtime boss. But now the man who once said he would take a bullet for Trump plans to correct the record before the House and Senate Intelligence Committees—perhaps giving lawmakers more insight than they’ve ever had into the president’s dealings with Russia before and during the election.

Cohen’s much-hyped public testimony before a separate panel, the House Oversight Committee, was expected to be highly restricted to avoid interfering with Special Counsel Robert Mueller’s Russia probe, with which Cohen has been cooperating for several months. (Cohen postponed that hearing following attacks from the president and the president’s lawyer, Rudy Giuliani, on his wife and father-in-law.) But the intelligence-committee hearings will be conducted behind closed doors, giving Cohen the opportunity to have a freer exchange with the members.

[Read: Three remarkable things about Michael Cohen’s plea]

Cohen is willing to answer questions about what he’s told Mueller and other issues related to the ongoing investigation, according to two people familiar with his plans. (They, like other people I spoke with, requested anonymity to discuss the private deliberations.) However, his legal team is also in talks with Mueller’s office to determine whether there are any parameters for his testimony. The House Intelligence Committee is also “in consultation with the special counsel’s office to ascertain any concerns that they might have and to deconflict,” according to a committee aide.

“The reason that he agreed to testify privately for the intelligence committees is, first and foremost, because he owes them,” one of the people familiar with Cohen’s plans said. “He pleaded guilty to lying to them and owes them an apology.” Cohen admitted in court late last year that he lied to Congress when he told them that negotiations to build a Trump Tower Moscow ended in January 2016, and that he hadn’t discussed it much with Trump. In fact, Cohen testified, he agreed to travel to Russia in connection with the Moscow project and took steps to prepare for Trump’s possible trip there after he clinched the Republican nomination.

Cohen is “more open to answering questions” about these and other Russia-related issues than he would have been in a public setting, according to this source, as long as he remains “secure in the knowledge that both committees will protect his testimony and prevent leaks.”

[Read: Michael Cohen takes Mueller inside the Trump Organization]

The Senate panel, which subpoenaed Cohen earlier this month, and its House counterpart declined to preview what questions they intend to ask. But the central purpose of the interviews is for Cohen to correct the record on his previous false statements to the committees about the timing of the Trump Tower Moscow negotiations—and, crucially, whether Trump or anyone in the White House directed him to lie in the first place. BuzzFeed News reported earlier this month that Mueller had evidence that Trump had asked Cohen to lie about the timing of the real-estate deal, prompting the special counsel’s office to release a rare statement contradicting aspects of the story. But House Democrats made it clear that, if it were confirmed that the president had tried to obstruct justice in order to hide his involvement in business negotiations with the Kremlin during the election—while Russia waged a hacking and disinformation campaign to undermine Trump’s opponent—it would be cause for impeachment.

Democratic Representative Adam Schiff, the chairman of the House Intelligence Committee, told MSNBC on Wednesday morning that the panel expects Cohen to address the Moscow real-estate deal, which was also a potential source of leverage for Russia throughout 2016 as Trump—and his family—kept the negotiations a secret from voters. Cohen admitted late last year to discussing the Moscow deal with Trump’s family members “within” the Trump Organization.

Donald Trump Jr., an executive vice president of the Trump Organization, told the Senate Judiciary Committee that he was only “peripherally aware” of the Moscow deal in 2016. It is not clear what he told the House Intelligence Committee, which has not yet released the transcripts of the closed-door interview. But Cohen’s corrected testimony could illuminate whether other witnesses have been honest during congressional testimony about their role in, or knowledge of, the Trump Tower Moscow negotiations in 2016, and Russia’s interference more broadly.

[Read: Michael Cohen pays the price for his ‘blind loyalty’ to Trump]

“I think this common thread of lying to Congress and particularly to congressional committees may ensnare a number of other potential targets in the special counsel’s investigation,” Democratic Senator Richard Blumenthal, a member of the Senate Judiciary Committee, said on Monday. “And it could become a matter of criminal action.” On Friday, the longtime Trump adviser Roger Stone was indicted by Mueller for lying to the House Intelligence Committee about WikiLeaks, prompting Schiff to reiterate that the Intelligence Committee’s “first order of business” once it is constituted—which has been delayed by Republicans—will be a vote to send the official witness transcripts to Mueller. “We will continue to follow the facts wherever they lead,” he said.



from The Atlantic http://bit.ly/2FYG4Ly

Many Families May Need Months to Recover From the Shutdown

Ever since the government shutdown ended last Friday, Yvette Hicks said her cable company, her electric company, the bank that processes her auto-loan payments, and her life-insurance company have been calling her “back to back to back.” They want to know when they’ll be paid.

Hicks, a 40-year-old security guard working as a contractor for the federal government, had been wondering the same thing about her own income, having gone without work or pay during the 35-day shutdown.

During that time, she had to dip into her savings so that the electric company didn’t cut off power to her home in Washington, D.C., and she was forced to ration her children’s asthma medication—they needed it every four hours, but Hicks couldn’t afford to keep up that frequency. With bills piling up over the past month, she estimates that even now that she’s back to work, it’ll take until “the end of March, maybe” for her to get her finances back to where they were before the shutdown.

Hicks is one of the millions of Americans whose livelihoods were upended by the longest government shutdown in U.S. history. Just because that shutdown is over doesn’t mean that it isn’t still shaping the lives of American families, including contractors like Hicks who didn’t work and likely won’t receive back pay, and the roughly 800,000 federal workers who will. The individual stresses—both financial and emotional—may have eased, but they persist even as people return to their working rhythms.

[Read: The shutdown showed how precarious Americans’ finances really are]

The timing for a shutdown is never good, but it was especially bad for Bebe Casey, a 52-year-old businesswoman in New Hampshire. In late December, her mother was in the hospital at the end of her life, and Casey’s husband, a Customs and Border Protection employee, was ordered to continue working, with the expectation of later receiving back pay. The combination of a parent in the hospital and financial uncertainty was taxing. “This has just been a really hard last six weeks,” she told me.

Things turned out all right financially for Casey’s family, thanks in part to the end-of-year bonus she received. “If I hadn’t had that, then we would’ve been late on lots of stuff,” she says. “We were lucky.” Still, she’s a little nervous about upcoming financial obligations, such as making a college-tuition payment for her daughter next week. “We’re gingerly going through day by day right now, still trying to stuff as much money as we can away in case this happens again,” she says.

On the other side of the country, Mel May was experiencing similar uncertainty during the shutdown. May, a 55-year-old employee of the Federal Aviation Administration living in Albuquerque, was also expected to come into work without pay. “It was very stressful,” he said of making sure his bills were paid. He had trouble getting a loan to cover his rent and other bills, and ended up having to borrow money from his girlfriend.

In the end, May didn’t fall behind on any payments. “Everybody worked with me—even Comcast, of all people,” he said. Still, things were tight. He got cleared for unemployment assistance, started looking into getting a new job, and ate frugally—“lots of eggs, lots of oatmeal,” he told me. He’s glad he’ll start getting paychecks again soon.

The people I spoke to for this story told me of the stress they’d experienced during the shutdown and were well aware that they could experience it again soon, if a new agreement to fund the government is not reached by February 15. Yvette Hicks was worried that if she was out of work again, it’d take her even longer to get back on track financially.

Some aftereffects of the shutdown have been subtler, though. One 23-year-old federal contractor—he asked not to be named, for fear of repercussions at work—told me that this early exposure to the potential precarity of government work was discouraging. “For all of the merits of working for the government, it has shown me some of the disadvantages of working on something that’s at the mercy of politics,” he said. The shutdown may have lasted 35 days, but some of its effects will extend well beyond that.


Amal Ahmed contributed reporting to this article.



from The Atlantic http://bit.ly/2TnQVBB

Breslin and Hamill: Deadline Artists Feels Like a Eulogy for Journalism

Breslin and Hamill: Deadline Artists, a new HBO documentary about two of the most celebrated newspapermen of the 20th century, has the passionate, thunderous, and occasionally weepy tone of a good barroom eulogy. Jimmy Breslin and Pete Hamill represent, various interviewees attest, the last of their kind: journalists writing for and about the working man, self-educated voices for New York’s disenfranchised, reporters who also sometimes managed to be poets. Together, they embody the sharp thrill of a time when to cover the news was to be a hard-drinking, iconoclastic, ink-and-grease-stained truth teller. As Spike Lee puts it in one moment, “These guys were like superstars.” Later, Tom Brokaw adds that Hamill was “so authentically male.”

Directed by the journalist Jonathan Alter and the documentarians John Block and Steve McCarthy, Deadline Artists often feels as if it’s eulogizing not just Breslin and Hamill (Hamill is still alive and writing; Breslin died in 2017) but also a golden era of journalism itself. Alter told Page Six that he wanted to capture the heart of a time “when print journalists could be swashbuckling figures”—a moment when Breslin could advertise beer in television commercials and appear on Saturday Night Live, and when Hamill could date Jacqueline Kennedy and Shirley MacLaine at the same time.

With all the hushed reverence, though, comes a sense that something truly valuable has been lost. “These journalists today go to the elite colleges,” the legendary magazine writer Gay Talese says in one interview. Hamill and Breslin, the movie argues, were “anchored in a place and time,” able to tell stories about underserved communities because they themselves were of the people.

A hagiographic documentary certainly has its place—just ask the Academy, which nominated Betsy West and Julie Cohen’s Ruth Bader Ginsburg movie, RBG, for an Oscar earlier this year. It’s just that Deadline Artists often seems enthralled by a version of the narrative that even Breslin and Hamill question in moments, one in which the emptying-out of traditional newsrooms and the checking of anarchic reportorial habits signal a fatal, irreversible decline.

[Read: The media’s post-advertising future is also its past]

“There aren’t any more Breslins and Hamills,” an uncredited voice says in the movie. “This was the last expression of great 20th-century muscular American journalism.” Maybe, but it’s hard not to read “muscular” as a euphemism for something else, some ineffably virile quality that both Breslin and Hamill apparently had in abundance. And the sentiment undermines the astonishing reporting being done every day to expose inequality and injustice in America, in a much more challenging economic climate for news.

When it functions as a dual biography, Deadline Artists is a fascinating film. It’s drawn more to the bombastic, outspoken, abrasive Breslin than the ruminative Hamill, but it makes a case for the ways in which both changed newspaper journalism for the better. They each fell into the profession with a simplicity that would make contemporary J-school students weep—Breslin became a copy boy earning 18 dollars a week, while Hamill, after writing persuasive letters to the editor of the New York Post, was personally invited to join the editorial team, after which his first story ran on page 1.

The writers made a name for themselves by seeking out lesser-told stories, Breslin in the bars and back rooms of Queens, and Hamill across America, Europe, and Asia. Breslin’s coverage of John F. Kennedy’s assassination changed the shape of what newspaper journalism looked like by focusing on the men at the edges of history—the gravedigger at Arlington National Cemetery, the emergency-room doctor who tried in vain to save the 35th president’s life. In 1968, Breslin and Hamill were in the immediate vicinity of Robert F. Kennedy when he was murdered; Hamill summed up the scene by writing, “We knew then that America had struck again.”

Both men inevitably became celebrities, lauded for their tenacity, their commitment (Hamill, while sparring with the new owner of the Post, refused to step down and oversaw proofs from the diner by the office), and their fearlessness. Breslin’s ego seems to inflate rather unappealingly at this point in the film, when he’s shooting commercials for Grape-Nuts and corresponding personally with the Son of Sam killer. Called to ask whether he’s covering a house fire in which two people have died, he imperiously replies, “More must die before Breslin goes.” And, in his ugliest moment, he publicly berates a young, female, Korean American reporter who’s criticized one of his columns for being sexist, unleashing a tirade of racial slurs that gets him suspended for two weeks.

Deadline Artists dutifully includes this stain on Breslin’s biography, even if it subsequently drafts family members and friends to explain it. “I think the suspension probably killed him because I don’t think he had a bigoted bone in his body,” his son says. “Jimmy is an impulsive guy from the streets, and I thought he just made a huge mistake,” Gloria Steinem adds. “[Jimmy] doesn’t like when other people criticize him,” Breslin’s second wife, Ronnie Eldridge, explains. “He was terrible in many ways,” recalls The New York Times’s Gail Collins, “but his sense of sympathy was just amazing.” The filmmakers briefly interview Ji-Yeon Yuh, the woman whom Breslin verbally abused, but they seem more interested in propping up Breslin’s bona fides as a champion of the disenfranchised than in considering, even fleetingly, how the ribald newsroom culture the movie glorifies might also have helped keep a generation of talented reporters on the margins.

Alter, Block, and McCarthy are convincing in their argument that Breslin and Hamill shaped the way news stories are told, inspiring a generation to try to emulate their melding of dogged reporting and writerly craft. Breslin’s coverage of the emerging AIDS crisis in New York when few other reporters dared is held up as yet another example of how he foregrounded people whose plight merited national attention. But the implication underlying Deadline Artists’ swooningly nostalgic portrait of a bygone era—that things were better then—is undercut by one of its subjects. Journalism, Hamill says, is “always being enriched by the new people who come.” It’s a much-needed counterpoint in a film that could use more of them.



from The Atlantic http://bit.ly/2TjW7X6

How Do Plants Grow in Space?

Earlier this month, tiny green plants sprouted on the moon.

The plants arrived as cotton seeds, tucked inside of Chang’e 4, a Chinese spacecraft that had landed, in a historic first, on the far side of the moon, the side that never turns toward Earth. The seeds came with the comforts of home: water, air, soil, and a heating system for warmth. Huddled together, the seedlings resembled a miniature, deep-green forest. A hint of life on a barren world.

And then, about a week later, they all died.

Lunar night had set in. Without ample sunlight, surface temperatures near the spacecraft plummeted to –52 degrees Celsius (–62 degrees Fahrenheit). The sprouts’ heating system wasn’t designed to last. The plants froze.

Outer space, as you might expect, is not kind to plants, or people, or most living things, except maybe for tardigrades, those microscopic creatures that look like little bears. If you stuck a daisy out of the International Space Station and exposed it to the vacuum of space, it would perish immediately. The water in its cells would rush out and dissipate as vapor, leaving behind a freeze-dried flower.

[Read: A new clue in the search for forests on distant planets]

China’s experiment marked the first time biological matter has been grown on the moon. (There is biological matter on the moon already, in what NASA politely refers to as “defecation collection devices.”) But plants have blossomed in space for years. They just need a little more care and attention than their terrestrial peers.

The first to flourish in space was Arabidopsis thaliana, a spindly plant with white flowers, in 1982, aboard Salyut, a now defunct Russian space station. The inaugural plant species was chosen for practical reasons; scientists call Arabidopsis thaliana the fruit fly of plant science, thanks to a fairly quick life cycle that allows for many analyses in a short time.

Now, plants grow on the International Space Station, humankind’s sole laboratory above Earth. They are cultivated inside special chambers equipped with artificial lights pretending to be the sun. Seeds are planted in nutrient-rich substance resembling cat litter and strewn with fertilizer pellets. Water, unable to flow on its own, is administered carefully and precisely to roots. In microgravity, gases sometimes coalesce into bubbles, and overhead, fans push the air around to keep the carbon dioxide and oxygen flowing.

The most advanced chamber on the station, about the size of a mini fridge, has precise sensors monitoring the conditions inside, and all astronauts need to do is add water and change filters. Scientists back on the ground can control everything, from the temperature and humidity to levels of oxygen and carbon dioxide.

Plants did not evolve to exist in this unusual setup. But astronauts have grown several varieties of lettuce, radishes, peas, zinnias, and sunflowers, and they do just fine. “Plants are very adaptive, and they have to be—they can’t run away,” says Gioia Massa, a scientist at NASA’s Kennedy Space Center who studies plants in microgravity.

Scientists were surprised to learn that the lack of gravity, the force that has shaped our biological processes, doesn’t derail plants’ development. On Earth, plants produce a filigree-like pattern of roots, as they grow away from their seeds in search of nutrients. Scientists had long assumed the movements were influenced, in part, by the force of gravity. On the International Space Station, roots exhibited the same pattern, without gravity as a guide.

“Plants don’t really care about the gravity so much if you can get the environment right,” Massa says.

For NASA, the growth chambers on the space station are the predecessors of extraterrestrial farms beyond Earth. If human beings ever travel to another planet, they will need enough food for the journey. NASA has spent years perfecting thermo-stabilized or freeze-dried entrées and snacks for astronauts on the International Space Station, from scrambled eggs to chicken teriyaki. The meals are meant to last, but they wouldn’t survive the long journey to Mars, says Julie Robinson, the chief scientist for the International Space Station.

[Read: Supper club on Mars]

“We don’t have a system today that would preserve all the nutrients in food for all that time, even if it was frozen,” Robinson says.

Future Mars astronauts will likely bring with them an assortment of seeds, a Svalbard-like vault to kick-start the first generations of crops. None will be able to grow in Martian soil, which resembles volcanic ash; it’s devoid of the organic matter—formed on Earth by generations of decomposed plants—that supports life. It also contains chemical compounds that are toxic to humans. Astronauts could flush out the toxins with their own chemical solutions and convert the soil into something workable, but it may be easier to replicate the growth chambers on the International Space Station instead.

On Mars, plants will likely grow in climate-controlled greenhouses, from nutrient-rich gels and under bright lighting, with water delivered through liquid solutions at their roots or by a fine mist released from the ceiling. And anyone living on Mars will need many of these alien gardens; you can’t grow a salad from a petri dish.

Astronauts have already made a space salad. In 2015, astronauts on the space station were allowed to try the leaves of a red romaine lettuce that was cultivated in NASA’s first fresh-food growth chamber. They added a little balsamic dressing and took a bite. “That’s awesome,” the NASA astronaut Kjell Lindgren said then. “Tastes good.”

No one on Earth has sampled a space vegetable yet, according to Massa. Some plants grown on the station are sent down to the ground for study in the lab, but they usually come back frozen or preserved in a chemical solution. “Frozen would be better, but I don’t think lettuce popsicles will be very popular anytime soon,” she says.

NASA scientists are thinking about more than nutrition in these experiments. Growing plants just for the sake of growing plants is quite nice. Research has shown that gardening is soothing and can be beneficial for good mental health. Future deep-space astronauts, cooped up in a small spaceship for years with the same people, will need all the soothing activities they can find. Plants, especially flowers, grown not for consumption but for decoration may help far-flung astronauts feel connected to the comforts of Earth.

“There’s a great deal of joy in growing and watering the plants and producing a flower,” Robinson, the ISS scientist, says. “There can also be some real sadness if plants you’ve been cultivating are not successful and are dying on you.”

Anyone who has enthusiastically purchased a succulent and witnessed it inexplicably wilt days later might relate. Imagine the magnitude of that disappointment on Mars, where the closest store is all the way across the solar system, and the only option is to grow another one.



from The Atlantic http://bit.ly/2SeuSju

The Generation of Grandparents Who Keep Their Grandchildren Afloat

My husband and I have been pretty good at saving money over the years, which means that we have enough of a cushion to start passing along some of it to our children and grandchildren while we’re alive, rather than leaving it behind as their inheritance. If we live another 20 years, give or take, as the actuarial tables say we might, our offspring in 2040 might have less need for the extra money than they do today, when they’re young. We’re living by the slightly morbid axiom my mother would invoke whenever she gave me a big check for a birthday or an anniversary: “I’d rather give it with a warm hand than a cold one.”

Grandparents across the U.S. are making similar calculations. According to the AARP, almost all American grandparents say that they offer some sort of financial support to their grandchildren, typically in the form of helping pay for their education (53 percent), living expenses (37 percent), or medical bills (23 percent). A recent survey by TD Ameritrade found that the average grandparent couple spend $2,383 a year on their grandkids. This figure is undoubtedly skewed by the wealthy (and disproportionately white) people who give their grandkids the largest amounts—but even grandparents who are just scraping by try to share what they can. And it doesn’t include other ways of helping beyond giving money, such as by being unpaid daycare providers.

Intergenerational gift-giving has grown substantially in the past decade or so. One analysis of Census Bureau data found that between 1999 and 2009, the amount that Americans over 55 gave to their adult children increased by more than 70 percent. During that period, the amount given specifically for primary- and secondary-school tuition and school supplies—that is, to be spent on the grandkids—nearly tripled.

[Read more: The age of grandparents is made of many tragedies]

Financially, my husband and I are lucky; we’ve been careful with our spending, yes, but more than that, we’ve had relatively well-paying, stable careers. Most Americans our age aren’t so lucky: Just 39 percent of working adults in the United States have managed to save enough to get themselves through five years of retirement, let alone give a leg up to their progeny.

But even without adequate savings, parents and grandparents these days are quick to offer monetary help to younger generations—sometimes raiding the nest egg they might need for themselves. “Financial managers advise the elderly to hold on to the money they’ve saved, to use it to care for themselves in old age, to avoid becoming the responsibility of their children,” Kathleen Gerson, a sociology professor at New York University, told me. But many grandparents have a hard time listening to this advice, she said, because they can see that their children and grandchildren are even more financially insecure than they are. Giving money serves two functions, Gerson said—it’s “a way of expressing love,” and a way to help ensure “that your children’s children will have a decent spot in the world.”

But no matter how well intentioned these transactions are, the fact that many young Americans turn to the Bank of Grandma and Grandpa is evidence of their struggles and the lack of an adequate safety net to keep them afloat. Giving money to grandchildren is also one way that well-off families pass on privilege and wealth not just to the next generation, but to the one after that—a way that Americans stay rooted in the social stratum into which they were born.

Gerson pointed out that robust social programs benefiting grandparents might be what make them feel able to offer support in the first place. In the 20th century, poverty among the elderly was “greatly erased,” she told me, “thanks to policies such as Social Security.” According to the Center on Budget and Policy Priorities, a left-leaning think tank, 9 percent of Americans older than 65 have an income below the poverty line—but if Social Security didn’t exist, that number would balloon to 39 percent.

Yet while poverty among the elderly has plummeted, childhood poverty has held steady—as programs like food stamps and Aid to Families With Dependent Children have been overhauled and the cost of college has skyrocketed. According to the National Center for Children in Poverty, children younger than 18 are more than twice as likely to be poor as adults older than 65. That stark difference in their fortunes makes it hard for grandparents to turn away from the youngest generation, whose needs are so obvious. As Gerson put it, grandparents are “stepping into the void” created by the loss of social safety nets.

[Read more: The crushing logistics of raising a family paycheck to paycheck]

Stepping into that void most enthusiastically are grandparents of color, according to a 2012 study by the AARP. African American and Latino grandparents were more likely than their white counterparts to spend money on schooling: 65 percent of African American grandparents and 58 percent of Latino grandparents helped pay for their children’s education, compared with 53 percent of all grandparents polled. African American grandparents were also far more likely than other groups to assist with everyday living expenses. They were also more likely to be a little bit indulgent. “I give to my grandkids because they ask for things,” said 52 percent of African Americans and 43 percent of Latinos, compared with about 30 percent of grandparents overall.

While older Americans are dealing with their own financial woes—insufficient savings, for instance, and a mountain of medical bills—they still tend to feel more secure than their children, for whom pension plans and health-care coverage in old age are even more uncertain. Indeed, today’s grandparents might be the last generation to feel as if they have more economic wiggle room than their kids do. Older Baby Boomers, who are now in their late 60s and early 70s, grew up in a time of economic expansion and relative job security, when Americans could still earn a solid middle-class income with just a high-school degree. That places them in a better position to share the wealth with their families than Gen Xers, when the majority become grandparents.

But today’s grandparents are probably not quite as financially secure as they think they are, according to Teresa Ghilarducci, an economist at the New School and the author of How to Retire With Enough Money. Many of them have inadequate insurance for unexpected medical bills, especially long-term care, she told me, and they might not realize that “their costs for house cleaning and personal services are likely to go up as they become more fragile.” A lot of people have trouble understanding what retirement really costs and how much to save, Ghilarducci said. (Her rule of thumb: Savings should equal eight to 10 times your annual salary preretirement.) Even worse, if a recession happens in the next couple of years, she told me, it could reduce elders’ retirement money (from savings and part-time work) by as much as 20 to 25 percent.

While wealthier Americans can blithely give away money to their kids and grandkids without thinking twice, many working- and middle-class Baby Boomers struggle to help out. The TD Ameritrade survey found that one in four grandparents had to dip into savings to give money to grandchildren. And about 8 percent said that the desire to offer financial help to the youngest generation was leading them to put off retirement.

We do it anyway, often against financial analysts’ advice, because it’s hard to resist giving money to the children right in front of us rather than socking it away for a future that no one can foresee. It’s a sign of what Gerson calls “family cohesion”—the urge to help the grandchildren we love, even if we then ignore our own future needs, when we sense they lack something that we can provide.



from The Atlantic http://bit.ly/2RrSSel

There’s No Case for War With Venezuela

President Donald Trump has recently turned his attention, and the focus of the U.S. foreign-policy debate, toward the economic and political crisis in Venezuela, where two men are pushing rival claims to be the head of state. The opposition leader, Juan Guaidó, has the support of the United States. But despite mass protests, the Venezuelan dictator Nicolás Maduro refuses to step down.

The Trump administration’s efforts to force his ouster and bolster his opponent’s claims have included oil sanctions, the diplomatic maneuverings described by my colleague Uri Friedman, and harsh rhetoric. I want to focus on a subset of that rhetoric: threats of military force.

[Read: How seriously should the world take Trump’s Venezuela threat?]

In 2017, Trump stated, “We have many options for Venezuela, including a possible military option, if necessary.” The AP reports on a bygone occasion when he privately underscored that posture:

As a meeting last August to discuss sanctions on Venezuela was concluding, President Donald Trump turned to his top aides and asked an unsettling question: With a fast unraveling Venezuela threatening regional security, why can’t the U.S. just simply invade the troubled country? The suggestion stunned those present … including U.S. Secretary of State Rex Tillerson and national security adviser H.R. McMaster, both of whom have since left the administration.

This account of the previously undisclosed conversation comes from a senior administration official … McMaster and others took turns explaining to Trump how military action could backfire and risk losing hard-won support among Latin American governments to punish President Nicolas Maduro for taking Venezuela down the path of dictatorship, according to the official. The official spoke on the condition of anonymity because of the sensitive nature of the discussions.

As recently as Monday, National-Security Adviser John Bolton told reporters, “The president has made it very clear on this matter that all options are on the table.” He appeared to signal that U.S. troops might be sent to the region. And Senator Lindsey Graham told an Axios reporter that Trump had recently mused to him about a military option.

[Read: The White House’s move on Venezuela is the least Trumpian thing it’s done]

Trump has always been more hawkish than most people realize. Still, it is hard to know how much of this is earnest and how much is bluffing. If Team Trump is merely bluffing, the saber-rattling could conceivably pay off by yielding concessions, or undermine U.S. interests by alienating Venezuelans against the side we’re backing, or have no real effect. To me, bluffing when one cannot lawfully follow through is a generally high-risk, foolhardy approach to international relations, but it’s still preferable to the alternative explanation: a risky, unilateral war of choice that shouldn’t even be a possibility.

That’s because the Trump administration has no legal basis to intervene militarily in Venezuela without prior authorization from Congress. This Congress seems unlikely to authorize the deployment of U.S. troops to Venezuela. And while I’ve seen no polling on the matter, I strongly suspect that the public would oppose a new war there, and that if a war were begun, neither Congress nor the public would possess the resolve to see it through to a successful conclusion.

A war of that sort might be less likely if the media organizations reporting on Trump administration saber-rattling always pointed out that actually waging war would be flagrantly unlawful, rather than proceeding as if this is a matter properly decided by the White House. But much of the press has accustomed itself to an imperial presidency, so the Constitution’s mandates often go unmentioned.

Still, if the Trump administration unilaterally wages war in Venezuela, violating the separation of powers in the Constitution—a document that the president is sworn to protect and defend—the House should move to impeach the president.



from The Atlantic http://bit.ly/2Wv54io

Hollywood’s Secret History of Sexuality

How a Ballot Initiative to Expand Medicaid in Utah May Be Denied

SALT LAKE CITY, Utah — In November, Utah’s voters defied their state legislature and moved to adopt Obamacare’s Medicaid expansion in the state. With strong majority support, Utahns passed Proposition 3, a ballot initiative that would expand Medicaid coverage to all poor and near-poor adults. Joining Idaho and Nebraska, Republican-led states that passed similar initiatives in November, Utah reflected a divide between political leadership in ruby-red states—which has often opposed anything to do with Barack Obama’s signature policy—and the will of even Republican voters, who often like the plan and the prospect of more affordable coverage for more people.

But Utah, like some other Republican-led states where the Medicaid expansion has received support from the electorate, is now seeing immediate challenges to the law passed by popular democracy. With legislators tasked with crafting the sales tax and the budgetary changes needed to implement Proposition 3’s coverage, the GOP is pushing to add restrictions to benefits and eligibility of the expanded Medicaid program. Citing budgetary concerns, Republican legislators say they are doing what they can to advance the wishes of voters and the mandate to expand insurance, while also being fiscally responsible. But supporters of the proposition see in these plans a stealthy maneuver to repeal and replace the expansion before it even exists, and see it as part of a larger movement by Republicans nationwide to stonewall the progress of expanded health insurance, even when it runs counter to what their constituents want.

[Read: Medicaid expansion’s troubled future]

On Monday, hundreds of protesters descended on the state capital, intent on preserving the exact letter of what voters approved with Proposition 3 in November. That ballot initiative expanded Medicaid coverage to all non-elderly adults with an income under 138 percent of the federal poverty line and would pay for that expansion with an increase in the state sales tax, which state analyses found would raise about $90 million in revenue. If the initiative were implemented as expected, it would go into effect on April 1, extending free coverage to about 150,000 people, including many like the low-income workers who showed up Monday to protest.

Republican lawmakers, who control both houses and the governor’s mansion in the state, were never keen on the plan to begin with. But faced with undeniable public support in polls for Medicaid expansion, in 2018 the state legislature passed a bill that would expand Medicaid to some adults below poverty, an expansion of some 70,000 covered lives, but less than half of what proponents of Proposition 3 pitched. But that bill required a novel waiver from the federal Centers for Medicare and Medicaid Services (CMS).

[Read: How to stop Medicaid expansion]

It was clear then that Republicans viewed their partial expansion as a way to circumvent public sentiment. In a June 2018 confidential memo to the White House and to President Donald Trump, the state warned that “there is significant risk that Utah will vote to expand fully with a November ballot initiative” and that “allowing partial expansion would result in significant savings over the 10-year budget window compared to full Medicaid expansion by all.” But in the end the Trump administration viewed a decision to grant waivers for partial expansions as a capitulation to Obamacare. CMS did not make a decision on granting that waiver by the time Utah voters on November 6 went out and passed the much more expansive version of the expansion.

Now, as the start date for Utah’s Medicaid expansion approaches, proposals in the state House and Senate that mirror some parts of the original partial expansion are back on the table. Last week, State Senator Allen Christensen told the Deseret News, “We’re going to make the program work within the money the proposition provided.” The bill Christensen introduced that’s currently under consideration in the state Senate would apply only to people in poverty, establish caps on spending, make eligibility and verification more complicated and restrictive for applicants, establish work requirements for beneficiaries, and introduce lockout periods for people who violate the conditions. A bill under consideration from the state House would roll back the Proposition 3 version of the Medicaid plan so that Christensen’s could take its place. Both bills were under consideration by the Senate Health and Human Services Standing Committee on Tuesday.

[Read: How GOP voters are getting in the way of a Medicaid rollback]

Republican lawmakers have cast these moves as ones that would save the Medicaid expansion that voters wanted, not dismantle it. According to the Associated Press, Senate President Stuart Adams, a Republican, indicated that Utah’s legislature intended to pass a Medicaid expansion, but that it needed to be accomplished in a “fiscally prudent way.” Republicans point to a predicted $10.4 million budget shortfall for the state in 2021.

But proponents of Proposition 3 see this as a rejection of the self-determination of voters. “These politicians are trampling on the most fundamental principles of representative democracy,” said Jonathan Schleifer, executive director of the Fairness Project, a group that advocates for statewide Medicaid expansions and spearheaded several ballot-initiative efforts in GOP-led states in 2018. “Legislators who vote for this bill would be disrespecting the voters of Utah, disrespecting the struggling families who were expecting access to health care, and disrespecting the basic principles of representative democracy.”

Other objections to the GOP’s plan also mention that it—like the 2018 attempt to circumvent the ballot-initiative process—relies on the grant of a waiver from CMS that may never come. According to the liberal-leaning Center on Budget and Policy Priorities, “CMS denied earlier requests from Arkansas and Massachusetts that they receive the enhanced matching rate to cover only part of the expansion population. Thus, there’s a high chance that CMS wouldn’t approve a new waiver proposal from Utah with the same type of request and, consequently, no coverage expansion would ever occur.” In addition to the real, probable risk of the proposed expansion never materializing, even if the federal government does go against precedent and approves Utah’s proposed waiver, it would take months to be able to implement it properly, meaning coverage certainly won’t start in April.

Critics of Utah’s GOP see its actions as reminiscent of other Republican efforts across the country to stonewall the Affordable Care Act (ACA) and turn Medicaid into a more conservative program, even when it means going against direct democracy. In 2017, 59 percent of Maine’s voters approved a Medicaid-expansion ballot initiative. Then-Governor Paul LePage, a Republican, simply ignored that initiative, refusing to implement the reform until state legislators made changes in funding the program. Advocacy groups sued LePage. Janet Mills, a Democrat, was elected governor in 2018, and when she assumed office, she implemented the program. Other Republican governors have—with the blessing of the Trump administration—championed the implementation of work requirements in Medicaid, and have sought increased flexibility through waivers to create programs that cover fewer people, increase cost exposure for covered people, and make eligibility verification more frequent and onerous. In two other states, last November saw expensive, major campaigns to defeat Medicaid expansions from ballot initiatives.

As is true in Utah, these actions all highlight the tightrope the GOP is now walking. In national and state polls, Americans across party lines have indicated support for the ACA’s coverage expansion, have especially favored Medicaid expansions, and have favored funding mechanisms for those expansions. All of this was true and is true in Utah, and was made explicitly clear by direct democracy last fall. Yet the expansion could still fail to materialize, a development that would put representatives squarely against the will of the people they represent.



from The Atlantic http://bit.ly/2UsBFn0

The Environmental Issue Republicans Can’t Ignore

When Florida Governor Ron DeSantis declared on his inauguration day that water is “part and parcel of Florida’s DNA,” and vowed to fight the pollution and toxic algae that choked the state’s beaches and fresh waters last summer, his critics rolled their eyes to the Tallahassee heavens above. DeSantis had a poor environmental voting record in Congress. He’d helped found the House Freedom Caucus, which urged President Donald Trump to eliminate the Clean Water Rule and dozens of other environmental safeguards.

But two days later, the critics looked to those same heavens in wonder. Florida’s new governor began his tenure with one of the furthest-reaching environmental orders in state history, calling for a record $2.5 billion for Everglades restoration, a harmful-algae task force, a chief science officer, and an office of resilience and coastal protection to fund and coordinate Florida’s response to rising seas. Under the headline “Florida’s Green Governor,” the state’s largest newspaper, the Tampa Bay Times, declared that DeSantis “has done more to protect the environment and tackle climate change in one week than his predecessor did in eight years.”

DeSantis’s actions reflect a broader effort by some red-state governors to confront the unifying issue of water, even though they remain quiet, if not completely silent, on the larger crisis of a warming world.

Concern about climate change has surged to record levels over the past year. Yet anti-science operatives funded by the fossil-fuel industry still relentlessly spread misinformation; the recent video “Why Climate Change Is Fake News” has drawn 10 million views on Facebook. And as his administration systematically rolls back environmental regulations, Trump seems to like stoking distrust of the scientific establishment. It is no surprise, then, that political affiliation continues to shape belief: 86 percent of Democrats believe the climate is changing, compared with 52 percent of Republicans, according to a University of Chicago/AP poll released last week.

[Read: They’re here to fix climate change! They’re college Republicans.]

What’s happening to water around the nation, however, permits no alternative claims: the piles of stinking algae and rotting fish heaped up on both coasts of Florida last year. The hundreds of thousands of gallons of raw sewage and hog waste that swirled in the North Carolina floodwaters after Hurricane Florence. The chalky walls—now too large to be called bathtub rings—exposed as Arizona’s Lake Mead drops to record-low levels.

Like DeSantis, Arizona’s Republican Governor Doug Ducey focused on water in his first major address of the new year—without using the words climate change. During his reelection campaign, water was “one of the issues I was asked about most by real people,” he said. Noting that Arizona faces a January 31 deadline to figure out how to reallocate its dwindling portion of the Colorado River, Ducey urged the legislature to see beyond politics and partisanship to “do the things that matter and secure Arizona’s future.”

“At the top of that list,” he said, is “securing our water future.”

It was the same in Idaho, where newly elected Republican Governor Brad Little devoted part of his State of the State speech to “Idaho’s lifeblood”—water—spotlighting the once arcane issue of Eastern Snake Plain Aquifer replenishment.

These red-state GOP governors are not taking aim at greenhouse-gas emissions like their blue-state Republican counterparts Governors Larry Hogan of Maryland, Charlie Baker of Massachusetts, and Phil Scott of Vermont. Still, environmentalists should not dismiss their momentum on water. In several states won by Trump, water, literally a chemical bond, is also proving a bond that brings disparate people, groups, and political parties together around shared concerns for the Everglades, the Great Lakes, the Colorado River, and other liquid life systems. “We have this phenomenon where one of the ways to work on climate change without triggering the cultural wars is to work on water,” says John Fleck, the director of the University of New Mexico Water Resources Program, who researches the Colorado River and solutions to water scarcity.

[Read: The GOP just lost its most important climate moderates]

Water progress is climate progress. It takes an intense amount of energy to extract water, treat it, and dispose of it, and to clean water when it’s polluted. Nationally, water consumption peaked in 1980 and has dropped steadily, even as the economy and population have grown. That shift, in waterworks and minds, affirms Americans’ willingness to live differently once we understand how painless the better path is. The same will be true for decoupling carbon growth from economic growth, though it may have to wait out Trump: U.S. carbon-dioxide emissions surged last year, even as a near-record number of coal plants shuttered.

If DeSantis, Ducey, and Little can make strides on water, they will also make strides for the Republican Party, said Christine Todd Whitman, a former EPA administrator and Republican governor of New Jersey. “It comes down to issues of human life and safety,” she said, “and that’s what we’re supposed to be all about.” America’s Everglades and Chesapeake Bay restorations, Great Lakes compact, Colorado River drought planning, and other hard-won partnerships have taken decades of bipartisan leadership to bear fruit. By preserving and even strengthening those state-federal alliances in the Trump era, the governors help their party salvage a legacy to which the GOP is entitled.

A half century ago, President Richard Nixon pushed the Environmental Protection Agency, the Clean Water Act, and other safeguards in response to a broad public outcry over the industrial and sewage pollution then fouling rivers, bays, and coastlines. This June 22 will mark the 50th anniversary of the 1969 Cuyahoga River fire credited with sparking that outcry. But amid all the events and retrospective articles planned to commemorate that date, the more important one to remember might be 1868. That was the first of at least a dozen times the Cuyahoga burst into flames, sparking a century of pollution disasters, some fatal.

[Read: The Republicans who want Trump to fight climate change]

All that time, Americans accepted industrial pollution as an inevitable consequence of progress. Now we don’t have the hundred years we spent watching the Cuyahoga burn to watch the planet do the same. We must hope that the red-state governors’ attention to water will lead them to act on climate change, because the sorry truth is that even the boldest work on water won’t mean much if we can’t also stop warming.

In the early 2000s, Australia faced a drought so severe that abandoning major cities such as Perth seemed like a real possibility; the continent was feeling the water effects of climate change earlier than the rest of the world. For two decades, the Aussies have pioneered desalination, water markets, sewage recycling—and generally some of the most conscientious water habits in the developed world. And yet all that is not enough. It is summer in Australia. This month, record temperatures have contributed to wildfires, horses found dead in dry watering holes, and unprecedented fish kills in the iconic Darling River.

Hundreds of thousands of fish float belly up there, in striking and sickening similarity to Florida’s summer. DeSantis said it best: Water issues “do not fall on partisan lines.” Nor, ultimately, will climate change.



from The Atlantic http://bit.ly/2DJWAfX

Teaching the Bible in Public Schools Is a Bad Idea—For Christians

Shortly after Fox & Friends aired a segment about proposed legislation to incorporate Bible classes into public schools on Monday morning, President Donald Trump cheered these efforts on Twitter. “Numerous states introducing Bible Literacy classes, giving students the option of studying the Bible. Starting to make a turn back? Great!” Trump wrote.

The segment followed a USA Today report on January 23 that conservative Christian lawmakers in at least six states have proposed legislation that would “require or encourage public schools to offer elective classes on the Bible’s literary and historical significance.” These kinds of proposals are supported by some prominent evangelicals, including Family Research Council president Tony Perkins, the Texas mega-church pastor John Hagee, and even the actor Chuck Norris. They argue that such laws are justified by the Bible’s undeniable influence on U.S. history and Western civilization.  

If conservative Christians don’t trust public schools to teach their children about sex or science, though, why would they want to outsource instruction about sacred scripture to government employees? The type of public-school Bible class that could pass constitutional muster would make heartland evangelicals squirm. Backing “Bible literacy bills” might be an effective way to appeal to some voters, but if they were put into practice, they’re likely to defeat the very objectives they were meant to advance.

[Read: Trump’s Bible fail]

Debates over whether religion has a place in public schools are as old as public education itself. Some of America’s earliest grade schools were private and church-run, and they almost always included religious education. In colonial New England, which had early public schools, religious texts, including the Bible, were generally assigned a central role. In the 19th century, however, as a greater array of local governments started public schools to provide nonsectarian education for all children regardless of religion or social status, that began to change.

As the historian Steven K. Green writes in The Bible, the School, and the Constitution: The Clash That Shaped Modern Church-State Doctrine, many of these common schools still included Bible reading and were influenced to some degree by Protestantism. But they largely avoided providing devotional content and eschewed proselytizing students. After all, as Green says, “a chief hallmark of nonsectarian education was its purported appeal to children of all religious faiths.” The ostensibly secular nature and mission of these common schools miffed many Protestants at the time, much as they do today.

These early schools were controlled by the states, but the federal government began to assert a greater role in the 20th century, as the Supreme Court started to constrict the presence of religion in public schools. This included the landmark 1948 case of McCollum v. Board of Education, in which the Court ruled that “a state cannot consistently with the First [Amendment] utilize its public school system to aid any or all religious faiths or sects in the dissemination of their doctrines,” as well as the highly controversial 1962 case of Engel v. Vitale, in which the justices declared that compulsory prayer in public schools was unconstitutional because it violated the First Amendment’s establishment clause. These decisions helped spark the modern culture wars and the rise of the religious right, a movement that sought to fight the “secularization of public schools,” which included the reincorporation of prayer and Bible reading.

[Read: Why schools are banning yoga]

Donald Trump is an unlikely choice to play the role of champion for this cause. The thrice-married real-estate mogul who claims to have never asked God for forgiveness has said that he attended Manhattan’s Marble Collegiate Church, but the congregation responded that he is not “an active member.” Trump claimed on the campaign trail that the Bible was his favorite book, but he couldn’t cite his favorite verse when asked. (He finally named his favorite verse eight months later, but seemingly didn’t understand how to interpret or apply it.)

But while Trump might not care much about the Bible personally, he knows it is politically important to many of the conservative Christians who support him. According to a recent study conducted by the Barna Group, a staggering 97 percent of churchgoing Protestants and 88 percent of churchgoing Catholics said they believed that teaching the Bible’s values in public school was important. Trump’s odd tweet is likely an effort to shore up his base, rather than a passionate plea for a new educational initiative.

Following Trump’s tweet on Monday, some legal scholars argued that teaching the Bible in any form in a public school would violate the First Amendment. But others, such as John Inazu, a law and religion professor at Washington University in St. Louis and the author of Confident Pluralism: Surviving and Thriving Through Deep Difference, believe that this would depend on the nature of the classes.

“The Court’s approach to the establishment clause is convoluted and unclear, so it is difficult to say whether a Bible literacy class is unconstitutional,” Inazu told me. “It’s a historical fact that the Bible has influenced Western civilization and U.S. history, so it’s plausible that you could teach a class like this if it is done in a way that promotes cultural literacy.”

[Read: Cheerleaders for Christ]

Inazu added that the courts would also consider the motive behind instituting such classes. If they determine that the motivation behind Bible-literacy bills is to privilege Christianity, the classes could be ruled unconstitutional. That’s bad news for those pushing these bills because, as USA Today reported, they are the product of “an initiative called Project Blitz coordinated by conservative Christian political groups” who seek to “advocate for preserving the country’s Judeo-Christian heritage.” (It’s telling that they aren’t also advocating teaching the Koran or the Bhagavad Gita as part of world-history education.) So even if these bills pass, the classes they’ll spawn will likely not last long.

But assume for a moment that no ulterior motive is behind these bills, and that the conservative Christian activists and lawmakers are merely deeply concerned that America’s schoolchildren receive a better cultural-historical education. And also assume that teaching a Bible class in a public school is not a violation of the establishment clause, as many legal scholars claim. As Inazu pointed out, for a Bible class in public school to have any hope of passing constitutional muster, it would need to be academic rather than devotional. Which is to say, it couldn’t actually impart biblical values to students, and it would need to draw from scholarly consensus. And this is where the whole enterprise would backfire.

Start at the beginning. Many conservative Christians believe that the opening of the book of Genesis teaches a literal seven-day creation of the world by God around 6,000 years ago. But most academic Bible scholars believe that this text should be read poetically, and not as history or science. They interpret these passages as making theological points from the perspective of its ancient writers, rather than commenting on the truthfulness of Darwinian evolution or any other modern debate. Additionally, most conservative Christians believe that when Genesis tells about Adam and Eve, it is referencing two historical figures who were the first humans who ever lived. But geneticists assert that modern humans descend from a population of people, not a single pair of individuals.

[Read: Homeschooling without God]

The potential problems don’t stop with the Bible’s creation narrative. Conservative Christian parents across America teach their children a Bible story about a man name Noah who built a giant boat to house all of Earth’s animals during a cosmic flood. But scientists say that such a flood is impossible; there isn’t enough water in the oceans and atmosphere to submerge the entire Earth. Historians point out that strikingly similar stories are told in sources other than the Bible, some which pre-date the biblical tale. This story, many scholars conclude, was not meant to be literal. And what of the Bible’s story of Jonah being swallowed by a great fish and surviving in its belly for three days? A marine biologist will tell you it’s anatomically implausible, and many biblical scholars say the story is intended to be understood allegorically, anyway.

That evangelical understandings of the Bible differ from scholarly consensus doesn’t make them incorrect, but it does mean that the material taught to public-school students would likely diverge from what they are learning in church and at home. Can you imagine young Christian students coming home from school and informing their parents that they’ve just learned that all these cherished Bible stories are in fact not historically correct? How would evangelical parents react when their fifth grader explains that their teacher said the Apostle Paul said misogynist things and advocated for slavery? And what if the teacher decides to assign the Catholic version of the Bible, which has seven books that Protestants reject as apocryphal?

And this only addresses issues of historicity and interpretation. The social teachings of the Bible could also create issues in a public-school setting. A recent thesis project in sociology at Baylor University suggested that increased Bible reading can actually have a liberalizing effect, increasing one’s “interest in social and economic justice, acceptance of the compatibility of religion and science, and support for the humane treatment of criminals.” Every community that reads the Bible places unequal stress on certain books or passages. While evangelicals are generally more politically conservative, teachers in public schools might choose to emphasize the Bible’s many teachings on caring for the poor, welcoming the immigrant, and the problems of material wealth.

Bible-literacy bills are unlikely to pass in most states, and even if they do, they might be soon ruled unconstitutional. But conservative Christian advocates would do well to think through the shape these classes will likely take if their efforts are successful. They might end up getting what they want, only to realize that they don’t want what they’ve got.



from The Atlantic http://bit.ly/2UsBBng

The Atlantic Daily: Bad Apple

City Hall to the White House: Can’t Get There From Here

If the mayor of South Bend, Indiana, can run for president in the already crowded 2020 Democratic field, why shouldn’t the mayors of New York and Los Angeles? After all, each city is bigger and more complicated than plenty of states. But there’s just one thing that Bill de Blasio, who’s not ruling out a race, and Eric Garcetti, who just did, ought to remember about the last time the mayors of the Big Apple and the City of Angels decided they were best suited to topple a controversial Republican president: It didn’t turn out so good.

The year was 1972, and Richard Nixon looked vulnerable. Mayor Sam Yorty of L.A.—a conservative Democrat known as “Travelin’ Sam” for his peripatetic publicly financed travel—had spent nearly half his time away from his city in the last half of 1971 before launching a quixotic campaign in which he sought to out-Nixon Nixon on law and order. Yorty complained that his hometown was “an experimental area for taking over of a city by a combination of bloc voting, black power, left-wing radicals, and if you please, identified communists.”

Yorty received the backing of William Loeb, the extreme right-wing publisher of the Manchester Union-Leader newspaper in New Hampshire, who thought Nixon had gone soft on Vietnam. But Yorty won just 6 percent of the vote in the New Hampshire primary, never got any traction, and dropped out of the race just before the California primary, begging voters to support Hubert Humphrey instead of the “radical” George McGovern, who would become the party’s nominee.

[Read: How to run for president while you’re running a city]

John V. Lindsay’s campaign started out with more promise. The charismatic, patrician Republican who had walked the streets of Harlem to keep the peace when other cities burned in the 60s switched his party registration in 1971 to mount a campaign that proclaimed, “While Washington’s been talking about our problems, John Lindsay’s been fighting them.” No less a hardened cynic than Hunter S. Thompson professed to be impressed.

“If you listen to the wizards, you will keep a careful eye on John Lindsay’s action in the Florida primary,” Thompson wrote in Fear and Loathing: On the Campaign Trail ’72. “Because if he looks good down here, and then even better in Wisconsin, the wizards say he can start looking for some very heavy company … and that would make things very interesting.” And if nothing else, Thompson hoped, the potential presence of both Lindsay and Ted Kennedy in the race might turn that summer’s Democratic Convention in Miami “into something like a weeklong orgy of sex, violence and treachery in the Bronx Zoo.”

But despite a cadre of loyal campaign aides that included a young Jeffrey Katzenberg and the speechwriter turned journalist Jeff Greenfield—and despite spending half a million dollars in Florida—Lindsay finished fifth, with just 7 percent of the vote.

[Read: Bill de Blasio and Gavin Newsom may give restrictionism new life]

“A disgruntled ex-New Yorker hired a plane to fly over Miami with a sign reading ‘LINDSAY MEANS TSURIS,’” which is Yiddish for trouble, Greenfield recalled in an email this week. The Brooklyn Democratic leader Meade Esposito, still contemptuous of the mayor’s party switch, declared, “Little Sheba better come home,” a reference to the popular Broadway play in which a forlorn housewife pines in vain for her lost dog.

But Lindsay pressed on to Wisconsin, and Sam Roberts, who covered the campaign for the New York Daily News, still recalls “the mixture of hope and desperation.” Lindsay’s poll numbers were in the gutter, so to build momentum, he adopted a new slogan: “The switch is on.”

“I remember naively buying into the optimism,” Roberts remembers. “I wrote a story for the Daily News that Wisconsin was not likely to be Lindsay’s last primary. As the story was transcribed in New York, the word not was dropped. When it was published, I must have seemed prescient. The switch was on all right, but to other candidates. Lindsay ran sixth. The next day, he dropped out of the race.”

[Read: New York Mayor John Lindsay. Remember him?]

The political world is different today, of course. So Mayor Pete Buttigieg of South Bend may dream of presidential glory—and his big-city counterparts, including former Mayor Michael Bloomberg of New York, can, too. But the track record is not encouraging. Remember President Rudy Giuliani? Andrew Johnson, Grover Cleveland, and Calvin Coolidge were all mayors, but all first held other higher offices before winning the White House. Maybe, Greenfield suggested, that’s because the job of mayor “is seen in terms of picking up garbage and fixing the streets.” On the other hand, in Donald Trump’s Washington, that might be just what America needs.



from The Atlantic http://bit.ly/2DHcE1T

A Coast Guard Community Struggles to Put Food on the Table

Dixie Lambert has lived in the small fishing village of Cordova, Alaska, for 36 years. She knows almost everyone in the community—most of them U.S. Coast Guard employees and their families. During the 35-day shutdown, Lambert observed how many of these furloughed families struggled to make ends meet, so she began soliciting public donations at the local grocery store.


The filmmaker Derek Knowles, who was in the area filming another documentary project, met Lambert and was immediately struck by her personality and spirit. “She knew everyone who came into the store and transformed the grim backdrop of the shutdown into an occasion for good-humored action,” Knowles told The Atlantic. He decided to film Lambert for the better part of a day as she provided Cordovans with assistance buying groceries. “I felt like I got a window into Cordova itself and the power that can come from a genuine community, where everyone knows one another and cares for his or her neighbor,” said Knowles.


Alaska has one of the highest per capita rates of federal employees in the nation. As a result, it was hit especially hard by the economic effects of the shutdown. Even though the government has now temporarily reopened, Knowles said that many residents of Cordova are anxious that the deal won’t last long.


“We’re not even trying to guess what will happen next,” Lambert recently told Knowles.



from The Atlantic http://bit.ly/2ScJ6By

When Water Dooms Life

The Atacama Desert in northern Chile is the driest place on Earth, a parched rockscape whose inner core supports zero animal or plant life. Only a few hearty species of lichen, algae, fungi, and bacteria can survive there—mostly by clinging to mineral and salt deposits that concentrate moisture for them. Still, it’s a precarious life, and these microbes often enter states of suspended animation during dry spells, waking up only when they have enough water to get by.

So when a few rainstorms swept through the Atacama recently, drenching some places for the first time in recorded history, it looked like a great opportunity for the microbes. Deserts often bloom at such times, and the periphery of the Atacama (which can support a little plant life) was no exception: It exploded with wildflowers. A similar blossoming seemed likely for the microbes in the core: They could drink their fill at last and multiply like mad.

Things didn’t quite work out that way. What should have been a blessing turned into a massacre, as the excess water overwhelmed the microbes and burst their membranes open—an unexpected twist that could have deep implications for life on Mars and other planets.

The Atacama has been arid for 150 million years, making it the oldest desert on Earth. Its utter lack of rain can be traced to a perfect storm of geographic factors. A cold current in the nearby Pacific Ocean creates a permanent temperature inversion offshore, which discourages rainclouds from forming. The desert also lies in a valley that’s wedged between the Andes Mountains on the east and the Chilean Coastal Range on the west. These mountains form a double “rain shadow” and block moisture from reaching the Atacama from either side. The desert’s driest point, the Yungay region, receives fewer than 0.04 inches (or 1 millimeter) of rain a year. Death Valley in California gets 50 times more rain annually, and even the driest stretch ever recorded there still averaged 0.2 inches a year.

[Read: Climate change is hurting desert life]

That’s why the recent rainstorms in the Atacama—two in 2015 and one in 2017—were so startling. They left behind standing lagoons, some of which glowed a lurid yellow-green from the high concentration of dissolved mineral. Nothing like this had happened in Yungay since at least the days of Columbus, and possibly much earlier. No one quite knows what caused the freak storms, but climate change is a likely culprit, as the cold sea currents have been disrupted recently. This allowed a bank of rainclouds to form over the Pacific Ocean. The clouds then plowed over the Chilean Coastal Range and dumped water onto Yungay and surrounding areas.

Five months after the June 2017 storm, a group of scientists led by Armando Azua-Bustos, a microbiologist at the Universidad Autónoma de Chile, and Alberto Fairén, a planetary scientist at Cornell University, visited the Atacama to sample three lagoons. They wanted to study the microbes that had gotten swept into them and document how well they were handling this precious influx of water.

Not very well, it turned out. As detailed in a recent paper, the scientists found that the majority of microbes normally present in the soil had been wiped out—14 of 16 species in one lagoon (88 percent), and 12 of 16 in the others (75 percent)—leaving behind just a handful of survivors. On a local scale, the rains were every bit as devastating as the asteroid that wiped out the dinosaurs 66 million years ago, which killed off 70 to 80 percent of species globally.

The scientists traced this massacre back to the very thing that allows the microbes to survive in the Atacama: their ability to hoard water. Under normal conditions, this miserliness pays off. But when faced with a glut of water, they can’t turn off their molecular machinery and say when. They keep guzzling and guzzling, until they burst from internal pressure. Azua-Bustos and Fairén’s team found evidence of this in the lagoons, which had enzymes and other organic bits floating around in them—the exploded guts of dead microbes.

Water in the Atacama, then, plays a paradoxical role: It’s both the limiting factor for life as well as the cause of local extinctions. And while the death of some bacteria and algae might not seem like a big deal, these microbes are actually famous in some circles as analogues for life on Mars.

We don’t know whether Mars ever had life, but it seemed like a promising habitat for its first billion years, with vast liquid oceans and plenty of mineral nutrients—not much different than Earth. One billion years probably wasn’t enough time for multicellular life to arise, but Martian microbes were a real possibility.

Starting around 3.5 billion years ago, however, our planetary cousin went through a severe drying-out and began to lose its water. Some was sucked deep underground, and most of the rest got dissected into H2 and O through various chemical reactions. Eventually these processes turned most of Mars’s surface into one giant Atacama Desert, forbiddingly dry and dotted with mineral deposits. NASA, in fact, uses the Atacama landscape to test rovers and other equipment for Mars missions.

But there’s an important wrinkle here. The great drying-out didn’t happen instantly; it took eons. And during the transition, when Mars was fairly parched but still had some liquid water, it experienced floods that would have made Noah blanch. We can see evidence of them on the surface of Mars today: The dry riverbed channels and alluvial fans that those floods left behind are the largest in the solar system.

This tumultuous state—a hyper-dry climate, punctuated by massive washouts—would have been catastrophic for life on Mars. The slow drying-out would have choked off the vast majority of microbes, grinding them into dust. Any that managed to pull through, scientists have argued, probably would have resembled those in the Atacama today: water-hoarders clinging to oases of mineral deposits in a vast red desert.

But if Martian microbes did resemble their Atacama counterparts, then the washouts probably finished them off, swelling them with water and bursting them like balloons. After a certain point, in other words, Mars might have been too wet to sustain the life that evolved there.

[Read: The search for alien life begins in Earth’s oldest desert]

It’s possible, of course, that a few lucky pockets on Mars escaped flooding entirely, allowing microbes there to survive until today. But if so, Azua-Bustos and Fairén point out, our current approach to finding these holdouts could be doomed to fail. NASA sent the famed Viking lander to Mars in 1976, for example, largely to search for life there. To this end, the lander scooped up several soil samples for analysis—and immediately doused them with water. Viking might have come up empty anyway, but given the Atacama results, it also might have killed off the very thing it was looking for.

What applies to Mars applies to other worlds as well. Over the next decade, several new space telescopes will expand the hunt for life beyond our solar system, to planets orbiting distant stars. Scientists are especially keen to find planets that have liquid water, since as far as we know, liquid water is essential to life.

But that statement might need qualification. Water can give life, certainly. As planets change, however, and life evolves in tandem, it can also snatch life away.



from The Atlantic http://bit.ly/2Ut3iwu

What Billionaires’ Fasting Diets Mean for the Rest of Us

Twitter’s CEO, Jack Dorsey, doesn’t eat for 22 hours of the day, and sometimes not at all. Over the weekend he tweeted that he’d been “playing with fasting for some time,” regularly eating all of his daily calories at dinner and occasionally going water-only for days on end. In many cases, severe and arbitrary food restriction might be called an eating disorder. And while researchers are hopeful that some types of fasts may be beneficial to people’s health, plenty of tech plutocrats have embraced extreme forms of the practice as a productivity hack.

Dorsey’s diet was widely criticized on the website he runs, but Silicon Valley has an obsession with food that goes far beyond the endorsement of questionable personal-health choices. Intermittent fasting is a type of biohacking, a term that includes productivity-honing behaviors popular among Silicon Valley power players for their supposed ability to focus a person’s energy to work longer and more efficiently. To enhance themselves personally, tech leaders have adopted everything from specially engineered nutrition shakes to gut-bacteria fecal tests.

[Read: The harder, better, faster, stronger language of dieting]

What and when most people choose to eat is no one’s business but their own, but someone like Dorsey isn’t most people: He leads a platform with hundreds of millions of active users, built for the quick, contextless dissemination of ideas. As biohacking’s most powerful disciples become more committed and more evangelical, what does that portend for the vast workforces they employ, and for the far larger populations whose lives are affected by their products and policies?

Intermittent fasting, like most health-and-wellness behaviors, can exist anywhere on a spectrum that runs from very dangerous to potentially beneficial, depending on who’s doing it and how it’s implemented. Fasting in one form or another has been a part of human eating behavior for millennia, and although scientific research on it is still preliminary, early studies suggest it might help reduce the risk of heart disease, cancer, and diabetes. For people with eating issues, though, fasting can be a very risky trigger for anorexia or bulimia. For most people, exploring Dorsey’s lengthy, everyday fasts without oversight from a doctor or nutritionist is probably unwise. (Dorsey and Twitter did not immediately respond to requests for comment.)

On his Twitter account, Dorsey doesn’t mention anything about long-term disease risk or even weight loss, which is a purported benefit of fasting that’s gained the practice a lot of attention over the past several years, including from celebrities such as Kourtney Kardashian and Chris Pratt. Instead, Dorsey focuses on how much time slows down when he hasn’t eaten anything. Considering the demands of his job, it’s not surprising that a longer day would be important to him: Silicon Valley is, by and large, always looking to find a way to do a little bit more work. The tech industry also employs a younger-than-average workforce, full of burned-out Millennials who are expected to performatively hustle in order to curry professional favor and advance their career, creating what’s potentially an ideal environment for unhealthy “health” practices to proliferate.

Whether any Silicon Valley tech companies have implemented biohacking behaviors as an expectation for their employees is hard to know, but the industry itself produces a lot of diet programs and products, and it has a history of coercive eating policies for its workforce. Many big tech companies have on-site employee cafeterias that provide food for free or reduced cost. “By helping your employees make healthier decisions, your business benefits with reduced absenteeism and more productive energy,” wrote Andrea Loubier, the CEO of Mailbird, in a 2017 op-ed that encouraged other tech execs to follow Google’s lead and provide employees with certain types of food in-house, as well as with calorie-counting information. These policies are usually framed as a win-win for employers and their workforce—who doesn’t want a free lunch?—but in the end, they still tend to keep employees close to their desk and working as much as possible.

[Read: The myth of the cool office]

For companies that can’t build on-site food service for their employees, it’s possible to take a strong-arm approach to moderating their workforce’s diet and physical activity in other ways. Office wellness programs are popular and widespread even outside of the tech sector, with many of them featuring things such as office weight-loss challenges that encourage employees to restrict their eating for fun and prizes. As the journalist Angela Lashbrook argues, these programs can act as employer surveillance masquerading as health. “It’s perfectly legal to increase health-care premiums based on the failure of a customer or their partner to achieve certain benchmarks in an insurance-affiliated wellness program,” she writes.

In collecting detailed data about weight or physical activity, workplace wellness takes another step toward punishing failures that don’t necessarily show up in the quality of a person’s work, helping make an ever larger portion of a person’s existence fodder for performance reviews.

Certainly, it’s possible for executives to keep their personal-health practices separate from what they expect of others. But tech as a business sector has long been notorious for its bad boundaries between the personal and the professional. Silicon Valley can give the impression that all personal choices should be made for the end goal of doing ever more work and generating ever more money for founders or venture capitalists, which is part of why so many people find it unnerving to watch a man with so many employees decide that not eating is a valuable practice. If everyone above you on the organizational chart refuses to eat in order to squeeze a little more work out of an already long day, consuming your sad desk salad might be a little higher-stakes than you thought. The person who signs your paychecks might be watching.



from The Atlantic http://bit.ly/2DIpPQh