James Fallows

James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne. More

James Fallows is based in Washington as a national correspondent for The Atlantic. He has worked for the magazine for nearly 30 years and in that time has also lived in Seattle, Berkeley, Austin, Tokyo, Kuala Lumpur, Shanghai, and Beijing. He was raised in Redlands, California, received his undergraduate degree in American history and literature from Harvard, and received a graduate degree in economics from Oxford as a Rhodes scholar. In addition to working for The Atlantic, he has spent two years as chief White House speechwriter for Jimmy Carter, two years as the editor of US News & World Report, and six months as a program designer at Microsoft. He is an instrument-rated private pilot. He is also now the chair in U.S. media at the U.S. Studies Centre at the University of Sydney, in Australia.

Fallows has been a finalist for the National Magazine Award five times and has won once; he has also won the American Book Award for nonfiction and a N.Y. Emmy award for the documentary series Doing Business in China. He was the founding chairman of the New America Foundation. His recent books Blind Into Baghdad (2006) and Postcards From Tomorrow Square (2009) are based on his writings for The Atlantic. His latest book is China Airborne. He is married to Deborah Fallows, author of the recent book Dreaming in Chinese. They have two married sons.

Fallows welcomes and frequently quotes from reader mail sent via the "Email" button below. Unless you specify otherwise, we consider any incoming mail available for possible quotation -- but not with the sender's real name unless you explicitly state that it may be used. If you are wondering why Fallows does not use a "Comments" field below his posts, please see previous explanations here and here.

James Fallows: Atlantic monthly

  • Michael Janeway and The Atlantic

    A man who played a large role in shaping the magazine you see today.

    I was sorrier than I can say to have learned today that Michael Janeway, a friend who influenced many journalistic institutions but probably most of all The Atlantic, had died of cancer.

    The Boston Globe, where Mike held a sequence of editing roles including that of editor in chief, ran a wonderful appreciation by Joseph Kahn, which is true to the range of achievements and complexities in his life and career. I mention it both in hopes that you'll read Kahn's article and as an excuse to offer a screen shot of the Globe photo by Stan Grossfeld that accompanied the obit, which is of Michael Janeway in his mid-40s and captures him at his sunniest.


      


    Mike Janeway is known in journalism for a series of influential roles: at the Globe, where he fostered talent and invented or revitalized sections before a stormy period as head editor; then as a book editor at Houghton Mifflin; then as dean of the Medill School of Journalism at Northwestern, where he foresaw and discussed many of the trends all of journalism is now coping with; then as a professor at Columbia Journalism School; and as an author, including of a history-memoir of the New Deal-and-afterwards policy intelligentsia.

    This last (The Fall of the House of Roosevelt) was cast as memoir because Mike's parents, the economist Eliot Janeway and the novelist Elizabeth Janeway, were in the middle of this group, and Mike grew up hearing stories about them and meeting luminaries around the house. It also turned on Mike's difficult relationship with his then-quite famous father, as you can see from the book itself or via Michael Beschloss's fine review in the NYTBR. Among the Janeways' New Deal contacts had been the rising Congressman Lyndon Johnson. As a teenager Mike had a summer job working on Capitol Hill for Senator Johnson, and he remained deeply interested in the grandeur and the tragedy of the ultimate Johnson saga: the ambitions for a Great Society and a "we shall overcome!" civil-rights movement, the disaster of Vietnam. The circle of people grappling with the contradictions of Johnson -- of whom Robert Caro is best known now but that also includes Billy Lee Brammer, Bill Moyers, Doris Kearns Goodwin, James C. Thomson, Harry McPherson -- very much included Mike Janeway.

    That's the journalism world in general. His influence on the Atlantic specifically was profound. In 1967, five years out of college, he joined the magazine as a staff editor. Robert Manning had just become editor and was shifting the magazine toward coverage of the great upheavals of that time -- race relations, wealth and poverty, Vietnam, all the rest. On their watch the magazine published probably the best real-time assessment of what was going wrong in Vietnam, James C. Thomson's "How Could Vietnam Happen? An Autopsy" -- and a long sequence of other journalism that stands up nearly two generations later. In 1968, when Mike was 28, he and Manning co-edited an influential book of writing about Vietnam, called Who We Are. The magazine you see today is an extension of what Manning, Janeway, Richard Todd, Louise Desaulniers, Michael Curtis, and others (including Elizabeth Drew, the Washington correspondent) created in those years.

    I was in college then, and I would rush out to the newsstand -- yes, that's how it worked in those days -- to get the new issues of The Atlantic, and Harper's, and the nascent Rolling Stone.I barely dared imagine then that I could eventually write for one of these publications, and that I managed to write for the Atlantic is due to Mike Janeway.

    After I heard the news about Mike today, I stopped to reflect that a small group of people (outside my family) gave me crucial opportunities, support, and direction at important early moments in my path. One, perhaps improbably, was Ralph Nader, under whose auspices and at whose prodding I ended up writing two books very soon after leaving college. Another was, of course, Charles Peters, whose role in training generation after generation of reporters and writers at The Washington Monthly is well-known at least within the business.

    Another was Michael Janeway. After I had finished my Washington Monthly stint and was trying to get a start as a free-lancer based in Texas,he entertained one proposal for an Atlantic story. After he nursed me through that one, there was another, and another. Years later, when he was at Houghton Mifflin and I had run into trouble with a different publisher with my idea for a book, he took it over and guided it to what I found a very satisfying conclusion. (This was More Like Us.) When he was dean at Medill-Northwestern, he invited me to give a speech that became the outline and impetus for my book Breaking the News. All of this doesn't matter to anyone else, but it mattered a lot to me, and his example (plus others') is in my mind as I think about dealing with people now trying to get a start. 

    As Joseph Kahn's excellent Globe story conveys, Michael Janeway was not always an easy-going man -- toward others, or on himself. I am glad for the reports that he became more contented in his latest years. I am sorry that I did not think to tell him directly how much he had meant to his profession, and to this magazine, and to me, but I wanted to say it to his family members now.

    Previous post                                                                     Next post

  • Why You Still Have to Fill Out All Those Paper Forms at a Doctor's Office ...

    ... and whether that might ever change.

    David Blumenthal MD

    In the new issue of the magazine (subscribe!) I have an interview with Dr. David Blumenthal. He is now head of the Commonwealth Fund, but during the first few years of the Obama administration he was in charge of moving America's medical-records system away from tedious paper-based filing to the digital age.

    I am biased, in that David Blumenthal and I have been friends since we were teenagers, but I think he does a very good job of explaining why it has taken so agonizingly long for medical records to catch up with the rest of our digitized life -- and what the payoff will be when the impending switch takes place.

    Please check it out. I've already received some retorts from other doctors (and insurance-company people), and will post in due course. 

  • Our New Issue Is Out

    For your seasonal reading enjoyment.

    Through the bounty of Fox TV, a statue on the waterfront of Eastport, Maine. Details below.

    The next issue of the magazine is out just now. It's best read and enjoyed in print (the perfect gift!), but you can also get the idea online. Through the years I've made a point of not seeing what's in the magazine, apart from articles I'm directly involved in, until the whole thing arrives in the mail. A few highlights from this one:

    • Scott Stossel's cover-story account of his lifelong adventures in "Surviving Anxiety" is worth the year's subscription on its own. 
    • Christopher Orr answers a question many fans of Elmore Leonard may have had, of why novels that seemed so cinematic on the page had such trouble making the transition to the big screen.
    • Liza Mundy on the virtues of paternity leave and "The Daddy Track."
    • The redoubtable E. Fuller Torrey with a "very short book excerpt" about a phenomenon I have also noticed while traveling around the country.
    • And lots more including poetry, dark secrets of the Internet, extreme-craft beer, ways to fix televised sports, raciness in the air, and other compelling topics.

    The two parts of the issue I had seen before publication time were my Q & A with the also redoubtable Eric S. Lander, on how and "When Will Genomics Cure Cancer?" and related big-picture questions. And the first print-magazine installment from our "American Futures" series (with the Marketplace radio program and the Esri mapping firm). This one is about Eastport, Maine, "The Little Town That Might." Previously you heard about Eastport online (eg here and here) and in this Marketplace report.  Our new article, among other things, gives the backstory on the tasteful super-life-sized statue shown at top.

    Read, enjoy, give gift subscriptions, and have a Merry Christmas season.

  • Events: Pritzker; Simon, Doctorow, Gessen, & Nafisi; Kummer & Levenson

    Three interesting events; hope you can catch some or all.

    In the world of modern journalism, sometimes you are traveling around and learning things as a reporter; sometimes you are sitting in a room and going crazy as a writer; and sometimes you get to ask people questions as a moderator and live-event appearance-maker.

    The next two days hold a rich offering of live-event possibilities for me. Please join in if you can:

    Atlantic Small Business Forum, which runs through the morning of tomorrow, December 4, in Washington. You can see a live stream here. I get to interview the new Secretary of Commerce, Penny Pritzker, about her "Open for Business" agenda and other initiatives at 9:30 a.m.

    PEN - Google - Atlantic Forum, "Who's Afraid of Free Speech?" tomorrow afternoon, December 4, in Washington starting at 4 p.m. You can find information about attending or seeing a livestream here. I get to interview an astonishing panel including E.L. Doctorow (any book you can name); David Simon (The Wire etc); Azar Nafisi (Reading Lolita in Tehran etc); and Masha Gessen (The Man Without a Face etc), at the Newseum in Washington.

    MIT Writing Program, "Long-Form Journalism: Inside the Atlantic," on Thursday afternoon in Cambridge, 5pm-7pm, details here. Tom Levenson, Professor of Writing at MIT and author of the Inverse Square blog, will interview the Atlantic's Corby Kummer, who has edited my articles for 30+ years, and me on specifics of how articles go from inchoate concept to (we hope less inchoate) published reality. 

    Then it will be back to the reporting and writing mode. I was going to take a stab at adventure/reporting by flying myself up to Boston for the MIT event, but ominous weather forecasts for the return on Friday morning leave me in the hands of US Air. 

  • '50 Greatest Breakthroughs,' the Illustrated and Expanded Edition

    From horse collars to the atlatl (yes) to beer, the things we might have added to the list.

    Thomas Perry, an artist and designer living in Osaka, sends this timeline companion to our current cover story on the 50 most significant breakthroughs since the discovery of the wheel. He explains his approach:

    I spent a couple of hours creating a timeline in a class I teach on basic design to interior, product and graphic design students in Japan. I agree with a point you made on your blog that the changes at that time had a much more jarring impact on day-to-day life compared to the present, although dispersed into general society at a slower pace compared to recent inventions.

    A few explanatory points:

    * I start the 20th century at the sinking of the Titanic - all of the promise of recent technology with the dark hint of what`s to follow. A case could be made that electrification starts the 20th century but my guess is that it did not truly penetrate (western) society until after 1912, more likely not until the `20s. I close the 20th century at the airing of the Apple "1984" Mac commercial and the beginning of the digital age on a commercial level. An explanation of Orwell`s book is also necessary.

    * on occasion I use 2 dates to mark an invention, for example, photography:  1827 for the introduction of the first daguerreotype and then the early *1900`s as the vague starting time for general use. When I ask the students to fill in the timeline from the list of inventions that we draw up, I emphasize that they should pick dates when the item truly becomes relevant to general society, otherwise electricity and automobiles first make their mark in the 1600`s.

    *interesting how certain events can be dated to the day, airplanes and WW 1, but most have a vague time span, automobiles, the starting point of WW 2 in Asia, the cell phone, email, etc.

    * I noticed in your article you put refrigeration as starting in the 1850`s, I place it at 1927 and the introduction of the electric refrigerator for home use .... the freezer does not make it into the home until the 40`s according to my brief research on the internet...

    Now, other reactions from readers. First, what about the threshing machine

    While I could quibble with some of the rankings, etc., the biggest flaw is the combine harvester [#50 on our list].  While important, this was merely an improvement over the real leap forward, the invention of McCormick threshing machine.  Before that, grain had to cut by hand with a sickle (which was itself probably a pretty important invention).  More so than the moldboard plow, this is what really made large scale farming feasible and started the movement of people off the farms into the cities.   

    What about the semaphore?

    The quote about the telegraph ("Before it, Joel Mokyr says, “information could move no faster than a man on horseback.”) is incorrect.  As early as 1795, messages from London to Deal (about 75 miles) were regularly being sent in about one minute.  Other European cities probably were doing it even earlier.

    The Semaphore Line, invented and developed by various Europeans in the late 18th century, was so quickly and throughly eclipsed by the electric telegraph that it is largely forgotten today.  But for several decades it could send messages quite quickly across great distances,  assuming the cooperation of weather and finicky machinery.

    What about the horse collar? From a reader in Zurich:

    Clearly there would have been great diversity in the lists presented by each panelist, but I would be curious to know if anyone had mentioned the horse collar (possibly as early as the 3rd century AD.)  [JF note: I don't recall anyone suggesting this, but I'll check again.]

    Years ago in an undergraduate history course, we were made aware that this invention enabled man to use a more powerful animal to plow, expand production from subsistence level, and, ultimately, enable the formation of cities.  It still sounds pretty important to me.

    What about baskets?

    But before I read down to the details of the piece I stopped myself and wrote my own choices for the list.  Such lists, though, are troubling because they strive to be unconditional:  the Gutenberg press is fine alone, except what about cheap paper?  Ooops, and the alphabet? So, like your correspondents, I varied the list into clusters of like objects, each of which made something else possible.  That became my criterion:  the power of what is made possible by the invention or innovation.  The great mind may be a combinatory mind.

    So:  wheel is fine.  Wheel + axle is better because it seems to show more possibility in its genius. Hammer-stone, cutting-stone, stone scraper, stone spear, atlatl, are all the same, because they permitted human beings to plan for things like hunting, prepare things like food and clothes, and make shelters.  Higher intellect begins for me in planning, preparing and making -- and then replicating these activities in an adaptive way.  We could add controlled fire + cookery to the list, probably, and the adaptation of materials for tools, weapons, shelters, and food as well.  (Also, some kind of primitive balanced meal must have allowed robust survivors to emerge with expanded brains.)

    Maybe a shorter list of processes is needed as well:  not just invention but also adaptation, combination, cultivation, imitation, experimentation, evaluation, differentiation, judgement.  Welcome, people, to complexity and ambiguity.

    An insight I had at the American Museum of Natural History in 1983 or 1984 made me add baskets to my list ("a world with a basket is better than a world without a basket," that was my thought when contemplating the native people of the American Plains).  But that thought has to be broadened to include all kinds of containers and vessels made of glass, pottery, metal.  Imagine a world without portable storage, cooking containers, drinking gourds.

    What about justice for Eli Whitney? A reader whose last name is Whitney writes:

    For my own selfish reasons, I was pleased to see that Eli Whitney made the list by virtue of the cotton gin.  However, I feel he was slighted by not being given credit for the introduction of the assembly line, described as starting in 1913 and having "Turned a craft-based economy into a mass-market one." 

    As I recall from a long-ago history course, Eli Whitney pioneered the idea of the assembly line when, around 1790, he got a contract from the US government to produce the Whitneyville musket.  This weapon was revolutionary in that it was made from interchangeable parts.  Previously firearms were hand-crafted; the individual gunsmith had to machine each part to fit into the weapon he was making.  By introducing interchangeable parts, an unskilled workman could assemble the musket rather than having to rely on a skilled craftsman.  Without interchangeable parts, the whole idea of the assembly line would never have been workable.

    After the jump, one more message on the hits and misses of our survey. 

    What about beer? From a reader in the Midwest:

    A breakthrough for early America which I think had global impact was the move from agriculture and into the early urban settings which, among other things, meant groups/communities were no longer subsistence based but that one group had to create things that supported and met the needs of others. America started having 'specialists' rather than self contained and self sustaining communities. Then, most dramatically, was the first Constitutional Convention of 1787 which eliminated the Articles of Confederation and generated a wholly new kind of inter-dependency.

    And that struggling inter-dependency which led to America's economic growth and strength had its own dramatic breakthrough in the development and applications of Alexander Hamilton's notion of banking, credit and early capitalism/free markets to counter the European mercantile mode (in this, Jefferson's Barbary Wars might also be considered a global economic breakthrough). The centralization of debt by the federal government and the federal government's subsequent right to tax for common services and management also opened huge opportunities for entrepreneurs and the need by communities to integrate and further develop services and products.

    One thing I was disappointed to see overlooked were several world changing military breakthroughs which then allowed cultures to forcibly intermingle, re-prioritize what needed to be done and mix intellectual capital. Two of those examples could include the English Long Bow (victory at a distance and with far less losses to the winning side) and mounted cavalry (allowing much wider range for military and social conquest). There was also the 'breakthrough' of so long ago when the first nomadic modern humans built structures and started to save and use seed linked, per some theories, to the brewing of beer thereby requiring crop growth.

    From the original article itself:

    Any collection of 50 breakthroughs must exclude 50,000 more. What about GPS systems, on which so many forms of movement now depend, and which two panelists recommended? What about the concept of the number zero...

    The more questions and discussions our ranking provokes, the more successful the endeavor will have been.

    Thus I declare success. 

    More »

  • Readers Comment, on Reader Comments

    What Bartleby the Scrivener can tell us about the modern Internet.

    Earlier today I re-explained why I like getting and quoting reader mail but have chosen not to have an open-comments section. Now some sample response:

     'What are you so afraid of?' From a representative unhappy reader:

    I have largely stopped reading your columns because of the no comment rule you insist on.  [JF Note: As good luck would have it, the reader happened to see this item and respond within a few minutes of its going up.] To me, it seems quite intellectually cowardly, not a brave stand as you would have it.  

    To say that open (unmoderated) forums become a place "ruined by bullies, hotheads, and trolls" does not really represent a coherent argument to be against them.  Should we shut down speaker's corner in London because of "hot heads and bullies"?  Or stop town halls stuffed with Tea Partiers because arguments may be too narrow-minded?  You obviously can chose to read through them or not, but not all the comments are thoughtless and I don't think the intellectual world needs you policing the unseemly things that some people say.

      All it seems like is that you would find it too unseemly to have one of your articles sullied by some stupid comment, that it would ruin the whole thing.  Do all other authors at the Atlantic or others writers like Ezra Klein, or Jonathan Chait or whomever, painstakingly winnow out bad comments. Most forums allow other commenters to flag truly offensive  comments  that you could track of without following the entire thread.  Really, what are you so afraid of, why do you have such a low opinion of your readers?

    To answer the question directly, what I am afraid of is committing the time to moderate a comments section. Other people are content with unmoderated comments, and that's fine for them. But in my own case, in the celebrated words of Bartleby, "I would prefer not to." 

    Wonderful in theory. From another reader:

    The thing is, a good commentariat turns a blog into a community, and a topic into a conversation.  They are wonderful in theory. In practice, they inevitably are ruined by the mean, the stupid, the angry and the troll. The poster child is Matt Yglesias.  Hated by many on the left and most on the right, his comment threads deteriorate into a morass of spittle, bile and ugliness.  I find it neither entertaining nor enlightening to read that kind of 'input'.

    I feel that there are many people whose writing can make for an interesting conversation, but in most cases - you, Yglesias, Drum, Marshall - I end up having that conversation via email.  Let's face it - an interest in politics, public policy and international relations results in one's having to confront a great deal of ugliness and hatred.  Reading the very personal embodiment of that kind of unhinged ideological anger is just not something I'm willing to do.
     

    In which I feel even more like a museum piece:

    I am glad comments exist in many of the things I read regularly (Charlie Pierce and Wonkette come immediately to mind).

    But reading your notice today made me realize that when I click on the link to your blog, I have a sense of walking into a library or museum, where I know I will find evidence of what and how people think, guided by, and including, your thoughts and comments. I expect and enjoy screaming at a rock concert, but I also enjoy some quiet time. 

    "With a side of maggots":

    I usually avoid reading "comments," although the ones in T-NC's column often are worthwhile. Your post got me thinking about one of my favorite Gene Weingarten quotes: "I basically like 'comments,' though they can seem a little jarring: spit-flecked rants that are appended to a product that at least tries for a measure of objectivity and dignity. It's as though when you order a sirloin steak, it comes with a side of maggots." 

    I am all about a big-tent range of approaches to this issue. People who want to have comment sections, great. People who don't want to, also fine. Thanks for writing in, and reading. [Bartleby shirt image from here.]

  • Back Online

    What we can learn from the first Gilded Age, and other news from Aspen

    Thumbnail image for GlamorousLife.jpg

    This has been an unusual period: five days of 24/7 travel and reporting in Sioux Falls, South Dakota, for reasons to be described shortly, and leaving little margin beyond the hours of interviewing and transcribing; followed by five days of 26/7 events, interviewing, and emceeing at the Aspen Ideas Festival, as my colleagues have so skillfully explained. During some previous years' Aspen sessions, I have piled on with the real-time blogging. This time it would have been hard to do that, and survive.

    This is a back-to-the-online world note touching a few issues.

    1) Gilded Age. The scene above is how our room looked after I did an NPR session yesterday (with Jacki Lyden, on Weekend All Things Considered). Two computers of mine; one of my wife's; digital recorder plus headset and microphone for "tape-sync" recording with NPR; beer [Odell IPA]; water; pills for the high-altitude headache; and so on. I suppose I could add this to the Glamorous Life chronicles.

    In this segment I mentioned my first involvement in this year's Ideas Festival: a two-minute proposal of a "Big Idea" for the opening session. The conference usually begins with eight or ten people giving their Twitterized, speed-reading versions of a "big idea" while a big clock in the background counts down the seconds from 120 to 0.

    The real version of what I said is immediately below; the version that would have taken four and a half minutes to read is below that.




    Here is the longer version, which I was cutting desperately as I went to the stage:
    My name is James Fallows. I am a long-time writer for the Atlantic, and my big idea is that we must do as well as the Gilded Age. Let me explain.

    In high school, students are told that they must study history's lessons. In college they learn, or should, to be very wary of this exercise. In theory, historical parallels light our way forward. In reality, they're usually picked and tailored to fit the position we've already chosen in the here and now. For a whole book on this theme, you can do no better than Thinking in Time by two professors I most liked and admired, Richard Neustadt and Ernest May.

    But objectively, some eras share more traits than others. And I submit that the best match for our current American prospect is the last 20 years of the 19th century and the first 20 years of the 20th - a span that include the Gilded Age, later the Populist and Progressive era, a time of labor strife and demographic changes and economic and technological revolution and countless other parallels to what we have been through.  Consider just a few:
    a newly globalized market made some people much richer, and many others less secure, and for both better and worse tied everyone's fate to shocks and surprises in far-off parts of the world;
    a nonstop flow of inventions - first the telegraph, then the electric grid, then the telephone and the radio and the internal combustion engine and the assembly line and oil refinery and the combine, and the airplane, and refrigerated train cars and mass publishing could make children's  lives unrecognizably different from their parents. Our past 40 years, Google and all, have been nothing by comparison;
    from kindergarten through professional school, every part of the educational establishment faced new economic pressures and cultural expectations;
    immigration transformed the nature of the American population more rapidly than it has done in our time ;
    many of those immigrants worked in stockyards or factories where they organized and fought for their rights;
    in the aftermath of Reconstruction, a Jim Crow system emerged;
    the Senate was corrupt; the Supreme Court was partisan; and in the end of this era, during World War I, an intellectual president constrained press freedom in the name of national security.

    I won't go on, because time is short and you're already thinking, as you should, of the ways in which our second Gilded Age differs from the first one. But here is why I use my "Big Idea" slot to make this parallel. The first Gilded Age led to something better.

    From the extremities of farm and factory life, the Populists arose. From the excesses of unregulated new global capitalism came the Progressives. After centuries of flat-out pillage of the American landscape, the conservation movement got its start, as did the national parks. After a post-Lincoln era of disdain for and exhaustion with the art of politics, we had an extraordinary range of people devoted to the public process. People as different, and flawed, as Eugene Debs, Tom Watson, William Jennings Bryan, Susan B. Anthony, WEB DuBois, Norman Thomas, John Muir and Gifford Pinchot, Lincoln Steffens and Upton Sinclair, LaFollette, various Roosevelts, the young Brandeis, the Carnegies and Rockfellers in the charitable phases of their lifespan, and many more.

    That is what the first Gilded Age led to. It would be a big idea, and a big achievement, to match those names, commitments, and deeds. 

    2) Jerry Brown, Good and Bad. Following my long profile of California's once-and-future governor last month, three updates:
    • Andrew Cohen last week on Brown's ongoing problems with California's (overcrowded, overblown, and very expensive) prison system;
    • Several reports (eg this and this) on Brown's reversing what seemed a benighted position (essentially: trying to cut spending by undoing the state's open-records law); and
    • A positive report on Brown's new school-spending agenda.

    3) We are doomed by our stupidity. The Congress cuts spending for the GPS program. Read and weep.

    4) Maybe we are not doomed. Mark Kelly and Gabrielle Giffords appeared yesterday at Aspen. Watch and weep, in a different way -- and, be inspired.


    I have no idea whether Mark Kelly was an effective public speaker before his wife was shot. In the aftermath of that tragedy, he has become a formidably eloquent speaker, with great, calmly understated power.


    5) Yesterday I got to interview Henry Paulson, former Treasury secretary under GW Bush and long-time China buff, on prospects in China and for China-US relations -- and the world's environment. (He is a big advocate of US leadership in climate-change legislation.) Full session here; snapshot below.

    Paulson-680x350-2.jpg


    6) Plus tomorrow, a speech by Jeff Smisek of United! Much to chronicle.

  • Today's Press-Related Links

    A loss that provokes discussion of the purpose and future of journalism

    BabyDevil1.png

    I am still mainly off the grid but wanted to note these items:


    1) How things should not work, part 1. I knew Michael Hastings slightly and liked him a lot. As with most people who either knew or knew-of him, I was shocked and saddened to learn of his sudden death at age 33. He was still growing as a writer. The loss to his family and friends is obvious; the loss to the public is the stories, revelations, and sensibility we will not have from him, as his growth went on. Condolences to his family and colleagues.

    If you would like to read one thing that puts Michael Hastings's death (and life) in a larger perspective, I suggest "Enough with the news-reader apps - it's time to support media that really matters," by Hamish McKenzie, in Pandodaily. He contrasts two news items that crossed his screen at about the same time: one about the Hastings crash, and another announcing $40 million funding for a news-aggregator app. You'll see the powerful and important conclusion he draws from the contrast.*

    2) How things do work, part 2. I highly recommend Isaac Chotiner's excellent interview, in the New Republic, with the editors of Politico, John Harris and Jim Vandehei. Plus this followup by David Karol at The Monkey Cage. The best interviewers encourage or lure their subjects to reveal and express themselves in ways they might not have intended; Chotiner has done that. A lot of the story of modern Washington journalism can be wrung from these two items.

    3) Home notes. (a) The latest issue of the Atlantic is out! My contribution is a brief but heartfelt item on what I was doing a year ago at about this time, on the other side of the world (where the photo at top was taken). And if you were to subscribe, you would see in the actual print issue a photo not included on line. It is of the moment I describe in the beginning of the piece, when I faced a classic journalistic dilemma: whether to let my wife know that a wallaby was sneaking up behind her to steal her food -- or whether instead I should just keep the camera going and let the drama unfold.

    3 (b) The issue also contains a short article by this same Deborah Fallows, who fortunately survived the wallaby attack. It concerns what linguists know, or suspect, about how the process of language-acquisition may change, when so many of the people spending time with babies and toddlers are talking not to the child in front of them but to someone else on a smart phone.

    3 (c) While I'm at it, Deb will also be doing an online chat this afternoon with Sandra Tsing Loh, well known to Atlantic readers and many others, on various aspects of Chinese language, based on Deb's book Dreaming in Chinese. It will be 5pm-6pm EDT today, details here.
    ___
    * OK, this will give you an idea of the case McKenzie is making, as he considers the latest well-funded aggregator startup:
    Finding content on the Web is not a serious problem. It's a leisure problem - as in, it's only really applicable to someone who has too much leisure time. If someone ever comes to me to say, "Oh, I can't find anything decent to read on the Internet while I'm killing time waiting for my Uber," I'm just going to slap them.

    And this is where the contrast to Hastings is so painfully evident. Hastings was doing work that, in part because of digital media, is becoming less financially viable by the day (even though he was employed by BuzzFeed, a digital media startup). His brand of hard-hitting, deeply researched investigative journalism is proving increasingly difficult to sustain for media companies that are now more used to cutting budgets than they are to investing in quality reporting. But that's a problem that tech is not doing much to solve.

    Instead, because software people think in terms of efficiencies and scalability, we get this surfeit of applications that deal in repackaging other people's content in a highly personalized and streamlined fashion. The concerns that are given most attention are distribution and discovery, not the promotion of civic-minded independent journalism, and certainly not any way to make it a more profitable enterprise. ..While these news aggregation companies often claim to democratize media and improve access to information, they simultaneously eschew the real problem inherent in today's media business: monetization.

    I am not suggesting that the dwindling fortunes of the media business is the tech industry's issue to solve. But if the likes of Rockmelt and its well-funded ilk are serious about solving difficult "change the world"-type problems, they ought to look at reporters like Michael Hastings and ask themselves, "How can we support work like that?"
  • China, The Atlantic, and the Foibles of Big Data, All in 1 Post

    "Mr. Serving Dishes" comes to San Francisco and offers American manufacturers new hope.

    LiamChina.png

    Act One: Late last year I revisit my friend Liam Casey, the Irish entrepreneur deeply involved in the global outsourcing-industrial complex, at the headquarters of his PCH International  company in Shenzhen, China. I do an an update on his views of the shifting trends in world manufacturing, in an Atlantic story called "Mr. China Comes to America" -- source of the photo above, showing him and one of his factory lines.

         Act One-and-a-Half: Liam tells me to watch for word of his opening a new design center in San Francisco, emblematic of the Bay Area's taking on an expanded role in the ever-faster branding-design-manufacturing cycle.

    Act Two: TechCrunch runs a nice story last week on the opening of the new SF design center. The title of the story is "Mr. China Goes to San Francisco," with gracious references to the ongoing Atlantic chronicles of the activities of Mr. China. It also explains Casey's current ambitions for the center, and in general:
    A teetotaling Irishman, the inexhaustible Casey ostensibly lives in a hotel [JF: the Four Points Sheraton] in downtown Shenzhen but is nearly always in the air. He and his cross-cultural team make nearly all the accessories you can imagine for multiple vendors. You couldn't point a finger in a Best Buy without hitting a product PCH builds.
    He envisions his new building as a gateway to China and a way to help clients - and the public - understand the vagaries of mass manufacturing.
    Those are the China-related and Atlantic-related parts of this item. Now, we come to Big Data part:

    Act Three: A number of auto-translate bots convert the TechCrunch story to Chinese -- and then evidently back out again. Here is the way it looks when it has made the round trip from English to Chinese and then to English. The headlines, from a site tracking pickup of our articles, will give you the idea:

    MrPorcelain.png
    Liam Casey has both enjoyed and been mildly embarrassed by the jokey moniker "Mr. China." Let's see how he likes becoming "Mr. Serving dishes." All this is in the ongoing category of "big data making us smarter, sort of."
  • California's New 'Problem': Jerry Brown on the Sudden Surplus, and the Filibuster

    The Senate's abuse of the filibuster "could end America's ability to govern itself." And other interview outtakes.

    joseph_interpreting_pharaohs_hi.jpg

    Lots of attention on a holiday weekend to the NYT's lead front-page story, by Adam Nagourney, about California's odd "problem" of having a rapidly-burgeoning state budget surplus. Less than three years after Arnold Schwarzenegger departed with a budget deficit in the tens of billions, a combination of tax increases and spending cuts is giving the state a big surplus. As the story puts it:

    At first glance, the situation should be welcome news in a state overwhelmingly controlled by Democrats, who have spent much of their time slashing programs they support... Instead, the surplus has set off a debate about the durability of new revenues, and whether the money should be used to reverse some of the spending cuts or set aside to guard against the inevitable next economic downturn.

    The new surplus figures are bigger than were known when I last spoke with Jerry Brown, in California in early April, for my story in the new issue. But he was all on top of this issue and the upcoming "what do we do with this money?" debate. Here are relevant parts from the story:

    The third and most publicized part of the California budget [after economic recovery, and spending cuts] turnaround was Brown's success last fall in winning passage of Proposition 30, which (among other things) raised high-end tax rates for several years, with a commitment to use the money to avoid cuts in school funding and to pay down the state debt. ... The higher rates will last for seven years, and Brown in his speeches told the biblical story of Joseph, Pharaoh, and the seven fat years and seven lean years. "The people have given us seven years of extra taxes," he said in his State of the State speech. "Let us follow the wisdom of Joseph, pay down our debts, and store up reserves against the leaner times that will surely come."

    And, about the shift in power between himself and the legislature about what to do in these new circumstances:

    "For me to get the budget cuts these past two years, I had to go to the legislature and say 'Please, please, please!' " he told me. "The Democrats"--who control the legislature--"didn't like it, but they agreed as part of getting the tax increase." In California, the governor has line-item-veto authority--one more indication of the legislature's feebleness--and Brown says he will use his veto power to resist spending increases. "The budget is more or less balanced," he told me. "To un­balance things now, they have to come through me. That is a real shift in power." Meanwhile, Brown's reduced and balanced budget includes more spending for what he considers the big challenges of the future: clean-energy initiatives, an expensive (and controversial) north-to-south high-speed-rail project, new canals and aqueducts, even California-based medical-research projects beyond those sponsored by the National Institutes of Health....
    Brown has tried to cut spending so much that the main complaints about him are from the left, and budget-related--­especially about his resistance to federal court orders to spend more on California's enormous and overcrowded prison system. "Fiscal discipline is not the enemy of our good intentions but the basis for realizing them," he said in this year's State of the State speech, justifying a hard line against letting spending increases sop up new revenues. "It is cruel to lead people on by expanding good programs, only to cut them back when the funding disappears."

    Now, here is a little more from that early-April on-the-record interview, beyond what we could fit in the magazine. My article was brim-full of quotes from Jerry Brown, but they amounted to about 5 percent of what he said in our talks. Here's the fuller-context version of how he set up the coming budget fights:

    We are governable. We balanced our budget. Arnold just borrowed money, but we're paying down our debts. Our job creation -- we're 50% faster than the national average. We lost 1.3 million jobs. But we are coming back. Our tax revenues are very volatile, but this increase will be over in seven years. We've got to learn to pay down our debts. We are paying them off at $1.5 billion every year. Then that will be $1.5 billion we don't have to spend.

    The [proposed new spending] bills are stacking up! It's like water on a causeway, it's going to come rolling down. But I'm here, and I'm going to make sure we're going to live within our means. They [meaning other politicians] haven't heard that yet. But they will hear it, as I continue to repeat it.

    I think the real test is whether we get through this year in a balanced way. For me to get the budget cuts these past two years, I had to go to the legislature and say 'Please, please, please!' The Democrats didn't like it, but they agreed as part of getting the tax increase. The budget is more or less balanced. To un­balance things now, they have to come through me. That is a real shift in power.

    All I have to do is hold that line. All I have got to do is play defense.

    I don't know enough about the details of the coming budget battles to judge the full merits of Brown's hold-the-line pledge versus the state's unaddressed needs. My point is that he was anticipating stories like today's.

    While I'm at it, here was another Jerry Brown riff that couldn't fit in the article. We were talking about the oddities of California's governing structure, especially the unique (among U.S. states) weakness of its legislature and unique power of the public through direct-democracy initiatives. I asked him what he thought about a related structural problem at the national level: the modern abuse of the filibuster in the U.S. Senate. For those joining us late, I am talking about the radical increase in filibuster threats in the past 6 years, which in effect means that it takes 60 votes (rather than the normal simple majority of 51) to get anything done. Brown was not a fan:

    We can't have a country based on the 60-vote standard. This is serious.

    We've never had to have 60 votes for appointments or day-to day-decisions. Really, you can't govern that way. That's a radical change.

    How can you govern? Does England have 60? [JF note: Obviously a rhetorical question. His point is that the U.S. has the drawbacks of parliamentary democracy, including political polarization -- without the benefits, namely the ability to get things done.] I think that 60 votes could end America's ability to govern itself. We have to get rid of it.

    That 60 votes is bad.

    Image of Joseph and Pharoah from here.

  • Why 'Turd Blossom' Is Metaphor but Not Metonym

    The same is true of "blood-sucking leeches."

    wall-street-sign.jpg

    Let's have fun with metonymy! I got into this thicket with an early scene in my new profile of Jerry Brown. Here I was trying to convey the interesting/odd experience of talking with the man:

    "Do you know what 'metonym' means?" [Brown] asked out of the blue one time. Unfortunately, I didn't. (To spare you my embarrassment: it's a name used as a reference for something else, like "K Street" for Washington's lobbying culture, or "Silicon Valley" for the tech industry.) The surprise, coming from a politician, was that he was actually asking for information rather than testing me or pretending he already knew. "Me neither," he said after my admission, "but I know it's very big with the deconstructionists." I did better when he asked whether I knew where the phrase "no country for old men" had come from. Yes! It's the first line of Yeats's "Sailing to Byzantium," which became the title of a novel by Cormac McCarthy, which was in turn the basis for a 2007 movie by the Coen brothers. Brown said that he was wondering because he'd just talked with a Washington media grandee* who used the phrase without knowing that it had any history. "Jerry didn't know there was a movie," his wife [Anne Gust Brown] said.
    Now the readers weigh in. First, from Graham Culbertson of the department of English and Comparative Literature at UNC - Chapel Hill. He said he liked the piece, but:
    I thought I'd take a moment to explain metonym a little more, in case you were interested. Your definition is right but might be a little misleading, while your examples are perfect.

    It really only makes sense to talk about metonymy in reference to metaphor. You say a metonym is "a name used as a reference for something else."  That's true, but only part of the story. A metaphor is also a name used as a reference for something else. The difference is that a metonym has a real connection to the thing being referenced, whereas a metaphor has only an imagined connection.

    If I call the lobbying industry "K Street," that's metonym, because K Street has a literal connection to lobbying. But if I call the lobbying industry "the blood-sucking leeches of American democracy," that's metaphor. They are symbolically connected to blood-sucking leeches, but there is no literal connection. A Karl Rove example: Calling Rove "Turd Blossom" is metaphor - he's not actually a flower. Calling him "the Brain" or "Bush's Brain" is metonymy - he is famous for his use of his brain. That last example is the most common type of metonym, synecdoche, when something is referred to by one of its parts. When you say "we need ten head of cattle," or "they need more arms in the bullpen," or "that movie got asses into seats," you are taking a full thing (a cow, a pitcher, an audience member) and using a part of it as its name (head, arm, ass). (Hopefully "ass" is ok to use if you quote this in your blog, as long as no children are forced to read it in-flight).

    Finally, the reason why deconstructionists were obsessed with metonymy is that they were obsessed with how language tried to but failed to capture reality. Metonym, which seems to come closer to capturing the "real" thing that metaphor, was thus particularly interesting.
    Noted! And now, from Dean Rowan of UC Berkeley Law School (Dean is his name, not his title):
    Thumbnail image for Metonym-Release.jpgI suspect mine will not be the only comment you receive about the metonym passage in your Brown profile. There are at least a couple problems with your account.

    First, your definition is essentially correct, yet meaninglessly so. A metonym does indeed involve substitution of one word or phrase for another, but its significance is in how the two terms are related. From OED's entry for "metonymy": " the action of substituting for a word or phrase denoting an object, action, institution, etc., a word or phrase denoting a property or something associated with it; an instance of this" (my emphasis). K Street is a metonym for DC lobbyists, because many of lobbying firms reside there and, consequently, the street is commonly associated with the practice. Similarly, Silicon Valley and the tech industry. Neither of these involves mere substitution of one phrase for another.

    Second, Gov. Brown's reference to "deconstructionists" is misleading. Indeed, some scholars associated, for better or worse, with deconstruction as an approach to literary theory did enjoy parsing tropes in texts, and metonymy is a widely deployed trope. Paul de Man was perhaps the most famous example of such a scholar. But having more than a passing interest in rhetorical analysis does not make one a "deconstructionist," and, conversely, many "deconstructionists" don't especially care about it at all.
    I wrote back saying thanks for the clarification -- and offering a clarification in my own defense. I hadn't said that metonym was a "substitute" for a real name. Rather, I'd said it was a "reference." In reply Rowan writes:
    Well, yes, "reference" affords a degree of wiggle room. But my point is that "metonymy" specifies a particular referential relationship of association or adjacency not precisely indicated by other tropes, such as synecdoche, which specifies part-for-whole or vice-versa. We refer to judges as "the bench" (an adjacent object) and sometimes to the President as "the White House" (also related by adjacency), but (because I'm at a loss just now for an example, I choose one from Wikipedia) a "wood" as a particular golf club (referring by synecdoche to the wooden part of the club). Each is indeed a reference, but if you're going to ask, "What is a 'metonym'?," you're not looking merely for the aspect of reference. You want to know how that reference is effected. (A shoelace is a thing, but being told as much doesn't really help one define the object. Similarly, metonymy involves reference, but being told as much doesn't tell one how metonymy refers.) 

    This is longstanding technical jargon, not by any means exclusively deployed by those literary theory folks from the '60s through the '80s or '90s who went a little nuts pitting metonymy against metaphor against synecdoche, and so on. (Don't even get me started on chiasmus.) Systematic rhetorical analysis harkens back at least to medieval thinkers,who were determined to classify these modes of reference, and far more ambitiously than "deconstructionists."
    At least I know about chiasmus! I've even written about it right here.

    * I am feeling particularly big-spirited in not naming the "Washington media grandee" in the No Country episode, whose identity I learned in off-the-record circumstances. Or maybe I am just feeling canny. (Top picture from here; other one from here.)
  • Linda Stone on Maintaining Focus in a Maddeningly Distractive World

    "At one point, I interviewed a handful of Nobel laureates about their childhood play patterns..."

    JuneCover.jpg

    As I mentioned a few minutes ago, our new issue (subscribe!) includes a Q-and-A I did with Linda Stone, coiner of the term "continuous partial attention," on how to maintain sanity and focus in an insane and unfocused world.

    Here is the promised extended-play bonus version, beyond what we could work into two pages of the magazine:
    ___
    JAMES FALLOWS: You're well known for the idea of continuous partial attention. Why is this a bad thing?

    LINDA STONE: Continuous partial attention is neither good nor bad. We need different attention strategies in different contexts. The way you use your attention when you're writing a story may vary from the way you use your attention when you're driving a car, serving a meal to dinner guests, making love, or riding a bicycle. The important thing for us as humans is to have the capacity to tap the attention strategy that will best serve us in any given moment.

    JF: What do you mean by "attention strategy"?

    LS: From the time we're born, we're learning and modeling a variety of attention and communication strategies. For example, one parent might put one toy after another in front of the baby until the baby stops crying. Another parent might work with the baby to demonstrate a new way to play with the same toy. These are very different strategies, and they set up a very different way of relating to the world for those children. Adults model attention and communication strategies, and children imitate. In some cases, through sports or crafts or performing arts, children are taught attention strategies. Some of the training might involve managing the breath and emotions---bringing one's body and mind to the same place at the same time.

    Self-directed play allows both children and adults to develop a powerful attention strategy, a strategy that I call "relaxed presence." How did you play as a child?

    JF: I have two younger siblings very close in age, so I spent time with them. I also just did things on my own, reading and building things and throwing balls and so on.

    LS: Let's talk about reading or building things. When you did those things, nobody was giving you an assignment, nobody was telling you what to do--there wasn't any stress around it. You did these things for your own pleasure and joy. As you played, you developed a capacity for attention and for a type of curiosity and experimentation that can happen when you play. You were in the moment, and the moment was unfolding in a natural way.

    You were in a state of relaxed presence as you explored your world. At one point, I interviewed a handful of Nobel laureates about their childhood play patterns. They talked about how they expressed their curiosity through experimentation. They enthusiastically described things they built, and how one play experience naturally led into another. In most cases, by the end of the interview, the scientist would say, "This is exactly what I do in my lab today! I'm still playing!"

    An unintended and tragic consequence of our metrics for schools is that what we measure causes us to remove self-directed play from the school day. Children's lives are completely programmed, filled with homework, lessons, and other activities.. There is less and less space for the kind of self-directed play that can be a fantastically fertile way for us to develop resilience and a broad set of attention strategies, not to mention a sense of who we are, and what questions captivate us. We have narrowed ourselves in service to the gods of productivity, a type of productivity that is about output and not about results.

    JF: When people talk about attention problems in modern society, they usually mean the distractive potential of smartphones and so on. Is that connected to what you're talking about in early-childhood development?

    LS: We learn by imitation, from the very start. That's how we're wired. Andrew Meltzoff and Patricia Kuhl, professors at the University of Washington I-LABS, show videos of babies at 42 minutes old, imitating adults. The adult sticks his tongue out. The baby sticks his tongue out, mirroring the adult's behavior. Children are also cued by where a parent focuses attention. The child's gaze follows the mother's gaze. Not long ago, I had brunch with friends who are doctors, and both of them were on call. They were constantly pulling out their smartphones. The focus of their 1-year-old turned to the smartphone: Mommy's got it, Daddy's got it. I want it.

    We may think that kids have a natural fascination with phones. Really, children have a fascination with what-ever Mom and Dad find fascinating. If they are fascinated by the flowers coming up in the yard, that's what the children are going to find fascinating. And if Mom and Dad can't put down the device with the screen, the child is going to think, That's where it's all at, that's where I need to be! I interviewed kids between the ages of 7 and 12 about this. They said things like "My mom should make eye contact with me when she talks to me" and "I used to watch TV with my dad, but now he has his iPad, and I watch by myself."

    Kids learn empathy in part through eye contact and gaze. If kids are learning empathy through eye contact, and our eye contact is with devices, they will miss out on empathy.

    JF: What you're describing sounds like a society-wide autism.

    LS: In my opinion, it's more serious than autism. Many autistic kids are profoundly sensitive, and look away [from people] because full stimulation overwhelms them. What we're doing now is modeling a primary relationship with screens, and a lack of eye contact with people. It ultimately can feed the development of a kind of sociopathy and psychopathy.

    JF: I'm afraid to ask, but is this just going to get worse?

    LS: I don't think so. You and I, as we grew up, experienced our parents operating in certain ways, and may have created a mental checklist: Okay, my mom and dad do that, and that's cool. I'll do that with my kids, too. Or: My mom and dad do this, and it's less cool, so I'm not going to do that when I'm a grown-up.

    The generation that has been tethered to devices serves as a cautionary example to the next generation, which may decide this is not a satisfying way to live. A couple years ago, after a fire in my house, I had a couple students coming to help me. One of them was Gen X and one was a Millennial. If the Gen Xer's phone rang or if she got a text, she would say "I'm going to take this, I'll be back in a minute." With the Millennial, she would just text back "L8r." When I talked to the Millennial about it, she said, "When I'm with someone, I want to be with that person." I am reminded of this new thing they're doing in Silicon Valley where every-one sticks their phone in the middle of the table, and whoever grabs their phone first has to treat to the meal.

    JF: So people may yet find ways to "disconnect"?

    LS: There is an increasingly heated conversation around "disconnecting."  I'm not sure this is a helpful conversation . When we discuss disconnecting, it puts the machines at the center of everything.  What if, instead, we put humans at the center of the conversation, and talk about with what or whom we want to connect?

    Talking about what we want to connect with gives us a direction and something positive to do.  Talking about disconnecting leaves us feeling shamed and stressed. Instead of going toward something, the language is all about going away from something that we feel we don't adequately control.  It's like a dieter constantly saying to him or herself, "I can't eat the cookie.  I can't eat the cookie," instead of saying, "That apple looks delicious."

    JF: You say that people can create a sense of relaxed presence for themselves. How?

    LS: When we learn how to play a sport or an instrument; how to dance or sing; or even how to fly a plane, we learn how to breathe and how to sit or stand in a way that supports a state of relaxed presence. My hunch is that when you're flying, you're aware of everything around you, and yet you're also relaxed. When you're water-skiing, you're paying attention, and if you're too tense, you'll fall. All of these activities help us cultivate our capacity for relaxed presence. Mind and body in the same place at the same time.

    People have become increasingly drawn to meditation and yoga as a way to cultivate relaxed presence.   Any of these activities, from self-directed play to sports and performing arts, to meditation and yoga, can contribute cultivating relaxed presence.

    In this state of relaxed presence, our minds and bodies are in the same place at the same time and we have a more open relationship with the world around us.

    Another bonus comes with this state of relaxed presence.  It's where we rendezvous with luck.  A U.K. psychologist ran experiments in which he divided self-described lucky and unlucky people into different groups and had each group execute the same task.   In one experiment, subjects were told to go to a café, order coffee, return and report on their experience. 

    The self-described lucky person found money on the ground on the way into the café, had a pleasant conversation with the person they sat next to at the counter, and left with a connection and potential business deal.  The self-described unlucky person missed the money - it was left in the same place for all experimental subjects to find, ordered coffee, didn't speak to a soul, and left the café.  One of these subjects was focused in a more stressed way on the task at hand.  The other was in a state of relaxed presence, executing the assignment. 

    We all have a capacity for relaxed presence, empathy, and luck.   We stress about being distracted, needing to focus, and needing to disconnect.  What if, instead, we cultivated our capacity for relaxed presence and actually, really connected, to each moment and to each other?

  • 'Continuous Partial Attention,' 'Metonym,' 'FOP,' 'Charm'—Items From Our New Issue

    What we're serving up this month

    JuneCover.jpg

    The June issue of The Atlantic has arrived. Say it with me: Subscribe! I read the "actual" (printed) magazine cover-to-cover last night, on the DC-NY train and then after arrival; it's full of good stuff. In my hypothesized "spare time" some day I will intend to do a story-by-story gloss. For the moment I'll just touch on a few in-house features:

    This is by way of segue to two extra in-house aspects of the issue that involve me. One is a long story by me about the past-present-and-future governor of my original home state, Jerry Brown. I'll do a follow up item here soon, but the two points I tried to convey in the story are what is unusual (and impressive) about Brown as a person, and what is unusual (and instructive) about the predicament of California as a state. For now I'll say that I really enjoyed doing this story, except the always-tedious "writing" part; and it helped me come to terms with the changes for good and bad in California between my time growing up there and my sons' time there now.

    Oh, yes, the story also involves my (and the governor's) discovery of what a certain literary term means.

    The other is a Q-and-A with Linda Stone, known inside the tech world for her work at Apple and Microsoft and known to the world at large for coining the term "continuous partial attention" to define our modern mental state. The print version of the interview, with tips on maintaining your own focus despite the blur, is here; the next post in this space will be a "director's cut" extended version of the interview, with more tips. I will try to maintain focus long enough to get it posted.

  • Rauch, Runciman, Rowe: Three Rs for Today's Reading

    Don't try to read these between tweets and chat messages -- but do try to read them soon.

    Here are three pieces of writing very much worth reading -- not necessarily right at the moment, between emails and hassles, but when you have time to digest each of them.

    MayIssue2013.png1. Jonathan Rauch, "How Not to Die," in the hot-off-the-press issue (subscribe!) of our magazine. Quite a few articles in this issue illustrate the kind of journalism that has long been The Atlantic's distinctive strength. This is what we sometimes refer to as "breaking ideas," as opposed just to "breaking news," and by that we mean an article whose author does a lot of traveling, reporting, and interviewing; takes care to present the material in a narrative structure rather than as a straight-out essay; and does all this toward the end of presenting a new concept or way of seeing the world. The cover story, by Charles Mann, obviously is a full-length demonstration of the "breaking ideas" approach, and I will say more about that later. But Jonathan Rauch's piece also deserves careful attention.

    Its essential point is that if people could see and fully imagine what the end of life is like, when it occurs under today's hyper-medicalized circumstances, they would make very different choices about their loved ones and themselves than they do when just confronted with over-familiar facts like "most of medical spending is in the last few months of life," etc. As he explains, Jonathan Rauch came to grips with this reality in watching his father's demise. The same experience with my own father had a similar effect on me. (In our family's case, my father was spared the worst extremes only because one of my sisters had the strength and wisdom to make a last-minute, split-second call against the momentum of high-tech-but-dehumanizing medical-industrial intervention.) Please don't miss this article. 

    2. David Runciman, writing about Ira Katznelson's history of the New Deal, Fear Itself, in the London Review of Books (subscribe! -- and in any case you will need to do a free registration to read the article). Runciman, who is a political scientist and writer based at Cambridge University, uses the review to lay out the long background of regional and racial politics in the United States that affects the news even to this day. For instance: Today's legislative paralysis is largely due to the willingness of smaller-state senators to band together as a blocking minority. The party lineup was different in the 1930s (the "Solid South" was Democratic then) but the phenomenon was very similar (emphasis added):
    The second weapon Southern senators had at their disposal was their longevity. Control of Senate committees went by seniority and because the South was a one-party state, Southerners were invariably the ones who had been there longest. In the 1920s, when the Democratic Party was being battered by Republicans in national elections, the South was immune. During this period, 67 per cent of all Democrats in the Senate and 72 per cent in the House came from the South. When a new raft of Northern and Western Democrats were returned on FDR's coat tails in the 1930s, the same Southerners were still around. So it didn't matter whether the Democrats were down or up, the South still ended up on top. When the party was down, Southern representatives were the only ones standing; when the party was up, Southern representatives were the ones with all the experience. There was no way for a Democratic president to legislate without letting the South get its fingerprints all over his bills.
    And, about the results of that era -- and especially of FDR's decision that he could not/would not challenge the racial order in the South:
    Katznelson's argument is that the distinctive character of the postwar American state was determined by the compromises that riddled the New Deal from its outset until its demise under Eisenhower. The result was a 'Janus-faced' politics: outwardly assertive, interventionist, crusading, moralising, always looking to take the fight to the enemy; inwardly constrained, laissez-faire, decentralised, protective of private interests, reluctant to uphold the public good. Katznelson sees this dual state - mixing nearly unconstrained public capacity with nearly unconstrained private power - as both enduring and pathological.

    Thumbnail image for JonRowe.jpg3. Jonathan Rowe, in his posthumous book Our Common Wealth (buy!). As I mentioned two years ago at the time of his sudden and unexpected death, Jon Rowe was a wonderful and original-minded writer who found a way to express concerns and ideas that made instant sense -- once he had pointed them out. His main contribution to The Atlantic was a 1995 cover story, with Ted Halstead and Clifford Cobb, on why GDP growth was a crude-at-best, destructive-at-worst way for a society to measure its overall progress and well-being.

    At the time of his death Jonathan Rower was working on a set of ideas that now have taken form in a book edited (from his papers) by his friend Peter Barnes. Its power is, again, to give voice and form to a concept many people sense but that doesn't clearly make its way into political, journalistic, or academic discussion. That is the value of all the things to which we can't attach an immediate profit-and-loss value but that clearly matter to individuals, families, and entire societies in distinguishing satisfaction and happiness from malaise. Which is also a point Jonathan Rauch and David Runciman are addressing.

    Please find the time to read these three works.

  • The Glamorous Life of a Journalist, Cont.

    The life we have chosen

    Thumbnail image for IMG_20130429_083853.jpg

    LAX, 830 am, locating the only working electric socket along this corridor, knowing that the six-hour (United) flight coming up has no power ports or connectability. Reviewing final-final changes on an article that will "ship" while I am en route.

    I tell myself that this hunched-gnome posture is because I am sitting on the floor. In any case, return to "normal" online presence impends. (Full "glamorous life" archives here.)

Video

Is Wine Healthy?

James Hamblin prepares to impress his date with knowledge about the health benefits of wine.

Video

The World's Largest Balloon Festival

Nine days, more than 700 balloons, and a whole lot of hot air

Video

The Origins of Bungee Jumping

"We had this old potato sack and I filled it up with rocks and dropped it over the side. It just hit the water, split, dropping all the stones. And that was our test."

Video

Is Trading Stocks for Suckers?

If you think you’re smarter than the stock market, you’re probably either cheating or wrong

Video

I Spent Half My Life Making a Video Game

How a childhood hobby became a labor of love

Writers

Up
Down

From This Author