Anurag Mairal, director of technology solutions at PATH Health Technologies, says that it's time to start looking at low-cost innovations in healthcare differently. What should the road map be for an innovation? Should the product debut in a developing country first and then, evolve for the developed market or vice versa?
"We're at a point now, that companies cannot just keep adding bells and whistles to the same product to garner sales," Mairal told me.
So, given the increasing costs of healthcare in the U.S. and in Europe, companies are looking increasingly at simpler solutions that have the capacity to criss-cross border with a few tweaks.
PATH is a Seattle-based non-profit. But it's working with the commercial sector to scale, distribute, and market its innovations. That's Mairal's task. He's a new addition to the PATH team, after a noted career with Johnson & Johnson companies. He is symbolic of this merger between social impact and commercial viability.
His approach includes disruptive innovation (disrupt the global health system by changing the cost equation, moving away from a grants-based approach to a commercial-approach) and developing a market for said innovations. It's not enough to innovate. Must build a system to introduce these innovations to the market, advertise them, create distribution chains, and get them to the end-user.
PATH was started in the 1970s by three researchers: Gordon Duncan, Rich Mahoney, and Gordon Perkin. Their aim was quite the same: bring together public health and the private sector. Their focus, though, was slightly different: population control. So, their first innovations addressed population overload -- interventions to curb birth rates -- and were focused on Asia.
Since then, they've expanded their focus, looking at nutrition, water, sanitation, vaccinations, and reproductive health. Here are some of the innovations that PATH has piloted:
River blindness tests
It looks like a pregnancy test but it's designed to identify river blindness (or a tropical disease, "onchocerciasis"), which is a preventable condition that has affected 37 million people globally, many in poor, rural communities situated near a water source. With a grant from the Gates Foundation for $1.8 million, PATH created this device. Traditionally, a health worker would have have to draw a vial blood, take it to a clinic where it can be processed, and then report the results several days later. The later strip, however, requires just one drop of blood from a finger prick and results are available in 20 minutes. Ideal for rural health workers.
Rice is a popular grain, eaten by half the world's population. Fortified rice includes micronutrients such as iron, thiamin, zinc, vitamin A, and folic acid. PATH partnered with food purveyors in India, Brazil, and Colombia to produce the fortified grains to combat iron deficiencies, malnourishment, and anemia. Now, the grains are being coupled with school meal programs, such as in Burundi, to ensure that they reach school kids in low-income communities.
Mobile-phone milk pasteurization
Still in the works, FoneAstra is a system that uses mobile phones to monitor flash-heat pasteurization of donor breast milk. When a mother's milk is not safe to consume or is simply not available, human milk banks (HMB) fill the need; WHO supports the use of HMBs to address malnourished infants. However, the pasteurization process is tricky and healthcare facilities are hesitant to use this donor milk, unsure of its safety. By having a cell phone attached to the pasteurization device, FoneAstra enables these health clinics to monitor data on pasteurization, assuring them that the milk is safe to use. A pilot is under way in South Africa with the Human Milk Banking Association.
PATH created this design after consulting with women globally (in the US, South Africa, Thailand, and Dominican Republic) to ensure that they had single-size solution. It's more "discreet" than the condom, is easier to use than hormonal contraception, and enables women to protect themselves from unwanted pregnancy and some sexually-transmitted diseases. Now, PATH is commercializing this for sale with Germany-based health company, Kessel. But, it's also trying to figure out how the diaphragm could be integrated into family planning programs, given that it's a reusable product and would eliminate trips to a local health clinic. Those projects are under way in Uganda, India, and South Africa.
Vaccine vial monitors
Vaccine temperatures are critical - if too hot, they lose their potency. One of PATH's earliest innovations (1996) included the vaccine vial monitor- a square indicator on the label that lets health workers know if the vaccine is still safe to use. Modeled after a technology used in the food industry, it prevented WHO from dumping massive quantities of vaccines whose potency would be "unknown" after a day in the sun or in the hands of a health worker. UNICEF and WHO claim that this innovation saves the global health community $5 million every year.
The opposite problem of vaccines getting too hot -- they freeze in the carriers. Coupled with ice packs, the vaccines can be at the risk of freezing which diminish their potency as well. Solution? PATH discovered a new way to use nontoxic, biodegradable phase-change material with ice packs to prevent freezing.
Here's a product that debuted in the developed world but is being refined, and considered for the developing world as well. While you can find a female condom in drug stores, Mairal explains that they're not popular. Why? They're not always easy to use and can be uncomfortable. A more refined version, developed by PATH, has higher quality materials (i.e. 0.03 mm thin polyurathane film that allows for heat transfer), claims to be easier to use and feels more natural.
Rather than showcasing pre-made videos on maternal and neonatal health, PATH's Digital Public Health Platform -- basically, video and projector equipment- is enabling rural women in Rajasthan, India to create videos, showcase their films, and answer questions. The community-driven approach includes teams of health workers for local solutions and storytelling.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
University leaders and observers discuss the intersection of student protests, free speech and academic freedom.
In a Thursday debate titled “Academic Freedom, Safe Spaces, Dissent, and Dignity,” faculty or administrators from Yale, Wesleyan, Mizzou, and the University of Chicago discussed last semester’s student protests and their intersection with free speech. They shared the stage at the Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic, with Jonathan Greenblatt of the Anti-Defamation League; Kirsten Powers, author of The Silencing: How the Left Is Killing Free Speech; and Greg Lukianoff, who leads the Foundation for Individual Rights in Education.
My colleague Jeffrey Goldberg was the moderator.
The most interesting exchange involved Stephen Carter, a law professor at Yale, and Michael S. Roth, the president of Wesleyan University.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
In an era fixated with science, technology, and data, the humanities are in decline. They’re more vital than ever.
Earlier this month, the Washington Post journalist Jeff Guo wrote a detailed account of how he’d managed to maximize the efficiency of his cultural consumption. “I have a habit that horrifies most people,” he wrote. “I watch television and films in fast forward … the time savings are enormous. Four episodes of Unbreakable Kimmy Schmidt fit into an hour. An entire season of Game of Thrones goes down on the bus ride from D.C. to New York.”
Guo’s method, which he admits has ruined his ability to watch TV and movies in real time, encapsulates how technology has allowed many people to accelerate the pace of their daily routines. But is faster always better when it comes to art? In a conversation at the Aspen Ideas Festival, co-sponsored by the Aspen Institute and The Atlantic, Drew Gilpin Faust, the president of Harvard University, and the cultural critic Leon Wieseltier agreed that true study and appreciation of the humanities is rooted in slowness—in the kind of deliberate education that can be accrued over a lifetime. While this can seem almost antithetical at times to the pace of modern life, and as subjects like art, philosophy, and literature face steep declines in enrollment at academic institutions in the U.S., both argued that studying the humanities is vital for the ways in which it teaches us how to be human.
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.
The release of the Benghazi report should have made this a great week for Hillary Clinton, but questions about her email—and a questionable social call by Bill Clinton—have cast clouds over it.
This should have been a great week for Hillary Clinton. On Tuesday, the special House committee appointed to investigate the deaths of four Americans in Benghazi on September 11, 2012, released its long-awaited report. (Well, the majority report. Democrats on the committee released their own polemic the day before.)
The report turned out about as well for Clinton as she could have hoped. While the panel found fault with the way the Obama administration, and Clinton’s State Department, ran security for embassies and other U.S. stations around the world, it did not find any new bombshells about the night of the attacks, no details that would suggest Clinton could have acted differently that evening and saved the lives of Ambassador Chris Stevens and three others. That outcome largely vindicates Clinton’s claims all along, and it mostly puts the question of the attacks to rest after a long and extremely detailed investigation (though not, as Democrats like to claim, the longest). Clinton’s political opponents are unlikely to quit talking about it, but the threat of a new, damaging disclosure now seems remote.