Strange as Croatia's "Cravat Day" on October 18 may sound to an American, this celebration actually commemorates an element of national heritage. Croatia, after all, claims to have pioneered that most ubiquitous of modern accessories: the necktie.
"Croat" and "cravat," in fact, are etymologically linked, according to the Oxford English Dictionary. They were originally different variations of the same word: "The troops are filled with Cravates and Tartars, Hussars and Cossacs," reads a sentence by David Hume in 1752. Or take the even more comical statement from Daniel Defoe in 1720 (highlighting the alternate spelling, with a "b"): "We fell foul with 200 Crabats."
The cravat apparently came to Western Europe in the 17th century, courtesy of Croatian mercenaries. Perhaps appropriately, the modern Cravat Day has an origin of similarly mixed cultural-commercial flavor. In 1990, Croats Marijan Bušić and Zlatko Penavić founded the Zagreb-based company Potomac D.O.O. According to the company's website, it was intended to fulfill Bušić's high school dream of "creating an authentic medium which would act as a mediator in the presentation of Croatia to the world." The medium Bušić settled on, after partnering with trade-savy Penavić, was the necktie. Accordingly, in 1997 the firm founded the non-profit Academia Cravatica. "By spreading the truth about the cravat," the Academia Cravatica proclaims, "we improve Croatia's image in international public. The fact that Croats invented the Cravat makes us proud to be Croats."
On the other hand, by acknowledging the role of the French people in [the] cravat's history (who had recognised this ornament on Croatian soldiers), the role of the English people (who had spread it across the world) and the role of other nations which had embraced the cravat, we want to promote partnership between the Croats and other nations as well.
Accordingly, Cravat Day debuted on October 18, 2003, when Bušić and the Academia Cravatica undertook to wrap a giant red necktie around the Roman arena in Pula. The installation proved popular enough that the team has promoted Cravat day on each October 18 since, with some spontaneous commemorations across Croatia lending momentum to the project. In 2008, the Croatian Parliament unanimously declared October 18 the "Day of the Cravat."
Opponents of so-called Hallmark holidays, make of this what you will. At the very least, there's an element of genuine national heritage here. Also, there are some rather pretty horses. Below, a few images of Cravat Day in 2003 and a traditional guard exchange ceremony just prior to Cravat Day in 2010, which staged similar ceremonies.
An image of the installation "A Cravat Around the Arena" on October 18, 2003. (Courtesy of Academia Cravatica)
Soldiers in traditional uniforms participate in a changing of the guard ceremony in St. Mark's Square in Zagreb in 2010. The traditional dress includes a bright red cravat. (Nicola Solic/Reuters)
Further traditional military uniforms for the guard exchange. Aren't you glad it's the necktie, rather than the cap on the horseman, that's become traditional business apparel? (Nicola Solic/Reuters)
Soldiers in traditional uniform march past spectators in 2010. Cravat Day celebrations include a ceremony of military units wearing cravats. (Nicola Solik/Reuters)
People labeled “smart” at a young age don’t deal well with being wrong. Life grows stagnant.
ASPEN, Colo.—At whatever agesmart people develop the idea that they are smart, they also tend to develop vulnerability around relinquishing that label. So the difference between telling a kid “You did a great job” and “You are smart” isn’t subtle. That is, at least, according to one growing movement in education and parenting that advocates for retirement of “the S word.”
The idea is that when we praise kids for being smart, those kids think: Oh good, I'm smart. And then later, when those kids mess up, which they will, they think: Oh no, I'm not smart after all. People will think I’m not smart after all. And that’s the worst. That’s a risk to avoid, they learn.“Smart” kids stand to become especially averse to making mistakes, which are critical to learning and succeeding.
As he prepares for a presidential run, the governor’s labor legacy deserves inspection. Are his state’s “hardworking taxpayers” any better off?
This past February, at the Conservative Political Action Conference (CPAC) outside Washington, D.C., Wisconsin Governor Scott Walker rolled up his sleeves, clipped on a lavalier microphone, and without the aid of a teleprompter gave the speech of his life. He emerged from that early GOP cattle call as a front-runner for his party’s nomination for president. Numerous polls this spring placed him several points ahead of former Florida Governor Jeb Bush, the preferred candidate of the Republican establishment, in Iowa and New Hampshire. Those same polls showed him with an even more substantial lead over movement conservative favorites such as Ted Cruz, Rand Paul, and Mike Huckabee. In late April, the Koch brothers hinted that Walker would be the likely recipient of the nearly $900 million they plan to spend on the 2016 election cycle.
The untold story of the improbable campaign that finally tipped the U.S. Supreme Court.
On May 18, 1970, Jack Baker and Michael McConnell walked into a courthouse in Minneapolis, paid $10, and applied for a marriage license. The county clerk, Gerald Nelson, refused to give it to them. Obviously, he told them, marriage was for people of the opposite sex; it was silly to think otherwise.
Baker, a law student, didn’t agree. He and McConnell, a librarian, had met at a Halloween party in Oklahoma in 1966, shortly after Baker was pushed out of the Air Force for his sexuality. From the beginning, the men were committed to one another. In 1967, Baker proposed that they move in together. McConnell replied that he wanted to get married—really, legally married. The idea struck even Baker as odd at first, but he promised to find a way and decided to go to law school to figure it out.
Many authors have been tempted into writing revisionist histories of the 37th U.S. president, but these counterintuitive takes often do not hold up under closer scrutiny.
Every once in a while someone writes a book arguing that Richard Nixon has been misunderstood. These authors tend to focus on some particular aspect of his presidency that, the argument goes, is more important than that Watergate business. They’ve focused on his domestic policy or his foreign policy as achievements that override his flaws and his presidency’s denouement. Nixon’s highly complex persona also has led to books that probe his psyche—a hazardous and widely debunked practice, though that hasn’t discouraged further attempts.
And, as with other major figures, but all the more so given the drama of his time on the national stage, Nixon’s complexity and essentially low repute tempts some authors to offer revisionist approaches to his place in history. Such approaches have to be assessed on their own merits, not accepted merely because they’re counterintuitive or receive a lot of attention, as new assessments of the controversial and fascinating Nixon tend to do. Two major revisionist books about Nixon argued that his domestic policy was so expansive, humane, and innovative that it overrides his unfortunate behavior; their accounts relegate Watergate to a far less important role. The problem with these books is that they don’t stand up to close scrutiny.
Mike Huckabee and Ted Cruz are suggesting there might be ways for states and cities to nullify the justices’ ruling. They’re wrong.
The Supreme Court’s decision last week did make gay marriage legal around the nation. Unfortunately for social conservatives, it did not, however, make nullification legal around the nation.
Nullification is the historical idea that states can ignore federal laws, or pass laws that supercede them. This concept has a long but not especially honorable pedigree in U.S. history. Its origins date back to antebellum America, where Southern states tried to nullify tariffs and Northern states tried to nullify fugitive-slave laws. In the 1950s, after Brown v. Board of Education, some Southern states tried to pass laws to avoid integrating schools. It didn’t work, because nullification is not constitutional.
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
The social network learns more about its users than they might realize.
Facebook, you may have noticed, turned into a rainbow-drenched spectacle following the Supreme Court’s decision Friday that same-sex marriage is a Constitutional right.
By overlaying their profile photos with a rainbow filter, Facebook users began celebrating in a way we haven't seen since March 2013, when 3 million peoplechanged their profile images to a red equals sign—the logo of the Human Rights Campaign—as a way to support marriage equality. This time, Facebook provided a simple way to turn profile photos rainbow-colored. More than 1 million people changed their profile in the first few hours, according to the Facebook spokesperson William Nevius, and the number continues to grow.
“This is probably a Facebook experiment!” joked the MIT network scientist Cesar Hidalgo on Facebook yesterday. “This is one Facebook study I want to be included in!” wrote Stacy Blasiola, a communications Ph.D. candidate at the University of Illinois, when she changed her profile.
Was the Concorde a triumph of modern engineering, a metaphor for misplaced 20th-century values, or both?
The box sat untouched in his bottom desk drawer. For weeks we discussed opening it, and one January morning he was ready. I set the box on his white bedsheets and removed the stack of passports, which could have belonged to a family with dual citizenship. But all nine—from 1956 to a valid update issued in 2014—belong to my 89-year-old grandfather.
Lying in bed, he unfolded a stamp-covered page like an accordion and held it open above his chest. “Oh my,” he kept repeating. He paused, and pointed.
London. March 22, 1976. My then-50-year-old grandfather, Raymond Pearlson, the inventor ofSyncrolift, was traveling the world selling his shiplift system. Concorde had launched commercially that January. He knew exactly what this stamp represented: Washington Dulles to London Heathrow in 3.5 hours—the first of at least 150 supersonic flights he took on the legendary aircraft.
Engineers at IBM and Google claim they're closer than ever to making computers that could process data in days that would take millions of years to flow through today's machines.
One of the first electronic, programmable computers in the world is remembered today mostly by its nickname: Colossus. The fact that this moniker evokes one of the seven wonders of the ancient world is fitting both physically and conceptually. Colossus, which filled an entire room and included dinner-plate-sized pulleys that had to be loaded with tape, was built in World War II to help crack Nazi codes. Ten versions of the mammoth computer would decrypt tens of millions of characters of German messages before the war ended.
Colossus was a marvel at a time when “computers” still referred to people—women, usually—rather than machines. And it is practically unrecognizable by today's computing standards, made up of thousands of vacuum tubes that contained glowing hot filaments. The machine was programmable, but not based on stored memory. Operators used switches and plugs to modify wires when they wanted to run different programs. Colossus was a beast and a capricious one at that.
I spent a year in Tromsø, Norway, where the “Polar Night” lasts all winter—and where rates of seasonal depression are remarkably low. Here’s what I learned about happiness and the wintertime blues.
Located over 200 miles north of the Arctic Circle, Tromsø, Norway, is home to extreme light variation between seasons. During the Polar Night, which lasts from November to January, the sun doesn’t rise at all. Then the days get progressively longer until the Midnight Sun period, from May to July, when it never sets. After the midnight sun, the days get shorter and shorter again until the Polar Night, and the yearly cycle repeats.
So, perhaps understandably, many people had a hard time relating when I told them I was moving there.
“I could never live there,” was the most common response I heard. “That winter would make me so depressed,” many added, or “I just get so tired when it’s dark out.”
But the Polar Night was what drew me to Tromsø in the first place.