Recently, my husband and I wanted to buy some new sheets. But how to choose? Would they lose their shape over the years? Begin to pill?
A friend pointed us to Sweethome's recent sheet review. This was no joke. These testers had examined the cotton fibers under a microscope, washed the sheets multiple times, and even given them a literal smell test to make sure they didn't have any noxious post-factory odors. We were suitably impressed, and bought their recommended sheets without thinking about it twice.
But the experience left me curious about this magical little site. Who were these people and why were they so serious about sheets? I decided to ask Jacqui Cheng, editor in chief of Sweethome and its partner tech-site, Wirecutter, about the work they do and why it matters. A lightly edited transcript of our conversation follows.
Tell us about Wirecutter and Sweethome. What's the idea behind them? What sets them apart from other review sites?
You know how when you—or your parents, or your friends—decide you're going to buy something new, like a new portable hard drive or a new humidifier for your bedroom, you often do some research to find out which products have the best features for the best price? And then you have to find out which features aren't worth paying for, and why is there an $800 humidifier anyway!? Well, that's the kind of thing we're trying to do for you at Wirecutter and Sweethome. As consumers, we're all basically doing the same research on the same products all the time, and it's not only redundant, it's a wasteful use of our collective time for everyone to be doing that stuff all the time.
Wirecutter and Sweethome aim to do all that research and just tell you which one or two items are the "best" for the majority of the people, the majority of the time, for any particular category. We're trying to keep the world from wasting its time doing all that research, but also trying to keep everyone from wasting their time on features that are full of hype or don't matter when it comes to real-world usage. For example, at the moment we don't recommend any 4K monitors, and that's because it's just plain not worth it for most customers right now. You aren't going to find tech sites that tell you that so clearly—their job is to talk about why 4K is awesome and why it's the future, not why the home consumer has no use for it right now.
Sometimes, we even tell readers why they shouldn't rely on something that we're reviewing. We did a piece on portable breathalyzers and presented all our research, testing data, and analysis as usual. But during that process, we also discovered that nearly all portable breathalyzers that a regular person can buy on the market today are inaccurate, not to mention that people's blood alcohol content is very hard to predict over time (your BAC continues to go up for 90 minutes after you stop drinking).
So, not only is it wildly unsafe to try to use your own breathalyzer to determine whether or not to drive, even if you did use it, your reading isn't likely to match whatever the police might read if you were to be pulled over and given a police breathalyzer.
In the end, is it really worth spending $150-$300 on a device that is only slightly better than using a crystal ball? We think it's not, and we said so pretty clearly in our piece. However, we still named a "best" portable breathalyzer based on our tests & research (it was the most accurate of the bunch) for those people who don't want to listen to our advice, or just absolutely feel like they need to have one (they might want to see if their teenage kids have been drinking at all, or something along those lines).
In terms of how other tech or review sites operate, we love them—don't get me wrong!—but they also tend to review things on an individual basis instead of looking broadly at a category and determining the best. That's not to say their opinions aren't useful—we rely on other review sites heavily in order to help inform our own judgement on a category. In a way, we all work together virtually, but Wirecutter/Sweethome's goal is to do all that research and condense all that information into a simple, easy to understand format for the regular Joe or Jane who doesn't necessarily want to read 10,000 words on routers—they just want to know what's the best one to get at home to use with their iPads, you know?
The hope is that we'll all get back some hours in our lives that could be used on more important things, like spending time with people we like, or getting work done.
How does the testing process work? Which products took the most work to test? Any outlandish tales?
By the time we're testing a handful of items for a particular product category, we've already put in dozens of hours of research and interviews in order to even identify what to test. We really go through each product category with a fine-toothed comb and identify what an ideal product might look like, if such a thing were to exist, before we even begin to narrow down what's actually available on the market to test.
Each product gets a different testing procedure, but we try to put together a plan that does not replicate what other reviewers have already done. So, for example, if it turns out that a site like Anandtech or CNET already ran all the benchmarks you can possibly run on a new tablet, we aren't going to run those same benchmarks again just for the sake of doing so. We're going to name and cite them, and talk about their results, and possibly use that data to help formulate different tests.
We also try to make our tests as "real world" as possible. We recently published a piece on surge protectors where we had an electrical engineer use a noise generator to send voltage spikes through each of our testing candidates to see how each surge protector clamped down on the spike. No one else is doing real world tests like that—you just read the box and you know that this surge protector claims to be able to clamp down 600 volts over a period of 10 years, but does it really? With our tests, we were able to actually see—through the data—how much capability each surge protector loses every time it has to clamp down on a giant spike like that. (Did you know they can only do that so many times before failing, and the only way to know your surge protector is on its last legs is when it finally blows out?)
I would say one of the products that took the most work to test was bike locks. We spent a huge number of hours—seriously, it's terrifying—interviewing and working with a professional bike thief to learn about all the ways in which most bike locks fail, and how thieves usually get around even the best bike locks. We followed this guy around and he showed us exactly how things work and what to look for, and he performed all the testing for us when we narrowed things down to the final candidates. Sometimes, testing isn't so much about the technical details (like with surge protectors) and more about how real humans are going to interact with the item. In the case of bike locks, the things you have to think about the most are actually human factors, with the technical details coming second.
Have there been results that were particularly surprising? Any top-of-the-line products that failed miserably? Unknowns that triumphed?
To be honest, we find surprising things pretty often during testing. For example, we're working on a piece for Sweethome right now that evaluates pepper mills/grinders, and not one, but two of our final testing candidates ended up burping out metal shards along with the ground pepper. And this is after eliminating hundreds of others based on our research and other reviews, so these are not bottom-of-the-barrel pepper grinders. I mean, can you believe that? Two of the "best" pepper grinders that you can buy from reputable brands basically cover your food with metal shards? Wouldn't you like to know that as a consumer? We also found one popsicle mold that put metal shards into the popsicles themselves, so this isn't just limited to things that grind.
As far as unknowns go, it happens a lot, but that's part of why we do this kind of testing. One example that I can think of is that Sweethome did a piece on soda makers last winter, and the best choice was not a Sodastream as one might expect. It ended up being the Purefizz by Mastrad, which is a little-known foreign company that typically makes tools for chefs so they can make things like meat-flavored foams and whatnot. They made a carbonation device that creates a better, more fizzy product than the Sodastream, and without proprietary cartridges for the carbonation.
How do you choose which products to test? Specifically I am curious how nail clippers got on the list.
It's a magical combination of what the readers have been requesting from us (and how often), what the Google search terms are telling us, and what we just plain feel like reviewing. Obviously we try to put more weight on things that we're getting a lot of requests for, or that people are searching for on Wirecutter/Sweethome where we don't have anything published yet. One example of that is the aforementioned router piece—we had an old piece that was in desperate need of updating, and we kept seeing more and more requests for it. So now we have someone working on a totally fresh piece that recommends which router to buy. Same thing goes with Sweethome stuff—we're working on this great (but complicated) piece about lotions and moisturizers because we've been getting so many requests to touch on beauty products.
Of course, every so often, we feel tempted to just tackle a subject even if it isn't totally practical, like personal drones (the kind you fly around). We've got a piece on that coming up because the it's a cool topic and the timing was right (in terms of writer availability, and experts willing to talk to us about the topic). Sometimes we just have these amazing pie-in-the-sky ideas that we shelve until the right writer or the right expert comes along. When that happens: magic.
How is this information important for consumers? What sorts of feedback do you get from readers?
Generally speaking, the readers are just absolutely thrilled that we're saving their time by doing all this research for them and presenting it in a transparent, easy to understand way. Our audience on both sites is really diverse: we have a following of techies and foodies (Wirecutter and Sweethome, respectively), but we also have a pretty big following of just regular folks—parents, grandparents, friends, neighbors, etc.—who are just tickled to death that they can get information about stuff they're not experts on so they can just buy it and be done with it. No weeks of research and confusing numbers to compare for them.
In fact, although the readers tell us regularly that they love having all the research and testing data available to them, we find that some readers just trust our process enough to not bother reading the whole reviews. We get tweets all the time from people who say they just look for whatever we recommend at the top and buy it without reading the rest of the guide, because they have so much confidence in our tenacious research. We're really flattered that people do that, but we do like to keep all that information there on the page so those who want to find out how we came to that conclusion can do so.
We want to hear what you think. Submit a letter to the editor or write to email@example.com.