A LONGER THAN NORMAL REVIEW: This review, being my first player reviewed here, is longer than future ones are likely to be. I wanted to explain some of the methodology and background as well as present the test results. So it's a good read if you're curious about what sorts of measurements differentiate player performance, what might be audible, and what's probably not audible. Future reviews should be more concise.
BETTER THAN AN iPOD: Despite being cheaper, the Clip+ has many cool features even the more trendy iPod's lack. One is a microSD card slot so you can add cheap flash memory for vastly more storage--the Apple Shuffle is especially weak here. For a total of $45 you can have a 10GB player (2GB Clip and 8 GB card). You also get a real equalizer, FM radio, voice recorder, and a few other cool things you won't find at the Apple store--all for a fraction of the price.
FLAC READ HERE: And the Clip+ will play uber high quality FLAC files (FLAC is the most popular form of "lossless compression" and literally sounds identical to a CD). As storage prices have dropped more people are using FLAC for their home music collection. Most portable players force you to transcode your FLAC files into MP3 (or AAC, etc.) before you can take them with you. This slow extra, sound degrading, step isn't needed with the Clip+.
NO iTUNES REQUIRED! (warning: subjective bias ahead...) With the Clip+ there's no risk corrupting your PC and music collection with what many consider the invasive, bloated, slow, buggy, in-your-face Apple Profit Center piece of Malware otherwise known as iTunes for Windows. You can just plug the Clip+ into your PC and drag files to it or use any number of popular music managers that most consider nicer and better behaved than iTunes--such as the widely praised Media Monkey. The iPod Touch, by comparison, is literally useless out of the box without first installing iTunes and giving Apple your personal information when you open the mandatory Apple account. You see, Apple wants that so they can try to sell you all sorts of things and monitor most everything you do with your iPod. If you're a geek, you can also risk giving up your warranty, DRM rights, media subscriptions, e-reader support, future Apple updates, and possibly even bricking (ruining) your Touch by hacking it (known as "jail breaking"). You don't have to worry about any of that with a Sansa. It's yours to do what you want with, instead of only as Steve Jobs wants. No bloatware or registration is required.
ROCKBOX COMPATIBLE: One of those cool things you can do with a Clip+, that Steve Jobs doesn't approve of for iPods, is run Rockbox's popular free open-source firmware. Rockbox is an impressive project and adds many great features including additional music formats, very useful EQ options, lots of fonts/skins, games, and it's highly configurable. It can be plug-and-play--fully up-and-running with a few clicks of the mouse. Or you can geek out and really customize your player. A Rockbox Clip+ also remains "dual boot" so you can power it on with either the factory Sandisk firmware or Rockbox and it can be easily un-installed. See: rockbox.org
STANDARD MINI USB: Unlike Apple players, the Cowon i9, and even its bigger brother the Sansa Fuze, the Clip+ uses a standard mini-USB connector for charging and sync to your computer. This is a plus if you have other devices with this same connector as it's one less cable, home charger, car charger, etc. to have to worry about. And if you lose the Sansa cable, you can get easily get an inexpensive replacement just about anywhere.
OTHER DETAILS: I could go on with even more details, but I'm late to the party here. There's lots already written about the Clip+ so just Google "Clip+ Review" for more than you probably want to know.
SUBJECTIVE SOUND QUALITY: There are plenty of reports of the Clip's sound quality on the web--most very favorable. Before I measured it, I spent a few weeks just using and listening to it. Most of the listening was done with Ultimate Ears SuperFi 5 Pro and Sony EX-71 (my usual choice at the gym) in-ear headphones. I immediately noticed familiar tracks with serious deep bass took on new life. They seemed to have way more bass authority and almost subwoofer-like "impact". And the rest of the frequency range seemed a bit cleaner and more open sounding. The Clip is plenty loud without maxing out the volume setting with the average efficiency Sony headphones and even half volume is quite loud with the SuperFi's.
AUDIBLE HISS (added 2/23/11): Playing back a very low level signal with my most efficient headphones (the UE SuperFi's) the Clip+ has some very slightly audible hiss. Interestingly it seems (subjectively) slightly worse with the Rockbox firmware but I need to investigate that more. With more typical headphones there's zero audible hiss and even with the SuperFi's the hiss in the recording itself and/or background noise leaking past the headphones usually masks the slight hiss. So, in my opinion, it's not a problem unless you have uber-efficient headphones, listen to pristine recordings, and hate even a tiny bit of hiss.
AUDIO PERFORMANCE: Here's what makes this review different from most. I measured the Clip's capabilities on a high-end professional audio analysis system (the Prism Sound dScope III) that's capable of more and better measurements than RMAA (which I also used for comparison purposes). The short answer is the Clip+ lives up to its reputation very nicely. There's little to fault and much to praise, and it's especially impressive it pulls it off at such a tiny size and low price.
THE MEASUREMENTS (brief version):
- Frequency Response: Ruler flat from 10 hz to 20 Khz and very accurate
- Distortion: Below 0.05% which most agree is inaudible
- Maximum Output: About 15 mW--higher than average
- Output Impedance: An extremely impressive 1 ohm (dedicated headphone amp territory)
- DAC Performance: Impressive and better than many players costing several times more
NO HEADPHONE AMP REQUIRED: A headphone amp isn't likely to help much, and may do more harm (see: Headphone Amps) considering how good the Clip+ already is by itself. It has plenty of clean output power to drive nearly any portable headphone likely to be used with a player like this. The unusually low output impedance means its performance is relatively unaffected by the headphones used. Unless you plan to use some oddball seriously inefficient headphones with an impedance higher than 64 ohms, or like uber-loud levels and long term hearing damage, you'll likely not get any significant benefit from using an amp and may end up less happy. An amp also defeats having such a small and light portable player.
CLIP+ VS iPOD: My portable reference is the previous generation (3G) of Apple's flagship iPod--the Touch. But it's worth noting it's 5+ times the price of the Clip+ and nearly 5 times its size and weight. The Nano, at more than 3 times the price, or Shuffle at nearly double the Clip+ 2GB price, would be a more fair match.
FASHION FAILURE: Speaking of iPods, while the Clip+ does come in several colors, it's not going to wow your friends like the latest Nano might. There's no color touch screen, ultra slim brushed aluminium enclosure, album art, etc. You can get more of that with a Sansa Fuze, but if fashion and status are your main priorities, you might want to spend lots more and sell your soul to Apple instead. But, being realistic, most don't use their players as fashion accessories and they often keep them stashed away. Heck, with the money you save you can even buy some cool shoes or clothes.
THE DISPOSABLE FACTOR: Let's face it, portable players used during commutes, at the gym, at work, etc. can have a hard life. The headphone cord snags on things and the player get flung hard to the floor. They get jammed in dirty pockets with car keys or stored with sweaty gym clothes. They sometimes get lost or stolen. And eventually, even with the best of care, their batteries no longer hold a decent charge. You can buy several Clip+ players for the price of one iPod or other high-end model that likely doesn't perform as well.
DILBERT (and the buyer) WINS: The Clip+ team at Sandisk should get an award. Their engineers clearly know what they're doing and apparently found a way to somehow pull off a genuinely well engineered product at a bargain price. How they managed without senior management insisting it be rushed out the door half baked with all the usual corners cut is a mystery.
BOTTOM LINE: I'm not aware of any other player, besides perhaps the larger Sansa Fuze, that can touch the Clip+ in terms of audio performance for anywhere near this low of price or offer similar features and performance in anywhere near as small of package. I'm so impressed I plan to buy a few spares when they near the end of the product cycle because I'm worried Sandisk might somehow screw up the replacement (like some have said they did with the latest Fuze+). The Clip+ is a rare find, and in some key areas, outperforms many more expensive players.
TEST RESULTS (the section in blue below is for those who are curious about audio test measurements, or just like to geek out on the numbers):
TEST INTRO: You can read about how I (and some others) test in this link: Testing Methods. And near the bottom of this review you'll find more details right down to firmware revision, etc.
FREQUENCY RESPONSE: This is one of the more audible differences between players. Most portable players roll off in the low frequencies to varying degrees (some quite audibly) and many roll off the highs in audible ways as well. But not the Clip--its frequency response is very flat and accurate. Here's a swept sine wave test starting at a lower than normal subsonic 10 hz up to 20 Khz. The yellow flat line is the frequency response. it's about + 0.07 dB/-0.0 dB across the entire range (as with most of the graphs, click for the full size version):
I'm guessing the Clip+ uses a Class-D, or similar, headphone amplifier as it's apparently DC coupled. There's evidence that cheap/small coupling capacitors (often needed to block DC from reaching the headphones or various parts of the circuitry) degrade sound in audible ways. So it's a good thing the Clip+ apparently doesn't seem to have any.
The highs are flat right to 0dB even at 20 Khz. This means the Clip+ is likely using sophisticated digital filtering techniques and a relatively high quality DAC. Most players--especially ones under $100--have at least some visible HF roll off. And, for anyone wondering, the tiny rise starting around 5Khz amounts to about 0.07 dB and is completely insignificant (anything less than +/- 0.25 dB is generally considered impossible to hear). Flatter HF response also generally means better HF phase accuracy in the more audible frequency range than players that start rolling off sooner.
The light blue line above is the distortion plotted across the same range of frequencies. To keep the scales all in dB (and consistent with RMAA), it's plotted in dB referenced to 0 dB. So, for example, at 1 khz, the total distortion is about -65 dB. And if you look at the 1 Khz distortion plot a few graphs down, that's about the sum of the applicable distortion products. The distortion falls off at the highest frequencies due to the analyzer limiting the distortion measurement to the audible band--at say 15 khz the 2nd harmonic is at 30 Khz which is outside the audible range so it's not calculated into the THD + N measurement. the distortion is more constant with frequency than many players, and this is generally considered a good characteristic that leads to better sound. Here's the iPod Touch 3G for comparison:
The iPod has very similar impressive frequency response and notably lower distortion although one can argue neither player's distortion is audible. And, no, I can't explain the iPod's glitch right before 20 Khz but I couldn't make it go away running variations on the test. WINNER: FREQUENCY RESPONSE-Tie, DISTORTION-iPod
HOW MUCH DISTORTION IS AUDIBLE? There's considerable debate over how much distortion is actually audible, and here's the short answer: "It's complicated". For example, odd order harmonic distortion is more audible than even order harmonics. And certain music reveals distortion more easily while other music masks it better. The same is true of headphones and speakers--some mask distortion and some better reveal it. And there are kinds of distortion that are not well measured by just THD or THD+N but it's still a very good indicator of the relative accuracy, linearity and overall design of audio circuitry. Devices with fairly high THD are generally more likely to be compromised in audible ways. And this is especially true of low cost, small, battery powered products like portable players.
While I hope to cover this more in a future article, the estimates of the audible threshold mostly range from 1% down to as low as 0.01%. But nearly everyone seems to agree 0.05% or less is inaudible when listening to real music. Various blind studies have been done to help find the threshold.
REMEMBER TAPE HISS? To perhaps provide another--perhaps more intuitive--perspective, a THD value of 0.05% means the distortion is 66 dB below the peak signal. For anyone who's ever listened to tapes, 66 dB was roughly the amount of signal to noise ratio you get with Dolby B noise reduction and cassettes. The tape hiss is easily audible during quiet passages but generally masked by the music at more normal levels. Unlike tape hiss, which is relatively constant, distortion products are mostly proportional to the level of the music. So when the music is quiet, so is the distortion. But if you're uncomfortable with something like tape hiss being buried under your music, you might want to seek out products with low levels of THD--i.e. below 0.05% at all frequencies, and to be extra safe, below 0.01%.
THE SANSA CLIP's DISTORTION: Here's the distortion referenced to an output level of approximately 1 mW into 32 ohms ( about 189 mV) an industry standard level. This is roughly the level many people listen at with typical headphones. The most common portable headphone impedance is 16 ohms and some of those dip to 15 ohms. So using a 15 ohm resistive load is ideal to simulate near worst-case loading and allow others to easily replicate these tests. A 15 ohm load is more challenging to the player than a 32 ohm load:
As you can see the Clip's THD alone measured a very respectable 0.048% and the THD + Noise was only slightly higher at 0.055%. Most would argue this is comfortably below what's audible playing any sort of music. However, for comparison purposes, here's the iPod Touch 3G in light blue over the Clip in red (click for full size):
As impressive as the Clip+ is, the iPod has ten times less distortion at 1 Khz into the same 15 ohm load at a similar output level available. It also has a lower noise floor but somewhat higher ultrasonic leakage of the 44.1 Khz sampling rate. Of course, the iPod Touch is also 5+ times the price and almost 5 times bigger and heavier. And a strong case can be made the distortion is inaudible for either player. The iPod turns out to have another weakness that's way more likely to cause audible problems than the Clip's low distortion. Keep reading to find out what that is... WINNER: iPod
OUTPUT IMPEDANCE: Output impedance is a critical factor in evaluating any device intended to drive a headphone. And it's not measured by RMAA. A high output impedance causes audible frequency response variations with many headphones (especially balanced armature types). The lower the output impedance, the less interaction there will be between the headphones and their source. That's one of the big reasons headphone amps can improve the sound--they generally have lower output impedance.
Here the Clip+ also does amazingly well. Output impedance is measured by noting the drop when a known load is applied. From this you can calculate the internal output impedance. See Headphone Amp Impedance for more information. Under the exact same conditions as the THD test above, here's what happens with no load:
Unloaded, the output voltage went from 186 mV up to 199 mV. If you do the math (see the link referenced above) the Clip+ has an output impedance at 1 Khz of 1.02 Ohms--this is very impressive!
It's also worth noting the distortion drops to about 0.02% with no load as the Clip+ is no longer having to deliver any current which makes the internal amp's life much easier. Here's the iPod, same conditions as above, but with no load:
The iPod is 211 mV with 15 Ohms and 310 mV no load. Doing the math, the iPod's output impedance of 7 ohms is 7 times higher than the tiny Clip+! 1 ohm is really impressive, 2 - 4 ohms is OK. But 7 ohms is getting high enough to be a concern with some headphones (see: Amp Impedance). And, interestingly, the iPod distortion goes up slightly with no load likely because it's a THD + Noise measurement and the iPod's noise increases without a load. WINNER: Clip+ (by a huge margin)
WHY OUTPUT IMPEDANCE MATTERS: Here's the Clip+ frequency response into the same 15 ohm purely resistive load and also the Ultimate Ears SuperFi 5 Pro headphones which have an impedance characteristic that's typical of many armature type headphones. The lowest trace, in yellow, is the iPod driving the SuperFi's (all 3 level matched at 1 Khz):
Note how the reactive impedance of the SuperFi's causes less than a 1 dB (+/- 0.5 dB) deviation with the Clip+ (blue trace) but causes a decidedly audible variation of nearly 3.5 dB with the iPod (yellow trace). This sort of wide audible variation is what you can expect when using many balanced armature-type headphones with the iPod.
MAXIMUM OUTPUT: This is another very key difference between players--and another critical specification RMAA cannot measure. The generally accepted method is you increase the output level driving a realistic load until the distortion reaches 1%. With portable players, however, the volume steps are usually coarse enough that one step (the smallest change you can make) will often go from well under 1% to well over 1% THD. But, in the case of the Clip+, even playing the loudest digital signal possible (0 dBFS), with the volume all the way up, into a difficult 15 ohm load, it won't clip! The distortion is still an extremely respectable 0.06% at maximum output!
The maximum output of 489 mV corresponds to about 15 mW into 16 ohms (or 7.5 mW into 32 ohms). This is a very healthy max output, especially for a player with such a tiny battery. Many players can only manage about 5 mW. Here's how the iPod compares:
The iPod was somewhat disappointing here. Not only did it manage less maximum output into the same load, and not only did the harmonic distortion rise dramatically, but the overall noise floor is some 15-20 dB worse. So don't push the iPod even close to clipping if you want clean sound. With standard 16 ohm headphones, the Clip+ should play slightly louder and cleaner. WINNER: Clip+
IMD DISTORTION: By combining two tones at once, it makes life harder on the player and does a better job of simulating the complexities of real music. The audio geeks at SMPTE decided to standardize on 60hz and 7Khz in a 4:1 ratio of levels as best. So here's the SMPTE IMD for the Clip+ referenced to the same ~180 mV reference level as the THD measurements into the same 15 Ohm load
Again, 0.03% IMD is below the audible threshold. Note, unlike most RMAA IMD results, you can actually see what happens at higher frequencies here. There's a different sort of ultrasonic hash here from the THD measurement. You can clearly see the interaction of the two tones creating "pairs" of harmonics. You can also see the 44 Khz sampling carrier leaking through. Note the IMD reading is lower than the THD reading due to the way IMD distortion is calculated from the predicted side bands. I'm not sure how RMAA calculates IMD, but it's often inconsistent with the results from real audio analyzers. Here's the iPod's IMD into 15 ohms (same conditions):
As with THD, the iPod clearly has lower distortion. This is impressive performance for a portable player at any price driving 15 ohms. WINNER: iPod
DAC LINEARITY: Even cheap DACs these days manage to show pretty good performance at large signal levels but at really low levels--close to the noise floor--is where they often show their compromises. This is especially true of DACs that have to use very little power and run from very low voltage power supplies.
DAC linearity is important because most portable devices have digital volume controls that reduce the signal before the DAC. So if you listen to music with a wide (say 60 dB) dynamic range, and you turn the volume down say 30 dB below the max level, the quietest parts of your music are now at -90 dB from the DAC's perspective. If the DAC has poor linearity you get dynamic distortions--i.e. non-linear compression/expansion--that can be audible.
The generally accepted test is to use a 0 dBFS signal to set the measurement reference and then play a test track recorded at -90 dB. A perfect DAC will reproduce the signal at very close to -90 dB. Many, however, will be either well above or below that. Errors of 1 or 2 dB are considered normal and inaudible. Greater than that, however, is a sign of a poor DAC and/or other design problems.
There's some debate if this test should be run with the digital volume control all the way up so you're not reducing the input to the DAC making the test even more challenging. But many players clip trying to establish the 0 dBFS reference at max output (like the iPod). So I chose to run this test using the same ~180 mV reference output level. That way the test can be reproduced in a consistent way on most any player. Here's the Clip+ reproducing 1 Khz at -90 dB:
If you look at the two areas circled in yellow, you'll see it's at -91.6 dB which is a fairly impressive result considering it's only just above the noise floor and volume control is further reducing the actual level to the DAC by several dB. Note the frequency axis has been zoomed to more easily see the top of the 1 Khz tone. Here's the same test on the iPod:
The iPod is almost perfect at -90 dB and likely has at least a slightly better DAC (as it should for 5+ times the price and being much larger). But the Clip's error is still small enough to be inaudible. Winner: iPod
NOISE (revised 2/23/11): A lot of players mute their output when they detect no digital signal or nothing is playing. So to really evaluate noise, you have to have something playing. And, considering noise is most audible during quiet passages, it's most valid to evaluate the noise playing a very low level signal. The above -90 dBFS tests are perfect for this. Here what matters most is the relative difference between players referenced to approximately the same output level. If you look at the above 2 graphs you can see the noise floor of the iPod is about 2 - 3 dB lower than the Clip+. This also shows up in the RMAA results later. Both are very quiet players. As noted in the first section of this review, this noise is just barely audible with really efficient headphones and is likely to be unnoticed most of the time.
JITTER SPECTRUM: Somewhat like THD there's lots of debate about jitter, how to best measure it, how much it's audible, etc. While the dScope can measure actual levels of various kinds of jitter on a digital signal, and create controlled jitter on its digital outputs, trying to measure jitter on an analog signal from a device playing back a file is considerably more challenging and controversial.
The most widely recognized method I'm aware of is to use a 11.025 Khz test signal (i.e. 1/4 the sampling rate) at -6 dBFS and then inspect an averaged high resolution FFT with both the frequency and level zoomed in for greater detail. Any "spread" at the base of the signal is likely from low frequency jitter, and any jitter that's a byproduct of various clock frequencies, etc., will show up as "spikes" or "side bands" to the main signal. There are lots of resources on jitter, but this is one of the better ones I've found that doesn't get too technical and shows examples of what I'm talking about: Measuring Jitter
The above method does indeed seem to work and I have a specific test routine saved on the dScope to allow re-creating the identical test with any device. Some indeed measure much more poorly than others. The Clip+ and iPod, however, show essentially no jitter problems. Here's the combined result of both (click to enlarge):
The Clip+ is in red and the iPod in light blue. The iPod shows slightly less "spread" and slightly lower "noise" but it has lower noise in general. The 0.06% distortion figure shown is for the Clip+ but really shouldn't be in the screenshot as it isn't a good measure of jitter. Jitter performance is mainly what the spectrum looks like. In this case it's tempting to give the iPod the slight nod, but it's really too close to call. Winner: Tie
SQUARE WAVE TEST: Like the jitter test, inspecting square wave performance is done visually. It's a very revealing test in many ways. Here are some things you can learn from a simple 1 Khz square wave:
- Low Frequency Performance: If the device has poor bass performance, the tops of square waves will slope down to the right.
- High Frequency Performance: If the device has poor treble performance the leading edge of the square wave will be either rounded down or there will be lots of overshoot.
- (In)Stability: Nearly all amplifiers use feedback to reduce distortion. Feedback design involves compromises and some can lead to instability. This instability is usually invisible in regular sine wave distortion measurements but shows up clearly on a square wave as an oscillation.
- Digital Filter Response: There are lots of different ways to design DACs and nearly all require some sort of low pass filter to properly reconstruct the audio and filter out the unwanted byproducts of the conversion process (such as the 44.1 Khz carrier frequency). Increasingly, this is partly or entirely built into the same chip as the DAC. Digital filter design is a compromise between the phase and amplitude characteristics. Some designs produce minimal ringing, some pre-ringing, and some post-ringing. Any ringing will typically show up in an inpulse response and also the square wave response shown here. There is considerable debate about the "less of evils" in the design of these filters and which sounds best. Two of the more popular options are "Minimum Phase" and "Linear Phase". See Digital To Analog Converters for more about the conversion process.
And for some heavy theory and a taste of the trade offs see Linear Phase Filters For Audio.
Here's a "near ideal" square wave feeding the dScope from a high-end professional signal generator:
Note the very flat and smooth "tops" of the waveform. This is as good as a square wave can look on the dScope. Here's the Clip+ reproducing the same 1 Khz square wave into a 15 ohm load and the more wild impedance of the UE SuperFi 5 Pro's (see ):
The level isn't exactly 500 mV peak-to-peak because of the discrete volume steps on the player. But the flatness (lack of slope), ringing, or other problems is very impressive. Note how little the very reactive non-linear load of the SuperFi 5's changes the output. Credit the very low 1 ohm output impedance and also excellent stability in the Clip's headphone amp. This is among the best performance as I've seen from any portable player. For comparison, here's the iPod:
With both the 15 Ohm resistive load, and the SuperFi's, the iPod has quite a bit more visible ringing. This likely means Apple opted for different compromises with the DAC filtering and/or headphone amp design and might not be a sign of any problem. But it could also be partly related to the small high frequency rise seen in the iPod's frequency response. Also note the bigger shift in level with the two loads--this is due to the iPod's much higher output impedance.
PITCH TEST (revised 5/25): It's somewhat unusual to find significant pitch errors in today's digital audio players. And, if the error is consistent sample-to-sample, it's likely a design error with the hardware or firmware. There's already some info on the web about the Clip+ "running fast" and mine indeed does but only by a very small amount: 0.25%. So the musical note Middle C is 441 hz instead of 440 hz. I wonder how many would notice this, but I did confirm the problem using my Agilent bench meter (which has an extremely accurate internal timebase) and playing back a "perfect" 1 Khz sine wave:
ROCKBOX PITCH FIX: The good folks who maintain Rockbox apparently knew about the above error, and interestingly, fixed it in their firmware. Here's the same Clip+ playing the same file but running Rockbox and the error is now near zero (0.04%):
OTHER ASSORTED RESULTS: The channel balance was within 0.20 dB (inaudible) on the Clip+ and iPod. The channel separation was about 50 dB on the Clip+ and 60 dB on the iPod (both loaded with 15 ohms). I'm not sure the extra 10 dB advantage to the iPod is audible but it might be. The Clip is handicapped by it's extremely small size. Physical separation of the left and right audio circuitry is a key design technique to increase channel separation. Sandisk clearly didn't have as much space to work with as Apple did. I suspect the tiny Shuffle performs similarly or even worse.
RMAA RESULTS: For reference, and comparison purposes, here are some RMAA results for the Clip+ versus the iPod Touch 3G. Unlike most RMAA measurements you find on the web, they were taken at known reference levels (the same as the tests above: ~180 mV), with a known matched load (the same 15 ohm load used with the dScope results), and with known PC high quality sound hardware (a Benchmark ADC1). See Testing Methods for more details on how I test, as well as performance graphs of the Benchmark and dScope hardware. Here's the basic RMAA results table for both players and several RMAA graphs for just the Clip+:
In general the RMAA results, under these controlled conditions, compare fairly closely to those I measured with the dScope. Some things are significantly different--especially the IMD. But, unfortunately, RMAA results published on the web often don't include essential details such as the PC sound hardware, loads, reference levels, etc. And without knowing these details, it makes comparing results very difficult. And it's entirely possible the better performing product will appear to be worse when you're comparing apples to oranges in terms of how the measurements were made. See my article on RMAA and it's many problems: RMAA Limitations
Equally as important, the dScope can measure lots of things RMAA cannot. Some of these, like maximum output level, output impedance, DAC linearity, square wave performance, etc, make for very real audible differences between players.
FINAL SCORE: I count 4 wins for the iPod, 3 for the Clip+, and 2 ties. That makes it a fairly even battle--impressive considering the huge price difference. But given many of the iPod's wins are likely an inaudible advantage, and the opposite is true for the Clip+ (the lower output impedance, higher output at less distortion, and better square wave performance), one could argue the Clip+ is more likely to sound better in real world use. This will especially be true with balanced armature type headphones.
SUMMARY: As stated near the beginning of this review the Clip+ turned in excellent performance for just about any portable player, let alone a really tiny one under $40. While the iPod has the performance advantage in several areas I'm not sure any of them will result in audibly better sound quality. I am, however, fairly confident the lower output impedance and higher low distortion output power of the Clip+ are an audible advantage over the iPod with many headphones--especially balanced armature types such as most of the Etymotic IEMs, Shure IEMs, Ultimate Ear IEM's etc. These all tend to have wider impedance swings which, combined with the iPod's roughly 7 ohm output impedance, will create audible frequency response variations compared to the Clip+ (see the earlier graph). The flip side of this is some might actually prefer the "less flat" frequency response they get with the iPod. And some argue certain headphone makers "voice" their affordable portable products for use with iPods.
THE BORING TEST DETAILS: The Clip+ was the 4GB model in black, purchased at a local big box store in January 2011. The firmware is the factory version that came with it (version 01.02.15a) unless otherwise noted. The battery was fully charged at the start and remained above 90% for all the tests. All the test files were in uncompressed WAVE 44 Khz/16 bit format (CD quality) although basic performance was verified with FLAC and MP3 and no significant changes were noted. All EQ and other "processing features" were turned off for all the tests.
The iPod Touch 3G is an 8 GB model running iOS 3.1.2 and was fully charged at the start of the test. All EQ and processing was set to off. The identical WAV files used on the Clip+ were used for all measurements. Most of the measurements were made back-to-back with the identical test setup--adjusting for 0dBFS reference levels when needed. The nearest volume setting on the iPod was a fraction of a dB higher than the level used on the Clip+.
Most of the tests are referenced to 0 dBFS being roughly 1 mW at 32 ohms (about 180 mV RMS) into whatever load is specified (typically a resistive 15 ohm load). I say "roughly" because, like nearly all players, the volume changes in discrete steps. So when playing back a test signal the you're forced to choose the volume step that's closest to the desired value. The 1 mW/32 ohm value is used as it's something of an industry standard, a level most players can achieve without clipping, and represents a typical listening level typical headphones.
Unless otherwise noted, the distortion analysis is restricted to the audio range of approximately 20hz - 20Khz, or in the case of IMD, follows SMPTE guidelines. No weighting was added to any of the dScope results (I'm not sure what RMAA does internally). No animals were harmed in the testing.