Imagine one of these in a pono player! Wow "dat clarity"
....
#Ahem#
Imagine one of these in a pono player! Wow "dat clarity"
....
#Ahem#
If a news site posted a news article stating 'Sony resurrecting the dead and bringing about the apocalypse with Satan as a sidekick' I looked at my calender and it was April 1st.
I'd scratch my head and think to myself 'Sounds legit'.
Nothing, and I mean absolutely nothing, that company does suprised me.
As I said this isn't about noise disrupting the digital datastream - this is why techniques like line coding/ forward error correction are used on the wire. Rather, it's (supposedly) about reducing electronic emissions from the component. And it's not just about shielding, there are multiple ways things like transistor switching noise can find its way into adjacent blocks, for example back onto power rails or the ground plane.
The same kind of applies to this, as I said you can't take the analogue side in isolation when looking at a complete playback device - e.g. having a noisy buck converter without enough filtering can cause very audible problems with the output.
Still, the price charged for this sort of device is another discussion, but I think it's fair to say the point of diminishing returns has been left a long way away. It's not something I'm going to argue one way or the other though as I have no idea how much real difference exists, and how much people value that difference.
I wouldn't consider it snake oil if there's an improvement, provided it can be detected in a blind test. I know external DACs are quite popular for PCs too and doing the analogue processing outside of the electrically noisy PC environment helps to cut back on that 'digital noise' I mentioned earlier. Some motherboards are worse than others, but you can often hear it by turning up speaker volume and down Windows volume as then you're effectively reducing the SNR - you'll often hear buzzing when you move your mouse, scroll web pages etc as the CPU flicks between power states.
Just to be clear, I'm not mindlessly defending this card, and a lot of my assumptions come with the same huge 'IF' that Saracen posted. My point is, the theory isn't necessarily as stupid as a lot of people are assuming, but I'd need quite some convincing that there is an appreciable difference, and more still that it justifies the price.
I'd be very surprised if the difference could be detected over the general background noise (I think it's safe to say most of us are probably thinking the same thing there) - maybe in very carefully controlled experiments and comparing scope shots you could tell them apart, but that's not the same thing as an audible difference of course.
Last edited by watercooled; 20-02-2015 at 06:50 PM.
While I do appreciate strides in the audio world, I do feel a bit suspicious about cards that carry a very noticeable premium over similar spec cards, especially (as others have pointed out) doing a DIY mod that can even be permanent will cost cents more over regular cards. But yes, blind tests are of quite some use (even if they're scorned on some audiophile circles).
Regarding external DACs vs soundcards (which also contain DACs), it should be said that what the IT world currently has in terms of internal DACs is rather impressive and one needs to pay quite a lot more to actually match the best internal cards have to offer. Additionally, soundcards have the advantage (no, not subjective) of having additional output methods and processing features that, on higher end hardware, can be bypassed at will in the event people just want the cleanest output while retaining the hardware's component quality on the chain.
To quote Mr T, pity the fool... who buys such twaddle.
Actually, this is incorrect. Although in general most people won't notice the difference there is certain amount of error correction that is allowed in the HDMI standards. If 2 cables are made to these standards then sure, it doesn't matter between them, however the instruments required to test cables to these standards are expense, so quite often in the consumer world when you buy the £5 cables, these standards are ignored resulting in a greater level of error correction in the display.
Its unlikely anyone would be able to casually tell the difference on a £250 TV with a 1m cable, but (and im not saying there isnt a lot of, 'snake oil' in the market) there is technically a benefit to high quality cables. Anyone in Broadcast/AV/Digital Signage can tell you the importance of signal integrity over any significant distances.
Sony really seem to like overpricing memory dont they, they did the same for both the PSP and Vita after all except those are actually forced. You can argue the point for PSP/Vita since presumably, like their less-portable brethren, handhelds are probably also sold at a loss initially. Currently though both PSP memory stick pro duo and the vita memory cards are £50 for 32GB on amazon, 5x the equivalent microsd, seems they like 5.
As a musician/producer/sound engineer (not proffesional), stuff like this interests me when I first hear about it. But I can't see this being very popular. What's the idea, that this memory card causes less interferance with sound device? That's probably only going to make a difference in a mobile setting. And I don't know how noticeable that difference would be. But a big problem I see straight away with that, is that headphone listening (which is a sub par experience anyway, at least due to lack of stereo field perception) is often done in environments that are not ideal for listening to music in - noisey places out and about. So who the hell is gonna notice with all that ambient noise... And even if you listen in a quiet room with top notch headphones, your audio interface probably has plenty of interferance protection, or at least enough to stop a little memory card causing trouble.
Again, I'm not an expert on audio. But it's my passion, and something I'm always improving my skills on. I will be looking for some more experienced engineers say on this. But for now - I'm not very intrigued by it. Good luck Sony
CAT-THE-FIFTH (21-02-2015),Saracen (21-02-2015)
What do you mean about there being a 'greater level of error correction' exactly? Because for sorter lengths, unless there's something badly wrong with the cable, the decoder will output the same bitstream as the encoder. The stuff about 'clearer' images, 'fuller' sound, and so on is complete nonsense. If an unrecoverable error is found, there's no degradation, it fails completely. As they're transmitted serially this results in obvious sparkles in the images, or sound dropouts. This is more common with longer lengths which is why active cables exist.
I don't see what part of that would result in most people not noticing; either there's a fairly obvious problem or no problem. It's not analogue transmission with increasing noise, shifting colours, etc. And I've not yet encountered a cheap HDMI cable which has caused any problems over short lengths, either they work or are outright broken. There really isn't a justification for spending silly money on HDMI cables.
The bit about signal integrity is irrelevant - of course you need enough SNR to decode useful information at the receiving end, I don't think anyone is disputing that. But if you're getting that with a cheap cable, what's the problem?
Erm, no.
Stage 1 of error correction is, if you like, detecting the error. Stage 2 is dealing with it. Up to a point, but only up to a point, audio system error correction can recreate the mussing data. Up to that point, you are correct - errors are handled, and the datastream out of the decoder is as came out of the encoder.
For data files, like computer code, 100% data accuracy might be necessary, but it isn't for audio data. Once error correction for audio gets beyond replacement capabilities, it resorts to what is effectively interpolation .... guessing at a particular sample value(s) based on data either side of it. A significant proportion of the time, our data receivers, known as ears, aren't discerning enough to detect it, and so it goes unnoticed. But it's there. Only when the error rate gets beyond the ability of the decoding algorithm to get away with interpolation does what amounts to a brief (or not so brief) audio muting occur.
That interpolation process is a LONG way from uncommon in digital audio, and the result is to .... change .... the audio, because while interpolation might, by chance, hit exactly the right value as the original sampled value, it more likely won't.
It's certainly not implausible to expect trained ears, like a concert violinist, to detect things in a violin solo, that's been interpolated, that most people wouldn't, firstly because their ears are what trained muscles are to an Olympic athlete, and second, they probably know the musical piece vastly better than most.
Guessing at bit values with interpolation isn't much use for tranferring computer code in a digital data stream, but it's commonplace in digital audio data.
I'm quite aware of the difference between error detection and error correction. Interpolation is not error correction.
Read the HDMI specification, and look at actual tests like this: http://www.expertreviews.co.uk/tvs-e...proof/page/0/2
For video, HDMI has no explicit forward error correction or retransmission capability, hence snowing. However audio is transmitted on the same data/aux channel as things like DRM and Ethernet data, and uses BCH coding. It would have to be a heck of a bad signal to start affecting audio, and the video would likely be destroyed by that point.
Yeh that's what I'm thinking. At home/indoors, you probably won't be driving the headphones with a mobile device, so you won't be using one of these memory cards. And even if you were, your audio interface won't be bothered by it. Other electronics, like your computer, for example, will be far more significant in regards to interferance. But, let's say you were listening via a mobile device playing from one of these memory cards indoors, I guess this is where the difference potentially becomes noticeable. But again, if you're an 'Audiophile', I can't see how you would likely be in this scenario.
I dunno... I have no doubt that the audio guys at Sony know their stuff far better than I do! Is this just a marketing stunt to try and sell people something that has no significant benefit, or is it legit... I am really struggling to understand who is going to want these things. It seems like a very, very niche market. Though, I will keep an open mind and see what people say
I did read the HDMI spec, or to be more precise, v1.3a spec, because I couldn't find my copies of later specs.
I refer you to the audio section, and in particular to the Error Correction section, section 7.7 in 1.3a, which recommends interpolation. Specifically, the criteria is to prevent loud clicks or pops coming from speakers, interpolation is a well-accepted " concealment" method, and interpolation is recommended, and implementation-dependent .... unlike both video and control signals.
That's the entire section, by the way, not a selective quote.Sink after detecting an error is implementation-dependent. However, Sinks should be designed to prevent loud spurious noises from being generated due to errors. Sample repetition and interpolation are well known concealment techniques and are recommended.
I return to my original point, therefore, which was that interpolation IS capable of muddying audio signals, and IS part of digital audio-stream error correction. It's inherent in both CD, and DAT (variations on Reed Solomon) and on some other implementations.
As for HDMI, well, the above quote is direct from the spec - recommended, but implementation-independent. I do, on the wider point, absolutely agree about the kevel of kiddology going on in fancy-priced cables, which is why I use cheap cables, either in-the-box, or for a couple, actually from my local pound shop. On the latter, I have some reservations on whether they're actually manufactured to HDMI standards, as they seem a bit .... ummm .... delicate, subject to failure (mechanical) but functionally, they either work perfectly (or as far as I can tell, they do) or not at all.
Fancy-priced cables MIGHT be better-made, but certainly for short runs, IMHO make no difference. Unlike analog cables, where there can be a difference, but even then, it's pretty modest, in my experience.
In practice interpolation is not used routinely, as you say it's to avoid loud noises making it to the output. You'd need a seriously degraded signal to start overwhelming FEC and since the data channel is interleaved with video on the cable, not on a separate wire pair or anything, it's pretty much a given that the video stream would be noticeably impacted by that point. And there's really nothing subtle about video failure on HDMI, no interpolation is used there. Interpolation isn't error correction though, it's error concealment/compensation.
WRT CD audio, there's a ton of redundancy in the bitstream and the spec really allows for a lot of read errors before the audio suffers i.e. isn't bit-perfect post-error-correction. Again, interpolation may be used post-EC to prevent jarring noises but it's not part of the error correction process and should only ever be used if EC fails. It can happen on badly damaged CDs (or broken players), but they're surprisingly tolerant, especially to radial damage. You can get software to scan discs to check for RS errors.
I agree there can be differences in build quality etc among sensibly-priced cables, but I'm referring more to the absurdly priced nonsense. Ironically, in this test it seems the silly-priced ones fail more than the cheap ones, so even the standard-adherence doesn't seem to improve consistently with price. http://www.audioholics.com/audio-vid...esting-results
With longer cables, IIRC active cables can use additional FEC which is transparent to the devices.
Interpolation isn't error correction. It is, however, part of error correction, in digital audio.
In an ideal world, the bitstream never gets degraded enough to need it. Sadly, we don't live in an ideal world. And, of course, it's pretty much a last resort sort-of measure. All sorts of measures are taken to attempt to ensure lossless error correction, from the design of the encoding in the first place, with redundant data, to parity and CRC type stuff, to interleaving, all of which hopefully make it necessary to not have to resort to interpolation. But .... it happens, and if it does, it'll degrade, or muddy, the signal. Whether a human can tell the difference depends on all sorts of factors, including the ears of the listener, the expertise of the listener, and of course, how well they know what the music should sound like. Obviously, an expert (like a concert grade musician) is likely to have much more finely tuned ears than the average person, and perhaps a true audiophile does too.
Then, of course, so many other aspects of a system designed for sound reproduction introduce .... erm .... colouration, though that's a very polite term for what many audio systems do to mangle a piece of music. I guess a true audiophile does everything he/she can, at least within their budget, to ensure that they hear more of the music, and less of their sound system. In terms of poncy analogue cables, IF all other factors are good enough (from listeners ears, to amps, speakers, etc) then PERHAPS cable quality becomes an issue. But personally, my suspicion, as evidenced my my own ears in years past, is that up to a point better cables are worth it. That point, for me at least, comes fairly early. Speaker cable at a couple of quid a metre sounds better (in MY blind listening tests)than bell wire. Cable at £100/m sounds no different to me that cable at £10/m. In blind tests, I could reliably choose between the bell-wire and £2-£5/m, but not between £10 and £100/m.
There is a LOT, in my opinion, of kiddology in "premium" cables, at least in analogue. It's not entirely bunkum, and the only limitation I'm 100% sure of is my own ears, but seeing as I only even listen via those ears, the rest is irrelevant to me.
As for HDMI, not all applications for HDMI cables use video. I have some audio components interlinked with HDMI. Why? It's what the components use. For instance, switching on a good AV amp. So, I agree with the assertion that if audio is getting "muddied", video is probably useless. That isn't always a concern, though.
I don't think we disagree at all on "silly priced" cables. I caveat that, however, with the point that my ears, at least, are age-degenerated. They are not, by a long way, what they were. I used to be able to hear frequencies at the extreme edge of human capability .... i.e. up to about 22.6KHz. That was the result of tests in the audio engineering department at my old Uni. It was, however, 30 years ago and while I haven't done similar tests since, these days I'd guess more like 16Khz. Maybe less. Which is why I certainly wouldn't class myself as an audiophile. Cable prices don't have to be far along the silly spectrum these days before it's immaterial to me. And even when I might have classed myself that way, I was aware that cables I "liked" weren't necessarily objectively better. It's kinda like whether this dark chocolate or that dark chocolate is "better". It's not just about percentage or even quality of cocoa solids, but about my taste buds.today
Same, IMHO, with audio.
Sit me down with a blind system comparison with £100 cables and £10 cables and the ONLY things I care about are :-
- which do I LIKE best, and
- if the dearer ones, do I like it enough more to pay the cost.
Besides, I guess like all things, if an individual feels better about having "premium" cables, who am I to complain. I spent a lot of money on a car, and some of my friends, who could also afford it, thought I was nuts to do so. But it was what I wanted. You could, though, detect an objective difference, say in performance. It was just about the paint colour or wheel design ... as it probably is with many poncy cables.
There are currently 1 users browsing this thread. (0 members and 1 guests)