Hey prof, The increase in speed in your article isn't due to better glass. It's due to smarter kit at each end.
Quote from article:
The system transmits information at 40 gigabits per second, three milliseconds faster than any other system in the region ... The cable was also laid as straight as possible to speed communications.
Smarter kit requires straight cables? And if it was all down to smarter kit, why the need to lay new cable in the first place - surely any old cable would do?
I can't comment on your SLA figures from Verizon (I'm sure they're right), but I do know a thing about TCP/IP (the protocol actually used to transmit data between computers), and I know that (a) it's very inefficient due to the amount of error control built into it and (b) the reason it was picked was precisely because, when you're sending data over vast distances like they foresaw doing with the birth of the original internet world wide web, this was absolutely going to be required since one error could not be tolerated.
If this wasn't necessary, then we'd use a far more efficient protocol with no error correction and this would vastly speed up communications. Some programs do just that, using the UDP protocol (normally streaming video across local area networks for instance since, if you get a bit of breakup in your video due to faulty data, it's not generally that much of an issue, whereas ensuring the picture is up to date is far more critical).
The fact is, when you send data digitally, you will get errors - this idea of error free transmission because something is digital is a fantasy (at the moment) - something can only be made error free through error correction, and this requires non-realtime transmission of data.
The amount of BS on this thread is incredible.
I think I must be speaking mandarin
Straighter cables mean less distance, as the crow flies, less distance, faster transmission. Even light takes a while to get there.
If any enhancements around bandwidth have been made it will be to the materials refraction index to allow it to reliably transmit extremes of the light spectrum in a multimode fibre transmission system. It is still entirely dependant on the kit to send data at these frequencies. The medium will only transmit what has been passed to it. Certain mediums will not be able to transmit these frequencies. This is not relevant as the kit at either end has to be built to send and receive these extreme frequencies.
TCP/IP Different things often grouped together. Not one protocol.
TCP is a layer four transmission control protocol. Used to provide retransmits when necessary. Needs fully reliable data. Not speed dependant. Pure data transfer. Ie I need a page of txt. It needs to be exactly the txt that was sent.
IP is layer 3 internet protocol which is used to route packets across a network to tell them how to get to a destination.
UDP is also uses IP to get there but is does not have transmission control. Ie UDP will send, send, send and does not need confirmation to keep sending stuff. Voice and video tend to use this as the do not have time to wait for retransmits like TCP. Exact reproduction of the data sent is not required by the receiver from the sender unlike TCP. Voice/video being a prime example. They do not need 100% reproduction of data sent unlike a document.
My point is there is the cable involved plays no point in any of the above. It is solely dependant on source and destination device. A short length of copper or digital cable has no effect on any of this. Digital cables are irrelevant. That’s my point.
If you take 6000 miles of cable and go through 100+ interconnects and 50+ devices you will get errors, at a tiny rate. You will not get errors going between a telly and a blu ray on anything less than basic quality cable.
To suggest that these digital cables carry certain “ones” (blacks, whites, high frequencies or low frequencies) when they are all represented by the number “one” is farcical.
Hope you're not referring to me!
Nope, you're not - we're basically saying the same thing, apart from your very first argument stating that just because data transmitted thousands of miles under the sea gets to its location perfectly, that means audio data will also do the same doesn't add up. We've both agreed that the protocols used are different - one guarantees error free transmission (hence why your bank transactions work), the other doesn't. I'm not commenting about the affect errors may have on audio data - I just think it's unhelpful to perpetuate this myth that digital data transmission is infallible. It's not.
If the old cable was unable to transmit these frequencies, then the new cable has made a difference. It's not simplly "smarter kit". If it were, they wouldn't have spent god knows how much money laying a new cable. This really has nothing to do with the rest of the discussion though, just a small aside I thought I'd raise to show that an optical cable isn't just "an optical cable" - there are differences between them which affect the transfer of data as you've indicated above.
Yup, again, you've basically just said what I said, but in more detail (I didn't see the need to go into that much detail on a hi-fi forum!). Hopefully you see my point though? There's an awful lot of overhead going into this to ensure error free transmission. If there were no errors when using digital transmission, then this overhead wouldn't be necessary.
Have I suggested this? Tbh I'm pretty nonchalant on the whole audio digital cable debate now (make that any cable debate). Can you see differences or can't you, it's pretty irrelevant to my world so I'm not bothered. Some people find it incredibly important to prove one way or the other, and good luck to them. However, again, all I was doing was trying to stop the prevalence of this "I don't get a better quality Word document when I copy it to another computer over my fancy USB cable" argument that's just completely irrelevant to this discussion - and to me, I think the average punter would have interpreted your post as just that.
Anyway, I've just finished work after a 16 hour day, so am calling it a day there.
Smarter kit requires straight cables? And if it was all down to smarter kit, why the need to lay new cable in the first place - surely any old cable would do
Oh and at light speed one millisecond equals around 300km in distance. so a route 900km less in distance would equate to this difference without any enhancements in technology. I have no knowlege of the old route the new straighter one.
The point I am making is that the medium involved, glass or metal can have no effect on the timing or quality of the data whatsoever unless it’s made of seriously flawed materials. If the cables were of lower quality they would not show lesser blacks, poorer colour spectrum in the case of tv. In the case of audio they would not show poorer bass or a harsh top end. Digital cables are not capable of defining how a one or a zero relates to a high note or a low note, a black or a white. It’s a pulse of light or electricity. This is pure snake oil. How can one HDMI carry backs better than another? How can an optical cable carry high frequencies better than low? How would a piece of glass or metal transmit “ones” that are transmitting blacks better than “ones” that are transmitting red or blue? Or which “ones” are transmitting high notes as opposed to lower ones? It ridiculous these comparisons come into digital cable reviews.
I am not decrying good quality kit. It’s how well the source data is transmitted or how well the receiving device accepts that data that’s the key point. Half decent digital mediums will all carry ones and zeros at a rate where errors are insignificant, even to an audiophile.
+1, but I guess we won't see any mags write anything along these lines as it appears the majority of readers don't want to believe this.
Prof and Tonestar,
Just to clear up a few misunderstandings:
1. Packets can and do arrive in a different order to the order sent. They don't overtake each other on the same wire/fiber, but follow a different route through the network. TCP/IP does not assume a point to point connection.
2.Dont assume SPDIF is a digital link. It carries digital sample data and what is effectively an analogue clock. I have never seen an SPDIF link so poor that it introduces bit errors in the digital data, but there is plenty of scope for screwing up the clock.
3. The frequency response of the SPDIF link (transmitter / cable / receiver) defines the pulse shape of the clock at the DAC. Any clock recovery process can be subject to jitter, and a poor pulse shape makes jitter worse.
In the case of SPDIF, a coding technique 'Manchester biphase mark encoding' is used which can introduce jitter correlated to the progamme material into the recovered clock if the frequecy response of the SPDIF link is poor. Correlated jitter appears to be more detectable that random jitter.
Anyway, in summary, data cables used for aync protocols have no effect on digital audio data sent along them, (absent such a shoddy cable that errors are introduced). Synchronous protocols, and in particular SPDIF, are subject to the quality of the link if the receiver is sensitive to jitter.
So, a CAT 5 cable carrying data to your streamer is going to have no effect, the SPDIF cable from your streamer to your DAC just might make a difference if your DAC is jitter sensitive.
Just to go back the original point of the thread, having designed audio equipment for a living, the only reliable test is blind.
But, that was soon remedied when your good self went on a little WHF shopping trip for a new BDP , when if I recall correctly (and my apologies in advance if I'm wrong), you demoed three players in three different shops on three different TVs before declaring that the PQ of the Marantz was much better. I know you're a very clever chap, but surely your memory isn't that good
At least he demoed them and made up his own mind, rather than just buying something somebody else told them was good.
I'm also fairly sure the prof's trip was before the BQ in question, but hey...
Damn you and your argument-countering facts...
Cambridge Audio StreamMagic 6 | 751BD | 651A | Diamond 9.1 | Minx Xi | Sonos Play:3
Moderator. mail: john.duncan.whf at the mail of g dot com
Firstly there is no connection between the amount of time spent doing something and the ability to do it. For example I could spend all year training at a running track and at the end of it I'd still take me ages to do 100m if I didn't have a heart attack first. If your "experts" are so great they would be willing to undertake double blind testing of their opinions and we know they aren't dont we!
Secondly, of course they are interested in preserving the hype! If the true situation is that there is no material difference between most of the kit then nobody would buy the magazine would they? They have an obvious and self-evident reason for making us all believe we should spend lots of money on hifi kit.
Thirdly, they are just journalists! A good parallel is the Top Gear team. Nobody with a brain cell would buy a car on the basis of the Top Gear recommendations, they reject cars on the basis that the boot is big enough to hold golf clubs. The only difference is that the Top Gear guys are amusing.
Given your continued attacks on the magazine, based on what you clearly feel you know (but which is in fact completely erroneous), there's little point in continuing this discussion.
And please stop dismissing everything with which you don't agree as 'BS': the repeated use of the term is dull, unimaginative and rather sad.
Audio Editor, Gramophone
You haven't read What Car? for a while, have you?
Funnily enough, I read it every 6 or so years when it is new car time and use it to help provide my short list. There reviews are fine, I know which bits to discount when making my choice. I cannot imagine why anyone would read it for pleasure though. Like all car magazines, the journalists tend to be a bit Clarkson in their politics which turns me off ( although I am sure it is what the readers want).
And just to clear one last thing up, I don't believe anyone has said this. When I said data (or packets) can arrive out of order, I was referring to the fact that missing or corrupt packets could be resent when using TCP/IP - this doesn't then require every packet originally sent after it to also be resent - it can completely cope with the fact this packet has arrived after them. Clearly this wouldn't work when playing audio in real-time, so a different protocol is used.
I wouldn't normally care to post such a correction, but since this is my profession, I don't want people thinking I believe packets race each other down a wire trying to overtake each other Wacky Races style (though clearly this would be fantastic and a much more interesting subject if true!).
Curses, he was right. I guess the confusion on my part comes from the fact that we actually did that Savvy Shopper article before the BQ article was released (clearly we had to have done, in order to get it in the magazine for the following issue). The reason I remember this is I remember being surprised at the result (given my own experiences just a few weeks before!).
when if I recall correctly (and my apologies in advance if I'm wrong), you demoed three players in three different shops on three different TVs before declaring that the PQ of the Marantz was much better. I know you're a very clever chap, but surely your memory isn't that good
Yes, but I wasn't attempting to compare each player in each shop with each other - that would be ridiculous clearly (since as you say, each one was using a different TV for a start). I always had my existing Blu-ray player as a reference in each shop. So each new player was compared with my existing player side by side. In this way, I could make comparisons on the new player with my old player and then decide if the improvement was worth it from this.
The other part of this is, it was my decision to go looking at Blu-ray players, not WHFs. They didn't ring me up, ask me to look at a few players and in the heat of the moment I got carried away and purchased the Marantz! I'd wanted to upgrade for a while - applied for the Savvy Shopper and I was chosen mostly because I was local so neither of us would have far to travel. It was only after the initial phone call to see if I was interested in doing it that I made it clear I wanted to look at Blu-ray players (and this was all before that BQ article came out). WHF set up the auditions and documented the day, but it was my decision and mine alone (and yes, I did pay for the player!).
But I'm not sure why I'm writing this, since it's not going to change Max's mind. And also the fact that idc has really said everything that needs to be said on this subject from my point of view:
To ask WHF to a blind tests is bit like asking Toaster Magazine to do one. You would get a result that would confuse and perplex and put people off buying that magazine. That is because we buy all our hifi kit and toasters sighted and listen to and make toast sighted. So is not the case that sighted reviews are the most accurate for the real world audiophile and toast aficionado?
I think I went to school with him. I mean we called him Bi Manc Mark but it must be the same bloke...
No signature worth mentioning...
Which i what I was trying to say, just not very well.
I guess the key issue then is how senstitive the receiving device is to jitter....and whether we can truly hear those differences.
What I do know is there is noticeable difference between the optical connection and the USB connection with my system, and it aint no fancy pants DAC neither. Explain that (without resorting to the it's all in my head argument).
JRiver MC17 -> Cambridge Audio DACmagic+ -> Roksan Caspian M2 -> ProAc D18
© 2014 Haymarket Publishing