Why would one digital ouput have different quality that another digital ouput?
I'm still very much confused about the fact that
people say that digital source from one device is different that
digital source from another device which "in theory" doesn't make
sense. In mean binary output should be the same regardless of the
device and why it isn't? or is it really different?
I mean the
receiver side needs to receive full signal because I'm sure just like
in any network device there are checks that make sure that is the case.
In a situation that something is missing the sender is being asked to
resend the data. Also on the receiver side there is usually a buffer so
there shouldn't be a situation that you don't have any data at some
So please if anyone knows why one digital is not the same as another digital please let me know. Thanks.