Well yes, except in software all 1's and 0's are represented by voltages, currents, electrical charges, magnetic fields, pits or lands (actually those only signal transitions), darker or brighter areas and what not.
Some are perceptive to EMI/RFI, others are not, some are read with inferior magnetic or optical readers with motors and use (electro)mechanical components with limitations.
More often than not operating near the edge of where it still works reliably enough.
For 1's and 0's the physical representation doesn't really matter that much untill the signal is so much distorted that a 1 or 0 gets misinterpreted...
The timing part of data transfer is in the clock recovery/synchronisation which is determined by the (averaged) timing of rising or falling edges.
The clock is embedded in the data so a combination of both signals.
It's how the cicuitry behind it, that ultimately decides at a certain point in time which is dicatated by a recovered clock, what electrical levels (with a wide margin) represents a '1' and '0' are and how these are clocked through many components that convert bits to bytes, do lots of 'math' to the 1's and 0's (bytes usually), before actually arriving at the DA conversion stage.
It takes a lot of clockpulses (in time) for the incoming USB signal has been decoded, processed and finally changes the voltage value out of the DAC chip.
Things like impedance matching, level signals, bandwidth of a cable, induced 'garbage' from outside or inside (power supply) etc do of course have an influence on the signal amplitude but unless it is so bad bits are not interpreted as the intended bits and are incorrectable then something need to be done.
The video merely showed a USB 3.0 signal on an expensive scope and that the scope could analyse the signals as well which is neat.
Should do that as well for that price !
It said NOTHING about USB2.0 nor about its jitter, nor how the jitter is of influence in the recovered clock, nor how that recovered clock is used further on.
It also said nothing about about bandwidth differences of USB2.0 cables (which differs substantially from 3.0 cables)
Nope... indeed just a guy faffing around with an expensive toy.
Here's some plots of the influence of common mode filters and ferrite beads on a USB 2.0 signal which is of more relevance in case of the jitterbug.
(image shamelessly copied from:
http://www.tdk.co.jp/indiv/nec/usb20/p5/usb20p5.htm )
Would like to see some plots like this taken on a USB signal level before and after miracle boxes and also similar shots of (recovered) clocks going into the actual DAC chips.
That is the only point chips where the (recovered) clock signal integrity matters.
Also before and after a miracle box has been applied.
The problem I see here is that to measure this with a revealing accuracy you need very expensive equipment.
Chances are the ones owning this type equipment couldn't dare less about these signals.
The ones that do have these plots are likely not to publish it.
Not for 'proprietary' reasons though ....