Connect with us

USB protocol, couple of questions on low level stuff...

Discussion in 'Electronic Design' started by Brane2, May 9, 2013.

Scroll to continue with content
  1. Brane2

    Brane2 Guest

    I am plowing my way through available USB1/2 literature, but I still have a few holes on actual implementation.

    1. if everything on USB has to be carefully timed, how does USB hub cope with the consequences of bit-stuffing as part of NRZI coding ?

    I mean, when doing IN transaction, USB HUB can not know in advance how much bits will some packet actually have on wire as that will depend on how many bits did endpoint have to stuff-in.

    And this means that USB hub can not know how much will data transfer take. Does this mean that USB hub has to assume worst case when planning transfers for frame/microframe or is some other mechanism employed ?

    2. Is there any way to make transfer of unknown length ( up to maximal size) or does each transfer length have to be preset in enpoints config before transfer ?
     
  2. Brane2

    Brane2 Guest

    Errm, forget it. I found my answers.

    HUB and driver have to take care of it by catering for worst case, which means that there can be significant dead times during communications.

    Why was it designed this way I'll never know. It was supposed to be relatively simple interface, but it ended with more complications than PCIe. WTF ?

    Had I designed it, I would never waste one wire pair just for power. I'd use both pairs for power and couple signal AC on top of that. And I would use at least 12V for power, if not much higher, say 24V or 48V.

    And I'd drive pairs bidirectionally so that each receiver would have to subtract signal of its transmitter to get Rx. And I'd drop NRZI in favour of 8b/10b or something like that.

    This, as it is now ( as of V2.0) looks like piece of crap that caught cancer.
     
  3. josephkk

    josephkk Guest

    Well for starters USB 1.0 foolishly incorporated many bad properties from
    HP-IL and ADB from which it inherited (partially derived). For backwards
    compatibility both USB 2.0 and 2.1 retained them. USB 3.0 got rid of only
    the worst, creating a legacy comparability issue where some USB 1.0
    equipment is not interoperable.
    Gosh you seem a bit new to evolving standards.

    ?-)
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-