back to article Freeze, lastholes: USB-C and Thunderbolt are the ultimate physical ports

USB-C and Thunderbolt are the last mainstream connections devices will need to the outside world, according to analyst firm ABI Research. The company's new Device Connectivity Report predicts that by the year 2020 “Almost half of the smartphones and 93% of laptops will include USB Type-C connectivity.” In coming years we'll …

  1. Paul 129
    Coat

    Gotta love the gadget market :-)

    Our system is so good you'll never need anything else.

    Until the use cases appear for the contrary,

    and some bleepard shows of a new system that works that little better,

    and we all have to show that we're building it into our new products,

    and it finally gets ratified as a standard,

    and then we will be saying again.

    Our system is so good you'll never need anything else.

  2. CheesyTheClown

    SCSI anyone?

    Wireless is nifty, but wired is and always will be better. As for connector types, there will be plenty of upgrades to come. I just hope they're not smaller

    1. Known Hero

      Re: SCSI anyone?

      but wired is and always will be better.

      That is true, but then there is using wired just for wired sake, there are plenty of scenarios where there just isnt any point in using wires over wireless. Personally I cant stand wireless internet, but then again it's just not fast enough for me.

      1. jaywin

        Re: SCSI anyone?

        > there is using wired just for wired sake, there are plenty of scenarios where there just isnt any point in using wires over wireless

        Go on...

        We can only get a finite amount of data over wireless. Wired can effectively carry infinite data. And as more and more people try to push larger & faster amounts of data over wireless, it's going to stop being a viable option sooner-or-later. Maybe we should concentrate more on making the things that are helpful working wirelessly (mobile internet / mice / headphones) better, and keep the things that don't (desktop monitors / printers / desk phones) cabled down so we can take advantage of the better wireless?

        1. TRT Silver badge

          Re: SCSI anyone?

          And... and and and... aren't these just copper standards? I seem to recall an optical fibre version of Thunderbolt... or am I just imagining it?

          1. Dave 126 Silver badge

            Re: SCSI anyone?

            You're not imagining. Intel developed the optical version of Thunderbolt - then called LightPeak - first, before reverting to copper on cost grounds. Apple contributed the Thunderbolt name, and it was mainly Apple who used it - though Sony, bless 'em, made a VAIO laptop with a USB-A Thunderbolt port for driving an external GPU.

            This was some years ago now (indeed, VAIOs were still Sony), but it is only now that the idea of external GPUs are gaining traction amongst the gaming crowd. And Apple's next cinema display is rumoured to have its own GPU, because its existing Macbook Pros don't have a connection capable of shunting 5K video.

            As always, I'll let the gamers and Apple users pay the first-adopter's premium and iron out the bugs, and look forward to it being cheap and reliable in a year or two.

            1. Anonymous Coward
              Anonymous Coward

              Re: SCSI anyone?

              Intel developed the optical version of Thunderbolt - then called LightPeak - first, before reverting to copper on cost grounds.

              Actually the switch to copper was only done to provide power to peripherals. In fact optical thunderbolt cables are still available, they just required self-powered devices. Generally fibre thunderbolt is only used for longer distances or installations with interference problems, but the price tends to be cheaper per meter than the copper cables.

        2. Boothy

          Re: SCSI anyone?

          @jaywin

          Quote: "it's going to stop being a viable option sooner-or-later."

          Doubtful, imho.

          Bear in mind that WiGig uses the 60 GHz band for the high speed part (video to TVs/monitors etc), which won't penetrate walls, and is very short range. Effectively limiting you to gadgets/devices in the same immediate area, or at least the same room. So your finite bandwidth, is only being shared by a few devices in one room (or one part of a room if it's a large area).

          You still have your normal 2.5 and 5 GHz for house/building coverage itself, which works alongside the 60 GHz.

          I would suspect for most consumer gear, TV's, Blu-ray players, set top boxes etc, and for normal business use, even in an open plan office, this bandwidth issue is likely to be a none issue.

          Personally, I doubt my desktop at home will ever, not use hardwired network and monitors (latency etc). But for my TV, satellite box etc. I welcome the day I can get rid of the huge rats next of network, power, HDMI, displayport and optical audio cables, I have crammed behind the TV cabinet!

          1. jaywin

            Re: SCSI anyone?

            > Personally, I doubt my desktop at home will ever, not use hardwired network and monitors (latency etc). But for my TV, satellite box etc. I welcome the day I can get rid of the huge rats next of network, power, HDMI, displayport and optical audio cables, I have crammed behind the TV cabinet!

            Personally, I'd rather see something like the USB-C being used to connect everything together. One power lead to your TV, then USB-C to the satellite box, games console, thingyme-widget and so on.

            You can already get PoE powered large screens (using HDbaseT at the moment), so the efficiencies are getting there, such that you could just need to run a single cat5 from your router to the TV, and then the USB-C's and everything is all connected, powered, and talking to the internet.Seems much more user friendly and less complicated than trying to configure / pair / restrain a random bunch of devices over wireless.

        3. martinusher Silver badge

          Re: SCSI anyone?

          >And as more and more people try to push larger & faster amounts of data over wireless, it's going to stop being a viable option sooner-or-later.

          We've already arrived at this point. I fly model sailplanes and as with other model systems radio links have migrated to 2.4GHz over the last decade because its convenient and cheap. Unfortunately its becoming increasingly unreliable. The result has been the loss of some quite expensive planes due to radio lockup -- we fly at a school site that also has housing nearby and gradually as people install high power / high throughput routers its blotting out the entire band in the vicinity, something which is merely inconvenient for data users but potentially fatal to model aircraft. (Its no good getting the link back after a crash!)

          I've been a bit of a holdout, staying with the older 72MHz equipment, not because I don't trust new-fangled gadgets but because of my job I'm acutely aware of what's going on in the ISM (WiFi) band and how its likely to affect data links. Most people still don't have a very good idea how radio works -- we've all been brought up with the notion of 'frequency' and 'tuning' and so subconsciously think that things still work that way.

    2. Anonymous Coward
      Anonymous Coward

      Re: SCSI anyone?

      The good of wired connections like USB is they also carry power. Wireless devices or are already powered (i.e. a TV), or otherwise needs batteries. Batteries make them heavier/bigger (especially for more power angry devices), needs to be recharged, and shorten the device lifespan and usability if cannot be replaced (good for vendors... not for the customers).

      For example I have yoke/throttle/pedals devices for flight simulations. They are all USB connected. If they all were wireless devices, they would be three more devices to recharge, or connect to the mains instead of USB - where's the advantage?

      And there is also the issue of the shared bandwidth over the radio spectrum, as more and more devices start to compete for it, time critical data transfer may have issues.

  3. P. Lee

    Phew!

    I'm glad it will be fine for the PCIe x16 Gen4 external graphics cards coming out next year...

  4. xperroni
    Holmes

    Even a broken clock etc. etc.

    I think they have a point, actually. USB is already fast and user-friendly enough for virtually all inter-device communication scenarios – portable storage, cameras, peripherals etc. It is also likely to remain so for the foreseeable future, file sizes and storage density having all but stagnated in the past decade. Personally I think USB-C is already overkill, addressing as it does a marginal usability issue. Intra-device connections (veg. video card slots) may still change some, but I can totally see user-facing ports remaining the same for quite a long time.

    1. Ian K
      Stop

      Re: Even a broken clock etc. etc.

      "It is also likely to remain so for the foreseeable future, file sizes and storage density having all but stagnated in the past decade."

      File sizes I couldn't comment on, but HDD storage density has increased by roughly an order of magnitude over the last 10 years; that's hardly stagnation!

      (Source: http://www.hindawi.com/journals/at/2013/521086/fig1/)

      1. Dave 126 Silver badge

        Re: Even a broken clock etc. etc.

        > file sizes and storage density having all but stagnated in the past decade.

        4K televisions are becoming very inexpensive, and they like to be fed with a lot of data (their resolution is higher, but they also use more bits per pixel). I mention this because video files have driven consumer HDD sizes and interconnects in the past.

        1. Daniel 18

          Re: Even a broken clock etc. etc.

          I've noticed mention of 8K video on the way, and it seems likely that even larger resolutions will follow. It takes a lot to drive the ideal wall-sized display. Display demands will render current USB-C too slow soon enough.

          1. DRendar

            Re: Even a broken clock etc. etc.

            Not likely.

            Remember that USB-C is just the CONNECTOR. USB 3.1/Thunderbolt3 are the *current* best protocols, which manage 8k no problem.

            USB 4 / Thunderbolt 4 or even something new can still run over the same connectors.

            1. Charles 9

              Re: Even a broken clock etc. etc.

              Not necessarily. That's why USB3 had to go with added wires for its SuperSpeed mode. There are still limits that can beat even "future-proofing".

  5. Pavlov's obedient mutt
    Paris Hilton

    oh my...

    I do love a fully reversible fully functional in any direction hole.

    1. Anonymous Coward
      Anonymous Coward

      Re: oh my...

      there's only one direction i like my hole to function. anything arriving in the opposite direction is usually quite unwelcome.

      usually.

  6. Anonymous Coward
    Anonymous Coward

    WiGig will not replace HDMI

    Only reaches 7 Gbps (under ideal conditions that won't be always be realized) which isn't enough for a 4K display. Try again.

    1. jaywin

      Re: WiGig will not replace HDMI

      It might work OK for the first guy in the office to get it. However, once you've got a cluster of 4 desks each with a couple of monitors, a bit less so!

  7. Lee D Silver badge

    Sigh.

    And all wireless technology shares a common medium. Remember the days of 10Base2 or - shudder - Token Ring?

    Thus the max speed of any wireless technology is DIVIDED by the number of clients in the same area simultaneously trying to do the same thing.

    Whereas, cables? Cables can each do the full bandwidth all the time and the bottleneck is in the device they are connecting to, not the transmission medium. Two wired USB-C devices will have four times the bandwidth available than if they were to share a cable / the airwaves (1/2 vs 2).

    The number of times I have to explain this to people about wireless is scary. There's a reason that servers aren't wireless, that infrastructure isn't (generally) wireless, and that the wireless things are low-bandwidth applications (IP phones, CCTV, etc.). Wireless is great for easy-connectivity to a shared medium that you can browse the net on. It's bog useless for transferring a 1Gb roaming profile across the network to your device, and will always pale in insignificance to cabled infrastructure - especially when you have more than a handful of clients trying to do the same.

    Wireless video? That'll work great. For the first device to have it. And then there'll be a catastrophic collapse as people all buy it and flood the allocated frequencies with THEIR video too. You can compensate by ramping up the bandwidths and giving it more and more frequencies but - after a point - it just jams up with traffic or interference.

    Whereas an isolated copper cable from the 10-50p / metre range, will connect you from 100m away and give you full duplex, full-bandwidth of whatever was supplying that wireless access point anyway, but one such full-connection to each cable you have.

    Honestly, wireless is for guest wifi, convenience connections, etc. You don't use it for bridging unless you've carefully considered the full usage pattern of both ends, and you don't use it for primary infrastructure. Using it for video sounds like a disaster (but then, I'm working in a school where they are 32 huge HD displays all within metres of the next screen, so maybe my use-case is unusual? I can't see that offices aren't doing the same, though, especially shared-offices on multiple floors).

    And the same applies whether it's 802.11-whatever, bluetooth, or some fancy proprietary protocol.

    1. jzl

      MIMO does the equation a little bit, of course.

      1. Mage Silver badge

        MIMO

        Grossly overhyped and to do properly needs expensive aerial arrays, multiple transmitters and multiple receivers.

        Shannon's law is based on Thermodynamics. There is no free lunch.

        1. jzl

          Re: MIMO

          Agreed.

      2. Lee D Silver badge

        MIMO is not a panacea by any measure.

        In fact, let me use my favourite analogy.

        In wireless communications, you are trying to shout to attract your friend's attention from across a crowded room. The people in the room are also doing the same, and holding conversations across the room, at the same time. Now try to have a meaningful conversation.

        MIMO just lets you triangulate the signal. So you know your friend is over THERE so you cup your hands and direct your voice that way. Yes, it helps. Does it solve the problem when you have 50 people in a room all yelling at the top of their voice (and, on unmanaged systems, getting louder every time they can't hear a reply?) trying to talk over each other? No.

        Infinitely more effective is wireless management, where you do what humans would do. You tell everyone to shut up (because they are managed by you) and then you point at who can speak at what point. Interference from non-managed entities still destroys the system but you get much closer to theoretical maximum throughput (bear in mind, the numbers stated for Wifi are basically theoretical maximums, in a vacuum, inside a faraday cage, for two such devices to talk to each other and you'll understand why it means NOTHING to have an advertised Wifi speed even in the same order of magnitude as a cabled speed).

        I speak as someone with site-wide Cisco Meraki wireless - have you seen the price of that kit? It's horrendous. One of our point-to-point wireless links costs in the thousands and has multiple antennae the size of dinner plates. Hundreds of devices, inside acres of site. You aren't going to find better kit with greater range, throughput, management or coverage, And still Wifi is bog useless past casual browsing, collecting email and the odd remote device that you don't care about (weather stations, etc.).

    2. Anonymous Coward
      Anonymous Coward

      whilst I agree with you' don't drag Token Ring in to the argument it was actually a pretty good technology! It lost out to Ethernet a bit like Betamax did to VHS. But wires are always better the no wires

    3. Rol

      RE: Sigh.

      And then there's the security to consider.

      A poorly encrypted wifi would effectively be broadcasting out to anyone interested, whereas a heavily encrypted signal would come with overheads.

      Obviously a cable doesn't suffer those limitations, nor does it contribute to your power footprint.

  8. Anonymous Coward
    Trollface

    I'll just leave this here :

    https://xkcd.com/927/

    1. inmypjs Silver badge

      Re: I'll just leave this here :

      https://xkcd.com/927/

      I remember than being said about the Kansas City and CUT standards for recording data on audio cassette players in the mid 70s. Some things never change.

  9. Mage Silver badge

    wireless charging pads

    They universally need mains connections.

    A simple dock, like "cordless" kettles, drills, DECT phones and two way radios have (and phones used to have is sufficient) but doesn't have the same hipster marketing value.

    1. Charles 9

      Re: wireless charging pads

      The thing with Qi is that you don't need a matching cradle for each and every device you own (Lose the cordless phone? The base is now useless, and vice versa). At least you're not a slave to the manufacturer. Pick whatever suits you, put it down, and it goes to work. Breaks? Easy enough to get another one. Not only that, it saves wear and tear on your USB socket.

  10. paulc
    Black Helicopters

    WiGig dead in water on classified networks...

    everything has to be transmitted physically over screened wires (or fibre optics)... not even wireless keyboards/mice permitted...

  11. Doctor Syntax Silver badge

    I'm sure much the same thing was said about the 15-pin connectors for Ethernet.

    1. Dominic Thomas

      > 15-pin connectors for Ethernet

      I had completely forgotten those! Thank you for prompting a very pleasant five minutes of reminiscing over their WIkipedia entry!

  12. John Robson Silver badge

    I'm sure thunderbolt is great...

    but the cost is prohibitive at the moment - the cables alone cost a fortune.

    HDMI/DisplayPort/DVI will be around for a while. Ethernet will certainly remain.

    Space division multiplexing is far easier when you have wave guides available...

    1. JeffyPoooh
      Pint

      Re: I'm sure thunderbolt is great...

      "...the cables alone cost a fortune."

      I'm pretty sure that Apple has a Mandatory Requirement to be unique so that they can make a billion dollars from replacement cable sales.

      1. Dave 126 Silver badge

        Re: I'm sure thunderbolt is great...

        You don't have to buy your Thunderbolt cables from Apple.

        It's a bit like FireWire, some version of which were faster than USB 2 - most people ('consumers') didn't really need the extra speed and features (not being packet-based, FW is a more natural fit for audio recording gear). Of course, the people who did need it, initially for high res scanners and then digital video cameras, made good use of it (or whatever the hell it was Sony called it) for many years.

        Niche kit always looks a bit pricey, regardless of who makes it.

        1. Mike 16

          Not being packet based?

          The major difference between FireWire and USB (previous, anyway) is that it is peer-to-peer, like a typical network. USB is (was?) master/slave, and could only do "streaming" things if your chipsets (on both ends), OS, and application were flawless, and you were massively over-provisioned. (YMMV, of course).

          Pretty much everything is "packet based" these days to some extent, from the SPI BIOS ROM, through the cache coherency protocols on and off chip, to the DRAM interface, PCIE, etc.

          A big advantage FireWire also had (originally) was (working) DMA, but of course most computer and OS vendors didn't bother with an IOMMU, so that was a gaping security hole. Easily fixed once they got a clue and started using the spec'd features.

          A big dis-advantage of USB has always been the flood of dodgy kit, reminding me of the days of "RS-232 compatible", which essentially meant "We cut a crap-ton of corners, but if what you connect us to meets the spec to such an extent that the most tenacious spec-lawyer can't find a flaw, then we have a chance of working. YMMV, no refunds"

          1. Charles 9

            Re: Not being packet based?

            "A big dis-advantage of USB has always been the flood of dodgy kit"

            Thing is, dodgy kit is not unique to USB. It basically happens anywhere there's something ubiquitous to exploit. Dodgy diskettes, dodgy CD-Rs, dodgy USB drives, the list goes on.

  13. Dave 126 Silver badge

    Fibre optic

    I can see scenarios where a fibre optic connection would be useful.

  14. short

    Isn't fibre about to get cheap enough to hit the desktop? I keep reading about glomming fibre interfaces onto silicon directly.

    The cost and complexity of Thunderbolt cables has to help this direction a bit, if FibreBolt cables are 'just' passive glass and a customer-proof clip. The extra distance would be nice if I could just pull a single fibre pair from a rack-mounted (hot, loud) machine to my desktop monitor, keyboard and headphones.

    Edit: I know that optical Thunderbolt cables exist, but they've got the transceivers built into the connectors. That's not what I want, long term, although it does solve the problem of letting consumers near fibres by hiding them completely.

    1. Dave 126 Silver badge

      Thunderbolt started off as an Intel concept that used optical fibres called LightPeak but it proved too costly, so they reverted to copper, and Apple contributed the Thunderbolt name. Lightpeak seemed an attractive idea to me at the time, because a noisey computer / server / GPU farm could be kept in the next room - or indeed the garden shed - without much compromise. These days though, computers good enough for my purposes are generally cooler and quieter.

      I keep hearing about photonic circuitry too, but it seems to be a few years off at the very least.

      As regards consumer and desktop devices, copper-cable based solutions offer a usability advantage of optical connections* - they can carry power, so a single cable can do 'everything' (power, video, storage, peripherals etc).

      *Yeah, some people are working on power-over-fibre, but the use cases remain specialised (underwater robots, MRI machines, physics labs etc). My instinctual reaction to 'consumer fibre with 5W lasers beams' is "Arghh, my eyes my beautiful eyes!!"

  15. JimmyPage Silver badge
    Boffin

    RS-232C (or is that V22/V24)

    (can't remember the DIN)

  16. Big Ed

    Find Me An Editor

    Simon,

    The picture on the front page over the article title doesn't look like a USB Type-C Connector - looks more like a micro to me.

    Does production need some help?

    1. Seajay#

      Re: Find Me An Editor

      Maybe that's deliberate as a reminder of the previous standard which we thought was going to be the One True Hole.

  17. phuzz Silver badge

    So, do these analysis firms every actually do anything useful, or is their sole business coming up with ridiculous and controversial reports in order to gain press coverage?

  18. BenR

    Yes.

    We've all standardised on micro-USB. Except for Apple, who still use Lightning. And are unlikely to be changing at any point in the future. So that's already three standards we have running. Add USB-C into it and that's four...

    Any advance on four?

    1. Lee D Silver badge

      What happened with the EU mandating a charging standard for phones?

      Or are they going to leave it fifteen years and then look into a monopoly action against Apple costing more than it ever recovers, ala Microsoft vs EU?

      1. Dave 126 Silver badge

        >Or are they going to leave it fifteen years and then look into a monopoly action against Apple costing more than it ever recovers, ala Microsoft vs EU?

        What the hell are you thinking? Apple don't have a monopoly! How - or why - would you prosecute a company for the abuse of a monopoly it doesn't have? Shit, I'd be surprised if they enjoyed 25% market share, let alone 50%.

        The EU mandated micro-USB for charging because Samsung used several different connectors, Sony used several different connectors, Nokia used several different connectors... and these proprietary connectors were hard-wired to their wall plugs. The only company that used the same charging connector for its gadgets over several years was Apple, and their wall plugs just had a USB-A socket.

    2. Darryl

      "we're on track for a single standard hole for most portable devices any year now."

      And then Apple will come out with something more proprietary and expensive

  19. Dale 3

    Just remember...

    the last mainstream connections devices will need

    ...there is a difference between technical need and company/shareholder need.

  20. Chris Evans

    Smaller network plug than RJ45 please!

    "Wireless is nifty, but wired is and always will be better. As for connector types, there will be plenty of upgrades to come. I just hope they're not smaller"

    Well I wish there was a smaller wired network connector than RJ45 particularly in height! Much of the connector is only mechanical.

    1. John Robson Silver badge

      Re: Smaller network plug than RJ45 please!

      Don't know if you ever had an X-Jack (I think) PCMCIA network card.

      It had a sort of tray that popped out to accept a vertically oriented rj45 cable.

      Bit flimsy though.

      RJ45 has to support the weight of a cat5/6 cable, so some mechanical rigidity is required for most places.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon