back to article Freeze, lastholes: USB-C and Thunderbolt are the ultimate physical ports

USB-C and Thunderbolt are the last mainstream connections devices will need to the outside world, according to analyst firm ABI Research. The company's new Device Connectivity Report predicts that by the year 2020 “Almost half of the smartphones and 93% of laptops will include USB Type-C connectivity.” In coming years we'll …

Coat

Gotta love the gadget market :-)

Our system is so good you'll never need anything else.

Until the use cases appear for the contrary,

and some bleepard shows of a new system that works that little better,

and we all have to show that we're building it into our new products,

and it finally gets ratified as a standard,

and then we will be saying again.

Our system is so good you'll never need anything else.

19
0
Silver badge

SCSI anyone?

Wireless is nifty, but wired is and always will be better. As for connector types, there will be plenty of upgrades to come. I just hope they're not smaller

20
0

Re: SCSI anyone?

but wired is and always will be better.

That is true, but then there is using wired just for wired sake, there are plenty of scenarios where there just isnt any point in using wires over wireless. Personally I cant stand wireless internet, but then again it's just not fast enough for me.

0
0

Re: SCSI anyone?

> there is using wired just for wired sake, there are plenty of scenarios where there just isnt any point in using wires over wireless

Go on...

We can only get a finite amount of data over wireless. Wired can effectively carry infinite data. And as more and more people try to push larger & faster amounts of data over wireless, it's going to stop being a viable option sooner-or-later. Maybe we should concentrate more on making the things that are helpful working wirelessly (mobile internet / mice / headphones) better, and keep the things that don't (desktop monitors / printers / desk phones) cabled down so we can take advantage of the better wireless?

12
0
TRT
Silver badge

Re: SCSI anyone?

And... and and and... aren't these just copper standards? I seem to recall an optical fibre version of Thunderbolt... or am I just imagining it?

1
0
Silver badge

Re: SCSI anyone?

You're not imagining. Intel developed the optical version of Thunderbolt - then called LightPeak - first, before reverting to copper on cost grounds. Apple contributed the Thunderbolt name, and it was mainly Apple who used it - though Sony, bless 'em, made a VAIO laptop with a USB-A Thunderbolt port for driving an external GPU.

This was some years ago now (indeed, VAIOs were still Sony), but it is only now that the idea of external GPUs are gaining traction amongst the gaming crowd. And Apple's next cinema display is rumoured to have its own GPU, because its existing Macbook Pros don't have a connection capable of shunting 5K video.

As always, I'll let the gamers and Apple users pay the first-adopter's premium and iron out the bugs, and look forward to it being cheap and reliable in a year or two.

1
1
LDS
Silver badge

Re: SCSI anyone?

The good of wired connections like USB is they also carry power. Wireless devices or are already powered (i.e. a TV), or otherwise needs batteries. Batteries make them heavier/bigger (especially for more power angry devices), needs to be recharged, and shorten the device lifespan and usability if cannot be replaced (good for vendors... not for the customers).

For example I have yoke/throttle/pedals devices for flight simulations. They are all USB connected. If they all were wireless devices, they would be three more devices to recharge, or connect to the mains instead of USB - where's the advantage?

And there is also the issue of the shared bandwidth over the radio spectrum, as more and more devices start to compete for it, time critical data transfer may have issues.

8
0
Silver badge

Re: SCSI anyone?

@jaywin

Quote: "it's going to stop being a viable option sooner-or-later."

Doubtful, imho.

Bear in mind that WiGig uses the 60 GHz band for the high speed part (video to TVs/monitors etc), which won't penetrate walls, and is very short range. Effectively limiting you to gadgets/devices in the same immediate area, or at least the same room. So your finite bandwidth, is only being shared by a few devices in one room (or one part of a room if it's a large area).

You still have your normal 2.5 and 5 GHz for house/building coverage itself, which works alongside the 60 GHz.

I would suspect for most consumer gear, TV's, Blu-ray players, set top boxes etc, and for normal business use, even in an open plan office, this bandwidth issue is likely to be a none issue.

Personally, I doubt my desktop at home will ever, not use hardwired network and monitors (latency etc). But for my TV, satellite box etc. I welcome the day I can get rid of the huge rats next of network, power, HDMI, displayport and optical audio cables, I have crammed behind the TV cabinet!

1
0
Silver badge

Re: SCSI anyone?

>And as more and more people try to push larger & faster amounts of data over wireless, it's going to stop being a viable option sooner-or-later.

We've already arrived at this point. I fly model sailplanes and as with other model systems radio links have migrated to 2.4GHz over the last decade because its convenient and cheap. Unfortunately its becoming increasingly unreliable. The result has been the loss of some quite expensive planes due to radio lockup -- we fly at a school site that also has housing nearby and gradually as people install high power / high throughput routers its blotting out the entire band in the vicinity, something which is merely inconvenient for data users but potentially fatal to model aircraft. (Its no good getting the link back after a crash!)

I've been a bit of a holdout, staying with the older 72MHz equipment, not because I don't trust new-fangled gadgets but because of my job I'm acutely aware of what's going on in the ISM (WiFi) band and how its likely to affect data links. Most people still don't have a very good idea how radio works -- we've all been brought up with the notion of 'frequency' and 'tuning' and so subconsciously think that things still work that way.

1
1
Anonymous Coward

Re: SCSI anyone?

Intel developed the optical version of Thunderbolt - then called LightPeak - first, before reverting to copper on cost grounds.

Actually the switch to copper was only done to provide power to peripherals. In fact optical thunderbolt cables are still available, they just required self-powered devices. Generally fibre thunderbolt is only used for longer distances or installations with interference problems, but the price tends to be cheaper per meter than the copper cables.

0
0

Re: SCSI anyone?

> Personally, I doubt my desktop at home will ever, not use hardwired network and monitors (latency etc). But for my TV, satellite box etc. I welcome the day I can get rid of the huge rats next of network, power, HDMI, displayport and optical audio cables, I have crammed behind the TV cabinet!

Personally, I'd rather see something like the USB-C being used to connect everything together. One power lead to your TV, then USB-C to the satellite box, games console, thingyme-widget and so on.

You can already get PoE powered large screens (using HDbaseT at the moment), so the efficiencies are getting there, such that you could just need to run a single cat5 from your router to the TV, and then the USB-C's and everything is all connected, powered, and talking to the internet.Seems much more user friendly and less complicated than trying to configure / pair / restrain a random bunch of devices over wireless.

0
0
Silver badge

Phew!

I'm glad it will be fine for the PCIe x16 Gen4 external graphics cards coming out next year...

4
0
Holmes

Even a broken clock etc. etc.

I think they have a point, actually. USB is already fast and user-friendly enough for virtually all inter-device communication scenarios – portable storage, cameras, peripherals etc. It is also likely to remain so for the foreseeable future, file sizes and storage density having all but stagnated in the past decade. Personally I think USB-C is already overkill, addressing as it does a marginal usability issue. Intra-device connections (veg. video card slots) may still change some, but I can totally see user-facing ports remaining the same for quite a long time.

0
8
Stop

Re: Even a broken clock etc. etc.

"It is also likely to remain so for the foreseeable future, file sizes and storage density having all but stagnated in the past decade."

File sizes I couldn't comment on, but HDD storage density has increased by roughly an order of magnitude over the last 10 years; that's hardly stagnation!

(Source: http://www.hindawi.com/journals/at/2013/521086/fig1/)

4
0
Silver badge

Re: Even a broken clock etc. etc.

> file sizes and storage density having all but stagnated in the past decade.

4K televisions are becoming very inexpensive, and they like to be fed with a lot of data (their resolution is higher, but they also use more bits per pixel). I mention this because video files have driven consumer HDD sizes and interconnects in the past.

7
1

Re: Even a broken clock etc. etc.

I've noticed mention of 8K video on the way, and it seems likely that even larger resolutions will follow. It takes a lot to drive the ideal wall-sized display. Display demands will render current USB-C too slow soon enough.

0
0

Re: Even a broken clock etc. etc.

Not likely.

Remember that USB-C is just the CONNECTOR. USB 3.1/Thunderbolt3 are the *current* best protocols, which manage 8k no problem.

USB 4 / Thunderbolt 4 or even something new can still run over the same connectors.

0
0
Silver badge

Re: Even a broken clock etc. etc.

Not necessarily. That's why USB3 had to go with added wires for its SuperSpeed mode. There are still limits that can beat even "future-proofing".

0
0
Paris Hilton

oh my...

I do love a fully reversible fully functional in any direction hole.

5
0
Anonymous Coward

Re: oh my...

there's only one direction i like my hole to function. anything arriving in the opposite direction is usually quite unwelcome.

usually.

7
2
Silver badge

WiGig will not replace HDMI

Only reaches 7 Gbps (under ideal conditions that won't be always be realized) which isn't enough for a 4K display. Try again.

9
1

Re: WiGig will not replace HDMI

It might work OK for the first guy in the office to get it. However, once you've got a cluster of 4 desks each with a couple of monitors, a bit less so!

11
0
Silver badge

Sigh.

And all wireless technology shares a common medium. Remember the days of 10Base2 or - shudder - Token Ring?

Thus the max speed of any wireless technology is DIVIDED by the number of clients in the same area simultaneously trying to do the same thing.

Whereas, cables? Cables can each do the full bandwidth all the time and the bottleneck is in the device they are connecting to, not the transmission medium. Two wired USB-C devices will have four times the bandwidth available than if they were to share a cable / the airwaves (1/2 vs 2).

The number of times I have to explain this to people about wireless is scary. There's a reason that servers aren't wireless, that infrastructure isn't (generally) wireless, and that the wireless things are low-bandwidth applications (IP phones, CCTV, etc.). Wireless is great for easy-connectivity to a shared medium that you can browse the net on. It's bog useless for transferring a 1Gb roaming profile across the network to your device, and will always pale in insignificance to cabled infrastructure - especially when you have more than a handful of clients trying to do the same.

Wireless video? That'll work great. For the first device to have it. And then there'll be a catastrophic collapse as people all buy it and flood the allocated frequencies with THEIR video too. You can compensate by ramping up the bandwidths and giving it more and more frequencies but - after a point - it just jams up with traffic or interference.

Whereas an isolated copper cable from the 10-50p / metre range, will connect you from 100m away and give you full duplex, full-bandwidth of whatever was supplying that wireless access point anyway, but one such full-connection to each cable you have.

Honestly, wireless is for guest wifi, convenience connections, etc. You don't use it for bridging unless you've carefully considered the full usage pattern of both ends, and you don't use it for primary infrastructure. Using it for video sounds like a disaster (but then, I'm working in a school where they are 32 huge HD displays all within metres of the next screen, so maybe my use-case is unusual? I can't see that offices aren't doing the same, though, especially shared-offices on multiple floors).

And the same applies whether it's 802.11-whatever, bluetooth, or some fancy proprietary protocol.

37
0
jzl

MIMO does the equation a little bit, of course.

1
2
Silver badge

MIMO

Grossly overhyped and to do properly needs expensive aerial arrays, multiple transmitters and multiple receivers.

Shannon's law is based on Thermodynamics. There is no free lunch.

8
0
Anonymous Coward

whilst I agree with you' don't drag Token Ring in to the argument it was actually a pretty good technology! It lost out to Ethernet a bit like Betamax did to VHS. But wires are always better the no wires

2
0
Silver badge

MIMO is not a panacea by any measure.

In fact, let me use my favourite analogy.

In wireless communications, you are trying to shout to attract your friend's attention from across a crowded room. The people in the room are also doing the same, and holding conversations across the room, at the same time. Now try to have a meaningful conversation.

MIMO just lets you triangulate the signal. So you know your friend is over THERE so you cup your hands and direct your voice that way. Yes, it helps. Does it solve the problem when you have 50 people in a room all yelling at the top of their voice (and, on unmanaged systems, getting louder every time they can't hear a reply?) trying to talk over each other? No.

Infinitely more effective is wireless management, where you do what humans would do. You tell everyone to shut up (because they are managed by you) and then you point at who can speak at what point. Interference from non-managed entities still destroys the system but you get much closer to theoretical maximum throughput (bear in mind, the numbers stated for Wifi are basically theoretical maximums, in a vacuum, inside a faraday cage, for two such devices to talk to each other and you'll understand why it means NOTHING to have an advertised Wifi speed even in the same order of magnitude as a cabled speed).

I speak as someone with site-wide Cisco Meraki wireless - have you seen the price of that kit? It's horrendous. One of our point-to-point wireless links costs in the thousands and has multiple antennae the size of dinner plates. Hundreds of devices, inside acres of site. You aren't going to find better kit with greater range, throughput, management or coverage, And still Wifi is bog useless past casual browsing, collecting email and the odd remote device that you don't care about (weather stations, etc.).

3
0
Rol

RE: Sigh.

And then there's the security to consider.

A poorly encrypted wifi would effectively be broadcasting out to anyone interested, whereas a heavily encrypted signal would come with overheads.

Obviously a cable doesn't suffer those limitations, nor does it contribute to your power footprint.

2
0
jzl

Re: MIMO

Agreed.

0
0
Silver badge
Trollface

I'll just leave this here :

https://xkcd.com/927/

16
0
Silver badge

Re: I'll just leave this here :

https://xkcd.com/927/

I remember than being said about the Kansas City and CUT standards for recording data on audio cassette players in the mid 70s. Some things never change.

1
0
Silver badge

wireless charging pads

They universally need mains connections.

A simple dock, like "cordless" kettles, drills, DECT phones and two way radios have (and phones used to have is sufficient) but doesn't have the same hipster marketing value.

3
0
Silver badge

Re: wireless charging pads

The thing with Qi is that you don't need a matching cradle for each and every device you own (Lose the cordless phone? The base is now useless, and vice versa). At least you're not a slave to the manufacturer. Pick whatever suits you, put it down, and it goes to work. Breaks? Easy enough to get another one. Not only that, it saves wear and tear on your USB socket.

0
0
Black Helicopters

WiGig dead in water on classified networks...

everything has to be transmitted physically over screened wires (or fibre optics)... not even wireless keyboards/mice permitted...

6
0
Silver badge

I'm sure much the same thing was said about the 15-pin connectors for Ethernet.

6
0

> 15-pin connectors for Ethernet

I had completely forgotten those! Thank you for prompting a very pleasant five minutes of reminiscing over their WIkipedia entry!

2
0
Silver badge

I'm sure thunderbolt is great...

but the cost is prohibitive at the moment - the cables alone cost a fortune.

HDMI/DisplayPort/DVI will be around for a while. Ethernet will certainly remain.

Space division multiplexing is far easier when you have wave guides available...

1
0
Silver badge
Pint

Re: I'm sure thunderbolt is great...

"...the cables alone cost a fortune."

I'm pretty sure that Apple has a Mandatory Requirement to be unique so that they can make a billion dollars from replacement cable sales.

4
0
Silver badge

Re: I'm sure thunderbolt is great...

You don't have to buy your Thunderbolt cables from Apple.

It's a bit like FireWire, some version of which were faster than USB 2 - most people ('consumers') didn't really need the extra speed and features (not being packet-based, FW is a more natural fit for audio recording gear). Of course, the people who did need it, initially for high res scanners and then digital video cameras, made good use of it (or whatever the hell it was Sony called it) for many years.

Niche kit always looks a bit pricey, regardless of who makes it.

0
1
Bronze badge

Not being packet based?

The major difference between FireWire and USB (previous, anyway) is that it is peer-to-peer, like a typical network. USB is (was?) master/slave, and could only do "streaming" things if your chipsets (on both ends), OS, and application were flawless, and you were massively over-provisioned. (YMMV, of course).

Pretty much everything is "packet based" these days to some extent, from the SPI BIOS ROM, through the cache coherency protocols on and off chip, to the DRAM interface, PCIE, etc.

A big advantage FireWire also had (originally) was (working) DMA, but of course most computer and OS vendors didn't bother with an IOMMU, so that was a gaping security hole. Easily fixed once they got a clue and started using the spec'd features.

A big dis-advantage of USB has always been the flood of dodgy kit, reminding me of the days of "RS-232 compatible", which essentially meant "We cut a crap-ton of corners, but if what you connect us to meets the spec to such an extent that the most tenacious spec-lawyer can't find a flaw, then we have a chance of working. YMMV, no refunds"

2
0
Silver badge

Re: Not being packet based?

"A big dis-advantage of USB has always been the flood of dodgy kit"

Thing is, dodgy kit is not unique to USB. It basically happens anywhere there's something ubiquitous to exploit. Dodgy diskettes, dodgy CD-Rs, dodgy USB drives, the list goes on.

0
0
Silver badge

Fibre optic

I can see scenarios where a fibre optic connection would be useful.

1
1

Isn't fibre about to get cheap enough to hit the desktop? I keep reading about glomming fibre interfaces onto silicon directly.

The cost and complexity of Thunderbolt cables has to help this direction a bit, if FibreBolt cables are 'just' passive glass and a customer-proof clip. The extra distance would be nice if I could just pull a single fibre pair from a rack-mounted (hot, loud) machine to my desktop monitor, keyboard and headphones.

Edit: I know that optical Thunderbolt cables exist, but they've got the transceivers built into the connectors. That's not what I want, long term, although it does solve the problem of letting consumers near fibres by hiding them completely.

0
0
Silver badge

Thunderbolt started off as an Intel concept that used optical fibres called LightPeak but it proved too costly, so they reverted to copper, and Apple contributed the Thunderbolt name. Lightpeak seemed an attractive idea to me at the time, because a noisey computer / server / GPU farm could be kept in the next room - or indeed the garden shed - without much compromise. These days though, computers good enough for my purposes are generally cooler and quieter.

I keep hearing about photonic circuitry too, but it seems to be a few years off at the very least.

As regards consumer and desktop devices, copper-cable based solutions offer a usability advantage of optical connections* - they can carry power, so a single cable can do 'everything' (power, video, storage, peripherals etc).

*Yeah, some people are working on power-over-fibre, but the use cases remain specialised (underwater robots, MRI machines, physics labs etc). My instinctual reaction to 'consumer fibre with 5W lasers beams' is "Arghh, my eyes my beautiful eyes!!"

1
1
Silver badge
Boffin

RS-232C (or is that V22/V24)

(can't remember the DIN)

2
0

Find Me An Editor

Simon,

The picture on the front page over the article title doesn't look like a USB Type-C Connector - looks more like a micro to me.

Does production need some help?

0
0

Re: Find Me An Editor

Maybe that's deliberate as a reminder of the previous standard which we thought was going to be the One True Hole.

0
0
Silver badge

So, do these analysis firms every actually do anything useful, or is their sole business coming up with ridiculous and controversial reports in order to gain press coverage?

0
0

Yes.

We've all standardised on micro-USB. Except for Apple, who still use Lightning. And are unlikely to be changing at any point in the future. So that's already three standards we have running. Add USB-C into it and that's four...

Any advance on four?

2
0
Silver badge

What happened with the EU mandating a charging standard for phones?

Or are they going to leave it fifteen years and then look into a monopoly action against Apple costing more than it ever recovers, ala Microsoft vs EU?

0
0

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Forums

Biting the hand that feeds IT © 1998–2018