Did you buy the new MacBook or MacBook Pro? Maybe the Google Pixel? You’re about to enter a world of confusion thanks to those new USB-C” ports. See, that simple-looking port hides a world of complexity, and the (thankful) backward-compatibility uses different kinds of cables for different tasks. Shoppers have to be very careful to buy exactly the right cable for their devices!
Welcome to the dongle.
https://xkcd.com/927/
The simple strategy of “If it fits, it works” that has been a part of PCs for as long as I can remember (with a scant few exceptions – I’m looking at you, IDE Master/Slave) no longer applies.
That is very bad design.
We should still be using parallel and scsi ports.
Well, at least with PCs, the host side of the parallel port was different from the the Centronics® 50 connectors generally used for SCSI ports.
Well, SCSI is one of the best technologies ever created, going back to it would be a pleasure!!
You left out your sarcasm tags. I hope.
I’ve worked with SCSI, an awful lot– and I don’t miss it.
This page should provide a trip down memory lane:
https://www.ramelectronics.net/scsi-connector-types-and-pictures.asp…
As far as I’m concerned, the only advantage it had was the size of the disk chains you could create– and I’d rather have a good backplane with SAS (which uses the SCSI protocol, but not the bus).
Thankfully, the only ones of those I had to deal with when I was growing up were the HPDB50 on the non-proprietary end of our IBM ThinkPad 755C’s SCSI-2 PCMCIA Type-II card and the CN50 on the external housing when we upgraded from a grey NEC MultiSpin 3Xp to a faster drive in a generic external housing.
(I think the MultiSpin might have broken because I know that we were still using it as an unpowered coupler to connect the PCMCIA-to-HPDB50 cable to the HPDB50-to-CN50 cable, but we weren’t using it as a second CD-ROM drive on the machine.)
Edited 2016-11-02 06:43 UTC
Not even complete, no hotplug SCA connector ?
https://en.wikipedia.org/wiki/Single_Connector_Attachment
That’s not really true.
First, let’s eliminate the cables that aren’t made to spec– This is the work Benson Leung did over on Amazon, and managed to get awareness early on that just slapping some pins together in a cable isn’t sufficient any more. Cables were reporting themselves as 5A capable when they could barely handle 2A, and they were melting themselves (and attached hardware) under load.
If you use a properly built USB-C cable, it will work as a basic data/video/charging cable. It may not be high speed– it may not do Thunderbolt, or HDMI, or DisplayPort– but it will connect two USB-C ports together, and those two ports will be able to use the highest functionality available in that cable, according to the specification.
If you need Thunderbolt 3, get a TB3 capable cable (which might just be a good quality USB-C cable). Similarly, if you need DP, HDMI, audio, etc., get a cable / adapter that supports that functionality.
Above all, get a well made cable that isn’t being churned out for a buck fifty a pop on the ghost shift in Whozawhatsistan.
Cabling is only part of it, though.
Not every USB-C port will support Thunderbolt, or HDMI, or DisplayPort, or 100W power, etc etc etc.
There will be computers where not every USB-C port supports a display, or only ports 1 and 2 support Thunderbolt, but 3 and 4 only support Display Port (But only if there isn’t a display plugged into 1 and 2), but 5 and 6 support neither.
Or, all the ports only support USB 3.1 gen 1 (Those already exist, btw), leaving a person wondering why his USB disk drive doesn’t work, not realizing that it is actually Thunderbolt. I mean, it uses the same port. Why would it be different?
I mean, it isn’t that big of a deal, I guess, but it is still annoying, and a step backwards.
If Thunderbolt becomes ubiquitous, well,
That sort of confusion already exists for a long time across consumer electronics as well as computer tech areas. Look at TVs that came with multiple SCART inputs – usually only one would support RGB, and it would be a matter of trying to peer behind the TV to see the symbols on the socket to figure out which it was. Then you get some that support S-video, and I’ve even seen some support component, all with the same form factor.
My current PC has a mixture of USB 2 and 3 on the back, which means I need to get under my desk with a torch to see which is which – blindly poking cables in won’t do the trick (not that it matters for most of my devices anyway, but still…)
And HDMI connectors have varying capabilities – does it support 4K? 3D? HDCP? Ethernet? They all look the same, so you need to dig out the manual to figure out why some functions seem to work some places and not the next.
It’s not an ideal situation and there’s a lot to be said for avoiding situations like that, but at the end of the day, consumers are mostly used to that sort of situation so they’ll handle it just fine with a bit of a moan.
And that’s my point, in a nutshell. Buy the cable for YOUR port / needs, and most of the confusion vanishes. It will, however, require people learn what their USB-C enabled device is capable of.
And those who can’t be bothered, will use a regular USB-C cable, and wonder what everyone’s raving about.
Fortunately, USB alt mode requires negotiation with the host, and when the negotiation fails, the host can simply display a popup telling the user the host does not support the Thunderbolt protocol, which is required by the device.
And then the consumer gets together with other consumers and a class action is formed based on misleading advertising. You know it’s coming.
The Chinese sellers on Ebay will simply sell cheap cables with fake certifications. They already sell potentially deadly laptop and phone chargers with fake certifications.
If you buy the cheapest cable you can find on eBay, you get what you pay for.
I realize that we’re not too far removed from the Monster HDMI cable ($100 USD) vs. the Amazon HDMI cable ($5) stupidity, but if you’re going to cheap out on a cable for your hundreds, if not thousands, of dollars worth of smartphone / laptop / desktop, I have no pity.
grat,
I get what you are saying, but it is often the case with tech that price has very little to do with quality. The fact is lessor known companies have to lower prices compared to popular brands, just because they’re lessor known brands rather than because they’re strictly lessor quality. This posses a dilemma for consumers who don’t know if they are paying for quality or just for branding. Some vendors scam you with poor quality parts that don’t even meet the spec, sometimes manufacturers scam you with exorbitant prices for the exact same parts from the exact same fabs. Sometimes a manufacturer like intel/amd can use the exact same physical stock and just sell them as different cores/clock speeds, etc to consumers because it’s actually cheaper to fabricate a single part and charge different prices for it.
Ultimately, if you’ve got lots of money and can afford to pay for branding, then more power to you, you save yourself most of the hassles of differentiating the crap from the quality at the bottom end of the market.
By pure coincidence, yesterday NPR was talking about the psychology of consumers being manipulated to convince them to pay more than they otherwise would:
Given two similar products, one being more expensive than the other, about half the consumers would buy the cheap one and the other half would buy the more expensive one. However by introducing a 3rd product, one that’s ridiculously overpriced and not expected to sell much, consumers will suddenly be willing to pay more at the lower end of the scale because now they feel they’re getting a bargain.
Oh, manipulating consumers is nothing new. Why do you think everything costs 1 cent or 5 cents below the next non-decimal price point? $14.99 instead of $15? It’s precisely for this sort of manipulation.
darknexus,
Of course, although in my head I always round up…this one doesn’t affect me. One trick that really pisses me off though is masking price increase through smaller quantities over time or boxes that are significantly empty at grocery stores. I’ve taken to ignoring the package price entirely and just focusing on the unit price (ie per weight). Although now many stores only display the “sale price for 2 boxes” or “price when you buy 5 cans”, etc to the point where you don’t really know the true unit price you’ll be charged at the register for a single item… Someone should go beat them up with an “ethics stick”, haha. Anyways I’m going O/T again
Edited 2016-11-02 15:25 UTC
Actually, it was to force cashiers to open the cash register to give change, leaving a record of the sale in the register, making it harder for unscrupulous employees to just pocket the cash.
That explanation doesn’t make sense in countries with no 1c coins. The other explanation is the correct one.
In countries that used primarily nickels, they did pricing ending at 45 and 95.
Buying based on branding is just as bad as buying based on price.
The intelligent consumer reads other peoples opinions of the product– for example, buying a USB-C cable off of Amazon without Benson Leung’s blessing was a serious mistake at one time.
Now, however, thanks to his efforts, Amazon has cracked down on the out-of-spec USB-C cables, and you’re a bit safer, but it’s still worth reading other consumer’s experiences (and keeping in mind that some people shouldn’t be on the internet, let alone allowed to post product review, and there are also people who are *paid* to post reviews, so you’re going to have overwhelmingly positive and negative reviews to sift through– Never said it would be easy).
For europeans, there are two similarly looking symbols on gadgets: CE means CE Mark (Conformité Européenne) and CE means China Export. Chinese put second one on their products without any faking
Compare eg here: http://www.cumbriacomputerrepairs.co.uk/2014/08/22/ce-mark-c-e-mark…
90s nostalgia is in bloom.
I remember back when laptops first started going mainstream, and a lot of people had PCMCIA cards to add functionality to the laptop. Most of the time it was NICs and modems.
Then most thing became integrated, and anything that wasn’t was converted to a USB dongle.
Anyway, this isn’t news. Every time we transition to something it’s a shit show for the first couple of years, and then everything works itself out. Anyone remember the first decade of wifi? There is still stuff out there that is finicky, but it’s pretty well sorted out at this point.
In the coming years, there will be winners, losers, and forum posters who think it’s all a bunch of garbage because all we’ve done is reinvent a 20 year old technology, which was clearly superior at the time, when we could have been inventing a holodeck with all of those hours.
It will get better, but it looks like it will get a bit worse first.
We’ll need to educate users, and also manufacturers. Fortunately the really terrible off-spec cable issue was fixed after little effort, and many people now try to make sure the cable actually works before buying them. (One advantage of lightning was that Apple certified the cable, so there were no broken ones, except for manufacturing errors).
Then it is possible to carry power on the cable, in both ways. This is an extension of USB-OTG where the phone could be a host device. Now it can be a host for power, and can actually charge your portable powerbank, instead of other way around. (You’ll need to check, as the user, which device is charging the other).
And now we’ll have HDMI, DP, and Audio (two different specs) over the same port. And they require dedicated hardware. USB 3 ports were labelled blue, and some MBs color high sample rate mouse ports red. We might have a similar notation for a/v ports. (But I would rather have a full size HDMI and real 3.5 audio ports, but that’s another discussion).
Overall we’ll probably adapt and learn to live with the new technology. We have done this several times (Mini USB -> Micro USB -> Special MHL/HDMI/AV ports, miniDP, etc). We can do it once again.
Edited 2016-11-02 08:57 UTC
Some people have.
I adopted USB 1.1 as soon as I could and things like Mini USB and Micro USB didn’t really bother me, but I’ve basically been sitting out the entire smartphone boom while I wait for the ecosystem to stabilize on certain details I consider non-negotiable:
1. A physical keyboard or something else with suitably equivalent finger-alignment and keypress feedback.
2. Removable storage (A MicroSD slot or equivalent)
3. A non-proprietary way to simultaneously hook up power input, USB devices, wired audio output, and wired video output when at a desk or table.
4. Dongle-less, non-proprietary, non-wireless headphone connectivity when out and about.
5. An unlocked bootloader and fully open-source OS with a software ecosystem that isn’t trying to nickel-and-dime me or sell me as a product.
(Currently, since I can’t justify the exhorbitant Canadian mobile data rates anyway, I just carry a pocketable Linux laptop and/or my trusty old Sony PRS-505 eReader.)
I think you can keep on waiting for that until the earth is engorged by the red giant that was once our sun in about five billion years.
In the meantime try to invent immortality; you are going to need it to survive all the waiting.
You misunderstand. I’m not saying “I want a smartphone, but I’m waiting for them to have features X, Y, and Z.” That implies that I care about eventually arriving where everyone else is.
I’m saying that I know what I want and I’m pretty comfortable where I am.
When the world started chasing things I had no interest in, like Twitter and touchscreen-centric interfaces, I just kept walking as the rest of the herd decided to turn off in another direction.
(Sort of like how professionals see less and less appeal in the macbook pro line as Apple’s focus and their focus diverge.)
If they decide to wander back in my direction, the price-reducing competition would be nice, but I’m not going to hold my breath.
Edited 2016-11-02 14:07 UTC
Ah, yes, it seems that I misunderstood your post.
Being as sober as you is a nice change compared to the sheep most other people are. However since I also yearn for products that aren’t produced right now, I sort of feel your pain, as what you’d want doesn’t receive the attention (and thus no mass production financial benefits) it might as well deserve.
Oh well, just bide your time with whatever fits your requirements best until something better arrives, I suppose? The plight of the power user!
Aaaaand we’ve failed before we started. Users expect everything with the same plug to work the same; this is going to be yet another tech support burden on the family member/friend who “knows computers”.
I’m not saying existing USB isn’t a shit-show, I’m just saying Thunderbolt/Lightning/USB-C/etc./etc./etc. hasn’t made things better.
To be fair, I’ve yet to encounter any confusion with Lightning cables or accessories. If it’s a Lightning connector, it works as it should unless it’s defective. It’s one reason I didn’t mind the transition away from the old 30-pin dock connectors, where this kind of confusion between cables and accessories (particularly the charging pins) was getting crazy. Now it appears we’re back to the shit show with USB C. Damn!
darknexus,
If it fits, or if it’s close to fitting, then it’s meant to be compatible – keep trying.
http://superuser.com/questions/386359/will-a-vga-monitor-work-if-co…
https://answers.yahoo.com/question/index?qid=20090505015425AAXXxO5
https://www.reddit.com/r/techsupport/comments/gjakq/possible_to_use_…
Edited 2016-11-02 17:39 UTC
Yeah, it’s funny how so many people think 9-pin serial and VGA would be compatible. The odd thing is, at least to me, it would never occur that they would be because one has male on the cable, the other has female. I mean, as today’s teens would say it… duh!
darknexus,
CISCO, a year later: “Here’s your CCNA certification”.
Employee 2: “Boy, standards really have come down.”
Employee 1: “What are you talking about?”
Edited 2016-11-02 18:47 UTC
I needed that laugh today! Thanks! But you forgot the part where employee 1 agrees with employee2 at the end obliviously unaware of the self-reflexive insult.
I’m curious if you’ve ever actually looked at a phone that can do this. Anything Android 6.0 or newer (unless the manufacturer changes it) defaults to drawing power over the cable (and only drawing power, no data connection other than possibly negotiating the power draw) until you manually tell it to do anything else (you get a nice notification that you tap to pop up a menu to switch things). In other words, at least for Android, this is a non-issue.
At least for me though, even if it was an issue, it would still be worth having a proper USB host in the phone (it’s wonderful as an IT professional to be able to just carry around an USB Ethernet adapter for my phone to do network diagnostics and check cabling).
Somebody should call Weird Al Yankovic and ask him to create a “Welcome to the dongle” song based on Guns’n Roses’ “Welcome to the Jungle”.
One could then “incidentally” play that whenever one of those die-hard Apple fans visits 😉
This could replace the start-up chime that Apple’s apparently removed from the latest MacBook Pro.
You can make one yourself. It is not that hard to write, though getting the band back together is the entire second act.
Antonio Gramsci sort of captured the moment with this quote
“The crisis consists precisely in the fact that the old is dying and the new cannot be born; in this interregnum a great variety of morbid symptoms appear.”
I have to say the USB naming schema is just plain confusing. I assume in a few years we will be through this troubling transition, the old standards will be dying out and everything, more or less, will plug into everything else. One can but dream.
Edited 2016-11-02 13:11 UTC
Welcome to the dongle, we got fun and games
we got every port you want, honey just say the name
and that’s some very sexy tech, guaranteed to please
connect any thing you want, but it won’t be hassle free!
If we have a USB-C connector at the computer end, why can’t we move to having a USB-C connector at the peripheral end ?
I.e. replacing USB-A 3.1, HDMI, Displayport, Thunder bolt2(and of course 3) connectors.
And have the supporting electronics fall back to the top protocol (and/or current) that they can handle for that cable grade.
I personally would then choose high current capable, TB3 capable cables for any and all uses unless really exorbitantly priced.
Am I missing some crazy truth why is *couldn’t* be this way.??
Well, passive TB3 40Gbps capable cables have to be so short that they’re almost unusable. I don’t think I’d want to be stuck with 0.5m cables for everything…
I didn’t realize TB3’s top speed can be enabled with short passive cables as well.
Edited 2016-11-03 02:27 UTC
Hi,
Thunderbolt (including Thunderbolt 3) is a massive security disaster slapped together by an incompetent moron. It (literally) extends the computer’s PCI bus out to malicious third party devices (which can include “trojan devices” that look and act like USB while simultaneously using Thunderbolt); and there is no sane way for an OS to detect this or defend against it.
There are only 2 things anyone should ever plug into a “thunderbolt capable” socket. The first is hard setting putty. The second is a ball of super-glue.
Fortunately, this also solves part of the cable confusion.
– Brendan
Not quite. If motherboard manufacturers can get their heads out of their asses and prevent Pre-OS boot-time attacks via vectors like PCIe option roms, this is exactly the sort of thing the IOMMU is intended to protect against.
(Though I believe the originally envisioned attack was a guest OS exploiting some “let the guest manage this hardware exclusively” piece of hardware to attack the host OS by proxy.)
If you rule out bits like the option ROMs, then it’s no more vulnerable than FireWire or the technologies Thunderbolt is replacing in laptops, like ExpressCard and CardBus (PCMCIA 5.0), which have been a vector for evil maid DMA attacks since 1997… long before x86 systems had IOMMUs (starting in the mid 2000s).
Edited 2016-11-03 17:07 UTC
ssokolow,
I have to agree with Brendan that it’s not a secure design. It blows my mind that they didn’t address the flaws of firewire when designing thunderbolt. You don’t even need a hack, by using the ports as designed, one can just poke around the host.
http://www.breaknenter.org/2012/02/adventures-with-daisy-in-thunder…
Your computer could be hacked right in front of your eyes and you’d be none the wiser because it’s done through a seemingly innocent connection through one of these vulnerable ports.
http://blog.erratasec.com/2011/02/thunderbolt-introducing-new-way-t…
Re-purposing the IOMMU to compensate for protocols we know to be insecure isn’t the best. IOMMU was really designed for servers to allow VMs to allocate PCI devices virtually while preventing privilege escalation attacks from the VM. If you have a specific need to virtualize hardware, then IOMMU is great, but otherwise you incur an IOMMU performance penalty to virtualize the bus for no good reason.
https://www.kernel.org/doc/ols/2007/ols2007v1-pages-9-20.pdf
It just seems unnecessarily insecure, like having a gun shop right next to a bank. In theory it may not matter that the gun shop is right beside the bank or a mile away, but it just doesn’t feel like the best of ideas if it can be helped
No argument there.
I was approaching it more from the “Now that we have it, it’s not the end of the world” side of things.