Linked by Thom Holwerda on Fri 22nd Jun 2012 18:16 UTC
Apple "Why did Apple just release new MacBook Airs, MacBook Pros, and a Retina MacBook Pro, but no new iMacs or Mac Pros? And why are the iMacs probably being updated this year while the Mac Pro update won't happen for 12-18 months? As usual, I have some guesses." Good points.
Order by: Score:
Other possibilities...
by bhtooefr on Fri 22nd Jun 2012 18:56 UTC
bhtooefr
Member since:
2009-02-19

Thunderbolt may only be 2 lanes per direction per port, at 10 Gbps per, but DisplayPort 1.2 can be 4 unidirectional lanes per port, at 5.4 Gbps per lane.

So, two DisplayPort 1.2 ports together can just do it, not using Thunderbolt at all.

And, what does the MBPR have? Two MiniDP/Thunderbolt ports. Wouldn't be surprised if they support DisplayPort 1.2.

Other thing is, there are display compression protocols - DisplayPort 1.3 supports panel self refresh, the beginning of such a protocol, and IBM was doing research on a standard for only doing updates over the video link (to allow cheaper video link hardware to drive their T221), which ended up being used as a low power video link (I can't recall the name of it (edit: DPVL, which actually inspired DisplayPort: http://en.wikipedia.org/wiki/Digital_Packet_Video_Link ).

Edited 2012-06-22 19:07 UTC

Reply Score: 5

Comment by robojerk
by robojerk on Fri 22nd Jun 2012 19:00 UTC
robojerk
Member since:
2006-01-10

delete

Edited 2012-06-22 19:02 UTC

Reply Score: 1

RE: Comment by robojerk
by Drumhellar on Fri 22nd Jun 2012 19:06 UTC in reply to "Comment by robojerk"
Drumhellar Member since:
2005-07-12

It's probably not so much Intel is having problems with Thunderbolt on Xeon, but more along the lines of Thunderbolt is a non-option without on-CPU graphics, or at least excessively complex, considering the other options that are common on systems in Xeon's market vs. consumer market, i.e. Fibre Channel, 10Gbit networking, etc.

Reply Score: 2

Drumhellar
Member since:
2005-07-12

Apple is probably waiting for Xeons with on-CPU graphics. This would eliminate the difficulty of routing the PCIe available from the CPU or mainboard and the graphics from an add-in board to the same port.

With on-CPU graphics, it makes it much easier to add a thunderbolt port, and add-in graphics boards can still output via the integrated chip.

Reply Score: 2

tylerdurden Member since:
2009-03-17

Thunderbolt is basically a transceiver which allows PCI Express lanes to operate over longer distances.

I don't see why you think it would depend on the onboard graphics.

Reply Score: 2

Drumhellar Member since:
2005-07-12

Thunderbolt isn't just PCIe; the controller multiplexes PCIe and a DP video signal, then send it over the cable. It is this video signal that is used to drive displays, not the PCIe data.

Reply Score: 2

tylerdurden Member since:
2009-03-17

If anything the would implement thunderbolt as a connector coming off the graphics board, or having the PCH generate the displayport data from a PCI-e board. No one in their right mind would buy a Mac Pro class machine to run graphics off a crappy intel IGC.

Reply Score: 2

Drumhellar Member since:
2005-07-12

Well, using an add-in board for Thunderbolt would require either sacrificing some PCIe lanes for Thunderbolt, or using a slot design that is out of spec. Considering Apple's ongoing commitment to using open specs for their desktops (standard memory, ports, slots, etc etc), the latter isn't likely. The performance hit also makes the former unlikely as well.

Alternatively, an internal ribbon cable could carry DisplayPort data to a location to a pinout on the mainboard. However, this requires some messy, messy routing (Thunderbolt is already a complex beast to route on a mainboard on it's own), and is aesthetically ugly. Apple likes their Mac Pro internals to be aesthetically pleasing, too.

No, the likely scenario is something akin to nVidia's Optimus, or an enhanced version of Apple's own Switchable Graphics tech (Which is already part of OS X), or maybe something along the lines of BumbleBee (a Linux implementation of Optimus that isn't necessarily tied to nVidia tech). Basically, the CPU graphics is used for output, as well as rendering of OS graphics. However, the graphics card does the heavy lifting for more demanding tasks, but instead of outputting rendered graphics to the display, the output is directed to the integrated graphics, which then actually displays it.

Again, OSX already has this capability. This also has other benefits. When the user isn't doing anything that requires the graphics card, it can be put in a low power state, and only draw power when actually used.

Reply Score: 2

zima Member since:
2005-07-06

using an add-in board for Thunderbolt would require either sacrificing some PCIe lanes for Thunderbolt, or using a slot design that is out of spec. [...] The performance hit also makes the former unlikely as well.

Every time the effects of limiting GFX bus performance are tested over the years (within limits - like, "just" halving it), the differences turn out to be marginal at most...

It seems the push for higher speeds is more a "just in case" kind of thing. And/or a lingering legacy of AGP marketing claims (about outright usage of main memory as the primary place to store textures), which didn't really materialise.

Reply Score: 2

Drumhellar Member since:
2005-07-12

This is true. For graphics, cutting the bus down from 16 lanes to 8 doesn't have much of an effect (Are devices capable of operating on a number of PCIe lanes that isn't a power of two? Say, 14 lanes?), but I'm not sure that the same can be said about GPU computing. At least, I haven't seen any benchmarks testing the effects of reducing bandwidth for compute loads on GPUs.

Reply Score: 2

zima Member since:
2005-07-06

I doubt it would be much different - after all, the GPGPU computations are of a kind similar in nature to graphics processing ...that's why we nowadays try to do them on GPUs in the first place.

Are devices capable of operating on a number of PCIe lanes that isn't a power of two? Say, 14 lanes?

Well, if you're really curious, that's nothing a small slice of duct tape can't answer ;p (but seriously, OTOH this could be just as well a drivers thing at most)

Reply Score: 2

Comment by smashIt
by smashIt on Fri 22nd Jun 2012 19:19 UTC
smashIt
Member since:
2005-07-06

nice read but he is wrong on the retina-part

those displays have been around for more than 10 years now and the big question is not availability but price (and apple wants it cheap)

Reply Score: 4

RE: Comment by smashIt
by tony on Sun 24th Jun 2012 04:25 UTC in reply to "Comment by smashIt"
tony Member since:
2005-07-06

nice read but he is wrong on the retina-part

those displays have been around for more than 10 years now and the big question is not availability but price (and apple wants it cheap)


2880x1800 resolution displays in 15 inch form factor? There's never been a laptop display like that before.

1920x1200 (and more recently 1920x1080) has been the highest rez you could get on most laptops for 10 years or so, that actually became more rare over the past few years.

To get more than 1920x1200, you'd have to go to a 27 inch display usually (and most 27 inch displays are actually 1920x1200/1080).

Reply Score: 2

RE[2]: Comment by smashIt
by smashIt on Sun 24th Jun 2012 16:00 UTC in reply to "RE: Comment by smashIt"
smashIt Member since:
2005-07-06

2880x1800 resolution displays in 15 inch form factor? There's never been a laptop display like that before.


the panel doesn't care if it's put into a stand-alone device or a laptop

but for the panel you can take a look at the ibm t220 from 2001
http://en.wikipedia.org/wiki/IBM_T220/T221

Reply Score: 2

RE[2]: Comment by smashIt
by zima on Mon 25th Jun 2012 15:31 UTC in reply to "RE: Comment by smashIt"
zima Member since:
2005-07-06

1920x1200 (and more recently 1920x1080) has been the highest rez you could get on most laptops for 10 years or so, that actually became more rare over the past few years.

How exactly do you figure it became rarer?

Reply Score: 2

RE[2]: Comment by smashIt
by Kivada on Tue 26th Jun 2012 06:55 UTC in reply to "RE: Comment by smashIt"
Kivada Member since:
2010-07-07

You could get 15" the Thinkpad R50P, an NEC Versa Pro NX VA20S/AE or NEC LaVie G Type C at 2048x1536 a decade ago:
http://arstechnica.com/civis/viewtopic.php?f=9&t=555206
http://forum.thinkpads.com/viewtopic.php?t=43774

So no, we are just now catching back up to where we where years ago with pixel density tech.

As for "most" laptops on the market today, They are running the pitiful resolution of 1366x768, not by choice of the buyer, but because the companies got very cheap and only offer that resolution unless you are paying for a gaming class notebook when even the most crappy of today's IGPs can handle a much higher resolution for non 3d tasks. You'd think they'd go for 1280x800 instead since at least then 720p video wouldn't look distorted by being stretched across pixels.

Reply Score: 2

Too inexpensive?!?
by karunko on Sat 23rd Jun 2012 00:03 UTC
karunko
Member since:
2008-10-28

Is that even proper english?!? What about "cheap enough" or, if you're swimming in money, "very cheap"?

But, nitpicking aside, credit where credit's due: not only can Marco Arment predict the future (at least when it comes to the next Macs) he must know a definition of "inexpensive" I wasn't previously aware of. I mean:

MacBook Air 11" starts at 1.049 Eur
MacBook Air 13" starts at 1.249 Eur
MacBook Pro 13" starts at 1.249 Eur

I won't argue whether they're worth every cent or not, and I agree that they can't probably be equipped with a Retina Display at that price point, but... "too inexpensive"?!? C'mon! ;-)


RT.

Reply Score: 3

waiting for...
by l3v1 on Sat 23rd Jun 2012 06:22 UTC
l3v1
Member since:
2005-07-06

Well, if they are waiting for tech enabling high-fps (100+) rendering on 24"+ retina displays with >1200 line resolution (needing to effectively render at least 2x that size iirc), that could be a long wait.

Reply Score: 3

Lame guesses are news now?
by kragil on Sat 23rd Jun 2012 12:47 UTC
kragil
Member since:
2006-01-04

Zero information provided ..

Reply Score: 2