Despite the beating Apple often receives for its controlling behaviour of the App Store and its Mac OS X EULA, the company really does know how to market itself and its products. At the WWDC in San Francisco, currently under way, the company found a very illustrative way of showing the sheer size and success of the App Store.
Apple built a video wall out of 20 edge-to-edge 30″ Cinema Displays, and on this display array they rendered 20000 application icons from the App Store, and each time someone around the world buys an application out of the App Store, its corresponding icon on the video array will light up and cause a ripple effect on the array. It’s totally awesome.
You can find more images and videos at AppleInsider. Each display in the array is powered by a Mac Pro running Mac OS X 10.6 Snow Leopard, and the icons and effects are rendered using Quartz Composer.
I personally don’t believe Apple’s products – hardware and software – are of any magical quality that puts them above everyone else; for me, the difference lies in the sheer attention to presentation of products that you see at Apple, and this video wall is just one example of that. It not only illustrates the popularity and success of the App Store to developers attending WWDC, it also shows the power of Mac OS X, Quartz Composer, and the Cinema Display and Mac Pro.
The video array performs several functions in one go, in a way only Apple seems to be able to do. On top of that, it’s just cool.
Which pretty much summarises the reason for Apples recent successes.
It really is a pity more corporations aren’t taking a leaf out of Apples book as, personally, I don’t consider their software nor hardware to be any better than rival products of the same price bracket (in fact, they often under perform products in cheaper price brackets). But Apple know how package to sell their goods thus the appearance of value is inflated.
Take Google Android for example. On the G1 it has more features than the iPhone and Apple really are playing catch up. However the G1 is ugly as sin. Androids defaults (though easily changed) aren’t super “sexy” like the iPhones are and the handset wasn’t unveiled in some glamorous – and almost religious-like – product launch like the iPhone is.
Thus, regardless of how good the G1 is, it will always play 2nd fiddle to the iPhone.
These days, geek tech isn’t just bought by the geeks. So industries will either have to pander to consumers lust for superficial “bling”, or risk loosing market share to those who will.
Personally i think it’s a pity technology has moved this way, but then I guess it’s the natural evolution when industries reach mainstream consumer markets.
On a side note, I’d be more interested in iPhone news if Apple announced a relaxing of lockins rather than listening to a list of features which many non-apple handsets already have.
It is definitely cool.
I’m a little disappointed that they’re running this off 20 computers each with a monitor. Wouldn’t it be better to run this off a single computer, or is that only possible with Windows and Linux?
Good luck finding a computer that can drive 20 displays.
JAL
Any Windows computer with enough PCIe and PCI slots can do it. You just need to use Matrox video cards to do so. Their standard (lower-end) PCIe video cards can drive 4 displays. Their half-height video cards can drive 2 displays. And their higher-end PCIe video cards can drive 8 displays. And you can put more than 1 into a box, and link them together, without having to use SLI or CrossFire. (They also come in AGP formats.)
If I remember correctly, you can run 4 video cards at a time using the Matrox drivers, giving you from 8 to 32 displays.
You won’t be playing any high-end or even mid-range 3D games on these … but you do get full 2D acceleration with them.
We gave up trying to get nVidia and ATi/AMD consumer video cards to work with even 2 monitor setups in Windows, and have started buying Matrox. These are truly plug-and-play multi-monitor video cards.
Edited 2009-06-10 23:10 UTC
Thanks for this interesting information, I didn’t even know Matrox was still around.
In that case, for this specific videowall, they wouldn’t go then, as the ripple effect seems heavy 3D.
JAL
I don’t really see what practical difference does it make whether it’s one machine or twenty machines driving the installation.
And anyway, as for the feasibility of the former, it probably is not possible with Windows or Linux either – not because of software limitations but because it would require 10 16-lane PCI Express slots, which is something probably no production PC or motherboard has. FYI, the Mac pro has 4 such slots and therefore can drive up to 8 30″ displays. But since Apple doesn’t have to do it on a budget, they were able to afford to run it off of 20 machines.
Edited 2009-06-10 14:25 UTC
Storage. Watts. Karma.
Show the strenght or lack of strenght of OS X in this case explaining why it’s not used as much as a server OS or supercomputer OS ( unless someone else pay for it ).
It’s not rare to see day trader with more then 30 monitor running of one computer using GNU/Linux …
Also the limitation is by Apple design and due to it’s own hardware restriction. Witch don’t exist for GNU/Linux or Microsoft windows.
As also already stated , space and electricity and also budget are a factor for the majority.
Maybe it has something to do with OpenCL? Does OpenCL work with multiple cards/heads in a single computer?
No idea , could be it too …
There’s one big question that you don’t answer with that statement of a Linux box running 30 monitors:
Are these full hi-res GUI monitors (not likely, very expensive) or are these text-based terminals (most likely) ? Running a bunch of text-based terminals isn’t hard to do, even for an Apple 2c, as it really doesn’t require much bandwidth, and the terminals have enough intelligence to not need constant bandwidth for updates, and as such, they can run with standard old serial cables. I spent some dark months back in 2004 delivering pizza for a chain that used a single Linux box hooked up to a bunch of text terminals, as well as the various printers and registers: specialized POS POS stuff. I’m not saying Linux is a POS, but, rather, that the POS software was horribly unreliable, and often the machine had to be rebooted (and thus the entire store) to get it to talk to the printers, and it had consistent failures of messing up/losing orders, because the stuff was written so poorly by (most likely) the lowest bidder. In no way, however, did that change the reality of how good/bad/well-suited using Linux was for such a thing: if only it were 100% reliable for the POS software, honestly, that was the best tool for the job, and fancy graphic terminals wouldn’t enhance usability, and mice would only be one more thing to fail in such a messy environment (keyboards had plastic keyboard condoms on them), and that environment tends to eat the terminals, too, so good thing they were cheap boxes. Chances are, in a stock market environment, they’re still using mostly pure text terminals for displays in that situation: the old software that’s likely being used hasn’t changed for a very long time (why should it: it works, it’s been tested, these things need to be carefully certified, and that’s all expensive!) and all the other points, and it was likely written for Unix boxes way back when, so a simple rebuild was likely all that’s needed.
Without that further information as stated above, you can’t be sure exactly what the hardware support is, and anyway, if they’re all displaying distinct and separate information on each monitor, that’d be one heck of a videocard solution for any single computer to drive, instead of using signal splitters.
Usually they are full high res monitor ( touch screen oriented these days ) … They also are Multiple desktop solution , per monitor. You also have mixes of both ( some people work/process faster and better with text list ).
Not trying to run a muliplayer world , with thousand of player and another thousand of NPC on it from those , you have one guy interacting with many monitors with a set of predetermined data …
In the Apple case you have no one interacting with it , it’s blinking the app icon when the bean counter say it made a sale/download …
The question is why it need 20 Mac Pro computer to run it.
http://cultofmac.com/app-store-hyperwall-wows-wwdc/11630
GNU/Linux takes away hardware restrictions? Cool!
So if I install Ubuntu my single-video-card, dual-monitor-output PC will be able to drive 20 displays? AWESOME
Sorry Tom K , but in 2009 , I pass on replying to your lies that have no basis on reality.
Your comment is completely a non sequitur. The fact that the Mac pro has only 4 suitable slots and the video cards available for the Mac can drive at most 2 monitors says absolutely nothing about Mac OS X, let alone something about its suitability for server or supercomputer applications.
At most one may claim that, since none of the multi-display video card vendors (like Matrox) provide support for the Mac, it says something about Mac’s versatility as a platform, but that still does not imply technical inferiority of Mac OS X.
In fact, my admittedly superficial knowledge about the OpenGL infrastructure at the core of Quartz, which is very modular, flexible and scaleable, makes me believe that Mac OS X can handle an arbitrary number of displays (ok, there has to be a limit of some sort – memory or whatever, but it surely is greater than eight OpenGL drivers). I don’t claim an authority on the subject, so if someone is more familiar with it, feel free to correct me.
Maybe they are, but not in this case – Moscone is a big place, it can take another 20 Macs; considering the power consumption of the hundreds of Macs in the labs and in the laps of developers, and also the power consumption of lighting, air conditioning, etc. these 20 machines are a drop in the ocean; this is Apple and Apple’s own computers – it probably is the least expensive part of WWDC.
To reiterate my original point, the number of machines driving this installation does not affect its coolness factor.
I guess I was just remembering a Windows XP box running Doom on 8 screens. So anyway, at least two computers would be required to run this display. But, if I recall correctly, neither Vista nor 7 are capable of driving so many screens due to architectural limits.
This is probably not any kind of OS limitation for OSX. I’m just surprised to see 20 displays driving 20 screens. Seems like each computer should at least be able to drive two to four screens.
Actually it’s your original point who is non sequitur ” the number of machines driving this installation does not affect its coolness factor.”
I disagree , since the same hardware using another OS with one coumputer could run the same display , with the exact same effect , then it must be a requirement of the OS in this case. Also coolness without comparison is pretty much wrong , search “hyperwall” to see what I mean.
http://www.apple.com/macpro/specs.html
That would be a wrong fact on your part I see Five USB 2.0 ports , but that must be just my usual ignorance talking here.
Ok … let’s get back to reality when Apple is able to ship Moscone center in any city you might have a point , in other news , space and elctricity are not cheap anywhere else.
Beside this is more informative for developers , even if outdated :
http://www.148apps.com/10000/
Then watching blinking light and wave form around a map of Apps , that apple say got downloaded and is real time … Now if that display is going to appear in all Apple store and is interactive , meaning you press on the app and get details on it that’s completly another thing.
Sorry blinking lights and forms don’t impress me anymore.
“Non sequitur” doesn’t mean what you probably think it means. My point is not a conjecture or a conclusion so it almost by definition cannot be “non sequitur”. And more importantly, it is a direct response to richmassena’s statement –
, so in fact it very much follows the argument.
What exactly do you mean? When I search for “hyperwall” one of the results is this ( http://www.nas.nasa.gov/News/Releases/2008/06-25-08.html ) press release by NASA, which clearly states that the “hyperwall” is made up of 128 displays and is powered by 128 GPUs and 1024 CPU cores, or in other words 1 GPU and 8 CPU cores per display, which is exactly the same as Apple’s setup.
Using sarcasm effectively is an art form. And you demonstrate, like a tone-deaf person attempting karaoke, that if you are not good at it you can easily make a fool of yourself. In fact, I find your use of “my usual ignorance” oh so deliciously ironic.
And because I’m sure you still won’t be able to get it, let me clarify – the four slots, not ports, I were referring to are the four PCI Express slots – you know, the thingamajigs in which video cards go. Especially given my previous comment, I think that was quite obvious and I’m mystified what made you go down the USB tangent.
I’m not sure what part of the expression “in this case” confuses you.
What mistify me is your own hubris and self importance , but that’s ok , I am the monkey who know what USB can do …
I’d say communication in general confuses him.
There are specialty multiplexing cards available that can drive far more than two displays.
You’re right and that makes my comment technically incorrect, but in fairness very, very few of those, if any at all, have the processing power or memory to drive 2,560 x 1,600 pixel displays like the 30″ Apple Cinema ones and even if they do have them, they won’t have the interface (usually DVI) bandwidth to actually do it. And that doesn’t even take into account that Apple’s installation is a GPU intensive app.
No, only 5, there are several 4-port graphics cards on the market, e.g. the NVidia Quadro NVS range.
Yes, that’s a cost effective means. Apple takes 20 systems to WWDC and 20 monitors.
Instead, they should take 5 systems and buy those cards that I guarantee you aren’t laying around.
+1 (This was my thought too)
If it was for the “appearance of value”, the bling and glamour alone, I wouldn’t be a full on Mac user today. That was reserved for the good old MacOS9 on the bubble iMacs and the argument would have been perfectly valid back then.
The thing is that most people have no clue as to where MacOSX and many of the underlying technologies originally came from, what kind of company built it, and what the technology was designed to do: To provide a very serious computing platform for students, engineers and scientists.
People only see the bling, buy a Macbook and along the way discover how things seem to work pretty darn well and then hey, they didn’t need any support for it yet and hey, maybe it was pretty good value after all, because they’re not busy cleaning the Macbook of viruses.
This is what separates Apple from those who just make flashy gadgets with a million features that half-work, because making solid working gadgets is apparently not a priority. Selling as much junk as possible, is. And so:
Ask yourself whether the product launch of the G1 has something to do with the track record of Google releasing an OS for gadgets: They have none.
And so far, developers haven’t embraced Android as much as they should. They are going to have to bruteforce their way to fame, by getting Android on a ton of devices, both handhelds and netbooks, and using words like “Google Wave on your phone.”. Well, maybe not even that is enough.
Apple already had enormous momentum and attention at the iPhone unveiling because of the iPod. Remember the originally unveiling of the iPod? Not very religious. A small group of journalists and a lot of criticism afterwards for the iPod being underspecced and having less features than previous players.
The Palm Pre was hyped way more, because Palm has historically made very fine hand helds. People remember a good experience, and the release hype is usually connected to the history of the company. Their own recent track record is going to be their next challenge.
What I dislike, is that you have to wade through the market for tech that works, because there is so much junk. The worst part is that you can’t trust brand names anymore. HP making solid professional products? Forget it. Philips making great TV sets like they did in the 80s? Nope.
The day Apple begins to flounder in their ability to deliver solid products, I’m looking for something else.
While that maybe true, Apple aren’t the only company releasing high quality goods – which is the point I was making earlier.
However I do see your point that once you have confidence in a brand, there is an understandable reluctance to stray elsewhere.
Honestly, how many of the 50000 apps totally suck?
_I_ think the added gain from 5000 to 50000 apps is quite small.
Those perceived sucky Apps have what to do with Apple?
While the 20 by 30″ monitors are massively impressive, a similar pixel workload could be handled by 40 by 17-19″ monitors of 1280×1600 standing on side, same no of pixels but twice as fragmented and a lot uglier for sure. Of course only true cheap skates would do this.
thought experiment
So how many PCs to drive 40 of those displays, maybe only one if you can use the USB-VGA adapters and if the displays are mostly static. If the flashing of each download were restricted to just flashing an icon, then it is very doable but wouldn’t look so cool. Not actually sure if any OS can actually handle 40 odd virtual monitors, but a single graphic buffer space of 10240 by 8000 pixels isn’t too hard too draw graphics on, and it is only 320MB of pixel data of which only a tiny fraction is painted every second.
Since there are 3000 downloads/minute, or 50/sec, that is not actually a lot of pixels to change, so a limited spreading wave may also be doable too. I believe the USB-VGA adapters only have to transfer pixel changes so the drawing part is doable, but getting the changes into the 40 USB channels would be the challenge. It might require a few PCI USB expander cards and hubs.
I still remember the CRT walls that SuperMac used to put up of similar scale at the older MacWorlds. It probably won’t be long before we see an ultra thin OLED wall. I read recently about an LCD TV panel will likely go very thin by usingd LED lighting through the edges.
jeez …. does not need much to impress them then.
oh look something shiny…
It’s great to see that Apple is focusing their iPhone development resources on giant, live-updated, video-wall icon grids. That’s much more sensible than wasting time adding features that were standard everywhere else 10 years ago.
I really don’t find this impressive or cool.
There 20 computers with there own screen each displaying a selection of icons. When an app gets downloaded that icon flashes.
If you think that’s cool, please don’t play tic-tac-toe, you’ll get hypothermia.
http://www.apple.com/macmini/specs.html
The current Mac mini should be able to drive a 2560 x 1600 display. Did we really need 20 Mac Pros for a glorified “live” billboard?
I don’t rather understand why can’t a single machine drive several displays. It’s getting pretty common for people to use 3 displays on a single machine nowadays, and I’ve seen a few Linux boxes with up to 12 displays on a single machine. Is it a limitation of OSX?
At the very least, a Mac Pro should be able to handle two 30″ displays at full resolution with the right video card(s). If they had to go with OOTB configs, a Mac mini per display makes for a smaller carbon footprint.