"When you think about Silicon Valley you think about modern tech giants like Google, Yahoo and others, but did you know that this high-tech center of the Universe owes its existence to secret government cooperation dating back four decades? It certainly seems outlandish, but in a seminar given back on December 18th on the Google campus, entrepreneur and lecturer Steve Blank explained how the valley was born from billions of dollars worth of signals intelligence contracts from World War II and into the 1960s."
Geek stuff Archive
Geeks of the world rejoice: Futurama is back. After a long wait, today is the official release date of "Bender's Big Score!", a 90 minute direct-to-DVD film, that will later be chopped up into the first four episodes of the new season. Filled with geek and nerd references, Futurama is a popular American cartoon series, by Simpsons' creater Matt Groening, that sadly got cancelled way before its time by the Fox Network. Now, it's back, and the opening sequence, a big slap in the face of the Fox Network executives, is already a classic. I have already seen the new film, and as a big, big Futurama fan, I can tell you it will not let you down. Three more direct-to-DVD movies are already planned, so go out, buy the DVD, and just like Family Guy, let's get it back on a regular schedule! My personal favourites: Zapp (s03e02): "That young man fills me with hope and some other emotions that are weird and deeply confusing to me." And of course the classic sign over Amy Wong's parents' house (s04e06): "You came to the Wong place."
Technologists and investors gather at the two-day Singularity Summit in San Francisco to discuss the benefits and risks of advancing artificial intelligence--and what to do in the event that machines one day out-think humans.
A team from the Massachusetts Institute of Technology lit a 60 watt light bulb from a power source two meters away and with no physical connections between the source and the appliance. The 'WiTricity' device - the term coined by the MIT team to describe the wireless power phenomenon - uses magnetic fields to deliver power to the gadgets remotely."
"Ever since the remote control's co-inventor Robert Adler passed away , I've wanted to own one of the first remote controls. After trolling through eBay every now and then, I finally have in my hands a piece of the history of the button. But there's a mystery: which piece?" Entertainingly written piece on the world's first remote controls. I totally enjoy the simplicity of the ultrasonic remote: "All these early remotes are purely mechanical. No batteries at all. When you push the button, a small hammer strikes an aluminum rod, triggering a sound above our hearing range that's picked up by the TV. Each rod is a different length, thus a different frequency, thus distinguishable by the TV." Brilliant.
Vernor Vinge, 62, is a pioneer in artificial intelligence, who in a recent interview warned about the risks and opportunities that an electronic super-intelligence would offer to mankind. Vinge is a retired San Diego State University professor of mathematics, computer scientist, and science fiction author. He is well-known for his 1993 manifesto, "The Coming Technological Singularity", in which he argues that exponential growth in technology means a point will be reached where the consequences are unknown. Vinge still believes in this future, which he thinks would come anytime after 2020.
Remember the 'Minority Report' scenes in which Tom Cruise and others used their hands to manipulate data on giant computer screens? One man is on a mission to make that gestural interface technology commonplace on every desktop.
Augmented Reality is the overlapping of digital information and physical environment. Sci-Fi has often portrayed A.R. as interactive floating transparent computer screens projected into the air, or perhaps the most absolute example: standing inside an entirely computer generated world.
New computer software applications--in the labs and in the market--are using emotion as data input and responding to it. This CIO.com feature looks at current applications that focus on human emotion, and looks ahead to work being done on that subject, in computer labs around the world.
Half of our readers are away in this holiday season, so traffic and news items are considerably down comparatively to normal weekdays. Why don't we have some Holiday Fun (TM) with a poll? Our friends at Slashdot put a poll up asking about your favorite sci-fi TV series, but they forgot two very important entries as their readers mentioned quickly afterwards: the most famous TV series of the '90s "The X-Files", and the already cult classic "Firefly". So we thought we recreate the same poll, but with these options in play, just so we see what our (mostly geek) readership likes the most. Even if we only have ~1/10 of Slashdot's traffic we can still have some fun with it!
"In this article, see how HAL 9000, the computer in the 2001: A Space Odyssey movie , the smartest believable artificial intelligence so far in fiction, could predict equipment failure, answer personal questions, learn to sing 'Bicycle built for Two', and go insane, based on IBM Build to Manage Toolkit components. By the end of this article, you'll see how autonomic computing can be implemented today; determine if there is such a thing as a Hofstadter-Moebius loop in programming; and discover if HAL stands for Heuristic ALgorithmic computer, Heuristic Autonomic Learner, or is simply the first three letters of a prankster holiday that occurs about this time of the year."
In the latest study conducted by the Pew Internet and American Life Project, over 700 technology experts were asked to evaluate an assortment of scenarios in an attempt to determine potential trends for the year 2020. With responses from representatives of the World Wide Web Consortium, ICANN, the Association of Internet Researchers, and major corporations like Google and IBM, the report reflects the perceptions of "Internet pioneers," more than half of whom "were online before 1993."
Physicists at the U.S. Department of Energy's Argonne National Laboratory have devised a potentially groundbreaking theory demonstrating how to control the spin of particles without using superconducting magnets - a development that could advance the field of spintronics and bring scientists a step closer to quantum computing.
"Imagine if you will, a world where your ideas and perhaps, even your own creative works became part of the OS of tomorrow. Consider the obvious advantages to an operating system that actually adapted to the needs of the users instead of the other way around. Not only is there no such OS like this, the very idea goes against much of what we are currently seeing in the current OS options in the market."
Talking to your computer has been a staple of science fiction since at least the 1960s, but it looks as if it's finally coming within reach. This week saw the release of the first speech recognition software capable of handling continuous speech without the user having to train it in advance, namely Nuance's Dragon Naturally Speaking (DNS) version 9. For anyone else who tried IBM ViaVoice or Dragon Dictate a few years ago, found it awkward to get the system used to your voice, and even more awkward to speak in a staccato word-by-word fashion, this is a huge leap forward.
"Artificial intelligence is 50 years old this summer, and while computers can beat the world's best chess players, we still can't get them to think like a 4-year-old. This week in Boston, some of the field's leading practitioners are gathering to examine this most ambitious of computer research fields, which at once has managed to exceed, and fall short of, our grandest expectations."
OpenCyc is the open source version of the Cyc technology, the world's largest and most complete general knowledge base and commonsense reasoning engine. OpenCyc can be used as the basis for a wide variety of intelligent applications. This is release 1.0 of OpenCyc featuring the complete Cyc ontology of over 260,000 terms and their definitional assertions numbering over 1.8 million. OpenCyc requires about 500MB of disk space and performs best with over 512 MB RAM. One GB of RAM is recommended for Cyc when accessed by Java applications.
Computers of the future could be controlled by eye movements, rather than a mouse or keyboard. Scientists at Imperial College, London, are working on eye-tracking technology that analyses the way we look at things. The team is trying to gain an insight into visual knowledge - the way we see objects and translate that information into actions.
Since all you boys and girls watch Star Trek: "Physicists Nicolae Nicorovici from the University of Sydney, Australia, and Graeme Milton, from the University of Utah, have proposed that devices called superlenses could be used to create a type of cloaking device. Using a principle called 'anomalous localized resonance', superlenses placed very close to a small object could mask its reflected light waves by resonating at the same frequency, much like how noise-canceling headphones mask sound waves by creating a sound that is at the same frequency but inverted in phase."
Ok, this one is just plain scary. "By combining quantum computation and quantum interrogation, scientists at the University of Illinois at Urbana-Champaign have found an exotic way of determining an answer to an algorithm - without ever running the algorithm. Using an optical-based quantum computer, a research team led by physicist Paul Kwiat has presented the first demonstration of 'counterfactual computation', inferring information about an answer, even though the computer did not run." The research team published their results in Nature.