“Over the last few months we have been hard at work getting Native Client ready to support the new Pepper plug-in interface. Native Client is an open source technology that allows you to build web applications that seamlessly and safely execute native compiled code inside the browser. Today, we’ve reached an important milestone in our efforts to make Native Client modules as portable and secure as JavaScript, by making available a first release of the revamped Native Client .[…]In the coming months we will be adding APIs for 3D graphics, local file storage, WebSockets, peer-to-peer networking, and more. We’ll also be working on Dynamic Shared Objects (DSOs), a feature that will eventually allow us to provide Application Binary Interface (ABI) stability.”
I may be stating the obvious but, I think they’re trying to take java’s place but make it truly open source because it can be ANY language and then ANY one can make a compiler to it and the byte-code interpreter.
Like .NET ?
Is .Net entirely open source? So I guess just like Java. Fact Java (not the language) runs more languages than that platform seems to be forgotten by most devs.
Like Jython?
What makes you think NaCl uses any sort of byte code interpreter?
There is a separate PNaCl project for creating an LLVM bytecode interpreted version of NaCl which might fit that description
Let’s see, we have native code execution, 3D graphics, local file storage, peer-to-peer networking, and an ABI. This really is starting to look like a nearly full featured OS inside a web browser container.
So will we soon see a computer bootstrap into the barebones Chrome OS, which then launches the browser, and inside of it, the “real” OS?
Fascinating!
Until the internet connexion starts to fail, that is…
However, that’s already a problem with vanilla ChromeOS
Edited 2011-02-20 21:10 UTC
Yeah, that’s my biggest issue with this “online OS” trend too. I think it’s all a bit premature until we achieve a state of ubiquitous wireless network connectivity that is as robust and failsafe as the current physical interconnects on the motherboards and processors of today. Not to mention the bandwidth required for full network only computing.
I have a feeling that is a long way off.
I keep hearing this, but you don’t need an internet connection for HTML-apps after the initial download.
You have local-storage, session-storage, indexedDB and more importantly HTML5-app-cache and offline-API:
http://diveintohtml5.org/offline.html
http://blog.bitrzr.com/2010/10/html5-offline-webapps-practical-exam…
And things like the Canvas-API which allows you to manipulate images in the browser without uploading them to the server. You can choose an image from disk and instead of uploading it to the server manipulate it in the browser first. Maybe even just save it locally.:
https://developer.mozilla.org/En/HTML/Canvas/Pixel_manipulation_with…
Like the red-eye removal demo:
http://disruptive-innovations.com/zoo/demos/eyes.html
Or how about a webmail site that caches all your emails locally so you can read them when there is no connection and even save emails to be sent later.
Edited 2011-02-20 22:00 UTC
On the other hand, having everything synced live is also nice, in case the “local” hardware fails…
There is an interesting article today in the french newspaper “Le Monde” about how the video website Bambuser was the favorite way to share videos amongst Egyptian during the revolution: with this website, you stream the live video (and the website keeps it for future replays), which means even if the police catches you and takes away and destroys your cellphone, the video is already safe on a server…
This is an interesting debate. Sure, you’re right that in this case, having the live video stream synced online in real time was the best way to keep the video in safety.
On the other hand, while it works perfectly well for videos with low popularity, I think “breaking” news should not be distributed this way, if the guy cares a bit about his life. Considering how easy it is to locate a cellphone geographically nowadays, including when the guy has gone back home and forgotten to turn it off because he feels in safety, I think that things like sharing the video via freenet later, from a safe place with the lot of time it takes, would be a much better idea.
Edited 2011-02-22 06:53 UTC
Technically, Native Client doesn’t require a browser – it’s basically just a runtime environment.
I’m looking forward to a NaCl version of Flash, Java, Adobe Reader, etc. – because not only will they be effectively sandboxed, and prevented from executing malicious code, but they will be compiled for a “cross platform” environment effectively allowing them to run on any OS with a NaCl port.
On the other hand, it is my understanding that they must be recompiled for each processor target they will run on – but there are generally fewer of those than there are OS platforms these days
That does sound promising.
Yeah, with the near-death of PPC I would say x86/x64 and ARM are pretty much the only major hardware platforms left. Speaking of ARM, this may be something the hardware hacking crowd will want to see come to fruition, as many of the microprocessor-based development boards out there use the ARM platform.
I don’t know a lot about processors, but a Google TV with a MIPS-processor could that be possible ? Or a Loongson laptop/netbook ?
That would mean an other target.
ARM isn’t one target. It’s many. Different versions of the instruction set, with and without different extensions. That’s why open source is so great, this doesn’t matter. x86 need the backwards compatibility because it’s a closed platform, things can’t be recompiled. ARM has never had the need to be static like this, not since it left Acorn behind. Each device can have the ARM as it requires and code compiled for that. In open source world, this doesn’t matter either, a repository is just compiled against the target for that device/family.
I still don’t get what the big deal is about online apps vs a repository. Locally installed apps, updated from the “main frame”, isn’t this what locally installable cloud apps are? Non locally installed cloud apps are useless because you won’t always have a connection, let alone a good one. So actually, I’m quite happy as is…..
Well, x86, x86-64, MMX, 3DNow!, SSE(1,2,3,4,5), AVX: x86 while having less variant than ARM isn’t really one target either..
The reason ARM doesn’t really need the backward compatibility isn’t open-source, it’s because embedded software is seldom updated..
There is no “real” OS unlike we make it like that, by favorizing the “remote” OS over the “local” OS. Otherwise, you have the best of both worlds: a local and a remote, thin-client, sandboxed OS (how should I call it?).
Going further along that line of though, the next step would be blurring the line between the two. The only OS I know walking along that kind of future is DragonflyBSD (this is why I’d keep an eye on it).
But maybe it won’t be the case: maybe the thin client will be so powerful, versatile, secure and ubiquitous that nobody will care anymore about the local OS (ChromeOS-style, your “new paradigm”). Or maybe, the barrier between local and remote will be kept for security and privacy reasons.
“the next step would be blurring the line between the two. The only OS I know walking along that kind of future is DragonflyBSD”
Plan9 also comes to mind.
For me it sounds a bit like AIR applications running inside a browser. I’m using a number of AIR based applications and they’re a joy to use when compared to the horror that is Java (which will hopefully be removed in Lion thus making it an optional extra).
What I do hope, however, is that the idea of NaCL doesn’t become a browser specific technology but something one can add to those browsers that support NPAPI Pepper extensions. In the case of webkit the NPAPI implemented has Pepper extensions that link back to Core Audio/Core Animation/etc. (only available on Safari running on 10.6).
I don’t think they’re ‘aiming’ for a complete OS but rather something that fills in the gap between native desktop and web based applications – and as for concerns regarding the internet connectivity – the application will most likely saved some where in the browser directory in much the same way that AIR applications are saved and can be launched without an internet connection.
I’ve said it before, but NaCl sounds like a reinvention of Java applets, *without* the hardware platform independence. What exactly is the benefit again, aside from being free of Oracle’s patents?
Edited 2011-02-20 22:38 UTC
OK, after looking at their website I did come up with of a reason why Native Client is useful compared to Java. Simply, Java support is missing from the current/next generation of mobile devices. Flash is present on most of them, but then Flash isn’t too amazing for a lot of use cases (anything multi-threaded or involving advanced 3D graphics or sound APIs). Most complex games written today that are meant to be cross-platform for mobile devices are written in C/C++ using OpenGL ES. And so Native Client makes it easier to port said games/apps to new platforms such as ChromeOS.
The question is, exactly how much porting work needs to be done in order to develop, say, a cross-platform MMORPG that runs on both Android tablets and ChromeOS netbooks? Obviously it will be possible somehow, but I can’t shake the feeling that this great movement towards “native everything” we’re experiencing right now is just a huge mess of fragmentation for the developer compared to the nice, unified model that is the Java platform (Standard Edition)….
Btw, if you don’t think excellent-looking 3D games can be developed in Java, take a look at some of the following:
http://wurmonline.com/forum/index.php?PHPSESSID=bd3eb0eeb3a1603eeae…
http://jmonkeyengine.com/showcase/screenshots/
http://bytonic.de/html/screenshots.html
http://www.minecraft.net/
(OK admittedly Minecraft isn’t great-looking, but I threw it in there anyway since it’s something everyone knows.)
Really did yourself a disservice there
Minecraft runs like absolute crap on most of my machines. I guess this is because I don’t have a high-end graphics card. Considering the level of 3d-ness Minecraft uses, it seems like Java sure does make a crappy 3d platform here.
Also, it’s windows-only.
I don’t know about running like crap, but it’s definitely available for Mac and Linux.
http://www.minecraft.net/download.jsp
Damn… you’re right, and I completely missed that when i went to download my copy a while back.
*falls on sword*
As the inventor of Javascript has said:
“Javascript is not done getting faster”
“There is still more headroom to speed up Javascript”
With Javascript JIT getting better I wonder where the speed ups will slow down. The last big jump was Crankshaft in December from Google V8 which was a jump of 50% in some benchmark. That is still a huge improvement.
Currently we have 3 of the 5 browser vendors with opensource code and 5 competing browser vendors all trying to be faster than the other. And learning from each other.
I don’t know if the other browser vendors will adopt Native Client though, otherwise I don’t know how useful it will be.
Edited 2011-02-21 09:26 UTC
Still JS (esp. jitted one) is a terrible memory hog.
For swapless mobiles it’s a no-go.
Have to say, along as your not really doing engine stuff, any language is fine. For many years, many games have the actual game part, done in a script language. What many people don’t take into account is that a great engine can look crap with crap artwork, and a crap engine with great artwork can look great. So concentrate on making life as easy for the artists as possible. The engine just has to be good enough. For over the last decade and a half, most of the real work have been done by the graphics hardware anyway. So actually, if WebGL, provides a bit more then just OpenGL (add game engine functionality), games could be done quite happily there. Then it just becomes about good artwork and game play.
I understand your goodwill in making people believe that Java is a fine platform for 3D games, but if you’re going to provide us with evidence, please show us games that have graphics on par with games from 2007, like, say, Crysis. Not games that look like Quake 2, (which even had a PSOne port..). Unfortunately, the list you provided is hardly impressive in 2011, and specially not “excellent-looking”.
You cannot be serious. Crysis was an absolute pinnacle of graphics when it came out and nothing was close to it. You know that even with current games on current hardware, it isn’t common to find the graphical quality that Crysis pioneered in 2007 and these games aren’t in any interpreted or “managed” code either. Remember the meme “but can it run Crysis?”… it has been around for years.
Asking for Crysis-like quality from Java games is… well, weird if I can put it that way. The Java proponents can say whatever they want, but in terms of performance, I will never believe that interpreted code can run as fast as native code. Would you buy a Java game that would pretend to be equivalent to Crysis? I wouldn’t.
Edited 2011-02-22 13:03 UTC
OK, first of all, Java is *not interpreted*. It hasn’t been since Java 1.1, which was replaced with Java 2 *over 12 years ago*!!!! How many times does this have to be rehashed before it finally sinks in???
whew!
OK, second of all, even though it’s true that “non-native” code runs slightly more slowly than “native”, generally it runs fast enough that assuming the program is well-written, on modern hardware you should not notice a difference. More importantly for games, it is all hardware-accelerated via DirectX or OpenGL–so you *really* should not notice much difference in performance given a decent graphics card. The only weakness Java has in the area of 3D is that it uses its own API which of course lacks a lot of the new, bangwhiz features of the newest DirectX or OpenGL APIs. But for gaming on mobile devices or within the browser, it is still absolutely competitive, because in that case it is going up against the similarly feature-limited OpenGL ES.
Edited 2011-02-23 01:13 UTC
So you saying that Java is not-native code and it’s not interpreted? Wow! What kind of logic is that? Having Java byte-code JIT’ed doesn’t make it non-interpreted, otherwise it would be compiled which it isn’t.
Unless a third option other than “compiled” or “interpreted” has been found just for Java, it is not different from any scripting, VM’ed, or compiled-to-intermediate-code language.
Of course, you’re not saying that Javascript, which is now just as JIT’ed as Java and needs a host program just like Java is not interpreted… or are you?
Yes. Please get your facts straight before going off on a rant. Java is *compiled* to Java byte code. It is a form of machine code that is run on a virtual machine. It is *not* interpreted.
http://en.wikipedia.org/wiki/Just-in-time_compilation
As for JavaScript, yes it is interpreted despite the JIT engine that it’s running on, because I define interpreted to mean “there is no binary”. With Java there is a binary that needs to be created prior to execution, which at least historically (and in part due to the static typing of the language) has meant greater performance compared to interpreted languages. If JavaScript one day truly is able to best Java in performance, then I agree that the distinction will have become more or less meaningless.
Edited 2011-02-23 17:01 UTC
I wrote:
which means I know the “facts”. I know about Java, that’s what I use in my professional life. But in a previous life, I was a researcher and academic, so I also happen to know a few things about theory and my judgment is not clouded by a common abuse of language.
If code, whether text or binary, is not machine code, it has to be converted to native machine code before being executed by the CPU. “Interpretation” is the name of that conversion operation when it happens at run-time, which is exactly what the Java VM does and is for. The input format for that operation, whether plain text or binary, is irrelevant: it still is interpretation. Java is interpreted and the “fact” that javac is “the Java compiler” is also irrelevant.
True, strict, compilation is converting anything, text or binary, into native machine code prior to runtime. Any other operation is an interpretation. In other words, compilation is a specific kind of interpretation where the target code is machine code. I can’t even believe someone is arguing against that when obviously, Sun and now Oracle agree with me: http://download.oracle.com/javase/tutorial/getStarted/intro/definit…
On the other side, if you so wished, you could “compile” Java to C++ or C to ASM. That’s a text to text “compilation” that doesn’t mean much except prove a concept. Which is also why nobody bothers to do it. However, to have it run on a CPU, you would need another compiler for the output C++ or ASM code that would then compile to native code. That’s two “compilers” chained with only one doing a compilation.
I give up and leave you with your certitudes.
Hi.
First off, let me apologize for my accusation that you don’t know what you’re talking about.
We are just arguing about terminology here. And according to common usage on just about every page that talks about Java, including Wikipedia and Oracle’s own literature, Java is not “interpreted”. The word “interpreted” is usually reserved to mean “translated from a high-level language into machine code at runtime”. That’s just common usage. You can argue that *technically* it is a kind of interpretation when you are converting bytecode to machine code, and you would be right, but IMHO by actually using the term “interpeted” in a casual context, you are just muddying up the definitions of these terms *as they are commonly used*.
It’s just like when people talk about tomatoes being vegetables, and then some pedant comes and reminds them “technically, tomatoes are a fruit”. Common usage does not always reflect technical truth. Except I’d say that even the *technical* definition of the word “interpreted” seems to be much less clear-cut than that of the word “vegetable”, as the disparity between your usage and that of the Java literature illustrates.
Edited 2011-02-25 00:29 UTC
Welcome back ActiveX!
Hardly.
NaCl removes the ability for native code to execute outside of the sandbox. It can only interact with the NaCl interfaces provided, and other NaCl libraries/apps that are available to it.
Furthermore, NaCl is cross-platform capable so that it can essentially be ported to any operating system.
Perhaps you should read up on the methods used in NaCl before making such a broad lame statement.
Ah, but donรขโฌโขt jump to conclusions either; he still has a point.
A large part of the problem with ActiveX was because it is alien to the browser. Flash is an ActiveX plugin too.
NaCl might take away some of the pains associated with the Windows-only nature of ActiveX, but it doesnรขโฌโขt remove the issue of technology like this usurping the natural web.
Mozilla have come out against NaCl and will not be including it: http://www.theregister.co.uk/2010/06/24/jay_sullivan_on_firefox/
Google are just looking to move more stuff to the web in order to grease their palms, they are not interested in the long term openness and freedom of the web. NaCl makes some things possible on one hand (high-end games), but at the same time provides new ways of companies creating closed experiences that do not compliment the web. Flash all over again, basically.
The Web is about View->Source, thatรขโฌโขs what keeps it open.
Agreed. Mozilla advocate browser based technologies like WebGL and etc. instead of using plugin style approach.
I would argue that the flaw with Flash has less to do with the fact that you can’t go ‘view source’ and more to do with the fact that it is a closed source implementation where there is no ability for a consortium of companies working on a single open source code base to fix up long standing issues with said technology.
Compare NaCL to Flash – NaCL is completely open sourced, the NAPI Pepper extensions are full open source as well. What does that mean as an end user? it means that if there are issues I’m not dependent on one single company to actually fix the problem in the case of Flash. If the Flash plugin was open source do you really think that the long standing issues would have existed if it meant that RedHat/Novell/Apple/etc were able to fix their platform specific issues?
You may talk about openness but for the majority of people they really don’t care – what they care about is the end result and whether the end result is desirable. The issues with Flash had less do with ‘open ness’ as so far as being able to peak into raw source code and everything to do with the plugin being closed source and everyone dependent on one single company to fix issues.
Thanks Kroc,
that is what I meant.
NaCl is not different (with minor details) from:
Java
Flash (with or without Alchemy)
.Net
ActiveX
Just because it is being proposed by Google we have to welcome it with open arms?
The Web is to be made of open standards.
As for things like NaCl, there is already a solution. It is called desktop applications.
Maybe I am too old, but I still don’t get the browser as an OS.
>I don’t get browser as the os
Google as MS, now you get it?
View source might help you learn JS and all, but once an application reaches a certain complexity it doesn’t help you anymore.
Do you read the complete code of gmail, foursquare, wave? I think any JS that is generated and/or compressed is not consumable by humans.
Same will be true for PNacl.
There is only a slight difference between Eclipse compiling your code down to JS or to LLVM bitcode.
Edited 2011-02-21 08:24 UTC
But it can still be queried by machines. It can be probed, debugged, logged, analyzed. It cannot permanently hide its secrets.
All of these things are true of flash, with the right tool. If you give me a SWF file, and some compressed, obfuscated JavaScript code, I could find out a whole lot more about the SWF. In fact, Adobe provide a free tool to convert a SWF file into a SWFX file, which is basically an XML representation of the SWF. I don’t see why people always point to Flash as an example of big-bad-evil-closed-technology. It’s simply not true. The Flash player itself might be a piece of crap, but that’s a different argument.
Edited 2011-02-21 12:40 UTC
We already know Javascript is not the reason we don’t have high-end games in the browser. It is mostly the graphics side of it. It is “just a webpage”:
http://www.youtube.com/watch?v=rSSf_umjOgU
It is fully dynamic with mixed content from Twitter, Flickr and embedded video. It also analyzed the audio in Javascript to show the ‘bars’ in on the buildings.
It was build in a few weeks by people who didn’t know HTML or OpenGL.
Even though a lot of improvements have been made with hardware acceleration there is still a lot that can be improved. Because not everyone will have anywhere near the same performance yet.
I guess it is now at the point what can be done with directx through activex in the browser (something which not many people utilize for in-browser games obvious reasons).
There’s no reason why a browser vendor couldn’t implement View->Source functionality for Flash. Adobe have released the information that would allow one to examine the contents of a SWF:
http://www.adobe.com/content/dam/Adobe/en/devnet/actionscript/artic…
I know NaCl.
It is just another browser plugin framework, regardless of how safe it may be.
The web is to be made of open standards, not browsers plugins.
The place for things that NaCl tries to do, is called desktop applications.
I read a comment on Osnews discussing buffer overflow vulnerabilities in Windows. It was intrigued and wondered why this has not been solved..yet.
I dug around and found the following tool.
EMET
http://www.microsoft.com/downloads/en/confirmation.aspx?FamilyID=c6…
I just baffles me why this is still an optional download and not included as standard with windows.
It looks thought that most off the Windows enhancing tools from Microsoft that is optional has to do other third party tools that might cry foul if its included as standard.
Like MSE for instance or Windows Live Mesh ect..
That and lazy developers.
Interesting, thanks for the link.
This is a better post on the topic, and it includes a PDF explaining what it is and does.
http://blogs.technet.com/b/srd/archive/2010/09/02/enhanced-mitigati…
Reading through the PDF, it looks like MS is already moving towards this. Several of the capabilities are already in Windows Vista and 7, and this tool backports several of the features to XP. Along with providing a GUI for existing features.
I’ve heard that stuff has to be up to a certain level to be included in Windows, and that level is kind of a pain. It’s easier for teams to develop tools and release them independently.
A security tool downloader similar to Web Matrix would be nice.
This reminds me of SELinux. It’s a great server tool where everything is set and forget, but SELinux is tough on the desktop where stuff is more in flux.
I found the following remark rather humorous:
Secure as Javascript? Well, there’s a reason why some folks browse the web with the NoScript extension …
http://www.google.com/search?q=javascript+vunerability&ie=utf-8&oe=…
You’re confusing implementation with the technology itself – are there a number of shoddy JavaScript implementations more focused on speed than security? sure but that alone doesn’t prove JavaScript is inherently insecure.
Nacl may address the OS dependance issues for a lot of applciations. This has to worry MS. If developers move to Nacl instead of win32 MS is in trouble.
Active X and Nacl are very much related ideas. Nacl is very much Active X done right. Biggest issue with Active X was that from an active X control you could reach out and doing anything to the system.
One of the big things overlooked when people say about internet connection issue. Is that Nacl is going to be like a lot of the new HTML5 able to use Offline storage when network connection is broken.
Really Linux Distribution makers situp and please take notice. Does not matter if you like it or not. Universal binaries that are OS neutral with all the speed of running on the raw cpu are coming. Nacl tech also includes auto updating in it design. Yes outside the users control a lot.
Also the need to wake up and see the project to allow android application on everything. Yes the second wave of android tech.
NaCl with stay away from my computers.
Sorry but *individuals* don’t matter, if the majority of users are pleased with the result they will install it (like they did with Flash) and then if you want to access some of the content which comes exclusively with this technology then you’ll *have* to install it.
Do you think that I like Flash?
No, but I install it (disabled by default) because sometimes I need it..
I think MS is already worried about Google and the web.
How many applications do you run native on your desktop instead of something on the web ?
Personally, I only use an programmers editor & git an SSH-client and a mailclient. I guess a lot of people use games and office-software.
I can already replace the mailclient and IDE:
http://www.cloud9ide.com/ http://ace.ajax.org/
I think I’ve seen SSH-clients on the web as well.
Google Docs and similair office suits already exist and casual games can already be created based on webtechnology. Casual games is now close to 50% of the market. 2D and simple 3D games build with webtechnology is already possible:
http://developers.facebook.com/blog/post/460
I don’t think you need more than 30 frames per second for most games.
Then the network goes down for whatever reason and you got a brick.
No thanks, I like to be able to use my computer offline, when there is no special reason to go online.
It not like I do have flat rate access to the Internet everywhere I go with my computer.
See my other comment:
http://www.osnews.com/permalink?463409
If the application has been built right, you only need to download the web-application ones.
And who said you had to run your server somewhere far away, how about a plug-server ?
Browsers are for viewing documents online.
If I want applications I use an operating system.
There are 0 differences from what is being sold as Internet nowadays and what some of us were doing with client server with teletypes 20 years ago.
Now the teletypes just look prettier.
I guess I run my own teletypes. ๐
Also setup/run them for others.
And I’m fine with that. I like how they get updated over the web.
It is kind of like a Debian server. Potentially it should be the same for an Ubuntu desktop. Although that still seems to still be a bit of a hit or miss for some people. Not so much on the update side but upgrades still seems to be a bit of a problem.
I just tried http://code.google.com/p/shellinabox/ it is amazing. It works really well. I could just use mutt instead of webmail if I wanted too. ๐
Want a tricky answer, aside from what you already said ?
A bittorrent client
I also run a mail client, a multimedia player, and a backup application. All those services I could have on the cloud nowadays, but with more glitches (like multimedia playback randomly freezing) and less control on my data, so… I know what’s my choice.
Edited 2011-02-22 07:11 UTC
Just a question: how come will elf exec (compiled on Linux dev env) work on coff client environment (windows) and the reverse?