A good occasion to watch Gary Bernhardt's talk "The Birth & Death of JavaScript" [0] again, where he talks about the precursor of WebAssembly: asm.js and the future implication it "could" have in the future in a really humorous way. A few years old but still relevant.
You want Gimp for Windows running in Firefox for Linux running in Chrome for Mac? Yeah sure.
And for a somewhat more practical but at the same time more exotic example, the Internet Archive has a ton of old minicomputers and arcade games running in MESS/MAME, each compiled to webasm. One click and you can boot anything and play it in your browser. https://archive.org/details/softwarelibrary
Are you sure that's actually using WASM? It sounds to me like it's currently using ASM.js compiled via Emscripten. (Though in theory there's no reason why it _couldn't_ be WASM, since Emscripten supports WASM as a compiler target.)
>You want Gimp for Windows running in Firefox for Linux running in Chrome for Mac? Yeah sure.
I actually do. I want all code ever written and every environment it was ever written for to have a URL that will let me run it in the browser. Everyone else seems to want the web to go back to being whitepapers but I want actual cyberspace already!
There's another reason why I want JavaScript in the browser to die:
We haven't had a new browser engine written from scratch since KHTML. Firefox is a descendant of Netscape, Chrome (and Safari) is a descendant of WebKit which is itself a descendant of KHTML, Edge is closed source, but I'm almost sure there's some old IE code in there.
Why?
It's simply too expensive to create a fast (and compatible) JS engine.
If WebAssembly takes off, I hope that one we'll have more than three browser engines around.
The real problem is CSS. Implementing a compliant render engine is nearly impossible, as the spec keeps ballooning and the combinations of inconsistencies and incompatibilities between properties explode.
Check out the size of the latest edition of the book "CSS: The Definitive Guide":
What do you base the assumption on that Javascript is the critical piece of complexity here? (it might very well be, but it's not obvious to me)
At least some of the JS engines are used in non-browser projects (V8 and Microsofts), which at least superficially would suggest you could write a new browser engine and tie it to one of those existing JS interpreters. WebAssembly will gain interfaces to the DOM as well, so the complexity of that interaction will remain.
> Edge is closed source, but I'm almost sure there's some old IE code in there
EdgeHTML is a fork of Trident, so yes. That said, I'm led to believe there's about as much commonality there as there is between KHTML and Blink: they've moved quite a long way away from where they were.
> It's simply too expensive to create a fast (and compatible) JS engine.
I don't think that's so clear cut: Carakan, albeit now years out of date, was ultimately done by a relatively small team (~6 people) in 18 months. Writing a new JS VM from scratch is doable, and I don't think that the bar has gone up that significantly in the past seven years.
It's the rest of the browser that's the hard part. We can point at Servo and say it's possible for a comparatively small team (v. other browser projects) to write most of this stuff (and break a lot of new ground doing so), but they still aren't there with feature-parity to major browsers.
That said, browsers have rewritten major components multiple times: Netscape/Mozilla most obviously with NGLayout; Blink having their layout rewrite underway, (confusingly, accidentally) called LayoutNG; IE having had major layout rewrites in multiple releases (IE8, IE9, the original Edge release, IIRC).
Notably, though, nobody's tried to rewrite their DOM implementation wholesale, partly because the pay off is much smaller and partly because there's a lot of fairly boring uninteresting code there.
I completely disagree that the issue is JavaScript here.
In my opinion, the issue is the DOM. It's API is massive, there is decades of cruft and backwards compatibility to worry about, and it's codebase is significantly larger in all major open source browsers out there.
WebAssembly has nothing to do with JavaScript. When people make this association it is clear they are painfully unaware of what each (or both) technologies are.
WebAssembly is a replacement for Flash, Silverlight, and Java Applets.
Chrome's V8 engine was actually written from scratch, unlike Webkit's JavaScriptCore (which descended from Konqueror/KJS, as you say). Google made a big deal about marketing this fact at the time. (1)
And while yes, Mozilla's Spidermonkey comes from the Netscape days, and Chakra in Edge descends from JScript in IE, plus aforementioned JavaScriptCore, each of those engines still evolved massively: most went from interpreted to LLVM-backed JITs over the years. I suspect that no more than interface bindings remain unchanged from their origins, if even. ;-)
If the issue is JavaScript, what explains the explosion of JavaScript engines? I agree that JavaScript is a cancer whose growth should be addressed, but implementation complexity isn't a reason.
If these proposed browsers don’t ship with a JS engine [1], do you also hope to have more than one internet around?
[1] Such as V8, Chakra, JavaScriptCore, SpiderMonkey, Rhino, Nashorn, there is a variety to choose from, also experimental models such as Tamarin, they are almost certainly not the critical blocker for developing a browser.
Implement a WASM JIT in kernelspace & you don't have to have a userspace while still having hot code hopefully optimized to remove bounds checking. Now all your programs are WASM modules & we can replace your CPU with some random architecture that doesn't have to care about supporting more than ring0. Oh why not implement a nearly-WASM CPU? Probably just change branches to GOTO. Now the only program people care about, their browser, can have a dead simple JIT for this architecture, with WASM-in-the-browser being nearly as fast as any other program
Unfortunately SIMD is still not supported on any browsers and with the move away from SIMD.js it looks like this might take a while.
We've been working on porting over our fairly large barcode scanner library to WebAssembly. While the performance is close to what we have on other platforms ( http://websdk.scandit.com ), the major bottleneck for now is not being able to use optimized code relying on SIMD (and not having an existing C fallback as all other platforms we target have SIMD support)
Right now there are SIMD prototypes in 3 engines (SpiderMonkey, ChakraCore, and V8) and the remaining work is standardization between them, tool support, and performance tuning. There will be an official SIMD proposal for WASM in 2018 and it should move through the standardization process pretty quickly.
This is precisely what I'm waiting for as well. I want to run the Nengo neural simulator in the browser, so I can share my research easily, but it looks like I'll have to wait a few years.
Yes. Phones have SIMD. Hell, the cheapo MIPS in my router has SIMD. You use the portable simd instructions and they get compiled to native simd instructions for your platform. Just like any other part of webassembly.
Not really portable. Even x86 has lots of variation between supported operations and vector length. ARM similarly has variations between ARM versions. At least for the abandoned JS SIMD effort, they stuck to the least-common-denominator SIMD (SSE1/NEON-armv7), but it was still good for a noticeable speedup in many applications.
On older mobile devices it can decrease significantly, but it's still much better then any known Javascript alternatives (e.g. quaggaJS) and works reasonably well even when only passing on a handful of frames/sec to the scanner library. (which happens on slower devices)
OTOH having SIMD would speed it up significantly and probably get them all up to speed.
Running anything with the GPU introduces a huge amount of latency, it only makes sense when you need high throughput and have large enough workloads to justify the latency. SIMD code can be interleaved with normal native code with zero latency.
And then there's the fact that WebGL is so much behind the state of the art that it's not even funny. Sticking to an old version of GL/GLSL severely limits what you can do with it.
Not sure what you mean (maybe i'm missing your point) but those two things are not very comparable. GLSL is an uncompiled GPU language and SIMD is a class of CPU instructions that exploit parallelism opportunities at the block level (apposed to core level like a GPU).
I'm not as excited for this as I used to be. In most user applications JavaScript is good enough or better. If it wasn't then we wouldn't be taking the browser to make desktop applications. Recently I decided to make a desktop app and asked around about the different UI libraries. The answer I keep getting is "just use electron and JavaScript". Why? Because love it or hate it the Dom is fantastic and simple for making UIs that are interactive and reliable. And you can't beat JavaScript for manipulating the Dom.
The only benefit I can think of that benefits is specialized software like games or scientific analysis/simulation. But for what most users want, JavaScript is fast enough. The example I keep hearing is "imagine gimp in the browser" but it's already possible to make a gimp like application in the browser using things like canvas and the file api.
So by the time webassembly is ready for the prime and has needed features like memory management and Dom access, will it even by worth it beyond a few specific applications?
I'll agree with you that modern HTML and CSS for presentation is best-of-breed. I'll accept that the DOM API is sufficient.
But neither of those necessitate JavaScript; JS is just a language that happens to run in the browser and has DOM API bindings (and the other browser APIs too). There's no reason those identical bindings couldn't be provided in any other language.
> will it even by worth it beyond a few specific applications?
I'm sure if you sampled all developers the number that would say Javascript is their favorite programming language would be in a substantial minority. So if it makes it easier for developers to write client side code in their preferred language its worth it.
That being said, I'm worried a bit. Javascript being awful has traditionally kept developers doing as much runtime work as possible server side. In the last 5 years the rapid growth of JS libraries, client side frameworks, and recently ES6 have all reduced the pain points and correlated to a dramatic rise in client side code being loaded on users computers with corresponding huge page size bloat.
> I'm sure if you sampled all developers the number that would say Javascript is their favorite programming language would be in a substantial minority.
One interesting thing to me is that you don't have to write an entire app using only WebAssembly or JS. You can take the classic approach of benchmarking to find hotspots, and then optimizing those. Migrating these performance sensitive sections to more performant code can be a win.
JS is enough for most application code. There are libraries that have and always will be in C, C++, or something similar, and it would make the most sense to wrap and expose them for different environments. JS makes no sense for shared libraries.
There are a lot of better image compression formats than jpeg and png. With wasm you could have a very fast client side decompression and save a ton of bandwidth.
Can anybody ELI5? The documentation is pretty fluffy.
Does WebAssembly actually open up any new API hooks? I get that its a clever way of transpiling existing programs to JavasScript, but surely we could do that already?
Whats new avenues of development is WebAssembly expected to open up? Is the whole point just to enable an easy way to compile games made on other platforms (Unity) to the web?
WASM is used to run native code in the Web browser without going through JavaScript. With it, it should be possible to run code at near-native performance in the browser.
WASM is an intermediate representation which is output by your compiler (of your favorite language) and consumed by the browser's compiler to emit native code.
WASM is a bit similar to LLVM IR, but it's architecture independent.
Compare this to, say, LLVM and Clang. Clang (the C compiler front end) will read C code and emit LLVM IR. LLVM (the compiler backend) will read LLVM IR and emit assembly code for your CPU. With WASM, the developer will run the "front end" and distribute the WASM code over HTTP and the web browser will run the backend and turn WASM into native assembly code.
> I get that its a clever way of transpiling existing programs to JavasScript, but surely we could do that already?
No, WASM code is not JavaScript at any point. ASM.js is a predecessor to WASM that was a compiler-friendly variant of JavaScript that can be compiled to native code.
Today the browsers virtual machines can work only with Javascript code. That means that if you want to use another language you have to convert in to Javascript, and you are obviously limited by the features of Javascript. The goal of WASM is to provide a language similar to Assembly (or bytecode of Java) that can be understood by the virtual machines of browser, and so to have a new way to compile languages and use them in the browsers.
WASM can also be optimized better, having more features than Javascript (for example javascript uses only double as number types, while WASM has integers and floats)
WebAssembly is a specification of a small assembly language. This language can call into functions in its host environment. The "web" part is that all evergreen web browsers have implemented this language, and you can use it from inside your browser, and call JavaScript functions into it.
This means that WebAssembly is actually broader than just the web; for example the Etherium folks have been discussing using wasm as their language to script their blockchain.
It doesn't currently open up any real new API hooks; it's mostly about being an efficient language for computation. You get near-native performance in the browser. In the future, it may or may not grow more hooks directly into the web platform, rather than needing to call into JS to do so.
WebAssembly is (currently) the same as asm.js, but with smaller file size and faster parse times. Asm.js is just a subset of JavaScript that can be optimized because it doesn't use certain features (like strings or garbage collection).
WebAssembly and asm.js are both intended to be compile targets - you don't write them by hand, but instead write your code in another language (c, Rust, etc.) and then complie to them.
The main benefits are 1) JavaScript is no longer the only language of the web, and 2) it's possible to get better performance than JS ever allowed for.
There is a lot of c code out there that probably isn't worth rewriting fron scratch in JS, but may well be worth recompiling to run as a web app.
Also, in the future, WebAssembly should get DOM access, optional garbage collection and other features that will allow it to be a compile targets for other languages such as Python and Ruby. So then you can use a single language for all of your development without that language being JavaScript.
I can address at least the first misconception here - this isn’t transpilation to JS. This is native code that has browser APIs exposed to it, so in theory there should be massive performance wins.
This is also a misconception, it is just a bytecode. asm.js is already running at around 50% native speed so there is no massive performance wins to find. What you get with WebAssembly is reduced startup time because it doesn't have to be parsed first.
The really exciting thing people should be talking about is not WebAssembly but the general availability of SharedArrayBuffer[0] which finally makes it possible to run "foreign" multi-threaded code efficiently.
Before HTML5 deprecated it, it was assumed that browsers would just supply plugins to support various scripting languages or whatever, which is why the <script> tag had a type attribute.
Unfortunately, integrating a new runtime like <script type="text/lua" src="main.lua" module="lua.wasm"/> will probably never happen. A link tag, to me, specifies a static resource rather than executable code (although since CSS supports animations now that's probably a distinction without a difference.) <object> might be a good candidate but I don't know if it's still supported.
But, we can't get rid of JS altogether. Browsers will have to support javascript indefinitely, otherwise most of the web becomes unreadable. That being the case, using JS to load WebAssembly seems like the most reasonable backwards-compatible compromise available.
I can't agree more...but I believe the issue is that WASM sold a lie. Right now if you read between the lines it is argued that WASM is there just to help JS do more not to replace it.
What? The original WebAssembly announcement[1], which can be viewed as the manifesto for how WASM was envisioned, it clearly says "once browsers support the WebAssembly syntax natively, JS and wasm can diverge". Eich's goal with WebAssembly is not replacing JS, it's providing a better compilation target for other languages. WebAssembly is a replacement for asm.js, not JavaScript. No one is selling a lie here.
> WASM is there just to help JS do more not to replace it
That certainly is a shame, but it's not like we have to accept that as the future. HTML is a document markup language, but we've hijacked it to build interactive UIs. It might be that WASM is just a native-ish FFI system for javascript today, but tomorrow it could be something completely different.
I played a bit with WASM and I like its approach, but the one thing that annoys me most with web development is the limited number of client-side languages.
You want Gimp for Windows running in Firefox for Linux running in Chrome for Mac? Yeah sure.
[0] https://www.destroyallsoftware.com/talks/the-birth-and-death...
https://archive.org/donate/
Too bad makers of the original Minesweeper did not think to build touch input support.
I actually do. I want all code ever written and every environment it was ever written for to have a URL that will let me run it in the browser. Everyone else seems to want the web to go back to being whitepapers but I want actual cyberspace already!
We haven't had a new browser engine written from scratch since KHTML. Firefox is a descendant of Netscape, Chrome (and Safari) is a descendant of WebKit which is itself a descendant of KHTML, Edge is closed source, but I'm almost sure there's some old IE code in there.
Why?
It's simply too expensive to create a fast (and compatible) JS engine.
If WebAssembly takes off, I hope that one we'll have more than three browser engines around.
Check out the size of the latest edition of the book "CSS: The Definitive Guide":
https://twitter.com/meyerweb/status/929097712754098181
Until CSS is replaced by a sane layout system, there's not going to be another web browser engine created by an independent party.
At least some of the JS engines are used in non-browser projects (V8 and Microsofts), which at least superficially would suggest you could write a new browser engine and tie it to one of those existing JS interpreters. WebAssembly will gain interfaces to the DOM as well, so the complexity of that interaction will remain.
EdgeHTML is a fork of Trident, so yes. That said, I'm led to believe there's about as much commonality there as there is between KHTML and Blink: they've moved quite a long way away from where they were.
> It's simply too expensive to create a fast (and compatible) JS engine.
I don't think that's so clear cut: Carakan, albeit now years out of date, was ultimately done by a relatively small team (~6 people) in 18 months. Writing a new JS VM from scratch is doable, and I don't think that the bar has gone up that significantly in the past seven years.
It's the rest of the browser that's the hard part. We can point at Servo and say it's possible for a comparatively small team (v. other browser projects) to write most of this stuff (and break a lot of new ground doing so), but they still aren't there with feature-parity to major browsers.
That said, browsers have rewritten major components multiple times: Netscape/Mozilla most obviously with NGLayout; Blink having their layout rewrite underway, (confusingly, accidentally) called LayoutNG; IE having had major layout rewrites in multiple releases (IE8, IE9, the original Edge release, IIRC).
Notably, though, nobody's tried to rewrite their DOM implementation wholesale, partly because the pay off is much smaller and partly because there's a lot of fairly boring uninteresting code there.
In my opinion, the issue is the DOM. It's API is massive, there is decades of cruft and backwards compatibility to worry about, and it's codebase is significantly larger in all major open source browsers out there.
WebAssembly is a replacement for Flash, Silverlight, and Java Applets.
And while yes, Mozilla's Spidermonkey comes from the Netscape days, and Chakra in Edge descends from JScript in IE, plus aforementioned JavaScriptCore, each of those engines still evolved massively: most went from interpreted to LLVM-backed JITs over the years. I suspect that no more than interface bindings remain unchanged from their origins, if even. ;-)
(1) I can't currently find the primary sources from when Chrome released on my phone, but here's a contemporary secondary one: https://www.ft.com/content/03775904-177c-11de-8c9d-0000779fd...)
[1] Such as V8, Chakra, JavaScriptCore, SpiderMonkey, Rhino, Nashorn, there is a variety to choose from, also experimental models such as Tamarin, they are almost certainly not the critical blocker for developing a browser.
Netscape navigator on DOS in Firefox via WebAssembly.
http://8bitworkshop.com/?platform=vcs&file=examples%2Fhello
I think it's a great idea, though many disagree. It's basically ChromeOS but to the next level.
We've been working on porting over our fairly large barcode scanner library to WebAssembly. While the performance is close to what we have on other platforms ( http://websdk.scandit.com ), the major bottleneck for now is not being able to use optimized code relying on SIMD (and not having an existing C fallback as all other platforms we target have SIMD support)
wow, this is really cool! thx for sharing :]
Is SIMD portable though? Will it run good on mobile devices? And what about other non-Intel architectures in general?
I'm excited for WebAssembly.
OTOH having SIMD would speed it up significantly and probably get them all up to speed.
And then there's the fact that WebGL is so much behind the state of the art that it's not even funny. Sticking to an old version of GL/GLSL severely limits what you can do with it.
HN discussion: https://news.ycombinator.com/item?id=14495893
The awesome-wasm list is also a good start: https://github.com/mbasso/awesome-wasm
I'm not a collaborator on the WebAssembly Org, but I can still see that issue just fine.
I almost wish I was a student again so I could relearn math for the first time with all of these great tools...
The only benefit I can think of that benefits is specialized software like games or scientific analysis/simulation. But for what most users want, JavaScript is fast enough. The example I keep hearing is "imagine gimp in the browser" but it's already possible to make a gimp like application in the browser using things like canvas and the file api.
So by the time webassembly is ready for the prime and has needed features like memory management and Dom access, will it even by worth it beyond a few specific applications?
But neither of those necessitate JavaScript; JS is just a language that happens to run in the browser and has DOM API bindings (and the other browser APIs too). There's no reason those identical bindings couldn't be provided in any other language.
I'm sure if you sampled all developers the number that would say Javascript is their favorite programming language would be in a substantial minority. So if it makes it easier for developers to write client side code in their preferred language its worth it.
That being said, I'm worried a bit. Javascript being awful has traditionally kept developers doing as much runtime work as possible server side. In the last 5 years the rapid growth of JS libraries, client side frameworks, and recently ES6 have all reduced the pain points and correlated to a dramatic rise in client side code being loaded on users computers with corresponding huge page size bloat.
I wouldn't be so sure of that.
https://insights.stackoverflow.com/survey/2017#technology-mo...
Exactly. Only javascript can access attributes and call functions so good on exported document and window objects.
Dead Comment
Does WebAssembly actually open up any new API hooks? I get that its a clever way of transpiling existing programs to JavasScript, but surely we could do that already?
Whats new avenues of development is WebAssembly expected to open up? Is the whole point just to enable an easy way to compile games made on other platforms (Unity) to the web?
WASM is an intermediate representation which is output by your compiler (of your favorite language) and consumed by the browser's compiler to emit native code.
WASM is a bit similar to LLVM IR, but it's architecture independent.
Compare this to, say, LLVM and Clang. Clang (the C compiler front end) will read C code and emit LLVM IR. LLVM (the compiler backend) will read LLVM IR and emit assembly code for your CPU. With WASM, the developer will run the "front end" and distribute the WASM code over HTTP and the web browser will run the backend and turn WASM into native assembly code.
> I get that its a clever way of transpiling existing programs to JavasScript, but surely we could do that already?
No, WASM code is not JavaScript at any point. ASM.js is a predecessor to WASM that was a compiler-friendly variant of JavaScript that can be compiled to native code.
This means that WebAssembly is actually broader than just the web; for example the Etherium folks have been discussing using wasm as their language to script their blockchain.
It doesn't currently open up any real new API hooks; it's mostly about being an efficient language for computation. You get near-native performance in the browser. In the future, it may or may not grow more hooks directly into the web platform, rather than needing to call into JS to do so.
WebAssembly and asm.js are both intended to be compile targets - you don't write them by hand, but instead write your code in another language (c, Rust, etc.) and then complie to them.
The main benefits are 1) JavaScript is no longer the only language of the web, and 2) it's possible to get better performance than JS ever allowed for.
There is a lot of c code out there that probably isn't worth rewriting fron scratch in JS, but may well be worth recompiling to run as a web app.
Also, in the future, WebAssembly should get DOM access, optional garbage collection and other features that will allow it to be a compile targets for other languages such as Python and Ruby. So then you can use a single language for all of your development without that language being JavaScript.
The really exciting thing people should be talking about is not WebAssembly but the general availability of SharedArrayBuffer[0] which finally makes it possible to run "foreign" multi-threaded code efficiently.
[0]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...
Unfortunately, integrating a new runtime like <script type="text/lua" src="main.lua" module="lua.wasm"/> will probably never happen. A link tag, to me, specifies a static resource rather than executable code (although since CSS supports animations now that's probably a distinction without a difference.) <object> might be a good candidate but I don't know if it's still supported.
But, we can't get rid of JS altogether. Browsers will have to support javascript indefinitely, otherwise most of the web becomes unreadable. That being the case, using JS to load WebAssembly seems like the most reasonable backwards-compatible compromise available.
What? The original WebAssembly announcement[1], which can be viewed as the manifesto for how WASM was envisioned, it clearly says "once browsers support the WebAssembly syntax natively, JS and wasm can diverge". Eich's goal with WebAssembly is not replacing JS, it's providing a better compilation target for other languages. WebAssembly is a replacement for asm.js, not JavaScript. No one is selling a lie here.
[1]: https://brendaneich.com/2015/06/from-asm-js-to-webassembly/
That certainly is a shame, but it's not like we have to accept that as the future. HTML is a document markup language, but we've hijacked it to build interactive UIs. It might be that WASM is just a native-ish FFI system for javascript today, but tomorrow it could be something completely different.
WASM can be great if they allow it to.