To echo what Paul said, this will <i>not</i> be a browser you will want to use as a replacement for all your day-to-day browsing. If you try to, you won't end up happy.<p>Web compatibility is a long road, and it's crucial for us to be able to know what missing functionality is most important and the places where we need to focus on performance the most. The purpose of the package is to help us find and prioritize bugs and missing features. We want to know which sites are the most broken (and, even more importantly, which missing features are breaking those sites). From the Web developer side, we also want early feedback on use cases that may be slow today, so that the browser engine can eventually become a great experience for everyone.
Congratulations to both the Servo and Rust team for making it this far. You set out to slay not one, but two of the biggest dragons in all of software engineering, <i>at the same time</i>, and while you may not be done yet, you, uh, err.... have the lance definitely sliding pretty far in and the dragons are definitely noticing and quite upset?<p>Sorry. The metaphor kinda broke down there. Point is, congratulations. Rust+Servo is one of the most absurdly ambitious projects I've seen in the last twenty years, to make a new browser engine <i>and</i> a new systems-level language. The level of success achieved even to this point is astonishing. I know the road is still long, but I wish you the best in finishing this journey!
To be clear, this will be a very early release (nightly builds) of Servo with a HTML UI (browser.html). You won't be able to replace your current browser with Servo just yet :) … there's still a long way to go. The goal is to make it easier for people to test Servo and file bugs.
A year or so ago, I've read that servo project is fairly easy to contribute to even if you have no prior rust knowledge, other than the core basics, and are willing to learn it as you go. The reasoning was that there are tons of core functionality missing, therefore there are plenty of low hanging fruits.<p>I was wondering, is it still true (or ever was true)?
Is there any chance we'll get a browser with support for discretionary access controls in the render processes? Given that pretty much every OS supports locking down rights processes have, it would be a big win, security-wise, if the OS could catch anything the browser doesn't.
Could someone in the know clarify for me what the difference is in aims of this new Browser ("Servo") and Firefox? The Servo landing page said its aims are<p><pre><code> Servo project aims to achieve better parallelism,
security, modularity, and performance.
</code></pre>
But aren't those the goals of every browser?
Any plans to make the browser cache aware of cross domain resources? Basically a hash based cache, where as long as the hash is valid it can be used from a hash based caching pool. This could be integrated with SRI to reduce unnecessary network load without compromising user privacy.
Will not provide complete privacy, but might reduce exposure to 3rd party CDNs.<p>If something like this is implemented, providing frequently used resources via a plugin, privacy aware CDN or even a custom CDN similar to <a href="https://addons.mozilla.org/en-US/firefox/addon/decentraleyes/" rel="nofollow">https://addons.mozilla.org/en-US/firefox/addon/decentraleyes...</a> would be possible.
I wonder if the other browser vendors are working on a similar parallel browser engine? Perhaps using some custom version of Clang that applies Bjarne Stroustrup's C++ Core Guidelines to errors/warnings.
I am more interested to see how a major Rust project holds up when its attack surface gets larger. So the question is when does Servo get added to the Pwn2Own