Linked by Thom Holwerda on Mon 8th Feb 2016 22:09 UTC, submitted by Radio
Internet & Networking

Major desktop browsers push automatic security updates directly to users on a regular basis, so most users don't have to worry about security updates. But Linux users are dependent on their distributions to release updates. Apple fixed over 100 vulnerabilities in WebKit last year, so getting updates out to users is critical.

This is the story of how that process has gone wrong for WebKit.

Order by: Score:
Best wishes to GNOME...
by dionicio on Mon 8th Feb 2016 22:40 UTC
dionicio
Member since:
2006-07-12

Hoping they find decent, proactive answers for everyone at stake.

Reply Score: 3

RE: Best wishes to GNOME...
by nicubunu on Tue 9th Feb 2016 08:17 UTC in reply to "Best wishes to GNOME..."
nicubunu Member since:
2014-01-08

And their implied solution: everyone should upgrade to GTK3 (and, why not? to GNOME 3), since they don't plan to provide security fixes for GTK2.
The article even has a concrete example: GIMP, which uses WebKit for the sole purpose of displaying its in-app help system (and which is an optional package, if you don't have it installed GIMP will use the default web browser, i.e. Firefox and an online resource). To fix the security with its integrated off-line help system, GIMP has to move to GTK3.
There is another case from the article: email clients using WebKit, which are in a really tough position. Unfortunately, in this case the problem is due to a conscious design choice and the lack of a migration plan away from it.

Reply Score: 5

What exactly is he going on about?
by teco.sb on Tue 9th Feb 2016 02:25 UTC
teco.sb
Member since:
2014-05-08

I read the full article, but didn't quite understand where he was trying to go with his prose. WebKitGTK+ doesn't provide reasonable security fixes branch and he doesn't understand why they aren't providing security fixes? Was that it? He's kidding, right?

It's not a distribution problem. Every distro has a security update path, he even mentions Debian's special policy for web engines. The shell-shock fix, to use an example, was correctly distributed across all major distributions. It sounds like he's saying everyone should have an exception to their security policy just because the WebKitGTK+ project doesn't want to conform to the already established procedures.

Reply Score: 4

Delgarde Member since:
2008-08-19

I read the full article, but didn't quite understand where he was trying to go with his prose. WebKitGTK+ doesn't provide reasonable security fixes branch and he doesn't understand why they aren't providing security fixes? Was that it? He's kidding, right?


From what I gather, the main problem seems to be that WebKitGTK had a big API break in the past, when they switched to the multi-process model. And that break was long enough ago that the WebKitGTK developers aren't supporting the older versions anymore - but there are a number of smaller browsers like Midori that don't have the resources to switch to the new API, and are therefore stuck on older WebKitGTK versions with no security patches being applied.

As to fault - it's complicated, I think, by the fact that the break between the 1.0 and 2.0 APIs was more or less simultaneous with the port from Gtk+ 2 to 3. That left the small projects stranded somewhat - they couldn't update to the 2.0 API without also updating to Gtk+ 3, and didn't have the resources to either do that, nor to maintain the older WebKitGTK version they were stuck with.

Reply Score: 7

Delgarde Member since:
2008-08-19

Oh, and aside from that aspect, it *is* a distribution problem when distributions aren't actually shipping the latest updates. Look at the examples he gives from Ubuntu - as of writing, the current Ubuntu release was still shipping a WebKitGTK version with 40 known vulnerabilities that had been fixed upstream.

Reply Score: 2

Brendan Member since:
2005-11-16

Hi,

Oh, and aside from that aspect, it *is* a distribution problem when distributions aren't actually shipping the latest updates. Look at the examples he gives from Ubuntu - as of writing, the current Ubuntu release was still shipping a WebKitGTK version with 40 known vulnerabilities that had been fixed upstream.


The fact that people release software with 40+ vulnerabilities is a problem that "treating symptoms and not the root cause" (e.g. more frequently updates) will never fix.

So, what is the root cause? Is the problem that the WebKit project's quality control is woefully inadequate; or is it that the web itself has become a hideously over-complicated and constantly changing nightmare that is almost impossible to support; or is it a combination of multiple factors?

- Brendan

Reply Score: 4

tidux Member since:
2011-08-13

> Is the problem that the WebKit project's quality control is woefully inadequate

Yes. Notice how Blink and Gecko do not have this problem.

Reply Score: 3

teco.sb Member since:
2014-05-08

I didn't understand that to be the problem. Most distributions have a way of doing different major versions of the same library. Think Gtk+2 and Gtk+3 libraries.

Anyway, from the article:

Historically, WebKitGTK+ has not had security updates. Of course, we released updates with security fixes, but not with CVE identifiers, which is how software developers track security issues;


And then:
For instance, Fedora 22 shipped with WebKitGTK+ 2.8, so it would release updates to new 2.8.x versions, but not to WebKitGTK+ 2.10.x versions.

The problem is that most distros do not operate like fedora exactly because they are considered more "stable" in nature...
Debian is correct that we do not provide long term support branches, as it would be very difficult to backport security fixes.

So now he's basically saying "we don't actually provide security fixed to older branches because it is too hard, distros should just ship new releases with the occasional regression."

I just feel like the article is about a developer whining about stable distributions not shipping the latest code because upstream doesn't have the resources to keep up with security fixes.

Edited 2016-02-10 03:19 UTC

Reply Score: 1

dpJudas Member since:
2009-12-10

I didn't understand that to be the problem. Most distributions have a way of doing different major versions of the same library. Think Gtk+2 and Gtk+3 libraries.

The problem is that the developers of the main project stops doing patches for older branches. This then pushes the burden to the GTK developers, which probably doesn't have the resources.

For the GTK project it is the same problem - nobody technically prevents anyone from doing new development on the GTK 2 branch, but in practice this is what happens because all the main devs moved on to GTK 3.

Reply Score: 2

Bill Shooter of Bul Member since:
2006-07-14

I just feel like the article is about a developer whining about stable distributions not shipping the latest code because upstream doesn't have the resources to keep up with security fixes.


WTF? Distributions should curate and ship secure code from upstream. Its not upstream's fault if the distro doesn't package the secure version of the apps.

If you are a stable or "long term release" distro, then you either don't ship the insecure unmaintaned code, or you backport fixes your self. You don't get to whine about upstream not contorting themselves to your release cadence.

Reply Score: 2

acobar Member since:
2005-11-15

WTF? Distributions should curate and ship secure code from upstream. Its not upstream's fault if the distro doesn't package the secure version of the apps.

While I agree with you on general terms, we should not forget that many apps use web engines to display help. I think we agree that if it is used to display local content of trusted applications, there is not such a hurry to suffer all the trouble associated to move to a new and incompatible engine.

If you are a stable or "long term release" distro, then you either don't ship the insecure unmaintaned code, or you backport fixes your self. You don't get to whine about upstream not contorting themselves to your release cadence.

If we take in account that it is "modern" web engines we are talking about things raise to a way more troublesome snafu. Modern web engines are a very complex, convoluted, bloated, hodgepodge cauldron of code. Sometimes, backport security fixes are not that easy on this case and the effort is not always worth if all you do with it is show internal help.

Being frank about the status of it, except for gnome pushing web technology to display things on desktop, that in my humble opinion are better served by other methods, and a minority using small browsers, I bet that the large majority of linux (and FreeBSD) users use Chromium or Firefox for non-local content access and both of them are regularly updated so, in my opinion the problem is being overstated to, perhaps, promote oneself interests.

Reply Score: 3

Incredibly detailed article
by avgalen on Tue 9th Feb 2016 15:17 UTC
avgalen
Member since:
2010-09-23

This really shows how everything is interwoven and how hard it is to keep everything functionally working and secure at the same time.

People complained so much about IE being integrated into Windows, but at least that was an OS-component that was supported "forever" by a major party. How can anyone be surprised that integrating an optional and fast moving library like Webkit into so many programs that move much slower is causing security issues?

If your development moves slower than libraries that you use...you link dynamically to those libraries. If that breaks your application...you are going to have to fix your application. If you cannot fix your application it is time for your users to see that problem and find another application that moves faster

Reply Score: 1

RE: Incredibly detailed article
by dpJudas on Wed 10th Feb 2016 00:31 UTC in reply to "Incredibly detailed article"
dpJudas Member since:
2009-12-10

This really shows how everything is interwoven and how hard it is to keep everything functionally working and secure at the same time.

People complained so much about IE being integrated into Windows, but at least that was an OS-component that was supported "forever" by a major party. How can anyone be surprised that integrating an optional and fast moving library like Webkit into so many programs that move much slower is causing security issues?

It is a little bit more complicated than this. WebKit (and the Chromium fork) suffers from the problem that the main contributors invest in it for their individual browsers - not for 3rd party frameworks. This means that they do not have a stable library API and they care relatively little if their changes are compatible with any use-cases that was possible earlier on.

For example, the very first thing Google did with Chromium was to drop the canvas abstraction in WebKit and just typedef'ed it to SkCanvas (the skia canvas). While this naturally works perfectly fine for Google's two main targets, Android and Chrome, it also effectively means any 3rd party framework can now take a hike. At this point in time both projects are virtually only supporting their main sponsors.

Now, you make it sound like the situation was better with the MSHTML API in Windows. I don't really agree with you on that because while that API was more stable, it also was deliberately gimped on earlier versions of Windows for no good reasons. For example, with WebKit you can at least always know that your embedded web browser runs with exactly the version you deployed. With MSHTML you couldn't be sure whether it was IE 6, 7 or 8. Even if you had verified your product worked with all those, tomorrow Microsoft might release IE 9 and break your embedded product due to some security enhancement or trigger some bug in the HTML/CSS/JS.

If your development moves slower than libraries that you use...you link dynamically to those libraries. If that breaks your application...you are going to have to fix your application. If you cannot fix your application it is time for your users to see that problem and find another application that moves faster

The issue isn't static vs dynamic linking. The problem is when library/frameworks completely change their APIs. In those situations anyone that heavily relied on the old one is royally screwed. Examples: MFC, WinForms, GTK 2, Qt 3, WinRT, Win32 mobile, MSHTML, GDI, DirectDraw, Direct3D 3, Direct3D 9, Direct3D 11, OpenGL 1, OpenGL 4, ActiveThis, ActiveThat, Angular, heck every other web thing out there, etc.

It is probably the biggest argument against using more 3rd party stuff than absolutely needed. Each thing you link with that isn't under your own control is a ticking time bomb as a developer. ;)

Reply Score: 3

RE[2]: Incredibly detailed article
by avgalen on Wed 10th Feb 2016 09:25 UTC in reply to "RE: Incredibly detailed article"
avgalen Member since:
2010-09-23

The issue isn't static vs dynamic linking. The problem is when library/frameworks completely change their APIs. In those situations anyone that heavily relied on the old one is royally screwed.

In the basis it is all about static vs dynamic linking.
If you use static linking you know your app will continue to work with the old API, but you are screwed if the old API doesn't receive security fixes.
If you use dynamic linking you are screwed when the API changes but you are sure to receive security fixes.

My point was not about the quality of MSHTML against webkit, but about linking to one of the fastest moving libraries that isn't provided by the OS.

It is probably the biggest argument against using more 3rd party stuff than absolutely needed. Each thing you link with that isn't under your own control is a ticking time bomb as a developer

Absolutely. When you develop software it is your responsibility to make a good product. If you choose to include 3rd party stuff you become responsible for that part as well because your application relies on it. This is also why the web is so broken. Everyone links/includes stuff from everywhere and it changes all the time, often in major ways. There are almost no stable API's on the web and it is a small wonder that so many things work

Reply Score: 2

demetrioussharpe Member since:
2009-01-09

"The issue isn't static vs dynamic linking. The problem is when library/frameworks completely change their APIs. In those situations anyone that heavily relied on the old one is royally screwed.

In the basis it is all about static vs dynamic linking.
If you use static linking you know your app will continue to work with the old API, but you are screwed if the old API doesn't receive security fixes.
If you use dynamic linking you are screwed when the API changes but you are sure to receive security fixes.
"

This makes absolutely zero sense. If the library's API changes, it doesn't matter whether your application is statically linked or not -it's still not going to work. So, no matter how many security fixes the new version of the library has, you still won't get the benefit of it, because your application simply won't work.

Reply Score: 2

avgalen Member since:
2010-09-23

Please enlighten me how this makes zero sense:

Statically linking:
/MyTool/MyProg.exe (uses Webkit2-123.lib)
/MyTool/WebKit2-123.lib

Dynamically linking:
/MyTool/MyProg.exe (uses Webkit*.lib)
/lib/Webkit2.lib (currently version 123)

Now if WebKit2-123.lib gets a minor security update that should become /lib/Webkit2.lib (now version 124). With statical linking my program is guaranteed to keep working but isn't secure anymore
With dynamical linking my program is secure but might have broken (very unlikely but possible)

Reply Score: 3

dpJudas Member since:
2009-12-10

Now if WebKit2-123.lib gets a minor security update that should become /lib/Webkit2.lib (now version 124). With statical linking my program is guaranteed to keep working but isn't secure anymore
With dynamical linking my program is secure but might have broken (very unlikely but possible)

You really need to go back and read the original article. This is Linux we are talking about - virtually everything is always dynamically linked there.

The article is talking about the consequences of API breakage and the costs associated with backporting security problems. If nobody creates a new DLL for you, it doesn't help one bit whether its dynamic or not.

Reply Score: 2