Linked by Thom Holwerda on Fri 22nd Apr 2016 22:04 UTC
Internet & Networking

Recall that Doom is a multi-level first person shooter that ships with an advanced 3D rendering engine and multiple levels, each comprised of maps, sprites and sound effects. By comparison, 2016's web struggles to deliver a page of web content in the same size. If that doesn't give you pause you're missing something. So where does this leave us?

It leaves us with a web that is horrible to use.

Order by: Score:
Mobile vs desktop
by WorknMan on Fri 22nd Apr 2016 22:33 UTC
WorknMan
Member since:
2005-11-13

I understand a lot of people on mobile devices like to set up their browsers as desktop, so they get the full desktop experience. I'd actually like to do the opposite and set up desktop browsers up as mobile. Seems to me that mobile sites generally have the same content, without nearly as much of the extra crap ...

Reply Score: 5

RE: Mobile vs desktop
by p13. on Fri 22nd Apr 2016 22:51 UTC in reply to "Mobile vs desktop"
p13. Member since:
2005-07-10
RE: Mobile vs desktop
by Carewolf on Fri 22nd Apr 2016 23:29 UTC in reply to "Mobile vs desktop"
Carewolf Member since:
2005-09-08

I would not recommend it. Mobiles sites are generally total crap, and often have the same or more bloat.

Reply Score: 6

RE[2]: Mobile vs desktop
by chithanh on Sat 23rd Apr 2016 07:11 UTC in reply to "RE: Mobile vs desktop"
chithanh Member since:
2006-06-18

It is quite mixed in my experience.

Some publications like Slashdot have indeed horibble mobile sites, which are much worse than their non-mobile counterparts even on a phone.

But there are quite a few where the mobile site browsing experience is better even on a desktop.

Reply Score: 3

RE[2]: Mobile vs desktop
by IndigoJo on Sat 23rd Apr 2016 08:16 UTC in reply to "RE: Mobile vs desktop"
IndigoJo Member since:
2005-07-06

And they often don't even give you the content you ask for. Sometimes they just redirect to a mobile front page.

Reply Score: 2

RE[2]: Mobile vs desktop
by Morgan on Sun 24th Apr 2016 17:41 UTC in reply to "RE: Mobile vs desktop"
Morgan Member since:
2005-06-29

Indeed, without an ad blocker the typical mobile website consists of about 30 percent advertisement on the top and the bottom bottom, with the remaining 40 percent of content in the middle, which more often than not is also covered by a floating ad. When one tries to tap on the tiny "X" in the corner to dismiss that one, they inevitably tap the border ad behind it instead, launching three or four pop-up tabs with videos playing.

And that's the websites with content worth reading, once you can get to it.

Reply Score: 4

My condolences, CERN.
by dionicio on Fri 22nd Apr 2016 23:42 UTC
dionicio
Member since:
2006-07-12

On your never realized dreams of global enlightenment.
Money will always work this way.

.....

Click bait for procrastinated tabloid & gossip addicted consumers.

Reply Score: 3

Comment by Drumhellar
by Drumhellar on Fri 22nd Apr 2016 23:54 UTC
Drumhellar
Member since:
2005-07-12

Bandwidth has also vastly increased.

I'm less concerned with bandwidth usage (I have plenty), and more concerned with utterly shitty, inefficient scripts on pages that make my CPU fan spin up for no visible reason.

If I put my PC in low-power mode (4c/8t at 700Mhz is what the CPU runs at in that mode), the web is hardly usable. Maybe I should stop using Firefox, but, dammit, I like tab groups way too much. The next browser that has tab management like Tab Groups has my vote.

Edited 2016-04-22 23:56 UTC

Reply Score: 5

RE: Comment by Drumhellar
by kurkosdr on Sat 23rd Apr 2016 00:58 UTC in reply to "Comment by Drumhellar"
kurkosdr Member since:
2011-04-11

I'm less concerned with bandwidth usage (I have plenty), and more concerned with utterly shitty, inefficient scripts on pages that make my CPU fan spin up for no visible reason.


This. Javascript has ruined the web. Before Javascript, sites used to be beautiful trees that even a 100MHz processor with a tiny amount of RAM could process. But no, web designers are compelled to drop useless script after useless script in, such as making images "pop-out", stupid animated galleries, and the biggest annoyance of them all, scroll-activated JS scripts. For example Ars Technica, on their mobile site, feels compelled to include a script that "compresses" the logo when scrolling in the comments of an article, because reasons.

And none of those "designers" has the courtesy to provide a noscript tag with an image that doesn't pop-out, a gallery that isn't animated, or a logo that is just a frickin' png. I frequently disable Javascript on old phones (to prevent the browser from crashing) and the more "hip" a website is, the more it breaks.

I kinda wish the original vision of the web, aka a beautiful HTML tree with boxes of JavaSE apps activated only if the user wants to and which leave the rest of the document alone instead of messing with it, had happened. And it would have happened if Sun allowed Java to be integrated into browsers instead of being a third-party plug-in.

PS: What's the purpose of Google's AMP and Facebook's Instant Articles. Can't they just make light webpages?

Edited 2016-04-23 01:15 UTC

Reply Score: 9

RE[2]: Comment by Drumhellar
by Drumhellar on Sat 23rd Apr 2016 01:07 UTC in reply to "RE: Comment by Drumhellar"
Drumhellar Member since:
2005-07-12

Much of the problem with Javascript is in ads, though. Often times, 70% of what a page loads is from a couple of ads that load giant chunks of Javascript.

But, fark, eternally scrolling pages that just load more content when you get to the bottom of the page are annoying. Worse, some sites hide half of their content and make you click a button before it appears - for no fucking reason. AAARRRGGGHHH!!!!

That makes me mad.

JS that makes it so pages cannot be static piss me off, too. For example, if you click a link on Facebook, then hit back, your page is different. Too many pages do this.

Reply Score: 7

RE[2]: Comment by Drumhellar
by lucas_maximus on Sat 23rd Apr 2016 04:01 UTC in reply to "RE: Comment by Drumhellar"
lucas_maximus Member since:
2009-08-18

This. Javascript has ruined the web. Before Javascript, sites used to be beautiful trees that even a 100MHz processor with a tiny amount of RAM could process. But no, web designers are compelled to drop useless script after useless script in, such as making images "pop-out", stupid animated galleries, and the biggest annoyance of them all, scroll-activated JS scripts.


Actually the JS scripts are probably not as much of a problem as you think they are. JS rendering in some cases is actually faster than rendering server side and pushing to the page. Especially on phones where the latency for downloading the markup and assets is actually a bigger deal than processing the JS.

Also CSS3 can have a massive performance impact, rounded corners for example can slow down a device (early blackberry phones) to a crawl if there are enough rounded corners rendered on screen at the same time, whereas the old "inefficient" way of rounded corners (a div with a background image) would actually render faster.

If you use CSS translate-z for example you can crash the phone since it will render the element on the GPU. If you do this too many times you can fill up GPU memory and crash the phone. This used to happen quite often on devices such as an iPhone 3gs.

For example Ars Technica, on their mobile site, feels compelled to include a script that "compresses" the logo when scrolling in the comments of an article, because reasons.


The actual logo compression looks like it is done via CSS transitions so it would actually be very efficient and offloaded to the GPU and will happen outside of the main JS event loop.

http://imgur.com/AXNkkJ5

Arstechnica's website is one of the best mobile websites on the web. Their website is a very good example of how to do it right.

PS: What's the purpose of Google's AMP and Facebook's Instant Articles. Can't they just make light webpages?


No because and the reason why is that some product manager normally wants a few hundred tracking scripts. Most web devs normally build something nice and this shit get tacked on over time.

I had our desktop website rendering in less than a second on a 10 meg connection without anything being cached (and there is 15000 lines of JS and 10000 lines of CSS), over times management kept packing scripts onto the page until it is now back to 4 seconds average.

Edited 2016-04-23 04:06 UTC

Reply Score: 3

RE: Comment by Drumhellar
by bassbeast on Sun 24th Apr 2016 00:26 UTC in reply to "Comment by Drumhellar"
bassbeast Member since:
2007-11-11

Have you tried IceDragon or Waterfox? They are both based on the Gecko engine so your Tab Groups should work but they feel less sluggish and bloaty IMHO.

I don't have your PC so I can't comment on how they will behave in your particular setup but I can say that running Comodo IceDragon on an underclocked Phenom I 9600 (1600/400MHz) it feels quite snappy.

Reply Score: 2

totally irrelevant comparison
by kristoph on Sat 23rd Apr 2016 00:32 UTC
kristoph
Member since:
2006-01-01

In other news, a potato is not a sweet and tangy as an orange. For years we've been complaining about this significant shortcoming of potatoes and yet nothing, NOTHING has been done!

I blame the government, major corporations, and that guy down the street whose house is bigger then mine.

That's why I am voting for Donald Trump or that other guy with crazy hair, Sandy or something.

Reply Score: 3

Silly comparison
by dpJudas on Sat 23rd Apr 2016 01:10 UTC
dpJudas
Member since:
2009-12-10

Two simple stats about Doom:

* The typical texture size in Doom is 64x64 pixels using a 8 bit palette. By comparison, the small OSnews logo at the top of this page is 176x69 pixels alone. That one icon covers 3 doom textures. If it had been a little bit more advanced so that it required true color it would have consumed more than 9 full Doom textures. For a single web image.

* Doom was made for 320x200. Web pages in 2016 are made for 5120x2440 or more. A single screenshot on my computer holds more information than 9000 doom textures.

If a web page was meant to be as blocky and low res as Doom (both graphics and fonts), then yeah, maybe it would be fair to compare the two.

Reply Score: 3

RE: Silly comparison
by Alfman on Sat 23rd Apr 2016 03:23 UTC in reply to "Silly comparison"
Alfman Member since:
2011-01-28

dpJudas,

If a web page was meant to be as blocky and low res as Doom (both graphics and fonts), then yeah, maybe it would be fair to compare the two.


Clearly you are right that it's not apples to apples, however I think this misses the point. Doom was a vast interactive multiplayer multilevel world, which not only included all the resources for the game, but also a self contained engine to handle it's own archive formats, dialogs, 3d engine, fonts, networking stack, etc.

That a simple mostly text webpage today with no high res graphics has such a large footprint highlights how grossly inefficient thing have gotten.

Take osnews page (using the developer tool's network tab):
js * 42 = 1,934KB
html * 14 = 133KB
css * 4 = 48KB
images * 39 = 48KB
xhr * 16 = 31KB

Total Size = 2196KB
Total Time = 9.5s

I turned off all blockers for this test, and...OMG is this bad or what?

Note that the "hires" multimedia elements are not the culprit, most of the overhead is in 2MB of virtually pointless JS.

Using the adblocker (with some custom rules to block additional 3rd party tracking) a full page load goes down to 139KB total in 4s. This could be better, but the low hanging fruit is clearly 94% overhead from 3rd party ads & trackers.

Caching helps eliminate some network traffic, but the memory & CPU resources still take a toll locally which is why opening a single webpage needs more memory than entire computer had in the 90s.

It seems that every time modern hardware & networks address inefficiency using more horsepower, the less efficient we become. I used to strive to be efficient just because I took pride in it, but I've mostly given up because no one else cares.

Reply Score: 9

RE[2]: Silly comparison
by dpJudas on Sat 23rd Apr 2016 05:01 UTC in reply to "RE: Silly comparison"
dpJudas Member since:
2009-12-10

Total Size = 2196KB

Doom2.WAD (the main resource file of Doom 2) is 14 megabytes. The article Thom links to cheats by using zipped versions of the Doom installer for comparison. Which means to make a fair comparison to a web page you will have to zip the JS+HTML+CSS. As they are text files the file size will drop significantly (i.e. the index html for OSnews drops to 20KB).

Now, you could of course make the point that the compiled exe of Doom did a lot more than the javascript on a web page. Which is true at some level. However, programs of that age did this by taking countless shortcuts that were carefully balanced to exactly do what they were designed for and nothing more.

As an example, Doom's level and BSP vertices are 16 bit integers (not even fixed point) and only works because id software avoided all the situations where that would break. Modern frameworks could never be done under such constraints, and as such you'll have to custom code everything. Trading saved development time and difficulty for a bit of bandwidth is well worth it - especially as bandwidth gets increasingly more cheap.

That a simple mostly text webpage today with no high res graphics has such a large footprint highlights how grossly inefficient thing have gotten.
...
It seems that every time modern hardware & networks address inefficiency using more horsepower, the less efficient we become. I used to strive to be efficient just because I took pride in it, but I've mostly given up because no one else cares.

Nobody cares because in the big picture the user doesn't care if it is efficient. The only thing that matters is that it doesn't get too slow. The improved hardware resulted in cheaper software development, not faster programs (they've been roughly fast enough for 20 years soon).

Reply Score: 1

RE[3]: Silly comparison
by Alfman on Sat 23rd Apr 2016 07:15 UTC in reply to "RE[2]: Silly comparison"
Alfman Member since:
2011-01-28

dpJudas,

Which means to make a fair comparison to a web page you will have to zip the JS+HTML+CSS. As they are text files the file size will drop significantly (i.e. the index html for OSnews drops to 20KB).


There's no point in nitpicking this stuff because we already know it's always going to be apples and oranges. It's just an observation of how spoiled we are that a dinky little webpage needs 2MB of raw data and 250MB of ram. If you want to slice it a different way then I have no problem with that, but I think it's inefficient no matter how we cut it.


However, programs of that age did this by taking countless shortcuts that were carefully balanced to exactly do what they were designed for and nothing more.


Of course, we used to optimize stuff because there wasn't a choice. Efficient algorithms and structures were needed to run in a few MB of ram and had to load and run quickly. It took great skill and effort to do this. I kind of miss these classic CS skills, but they're obsolete since we have far more resources, which negates the need to be efficient. Most clients today don't really care about software efficiency. While we could make things efficiently today, we usually can't be bothered.



Nobody cares because in the big picture the user doesn't care if it is efficient. The only thing that matters is that it doesn't get too slow. The improved hardware resulted in cheaper software development, not faster programs (they've been roughly fast enough for 20 years soon).



Well, some people do complain, at least indirectly. My wife has mentioned that FF on her tablet struggles with bloated websites. But alas, as things keep getting less efficient it becomes the user's responsibility to upgrade when there is a software performance problem.

Another anecdotal example: I bought a laptop at the begging of 2015 and it's already too slow. In a recent project my job is to upgrade a codebase from VS2003 to VS2015. Man, VS2015 is so frustratingly slow that I seriously want to go back to an old & unsupported yet much faster version. This complaint is echoed by many in the community. Maybe the new version "does more", but it's not readily apparent that any of it will be useful for my work. I regularly have to wait for the IDE to catch up as I'm typing. Sometimes when I click on something I'm not sure if I misclicked or if the IDE is just running dog slow. These kinds of UI sluggishness problems should be extinct by now on all modern hardware. I assume the reason MS doesn't care about Visual Studio performance is because everyone there is running the latest and greatest high end computers available; MS probably gets a great discount and just think, they don't have to pay the "MS tax" either.

Edited 2016-04-23 07:19 UTC

Reply Score: 6

RE[4]: Silly comparison
by Kochise on Sat 23rd Apr 2016 11:47 UTC in reply to "RE[3]: Silly comparison"
Kochise Member since:
2006-03-03

Still using VC6 with WndTabs and everything is perfect in this world.

Reply Score: 3

RE[4]: Silly comparison
by dpJudas on Sat 23rd Apr 2016 17:00 UTC in reply to "RE[3]: Silly comparison"
dpJudas Member since:
2009-12-10

It's just an observation of how spoiled we are that a dinky little webpage needs 2MB of raw data and 250MB of ram. If you want to slice it a different way then I have no problem with that, but I think it's inefficient no matter how we cut it.

First of all, if you insist on using the 2MB number, then you also must insist that the size of Doom and its resources is 15MB. And that is only if we use the original executable - zdoom.exe is 3MB, making it 18MB in total. Comparing a *zipped* version of Doom against an unzipped web resources makes no sense. Thus, OSnews is only 14% of the pointless comparison to Doom.

Secondly, that 250MB of ram usage is certainly *not* the amount of memory the OSnews page uses. My FF total memory usage is 198MB, which includes all kinds of stuff that has nothing to do with displaying this page.

Finally, inefficiency is in the eye of the beholder. There are countless places in the Doom source code where something could been done a lot faster or better. Doom's size is a product of what was available for its target machine (a 386 with 2MB or memory or something along those lines).

Likewise the memory usage of FF in 2016 is a product of the fact that my computer has 8 gigabytes of memory. 200MB is only 2.5% of the memory of an average PC today. That places it firmly in the "who cares" category of concerns for a developer on the FF team. Likewise, transferring 2 MB of data on my 30 mbit connection is a one time operation (its cached afterwards) of 0.5 seconds. It is in fact so much in the "who cares" category that a simple config file change on the OSnews web servers (deflate) would reduce that to 1/3 of its size - but it is not on because things are already fast enough.

Once Carmack reached his target he stopped optimizing. I can see this isn't a popular viewpoint on this site, but that is how the world works. Optimization costs money. I'm pretty sure that if you drew a graph that divided the page load size with the average connection speed you'd actually see something interesting. It would show if sites are getting unbearably slow or not. Comparing the current size to an old classic computer game is just silly and shows nothing beyond the "wow, could we really place all of Windows 95 in L2 CPU cache today?" kind of realizations. The stuff that is fun to ponder about but says very little.

Well, some people do complain, at least indirectly. My wife has mentioned that FF on her tablet struggles with bloated websites. But alas, as things keep getting less efficient it becomes the user's responsibility to upgrade when there is a software performance problem.

It has always been this way.

Man, VS2015 is so frustratingly slow that I seriously want to go back to an old & unsupported yet much faster version.

Did you install Update 2 and did you enable hardware acceleration in the Options dialog? When VS2015 came out I immediately went back to VS2013 because this product was clearly broken. About a month ago I gave it another try (I really want the C++11 features ;) ) and at least on my C++ projects it is now by far the fastest VS I've seen in a while.

Of course, if you compare it to VS98 everything is slow. But then that just once again illustrates that this isn't an issue specifically with the web - it is that developers always only optimize until things somewhat OK on their own hardware. I'm sure that VS98 was slow compared to say VS2.0 if you tried both on a 1996 machine. ;)

Reply Score: 3

RE[5]: Silly comparison
by Alfman on Sat 23rd Apr 2016 20:14 UTC in reply to "RE[4]: Silly comparison"
Alfman Member since:
2011-01-28

dpJudas,

First of all, if you insist on using the 2MB number...


I don't claim there's anything special about 2MB at all. I chose osnews to highlight modern inefficiency simply because that's where we are at, but really it's just an arbitrary page on the internet. Much more egregious examples would be forbes.com, money.com, CNN.com, etc.

For example, on cnn.com even excluding multimedia there's 9MB worth of raw data. This is horrendous. There's only 15k of actual text. Some markup and scripting overhead may be justified but 9MB should be shocking, the fact that it's not just goes to show how little appreciation we have for the potential of 9MB. (And no, there's nothing special about this number, it's just an arbitrary example).

I took the entire cnn.com page (consisting of 8 pages or so of text, images, ads, everything), saved it as as a lossless png...while this is a horribly inefficient way to store a web page, take a guess how big that was?

The entire thing as a massive full color 1270*7205 lossless image was only 4.9MB.

Secondly, that 250MB of ram usage is certainly *not* the amount of memory the OSnews page uses. My FF total memory usage is 198MB, which includes all kinds of stuff that has nothing to do with displaying this page.


Well, more or less, depending on whatever other plugins are installed.


Did you install Update 2 and did you enable hardware acceleration in the Options dialog?


Yea, it's a fresh update.
"visual studio is currently using hardware-accelerated rendering. The visual experience settings automatically change based on system capabilities".
Thanks for the suggestion though.
I've tried searching for solutions, it appears to be a very common problem, I think it's just a slow product.

Of course, if you compare it to VS98 everything is slow. But then that just once again illustrates that this isn't an issue specifically with the web - it is that developers always only optimize until things somewhat OK on their own hardware. I'm sure that VS98 was slow compared to say VS2.0 if you tried both on a 1996 machine.


VS5/6 ran well on all my computers and I loved that about them. My circa 2003 computer ran VS2003 easily and VS2005 without issue. 2008 was slower but usable on the same system from 5 years earlier. Now VS2015 is just intolerable on my 2015 rig. Evidently I need to buy a newer beefier system to run VS2015 well, but having bought one just last year it's not in this year's budget. ;)

Ironically this project's code base is from the 90's, so it doesn't benefit much from anything newer. Oh well, I don't really have a point to make with this, it's just a complaint.

Reply Score: 3

RE[6]: Silly comparison
by dpJudas on Sat 23rd Apr 2016 22:53 UTC in reply to "RE[5]: Silly comparison"
dpJudas Member since:
2009-12-10

For example, on cnn.com even excluding multimedia there's 9MB worth of raw data. This is horrendous. There's only 15k of actual text.

From my end-user perspective when I tried to surf to cnn.com (a place I never go so no caching) was that the page showed up right away and seemed fast on my MacBook Pro (2014 edition, I think). I really don't see the problem.

Now, IF you want me to complain about cnn.com, then it isn't the load speed. It is the *shit* they use their scripts for: tear-off headers, popups from the bottom, the scale of the page (apparently the designer needed glasses), how I can't get a quick glance of anything there. Oh god how I hate anything using responsive design frameworks. If I frequented this page my first step would be to add a Stylish script. ;)

Now VS2015 is just intolerable on my 2015 rig. Evidently I need to buy a newer beefier system to run VS2015 well, but having bought one just last year it's not in this year's budget. ;)

I don't really know why there's such a big difference between your experience and mine. My computer is from late 2014 and even only got 8 gigs of memory. It is an i7 with a 980gtx card and a SSD though. Too bad Update 2 didn't fix it for you as it did for me. ;)

Reply Score: 2

RE[7]: Silly comparison
by Drumhellar on Sun 24th Apr 2016 00:01 UTC in reply to "RE[6]: Silly comparison"
Drumhellar Member since:
2005-07-12

VS2015 seems snappy enough for me on my 3-year old i7-based laptop with a GB of ram and a slow spinning disk.

Now, it's much faster, since I have an SSD, but otherwise, the same laptop.

Reply Score: 2

RE[7]: Silly comparison
by Alfman on Sun 24th Apr 2016 04:09 UTC in reply to "RE[6]: Silly comparison"
Alfman Member since:
2011-01-28

dpJudas,

From my end-user perspective when I tried to surf to cnn.com (a place I never go so no caching) was that the page showed up right away and seemed fast on my MacBook Pro (2014 edition, I think). I really don't see the problem.


While the wastefulness irks me, as long as hardware gains come in equal measure with overhead, then you could always come along and say "I really don't see the problem" regardless of how bad it gets.


Currently, it doesn't seem there's there's any momentum at all for making things efficient, and I've accepted that already. However just because we can over-provision technology doesn't mean there aren't costs for doing so, which are multiplied across millions of users. Due to this, just a little bit of effort from producers to optimize content/software could easily save society the billions of dollars that are directly and indirectly paying for overhead.

Reply Score: 3

RE[4]: Silly comparison
by malxau on Sat 23rd Apr 2016 18:35 UTC in reply to "RE[3]: Silly comparison"
malxau Member since:
2005-12-04

Of course, we used to optimize stuff because there wasn't a choice. Efficient algorithms and structures were needed to run in a few MB of ram and had to load and run quickly. It took great skill and effort to do this. I kind of miss these classic CS skills, but they're obsolete...


Not completely. My working life is in kernel mode, and when I leave work, I end up taking similar values home (where it is admittedly less rational.) In my world, this hasn't changed one bit. And we always need younger recruits with dinosaur values to keep things moving.

Man, VS2015 is so frustratingly slow that I seriously want to go back to an old & unsupported yet much faster version...I regularly have to wait for the IDE to catch up as I'm typing.


Why use the IDE? In brutal honesty new IDEs have always been excessively chunky - I remember Visual C++ 1 on a 386 with 2Mb RAM, or buying Visual C++ 4 because it said it needed 8Mb RAM which turned out to be comically optimistic, or first seeing Visual Studio which wanted over a gigabyte of disk space in 1997 (I left it on the shelf.) Underneath all the gunk, the compilers are getting better.

In my current work life, I'm using a VS 2015 with msbuild projects that could be used with the IDE if others want to...but I choose not to most of the time.

Reply Score: 2

RE[5]: Silly comparison
by Alfman on Sat 23rd Apr 2016 20:33 UTC in reply to "RE[4]: Silly comparison"
Alfman Member since:
2011-01-28

malxau,

Not completely. My working life is in kernel mode, and when I leave work, I end up taking similar values home (where it is admittedly less rational.) In my world, this hasn't changed one bit. And we always need younger recruits with dinosaur values to keep things moving.


If you have a good paying job that involves this good old "real" CS work, then I'm all ears ;) Graphic rendering, os-dev, used to be my passion. I've tried to land jobs doing that but it's been in vein. Now I mostly do websites and maintain legacy business software because that's where the majority of the local opportunities are at here on Long Island. If I were to move elsewhere it would have to pay well enough to justify uprooting my family.

Reply Score: 2

In 1995
by unclefester on Sat 23rd Apr 2016 08:46 UTC
unclefester
Member since:
2007-01-13

When I started using the WWW back in 1995 the average web page was ~5KB. It was literally instantaneous on the university 100MB/s LAN and extremely fast on dialup.

Reply Score: 8

RE: In 1995
by ThomasFuhringer on Tue 26th Apr 2016 10:27 UTC in reply to "In 1995"
ThomasFuhringer Member since:
2007-01-25

It served text and pictures. And for that it should not take more resources.
Nowadays websites still serve text and pictures - at least that is all I want from a website - and it takes a whole power plant to make that happen.
What kind of people are producing that crap?

Reply Score: 2

You kids with your fat webpages
by BluenoseJake on Sat 23rd Apr 2016 15:35 UTC
BluenoseJake
Member since:
2005-08-11

Get off my lawn!

Reply Score: 5

Comment by Drumhellar
by Drumhellar on Sat 23rd Apr 2016 18:37 UTC
Drumhellar
Member since:
2005-07-12

Doom isn't a 3D engine.

It's a program that draws lists of trapezoids that it pulls from a BSP tree representing a 2-dimensional map.

Reply Score: 1

RE: Comment by Drumhellar
by JLF65 on Sun 24th Apr 2016 19:33 UTC in reply to "Comment by Drumhellar"
JLF65 Member since:
2005-07-06

Most people call it 2.5D, but in reality, it is indeed a 3D engine. The particular form of 3D rendering is called "lines of constant z". This means that all rendered lines are perpendicular to viewer - flat floors and flat walls. The constant Z means that perspective can be calculated once for the entire line, allowing for a simple linear rendering. It's still 3D, just a very restricted version, making rendering fast for low-end systems.

Reply Score: 5

RE[2]: Comment by Drumhellar
by dpJudas on Mon 25th Apr 2016 05:41 UTC in reply to "RE: Comment by Drumhellar"
dpJudas Member since:
2009-12-10

Most people call it 2.5D, but in reality, it is indeed a 3D engine.

I believe the 2.5D reference is just as much to the limitations of the levels. The game looks 3D, but the actual map is 2D with different height values for each sector.

Reply Score: 4

RE[3]: Comment by Drumhellar
by JLF65 on Mon 25th Apr 2016 19:36 UTC in reply to "RE[2]: Comment by Drumhellar"
JLF65 Member since:
2005-07-06

Yes, games didn't move to 3D maps until the mid 90's. Quake and Tomb Raider were prominent examples of using 3D levels as opposed to 2D with height info. 2D with height info is STILL popular for certain parts of games, even today. For example, the outdoor landscapes for "open" games. You then mix in a 3D mesh for objects located in that 2D map, like rocks or trees or buildings or whatnot.

Reply Score: 2

RE[4]: Comment by Drumhellar
by Alfman on Mon 25th Apr 2016 21:21 UTC in reply to "RE[3]: Comment by Drumhellar"
Alfman Member since:
2011-01-28

JLF65,

Yes, games didn't move to 3D maps until the mid 90's.


Even though it's graphics were a bit laughable, flight simulator goes back to the 80s and it looked like terrain was stored as real 3d vectors with very few details, and not just a 2.5D height map.


I think test drive 3 in 1990 had full 3d terrain, although I'm not sure the game really called for it. The computational efficiency of a 2.5D height map could probably have yielded more detail. However you did actually drive through some "mountains".

Descent was the first game I remember well that explicitly incorporated 6 degrees of freedom into the game in 1994.


2D with height info is STILL popular for certain parts of games, even today. For example, the outdoor landscapes for "open" games. You then mix in a 3D mesh for objects located in that 2D map, like rocks or trees or buildings or whatnot.



Yea, extremely steep slopes are rare in nature so height maps are easy to edit and generate, they work well. I remember magic carpet used two height maps so you could fly around in caverns. It was awesome that the game incorporated effects that manipulated terrain on the fly.

Aside from flight sims, it's rare to see 3d games using all 6 degrees of freedom.

Reply Score: 2

RE[5]: Comment by Drumhellar
by Kochise on Tue 26th Apr 2016 11:48 UTC in reply to "RE[4]: Comment by Drumhellar"
Kochise Member since:
2006-03-03

Hmmm, let me think...

Elite (David Braben and Ian C.G. Bell)
Zarch (same duo)
Starglider

The Sentinel
Hunter
Midwinter
Powermonger

Stunt Car Racer
Hard Drivin'
...

Edited 2016-04-26 11:51 UTC

Reply Score: 2

RE[4]: Comment by Drumhellar
by dpJudas on Tue 26th Apr 2016 02:57 UTC in reply to "RE[3]: Comment by Drumhellar"
dpJudas Member since:
2009-12-10

Yes, games didn't move to 3D maps until the mid 90's. Quake and Tomb Raider were prominent examples of using 3D levels as opposed to 2D with height info. 2D with height info is STILL popular for certain parts of games, even today. For example, the outdoor landscapes for "open" games. You then mix in a 3D mesh for objects located in that 2D map, like rocks or trees or buildings or whatnot.

I don't really think games like World of Warcraft qualify as 2.5D because their height map terrains have holes in them and contains world map objects. What earned Doom the 2.5D moniker was the fascination that the game looks 3D but in reality you're walking around in a 2D world. The collision detection in Doom didn't even take the height into consideration if I remember correctly.

Reply Score: 2

RE[4]: Comment by Drumhellar
by Soulbender on Tue 26th Apr 2016 06:07 UTC in reply to "RE[3]: Comment by Drumhellar"
Soulbender Member since:
2005-08-18

Yes, games didn't move to 3D maps until the mid 90's.


There were plenty of games before the 90's that did 3D maps. Not necessarily textured 3D but 3D none the less. Other than the many flight simulators a shining example is Mercenary.

Edited 2016-04-26 06:12 UTC

Reply Score: 2

A double edged sword...
by DeadPixel on Sat 23rd Apr 2016 23:39 UTC
DeadPixel
Member since:
2016-04-23

As others have noted, the problem seems to be compounded by modern browsers becoming just as bloated as the pages they render.

This has always bugged me, there are a few programs I will accept dragging my system's performance down and a web browser is certainly not one of them.

A not so great workaround for me was to start using elinks for the quick one off searches that I find myself doing when coding etc, but unfortunately the modern web just isn't built for text only browsers anymore. I have to say though, OSnews is excellent in this regard; I've always been impressed by how neatly it renders in elinks and co.

Luakit would be perfect but is just a bit too unstable at the moment for me to use it day to day. Links - when compiled with graphics support, looks okay(ish), is rock solid and blazingly quick but suffers from the same problems as elinks.

Recently I've started using xombrero and have to say, it ticks all of my boxes: lightweight, quick and it supports a vi-like command interface. The only caveat here is that it's affected by a bug in libjavascriptgtk that seems to be causing random segfauts in all of the alternative webkit browsers atm (a minor annoyance for me compared to dealing with firefox and chrome)

Reply Score: 3

Your not wrong!
by Sauron on Sun 24th Apr 2016 06:11 UTC
Sauron
Member since:
2005-08-02

Try browsing the web these days using a Amiga A1200, nightmare. It doesn't seem too long ago where you could do this easily, no chance now.
Too much bloat!

Reply Score: 3

RE: Your not wrong!
by leech on Sun 24th Apr 2016 15:35 UTC in reply to "Your not wrong!"
leech Member since:
2006-01-10

Try browsing the web these days using a Amiga A1200, nightmare. It doesn't seem too long ago where you could do this easily, no chance now.
Too much bloat!


I was going to bring this up but you beat me to it! I have both an Atari Falcon 030 and an Amiga 4000D (nicely upgraded with an 060 and tons of ram) and when people throw around MB for websites, and you're dealing with 14mb (Falcon) and 128+16 Fast, 2mb chip (Amiga) then it is massive. Ibrowse at least handles aminet fairly well, I haven't browsed the net on the Falcon yet, still setting some stuff up, but it really is hard to do on such limited resources.

Reply Score: 2

RE: Your not wrong!
by DeadPixel on Sun 24th Apr 2016 16:03 UTC in reply to "Your not wrong!"
DeadPixel Member since:
2016-04-23

I've yet to get around to setting up the Internet on my a1200, I'm hoping that the ACA 1230 will give it a hand with the worst offending websites.

Any recommendations for browsers? I remember playing around with AWeb a few years ago and was left pretty underwhelmed.

Reply Score: 2

RE[2]: Your not wrong!
by Sauron on Mon 25th Apr 2016 02:47 UTC in reply to "RE: Your not wrong!"
Sauron Member since:
2005-08-02

I've yet to get around to setting up the Internet on my a1200, I'm hoping that the ACA 1230 will give it a hand with the worst offending websites.

Any recommendations for browsers? I remember playing around with AWeb a few years ago and was left pretty underwhelmed.


Your pretty limited on browser choice unfortunately, there is AWeb that you tried already and there is iBrowse that is a little dated now. iBrowse is the better choice although it will struggle with the latest web sites using CSS etc.
There is also Voyager but that is really outdated and mostly unusable except on basic sites.
There is NetSurf but you need a 68060 for that, and there is OWB that is used in Aros and MorphOS, that requires emulation, a PPC processor or Aros on x86.
The good news is that iBrowse is getting a new release in the near future which should alleviate some of the problems with modern web sites.
It still won't solve the bloat problems though.

Reply Score: 2

Flash didn't have this problem.
by CaptainN- on Sun 24th Apr 2016 15:44 UTC
CaptainN-
Member since:
2005-07-07

Flash didn't have this problem.

Reply Score: 3