Me, three weeks ago:
Mark my words: this “next generation of Windows” is nothing but a few nips and tucks to the current, existing UI to make it slightly less of an inconsistent mess.
Nothing more.
Fast-forward to today, and we have a leaked build of this “next generation of Windows”, Windows 11, and much to my utter, devastating surprise, it turns out I was 100% right. Windows 11 is exactly what I said it would be: Windows 10, but with a few small nips and tucks (rounded corners, centered taskbar, tweaked Start menu), and that’s it. All the old Windows 95, XP, and 7-era stuff is still there, and since you can actually easily turn off a lot of the changes in Windows 11, there’s now a whole new layer of old design – Windows 10-era stuff.
If this is the “next generation of Windows”, Microsoft is delusional.
The “old Windows 95, XP, and 7-era stuff” is crucial for backwards compatibility and maintaining familiarity, so I don’t consider it a problem, as long as the icons are finally updated to be consistent. Plus, I don’t trust the Microsoft of today to provide replacements for things like the Control Panel or the Volume Manager without shipping broken stuff that’s missing a ton of features.
BTW the whole “Windows 11” thing looks like an excuse for Microsoft to break driver compatibility or sell upgrade licenses (or both). I mean, it’s been 6 years, that’s about as long as Microsoft can wait before trying to sell upgrade licenses.
I wish I am wrong and we get something meaningful announced on 24th June (for example updates to the win32 experience beyond the already-announced Project Reunion), but the fact a regular Windows 10 biannual update (Sun Valley) magically turned into “Windows 11” in a matter of weeks doesn’t leave much room for optimism.
The backward compatibility software and drivers should be a totally separate package. A person buying new laptop or all in one PC (aka most buyers) isn’t running 16 bit applications, a floppy drive or a dot matrix printer. They don’t need their machine filled with unnecessary cruft which makes it slower and less stable.
Brisvegas,
I agree it can go in an optional bundle. A small nitpick: Windows already has dropped 16 bit application on all 64bit versions of windows (ie around vista). This is because AMD themselves decided to stop supporting VM86 under 64bit operating systems. This means the CPU is no longer capable of doing it directly without emulation. Funnily enough this did affect me and I was annoyed that some of my DOS games and programs no longer worked. At least DOSEMU was available, but to this day it still doesn’t run everything as well as native hardware did, especially audio sync issues.
@Alfman
Curious if you have tried DOSBox-X? It seems to be pretty great so far for somethings I’ve thrown at it.
leech,
No I haven’t tried it, apparently I had it bookmarked though, haha.
Looking at the website I can’t tell if it has any relation to the regular dosbox?
None of these things are particularly critical, it’s just for running things of sentimental value 🙂
If you have a computer with a fairly fast single-core performance (4Ghz CPU), let me recommend PCem. It’s an emulator that gives you an accurate emulation of a computer with a Pentium 75Mhz CPU, an SVGA card, a Soundblaster and an optional 3dfx Voodoo. You will have to provide a BIOS and it takes some effort to setup a VM in PCEm, but it doesn’t have the incompatibilities of DOSBox. You can even run Windows 98SE in it and it will run full-screen DirectX applications. DosBox-X is another thing you can try.
kurkosdr,
Thanks for the suggestion! I’m glad to see it supports linux. And Voodoo emulation too, haha. I’ve bookmarked it and will try to remember to give it a try next time.
BTW 75Mhz is the recommended starting setting. You can obviously crank it up if your CPU can take it.
@kurkosdr
I used DOSBox and Glide some years ago now. I cannot remember if the version I used had the patch integrated or I had to apply the patch. I think it was already integrated… This is where I began but I cannot for the life of me remember what I downloaded.
http://www.si-gamer.net/gulikoza/glide.html
https://www.vogons.org/viewtopic.php?t=16462
One graphics card lacking emutlation is the Matrox Mystique. In Tomb Raider the quality of the rendering underwater compared to a 3Dfx Voodoo was something else.
@HollyB
Most emulators and VM software either support the calls of a popular 3D API (Glide, DirectX and OpenGL, or any combination thereof) or emulate a graphics card (usually a Voodoo or an S3) and let the drivers handle the API support.
Tomb Raider 1 doesn’t support DirectX or OpenGl, so we don’t want that.
Tomb Raider does support Glide but the Matrox Mystique doesn’t (it was a Voodoo exclusive), so we don’t want that either.
So, the only API I can think of that is 1) supported by Tomb Raider and 2) a Matrox Mystique exclusive is an obscure proprietary 3D API they had back in the day:
https://www.youtube.com/watch?v=Bn8Bd61Zj-I&t=300s
which was apparently supported by Tomb Raider with a patch.
There is an emulator for it
https://github.com/kytulendu/MSISDK
but it has like 4 commits.
Just use a Glide patch. There is a mostly-functional GoG version that allows you to run the game under DosBox + GliDos, and there is a fix over at PC Gaming Wiki for running it natively.
Glide patch = Glide wrapper
Also, it looks like the GitHub Matrox MSI link is for the SDK, not an emulator for modern systems, so, a Glide wrapper is probably your only option.
If your comparison was between Matrox MSI and a software renderer, you are still going to be impressed by Glide.
@kurkosdr
I do know what Tomb Raider is and the vendor patched versions are and what an API is!
I played Tomb Raider the first time around when it was new in software mode and later again with a Voodoo. For nostalgia reasons some years later I used a (patched?) version of DosBox which handled the Glide calls and rendered it via D3D/OpenGL (whichever one of the two it used). I also have the GOG version on disc somewhere which I loaded up really just to see how well they packaged it.
I also played Tomb Raider with the vendor patched version supplied with a Matrox Mystique which I owned for a couple of days before I took it back and got a Voodoo. It really is better than the Voodoo version. The Youtubes don’t show this at all.
I never coded Glide but had the SDK and glanced through it. My graphics framework could dynamically load the miniGL if present which provided a mostly usable OpenGL API over Glide and I did some coding with that as well as a GeForce II Mx I bought later. The ATI Rage II was a bit slow but very capable too. It was fun! Later cards just added more “stuff”,
Sorry this is
–Windows already has dropped 16 bit application on all 64bit versions of windows (ie around vista). This is because AMD themselves decided to stop supporting VM86 under 64bit operating systems.–
This is not the true story. Sounds good on paper but its not 100 percent true. Dropping VM86 support only effected dos programs. Did not effect win16 programs.
https://lore.kernel.org/lkml/[email protected]/t/
We had in 2014 this patch all hell broke lose in the Linux world because win16 in wine would not work any more on 64 bit systems.
The intel developers did admit that the reason why Windows dropped win16 was this exact fault was known about back in the development of Vista. That is kind annoying that they were late to tell Linux and any other OS vendor about the problem. It was also bad that Intel solution to the problem was just disable 16 protected mode on 64 bit operating systems. Took many months to work by intel developers in fact implement in the Linux a correct software fix so using 16 bit protected mode on 64 bit kernel was not going to end up with security hole that we have still place in the Linux kernel today and this is why wine still runs win16 binaries.
Wine solves the vm86 missing by simply using dosbox.
Basically when windows dropped 16 bit support they dropped two parts not one.
Dos support was vm86.
Win16 support gone from vista was known security that they were not going to invest time to fix and the cheap solution was just disable win16. Bad part is how few parties got told about this real problem developing other operating systems.
This has already happened with DirectPlay which has been moved into an optional legacy package (which is the first thing you enable if you want to run Pre-DirectX 11 games btw). The problem is that for prrety much everything else, the “legacy stuff” (win32) *is* Windows as we know it. Sure. they could go and disentangle DirectX 7 from Direct X 9, but for what gain? Those have been a single package since forever, and it’s not that big as a package tbh. If Microsoft were to remove all the “legacy” stuff, they would have to rewrite Windows from scratch (at least above the kernel). And I don’t trust the Microsoft of today to be able to do that, much less do it correctly.
For drivers there is a point to be made, but most drivers are useful when people upgrade from Windows 7 and want their 9-year old WiFi card to work so they can go online and download more drivers. Again, you could remove some things, but not much.
Support for DOS and win16 does not exist in 64-bit versions of Windows.
https://github.com/leecher1337/ntvdmx64
–Support for DOS and win16 does not exist in 64-bit versions of Windows.–
kurkosdr not quite right. Official support for dos and win16 does not exist in 64 bit versions of Windows. Different parties have managed to get win16 and dos to work.
The reasons dos and win16 does not work officially is both cases Microsoft was not willing to work around the problem. ntvdmx64 shows that Microsoft could have for dos done the wine handling of using something like dosbox instead of vm86. There are other examples that show that Microsoft could have kept on support win16 as well just they had to fix a security fault they knew about.
Of course the will to drop dos and win16 did put the writing on the wall that they could cut out more features in future.
It’s really annoying how some in the Arstechnica comments on this story are calling for Windows 11 to cut off old hardware. Like how old? And who is buying new hardware with whose money? I would rather people didn’t volunteer my computer for the scrapyard just because they bought the latest whizzbang thing.
I use a small and stable number of applications. I don’t play games and we are long past the time when everything was new and evolving not to mention being done with the upgrade treadmill.
I can certainly understand the sentiment. BUT, being a developer, I can also understand the alternative vantage point. It can be frustrating to need to support MANY iterations of older OSes and/or hardware. I can’t speak for Windows, but here’s how it ends up working on Android:
– Support Android 5.0, 6.0, 7.0, 7.1, 8.0, 9.0, 10.0, 11.0, and soon 12.0 (if you’re lucky, cut off 5.0 and even 6.0; if you’re not, add 4.4 as well)
– Support various screen densities (mdpi, hdpi, xhdpi, xxhdpi, and xxxhdpi)
– Support landscape and portrait, possibly with tablet vs phone UI differences
– Support Pixel & Samsung devices (two of the more popular ones), possibly LG and others
– Don’t forget light theme, dark theme, and now Material YOU (Android 12)
Some of that can be condensed, depending on the app in question, and sometimes they can’t. It becomes quite cumbersome.
I’m not saying devices should be obsolete after 1 or 2 years (I never own the very latest as my personal device b/c I don’t want to pay that much). And I’m all for Project Treble leading to more and more devices being updated to the latest Android from day 1. BUT, it does take a lot of work to support so many different device combinations.
I am not an Android developer but this sounds like an overcomplication of matters.
Supporting legacy Android versions isn’t hard at all: existing code doesn’t suddenly lose compatibility. The problem is rather supporting newer versions as it requires you to use new widgets and features (for optimal UX) and therefore breaking the compatibility backwards.
So you need maybe two versions of the app. They can of course share the underlying code if your developers aren’t incompetent.
Pixel densities shouldn’t be an issue, I would expect this to be managed by the Android frameworks automatically. Of course an incompetent developer might do things the wrong way and then there will be problems.
Supporting specific devices sounds like an ultimate edge case for exotic apps.
Supporting varying aspect ratios should be trivial. Essentially it means scaling the app wider as the screen ratio grows wider too. Not at all hard unless the app is exotic or the developer incompetent.
Many mainstream apps don’t support landscape even at present, e.g. all social media apps come to mind. Landscape mode is rather an optional quirk for those apps that actually benefit of supporting it at all.
“I am not an Android developer…”
Yeah, stfu.
richterlevania3,
-1 vote. This was hostile and much less insightful than the post you responded to.
Almost every platform encounters similar problems. The web is probably the worst because the gamut includes all screen sizes, including those the developer will never see. The way you develop the app makes all the difference in the world as to how it’s going to scale up or down.
@sj87
As a previous developer myself I abstracted with a portability abstraction which included not just platforms but versioning (including compiler versioning). Interfaces, business logic, and data were equally well seperated. If you abstract early support is actually really trivial. The key is abstract as early as you can. Thereafter you can also reuse your portability layers for other projects.
There’s no law saying you must use new or platform specific features and times when they are best avoided. Even then they can sometimes be abstracted and equivalent code substituted to maintain compatibility across different platforms. I cannot comment on Linux or Android at a low level but on Windows you can dig down through the Windows SDK and find some Win32 functions are wrappers for older functions or lower level portable functions.
HollyB,
It’s worth noting that there are many portable abstractions to choose from as well. It really depends on the needs of the project. SDL is an example of a very successful portable toolkit for games and it’s easier to use than most native APIs.
https://www.libsdl.org/
I’ve used FLTK for basic GUIs. I like that it has zero runtime dependencies. We also have toolkits like GTK, wxwidgets, QT, etc which can often be ported to different platforms. There are probably hundreds of these toolkits around.
I don’t think there’s anything wrong with rolling your own GUI abstractions on top of native ones, many of us have done it in some form or other, but it does mean you won’t get the immediate benefit of portability. You’ll have to create a new implementation to target a new native API and it can be cumbersome to handle the unique quirks of each platform.
For example if you’re using win32 components such as a text box: windows may handle events, keystrokes, threads, callbacks, field tabbing, copy paste, mouse, context menus, etc differently than another platform. So it’s often easier to build your own controls than to try and use native controls in a portable way. But of course then it can feel out of place, haha. IMHO most developers should target portable frameworks. from the beginning.
Respectfully, supporting older Android versions DOES cause issues.
For instance, support of vector drawables is different between Android 4.4, 5.0, and everything else. As in, 4.4 and 5.0 differ from each other AND from anything newer. The app will crash on older Android versions if you don’t use workarounds.
You have to maintain code for pre-6.0 as far as permissions.
Pre-7.1 you can’t have app shortcuts.
Pre-8.0 (or 8.1, I forget offhand), you don’t have notification channels, so if you need separate behaviors for different types of notifications (think different ringtones, etc.), you can’t do that as easily on older versions.
Support for camera APIs is another one, both of Android versions AND hardware. CameraX was created exactly because of that. It is there to hide the complexity of that mess. Chet Haase did a video on it at I/O (last year I think).
Yes, the appcompat / androidx libraries help A LOT. BUT, I have run into numerous issues having to support older Android versions.
cacheline,
I agree with your points, however I think the essence of sj87’s point was that new programs are more likely to break backwards compatibility rather than old programs to have broken forwards compatibility. I’d be curious to see tangible hard data about this, but I think he’s probably right.
Consider that on the one hand compatibility can break on older computers because the developer chooses to target a new API (ie a dev knowingly switching to the latest version of .net). but on the other hand compatibility can break if the OS chooses to drop support of an existing API. This is a very different scenario with different outcomes. In the case of windows, the latest version dropping support for existing APIs (and therefor lots of software) would be a big deal.
Alfman,
Understood. Though, I can see where Microsoft doesn’t want to be tied endlessly to supporting APIs from decades ago.
It may be well worth taking a book out of automotive parts suppliers’ handbook as far as solutions… I can get most parts for decades old vehicles even now. Maybe I can’t get them for more obscure / less popular vehicles. But, for instance, you can easily get parts for old Ford/Chevy/Dodge vehicles from 40+ years ago. There’s also part suppliers for Deloreans and Fieros b/c there’s a market for it. You may have a vehicle lacking modern features, but if that’s what you want, you can maintain them easily.
It seems that Wine and ReactOS are doing some of that. Such an approach gives both sides a way to continue in their chosen path. We might argue it is unwise for Microsoft to drop so many customers, but that’s a different situation to discuss.
Code a portability layer and abstract early. It really does make supporting different platforms and versions trivial.
Who cares if version x, y, or z draws vectors differently? If you have your code structured properly from the beginning and use compiler flags where you have to it’s no bother at all. Also you don’t have to use every new feature just because it’s there.
The key is coding for portability from the ground up and designing for portable code. That may include in your portability layer flags for different compilers and versions and 32 and 64 bit variables, different but equivalent functions etcetera. It flows through your software design approach so you catch all the platform and version specifics as low down as possible and it’s all hidden behind higher level structures you switch in or out at compile or run time. Maintenance is trivial as almost all the code rarely changes. It’s only a headache if you didn’t code for portability from the beginning.
Almost all your business logic won’t be affected by API changes.
You can also ship more than one binary if there is a hard break from one version to the next.
Anyone who has problems coding for portability between platforms or API version changes simply has their business logic too close to the API.
The first bullet is kind of disingenuous. The entire API doesn’t change every single revision. Yes things do change, but a vast, vast majority of apps won’t need to be entirely (or even partially) rewritten for each API version, or even likely need to handle multiple API versions for many of their features.
Also, supporting landscape/portrait, system themes, multiple popular devices, or multiple screen resolutions is really not related to supporting old devices. That’s a completely separate issue.
@HollyB
Yes I totally agree, while I’m generally quite positive regarding Win 10 the one thing that will break my confidence is loss of backwards compatibility. I’m not against Win 11, provided my existing critical Apps run on it.
In fairness to Microsoft, I found upgrading to Win 10 from Win 8.1 was a performance increase on my old hardware, but it did take a while to get some drivers sorted from the initial offering.
As for the Windows versus iOS debate, at this rate of evolution they’ll merge into a common platform in about 4 or 5 generations. PC becomes tablet, tablet becomes PC, the differences are greatly diminished despite the marketing spin claiming the very opposite.
As much as I love tablet / hybrid interfaces for browsing media, they are dead set useless for most critical applications. Finally, don’t get me started on lefty versus righty.
What I don’t understand is the “need” to create new versions over and over. I mean, what’s the absolute benefit in added functionalities that Windows 11 might have over Windows 10 ? What Windows 10 have so much more that Windows 7 haven’t ?
The problem is the planned obsolescence that create so many various versions to support for developers. Plus the inability for users to upgrade (Android, I look at you, buddy). Hence we are in a fabricated mess that no one wants to handle correctly.
On my side, what only needs win32 functionalities will remain under win32. Security problems ? Fix your win32 layer, don’t ask ME to switch to a newer and shinier API if YOU made the mess in the first place. New stuff tends not to be very mature and developers will loose time upgrading/adapting, only for the time it is supported then ditched away by the constructor.
UX should be mastered now, it started at Xerox, had many PhD spending time on it, I don’t understand the shit UX has became and Windows 11 is a perfect example of that. “Simpler and cleaner UI” ? Bollocks, everything is now hidden under several layers of sub menus. What a great benefit for UX !
So, not wanting to sound old fashioned, but when it works, not change it.
“So, not wanting to sound old fashioned, but when it works, not change it.”
You sound like that, yes.
Also, by your logic, every smartphone should be using Windows CE and Symbian to this day.
What’s so much different but higher DPI and faster CPU/GPU ?
You had applications, notifications, you could phone (still have my N95 under Symbian 6).
Used PocketPC 2002 under Windows CE 5, worker well considering the limitations (resistive screen, 240×320, 300 MHz StrongARM, Compact Flash).
@Kochise
There’s lots of differences. The ASICs running on modern smartphones offer orders of magnitude more of performance and funcitionality.
You have multicore systems, with GPUs with programmable shaders, NPUs, integrated 5G modems, WiFi, GPS, fast local storage, several GBytes of RAM. Powerful radios. Blutetooth, spatial audio processing, insane image sensors and post processing. Etc. Etc.
All that requires modern OS support, device drivers, and APIs to expose the functionality.
Some of y’all are long past the time where you could keep up with technology, and are no longer the target audience. But there are newer generations of people for whom a smart phone or tablet are their main computing device. And they expect a lot more functionality than a lot of the old farts in this site.
javiercero1,
Yeah but so much of it is just more of the same. And while there are exceptions to this, most upgrades come in the form of spec bumps without really changing the way we use computers.
That’s totally an ad hominem attack, but it made me laugh all the same 🙂
.
No. There has been some serious advances in the SoC area. E.g. The world’s most powerful core comes from mobile. It’s hard for some to see through the lens of cynicism how disruptive mobile SoCs have been, to the point we’re getting Tflop-level performance on devices powered by battery running on your hand.
The interaction with a smart phone is fundamentally different than with a PC, and much more accessible for most people.
javiercero1,
Am I supposed to take that literally? The last benchmarks I saw showed AMD having the most powerful core. I believe apple’s A14 is the fastest mobile core to date (let me know if it isn’t), if so it’s performance remains behind PC counterparts for now.
Not that I think it really matters because IMHO phones have more than most users need.
The technology is certainly impressive, but it’s overkill for what I see users using their phones for on a daily basis.
There’s a similar argument being made on the PC side, where unless you are running a server high end CPUs are overkill for normal consumer gaming and applications. Even high end gaming GPUs have gotten so good at faking it that many causal users struggle to tell if raytracing is even on in an A-B testing scenario.
“Is RTX a Total Waste of Money??”
https://www.youtube.com/watch?v=2VGwHoSrIEU
I’m not saying that it’s not innovative because it is. And I’m not saying there are no applications because there are. But upgrading has naturally become less dramatic with time and is nothing like the night and day upgrades that we used to have. I’m not blaming the industry, it’s just a natural consequence of market maturity.
@javiercero1:
I’m not speaking about power in your hand, a smartphone stills make less work than a regular desktop pc. Do your smartphone do Blender? Full scale Office ? Doom Eternal ?
Pc had webcams long before your smartphone did and like @Alfman told you, it’s only a bump of spec with dedicated SoCs to offload the processing/rendering load, not to bring more functionalities per se.
As for adding functionalities without always needing to change the whole OS, ever heard about something called… drivers and DLL ? Loadable libraries to add functionalities. Changing the visual theme doesn’t require to change the whole OS either.
Today smartphones are just older PocketPC on steroids. From 2G to 5G doesn’t need a new OS to handle that, just the right driver. Handling permissions doesn’t needs a whole new OS for that, just the right OS architecture and kernel patch.
Don’t put the responsibility on OS upgrade to bring more functionalities otherwise it’s like saying applications are worthless unless the OS brings natively those functionalities.
That developers “forgot” common sense and needs a constant whole new upgrade to bring in functionalities is just a complete misconception of how computers really works and should be upgradable in an iterative fashion.
Otherwise it’s just useless pieces of junk whose lifespan is already set in advance. It’s like a technological fix your have to take over and over to stay on top.
@Alfman
Apple’s Firestorm core has the best IPC and performance per watt right now of any high performance CPU. On a cycle by cycle bass they’re the most powerful cores.
Technology has improved at the same exponential rate, it’s just that your own personal use cases haven’t. So yeah, if all you do is use a computer for word processing or light web browsing, you certainly are not going to see much of an improvement, as those were problems solved long ago.
But if you’re a content creator, a scientist, a developer, a gamer, etc. You certainly get to see the dramatic improvements in performance.
Also the vast majority of mobile users are probably in a demographic you’re not a part of, and you’d be surprised how much computing performance some of the apps, kids use these days, require.
@Kochise
Nah, it’s just more reductionist nonsense by people who are out of the loop and can’t keep up with the complexity.
Stuff running on a modern smartphone is orders of magnitude more complex in terms of functionality and computing requirements than that “pocket PC” of 15 years ago. You don’t realize it’s an even faster curve than what happened from the terminal-based PDP of the late 70s to the high end graphics workstation of the 90s.
javiercero1,
I accept that apple and ARM in general have more efficient cores. It’s the statement that “The world’s most powerful core comes from mobile.” that was inaccurate and/or misleading.
You know absolutely nothing of my personal use cases so let’s not even pretend that you have anything meaningful to say about that, haha.
Honestly though today’s upgrades are not nearly as dramatic as they used to be.
I’m not denying that we do see improvement in specs all the time: more cycles, better cameras, more pixels, and so on, but these upgrades are definitely becoming less impactful than they used to be. Take today’s ultra high HD video resolutions going up to and past 4K and all of the engineering it’s taken to get there. Is it nice to have? Sure, why not, but our upgrades today are relatively subtle.
You accuse everyone else of being too old and out of the loop, but perhaps there’s truth in the opposite: you may be too young to remember just how fast things used to change. If you didn’t experience that then I understand your skepticism, but it’s still true. This doesn’t strictly imply that the early years were better, we have objectively better technology today after all, but in terms of the rate of dramatically visible progress there’s no contest whatsoever: things were changing far faster back then.
And I want to dismiss any notion that this is in any way the fault of today’s engineers, who IMHO are just as talented as earlier generations. It comes down to diminishing returns. Even assuming that moore’s law has remained constant, the doubling of transistors simply doesn’t have the same night and day impact that it did a few decades ago.
@Alfman
No, what I said stands. Apple’s firestorm is currently the most powerful and efficient core microarchitecture and implementation. A firestorm core is about 7% faster than the fastest Zen3 and around 20% than a i9 10th generation cores.
If you show even an iPhone 4 to a bunch of kids today, they will marvel about how slow it was and how much things have changed with respect the the latest iPhone 12. Because it’s their world now, and they’re the ones keeping up with the mobile developments.
It’s just the extreme insecurity with the old farts in this site with their cynicism and humble bragging, whenever new tech comes along. Y’all not really understanding how exponential advances work, if you think the rate of advancement has slowed down. You have to use some subjective highly qualitative perception, usually based around when you were “with it.”
Just in the semi industry, things have changed more in the past 5 years, for example, than in the previous 20 just in terms of fab technologies. Things like EUV and the stuff that needed to happen to get 5nm out of the door is mindblowing. That we can now get the same amount of flops, on the palm of our hands using a battery, than the fastest supercomputer at the turn of the century which used 1 megawatt.
javiercero1,
Come on man, we’ve gone over this before. Apple’s CPUs do very well single threaded, but show me any benchmark of your choosing that shows apple’s chip come even close to intel & AMD CPUs under a full CPU load,
You may argue that apple’s CPU would perform better if they were to replace all their slow high efficiency cores with fast cores. Hypothetically that could be true, people including myself actually expected apple to ship more faster cores in the 2021 imac, but guess what they didn’t do it. And reviews may give a clue as to why: the m1 unified architecture runs very hot up in the 90C-100C range.
https://www.reddit.com/r/macgaming/comments/khp993/cpu_temperatures_of_the_m1_while_gaming_maybe_too/
So until they physically ship a product that bests the top scores of their competitors, we’re only talking hypothetical. Could apple hypothetically win? Sure why not, but they won’t get the title until they actually have a shipping product with reviewers independently confirming the benchmarks.
What are you talking about? And what’s up with all this ageism crap? Seriously dude give it a break.
And? I already know the technology keeps improving, I just said doubling transistors doesn’t have the same impact that it used to.
@Alfman
We keep going over this because it is clear to me now that you’re on the spectrum.
ON A PER CORE BASIS Apple’s Firestorm is the fastest microarchitecture right now. Apple’s high performance ARM CORE is faster than both AMD and Intel current fastest x86 CORE. In both Geekbench, and significantly more in SPEC (which is the suite we use in the microarchitecture community). This is not hypothesis, this is real world validated benchmarks.
So, Apple right now has both the lead in core microarchitecture and process node over AMD and Intel. Which is disruptive because this is a new territory.
Ageism? Nah, I work with plenty of old farts, experts in their fields, who have no issue keeping up. It’s the insecure old farts in sites like these, with their annoying humble bragging whenever something new is announced. Always talking about the same nonsense from 20/30 years ago.
So when you say stuff like “doubling transistors doesn’t have the same effect that they used to do.” Well that’s a completely subjective blanket statement. Because there are problems which benefit clearly from that increase in transistor budget, but since they are out of your radar/scope you don’t notice them.
Whereas if you talk to a younger person, they will most definitively note the increase in performance and transistor density in a mobile device from 2011 vs 2021. Because for them stuff like filters, augmented reality, gaming performance, and what not are what is on their radar of what’s important. Whereas you’re probably from the era when increasing resolution in PC monitors was the bees knees, for example. Since increasing resolutions reached a state of diminishing returns, you assume that there must be poor returns elsewhere as well.
javiercero1,
Until these cores are shown to scale up in real world CPUs, you’re just talking about hypothetical performance on paper. It doesn’t do anyone any good to have a hypothetical CPU that beats a real one.
Your ageism is on full display. You speak to people as though you know better than them because they’re too old to get it.
@ Alfman
The M1 has been out already for a while. The benchmarks are out too. The results are quantitative and publicly accessible. You still refuse to concede the point.
This is the nth interation of this cycle, so yeah at this point I’ll just assume you’re just a stubborn old fart out of the loop.
javiercero1,
We’ve already looked at the M1 benchmarks though. the M1 has great ST scores but thus far apple does not have a competitive CPU for heavy MT workloads.
I have to keep asking you to supply benchmarks and data to support your rebutals yet you keep restoring to ad hominem ageist attacks instead.
Kochise,
If it doesn’t change, it’s harder to sell. But at the same time backwards compatibility is one of windows’ strongest assets especially for businesses. When things break or need to be replaced it may cost us billions of dollars of downtime, support, patches, new development, testing, etc. I think windows and desktops are now mature products and it’s really hard to convince the market that “new” operating systems are all that much better than the operating systems they’re replacing.
As developers, we need to take a very careful look at what their motives are when they break backwards compatibility. Microsoft pushed so hard for the metro changes in windows 8 over customer protests because they were betting on new walled garden apps and I believe they intentionally made the classic desktop a bit jarring in an attempt to convince users that metro apps were better. Obviously metro didn’t work out back then, but I think microsoft may still be looking for ways to increase their control through planned obsolescence, which requires breaking backwards compatibility with more open APIs.
Making us change takes a carrot and a stick approach. But since the carrot becomes less effective under a mature market where most people already have what they need, perhaps we’re going to see more stick.
And telemetry ? Yeah, users wants telemetry, they requested that “feature” so vocally…
Kochise,
Yeah, the things that we demanded. Haha.
I agree that doing “new and shiny” just to do new and shiny is not the best way to go about it. Android 5.0 was revolutionary at the time (not just of material design, but of other features like Doze). Similar for Android 6.0. But, not all versions have added much.
But, as a developer, I also know that sometimes the mess *necessitates* totally rewriting certain pieces. Why? A couple of quick reasons come to mind: (1) you couldn’t anticipate all that the software would be asked to do (even if you really did plan it out well the first go around) and (2) bad and/or inexperienced developers created a maintenance nightmare & it’s more work to fix it than to toss it out and “do it right”.
I’ve done enough code reviews to know #2 is a BIG deal in the industry. Companies often want to optimize costs in software development. One way they do it is hiring 2-3 inexperienced devs to do the work of 1 more experienced dev. It “seems” good, but is often not.
“What Windows 10 have so much more that Windows 7 haven’t ?”
Virtual desktops? WSL? Sandbox?
A 2006 Core 2 Duo desktop is 64 bit, SATA with a PCIE and USB2. Yet is generally far less capable than than a $50 Raspberry Pi 4 which uses 1/50th the energy.
There is essentially zero need to still support IDE, PCI. AGP, VGA sockets, PS2, Centronics printer ports and all the 20+ year old legacy hardware which nobody still uses and hasn’t been sold for a decade.
Agreed. I have a Lenovo Thinkpad Yoga that’s only a few years old, and it lost its touchscreen driver with a recent Windows update. No amount of reinstalling the official Lenovo driver, rolling back Windows 10 to an older version, or even hacking around in the registry will enable it again. I can install any Linux distro or BSD, even Haiku, and it’s working just fine, so it’s purely a Microsoft “we don’t give a shit about your perfectly capable device” issue. It’s not so much a problem for me personally since I bought the laptop with the full intention of running Linux and OpenBSD on it, but for the hundreds of thousands of Lenovo customers out there with the same and similar Yoga laptops it’s a ticking time bomb they don’t deserve to have blow up in their faces.
People call out Apple and Google for their forced obsolescence, and they are correct to do so, but Microsoft does it as well and it’s so frustrating when a $700 machine is only good for a couple of years because you never know when Microsoft will decide to deprecate perfectly working hardware for absolutely no reason.
Lenovo have a generic support page for this. I have no idea if it will help. Other than this Microsoft used to be really really good at driver support. If you used the report function (wherever that is buried) the driver team could turn around a new driver in weeks or even days. I think it was my tablet driver which was a bit iffy and it took two days for Microsoft to issue an updated driver as it became available in Windows update a few days later. I understand they closed this down along with their quality control people and technical writing people.
My Lenovo Thinkpad was on update hold for most of last year up until a couple of months ago because of a Synaptic audio driver issue. It didn’t effect my laptop but the hold was annoying. Oh, well. Fixed now but it does remind you how vulnerable you are when you have a laptop as opposed to a desktop where you can swap parts.
The “last version of Windows” bs got binned as soon as Apple released version 11 of their competing OS… What a surprise.
Yep. MS has been one-upping Apple at least since the original Mac came out in ’84. That’s their running strategy. They reinvent Apple’s wheel, making it octagonal instead of round.
Methinks Nadella decided that MS wasn’t making as much revenue from Azure as he thought they should be by now (AWS still eats MS’ cloud business for lunch). So those recurring upgrade fees now sound better and better. I knew the ‘last version of Windows you’ll ever buy’ was bullshit.
It’s perhaps the good momentum for desktop Linux to arise.
If only it wasn’t such a mess (X11/Wayland/MIR/…, Gnome/KDE/Xfce/Unity/…).
Not gonna happen, as much as I would like it.
Just this other day ANOTHER window manager/distro was created promising, AGAIN, the ultimate distro for us users.
Yeah, not gonna happen, ever.
Mate is perfect for my need’s. Everything is done simple and logic.
Windows and MacOS are now both in a state of utter bloated crap. That is why I am sticking to Linux with Mate desktop. At least that is way more pure and have a better workflow.
Hiding the list of all software installed, behind that tiny button, is a major design fail. Why? Oh why?
Thankfully for Apple, Microsoft was nice enough to save them from bankruptcy, https://www.cnbc.com/2017/08/29/steve-jobs-and-bill-gates-what-happened-when-microsoft-saved-apple.html
Apparently “minimum” install size of Windows 10 is 32GB: https://www.makeuseof.com/tag/much-space-need-run-windows-10/ . It can go lower, but at a big cost.
I remember booting Windows 3.1 from a floppy (the actual exes were on a network drive, the floppy only contained network drivers and ini files). Windows 95 could run with less than 500MB hard drive. I think XP was the one that broke the GB barrier. Today Windows is larger than ever.
However that is not all wasted space. There is a large “SxS” folder for system components, WOW64 for running 32 bit applications, legacy versions of all common DLLs and their patches, and so on. If you run any random Windows app from the last 30+ years, it is likely that you need one or more of these.
On the other hand, if Windows is just reduced to kernel + UI + modern apps (no Win32), then it will run faster. Yes, we have that as the “embedded” versions of Windows (now called “IoT”), and can be installed on a Raspberry PI:
https://www.windowslatest.com/2020/02/09/heres-how-windows-10-runs-on-raspberry-pi-4-and-3/
So the choice becomes:
– Keep the current size, and stay backwards compatible
– or make it lean, but drop support for 99% of the software.
I would agree with you in a way, yet even user friendly Linux distros weight a lot by today standard. So…
Oh please, have you ever tried to restore a phone lately? The files are coming in at over 10gb in size, and that is with compression. A LITTLE EXCESSIVE for gimped/mickey mouse, embedded operating system that can’t even property multi task. I am looking at you Android closing my firefox and GPS in the background. My 8 year old windows pc will still boot faster than any android or iPhone with uefi disabled, about 4 seconds cold boot.
We live in the 2nd decade of the XXI century. 32GB is a trivial amount of storage for a modern computer.
Sure, but not for everyone.
Remember the push for $100 laptops, and OLPC fame?
People in education need access to cheaper hardware. Just giving them more money does not work (well, it would, but we are not doing that, for … reasons…). So giving more computers at a cheaper price is always a good alternative.
Windows is out of the scope of OLPC. The licensing costs of Windows alone make it out of the question for a $100 device, which is an insane price point you’re expecting.
They had a confusing scheme, but Windows 10 S was essentially “free”:
https://www.techradar.com/news/windows-10-cloud-release-date-news-and-rumors
Of course they tried to subsidize it by locking out several applications, which were not well received. But still, there were devices as cheap as $189, which is not too far off from the target.
And those devices had the storage requirements for Windows. Again, I don’t see the issue here.
Comparing a modern iteration of Windows with the requirements of Windows 3.11 makes little to no sense. Windows 3 RAM requirements could fit on the cache of a modern CPU. So what?
This morning I installed the latest Windows Server 2022 preview with UI. It takes 7.8Gb of disk space when using CompactOS, and 1.7Gb of that is the system page file.
The reasons that Windows ends up taking a lot of space aren’t the reasons most people think, and they are touched on in that link:
1. Using entire new builds as an upgrade mechanism means the system needs to store two instances of the system during upgrade. It keeps one of these around in the form of Windows.old for a while.
2. Pagefile/hiberfile/swapfile – things that are really a function of RAM size, not Windows code size.
3. Caches. A web browser will typically use 1Gb of disk space for caching, but that’s per browser, per profile. Things like Windows update keep a cache of downloaded state of their own which is typically a couple Gb. This is part of the reason for recommending people run disk cleanup, to trim overzealous caching.
4. Logs. The push towards diagnosability and telemetry means a lot more of these than in the past. Again, disk cleanup.
To me the real story of Windows disk space is what didn’t happen: Windows RT was a 32Gb device in 2012 (which was as small as could be achieved at the time), and that effective minimum hasn’t increased in 9 years. At the same time, the space needed for smartphones has increased dramatically – looking at iPhone models, the current model is selling with 4x the storage compared to the iPhone 5 in 2012.
> On the other hand, if Windows is just reduced to kernel + UI + modern apps (no Win32), then it will run faster.
Remember that Win32 was designed to run on 486s with 8-16Mb RAM and 100Mb of disk space. Win32 isn’t really where the size of Windows is going. If you want it to be smaller, remove the newer parts – part of the reason that Server 2022 install is that small is because it doesn’t have modern apps, but does have Win32.
> If this is the “next generation of Windows”, Microsoft is delusional.
On the contrary, Microsoft is being realistic.
Thanks to their API and tooling mismanagement since Windows 8’s introduction, most Windows shops never embraced .NET Native and WinRT, so they are rebooting Windows as if this path never happened.
Windows 11 is going to be Windows 7 + WinUI/Project Reunion APIs, that is all.
The “problem” with Windows is it got done around Windows 7. Ever since then Microsoft is more or less focused on how to bring a successful “Microsoft App Store” to Windows. Until now that hasn’t happened yet. Personally i don’t even know on how Windows 11 should be made for people to say OK that is something new and still Windows at the same time.
Marketing shoving you “revolutions” down your throat ?
Let’s wait for the official announcement but yes it’s usually just about that.
In this case…. The absolute worst design choice, is that the program list of everything installed, is hidden behind that tiny button. Like who the f**k thinks like that to choose that design solution? On the Mate desktop, you have what operates like classic Windows start button, and the first choice is all programs in one list and the second is favorites. Then it is grouped into the usual programs, games, administration and so on. Makes a better work flow.
Yes i too don’t know what exactly they are doing. Not just with Windows, for example GNOME too. Like most end users made it perfectly clear GNOME 40 is not usable for them without an extension such as Dash to Panel. And what do they do. They break it and let other developers work on it for a couple of months. And pretend people are just fine with that and they are still right. We just live in this funky decade where everybody is experimenting with some half backed desktop/mobile UX metaphor that nobody is really totally satisfied with. Maybe next decade will get better in this regard.
Looks kind of fake to me
Looks like a bad execution of Windows, KDE, Gnome, MacOS and Mate in one big pile of bad integration. Like why is all programs and apps hidden behind that tiny button in the top corner? I would go insane, if I had to deal with that on a daily basis.
I’ve installed the thing earlier this afternoon, and I really struggle to see how they can call it a new version of Windows.
The taskbar in the middle looks and feels weird, and I keep going to the left looking for start, the OS itself isn’t as responsive as Windows 10 was on the same machine.
In terms of dropping old stuff, the leaked build doesn’t start on my usual testing laptops (circa 9 years old with core-2 CPU), hence my using a 5 years old laptop instead.
I read that it requires a device with TPM which weirdly is actually deactivated at installation time.
Obviously this is a leak and it’s therefore not even Beta quality but all my applications seems to work, there is a strange “date bug” with some of the software. Somehow they shows up as having been installed tomorrow but otherwise everything works.
Interestingly the OS is described as Windows 10 Gold version 14, (I upgraded from W10 Entreprise 21H1) or Windows 10 Next in the upgrades and updates area.. You have to dig a bit in order to find the Windows 11 mention.
In a nutshell, it’s a work in progress and it really doesn’t move the needle from a software viewpoint.
As far as I can see. Then it looks like they took Windows10, and plastered ideas and solutions from KDE and MacOS all over it, in a bizarre solution. Like. I saw KDE instantly.
6 years ago, Microsoft said that Win10 would be the absolute last Windows version ever. Because they wanted to focus on bringing the newest stuff and candy to the user. They said that Win10 would be a service from now on….
And now they are about to launch Windows10 and will end life of Windows10 in 2025. Basically screwing people over yet again. Because they promised that Windows10 would continue forever and that it would be the last version that people would have to buy.
And I told people that there would be a new version at some point. Now I can not help, but thinking if Windows10 have been in permanent beta state all the way, from launch. It would not surprise me if that info somehow was leaked.
Pretty sure I’m in the minority but my dream is they announce a Windows 7 revival with proper HiDPI support. I never wanted to give it up.
Not in the minority, I’m up with you on this one. Windows 8 and the path it took was a predictable failure as it wasn’t motivated on logical and technical ground but just marketing greed with dubious justifications.
WinRT and .NET Native was a very good way to re-invent Windows for developers, unfortunately a mix of the same managers that want VB 6 to still be supported on Windows 10, coupled with the bad execution of how to migrate to modern stack, means Win32 and classical COM kept being the main APIs chosen by Windows shops.
I only ever coded for Win32. With well abstracted code it’s fairly trivial to add new code to handle major API changes. It just slots in. You can hide entire frameworks behind a DLL if you want to support a language change. It just needs people to abstract properly.
I don’t code now but one reason why I’d stick with Win32 is because Microsoft change their new API’s as often as they change their socks. If Microsoft completely refactored their OS to something sensible and stuck with it and offloaded “legacy” API’s to a subsystem perhaps something new would have more take up.
Otherwise as an affirmed end user I really don’t care as long as my stuff works and I can stay off the hardware/software upgrade treadmill.
Yeah this. You stick with what you know will work, but I do think dotNet is safe and . its basically the new vb6 in terms of when companies wanted a line of business desktop application built.