I hate how these months keep going down like vodka-martinis on an Italian beach, but at least we get another progress report for Haiku every time. Aside from the usual small changes and bug fixes, the most important of which is probably allowing the EXT4 driver to read and write again, there’s this little paragraph at the end which definitely stands out.
This month was a bit lighter than usual, it seems most of the developers (myself included) were busy with other things… However, HaikuPorts remained quite active: most months, at this point, there are more commits to HaikuPorts than Haiku, and sometimes by a significant margin, too (for May, it was 52 in Haiku vs. 258 in HaikuPorts!). I think overall this is a sign of Haiku’s growing maturity: the system seems stable enough that the porters can do their work without uncovering too many bugs in Haiku that interrupt or halt their progress.
↫ Haiku activity report for May
I definitely hope that this positive read is correct, as it would be a shame for the project to run into declining activity and contributions just as it seems to be serving as a solid base for quite a wide variety of applications. I’ve definitely been seeing more and more people giving Haiku a try lately and coming away impressed, but of course, that’s just anecdotal and I have no idea if that means Haiku has reached a certain point of maturity.
One thing that definitely does indicate Haiku is a lot more stable and generally usable than most people think is the massive amount solid ports the platform can handle, from Firefox to LibreOffice, and everything in between. I think a lot of people would be surprised by just how far they can get with their day-to-day computing needs with Haiku, assuming their hardware can boot Haiku and is properly supported, of course.
My opinion on Haiku has not changed, but I’m a random idiot you shouldn’t be listening to. The cold and harsh truth is that old people like me who want their BeOS boomerware but in 2025, are a small minority who are impossible to please. The Haiku team’s focus on getting modern software ported to Haiku, instead or trying to convince people to code brand new native Haiku applications, is objectively the correct choice to ensure the viability of the platform going forward.
If Haiku wishes to fully outgrow its hobby status, looking towards the future is a far better approach than clinging to the past, and unsurprisingly, Haiku’s developers are more than smart enough to realise that.
Sure BeOS makes sense if its native “superpowers” are used.
I remember that back in the BeOS heydays I was able to do the timeshift on analogue TV (a very intensive task), using a cheap TV card on my i486 DX100 Olivetti Envision.
In window and linux I was finally able to timeshift the live TV in 2012 when the DVB-T digital TV finally arrived. I think that even in 2025 if Analogue TV was still a thing, using a TV card w/o internal HW acceleration would be hard, no matter the CPU used.
Just an example, but BeOS was capable of things unknown elsewhere.
Now in 2025 most of the current apps are just ports from other platforms which make them usually worse than the platform they come from, when they could be not only better, but hugely better.
I wish the EU would put a limit on the electrical power a CPU or a whole PC can consume, then maybe SW companies and developers would be back to squeeze every bit of computational power they can, just like they did in the good old days (with positive effects on planet health)
the solutor,
Without compression capturing a raw NTSC stream at 720×480 with both interlaced fields at 29.9fps and 16bpp is 21MB/s, easily within PCI specs but we also have to remember that the data has to traverse PCI multiple times from a capture device to a hard disk. And if you’re playing back at the same time, it adds up. Obviously the achilles heel for spinning disks isn’t bandwidth, but seek time. IIRC many cheap cards supported mjpeg, which didn’t perform intraframe compression, but still managed to lighten the load.
Yeah, I’ve always had a preference for native software that was designed to work nicely with host OS facilities. I also think developers did more optimization to squeeze every ounce of performance out of specific hardware combinations…they had to out of necessity. Today software foregoes most of this refinement and optimization because moores law has provided so much more headroom, despite more demanding resolutions. Of course things like raytracing will bog things down, but traditional rasterization is often not worth optimizing when the pipeline can deliver hundreds of frames per second regardless.
I agree with you that would be nice, but I honestly don’t see how this would work both in terms of how the law would attempt to quantify efficiency and how it could be enforced. Heck forget applications and consider the inefficiency of modern websites. Clearly devs can do better, but by and large software companies aren’t interested. The industry takes it for granted that hardware will get upgraded instead and even as someone who’s good at optimizing I’ve learned long ago that it is futile to fight this. I can’t see it happening, but I am curious how would you envision optimization efforts working?
720×480 is more a PALish resolution, btw the matter doesn’t change.
I never had the pleasure to use the deferred LiveTV In any machine, no matter the CPU, while in BeOS it worked w/o a sweat on a 486 machine (albeit a Olivetti quality machine that had already the PCI bus).
Not only it worked in BeOS, it worked out of the box, just booted the system for the first time and the TV card was listed in the TV cards section inside the control panel
Well,.. not interested is an euphemism, clearly such kind of move would be seen (especially in the US) like a wrench in the gears of capitalism, but states and governments are supposed to do exactly that, put a limit when the market isn’t capable of writing/following its own rules.
the solutor,
What resolution would you use? I tried looking up my capture card, but the capture specs were not listed. I grabbed the number from here…
https://en.wikipedia.org/wiki/NTSC
Maybe the DVD and DV camcorder NTSC format used a different resolution for some reason?
You could do it on windows. I was doing it in the early 00’s at college using hardware that was a few years old at the time, a PII if I recall. Although given that the capture card was doing the encoding I don’t think the CPU was critical.
Yeah, I as a windows user, I often felt other platforms were more innovative. Microsoft’s specialty was monopolizing things, haha.
In practice many of these “it just works” scenarios require one buy the right hardware to begin with. I’m not dissing on BeOS in any way, just one of the realities of hardware vendors not committing to co-develop and use driver standards. This is one of the cool features to come out of USB since many devices implement a standardized interface and therefor just work. I wish everything could be this way.
It’s easy to say everything should be efficient. Government programs like energy-star make sense because they cover very specific tasks like washing dishes, power supplies, computers being idle/sleep/power down, refrigeration, etc. How could this be achieved for software development though? What would it even mean for office software, pdf viewers, games, cad programs, etc to run afoul of efficiency guidelines? A simple one size fits all doesn’t work because typing emails is drastically different to playing a game or movie or running blender. A single title can exhibit drastically different energy curves depending on settings like resolution and ray tracing. I struggle to envision government regulators successfully micromanaging all of this.
Aside: Energy star is on trump’s chopping block. It’s what we get for having such stupid leadership.
https://thehill.com/policy/energy-environment/5286201-trump-energy-star-climate-change-epa/
I think in IT 640×480@60Hz is common because the US TV standard.
525 visible lines + overscan + retrace lines is roughly 640.
Given in analogue signal there isn’t a precise concept of horizontal resolution, 480 is just 640/4*3
In Europe we had 625 lines @50Hz so applying the same rule we had 720 lines. Horizontal was more complicate. perfect 4/3 would be 720×540, but often 720×480 (with non square pixels) was used as well, Mostly (I think) because made easier the conversion to and from US sourced material.
Last but not least we use colloquially NTSC, PAL, SECAM…. but those have nothing to do with resolution and refresh.
We should talk about “systems” B, G, M, I, and so on which defines the resolution, the refresh and so on, and predates the color standards mentioned above.
There are/where a lot of places that used an European system coupled with the US color standard and viceversa (latin america, US military in EU, and so on)
Well, like I said, indirectly.
Enforce by law the maximum (electrical) power a PC can use, then let the SW dev to do whatever they like.
It’s like a car race, when you don’t define the power of the engine, or the maximum volume of cylinders.
You just tell “use the car you want, but you have to finish the race with 200 liters of gasoline”.
It works.
In my country we say “necessity sharpens the ingenuity” not sure how it sounds in English, but I’m sure you got the idea. 🙂
the solutor,
The 525 is the total NTCS scan lines including overscan, which you need to subtract to get visible lines. The wikipedia link I provided earlier says “The visible raster is made up of 486 scan lines”.
The correct placement of the CRT beam position at any point in time depends on the source signal operating with the proper resolution and refresh rate. If these aren’t in sync then you’ll probably see an effect similar to an old VCR with poor tracking.
I’m not too familiar with this, but I believe that NTSC’s timing and resolution characteristics haven’t changed since they were standardized in December of 1953. New revisions seem to be for things like nailing down the color gamut and more precise specification for voltage levels. If I am wrong and public NTSC versions did change timings/resolution, it would be really helpful if you could provide a source for this information.
I don’t see how a one size fits all quota system would work given that some applications inherently need more power on account of what they do rather than being inefficient at what they do.
Yes, that’s a good incentive, but your example goes back to resource consumption per unit work! This is more akin to the dishwasher example than a computer running arbitrary software doing undefined work.
Saying a personal computer isn’t allowed to use more than an average of Wmax watts isn’t the same as your car example. Wmax may be extraordinarily inefficient for some tasks while being impossibly low for others. So I don’t think we can easily quantify software efficiency without considering what that software is actually doing.
I agree with you there’s a problem with software efficiency. But I am still struggling to see how a simple law could work unless 1) it under-specifies the solution leaving it to businesses/developers to comply in good faith 2) it specifies specific energy goals for specific tasks.
#1 is a laugh.
#2 turns into government micromanagement over software features.
NTSC PAL SECAM are about how the color info is added to the underlying B/W Signal (while maintaining the compatibility in both directions, which is a big achievement on its own).
The TV system defines the frequency, I assume you’re in the US, hence you’re using the NTSC-M. The M part decides the resolution/refresh, not the NTSC one.
https://en.wikipedia.org/wiki/Broadcast_television_systems
Then obviously NTSC is sufficient colloquially, just like saying “American” instead of “US citizen” is enough, even if “American” is applicable to anyone from Canada to Chile.
You could do it on windows.
I could but not reliably. MS even made the TV card with internal MPEG engine mandatory for XP MCE, albeit there where hacks to use a SW encoder.
Common sense is applicable in engineering and when writing laws and rules. Not everything needs to be black or white, good or bad.
There are parameters, there are correction that could be applied.
Just let do the work to people with Catholic roots.
Catholics are analogue, Protestants are binary 🙂
But I think what I suggested will never happen, so no need to squeesze our neurons further. 🙂
the solutor,
It resulted in a less efficient standard, but still compatibility was an achievement. However we cannot disregard things like resolution and refresh rates, these are integral to compatibility, which is why “NTSC” hasn’t changed them since the 40s.
I appreciate your point that it should be called NTSC-M. however since resolution hasn’t changed and can not change without breaking everything, it became unnecessary to specify it and dropped from common parlance. It sort of reminds me of the “GNU/linux” debate, haha.
Most laws would be unnecessary if we could assume common sense instead. The reason we have so many laws is because people/lawyers deliberately seek to push boundaries even though it may be absurd. A “common sense” approach might work for those practicing in good faith, but not for those with a motive to bypass the law.
Another reason to have precise laws is to curtail judges legislating from the bench, I’ve learned that even they don’t always practice in good faith either.
Alright, fair enough.
> 720×480 is more a PALish resolution
720×480 is actually an NTSC standard and what was used in North American DVDs. The PAL equivalent is 720×576.
These two resolutions, 720×480 (NTSC) and 720×576 (PAL), are known as D1 and come from the world of analog CCTV and video surveillance. That said, in the world of DVD, 720×480 was more commonly referred to as 480p where it was typically paired with a frame rate of 29.97 fps. PAL usually had a frame rate of 25 fps. Actually 480p was 480 vertical lines of “progressive scan” video. There was also 480i which was for interlaced video. This is where the p comes from in 720p (HD video) and 1080p (Full HD video).
@lefantome
DVDs are a practical thing since 99/2000 (albeit the standard was finalized earlier), here the discussion was about TVcards, BeOS, i486 and alike
BeOS And the Olivetti Envision were both from 1995, the i486 was already at the swan song in that year.
Hence what I wrote is correct
This is from Wikipedia
I was somewhat disappointed that I didnt see their projects in GSoC 2025. That probably slowed down the OS development side somewhat.
I love Haiku and how efficient it is. The only showstopper, at the moment, for me to run it as a daily driver, is the lack (to my knowledge) of users or basic privilege separation.
Having my browser running with full system privileges is a big no no. Some kind of sandboxing would be good enough – better with the ability to set arbitrary ownership and access rights.
Haiku’s current goals (and it seems most are already fulfilled) is to obtain BeOS R5 parity. Once Haiku R1 has been released, more showstopping features like proper user support and privilege separation will be prioritised (i imagine)
It’s not that Haiku doesn’t have the ability for user sparation, that’s baked in from it’s UNIX-like filesystem. It’s just the rest of the OS isn’t really aware of it. I’m sure it won’t be too arduous to add that support in the future
Yes, you are right. Some commands show user ownership data on the UNIX model, and all owned by root. =)
Once they are there, there’s a high chance I will daily drive it. I just can’t risk using it for work as it is and I can’t justify it otherwise.
It is refreshingly clutter-free.
Back in the days, there was PhOS, a BeOS R5 “distro”, which added user support to R5. Even though it was easy to get around, as one could do alt-ctrl-del and restart the desktop team (or what was it called?)
Oh, I see hundreds of hackers around the world, all expert of Haiku weakness, targeting your non sandboxed browser.
Come on!
Security by rarity, is still a thing, and still works. In 2025 possibly even Win95 can be considered a very safe OS, nobody care about it, kids don’t have the skill, and the old farts who have the skill have usually better to do than looking for Win95 installations… 😀
We still do not have sound sound over HDMI. Or that might be afaik problem and my externality problem.
That’s one of my only real issues with it, and OpenBSD shares the same audio issue. For both of my machines that run those OSes it’s not really a big deal as I have analog audio out on those and multiple analog inputs on my speakers, but for convenience’s sake it would be nice to have audio over HDMI.
My other big issue with using Haiku daily is lack of full GPU drivers. The vesafb and EFI framebuffer drivers are fine for getting a display at all, but all Firefox based browsers have visual artifacts and poor video playback that would be fixed with proper video drivers. I’m not faulting the Haiku team for this, video/GPU stuff is some of the most difficult to do from scratch (ask the Asahi devs about that!) but for me at least it’s an insurmountable obstacle to daily use.
I feel like once proper GPU drivers are written or ported, the HDMI audio issue can be more readily addressed.
The artifacts is sync MESA problems and not Haiku problems. And YES the video problems in general is lack to proper acceleration. I love the Haiku people, and i would probably donate an organ to a few of them. So yes i am i bit biased.
That makes sense, thanks! And yeah I’m biased too; I used the BeOS as my main OS for several years in the early 2000s until it wasn’t possible anymore. Back then I wished I could win the lottery for enough money to buy it from Palm and pay the old devs to come back and continue making it. I firmly believe in another universe that indeed happened and I’m happily chugging along with BeOS 25.0 on my Ryzen workstation. 😉
If my fever dream ever happened, DEC would have bought Apple, and Apple would have bought Be. We’d all be running Apple BeTops and BeBoxes with Alpha processors.
GPU drivers also mean GPU reclocking, the lack of which is a major hit on Haiku battery life on some systems. I’ve got an older laptop I’d single boot Haiku on in a heartbeat if the iGPU, webcam, mic, and power saving features worked.
Came here in 2000 for BeOS. Stayed for OpenBeOS. Currently nostaligic for BeShare. Eager to see whatever the Haiku devs come up with over the next 24 years. Keep going, team!