I hate how these months keep going down like vodka-martinis on an Italian beach, but at least we get another progress report for Haiku every time. Aside from the usual small changes and bug fixes, the most important of which is probably allowing the EXT4 driver to read and write again, there’s this little paragraph at the end which definitely stands out.
This month was a bit lighter than usual, it seems most of the developers (myself included) were busy with other things… However, HaikuPorts remained quite active: most months, at this point, there are more commits to HaikuPorts than Haiku, and sometimes by a significant margin, too (for May, it was 52 in Haiku vs. 258 in HaikuPorts!). I think overall this is a sign of Haiku’s growing maturity: the system seems stable enough that the porters can do their work without uncovering too many bugs in Haiku that interrupt or halt their progress.
↫ Haiku activity report for May
I definitely hope that this positive read is correct, as it would be a shame for the project to run into declining activity and contributions just as it seems to be serving as a solid base for quite a wide variety of applications. I’ve definitely been seeing more and more people giving Haiku a try lately and coming away impressed, but of course, that’s just anecdotal and I have no idea if that means Haiku has reached a certain point of maturity.
One thing that definitely does indicate Haiku is a lot more stable and generally usable than most people think is the massive amount solid ports the platform can handle, from Firefox to LibreOffice, and everything in between. I think a lot of people would be surprised by just how far they can get with their day-to-day computing needs with Haiku, assuming their hardware can boot Haiku and is properly supported, of course.
My opinion on Haiku has not changed, but I’m a random idiot you shouldn’t be listening to. The cold and harsh truth is that old people like me who want their BeOS boomerware but in 2025, are a small minority who are impossible to please. The Haiku team’s focus on getting modern software ported to Haiku, instead or trying to convince people to code brand new native Haiku applications, is objectively the correct choice to ensure the viability of the platform going forward.
If Haiku wishes to fully outgrow its hobby status, looking towards the future is a far better approach than clinging to the past, and unsurprisingly, Haiku’s developers are more than smart enough to realise that.
Sure BeOS makes sense if its native “superpowers” are used.
I remember that back in the BeOS heydays I was able to do the timeshift on analogue TV (a very intensive task), using a cheap TV card on my i486 DX100 Olivetti Envision.
In window and linux I was finally able to timeshift the live TV in 2012 when the DVB-T digital TV finally arrived. I think that even in 2025 if Analogue TV was still a thing, using a TV card w/o internal HW acceleration would be hard, no matter the CPU used.
Just an example, but BeOS was capable of things unknown elsewhere.
Now in 2025 most of the current apps are just ports from other platforms which make them usually worse than the platform they come from, when they could be not only better, but hugely better.
I wish the EU would put a limit on the electrical power a CPU or a whole PC can consume, then maybe SW companies and developers would be back to squeeze every bit of computational power they can, just like they did in the good old days (with positive effects on planet health)
the solutor,
Without compression capturing a raw NTSC stream at 720×480 with both interlaced fields at 29.9fps and 16bpp is 21MB/s, easily within PCI specs but we also have to remember that the data has to traverse PCI multiple times from a capture device to a hard disk. And if you’re playing back at the same time, it adds up. Obviously the achilles heel for spinning disks isn’t bandwidth, but seek time. IIRC many cheap cards supported mjpeg, which didn’t perform intraframe compression, but still managed to lighten the load.
Yeah, I’ve always had a preference for native software that was designed to work nicely with host OS facilities. I also think developers did more optimization to squeeze every ounce of performance out of specific hardware combinations…they had to out of necessity. Today software foregoes most of this refinement and optimization because moores law has provided so much more headroom, despite more demanding resolutions. Of course things like raytracing will bog things down, but traditional rasterization is often not worth optimizing when the pipeline can deliver hundreds of frames per second regardless.
I agree with you that would be nice, but I honestly don’t see how this would work both in terms of how the law would attempt to quantify efficiency and how it could be enforced. Heck forget applications and consider the inefficiency of modern websites. Clearly devs can do better, but by and large software companies aren’t interested. The industry takes it for granted that hardware will get upgraded instead and even as someone who’s good at optimizing I’ve learned long ago that it is futile to fight this. I can’t see it happening, but I am curious how would you envision optimization efforts working?
720×480 is more a PALish resolution, btw the matter doesn’t change.
I never had the pleasure to use the deferred LiveTV In any machine, no matter the CPU, while in BeOS it worked w/o a sweat on a 486 machine (albeit a Olivetti quality machine that had already the PCI bus).
Not only it worked in BeOS, it worked out of the box, just booted the system for the first time and the TV card was listed in the TV cards section inside the control panel
Well,.. not interested is an euphemism, clearly such kind of move would be seen (especially in the US) like a wrench in the gears of capitalism, but states and governments are supposed to do exactly that, put a limit when the market isn’t capable of writing/following its own rules.
the solutor,
What resolution would you use? I tried looking up my capture card, but the capture specs were not listed. I grabbed the number from here…
https://en.wikipedia.org/wiki/NTSC
Maybe the DVD and DV camcorder NTSC format used a different resolution for some reason?
You could do it on windows. I was doing it in the early 00’s at college using hardware that was a few years old at the time, a PII if I recall. Although given that the capture card was doing the encoding I don’t think the CPU was critical.
Yeah, I as a windows user, I often felt other platforms were more innovative. Microsoft’s specialty was monopolizing things, haha.
In practice many of these “it just works” scenarios require one buy the right hardware to begin with. I’m not dissing on BeOS in any way, just one of the realities of hardware vendors not committing to co-develop and use driver standards. This is one of the cool features to come out of USB since many devices implement a standardized interface and therefor just work. I wish everything could be this way.
It’s easy to say everything should be efficient. Government programs like energy-star make sense because they cover very specific tasks like washing dishes, power supplies, computers being idle/sleep/power down, refrigeration, etc. How could this be achieved for software development though? What would it even mean for office software, pdf viewers, games, cad programs, etc to run afoul of efficiency guidelines? A simple one size fits all doesn’t work because typing emails is drastically different to playing a game or movie or running blender. A single title can exhibit drastically different energy curves depending on settings like resolution and ray tracing. I struggle to envision government regulators successfully micromanaging all of this.
Aside: Energy star is on trump’s chopping block. It’s what we get for having such stupid leadership.
https://thehill.com/policy/energy-environment/5286201-trump-energy-star-climate-change-epa/
I think in IT 640×480@60Hz is common because the US TV standard.
525 visible lines + overscan + retrace lines is roughly 640.
Given in analogue signal there isn’t a precise concept of horizontal resolution, 480 is just 640/4*3
In Europe we had 625 lines @50Hz so applying the same rule we had 720 lines. Horizontal was more complicate. perfect 4/3 would be 720×540, but often 720×480 (with non square pixels) was used as well, Mostly (I think) because made easier the conversion to and from US sourced material.
Last but not least we use colloquially NTSC, PAL, SECAM…. but those have nothing to do with resolution and refresh.
We should talk about “systems” B, G, M, I, and so on which defines the resolution, the refresh and so on, and predates the color standards mentioned above.
There are/where a lot of places that used an European system coupled with the US color standard and viceversa (latin america, US military in EU, and so on)
Well, like I said, indirectly.
Enforce by law the maximum (electrical) power a PC can use, then let the SW dev to do whatever they like.
It’s like a car race, when you don’t define the power of the engine, or the maximum volume of cylinders.
You just tell “use the car you want, but you have to finish the race with 200 liters of gasoline”.
It works.
In my country we say “necessity sharpens the ingenuity” not sure how it sounds in English, but I’m sure you got the idea. 🙂
the solutor,
The 525 is the total NTCS scan lines including overscan, which you need to subtract to get visible lines. The wikipedia link I provided earlier says “The visible raster is made up of 486 scan lines”.
The correct placement of the CRT beam position at any point in time depends on the source signal operating with the proper resolution and refresh rate. If these aren’t in sync then you’ll probably see an effect similar to an old VCR with poor tracking.
I’m not too familiar with this, but I believe that NTSC’s timing and resolution characteristics haven’t changed since they were standardized in December of 1953. New revisions seem to be for things like nailing down the color gamut and more precise specification for voltage levels. If I am wrong and public NTSC versions did change timings/resolution, it would be really helpful if you could provide a source for this information.
I don’t see how a one size fits all quota system would work given that some applications inherently need more power on account of what they do rather than being inefficient at what they do.
Yes, that’s a good incentive, but your example goes back to resource consumption per unit work! This is more akin to the dishwasher example than a computer running arbitrary software doing undefined work.
Saying a personal computer isn’t allowed to use more than an average of Wmax watts isn’t the same as your car example. Wmax may be extraordinarily inefficient for some tasks while being impossibly low for others. So I don’t think we can easily quantify software efficiency without considering what that software is actually doing.
I agree with you there’s a problem with software efficiency. But I am still struggling to see how a simple law could work unless 1) it under-specifies the solution leaving it to businesses/developers to comply in good faith 2) it specifies specific energy goals for specific tasks.
#1 is a laugh.
#2 turns into government micromanagement over software features.
NTSC PAL SECAM are about how the color info is added to the underlying B/W Signal (while maintaining the compatibility in both directions, which is a big achievement on its own).
The TV system defines the frequency, I assume you’re in the US, hence you’re using the NTSC-M. The M part decides the resolution/refresh, not the NTSC one.
https://en.wikipedia.org/wiki/Broadcast_television_systems
Then obviously NTSC is sufficient colloquially, just like saying “American” instead of “US citizen” is enough, even if “American” is applicable to anyone from Canada to Chile.
You could do it on windows.
I could but not reliably. MS even made the TV card with internal MPEG engine mandatory for XP MCE, albeit there where hacks to use a SW encoder.
Common sense is applicable in engineering and when writing laws and rules. Not everything needs to be black or white, good or bad.
There are parameters, there are correction that could be applied.
Just let do the work to people with Catholic roots.
Catholics are analogue, Protestants are binary 🙂
But I think what I suggested will never happen, so no need to squeesze our neurons further. 🙂
the solutor,
It resulted in a less efficient standard, but still compatibility was an achievement. However we cannot disregard things like resolution and refresh rates, these are integral to compatibility, which is why “NTSC” hasn’t changed them since the 40s.
I appreciate your point that it should be called NTSC-M. however since resolution hasn’t changed and can not change without breaking everything, it became unnecessary to specify it and dropped from common parlance. It sort of reminds me of the “GNU/linux” debate, haha.
Most laws would be unnecessary if we could assume common sense instead. The reason we have so many laws is because people/lawyers deliberately seek to push boundaries even though it may be absurd. A “common sense” approach might work for those practicing in good faith, but not for those with a motive to bypass the law.
Another reason to have precise laws is to curtail judges legislating from the bench, I’ve learned that even they don’t always practice in good faith either.
Alright, fair enough.
NTSC-M had changes since 40s, albeit none of them affected resolution and refresh rate.
NTSC-J (the Japanese cousin) is more or less NTSC-M w/o those changes
NTSC and M are related practically, but aren’t “integral part of each other”
NTSC N/NTSC 50Hz and NTSC 4.43 Are actual things.
NTSC-N Was meant for some Latin America countries (but was never adopted as they choose PAL-N instead)
BUT NTSC 50Hz became a real thing with the arrival of home computers/consoles/and VCRs, and IIRC was used in small broadcast stations meant for US military in EU or other countries.
The antimatter PAL-M was way more common and used as a national standard (In Brazil, for example)
Also SECAM M exited in Laos, Vietnam, Cambodia (and that was a perfect picture of the mix of French and US interests in the area at the time).
In short both NTSC =60Hz and PAL/SECAM=50Hz equations are both wrong, just like is wrong the assumption GNU/Linux = Linux
Aside the obvious example of Android, It’s plenty of distributions who used the Linux kernel + the BSD userland or viceversa, other “bastardized” attempt where made cross breading Opensolaris with other cousins.
In short no, isn’t matter of nitpicking, It’s matter of being precise if and when the discussion requires to be precise.
P.S. About common sense, I meant that common sense can be used writing laws.
NOT instead of writing laws (which is still an option, at least in theory)
the solutor,
I couldn’t find any references to NTSC-N and your own link only lists PAL-N. Even if NTSC-N was a real proposition, it didn’t happen and nothing’s changed since the 40s.
I’m not saying you are wrong to explicitly list the variants, but in practice NTSC-M became NTSC. Now if a new mode had come out, then I’d agree that it would once again be necessary to distinguish it, but that didn’t happen so the M just got dropped. It’s clear you don’t like this very much, but languages have a funny way of evolving away from strictly pedantic rules and that happened here. Of course it’s becoming history, but if you bought an “NTSC” DVD player, Nintendo, etc it is understood to be the American standard and virtually 100% of consumers would expect it to be compatible despite the omission of “M”.
Even articles that explicitly acknowledge your point about NTSC technically being a color standard nevertheless continue to use the term as the public does, rather than the technically correct NTSC-M.
https://itigic.com/ntsc-and-pal-definition-differences-and-what-is-better-for-gaming/
I never said it was equivalent, I said it reminds me of the debate. It’s got the same elements with strong opinions and meta discussions about a silly topic that nobody else gives a fuss over 🙂
Speaking of pedantry though, you keep using 60Hz when it’s actually 59.94 Hz.
I understand what you meant even though you didn’t say it, but that’s kind of the point with “NTSC” as well. We take these kinds of shortcuts all the time and not even you are immune to that. I don’t say this to criticize, but in the hopes that you may recognize how natural a thing it is to do.
https://en.wikipedia.org/wiki/CCIR_System_M
Also
https://en.wikipedia.org/wiki/NTSC
Just for the sake of information.
Then I said first, that colloquially NTSC is enough in mot cases
the solutor,
? Neither the link nor quote contains a mention of “NTSC-N”.
I understand about the “proposals”, but I can’t find any evidence it ever existed as a product. Most other countries deemed PAL to be better and in countries where NTSC reigned it was more important to be compatible with the existing standard so it never changed.
Of course I agree 🙂
For all practical purposes NTSC uses the same resolution and timings globally despite the fact that in theory it could be changed.
Well ,I quoted only part of the relevant paragraph, given I provided the link for readers who wants to dig deeper
The above seem practical purposes to me.
And the above are practical purposes as well
Like I said earlier the arrival of home computers/consoles/and VCRs complicated the matter a lot..
P.S.
While it has nothing to do with NTSC, i like to remember the fourth forgotten colour standard: ISA.
It was developed in Italy by the Indesit company (at the time was an “institution” they produced the first TV set powered by a switching power supply)
The colour TV in Italy was troubled, with some parties pushing for the French SECAM, and other for the German PAL, the ISA system was then develped .
Obviously, if adopted, would have ended like the (vastly superior) Video 2000, when the war between VHS and Betamax was raging.
Nobody cares if a system is technically superior if you are late to the party.
The story of ISA ended in the cradle when PAL was adopted, and today even the Italian Wikipedia has almost no references to it.
But ISA managed to delay further the choice of the standard, so Italy had the color TV only in 1977, likely the last western country to abandon the B/W.
We had a revenge with mobile telephony and smartphones, but that’s another story 🙂
the solutor,
Your example is a hack to get a monitor to display an incompatible source. Yes, hacks can be practical and I don’t really have a problem with this type of owner modding and experimentation, but it doesn’t really make it an official standard anywhere in the world.
I think we should stick to agreeing that the vast majority of people who say NTSC mean the mainstream standard. The onus would be on those making modifications to explicitly point them out.
No here I stop agreeing.
A cable TV service that served thousands or even millions of people in Hong Kong can’t be considered an hack, call it unofficial standard, call it a minimal share, but not hack.
To put it differently, if you say “this cost 10$” everyone assumes they are American Dollars, but is an incomplete info.
If you say “this cost 10 American dollars” the information is more complete but can be misleading anyway, given Canadian dollars are still American dollars.
If you say “this cost 10 US dollars” you are providing the whole picture.
Even if Dollars were used only in the US and Vatican City with its 800 people served, saying “Dollar” was more than enough colloquially, but still incomplete.
NTSC/NTSC-M is the same as the above example.
Non M NTSC is/was a thing, no matter if J has minimal differences, no matter if N was adopted or not after being officialized, no matter if N was used only in Honk Kong, and so on
Then if you want to broaden the meaning of “hack”, the 29.97 Hz thing is a hack, PAL itself is a hack to avoid the “Never Twice the Same Color” thing (and consequently the Tint button, never seen in Europe).
Matter of semantics
the solutor,
Please provide a link. I don’t see evidence for it, it looks like hong kong went with PAL. Regardless, even in the hypothetical the onus would still be on a tiny cable company to identify they are not using a compatible spec. I wouldn’t be a happy customer if my cable company did that.
As a Canadian, I must disagree. Even we don’t call Canadian dollars American dollars. You are not technically wrong that “American” should refer to the whole continent, but even when Canadians speak, “American” refers strictly to the neighbors from the south. I guaranty you that 0 people out of 100 would conflate it this way.
Not in resolution/timing. It’s a different in color/voltage, which is what I said earlier.
I agree that 29.97 FPS is a hack, even though they did it on purpose 🙂 Still that’s the standard and calling it 30FPS is technically imprecise.
@Alfman
I wrongly replied to the wrong post, please scroll down to read.
> 720×480 is more a PALish resolution
720×480 is actually an NTSC standard and what was used in North American DVDs. The PAL equivalent is 720×576.
These two resolutions, 720×480 (NTSC) and 720×576 (PAL), are known as D1 and come from the world of analog CCTV and video surveillance. That said, in the world of DVD, 720×480 was more commonly referred to as 480p where it was typically paired with a frame rate of 29.97 fps. PAL usually had a frame rate of 25 fps. Actually 480p was 480 vertical lines of “progressive scan” video. There was also 480i which was for interlaced video. This is where the p comes from in 720p (HD video) and 1080p (Full HD video).
@lefantome
DVDs are a practical thing since 99/2000 (albeit the standard was finalized earlier), here the discussion was about TVcards, BeOS, i486 and alike
BeOS And the Olivetti Envision were both from 1995, the i486 was already at the swan song in that year.
Hence what I wrote is correct
This is from Wikipedia
I even quoted that. Just do CTRL+F “Hong” on this very page.
I never had any doubt about that, my example was indeed purposely a bit stretched, just to show how a colloquial sentence can be both commonly used, and technically wrong (or at least not complete).
As Italian I can assure that the land discovered by an Italian and named after another Italian, is called America and includes Canada, Chile, Nicaragua, Mexico and so on, not just the US. 😀
Now I’m well aware that “American” as US citizen is an adjective so rooted that Anglophones don’t even have an alternative term, but we have it: “statunitense”.
And while we use the word “Americano” exactly like your American, when we want to be precise we say “statunitense”. In Spanish there is a similar world.
So don’t make the mistake of taking for granted that your habits are universal.
Canadian Dollar is univocal, Australian Dollar is univocal, American Dollar is not
No matter your personal or even national habit. You can’t be sure how it sounds for a person from Kazakhstan, or Namibia or Poland.
The M system was and still is 30fps, round. The 29.97 is a “hack” that doesn’t break any backward compatibility and is part of the NTSC standard when it shouldn’t.
There are similar “hacks” in other part of the world, say movies watched on TV in 60Hz countries are a complicate matter given the difference fro 24 to 30 fps.
Now digital made everything easy but in the analogue era they were not smooth given the rudimentary temporal conversion.
But in Europe they are just transmitted @25fps with the result that a movie last 4% less on TV than in Cinemas.
Most people don’t notice but the minority with absolute pitch, clearly identify the slightly higher pitch of the soundtrack.
4% is not a lot but it’s way higher than 30 v.s. 29.97 which is 1/1001
the solutor,
Your quote literally contains “citation needed”. Given that it contradicts other sources saying hong kong went with PAL instead a source is warranted.
Wikipedia is publicly editable. One user made the following change in 2024.
https://en.wikipedia.org/w/index.php?title=NTSC&diff=prev&oldid=1212366198
I think it was a well intentioned edit (by a 20 year old in Indonesia if you read the user profile), but this four word edit may not have intended to convey that NTSC 4.43 was actually used by “A cable TV service that served thousands or even millions of people in Hong Kong” the way you’ve taken it.
I’m not trying to be contradictory here, but consider that the same wikipedia page says this.
It was not the purpose of NTSC 4.43 to provide a broadcast standard, not even in hong kong where they used PAL. I find it more likely that NTSC 4.43 got used for it’s stated purpose of supporting playback of NTSC tapes on PAL systems. It required a multistandard receiver that could handle NTSC-M’s 525 lines/29.97 frames per second with PAL color. It seems plausible that consumers would have enjoyed the benefits of both NTSC movies and domestic PAL broadcasts.
It wasn’t actually arbitrary, keep in mind it’s not simply a matter of speeding up the signal because there are radio frequencies behind those signals too. The math does provide different solutions for minimizing the crosstalk between carrier frequencies. Using 30FPS was possible with different scanlines, but the big caveat (the one they chose to prioritize) was staying compatible with existing TVs. The 29.97FPS was chosen because the frequency math was right AND it was close enough to 30FPS that existing B&W TVs would accept it.
“Why is TV 29.97 frames per second?”
https://www.youtube.com/watch?v=3GJUM6pCpew
Of course today nobody uses B&W TVs and we resent their choice, Of course now that analog TVs are obsolete, the backwards compatibility matters much less.
I understand, you are the kind of person that loves to have the last word, no matter how friendly, or non important a discussion is.
But If you ask for a link I provide a link, then when you have it (before you asked) you start questioning the source and so on.
In short neither me or you had the pleasure to live in Honk Kong, so all we can do is to take the sources for good ones until explicitly negated by a different, more relevant one.
I started this discussion pointing to a *fact* and the fact is that people use the color coding method to refer to the system, to inform who read, not to start a pissing contest.
So, sorry, but I cut here before we start to discuss if the person who edited Wikipedia was vaccinated or not, and if his haircut was done by a respectable barber.
My English is poor, I know, but I think I expressed my point in a more than understandable way, feel free to disagree, I never post my opinion or my expertise to convince a single person.
Have a nice day.
the solutor,
I don’t know why your problem is with me. You should be scrutinizing the lack of citation as much as I am. I really don’t mean to be rude about it, but I do not think it’s wrong of me to call it out, as annoying as that might be.
It’s a very weak source with no citations that contradicts numerous sources that say hong kong selected PAL.
Why do you have an issue with my theory? NTSC 4.43 could have been used for playback of NTSC video tapes on pal systems because that’s why NTSC 4.43 was created. Your link even explicitly says this. Also it explicitly says it’s not a broadcast standard. Absent more evidence, I judge this to be the most plausible explanation. If you want to disagree, fine, either put up stronger evidence. If you cannot do that, then that’s fine too, but then don’t judge me for doubting your claim.
the solutor,
If possible, I’d rather end the discussion on a pleasant note. I understand my own stubbornness can get in the way of this, and that’s on me. I need to work on it. I am happy to agree to disagree though, I think it’s ok to have discussions conclude this way. Also your english is just fine!
I was somewhat disappointed that I didnt see their projects in GSoC 2025. That probably slowed down the OS development side somewhat.
I love Haiku and how efficient it is. The only showstopper, at the moment, for me to run it as a daily driver, is the lack (to my knowledge) of users or basic privilege separation.
Having my browser running with full system privileges is a big no no. Some kind of sandboxing would be good enough – better with the ability to set arbitrary ownership and access rights.
Haiku’s current goals (and it seems most are already fulfilled) is to obtain BeOS R5 parity. Once Haiku R1 has been released, more showstopping features like proper user support and privilege separation will be prioritised (i imagine)
It’s not that Haiku doesn’t have the ability for user sparation, that’s baked in from it’s UNIX-like filesystem. It’s just the rest of the OS isn’t really aware of it. I’m sure it won’t be too arduous to add that support in the future
Yes, you are right. Some commands show user ownership data on the UNIX model, and all owned by root. =)
Once they are there, there’s a high chance I will daily drive it. I just can’t risk using it for work as it is and I can’t justify it otherwise.
It is refreshingly clutter-free.
Back in the days, there was PhOS, a BeOS R5 “distro”, which added user support to R5. Even though it was easy to get around, as one could do alt-ctrl-del and restart the desktop team (or what was it called?)
Oh, I see hundreds of hackers around the world, all expert of Haiku weakness, targeting your non sandboxed browser.
Come on!
Security by rarity, is still a thing, and still works. In 2025 possibly even Win95 can be considered a very safe OS, nobody care about it, kids don’t have the skill, and the old farts who have the skill have usually better to do than looking for Win95 installations… 😀
We still do not have sound sound over HDMI. Or that might be afaik problem and my externality problem.
That’s one of my only real issues with it, and OpenBSD shares the same audio issue. For both of my machines that run those OSes it’s not really a big deal as I have analog audio out on those and multiple analog inputs on my speakers, but for convenience’s sake it would be nice to have audio over HDMI.
My other big issue with using Haiku daily is lack of full GPU drivers. The vesafb and EFI framebuffer drivers are fine for getting a display at all, but all Firefox based browsers have visual artifacts and poor video playback that would be fixed with proper video drivers. I’m not faulting the Haiku team for this, video/GPU stuff is some of the most difficult to do from scratch (ask the Asahi devs about that!) but for me at least it’s an insurmountable obstacle to daily use.
I feel like once proper GPU drivers are written or ported, the HDMI audio issue can be more readily addressed.
The artifacts is sync MESA problems and not Haiku problems. And YES the video problems in general is lack to proper acceleration. I love the Haiku people, and i would probably donate an organ to a few of them. So yes i am i bit biased.
That makes sense, thanks! And yeah I’m biased too; I used the BeOS as my main OS for several years in the early 2000s until it wasn’t possible anymore. Back then I wished I could win the lottery for enough money to buy it from Palm and pay the old devs to come back and continue making it. I firmly believe in another universe that indeed happened and I’m happily chugging along with BeOS 25.0 on my Ryzen workstation. 😉
If my fever dream ever happened, DEC would have bought Apple, and Apple would have bought Be. We’d all be running Apple BeTops and BeBoxes with Alpha processors.
GPU drivers also mean GPU reclocking, the lack of which is a major hit on Haiku battery life on some systems. I’ve got an older laptop I’d single boot Haiku on in a heartbeat if the iGPU, webcam, mic, and power saving features worked.
Came here in 2000 for BeOS. Stayed for OpenBeOS. Currently nostaligic for BeShare. Eager to see whatever the Haiku devs come up with over the next 24 years. Keep going, team!
This is a bit of a sour post for me.
I have wanted to run Haiku as a daily system for many years now. I used multiple computers and,
This is a bit of a hard comment for me to write as I am normally a major Haiku fan.
I have been wanting to run Haiku in some daily capacity for many years now. I have multiple computers, at different locations, and so I can “daily drive” Haiku on one of these machines even if it does not do absolutely everything I need. One of them is very old, a 2009 Macbook Pro.
Haiku has always blown me away with how fast and light it is. That is part of the attraction. So, I thought, perhaps it is time to put it on this old Macbook. With a USB WiFi dongle plugged in, I was able to boot straight into a Haiku nightly and to have Internet access. Let’s go!
I had been using this computer for about an hour before the switch. Stuff like OSnews, email, AWS workflows, and remote LLMs which all work well on this machine. Before trying Haiku, it was running Chimera Linux with the Niri compositor and the COSMIC panel. I mention all this to say that I had a pretty good mental benchmark for comparison.
Again, I am a Haiku fan and was looking forward to using it. So, what I found really surprised me.
Haiku at boot used over 300 MB of RAM. This is more than some compositors and windows managers under Chimera. Nirit with COSMIC panel only uses about 75 MB more and offers many more features. More surprising, Haiku did not feel any faster than Niri. In fact, it maybe felt a little slower.
Worse, “multimedia” performance on Linux was better. YouTube videos played noticeably better on Chimera. I think that audio with Pipewire has lower latency too.
My take-away is that Linux, or Chimera Linux at least, is a better multimedia platform (even on low-end hardware) than Haiku is. And of course Linux has vastly better hardware support as well as killer features like containers.
Why use Haiku?
I do not like this conclusion at all. It is just one personal experience. But it shook my faith a little.