So, Cartrivision tapes came in two formats: Black Tapes and Red Tapes. Black tapes you’d buy at the store like any other product, but for Red Tapes (which were relatively recent movies), you instead would go to the store and place an order from a catalog. The store would have it delivered by mail, then you’d come back in and get the tape. You’d take it home, watch it, and then return it back to the store. So… Video rental (like Blockbuster!), except they didn’t have any stock on hand, and only got the tapes on-demand by mail? Seems annoying.
BUT OH NO: it’s far more annoying than that. See… Red Tapes aren’t mechanically like Black Tapes.
I’m a sucker for weird formats, and this one is definitely right up there as one of the weirdest.
Interesting article, I was not familiar with red tapes. I’ve seen the more modern play once video disks however.
This is all true, yet arguably the non-ownership model ultimately won in the long run. Obviously they had to improve the convenience and flexibility significantly before there was broad appeal with video streaming services. But now if you want to rewatch something you watched on streaming, it may not be available because many programs are only licensed for a limited time. People don’t seem to mind this as much when they pay a generic monthly fee instead of specific per-movie rental fee, but unlike with DVDs, control over content retention has reverted back to the studios. Young generations today don’t have DVDs and take this for granted.
Unfortunately this is a common problem these days, far more wide-reaching than niche technology from the 90s.
And, as I understand it, streaming for the music business turned out to be a worse business model as well.
You can’t beat streaming convenience, unfortunately. For whatever it’s worth, content is still available on the usual places as MP4 or MKV files, but most people don’t bother. BTW you can still own stuff as Blu-Rays (they even have a UHD variant now), but not all content, some of it is streaming-only. What can I say, let’s hope the ability to torrent from the usual places never goes away.
Alfman,
This is also an optimization problem.
We have pantries at home, but we also go to the store. Essentially the stores are our “extended pantries”. More so for local shops where we know the owners and workers personally.
It’s the same way for entertainment. I own Lord of the Rings UHD set, (even though I never once opened them: streaming is easier). And I keep them in case something happens. The latest Top Gun movie on the other hand? I won’t lose much sleep if my “digital purchase” goes away.
So it is more of a balance between on how much you want to “hoard”, and how much you are okay with possibly losing.
sukru,
Everything you say makes sense, but I’d argue that technically we don’t really need the commercial services for extremely convenient media distribution. Free P2P services that predated all of these services were offering tons of convenience at scales that even youtube would struggle with for many years. Obviously the elephant in the room is that these P2P networks were violating copyrights, consequently the people and companies behind them were shutdown by the courts, but speaking strictly in terms of technological viability and convenience, we showed that centralized streaming providers aren’t all that important. If not for the legal battles that stopped companies investing in P2P, I think the streaming landscape would look extremely different today.
Alfman,
Yes, if p2p has solved its fundamental problems, today would be very different.
But we can argue that the companies have then “the good parts” out of p2p, and applied them to their commercial offerings.
On the front, there is ease of access. With torrents, you might get a recent movie even on release date from a “screener”. But there are hoops to go though, like finding a “seed box”. And in reality it could be without VFX, has terrible audio, and will have watermarks.
Or… You can go to Amazon*, pay $20, and get access to UHD version of the movie day and date (again not always). It is guaranteed to have high quality video, and spatial audio. Similar end results, or maybe even better. Though, this time at the whim of the studio vs. warez group for availability.
On the back, p2p offer distributed efficient content delivery. In reality, “free riders”, asymmetric links, and protocol overheads mean it is not actually as efficient as the theory tells.
Or… Amazon* can make use of global scale caching algorithms, and ISP co-located “edge” networks to distribute content, and make very good use of economics of scale.
We can probably list other benefits that are supposed to be on the p2p side, but were adopted by private companies.
Basically, p2p forced them to clean their act. And p2p developers could not bring new features to compete with modern services.
Anyway,
Of course there are other problems. Like companies editing content after release. All the way from deleting “starbucks coffee cups”, to removing entire episodes for one reason or another. They also alter the deals, like dropping support for certain hardware players, which “only affect a small portion of their users”. I can go on.
So it is not ideal.
The “solution” could be a “global NAS”, where you contribute storage, bandwidth and compute power, and in return get credits to offload your media. Unfortunately this, too was done, with not so good results:
https://consensys.net/blog/news/programmable-blockchains-in-context-ethereum-smart-contracts-and-the-world-computer/ (this time the Wall Street “investors” became the free riders).
Alfman,
Did you mean legal problems? If so then I agree, but that’s more of a social problem than a technology problem per say. IMHO technology-wise, the commercial streaming services would be quite redundant today if the courts hadn’t interfered.
I still think the commercial offerings have more built-in cons because they’re catering to the demands of the studios and advertisers, for better or worse. As long as they’re in control of the technology, I think the restrictions and spamification we’re seeing in media and tech is inevitable because it’s not designed for users first and foremost.
I don’t think there’s a contest at all, even giant corporations would have difficulty competing with genuinely unimpeded P2P. It became asymmetric because people were getting in trouble for sharing, getting cease and desist orders, loosing internet service, etc. Such prohibitions significantly culled the user base. So I explicitly acknowledge the legal challenges facing P2P, but my point is that technology-wise centralized streaming services aren’t nearly as irreplaceable as someone might make them out to be at face value.
Haha, I agree. Blockchain started a kind of gold rush with investors looking to invest in anything remotely related because they were afraid of loosing out. At the end of the day though I think blockchain will prove to be one of the most overhyped technologies of our time. It certainly has interesting properties, but for most applications a global transaction register just does not scale well, or at best scales at great cost.
Another example might be freenet, where you contribute storage to provide anonymously hosted web pages within the network. I think this is quite a bit more scalable than global blockchain technology. However there too they made major compromises to performance to protect anonymity. The routing is complex, indirect, and slow. It’s very interesting technology, but when you have a solution that doesn’t fit the natural topology of the network, it’s not going to end up being very optimal.
Alfman,
First a big thank you to WordPress for logging me out mid-comment, and deleting all my writeup. Thanks for reminding me to never type text on a web form, but use Notepad instead.
That is true.
But my point was Internet is fundamentally asymmetric. It is a limitation that is rooted in physics, and cannot be overcome.
I thought about mentioning Freenet. But I have a conscious objection to the content in there.
That being said, I had a technical objection to them 20 years ago, and that still holds, too.
As you said, it is slow. This is because the network can, at worst case, route the data through the entire network. Yes, in practice it would never happen, as randomized algorithms have theoretical expectations.
sukru,
I can sympathize with that.
What physical limits? I don’t follow.
I’d like to point out that many commercial streaming services are failing us today with almost all of them struggling to support & offer 4k video for everyone who wants it.
https://www.reddit.com/r/linuxquestions/comments/qty4or/will_4k_streaming_ever_come_to_linux/
https://community.stadia.com/t5/Stadia-on-TV/4k-and-HDR-not-supported/m-p/24498
https://www.techradar.com/news/netflix-video-quality-remains-throttled-to-the-annoyance-of-premium-users
https://www.reddit.com/r/HBOMAX/comments/l6zvbl/watching_hbo_max_4k_hdr_on_pc/
Youtube 4k transcoding delayed for days…
https://support.google.com/youtube/thread/67300511/4k-option-not-showing-up
https://www.reddit.com/r/Roku/comments/t0rdw8/disney_plus_not_giving_4k_as_an_option/
…
P2P is not only a great solution for scaling and alleviating network bottlenecks, but it would be able to do so with fewer restrictions on use. Oh well, I have to accept that we live in a world where RIAA & MPAA killed P2P.
Hmm…
My frustration with WordPress “leaked” here. Sorry about that.
Alfman,
Some hard, some depends on choice, some softer limits.
For example, latency has a hard bound wrt. speed of light. Hence a global network, cannot avoid needing location information, without a significant sacrifice (example, again freenet).
Others based on fundamental choices. Internet topology is essentially a Tree* (with some redundant links, which can be simplified again to single links). Hence uplinks will always be asymmetric. (All switches/routers having uplinks equal to total local bandwidth will hit a hard bound as Tier-1 connections will require exabit bandwidth, which again is not possible with our known technology).
And there is economics which is the “softest” part. But unless we have a equitable system (like Ethereum, but fixed), this will also be a blocker.
Obviously though in the context of video streaming, it’s bandwidth and not latency that we need to worry about. I’m pinging servers bidirectionally a couple thousand miles away in less than 100ms. It is not like a gaming server where every additional 10-20ms adds noticeable lag. You could even stream across the globe without any issue as long as you’ve got the bandwidth.
Long distance routes are not optimal, but therein lies another advantage for P2P, it has a very good solution for this because local P2P nodes will mirror the content and eliminate the hops. This solution works very well in practice. Google spends loads of money building out their caching infrastructure, but not even they can match the level of redundancy and locality of P2P.
Yes, but even with that in mind P2P still holds the advantage. Of sharing has to actually be encouraged instead of prohibited. Alas, as sharing became prohibited, that killed off P2P seeding and overall viability. IMHO this is the reason commercial streaming services won and not because their technology was better. The bottlenecks that users are experiencing with streaming services today as mentioned easlier would be better handled with P2P networks to improve locality, redundancy, and scalability.
Alfman,
It is the throughput we care about, and depends on both bandwidth and latency and dictates block size. Given very long ping times in this theoretical network (say 1000 ms average due to overhead), it means we need to have very large blocks for uninterrupted playback.
So, for a 40mbit/s stream, you’d need chunks of 5MB of size. For a baseline, bittorrent uses 16KB blocks (https://stackoverflow.com/questions/65250690/is-there-a-provably-optimal-block-piece-size-for-torrents-and-individual-file). Which means it would not be suitable for direct streaming. (It is also not suitable for other reasons as well, which are also important).
All the current p2p networks we have participants at the leaves of the tree, not the intermediate nodes. So all connections have a log(N) factor added to them.
Unless ISPs and all the way up to tier-1 interconnections accept to run the p2p software and the dedicated storage, and donate compute resources, it will always be bottlenecked compared to commercial operations.
(For example, a 6 hop p2p transfer is actually 6 * leaf to leaf traversal, which is log(N) of internet size).
Hmm,
Scratch that about the block sizes. They will still matter, but you can ask many small blocks in parallel. That would be another inefficiency in terms of overhead, but requires a different calculation.
sukru,
My ping to hong kong on the other side of earth is 233 ms and I’m streaming video from there right now with no issues beyond a short startup delay. Playing video across the world is a worst case scenario at least if we’re talking about physics. On average nodes will be much closer though.
Understood. I don’t think bittorrent is necessarily the best approach either. A purpose built P2P protocol could handle streaming better.
The collective capacity at the leaves of the tree is much greater than even large data centers. Incidentally this bandwidth savings is exactly why microsoft implemented p2p “cloud delivery optimization” in windows 10.
https://learn.microsoft.com/en-us/windows/deployment/do/waas-delivery-optimization
Alfman,
Yes, and ironically this is a major reason p2p did not work out.
As a network admin, I would be blocking p2p ports first if I had capacity issues.
Local mirroring between nodes is good for static content like Windows updates. But sharing with the outside means occupying precious uplink resources.
(Again this is without any copyright considerations).
sukru,
Arguably you would need to block/throttle video streaming services too given a tightly bandwidth constrained network. Even today some ISPs are actively throttling video streaming services for the same reason.
https://www.broadbandtvnews.com/2019/08/19/net-neutrality-isps-are-throttling-netflix-traffic/
My home internet isn’t throttled, but my cell phone carrier certainly is. I’ve detected both ATT and tmobile throttling video services. Some of them even call this practice “video optimization”, but it’s literally throttling. They even have the gull to do this to so called “unlimited plans”.
https://www.boostmobile.com/support/faq/plans-services/video-optimization.html
So I think we can agree that video, especially FHD & UHD, demands tons of bandwidth that network operators may not have. But I feel like pointing the finger at P2P is using it as a scapegoat for problems that have little to do with P2P specifically.
Efficient P2P scalability depends on local seeds being available. Obviously content will need to traverse the backbone initially, but highly demanded content will quickly become mirrored. Once local seeds are available the internet backbone won’t need to be used for bulk traffic in either direction.
Maybe we can try to model this somehow to produce tangible numbers? I get the feeling you are probably getting tired of the discussion though, and if so…sorry about that, I don’t mean to stretch things to the point of being boring, haha. I dislike the idea that the internet should evolve around all-powerful companies, I feel very strongly that it didn’t have to turn out this way. Still though, I have to concede that they won. The “why” we’re arguing over may be kind of irrelevant at this point.
Makes me long for the days of disc format. This digitaliztion took the control right out of the paoee’s hand, and Gen Z will suffer for that.
THAT, that right there, is the part nobody thinks of. I watched a historian the other day who thinks the 21st century shall be known as “the digital dark ages” because everything will be tied to servers that once shut down pretty much erases everything connected to it. He said in 300 years it won’t be hard to develop a device to read a VHS, CD, DVD, MMC, etc because its basic formats are well understood but with so much of our culture now tied to remote servers and all our media tied to some online DRM or on rapidly shifting mobile devices how much of our history will just be wiped out?
Heck its not even a hundred years we have to worry about, I watch a channel that showcases weird and wacky mobile devices and IDK how many times he has shown some device and had to say “I’d love to show ya what this device looks like working but the app required to make it work was removed from the appstore and nobody has a copy”.
bassbeast,
I don’t know what it will be called, but I have no doubt this will happen to future historians looking back at technology of our time.
I think it’s going to happen a lot sooner. Servers will simply shutdown without much fanfare. Old software and games that predate the era of online activation (like quake) will continue to live on in emulators and maybe even on original hardware. But more recent games and platforms are more likely to end up unusable, even after copyrights expire and it legally enters public domain. With “cloud everything” it’s no longer just a matter of breaking the copy protection like it used to be, rather critical functionality is being moved to the data center. Not just software, but hardware too.
Companies should provide a way for users/owners to support the server side themselves so that things like this don’t happen…
https://www.theguardian.com/technology/2016/apr/05/revolv-devices-bricked-google-nest-smart-home
I’m affected by this one just this year (it’s not just the app, but the server side which is going away).
https://www.reddit.com/r/smarthome/comments/zs2obc/radio_thermostat_mobile_app_discontinued/
But since companies rarely do this of their own accord, the next best case scenario may be for employees to leak the code for decommissioned servers so that the community can do it themselves. But outside of the slim chance this happens, a shutdown may be permanent with no hope of restoration.
Alfman,
Most of the time, the server “code” is not going to be very useful.
Yes, if they are using standard components, like Kubernetes, or standard protocols, maybe we can get alternate dedicated servers.
But many times, there is so much “in house” parts, you not only need the code, you also need to replicate the datacenter itself.
Take Microsoft Flight Simulator. It is actually a cloud hybrid game. Yes, it can be played offline. But if you want the full experience, you need to access Bing geo data, which measures in petabytes, and probably uses obscure private APIs.
Once Microsoft ends support, it will forever be limited.
sukru,
I’m using “code” to mean an all-encompassing term for source code/database/build environments/etc…whatever is needed to run the server. It would make all the difference in the world for developers like us to resurrect abandoned software/hardware.
It would be far less limited if microsoft were generous enough to allow the community to support it themselves. I don’t even think petabytes of data would stop a determined community effort given enough interest. But for the historical purposes that bassbeast is referring to even a much smaller subset of world data would be immensely valuable.
Alfman,
At that point, this becomes a real burden on the company. Most companies will want to write the minimum amount of code that will work on the target platform. And they would possibly use a lot of proprietary code, where a significant portion might be licensed from somewhere else.
(When DOOM was open sourced some parts, the audio library was missing):
With modern games, it would be even larger portions on the backend.
I don’t think consumers would want to pay for it. (Ultimately this extra development will be financed by increasing costs, or reducing features).
sukru,
I would see it as an excuse if a company used this as their justification for not releasing their server code. Sure they can’t release what’s not theirs, there’s a lot of things the community can do to remedy that situation though.
Lets use your example of DOOM, Do you think the community would be better off if DOOM hadn’t become open sourced on account of the proprietary library? … no of course not.
Legally it’s the developer’s prerogative to withhold the code for any reason at all, but it does not follow that said code wouldn’t provide benefit to the community. When it comes to software becoming disabled and devices becoming door stops, it’s blatantly wrong for the company to declare that a server component has no value for the community. Only the community themselves can determine whether it has value to them.
Alfman,
In case of DOOM, they were lucky, since most of the components were in house.
In other cases, just separating out “what can be released freely” can be a large task.
Again, this can be paid by the company (unlikely), paid by volunteers using Patreon, etc (haven’t seen it before), or paid upfront by customers by increased prices (for not using off the shelf commercial tools).
I wish there was a better answer.
Alfman,
I feel the attitude of management is a far bigger factor than anything else as to whether they’ll contribute anything as open source. The irony of all of this is that many companies rely so heavily on FOSS software to build their proprietary back ends. I’m really struggling to think of a single counter example among the companies I’ve worked for.
Obviously some companies buy and use proprietary software and tools in their stack. In general this is at the binary or object level. Wouldn’t you agree it’s quite unusual to have the source for proprietary software that isn’t your own? Maybe there are odd exceptions, but I would think those are the exception and not the rule.
Taking note of the very conspicuous quotation marks in the title.
Thom is just posting for a friend 😉
What i find fascinating is on just how much porn has become a part of the modern culture and civilization. Porn is even immune to movement such as woke and modern western feminism. As nothing is immune to that. So the obvious question i have is are people from such movements pornheads too? I find that very likely to be the case. Fascinating indeed.
Perhaps because there is nothing wrong with it? It’s pictures of people doing consensual stuff folks, there is nothing wrong with it.
Maybe.
If you think modern feminists all are okay with porn, you have been hanging out with a very limited sampling of feminists. Even if you go to r/askfeminists or such you will find a lot of people who are anti, and most of them will not be bigots.
One thing is to say something and another thing is what you are actually doing.
Who cares what they are okay with? There is still nothing wrong with pictures of people doing consensual stuff, that’s why nobody pays attention to the porn prohibitionists.
@kurkosdr
You might not care what they are OK with. But for sure they will tell you exactly that. And it looks like porn is so deeply embedded in today’s society. That it even gets a free pass in such area. I find that fascinating. Pornheads being the only universally agreed and accepted thing in 2023. Who would have thought that. So no need to be aggressive about it. Just do it. Like all the rest.
Geck: speak for yourself, dude. I have no way of knowing what passes through their Ethernet adapters, and neither do you, unless you’re working for the NSA (and badly abusing the privileges of the job). But then, something tells me feminism isn’t your jam anyway.
kurkosdr: it’s complicated, actually, and the “consensual” part does a lot of work in that sentence. Most people who actually care about porn actors and other sex workers do not support prohibition, for the simple reason that if you talk to sex workers, most of them will say that it makes their lives more dangerous. But this does not mean they’re 100% okay with porn, or with the industry as it exists today. In practice it’s often a poverty trap for actors, abuse and trafficking are rampant, and tube sites steal their videos even if they’ve avoided the worst elements. The reality is a lot shittier than abstractions like “well everyone consented” would lead you to think.
@rainbowsocks
Yes, dude. You speak for yourself too. And it’s not important on what is going on in private chats. If you are a movement it’s way more important your public stance on some subject. And for now porn gets a free pass. Fascinating indeed. Porn being the only thing getting that. As everything else in under scrutiny. Being a pornhead is hence some sort of an escape isle these days. Closest thing to being normal and not judged i guess. From society point of view.
Calling prostitution sex work. Just be honest and call it whoring.
@bubi
If you do watch porn then in my opinion such judgement is rather baseless. As you belong to the same club.
Oh please, porn is consensual, so they can leave any time they want. If someone doesn’t like working in porn but they do it anyway because they think they are owed movie actor salaries despite not being a movie actor, that’s their problem. They can quit and go work at Starbucks for minimum wage any time they want. It’s like how Mia Khalifa tried to spin a sob story about how bad her experience in the porn industry was, but later it became clear she was being paid the equivalent of a senior petroleum engineer’s salary for her work. Also, she could leave anytime she wanted but for some reason she didn’t. Strange…
Also, trafficking? How on Earth can you be “trafficked” at porn more than any other job? Get real. It’s like saying casino dealers are “trafficked” because some religious folk don’t like the industry. The performers in the porn industry are in for the money involved, consensually.
@kurkosdr
There actually is a lot of abuse involved in porn. If you want you can always look it up and educate yourself. Sometimes rape indeed is involved too. Don’t be fooled in believing there are not abusers in porn industry. There is always some percentage of evil people involved. Regardless of the industry. Including porn. And no not even in porn people deserve it. They are regular people and should be treated as such.
And yet, you are singling out the porn industry. Unless you are proposing we avoid dealing with all industries and go live in the mountains alone.
@kurkosdr
Relax. Just do it.
If remember reading about Cartrivision before, and when I read about the red tapes, I immediately thought “aha, that’s why studios went apeshit over Betamax”. You see, back then, the idea of paying admission once and watching a film more than once sounded unacceptable to the Hollywood execs, so un-rewindable rental cartridges was the DRM scheme Hollywood wanted to impose on the electronics industry of the era. Betamax went against that, and Hollywood was willing to litigate all the way to the Supreme Court based on highly dubious grounds to stop it. Fortunately, Hollywood lost.
A similar thing happened with MP3 players, which didn’t play to the DRM rules the recoding labels had defined for MiniDisc (by then Sony had changed sides). Fortunately, they recording labels lost, and this is the reason you can buy DRM-free MP3s from Amazon.
They got us with Blu-Ray and Netflix though… I kind of hoped the people would download MP4 and MKV files instead, but nope, HDCP and Protected Video Path won.
If the tape was returned to the store, the store had to have a way of rewinding without cracking open the case. Shame the case wasn’t opened to show the simple method of disabling the rewind block.
Yep, I’d imagine there was some sort of magnetic trigger that home players lacked but stores could use to wind it back up by moving the mechanism the article author found and running the spindle in reverse.
FOON! They are awesome, love their crazy hobbies and projects.
While we’re on the topic of software and media getting disabled…
“Blizzard entertainment’s servers were shut down this week after two decades.”
https://interestingengineering.com/culture/gamers-in-china-lose-access-warcraft
Alfman,
This is actually on China.
Unlike the rest of the world, they require games to be published by local companies only. So NetEase, or other partners can dictate terms, which apparently were not acceptable for the game maker (Blizzard).
For some reason, there is this asymmetry between China and their trade partners. Tencent for example can publish games in the West, can buy companies or invest in them freely. But the reverse is not true.
sukru,
I can see why we’d blame china here. But even so, this is the inevitable fate of all tethered software. Once we loose the ability to run things independently, we become highly dependent on powers that are outside of our control regardless of who’s fault it will be when the servers go offline. Do you want to show your kids games from your pastime? This may become impossible to do with online activation games and multiplayer games that don’t enable LAN gameplay. On the one hand, it is what it is, we just have to accept it and move on. But on the other hand, isn’t this a shame? I think so.
Alfman,
I agree. Many of my past games can still be run, even the online ones.
But the recent games will be lost forever. Just last week about 10 “live service” games shut down, one after another.
And I have only heard Marvel’s Avengers trying to hack together an update that will keep it running as-is (but probably still tied to some servers, somewhere): https://avengers.crystald.com/en-us/final-update-on-the-future-of-marvels-avengers/
The rest were not so lucky.