One of my annual goals has been to lose some weight. Walking on a treadmill is boring, so I placed a Roku (streaming video system) with a monitor in front of it. This permits me to walk while watching some streaming shows. I can honestly say that I've seen every good piece of science fiction on Netflix. (As well as some really bad sci-fi.) When I started running low on new content, I began rewatching the good shows. (I must have seen
The Old Guard at least a half dozen times.)
Unfortunately, the media providers have divided up the available shows. Some things are only available on Netflix, others are only on Hulu, Amazon, other a dozen other smaller providers. If you want a bigger selection, then you need to join to more services. By December, I had decided to subscribe to another streaming service, just to get more variety.
I heard that Disney+ had the Star Wars franchise. I checked their requirements and they explicitly say that they support my Roku model. However, as soon as I tried to enable it, it said that it wasn't supported on my device. A quick search in their forums identified the problem: Disney lied when they said they support it. Specifically, they have two tiers: with ads (cheap) and without ads (expensive). The cheaper Disney+ plan, which I had signed up for, is
not supported on Roku. It turns out that Disney+ is having some kind of
dispute with Roku and decided to not support their low-end service on Roku's platform.
I immediately canceled the Disney+ service, but they still tried to charge me for the month. A quick chat with their online service person got me my full refund.
This experience just capped off the end of a year with far too much technological divergence.
Video
I'm old enough to remember when the "Apple computer" was new and IBM made these super heavy "PS/2" desktop machines. Back then, if you wanted video then you could either use a TV or one of the newer video monitors. The video monitors had a
wide range of protocols: HGC, CGA, EGA, VGA, EVGA, etc. You needed a video card and a monitor that supported the same protocol. There was a little backwards compatibility, but it wasn't consistent.
Every new video card needed new, custom video drivers. Even different video cards from the same vendors were often incompatible. If the diver wasn't supported by the operating system or the monitor didn't support the video card, then you were out of luck.
Fortunately, most of this insanity was resolved with consistent standards. Today, if you have a monitor and a computer, then you can just plug them in and they work. It's been this way for a few decades.
This same divergence-to-convergence scenario happened with printers. In the early days, you had serial vs parallel, then came network and USB interfaces. You also used to need specific drivers for accessing your printer. But today, most printers speak standard protocols. If your specific printer model isn't in the list, then there's probably a similar (or generic) model that will work just as well. Most of the time, you can just plug it in and start using it.
However, over the last year I began to play with some AI software. I've experimented with everything from artificial text generation (GPT-2, GPT-3, etc.) to computer-generated imagery, voice replacement, and even some deep fake software. With all of these technologies, training the AI really requires a GPU. Unfortunately, this is where I found a resurgence of technological divergence.
It turns out that the GPU interfaces are all non-standard. No AI library supports every GPU model. For example, PyTorch (a very common AI framework) is based on Torch. The github repo for
Torch hasn't been updated in 5 years, is failing the nightly builds, and isn't actively supported. And yet, a ton of AI-based github projects still use PyTorch.
The AI community seems to mainly be divided between nVidia and non-nVidia hardware. But sticking to one hardware vendor doesn't ensure compatibility. Newer nVidia cards are released often and the newest ones typically lack community support for at least a year. I've found some projects that just won't work with my (newer) two-year-old nVidia GTX-3060. If you want to work on any AI projects these days, you need to make sure that your specific combination of hardware, drivers, libraries, and software are compatible.
Mobile
This technological divergence seems to be everywhere. This year, I spent some time playing with sensors on mobile devices. Most mobile phones and tablets have GPS, motion sensors, etc. (Last July, I was trying to determine if any of Apple's proprietary metadata fields recorded sensor information.)
There are plenty of JavaScript functions for accessing device information from within the web browser. However, to describe them as "inconsistent" would be an understatement. For example:
- My Android 11 tablet has the latest versions of mobile Firefox and mobile Chrome installed. Both can read my GPS sensor. However, that's pretty much the end of any consistency. Firefox can see the 'compass', but not the motion sensor. Chrome can see the motion sensor, but not the compass. In contrast, a standalone application (not JavaScript in a web browser) can see both without a problem. The conclusion? Different software on the same platform will work differently.
- While Firefox and Chrome on the same android behave differently, the exact same versions of mobile Firefox and mobile Chrome mostly work the same way on my iPhone. However, Firefox can just access the sensors, while Chrome needs a human to click on a permissions button first. Even if I granted indefinite permission, it still requires permission each time. The conclusion here? The same software on different platforms will work differently.
- The current iteration of mobile Chrome is version 108 (November). Version 105 (August) saw different sensors and had different prompting compared to 108. Chrome version 101 saw more sensors on the same device than Chrome 105. Thus, there is not even consistency between versions that are only months apart.
There is a very clear difference between iOS vs Android, Firefox vs Chrome, different versions of the same browser, and even the ability to immediately use the sensors vs prompt the user for permission first. If you're a software developer and hoping for specific functionality, then you have to plan for the inconsistency. The available sensors on any mobile device may differ by hardware, platform, software, and version.
Networking
Even networking, which has been stable for decades, is seeing a technological divergence. For example, most online connections rely of TLS for network security. SSL, TLS 1.0, and TLS 1.1 were
deprecated a few years ago due to having too many weak ciphers. (
Deprecated is a technical term for "no longer recommended and losing support soon".) On March 31, 2022, TLS 1.1 was
no longer supported by Google, Microsoft, Apple, and Mozilla. Online devices really should be using TLS 1.3, but TLS 1.2 is still supported.
These days, the only time my web sites see anything using SSL or TLS 1.0 is when it comes from attack bots. They're looking for anyone who is still running old software. (It's a good indication that you're vulnerable due to not patching.) I still see a few users with old browsers (usually only on mobile devices) who are using TLS 1.1, but even TLS 1.1 is mostly an indication that the TLS client is a hostile bot or automated script.
For the last few days, DEF CON founder Jeff Moss has
been reporting on a problem with his mastodon server. Jeff had only enabled TLS 1.3 (current TLS version that was revised in 2018) but the main mastodon hub seems to only use TLS 1.2 (circa 2008). This meant that Jeff couldn't register his
defcon.social mastodon service at the hub. (His solution was to enable both TLS 1.2 and 1.3 support.)
While less severe, I'm also seeing a migration from HTTP/1.1 to HTTP/2.0, but the migration is far from complete. At the current adoption rate, I'm expecting to see HTTP/1.1 for another decade. (In contrast, any clients claiming to be running HTTP/0.9 or HTTP/1.0 are bots and should be blocked on sight.)
On the positive side, I'm seeing more services cracking down on abusive network behavior, like scans and attacks. I used to see attacks coming from all over the internet. But these days, they are often limited to specific subnets or service providers. This makes blocking malicious activity much simpler; I have no problem blacklisting network ranges, or entire domains, when the service provider primarily hosts malicious clients. While blacklisting parts of the internet is introducing a technological divergence (you can't access everything from everywhere), it's great for mitigating attacks.
With video cards and printers, there used to be a hassle with installing drivers, but eventually the dust settled and compatible protocols were defined. Today, we seem to be re-entering a state of vendor-specific functionality, competing feature interfaces, and incompatible protocols. Unfortunately, this type of thing usually takes years to resolve. Here's hoping that it happens in 2023, even though it will probably be closer to 2026.
And on a personal note, since I still want to watch sci-fi while walking on the treadmill, I recently added Paramount Plus to my streaming options. They have the Star Trek and Terminator franchises, and it works on my Roku. With a new selection of stuff to watch, I can now boldly go into 2023!