โŒ

Reading view

There are new articles available, click to refresh the page.

My New Old Keyboard

It should come as no surprise that I type a lot. And I really mean a lot. Almost every keyboard I own has worn off the letters on the keys. Then again, I type so much that I'm really a touch-typist; I rarely look at the keyboard. (Even before I took a class on typing on a typewriter in elementary school, I was already typing as fast I as I could talk. Do schools still teach typing?)

There are lots of different types of keyboards out there: high profile, low profile, feather sensitivity or heavy hitters, curved keys, uniform height, etc. Personally, I like the loud, heavy keyboards with high profiles (keys stand up) and no gap between the keys. This way, I can feel the keys without looking down and tell the difference between brushing a finger over a key and actually typing a letter. (If you can't hear my typing during a video call, then the keyboard isn't loud enough.)

Most of my keyboards have had black or gray keys. A few years ago (2022), I bought a "large print keyboard" for the fun of it. It had big keys, a high profile, and a loud click. The selling point for me were the huge letters and the bright yellow color.



Unfortunately, it didn't last very long. Within a few months, the black paint on the letters began to vanish. 'A', 'S', and 'D' were the first to go, followed by 'X', 'C', and 'V' (for cut, copy, and paste). Fast forward to today (3 years later):



The shift-key on the right doesn't have a black scratch on it. That's where I've literally worn through the yellow plastic. It's not that I don't use the letters Q, P, H, or U; they just seem to have lasted longer. (I joked with my colleagues that the backspace and delete keys are in pristine conditions -- because I don't make mistakes.)

The New Problems

When a keyboard gets worn down this much, I typically go out and buy a new cheap keyboard. Given that I wear through keyboards every few years, I have trouble justifying $100 for a fancy replacement. Give me a $10 cheap-plastic keyboard every few years and I'll be happy. (Seriously, I splurged $23 on the yellow keyboard. It lasted 3 years, so that's less than $8 a year. Before the yellow keyboard, I had a cheap $12 one that also lasted 3 years, so it cost $4 per year to use.)

Over the last 40+ years, I've seen the quality degrade as vendors cut costs by using cheaper materials. The old heavy IBM PC keyboards were built like tanks -- they never broke down, even if the letters might fade a little. The PS/2 keyboards (circa 1987-1997) had more plastic and occasionally the key switches would degrade before the print on the keys wore off. (I have one old PS/2 keyboard that types "jn" every time you press "n". Beneath each key is a switch. This problem might be a dirty contact, but I don't think I can open the keyboard up without breaking it.) Today's USB keyboards are extremely lightweight but also cheaply constructed; letters fade fast and the plastic on the keys might wear out. Today's keyboards are not built to last.

Making matters worse, most keyboards are made overseas. Between the (insane) tariffs and shipping delays, I don't want to wait. And with the current economic instability, I'd rather not spend the money, even on a new cheap keyboard, if I absolutely don't have to.

What's Old is New

Fortunately, I have a huge box of old keyboards in the storage area. It includes everything from modern USB to old PS/2 and the super old 5-pin DIN connectors. (I think the oldest keyboard in the box is from the early 1980s.) Some computer manufactures would bundle a keyboard with every new computer. Other times I'd pick up a keyboard in a box of auction junk. (Often, I'd want something else at the auction, but the box being sold also contained keyboards.) Any keyboard I don't like, don't need, don't use, or is broken for some reason gets put in the big box of keyboards.

Today I went digging through the box, looking for something with the right profile and feel.
  • The first good one was a 105 keys with a PS/2 connector. (Most US keyboards have 101 keys.) My computer doesn't have a PS/2 port, but in the "big box of old keyboards" was an old PS2-to-USB adapter! That's the nice thing about keyboards -- they all use the same communication protocol. As long as you have the right adapter to plug it in, the computer will recognize it and it will just work.

    This new old keyboard was manufactured in 1992 by a company that no longer exists. (I looked them up. Today, there's a company with the same name, but they were founded in 2001.) And yet, the keyboard still works fine. Well, sort of. All of the standard "101" keys still work fine, but the custom "power", "sleep", "wake", and "Fn" buttons don't register when I press them. (Maybe I need to tweak the keyboard mapping? Probably not worth the effort.) Since it's not perfect, I went back to the box of keyboards.

  • The next keyboard had a bunch of sticky keys that push down but pop up slowly. (From an auction, someone probably spilled a drink on the keyboard a few decades ago.)

  • The original "Sun" keyboard looks like a PS/2 but doesn't work; it's probably not really communicating with PS/2. (When possible, stay away from proprietary connectors.)

  • I found one of my old keyboards that I used with my OS/2 workstation. After plugging it in, I remembered why I replaced it: the space bar was broken. Many space bars have a metal wire that ensures that the key goes down evenly. The wire fits into some plastic clips underneath. After years of use, those clips had broken off.
I finally settled on an old HP keyboard that was buried in the box. It's a C1405B #ABA, manufactured in 1992, back when HP actually made keyboards. (OMG, sellers on Etsy and eBay call it "Vintage"!) It's a heavy monster and yellowed from age, but no wear on the letters. It has a good feel and every key seems to work.

There's just one problem. It predates the appearance of the "Super" key ("Windows" or "Command" key on keyboards, next to the shift buttons). On my desk are two computers that share the same keyboard and mouse: a Linux box and a Mac. I use some software called 'Synergy' to link the desktops. As the mouse goes off the side of one monitor, it appears on the next computer's screen. Linux doesn't use the Windows/Command key, but Macs do. This missing key is going to be a problem... Fortunately, Synergy permits me to remap 'alt' to the Mac 'Command' key. (Problem solved.)

Macros

On my desktop computer, I have a few macros mapped to certain keys. For example:
  • I almost never use the function keys. I've mapped "F9" to toggle my mouse size. If I press it, then the cursor becomes larger -- which is great for video chats and sharing my screen -- the big icons help people to see my mouse. If I press F9 again, then the mouse returns to the normal small size.

  • I've remapped the "Pause/Break" button. (In 40+ years, I've never used that button to pause/break anything.) Instead, it turns on/off the audio recorder on my telephone. With the push of a button, I can record any call to an MP3. (I use it to record spam phone calls; I wrote about the script back in 2014.) If the phone rings from an unknown caller, I press the button to record and then answer the phone. (And yes, recording calls is legal in Colorado.)

  • The lower-right corner of most 101-key keyboards has a "Menu" button. I've remapped that to mute/unmute my speakers. (Sometimes I can't immediately find the app that is making sounds and I just want the computer to shut up while I take a call. Tap one key for mute.) However, this HP keyboard predates the "Windows" and "Menu" buttons, so I'll need to remap the mute/unmute to a different key. (Maybe F8; I never use that key!)
Unsurprisingly, the macros work with this new old keyboard. While the manufacturing quality has evolved over time, the keyboard communication protocol and key codes hasn't changed.

Old Learning Curve

I think the biggest hurdle for this new old keyboard will be my own adjusting to the physical key spacing. Cheap keyboards and older keyboards often use different key sizes. With this keyboard, the spacing is a little wider than the yellow keyboard. It also has a different sensitivity. (Not bad, just different.) Then again, if I decide I don't like it, then I can always go back to digging through my big box of old keyboards.

Technological Divergence

One of my annual goals has been to lose some weight. Walking on a treadmill is boring, so I placed a Roku (streaming video system) with a monitor in front of it. This permits me to walk while watching some streaming shows. I can honestly say that I've seen every good piece of science fiction on Netflix. (As well as some really bad sci-fi.) When I started running low on new content, I began rewatching the good shows. (I must have seen The Old Guard at least a half dozen times.)

Unfortunately, the media providers have divided up the available shows. Some things are only available on Netflix, others are only on Hulu, Amazon, other a dozen other smaller providers. If you want a bigger selection, then you need to join to more services. By December, I had decided to subscribe to another streaming service, just to get more variety.

I heard that Disney+ had the Star Wars franchise. I checked their requirements and they explicitly say that they support my Roku model. However, as soon as I tried to enable it, it said that it wasn't supported on my device. A quick search in their forums identified the problem: Disney lied when they said they support it. Specifically, they have two tiers: with ads (cheap) and without ads (expensive). The cheaper Disney+ plan, which I had signed up for, is not supported on Roku. It turns out that Disney+ is having some kind of dispute with Roku and decided to not support their low-end service on Roku's platform.

I immediately canceled the Disney+ service, but they still tried to charge me for the month. A quick chat with their online service person got me my full refund.

This experience just capped off the end of a year with far too much technological divergence.

Video

I'm old enough to remember when the "Apple computer" was new and IBM made these super heavy "PS/2" desktop machines. Back then, if you wanted video then you could either use a TV or one of the newer video monitors. The video monitors had a wide range of protocols: HGC, CGA, EGA, VGA, EVGA, etc. You needed a video card and a monitor that supported the same protocol. There was a little backwards compatibility, but it wasn't consistent.

Every new video card needed new, custom video drivers. Even different video cards from the same vendors were often incompatible. If the diver wasn't supported by the operating system or the monitor didn't support the video card, then you were out of luck.

Fortunately, most of this insanity was resolved with consistent standards. Today, if you have a monitor and a computer, then you can just plug them in and they work. It's been this way for a few decades.

This same divergence-to-convergence scenario happened with printers. In the early days, you had serial vs parallel, then came network and USB interfaces. You also used to need specific drivers for accessing your printer. But today, most printers speak standard protocols. If your specific printer model isn't in the list, then there's probably a similar (or generic) model that will work just as well. Most of the time, you can just plug it in and start using it.

However, over the last year I began to play with some AI software. I've experimented with everything from artificial text generation (GPT-2, GPT-3, etc.) to computer-generated imagery, voice replacement, and even some deep fake software. With all of these technologies, training the AI really requires a GPU. Unfortunately, this is where I found a resurgence of technological divergence.

It turns out that the GPU interfaces are all non-standard. No AI library supports every GPU model. For example, PyTorch (a very common AI framework) is based on Torch. The github repo for Torch hasn't been updated in 5 years, is failing the nightly builds, and isn't actively supported. And yet, a ton of AI-based github projects still use PyTorch.

The AI community seems to mainly be divided between nVidia and non-nVidia hardware. But sticking to one hardware vendor doesn't ensure compatibility. Newer nVidia cards are released often and the newest ones typically lack community support for at least a year. I've found some projects that just won't work with my (newer) two-year-old nVidia GTX-3060. If you want to work on any AI projects these days, you need to make sure that your specific combination of hardware, drivers, libraries, and software are compatible.

Mobile

This technological divergence seems to be everywhere. This year, I spent some time playing with sensors on mobile devices. Most mobile phones and tablets have GPS, motion sensors, etc. (Last July, I was trying to determine if any of Apple's proprietary metadata fields recorded sensor information.)

There are plenty of JavaScript functions for accessing device information from within the web browser. However, to describe them as "inconsistent" would be an understatement. For example:
  • My Android 11 tablet has the latest versions of mobile Firefox and mobile Chrome installed. Both can read my GPS sensor. However, that's pretty much the end of any consistency. Firefox can see the 'compass', but not the motion sensor. Chrome can see the motion sensor, but not the compass. In contrast, a standalone application (not JavaScript in a web browser) can see both without a problem. The conclusion? Different software on the same platform will work differently.

  • While Firefox and Chrome on the same android behave differently, the exact same versions of mobile Firefox and mobile Chrome mostly work the same way on my iPhone. However, Firefox can just access the sensors, while Chrome needs a human to click on a permissions button first. Even if I granted indefinite permission, it still requires permission each time. The conclusion here? The same software on different platforms will work differently.

  • The current iteration of mobile Chrome is version 108 (November). Version 105 (August) saw different sensors and had different prompting compared to 108. Chrome version 101 saw more sensors on the same device than Chrome 105. Thus, there is not even consistency between versions that are only months apart.
There is a very clear difference between iOS vs Android, Firefox vs Chrome, different versions of the same browser, and even the ability to immediately use the sensors vs prompt the user for permission first. If you're a software developer and hoping for specific functionality, then you have to plan for the inconsistency. The available sensors on any mobile device may differ by hardware, platform, software, and version.

Networking

Even networking, which has been stable for decades, is seeing a technological divergence. For example, most online connections rely of TLS for network security. SSL, TLS 1.0, and TLS 1.1 were deprecated a few years ago due to having too many weak ciphers. (Deprecated is a technical term for "no longer recommended and losing support soon".) On March 31, 2022, TLS 1.1 was no longer supported by Google, Microsoft, Apple, and Mozilla. Online devices really should be using TLS 1.3, but TLS 1.2 is still supported.

These days, the only time my web sites see anything using SSL or TLS 1.0 is when it comes from attack bots. They're looking for anyone who is still running old software. (It's a good indication that you're vulnerable due to not patching.) I still see a few users with old browsers (usually only on mobile devices) who are using TLS 1.1, but even TLS 1.1 is mostly an indication that the TLS client is a hostile bot or automated script.

For the last few days, DEF CON founder Jeff Moss has been reporting on a problem with his mastodon server. Jeff had only enabled TLS 1.3 (current TLS version that was revised in 2018) but the main mastodon hub seems to only use TLS 1.2 (circa 2008). This meant that Jeff couldn't register his defcon.social mastodon service at the hub. (His solution was to enable both TLS 1.2 and 1.3 support.)

While less severe, I'm also seeing a migration from HTTP/1.1 to HTTP/2.0, but the migration is far from complete. At the current adoption rate, I'm expecting to see HTTP/1.1 for another decade. (In contrast, any clients claiming to be running HTTP/0.9 or HTTP/1.0 are bots and should be blocked on sight.)

On the positive side, I'm seeing more services cracking down on abusive network behavior, like scans and attacks. I used to see attacks coming from all over the internet. But these days, they are often limited to specific subnets or service providers. This makes blocking malicious activity much simpler; I have no problem blacklisting network ranges, or entire domains, when the service provider primarily hosts malicious clients. While blacklisting parts of the internet is introducing a technological divergence (you can't access everything from everywhere), it's great for mitigating attacks.

With video cards and printers, there used to be a hassle with installing drivers, but eventually the dust settled and compatible protocols were defined. Today, we seem to be re-entering a state of vendor-specific functionality, competing feature interfaces, and incompatible protocols. Unfortunately, this type of thing usually takes years to resolve. Here's hoping that it happens in 2023, even though it will probably be closer to 2026.

And on a personal note, since I still want to watch sci-fi while walking on the treadmill, I recently added Paramount Plus to my streaming options. They have the Star Trek and Terminator franchises, and it works on my Roku. With a new selection of stuff to watch, I can now boldly go into 2023!
โŒ