Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Clone Wars: IBM Edition

14 January 2026 at 10:00

If you search the Internet for “Clone Wars,” you’ll get a lot of Star Wars-related pages. But the original Clone Wars took place a long time ago in a galaxy much nearer to ours, and it has a lot to do with the computer you are probably using right now to read this. (Well, unless it is a Mac, something ARM-based, or an old retro-rig. I did say probably!)

IBM is a name that, for many years, was synonymous with computers, especially big mainframe computers. However, it didn’t start out that way. IBM originally made mechanical calculators and tabulating machines. That changed in 1952 with the IBM 701, IBM’s first computer that you’d recognize as a computer.

If you weren’t there, it is hard to understand how IBM dominated the computer market in the 1960s and 1970s. Sure, there were others like Univac, Honeywell, and Burroughs. But especially in the United States, IBM was the biggest fish in the pond. At one point, the computer market’s estimated worth was a bit more than $11 billion, and IBM’s five biggest competitors accounted for about $2 billion, with almost all of the rest going to IBM.

So it was somewhat surprising that IBM didn’t roll out the personal computer first, or at least very early. Even companies that made “small” computers for the day, like Digital Equipment Corporation or Data General, weren’t really expecting the truly personal computer. That push came from companies no one had heard of at the time, like MITS, SWTP, IMSAI, and Commodore.

The IBM PC

The story — and this is another story — goes that IBM spun up a team to make the IBM PC, expecting it to sell very little and use up some old keyboards previously earmarked for a failed word processor project. Instead, when the IBM PC showed up in 1981, it was a surprise hit. By 1983, there was the “XT” which was a PC with some extras, including a hard drive. In 1984, the “AT” showed up with a (gasp!) 16-bit 80286.

The personal computer market had been healthy but small. Now the PC was selling huge volumes, perhaps thanks to commercials like the one below, and decimating other companies in the market. Naturally, others wanted a piece of the pie.

Send in the Clones

Anyone could make a PC-like computer, because IBM had used off-the-shelf parts for nearly everything. There were two things that really set the PC/XT/AT family apart. First, there was a bus for plugging in cards with video outputs, serial ports, memory, and other peripherals. You could start a fine business just making add-on cards, and IBM gave you all the details. This wasn’t unlike the S-100 bus created by the Altair, but the volume of PC-class machines far outstripped the S-100 market very quickly.

In reality, there were really two buses. The PC/XT had an 8-bit bus, later named the ISA bus. The AT added an extra connector for the extra bits. You could plug an 8-bit card into part of a 16-bit slot. You probably couldn’t plug a 16-bit card into an 8-bit slot, though, unless it was made to work that way.

The other thing you needed to create a working PC was the BIOS — a ROM chip that handled starting the system with all the I/O devices set up and loading an operating system: MS-DOS, CP/M-86, or, later, OS/2.

Protection

An ad for a Columbia PC clone.

IBM didn’t think the PC would amount to much so they didn’t do anything to hide or protect the bus, in contrast to Apple, which had patents on key parts of its computer. They did, however, have a copyright on the BIOS. In theory, creating a clone IBM PC would require the design of an Intel-CPU motherboard with memory and I/O devices at the right addresses, a compatible bus, and a compatible BIOS chip.

But IBM gave the world enough documentation to write software for the machine and to make plug-in cards. So, figuring out the other side of it wasn’t particularly difficult. Probably the first clone maker was Columbia Data Products in 1982, although they were perceived to have compatibility and quality issues. (They are still around as a software company.)

Eagle Computer was another early player that originally made CP/M computers. Their computers were not exact clones, but they were the first to use a true 16-bit CPU and the first to have hard drives. There were some compatibility issues with Eagle versus a “true” PC. You can hear their unusual story in the video below.

The PC Reference manual had schematics and helpfully commented BIOS source code

One of the first companies to find real success cloning the PC was Compaq Computers, formed by some former Texas Instruments employees who were, at first, going to open Mexican restaurants, but decided computers would be better. Unlike some future clone makers, Compaq was dedicated to building better computers, not cheaper.

Compaq’s first entry into the market was a “luggable” (think of a laptop with a real CRT in a suitcase that only ran when plugged into the wall; see the video below). They reportedly spent $1,000,000 to duplicate the IBM BIOS without peeking inside (which would have caused legal problems). However, it is possible that some clone makers simply copied the IBM BIOS directly or indirectly. This was particularly easy because IBM included the BIOS source code in an appendix of the PC’s technical reference manual.

Between 1982 and 1983, Compaq, Columbia Data Products, Eagle Computers, Leading Edge, and Kaypro all threw their hats into the ring. Part of what made this sustainable over the long term was Phoenix Technologies.

Rise of the Phoenix

Phoenix was a software producer that realized the value of having a non-IBM BIOS. They put together a team to study the BIOS using only public documentation. They produced a specification and handed it to another programmer. That programmer then produced a “clean room” piece of code that did the same things as the BIOS.

An Eagle ad from 1983

This was important because, inevitably, IBM sued Phoenix but lost, as they were able to provide credible documentation that they didn’t copy IBM’s code. They were ready to license their BIOS in 1984, and companies like Hewlett-Packard, Tandy, and AT&T were happy to pay the $290,000 license fee. That fee also included insurance from The Hartford to indemnify against any copyright-infringement lawsuits.

Clones were attractive because they were often far cheaper than a “real” PC. They would also often feature innovations. For example, almost all clones had a “turbo” mode to increase the clock speed a little. Many had ports or other features as standard that a PC had to pay extra for (and consume card slots). Compaq, Columbia, and Kaypro made luggable PCs. In addition, supply didn’t always match demand. Dealers often could sell more PCs than they could get in stock, and the clones offered them a way to close more business.

Issues

Not all clone makers got everything right. It wasn’t odd for a strange machine to have different interrupt handling than an IBM machine or different timers. Another favorite place to err involved AT/PC compatibility.

In a base-model IBM PC, the address bus only went from A0 to A19. So if you hit address (hex) FFFFF+1, it would wrap around to 00000. Memory being at a premium, apparently, some programs depended on that behavior.

With the AT, there were more address lines. Rather than breaking backward compatibility, those machines have an “A20 gate.” By default, the A20 line is disabled; you must enable it to use it. However, there were several variations in how that worked.

Intel, for example, had the InBoard/386 that let you plug a 386 into a PC or AT to upgrade it. However, the InBoard A20 gating differed from that of a real AT. Most people never noticed. Software that used the BIOS still worked because the InBoard’s BIOS knew the correct procedure. Most software didn’t care either way. But there was always that one program that would need a fix.

The original PC used some extra logic in the keyboard controller to handle the gate. When CPUs started using cache, the A20 gating was moved into the CPU for many generations. However, around 2013, most CPUs finally gave up on gating A20.

The point is that there were many subtle features on a real IBM computer, and the clone makers didn’t always get it right. If you read ads from those days, they often tout how compatible they are.

Total War!

IBM started a series of legal battles against… well… everybody. Compaq, Corona Data Systems, Handwell, Phoenix, AMD, and anyone who managed to put anything on the market that competed with “big blue” (one of IBM’s nicknames).

IBM didn’t win anything significant, although most companies settled out of court. Then they just used the Phoenix BIOS, which was provably “clean.”  So IBM decided to take a different approach.

In 1987, IBM decided they should have paid more attention to the PC design, so they redid it as the PS/2. IBM spent a lot of money telling people how much better the PS/2 was. They had really thought about it this time. So scrap those awful PCs and buy a PS/2 instead.

Of course, the PS/2 wasn’t compatible with anything. It was made to run OS/2. It used the MCA bus, which was incompatible with the ISA bus, and didn’t have many cards available. All of it, of course, was expensive. This time, clone makers had to pay a license fee to IBM to use the new bus, so no more cheap cards, either.

You probably don’t need a business degree to predict how that turned out. The market yawned and continued buying PC “clones” which were now the only game in town if you wanted a PC/XT/AT-style machine, especially since Compaq beat IBM to market with an 80386 PC by about a year.

Not all software was compatible with all clones. But most software would run on anything and, as clones got more prevalent, software got smarter about what to expect. At about the same time, people were thinking more about buying applications and less about the computer they ran on, a trend that had started even earlier, but was continuing to grow. Ordinary people didn’t care what was in the computer as long as it ran their spreadsheet, or accounting program, or whatever it was they were using.

Dozens of companies made something that resembled a PC, including big names like Olivetti, Zenith, Hewlett-Packard, Texas Instruments, Digital Equipment Corporation, and Tandy. Then there were the companies you might remember for other reasons, like Sanyo or TeleVideo. There were also many that simply came and went with little name recognition. Michael Dell started PC Limited in 1984 in his college dorm room, and by 1985, he was selling an $800 turbo PC. A few years later, the name changed to Dell, and now it is a giant in the industry.

Looking Back

It is interesting to play “what if” with this time in history. If IBM had not opened their architecture, they might have made more money. Or, they might have sold 1,000 PCs and lost interest. Then we’d all be using something different. Microsoft retaining the right to sell MS-DOS to other people was also a key enabler.

IBM stayed in the laptop business (ThinkPad) until they sold to Lenovo in 2005. They would also sell them their server business in 2014.

Things have changed, of course. There hasn’t been an ISA card slot on a motherboard in ages. Boot processes are more complex, and there are many BIOS options. Don’t even get us started on EMS and XMS. But at the core,  your PC-compatible computer still wakes up and follows the same steps as an old school PC to get started. Like the Ship of Theseus, is it still an “IBM-compatible PC?” If it matters, we think the answer is yes.

If you want to relive those days, we recently saw some new machines sporting 8088s and 80386s. Or, there’s always emulation.

Quantum computing is closer than you think

By: wfedstaff
8 January 2026 at 12:40

Quantum computing: From “someday” to now

Quantum computing has quietly advanced to a level of maturity and capability that many technologists, and policymakers, still underestimate. Long dismissed as a “future” technology, quantum is already delivering value in select use cases today.

“We’ve got real quantum computers,” said Scott Crowder, Vice President of Quantum Adoption at IBM Research. “We’ve come a long way in a very short period of time.”

Just nine years ago, Crowder noted, there were no quantum developers and no way to write quantum software. “Since then, we’ve gone from five-qubit systems to machines capable of running programs too complex for the world’s largest supercomputers,” he said.

While quantum computing is becoming more commercially available, it remains an emerging technology. The development curve, however, is accelerating thanks to ongoing improvements in qubit quality, system architecture, and software. Experts expect practical uses in the next five years, especially in medicine, energy, and materials science.

Act now: Modernization and quantum-safe security

Quantum computing will bring benefits before it becomes powerful enough to break today’s encryption – but that risk is coming. Future quantum computers could crack the cryptographic systems that protect government data. In fact, malicious actors can already steal encrypted data and wait until quantum technology makes it easy to unlock.

That’s why agencies should include quantum-safe cryptography as part their modernization efforts.

“There is risk today already for not changing your crypto standards,” Crowder warned. “Incorporating quantum considerations now ensures they become part of your IT strategy, not an afterthought.”

Crowder’s advice:

  • Make quantum-safe cryptography part of your modernization plans.
  • Find out what encryption you use (your “crypto bill of materials”).
  • Start switching to quantum-safe algorithms based on NIST standards.

The bottom line

Quantum computing is no longer just a theory, it’s real, it’s advancing fast, and it will change how we solve complex problems. For government agencies, the opportunity is huge, but so is the responsibility. Preparing now, by planning for quantum-safe security as part of modernization efforts, will ensure readiness for both the technology’s benefits and risks.

The post Quantum computing is closer than you think first appeared on Federal News Network.

© Federal News Network

GettyImages-2187753169

Global data breach costs hit all-time high

By: slandau
30 July 2024 at 12:38

EXECUTIVE SUMMARY:

Global data breach costs have hit an all-time high, according to IBM’s annual Cost of a Data Breach report. The tech giant collaborated with the Ponemon institute to study more than 600 organizational breaches between March of 2023 and February of 2024.

The breaches affected 17 industries, across 16 countries and regions, and involved leaks of 2,000-113,000 records per breach. Here’s what researchers found…

Essential information

The global average cost of a data breach is $4.88 million, up nearly 10% from last year’s $4.5 million. Key drivers of the year-over-year cost spike included post-breach third-party expenses, along with lost business.

Image courtesy of IBM
Image courtesy of IBM.

Over 50% of organizations that were interviewed said that they are passing the breach costs on to customers through higher prices for goods and services.

More key findings

  • For the 14th consecutive year, the U.S. has the highest average data breach costs worldwide; nearly $9.4 million.
  • In the last year, Canada and Japan both experienced drops in average breach costs.
  • Most breaches could be traced back to one of two sources – stolen credentials or a phishing email.
  • Seventy percent of organizations noted that breaches led to “significant” or “very significant” levels of disruption.

Deep-dive insights: AI

The report also observed that an increasing number of organizations are adopting artificial intelligence and automation to prevent breaches. Nearly two-thirds of organizations were found to have deployed AI and automation technologies across security operations centers.

The use of AI prevention workflows reduced the average cost of a breach by $2.2 million. Organizations without AI prevention workflows did not experience these cost savings.

Right now, only 20% of organizations report using gen AI security tools. However, those that have implemented them note a net positive effect. GenAI security tools can mitigate the average cost of a breach by more than $167,000, according to the report.

Deep-dive insights: Cloud

Multi-environment cloud breaches were found to cost more than $5 million to contend with, on average. Out of all breach types, they also took the longest time to identify and contain, reflecting the challenge that is identifying data and protecting it.

In regards to cloud-based breaches, commonly stolen data types included personal identifying information (PII) and intellectual property (IP).

As generative AI initiatives draw this data into new programs and processes, cyber security professionals are encouraged to reassess corresponding security and access controls.

The role of staffing issues

A number of organizations that contended with cyber attacks were found to have under-staffed cyber security teams. Staffing shortages are up 26% compared to last year.

Organizations with cyber security staff shortages averaged an additional $1.76 million in breach costs as compared to organizations with minimal or no staffing issues.

Staffing issues may be contributing to the increased use of AI and automation, which again, have been shown to reduce breach costs.

Further information

For more AI and cloud insights, click here. Access the Cost of a Data Breach 2024 report here. Lastly, to receive cyber security thought leadership articles, groundbreaking research and emerging threat analyses each week, subscribe to the CyberTalk.org newsletter.

The post Global data breach costs hit all-time high appeared first on CyberTalk.

❌
❌