Reading view

There are new articles available, click to refresh the page.

Ancient Egyptian Flatness

Making a truly flat surface is a modern engineering feat, and not a small one. Even making something straight without reference tools that are already straight is a challenge. However, the ancient Egyptians apparently made very straight, very flat stone work. How did they do it? Probably not alien-supplied CNC machines. [IntoTheMap] explains why it is important and how they may have done it in a recent video you can see below.

The first step is to define flatness, and modern mechanical engineers have taken care of that. If you use 3D printers, you know how hard it is to even get your bed and nozzle “flat” with respect to each other. You’ll almost always have at least a 100 micron variation in the bed distances. The video shows how different levels of flatness require different measurement techniques.

The Great Pyramid’s casing stones have joints measuring 0.5 mm, which is incredible to achieve on such large stones with no modern tools. A stone box in the Pyramid of Seostris II is especially well done and extremely flat, although we can make things flatter today.

The main problem with creating a flat surface is that to do a good job, you need some flat things to start with. However, there is a method from the 19th century that uses three plates and multiple lapping steps to create three very flat plates. In modern times, we use a blue material to indicate raised areas, much as a dentist makes you chomp on a piece of paper to place a crown. There are traces of red ochre on Egyptian stonework that probably served the same purpose.

Lapping large pieces is still a challenge, but moving giant stones at scale appears to have been a solved problem for the Egyptians. Was this the method they used? We don’t know, of course. But it certainly makes sense.

It would be a long time before modern people could make things as flat. While we can do even better now, we also have better measuring tools.

Crazy Old Machines

Al and I were talking about the IBM 9020 FAA Air Traffic Control computer system on the podcast. It’s a strange machine, made up of a bunch of IBM System 360 mainframes connected together to a common memory unit, with all sorts of custom peripherals to support keeping track of airplanes in the sky. Absolutely go read the in-depth article on that machine if it sparks your curiosity.

It got me thinking about how strange computers were in the early days, and how boringly similar they’ve all become. Just looking at the word sizes of old machines is a great example. Over the last, say, 40 years, things that do computing have had 4, 8, 16, 32, or even 64-bit words. You noticed the powers-of-two trend going on here, right? Basically starting with the lowly Intel 4004, it’s been round numbers ever since.

Harvard Mark I, by [Topory]
On the other side of the timeline, though, you get strange beasts. The classic PDP-8 had 12-bit words, while its predecessors the PDP-6 and PDP-1 had 36 bits and 18 bits respectively. (Factors of six?) There’s a string of military guidance computers that had 27-bit words, while the Apollo Guidance computer ran 15-bit words. UNIVAC III had 25-bit words, putting the 23-bit Harvard Mark I to shame.

I wasn’t there, but it gives you the feeling that each computer is a unique, almost hand-crafted machine. Some must have made their odd architectural choices to suit particular functions, others because some designer had a clever idea. I’m not a computer historian, but I’m sure that the word lengths must tell a number of interesting stories.

On the whole, though, it gives the impression of a time when each computer was it’s own unique machine, before the convergence of everything to roughly the same architectural ideas. A much more hackery time, for lack of a better word. We still see echoes of this in the people who make their own “retro” computers these days, either virtually, on a breadboard, or emulated in the fabric of an FPGA. It’s not just nostalgia, though, but a return to a time when there was more creative freedom: a time before 64 bits took over.

This article is part of the Hackaday.com newsletter, delivered every seven days for each of the last 200+ weeks. It also includes our favorite articles from the last seven days that you can see on the web version of the newsletter. Want this type of article to hit your inbox every Friday morning? You should sign up!

All sorts of interesting flags and artifacts will fly to the Moon on Artemis II

NASA's first astronauts to fly to the Moon in more than 50 years will pay tribute to the lunar and space exploration missions that preceded them, as well as aviation and American history, by taking with them artifacts and mementos representing those past accomplishments.

NASA, on Wednesday, January 21, revealed the contents of the Artemis II mission's Official Flight Kit (OFK), continuing a tradition dating back to the Apollo program of packing a duffel bag-sized pouch of symbolic and celebratory items to commemorate the flight and recognize the people behind it. The kit includes more than 2,300 items, including a handful of relics.

"This mission will bring together pieces of our earliest achievements in aviation, defining moments from human spaceflight and symbols of where we're headed next," Jared Isaacman, NASA's administrator, said in a statement. "Historical artifacts flying aboard Artemis II reflect the long arc of American exploration and the generations of innovators who made this moment possible."

Read full article

Comments

© Cole Simmons via collectSPACE.com

Unix workstations: The unsung heroes of modern computing

If you were a developer, scientist, engineer, computer engineer, or even a college student in the 1980s and early 1990s, you would have spent a lot of time in front of a Unix machine. Here are some reasons that it might have been like living in the future, given how workstations pioneered many computing features we take for granted.

MLK & Marijuana: How the Civil Rights Leader’s Work Informs the Push for Legal Pot

Martin Luther King Jr. might have turned 96 years old this month if he had not been felled by an assassin’s bullet on April 4, 1968. It is, of course, impossible to know what the United States would look like today if he had lived — or what he would think about the political dilemmas of our own time.

Yet there are certain obvious parallels between his time and ours. The country continues to be bitterly divided along political lines. And many activists and scholars argue that the racist power structure that King fought has re-congealed—this time in the guise of the “War on Drugs” and mass incarceration. His legacy, therefore, holds lessons for those now fighting for cannabis legalization.

Cycles of Repression and Revolution  

Foremost among those scholars is Michelle Alexander, author of the 2010 bestseller The New Jim Crow: Mass Incarceration in the Age of Colorblindness. Alexander takes a long view of the struggle for racial justice in the United States and paints a grim picture. She illustrates how many of the gains that King won in his life are being reversed after his death — this time in a new “race-neutral” guise that only serves to mask continued institutionalized racism.  

Alexander notes that in 1972, there were under 350,000 people in prisons and jails nationwide. Today there are 2 million. In fact, the US has the most people behind bars of any nation on Earth, in both per capita and absolute terms. This is certainly an irony for the country that touts itself as the “land of the free.” 

Among those 2 million people in prison are 40,000 who remain incarcerated in state or federal prisons on cannabis-related convictions— about half of them for marijuana offenses alone. When those waiting to see a judge in local jails are added in, the figure may approach 100,000 on any given day. And the racial disparity could not be more obvious. A 2013 American Civil Liberties Union report, Marijuana in Black and White: Billions of Dollars Wasted on Racially Biased Arrests, crunched the national data. It found that black people are more than three times as likely as whites to be arrested for cannabis — despite consuming the plant at essentially similar rates.  

And this is not the first time the country has seen significant and hard-won racial progress being in large part (at least) reversed, with the same power structure re-establishing itself in new guise. Slavery was abolished in the aftermath of the Civil War. But, as Alexander quotes historian and early civil rights activist W. E. B. Du Bois, from his 1935 book Black Reconstruction in America, “The slave went free, stood a brief moment in the sun, then moved back again toward slavery.”

In the South under occupation by Union troops after the Civil War, black people for the first time voted, served on juries and held elected office — until the backlash came. In 1877, the federal troops were withdrawn. In subsequent years, without federal interference, Ku Klux Klan terror enforced legal apartheid in the southern states — the system known as Jim Crow. Blacks were often reduced to a state of near-slavery through share-cropping and were barred from the vote by systematic disenfranchisement.  

It wasn’t until nearly a century after the Civil War that this system would be challenged. In his book Why We Can’t Wait, an account of the 1963 Birmingham Campaign to desegregate Alabama’s biggest city, King wrote of “America’s third revolution — the Negro Revolution.” 

By King’s reckoning, the country’s first revolution had been the one we actually call “the Revolution” — the War of Independence, although it left the slave-owning aristocracy of the South thoroughly in place. The second was arguably far more revolutionary — the Civil War, in which the slave system was broken. King’s Civil Rights Movement was avowedly nonviolent, but it was still a revolution — the overturning of a power structure by physical as well as moral opposition.

Despite the violent backlash, both from the police and Ku Klux Klan terrorists, the campaign ultimately swayed the nation, resulting in the passage of the Civil Rights Act of 1964 and other landmark legislation that finally ended legal apartheid in America.

But the year of King’s assassination saw the country’s national political establishment embracing the backlash — exactly as in 1877. In the 1968 presidential campaign, Republican candidate Richard Nixon first adopted the rhetoric of a “War on Drugs” (although he would actually coin that phrase three years later, when the Controlled Substances Act was passed). And, in just barely coded terms, Nixon was promoting the rhetoric of racism.

In her book, Alexander quotes Nixon’s special counsel John Ehrlichman explicitly summing up the campaign strategy in his 1982 memoir, Witness To Power: The Nixon Years: “We’ll go after the racists.” Ehrlichman unabashedly wrote how throughout the 1968 race, “subliminal appeal to the anti-black voter was always present in Nixon’s statements and speeches.” 

Alexander did not mention, however, another quote attributed to Ehrlichman in which he just as explicitly made the connection between this subliminal racism and the anti-drug drumbeat. Journalist Dan Baum in the April 2016 edition of Harper’s recalls a quote he says he got from a 1994 interview with Ehrlichman: “The Nixon campaign in 1968, and the Nixon White House after that, had two enemies: the antiwar left and black people… by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news. Did we know we were lying about the drugs? Of course we did.”

And the backlash was just beginning.

Birth of the New Jim Crow 

The new order would be consolidated over the next decade. In 1973, the same year the federal Drug Enforcement Administration was created, New York state’s Rockefeller Laws imposed the nation’s first mandatory minimum sentences for drug offenses. In 1977, New York decriminalized cannabis, overturning the harsh Rockefeller Laws where personal quantities of marijuana were concerned — but the draconian provisions for cocaine and heroin remained intact.

With the election of Ronald Reagan in 1980, the “drug war” rhetoric was revived with a vengeance, and the Anti-Drug Abuse Act of 1986 imposed mandatory minimum sentences nationwide. Ten years later, an ACLU report would find that the law “devastated African American and low-income communities.” 

The 1986 law also instated the sentencing disparity for crack and powder cocaine — as crack was flooding black communities and landing people with the far longer sentences. This was also reflected in public perceptions and media portrayals. In the early ’80s, powder cocaine was a status symbol for white yuppies. When crack hit the streets from New York to Los Angeles, it was immediately stigmatized by association with the criminal (read: black) underclass.

This period also saw the rapid militarization of police forces, and the War on Drugs, in Alexander’s words, went “from being a political slogan to an actual war.” The 1981 Military Cooperation with Law Enforcement Act started to erode the firewall that had existed between the armed forces and police since the end of Reconstruction.

The DEA joined with local police forces to instate Operation Pipeline, a program of traffic stops and vehicle searches that was protested by the ACLU as based on systematic “racial profiling.” 

This was enabled by a series of bad Supreme Court decisions — Terry vs Ohio in 1968, Florida vs. Bostick in 1991, Ohio vs. Robinette in 1996 — that dramatically eroded the Fourth Amendment. Alexander writes that these decisions enabled “consent searches” — in which the motorist (or pedestrian, or home resident) verbally consents to the search, but actually does so under police intimidation.

All-white juries were more likely to convict black people, of course — and prosecutors were still able to strike non-whites from serving as jurors despite the 1986 Supreme Court decision Batson v. Kentuckywhich barred discrimination on the basis of race in jury selection. As Alexander writes, “the only thing that has changed is that prosecutors must come up with a race-neutral excuse for the strikes.” 

In a vicious cycle, mass incarceration itself served to entrench the system of mass incarceration. Convicted felons are excluded from juries in many states, and only Maine and Vermont allow prison inmates to vote (as most Western European countries do).

Nor did this system turn around when the Democrats returned to the White House. The Bill Clinton years saw a 60% drop in federal spending on public housing, and a 170% boost in prison spending up to $19 billion. Prison construction would finally begin leveling off in the 2000s, but the actual prison population broke new records in 2008, “with no end in sight.”

Alexander writes: “Ninety percent of those admitted to prison for drug offenses in many states were black or Latino, yet the mass incarceration of communities of color was explained in race-neutral terms, an adaptation to the needs and demands of the current political climate. The New Jim Crow was born.” 

And this was utterly out of proportion to any real threat posed by illegal drugs. In the 1980s, there were some 22,000 drunk driving deaths per year, among 100,000 alcohol-related deaths. In Alexander’s words: “The number of deaths related to all illegal drugs combined was tiny compared to the number of deaths caused by drunk driving.”

Among the numberless stories of police terror in the name of drug enforcement, one recounted by Alexander is that of Alberta Spruill—a 57-year-old Harlem woman who died of a heart attack in May 2003 after police officers broke down her door and threw a concussion grenade into her apartment. No drugs or any contraband were found in the apartment. The cops were acting on a bad tip from snitches snared on a marijuana rap. 

A Fourth Revolution? 

Thanks in large part to growing public consciousness, there certainly appears to have been some progress in the fight against the War on Drugs over the past decade. In 2009, following a hard-fought activist campaign, the Rockefeller Laws were finally overturned in New York. Eleven states have now legalized cannabis, and nearly all have at least some kind of provision for medical use of cannabis — significantly lifting the pressure on one federally controlled substance.

But even amid the progress, there are clear and frustrating signs that a mere change in the law isn’t enough. From New York City (where cannabis arrests have been de-emphasized by policy) to Colorado (where cannabis is now legal), overall arrests for pot are significantly reduced — but the stark racial disparity persists in those arrests that continue under various loopholes.

Michelle Alexander concludes with a litany of necessary legal reforms and then states that, ultimately, they are insufficient: “Mandatory drug sentencing laws must be rescinded. Marijuana ought to be legalized (and perhaps other drugs as well)… The list could go on, of course, but the point has been made. The central question for racial justice advocates is this: are we serious about ending the system of control, or not?” 

She quotes from Martin Luther King’s book of collected speeches, A Testament of Hope“White America must recognize that justice for black people cannot be achieved without radical changes in the structure of our society. The comfortable, the entrenched, the privileged cannot continue to tremble at the prospect of change in the status quo.”

There are many other quotes from the great civil rights leader that shed equal light on the current impasse, in which the limitations of mere legal progress are becoming clear. In his April 1963 Letter from Birmingham Jail, King justified his civil disobedience in these words: “An unjust law is a code that a numerical or power majority group compels a minority group to obey but does not make binding on itself.”

This recalls both the relative impunity for white coke-snorters in the ’80s as black communities were militarized in the name of drug enforcement — and the white entrepreneurs now disproportionately getting rich off legal cannabis, while black users remain disproportionately criminalized.  

In Why We Can’t Wait, King wrote of how the country needed a “Bill of Rights for the Disadvantaged” — anticipating the current demands for drug war reparations, wedding legal cannabis to addressing the harms caused by prohibition and the related matrix of social injustice.

The notion that cannabis legalization is necessary but not sufficient recalls King’s 1967 report to the staff of the Southern Christian Leadership Conference, the main coordinating body of the civil rights campaign. 

In the “Report to SCLC Staff,” he noted how the 1965 Selma to Montgomery March culminated in passage of the Voting Rights Act later that year — a critical victory. Yet, he wrote: “We have moved from the era of civil rights to the era of human rights, an era where we are called upon to raise certain basic questions about the whole society. We have been in a reform movement… But after Selma and the voting rights bill, we moved into a new era, which must be the era of revolution. We must recognize that we can’t solve our problem now until there is a radical redistribution of economic and political power.”

If cannabis legalization is to truly undo the social harms of prohibition, its advocates may be in for a similar reckoning in the coming period.

The post MLK & Marijuana: How the Civil Rights Leader’s Work Informs the Push for Legal Pot appeared first on Cannabis Now.

How Accurate is a 125 Year Old Resistance Standard?

Internals of the 1900 Evershed & Vignoles Ltd 1 ohm resistance standard. (Credit: Three-phase, YouTube)
Internals of the 1900 Evershed & Vignoles Ltd 1 ohm resistance standard. (Credit: Three-phase, YouTube)

Resistance standards are incredibly useful, but like so many precision references they require regular calibration, maintenance and certification to ensure that they stay within their datasheet tolerances. This raises the question of how well a resistance standard from the year 1900 performs after 125 years, without the benefits of modern modern engineering and standards. Cue the [Three-phase] YouTube channel testing a genuine Evershed & Vignoles Ltd one ohm resistance standard from 1900.

With mahogany construction and brass contacts it sure looks stylish, though the unit was missing the shorting pin that goes in between the two sides. This was a common feature of e.g. resistance decade boxes of the era, where you inserted pins to connect resistors until you hit the desired total. Inside the one ohm standard is a platinoid resistor, which is an alloy of copper, nickel, tungsten, and zinc. Based on the broad arrow mark on the bottom this unit was apparently owned by the UK’s Ordnance Board, which was part of what was then called the War Office.

After a quick gander at the internals, the standard was hooked up to a Keithley DMM7510 digital bench meter. The resistance standard’s ‘datasheet’ is listed on top of the unit on the brass plaques, including the effect of temperature on its accuracy. Adjusting for this, the measured ~1.016 Ω was within 1.6% tolerance, with as sidenote that this was with the unit not having been cleaned or otherwise having had maintenance performed on it since it was last used in service. Definitely not a bad feat.

A British redcoat’s lost memoir resurfaces

History buffs are no doubt familiar with the story of Shadrack Byfield, a rank-and-file British redcoat who fought during the War of 1812 and lost his left arm to a musket ball for his trouble. Byfield has been featured in numerous popular histories—including a children's book and a 2011 PBS documentary—as a shining example of a disabled soldier's stoic perseverance. But a newly rediscovered memoir that Byfield published in his later years is complicating that idealized picture of his post-military life, according to a new paper published in the Journal of British Studies.

Historian Eamonn O'Keeffe of Memorial University of Newfoundland in St. John's, Canada, has been a Byfield fan ever since he read the 1985 children's novel, Redcoat, by Gregory Sass. His interest grew when he was working at Fort York, a War of 1812-era fort and museum, in Toronto. "There are dozens of memoirs written by British rank-and-file veterans of the Napoleonic Wars, but only a handful from the War of 1812, which was much smaller in scale," O'Keeffe told Ars. "Byfield's autobiography seemed to offer an authentic, ground-level view of the fighting in North America, helping us look beyond the generals and politicians and grapple with the implications of this conflict for ordinary people.

Born in 1789 in Wiltshire's Bradford-on-Avon suburbs, Byfield's parents intended him to follow in his weaver father's footsteps. He enlisted in the county militia when he turned 18 instead, joining the regular army the following year. When the War of 1812 broke out, Byfield was stationed at Fort George along the Niagara River, participating in the successful siege of Fort Detroit. At the Battle of Frenchtown in January 1813, he was shot in the neck, but he recovered sufficiently to join the campaigns against Fort Meigs and Fort Stephenson in Ohio.

Read full article

Comments

© Tom Fournier

Clone Wars: IBM Edition

If you search the Internet for “Clone Wars,” you’ll get a lot of Star Wars-related pages. But the original Clone Wars took place a long time ago in a galaxy much nearer to ours, and it has a lot to do with the computer you are probably using right now to read this. (Well, unless it is a Mac, something ARM-based, or an old retro-rig. I did say probably!)

IBM is a name that, for many years, was synonymous with computers, especially big mainframe computers. However, it didn’t start out that way. IBM originally made mechanical calculators and tabulating machines. That changed in 1952 with the IBM 701, IBM’s first computer that you’d recognize as a computer.

If you weren’t there, it is hard to understand how IBM dominated the computer market in the 1960s and 1970s. Sure, there were others like Univac, Honeywell, and Burroughs. But especially in the United States, IBM was the biggest fish in the pond. At one point, the computer market’s estimated worth was a bit more than $11 billion, and IBM’s five biggest competitors accounted for about $2 billion, with almost all of the rest going to IBM.

So it was somewhat surprising that IBM didn’t roll out the personal computer first, or at least very early. Even companies that made “small” computers for the day, like Digital Equipment Corporation or Data General, weren’t really expecting the truly personal computer. That push came from companies no one had heard of at the time, like MITS, SWTP, IMSAI, and Commodore.

The IBM PC

The story — and this is another story — goes that IBM spun up a team to make the IBM PC, expecting it to sell very little and use up some old keyboards previously earmarked for a failed word processor project. Instead, when the IBM PC showed up in 1981, it was a surprise hit. By 1983, there was the “XT” which was a PC with some extras, including a hard drive. In 1984, the “AT” showed up with a (gasp!) 16-bit 80286.

The personal computer market had been healthy but small. Now the PC was selling huge volumes, perhaps thanks to commercials like the one below, and decimating other companies in the market. Naturally, others wanted a piece of the pie.

Send in the Clones

Anyone could make a PC-like computer, because IBM had used off-the-shelf parts for nearly everything. There were two things that really set the PC/XT/AT family apart. First, there was a bus for plugging in cards with video outputs, serial ports, memory, and other peripherals. You could start a fine business just making add-on cards, and IBM gave you all the details. This wasn’t unlike the S-100 bus created by the Altair, but the volume of PC-class machines far outstripped the S-100 market very quickly.

In reality, there were really two buses. The PC/XT had an 8-bit bus, later named the ISA bus. The AT added an extra connector for the extra bits. You could plug an 8-bit card into part of a 16-bit slot. You probably couldn’t plug a 16-bit card into an 8-bit slot, though, unless it was made to work that way.

The other thing you needed to create a working PC was the BIOS — a ROM chip that handled starting the system with all the I/O devices set up and loading an operating system: MS-DOS, CP/M-86, or, later, OS/2.

Protection

An ad for a Columbia PC clone.

IBM didn’t think the PC would amount to much so they didn’t do anything to hide or protect the bus, in contrast to Apple, which had patents on key parts of its computer. They did, however, have a copyright on the BIOS. In theory, creating a clone IBM PC would require the design of an Intel-CPU motherboard with memory and I/O devices at the right addresses, a compatible bus, and a compatible BIOS chip.

But IBM gave the world enough documentation to write software for the machine and to make plug-in cards. So, figuring out the other side of it wasn’t particularly difficult. Probably the first clone maker was Columbia Data Products in 1982, although they were perceived to have compatibility and quality issues. (They are still around as a software company.)

Eagle Computer was another early player that originally made CP/M computers. Their computers were not exact clones, but they were the first to use a true 16-bit CPU and the first to have hard drives. There were some compatibility issues with Eagle versus a “true” PC. You can hear their unusual story in the video below.

The PC Reference manual had schematics and helpfully commented BIOS source code

One of the first companies to find real success cloning the PC was Compaq Computers, formed by some former Texas Instruments employees who were, at first, going to open Mexican restaurants, but decided computers would be better. Unlike some future clone makers, Compaq was dedicated to building better computers, not cheaper.

Compaq’s first entry into the market was a “luggable” (think of a laptop with a real CRT in a suitcase that only ran when plugged into the wall; see the video below). They reportedly spent $1,000,000 to duplicate the IBM BIOS without peeking inside (which would have caused legal problems). However, it is possible that some clone makers simply copied the IBM BIOS directly or indirectly. This was particularly easy because IBM included the BIOS source code in an appendix of the PC’s technical reference manual.

Between 1982 and 1983, Compaq, Columbia Data Products, Eagle Computers, Leading Edge, and Kaypro all threw their hats into the ring. Part of what made this sustainable over the long term was Phoenix Technologies.

Rise of the Phoenix

Phoenix was a software producer that realized the value of having a non-IBM BIOS. They put together a team to study the BIOS using only public documentation. They produced a specification and handed it to another programmer. That programmer then produced a “clean room” piece of code that did the same things as the BIOS.

An Eagle ad from 1983

This was important because, inevitably, IBM sued Phoenix but lost, as they were able to provide credible documentation that they didn’t copy IBM’s code. They were ready to license their BIOS in 1984, and companies like Hewlett-Packard, Tandy, and AT&T were happy to pay the $290,000 license fee. That fee also included insurance from The Hartford to indemnify against any copyright-infringement lawsuits.

Clones were attractive because they were often far cheaper than a “real” PC. They would also often feature innovations. For example, almost all clones had a “turbo” mode to increase the clock speed a little. Many had ports or other features as standard that a PC had to pay extra for (and consume card slots). Compaq, Columbia, and Kaypro made luggable PCs. In addition, supply didn’t always match demand. Dealers often could sell more PCs than they could get in stock, and the clones offered them a way to close more business.

Issues

Not all clone makers got everything right. It wasn’t odd for a strange machine to have different interrupt handling than an IBM machine or different timers. Another favorite place to err involved AT/PC compatibility.

In a base-model IBM PC, the address bus only went from A0 to A19. So if you hit address (hex) FFFFF+1, it would wrap around to 00000. Memory being at a premium, apparently, some programs depended on that behavior.

With the AT, there were more address lines. Rather than breaking backward compatibility, those machines have an “A20 gate.” By default, the A20 line is disabled; you must enable it to use it. However, there were several variations in how that worked.

Intel, for example, had the InBoard/386 that let you plug a 386 into a PC or AT to upgrade it. However, the InBoard A20 gating differed from that of a real AT. Most people never noticed. Software that used the BIOS still worked because the InBoard’s BIOS knew the correct procedure. Most software didn’t care either way. But there was always that one program that would need a fix.

The original PC used some extra logic in the keyboard controller to handle the gate. When CPUs started using cache, the A20 gating was moved into the CPU for many generations. However, around 2013, most CPUs finally gave up on gating A20.

The point is that there were many subtle features on a real IBM computer, and the clone makers didn’t always get it right. If you read ads from those days, they often tout how compatible they are.

Total War!

IBM started a series of legal battles against… well… everybody. Compaq, Corona Data Systems, Handwell, Phoenix, AMD, and anyone who managed to put anything on the market that competed with “big blue” (one of IBM’s nicknames).

IBM didn’t win anything significant, although most companies settled out of court. Then they just used the Phoenix BIOS, which was provably “clean.”  So IBM decided to take a different approach.

In 1987, IBM decided they should have paid more attention to the PC design, so they redid it as the PS/2. IBM spent a lot of money telling people how much better the PS/2 was. They had really thought about it this time. So scrap those awful PCs and buy a PS/2 instead.

Of course, the PS/2 wasn’t compatible with anything. It was made to run OS/2. It used the MCA bus, which was incompatible with the ISA bus, and didn’t have many cards available. All of it, of course, was expensive. This time, clone makers had to pay a license fee to IBM to use the new bus, so no more cheap cards, either.

You probably don’t need a business degree to predict how that turned out. The market yawned and continued buying PC “clones” which were now the only game in town if you wanted a PC/XT/AT-style machine, especially since Compaq beat IBM to market with an 80386 PC by about a year.

Not all software was compatible with all clones. But most software would run on anything and, as clones got more prevalent, software got smarter about what to expect. At about the same time, people were thinking more about buying applications and less about the computer they ran on, a trend that had started even earlier, but was continuing to grow. Ordinary people didn’t care what was in the computer as long as it ran their spreadsheet, or accounting program, or whatever it was they were using.

Dozens of companies made something that resembled a PC, including big names like Olivetti, Zenith, Hewlett-Packard, Texas Instruments, Digital Equipment Corporation, and Tandy. Then there were the companies you might remember for other reasons, like Sanyo or TeleVideo. There were also many that simply came and went with little name recognition. Michael Dell started PC Limited in 1984 in his college dorm room, and by 1985, he was selling an $800 turbo PC. A few years later, the name changed to Dell, and now it is a giant in the industry.

Looking Back

It is interesting to play “what if” with this time in history. If IBM had not opened their architecture, they might have made more money. Or, they might have sold 1,000 PCs and lost interest. Then we’d all be using something different. Microsoft retaining the right to sell MS-DOS to other people was also a key enabler.

IBM stayed in the laptop business (ThinkPad) until they sold to Lenovo in 2005. They would also sell them their server business in 2014.

Things have changed, of course. There hasn’t been an ISA card slot on a motherboard in ages. Boot processes are more complex, and there are many BIOS options. Don’t even get us started on EMS and XMS. But at the core,  your PC-compatible computer still wakes up and follows the same steps as an old school PC to get started. Like the Ship of Theseus, is it still an “IBM-compatible PC?” If it matters, we think the answer is yes.

If you want to relive those days, we recently saw some new machines sporting 8088s and 80386s. Or, there’s always emulation.

The Wild West of Web3: A New Frontier

The good ol’ times are back, in a different way

If you thought the dot-com boom was chaotic two decades ago, welcome to Web3, where fortunes are minted overnight, rug pulls happen before lunch, and the rule of law is still being written in real-time.

Just like the American frontier of the late 1800s, the decentralized web promises boundless opportunity and radical freedom. It’s still pretty much a land where anyone with an Internet connection can stake their claim, build their empire, or lose everything to digital bandits. There are not enough sheriffs here — but more than enough of griefers, anonymous founders, and communities trying to self-govern in an ecosystem that moves faster than regulators can comprehend.

However, amid the scams and speculation, something genuinely transformative is taking shape: decentralized finance is reimagining banking, NFTs are redefining ownership, and DAOs are experimenting with new forms of organization. Web3 is lawless at this point, raising the question of whether this frontier can be brought into order without stripping away the very freedom that makes it transformative.

As we step into 2026, it’s time for a clear-eyed reassessment and a high-level view of what this space has truly become over the past few years. So saddle up as we’re heading into territory where the only certainty is uncertainty, and the only rule is that the rules haven’t been written yet.

The Size of the Frontier

To understand just how wild this territory has become, look at the numbers. The global Web3 market was valued at approximately $4.62 billion in 2025 and is projected to reach almost a hundred billion by 2034, representing a compound annual growth rate of over 41%. This is an ecosystem that has grown from virtually nothing to housing over 17,000 companies and 460,000 professionals worldwide.

The infrastructure underlying this digital Wild West has exploded. Total Value Locked in DeFi protocols has seen massive growth, with the ecosystem reaching substantial scale. Recent data shows Ethereum hosting over $68.6 billion in TVL, while total DeFi across all chains has consolidated around $182 billion, demonstrating the massive influx of capital into these experimental financial systems.

Despite the dangers, or perhaps because of them, decentralized finance has emerged as one of the most compelling experiments in the Web3 realm, since it represents a complete reimagining of financial services — lending, borrowing, trading, and earning interest — without traditional intermediaries like banks.

The growth has been staggering: over 14.2 million unique wallets have interacted with DeFi protocols by mid-2025, and DeFi lending protocols saw over $51 billion in outstanding loans.

The institutional adoption that many predicted is finally materializing. Coinbase captured $2.03 billion in institutional revenue in Q1 2025, while traditional financial giants like JPMorgan have launched blockchain platforms for tokenized settlements. Even governments are getting involved — California’s DMV digitized 42 million car titles on Avalanche, demonstrating real-world utility beyond speculation.

Yet for every success story, there’s a cautionary tale. In the NFT space alone, total sales volume for 2024 reached $8.8 billion, but this represents a steep decline from the $15.7 billion recorded in 2021, a stark reminder that boom times don’t last forever on the frontier.

Bandits, Outlaws, and Rug Pulls

The lawlessness of Web3 isn’t just metaphorical. According to the FBI’s Internet Crime Complaint Center, Americans alone lost approximately $9.3 billion to cryptocurrency fraud in 2024, marking a 66% increase from the previous year. And that doesn’t even account for global losses or unreported incidents.

The numbers paint a sobering picture of the risks. The FBI received more than 140,000 complaints referencing cryptocurrency in 2024, with investment scams leading to $5.8 billion in losses alone. Individuals over the age of 60 were hit hardest, accounting for $2.8 billion in losses across 33,000 complaints.

The broader fraud landscape is even more alarming as the Federal Trade Commission reported that consumers lost $12.5 billion to fraud in 2024, with investment scams accounting for $5.7 billion — a sharp 24% increase over 2023. Cryptocurrency scams specifically resulted in $1.4 billion in reported losses through the FTC.

Rug pulls (where developers abandon a project and vanish with investor funds) have become the Wild West equivalent of train robberies. These scams are becoming faster and more sophisticated, often occurring on decentralized exchanges like Uniswap and PancakeSwap, where oversight is minimal.

Celebrity endorsements have amplified the damage. High-profile cases in 2024 included social media personalities launching tokens that soared to hundreds of millions in market cap before crashing within hours, leaving retail investors with devastating losses.

The Next Chapter of the Frontier

What does the future hold for Web3? The market projections suggest continued explosive growth. For example, in Q3 of 2024, Web3 startups raised $2 billion in over 300 deals, with major venture firms like Andreessen Horowitz continuing to deploy capital. Andreessen Horowitz has invested close to $1.2 billion in 30 Web3 companies, signaling sustained institutional confidence.

The technology is maturing rapidly. Layer-2 scaling solutions have cut gas fees by up to 90%, making blockchain interactions affordable for everyday users. Zero-knowledge proofs are unlocking privacy-preserving applications, while improved user interfaces are lowering barriers to entry.

Real-world integration is accelerating. Major brands are incorporating NFTs into loyalty programs, governments are exploring blockchain for public records, and financial institutions are tokenizing traditional assets. By 2030, Web3 marketing spending may exceed $300 billion, representing a fundamental shift in how digital economies operate.

Taming the Wild West

The Wild West analogy for Web3 is apt, but it’s worth remembering how America’s actual frontier evolved. The lawlessness gave way to functioning societies. Not through heavy-handed control from distant authorities, but through a gradual process of community building, norm establishment, and selective regulation.

Web3 appears to be following a similar path these days. The scams and speculation haven’t disappeared, but they’re increasingly met with sophisticated security tools, informed communities, and clearer legal frameworks. At this point, the Web3 security market is growing at 90+% annually, with over 200 companies focused on blockchain security.

The frontier mentality that made Web3 exciting, such as the permissionless innovation, the global accessibility, and the challenge to entrenched power, doesn’t have to disappear for the space to mature. But maturity requires acknowledging that with great freedom comes great responsibility, and that some rules might be necessary to protect the vulnerable without stifling the bold.

The Wild West of Web3 isn’t becoming civilized in the traditional sense. Instead, it’s developing its own unique form of order, one that blends code and community, incentives and institutions, freedom and accountability. Whether this experiment succeeds will determine not just the future of blockchain technology, but potentially the future of how we organize economic activity in the digital age.

The frontier remains open. The question is whether you’re willing to take the risk.


The Wild West of Web3: A New Frontier was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story.

NASA topples towers used to test Saturn rockets, space shuttle

Two historic NASA test facilities used in the development of the Saturn V and space shuttle launch vehicles have been demolished after towering over the Marshall Space Flight Center in Alabama since the start of the Space Age.

The Propulsion and Structural Test Facility, which was erected in 1957—the same year the first artificial satellite entered Earth orbit—and the Dynamic Test Facility, which has stood since 1964, were brought down by a coordinated series of implosions on Saturday, January 10. Located in Marshall's East Test Area on the US Army's Redstone Arsenal in Huntsville, the two structures were no longer in use and, according to NASA, had a backlog of $25 million in needed repairs.

"This work reflects smart stewardship of taxpayer resources," Jared Isaacman, NASA administrator, said in a statement. "Clearing outdated infrastructure allows NASA to safely modernize, streamline operations and fully leverage the infrastructure investments signed into law by President Trump to keep Marshall positioned at the forefront of aerospace innovation."

Read full article

Comments

© NASA

The Time Clock Has Stood the Test of Time

No matter the item on my list of childhood occupational dreams, one constant ran throughout: I saw myself using an old-fashioned punch clock with the longish time cards and everything. I now realize that I have some trouble with the daily transitions of life. In my childish wisdom, I somehow knew that doing this one thing would be enough to signify the beginning and end of work for the day, effectively putting me in the mood, and then pulling me back out of it.

But that day never came. Well, it sort of did this year. I realized a slightly newer dream of working at a thrift store, and they use something that I feel like I see everywhere now that I’ve left the place — a system called UKG that uses mag-stripe cards to handle punches. No it was not the same as a real punch clock, not that I have experience with a one. And now I just want to use one even more, to track my Hackaday work and other projects. At the moment, I’m torn between wanting to make one that uses mag-stripe cards or something, and just buying an old punch clock from eBay.

I keep calling it a ‘punch clock’, but it has a proper name, and that is the Bundy clock. I soon began to wonder how these things could both keep exact time mechanically, but also create a literal inked stamp of said time and date. I pictured a giant date stamper, not giant in all proportions, but generally larger than your average handheld one because of all the mechanisms that surely must be inside the Bundy clock. So, how do these things work? Let’s find out.

Bundy’s Wonder

Since the dawn of train transportation and the resulting surge of organized work during the industrial revolution, employers have had a need to track employees’ time. But it wasn’t until the late 1880s that timekeeping would become so automatic.

An early example of a Bundy clock that used cards, made by National Time Recorder Co. Ltd. Public domain via Wikipedia

Willard Le Grand Bundy was a jeweler in Auburn, New York who invented a timekeeping clock in 1888. A few years later, Willard and his brother Harlow formed a company to mass-produce the clocks.

By the early 20th century, Bundy clocks were in use all over the world to monitor attendance. The Bundy Manufacturing Company grew and grew, and through a series of mergers, became part of what would become IBM. They sold the time-keeping business to Simplex in 1958.

Looking at Willard Le Grand Bundy’s original clock, which appears to be a few feet tall and demonstrates the inner workings quite beautifully through a series of glass panels, it’s no wonder that it is capable of time-stamping magic.

Part of that magic is evident in the video below. Workers file by the (more modern) time clock and operate as if on autopilot, grabbing their card from one set of pockets, inserting it willy-nilly into the machine, and then  tucking it in safely on the other side until lunch. This is the part that fascinates me the most — the willy-nilly insertion part. How on Earth does the clock handle this? Let’s take a look.

Okay, first of all, you probably noticed that the video doesn’t mention Willard Le Grand Bundy at all, just some guy  named Daniel M. Cooper. So what gives? Well, they both invented time-recording machines, and just a few years apart.

The main difference is that Bundy’s clock wasn’t designed around cards, but around keys. Employees carried around a metal key with a number stamped on it. When it was time clock in or out, they inserted the key, and the machine stamped the time and the key number on a paper roll. Cooper’s machine was designed around cards, which I’ll discuss next. Although the operation of Bundy’s machine fell out of fashion, the name had stuck, and Bundy clocks evolved slightly to use cards.

Plotting Time

You would maybe think of time cards as important to the scheme, but a bit of an afterthought compared with the clock itself. That’s not at all the case with Cooper’s “Bundy”. It was designed around the card, which is a fixed size and has rows and columns corresponding to days of the week, with room for four punches per day.

One image from William Le Grand Bundy's patented time clock.
An image from Bundy’s patent via Google Patents

Essentially, the card is mechanically indexed inside the machine. When the card is inserted in the top slot, it gets pulled straight down by gravity, and goes until it hits a fixed metal stop that defines vertical zero. No matter how haphazardly you insert the card, the Bundy clock takes card of things. Inside the slot are narrow guides that align the card and eliminate drift. Now the card is essentially locked inside a coordinate system.

So, how does it find the correct row on the card? You might think that the card moves vertically, but it’s actually the punching mechanism itself that moves up and down on a rack-and-pinion system. This movement is driven by the timekeeping gears of the clock itself, which plot the times in the correct places as though the card were a piece of graph paper.

In essence, the time of day determined the punch location on the card, which wasn’t a punch in the hole punch sense, but a two-tone ink stamp from a type of bi-color ribbon you can still get online.

There’s a date wheel that selects the row for the given day, and a time cam to select the column. The early time clocks didn’t punch automatically — the worker had to pull a lever. When they did so, the mechanism would lock onto the current time, and the clock would fire a single punch at the card at the given coordinates.

Modern Time

Image via Comp-U-Charge

By the mid-century, time clocks had become somewhat simpler. No longer did the machine do the plotting for you. Now you put them in sideways, in the front, and use the indicator to get the punch in the right spot. It’s not hard to imagine why these gave way to more modern methods like fingerprint readers, or in my case, mag-stripe cards.

This is the type of time clock I intend to buy for myself, though I’m having trouble deciding between the manual model where you get to push a large button like this one, and the automatic version. I’d still like to build a time clock, too, for all the finesse and detail it could have by comparison. So honestly, I’ll probably end up doing both. Perhaps you’ll read about it on these pages one day.

The Rise and Fall of The In-Car Fax Machines

By: Lewin Day

Once upon a time, a car phone was a great way to signal to the world that you were better than everybody else. It was a clear sign that you had money to burn, and implied that other people might actually consider it valuable to talk to you from time to time.

There was, however, a way to look even more important than the boastful car phone user. You just had to rock up to the parking lot with your very own in-car fax machine.

Dial It Up

Today, the fax machine is an arcane thing only popular in backwards doctor’s offices and much of Japan. We rely on email for sending documents from person A to person B, or fill out forms via dedicated online submission systems that put our details directly in to the necessary databases automatically. The idea of printing out a document, feeding it into a fax machine, and then having it replicated as a paper version at some remote location? It’s positively anachronistic, and far more work than simply using modern digital methods instead.

In 1990, Mercedes-Benz offered a fully-stocked mobile office in the S-Class. You got a phone, fax, and computer, all ready to be deployed from the back seat. Credit: Mercedes-Benz

Back in the early 90s though, the communications landscape looked very different. If you had a company executive out on the road, the one way you might reach them would be via their cell or car phone. That was all well and good if you wanted to talk, but if you needed some documents looked over or signed, you were out of luck.

Even if your company had jumped on the e-mail bandwagon, they weren’t going to be able to get online from a random truck stop carpark for another 20 years or so. Unless… they had a fax in the car! Then, you could simply send them a document via the regular old cellular phone network, their in-car fax would spit it out, and they could go over it and get it back to you as needed.

Of course, such a communications setup was considered pretty high end, with a price tag to match. You could get car phones on a wide range of models from the 1980s onwards, but faxes came along a little later, and were reserved for the very top-of-the-line machines.

Mercedes-Benz was one of the first automakers to offer a remote fax option in 1990, but you needed to be able to afford an S-Class to get it. With that said, you got quite the setup if you invested in the Büro-Kommunikationssystem package. It worked via Germany’s C-Netz analog cellular system, and combined both a car phone and an AEG Roadfax fax machine. The phone was installed in the backrest of one of the front seats, while the fax sat in the fold-down armrest in the rear. The assumption was that if you were important enough to have a fax in the car, you were also important enough to have someone else driving for you. You also got an AEG Olyport 40/20 laptop integrated into the back of the front seats, and it could even print to the fax machine or send data via the C-Netz connection.

BMW would go on to offer faxes in high-end 7 Series and limousine models. Credit: BMW

Not to be left out, BMW would also offer fax machines on certain premium 7 Series and L7 limousine models, though availability was very market-dependent. Some would stash a fax machine in the glove box, others would integrate it into the back rest of one of the front seats. Toyota was also keen to offer such facilities in its high-end models for the Japanese market. In the mid-90s, you could purchase a Toyota Celsior or Century with a fax machine secreted in the glove box. It even came with Toyota branding!

Ultimately, the in-car fax would be a relatively short-lived option in the luxury vehicle space, for several reasons. For one thing, it only became practical to offer an in-car fax in the mid-80s, when cellular networks started rolling out across major cities around the world.

By the mid-2000s, digital cell networks were taking over, and by the end of that decade, mobile internet access was trivial. It would thus become far more practical to use e-mail rather than a paper-based fax machine jammed into a car. Beyond the march of technology, the in-car fax was never going to be a particularly common selection on the options list. Only a handful of people ever really had a real need to fax documents on the go. Compared to the car phone, which was widely useful to almost anyone, it had a much smaller install base. Fax options were never widely taken up by the market, and had all but disappeared by 2010.

The Toyota Celsior offered a nice healthy-sized fax machine in the 1990s, but it did take up the entire glove box.

These days, you could easily recreate a car-based fax-type experience. All you’d need would be a small printer and scanner, ideally combined into a single device, and a single-board computer with a cellular data connection. This would allow you to send and receive paper documents to just about anyone with an Internet connection. However, we’ve never seen such a build in the wild, because the world simply doesn’t run on paper anymore. The in-car fax was thus a technological curio, destined only to survive for maybe a decade or so in which it had any real utility whatsoever. Such is life!

NASA Marshall Prepares for Demolition of Historic Test, Simulation Facilities

6 Min Read

NASA Marshall Prepares for Demolition of Historic Test, Simulation Facilities

Engineers and technicians hoist the first flight version of the Saturn IB rocket's first stage into the T-tower for static testing at NASA’s Marshall Space Flight Center in Huntsville, Alabama, on March 15, 1965.
Credits: NASA

NASA is preparing for the demolition of three iconic structures at the agency’s Marshall Space Flight Center in Huntsville, Alabama.

Crews began demolition in mid-December at the Neutral Buoyancy Simulator, a facility built in the late 1960s that once enabled NASA astronauts and researchers to experience near-weightlessness. The facility was also used to conduct underwater testing of space hardware and practice runs for servicing the Hubble Space Telescope. The simulator was closed in 1997.

Two test stands – the Propulsion and Structural Test Facility and Dynamic Test Facility – are also slated for demolition, one after the other, by carefully coordinated implosion no earlier than sunrise on Jan. 10, 2026.

NASA Marshall tests fires the first stage of the Saturn I rocket at its historic Propulsion and Structural Test Facility, better known as the “T-tower.”

The demolition of these historic structures is part of a larger project that began in spring 2022, targeting several inactive structures no longer needed for the agency’s missions. All three towering fixtures played crucial roles in getting humans to the Moon, into low-Earth orbit, and beyond.

These structures have reached the end of their safe, operational life, and their removal has been long-planned as part of a broader effort to modernize Marshall’s footprint.  This demolition is the first phase of an initiative that will ultimately remove 25 outdated structures, reduce maintenance burdens, and position Marshall to take full advantage of a guaranteed NASA center infrastructure investment authorized under the Working Families Tax Credit Act.

This work reflects smart stewardship of taxpayer resources.

jared isaacman

jared isaacman

NASA Administrator

“This work reflects smart stewardship of taxpayer resources,” said NASA Administrator Jared Isaacman. “Clearing outdated infrastructure allows NASA to safely modernize, streamline operations, and fully leverage the infrastructure investments signed into law by President Trump to keep Marshall positioned at the forefront of aerospace innovation.”

Built in 1964, the Dynamic Test Stand initially was used to test fully assembled Saturn V rockets. In 1978, engineers integrated all space shuttle elements for the first time, including the orbiter, external fuel tank, and solid rocket boosters. It was last used in the early 2000s for microgravity testing.

The space shuttle orbiter Enterprise lifted by crane into the Structural Dynamic Test Facility at NASA’s Marshall Space Flight Center in Huntsville, Alabama, for vibration testing in July 1978.
NASA

The Propulsion and Structural Test Facility – better known at Marshall as the “T-tower” due to its unique shape – was built in 1957 by the U.S. Army Ballistic Missile Agency and transferred to NASA when Marshall was founded in 1960. There, engineers tested components of the Saturn launch vehicles, the Army’s Redstone Rocket, and shuttle solid rocket boosters.  It was last used for space shuttle solid rocket motor tests in the 1990s.

“Each one of these structures helped NASA make history,” said Rae Ann Meyer, acting center director at Marshall. “While it is hard to let them go, they’ve earned their retirement.  The people who built and managed these facilities and empowered our mission of space exploration are the most important part of their legacy.”

“These structures are not safe,” continued Meyer. “Strategic demolition is a necessary step in shaping the future of NASA’s mission to explore, innovate, and inspire. By removing these structures that we have not used in decades, we are saving money on upkeep of facilities we can’t use. We also are making these areas safe to use for future NASA exploration endeavors and investments.”

A legacy worth remembering

When NASA opened the Neutral Buoyancy Simulator in 1968, it was one of few places on Earth that could recreate the weightlessness of microgravity. The facility provided a simulated zero-gravity environment in which engineers and astronauts could find out how their designs might handle in orbit. The tank has been central to planning and problem-solving for Skylab missions, repairs to NASA’s Hubble Space Telescope, and more. The tank is 75 feet in diameter, 40 feet deep, and designed to hold up to nearly 1.5 million gallons of water. It was replaced in 1997 by a new, larger facility at NASA’s Johnson Space Center in Houston.

Kathryn Thornton in the Neutral Buoyancy Simulator at Marshall
Astronaut Kathryn Thornton practices maneuvers planned for the STS-61 mission in the Neutral Buoyancy Simulator at NASA’s Marshall Space Flight Center in Huntsville, Alabama, on Aug. 9, 1993.
NASA

The Propulsion and Structural Test Facility is one of the oldest test stands at Marshall. The dual-position test stand, sometimes called the T-tower, was built for static testing large rockets and launch systems – like launching a rocket while keeping it restrained and wired to instruments that collect data. The tests and data played a role in the development of the Saturn family of rockets, including the F-1 engine and S-IC.

The Dynamic Test Stand, a 360-foot tower topped by a 64-foot derrick, was once the tallest human-made structure in North Alabama. Engineers there conducted full-scale tests of Saturn V rockets – the same powerful vehicles that carried Apollo astronauts to the Moon. Later, the stand served as the first location where all space shuttle elements were integrated.

Preserving history for future generations

The irreplaceable historical value of these landmarks has prompted NASA to undertake extensive efforts to preserve their stories for future generations. The three facilities were made national landmarks in 1985 for their part in human spaceflight. In keeping with Section 106 of the National Historic Preservation Act, master planners and engineers at Marshall completed a rigorous consultation and mitigation process for each landmark, working closely with Alabama’s State Historic Preservation Office to preserve their history for future generations.

Detailed architectural documentation, written histories, and large-format photographs are permanently archived in the Library of Congress’ Historic American Engineering Record collection, making this history accessible to researchers and the public for generations.

Additionally, NASA has partnered with Auburn University to create high-resolution digital models of each facility. The project used technologies like LiDAR and 360-photography of the structures in detail before demolition. Their goal is to preserve not just the appearance, but the sense of scale and engineering achievement they represent. The models are still in work, but they’ll eventually be publicly available.

Select artifacts from the facilities have also been identified and transferred to the U.S. Space & Rocket Center through NASA’s Artifact Program, ensuring tangible pieces of this history remain available for educational purposes.

Honoring the past, building the future

For the employees, retirees, and community members who remember these facilities over the decades, their removal marks the end of an era. But their contributions live on in every NASA mission, from the International Space Station to the upcoming Artemis II lunar missions and more.

“NASA’s vision of space exploration remains vibrant, and as we look to an exciting future, we honor the past, especially the dedication of the men and women who built these structures and tested hardware that has launched into space, made unprecedented scientific discoveries, and inspired generations of Americans to reach for the stars,” said Meyer.

The demolitions represent more than removing obsolete infrastructure. They’re part of NASA’s commitment to building a dynamic, interconnected campus ready for the next era of space exploration while honoring the bold spirit that has always driven the agency forward.

Virtual tours and preserved documentation will be made available on Marshall’s digital channels. Marshall will also share video of the test stand demolitions after the event.

For communities near Redstone Arsenal, there could be a loud noise associated with the demolition on the morning of Jan. 10.

💾

This video is of the firing of the Saturn I first stage at NASA's Marshall Space Flight Center. The Saturn I first stage, or S-IB, consisted of eight tanks, ...

Stewart Cheifet, PBS host who chronicled the PC revolution, dies at 87

Stewart Cheifet, the television producer and host who documented the personal computer revolution for nearly two decades on PBS, died on December 28, 2025, at age 87 in Philadelphia. Cheifet created and hosted Computer Chronicles, which ran on the public television network from 1983 to 2002 and helped demystify a new tech medium for millions of American viewers.

Computer Chronicles covered everything from the earliest IBM PCs and Apple Macintosh models to the rise of the World Wide Web and the dot-com boom. Cheifet conducted interviews with computing industry figures, including Bill Gates, Steve Jobs, and Jeff Bezos, while demonstrating hardware and software for a general audience.

From 1983 to 1990, he co-hosted the show with Gary Kildall, the Digital Research founder who created the popular CP/M operating system that predated MS-DOS on early personal computer systems.

Read full article

Comments

© Stewart Cheifet

❌