The Black Death ravaged medieval Western Europe, ultimately wiping out roughly one-third of the population. Scientists have identified the bacterium responsible and its likely origins, but certain specifics of how and why it spread to Europe are less clear. According to a new paper published in the journal Communications Earth & Environment, either one large volcanic eruption or a cluster of eruptions might have been the triggering factor, setting off a chain of events that brought the plague to the Mediterranean region in the 1340s.
Technically, we’re talking about the second plague pandemic. The first, known as the Justinian Plague, broke out about 541 CE and quickly spread across Asia, North Africa, the Middle East, and Europe. (The Eastern Roman Emperor Justinian I, for whom the pandemic is named, actually survived the disease.) There continued to be outbreaks of the plague over the next 300 years, although the disease gradually became less virulent and died out. Or so it seemed.
In the Middle Ages, the Black Death burst onto the scene, with the first historically documented outbreak occurring in 1346 in the Lower Volga and Black Sea regions. That was just the beginning of the second pandemic. During the 1630s, fresh outbreaks of plague killed half the populations of affected cities. Another bout of the plague significantly culled the population of France during an outbreak between 1647 and 1649, followed by an epidemic in London in the summer of 1665. The latter was so virulent that, by October, one in 10 Londoners had succumbed to the disease—over 60,000 people. Similar numbers perished in an outbreak in Holland in the 1660s. The pandemic had run its course by the early 19th century, but a third plague pandemic hit China and India in the 1890s. There are still occasional outbreaks today.
Thirty years ago today, Netscape Communications and Sun Microsystems issued a joint press release announcing JavaScript, an object scripting language designed for creating interactive web applications. The language emerged from a frantic 10-day sprint at pioneering browser company Netscape, where engineer Brendan Eich hacked together a working internal prototype during May 1995.
While the JavaScript language didn’t ship publicly until that September and didn’t reach a 1.0 release until March 1996, the descendants of Eich’s initial 10-day hack now run on approximately 98.9 percent of all websites with client-side code, making JavaScript the dominant programming language of the web. It’s wildly popular; beyond the browser, JavaScript powers server backends, mobile apps, desktop software, and even some embedded systems. According to several surveys, JavaScript consistently ranks among the most widely used programming languages in the world.
In crafting JavaScript, Netscape wanted a scripting language that could make webpages interactive, something lightweight that would appeal to web designers and non-professional programmers. Eich drew from several influences: The syntax looked like a trendy new programming language called Java to satisfy Netscape management, but its guts borrowed concepts from Scheme, a language Eich admired, and Self, which contributed JavaScript’s prototype-based object model.
How do you top a highly detailed scale model of NASA’s new moon-bound rocket and its support tower? If you’re Lego, you make it so it can actually lift off.
Lego’s NASA Artemis Space Launch System Rocket, part of its Technic line of advanced building sets, will land on store shelves for $60 on January 1, 2026, and then “blast off” from kitchen tables, office desks and living room floors. The 632-piece set climbs skyward, separating from its expendable stages along the way, until the Orion crew spacecraft and its European Service Module top out the motion on their way to the moon—or wherever your imagination carries it.
“The educational LEGO Technic set shows the moment a rocket launches, in three distinct stages,” reads the product description on Lego’s website. “Turn the crank to see the solid rocket boosters separate from the core stage, which then also detaches. Continue turning to watch the upper stage with its engine module, Orion spacecraft and launch abort system separate.”
My parents owned a station wagon only very briefly when I was a child, but I'll never forget how awesome it was to hang out in the back of that wood-paneled beauty with the seats folded down. It made long trips a breeze, and I was sad to see it go.
YouTube has launched its yearly Recap, turning your 2025 watch history into a shareable breakdown of your habits. The feature analyzes everything you viewed, even assigns a “viewer personality,” giving you a fun, shareable snapshot of how you used YouTube this year.
Driving used to be something more than just a way to get from one place to another. Just taking a drive down to the store could be an exciting experience in the right car. The so-called "driver's car", which emphasizes feedback and responsiveness to your inputs was a great way to forget about your dull 9-5 life. For those who cared about such things, there were plenty of choices at every budget level. These days? I'm starting to doubt that a true driver's car even exists anymore in any form.
Google's product cemetery is littered with both bad and good ideas. Some failed because the execution was poor, others were simply released half-baked, or too soon.
On September 19, 1982, Carnegie Mellon University computer science research assistant professor Scott Fahlman posted a message to the university’s bulletin board software that would later come to shape how people communicate online. His proposal: use :-) and :-( as markers to distinguish jokes from serious comments. While Fahlman describes himself as “the inventor… or at least one of the inventors” of what would later be called the smiley face emoticon, the full story reveals something more interesting than a lone genius moment.
The whole episode started three days earlier when computer scientist Neil Swartz posed a physics problem to colleagues on Carnegie Mellon’s “bboard,” which was an early online message board. The discussion thread had been exploring what happens to objects in a free-falling elevator, and Swartz presented a specific scenario involving a lit candle and a drop of mercury.
That evening, computer scientist Howard Gayle responded with a facetious message titled “WARNING!” He claimed that an elevator had been “contaminated with mercury” and suffered “some slight fire damage” due to a physics experiment. Despite clarifying posts noting the warning was a joke, some people took it seriously.
It took the California Science Center more than three years to erect its new Samuel Oschin Air and Space Center, including stacking NASA’s space shuttle Endeavour for its launch pad-like display.
Now the big work begins.
“That’s completing the artifact installation and then installing the exhibits,” said Jeffrey Rudolph, president and CEO of the California Science Center in Los Angeles, in an interview. “Most of the exhibits are in fabrication in shops around the country and audio-visual production is underway. We’re full-on focused on exhibits now.”
On Monday, veteran game developer Rebecca Ann Heineman died in Rockwall, Texas, at age 62 after a battle with adenocarcinoma. Apogee founder Scott Miller first shared the news publicly on social media, and her son William confirmed her death with Ars Technica. Heineman’s GoFundMe page, which displayed a final message she had posted about entering palliative care, will now help her family with funeral costs.
Rebecca “Burger Becky” Heineman was born in October 1963 and grew up in Whittier, California. She first gained national recognition in 1980 when she won the national Atari 2600 Space Invaders championship in New York at age 16, becoming the first formally recognized US video game champion. That victory launched a career spanning more than four decades and 67 credited games, according to MobyGames.
Among many achievements in her life, Heineman was perhaps best known for co-founding Interplay Productions with Brian Fargo, Jay Patel, and Troy Worrell in 1983. The company created franchises like Wasteland, Fallout, and Baldur’s Gate. At Interplay, Heineman designed The Bard’s Tale III: Thief of Fate and Dragon Wars while also programming ports of classics like Wolfenstein 3D and Battle Chess.
I’m sitting in front of an old Sayno Plasma TV as I write this on my media PC. It’s not a productivity machine, by any means, but the screen has the resolution to do it so I started this document to prove a point. That point? Plasma TVs are awesome.
Always the Bridesmaid, Never the Bride
An Egyptian god might see pixels on an 8K panel, but we puny mortals won’t. Image “Horus Eye 2” by [Jeff Dahl]The full-colour plasma screens that were used as TVs in the 2000s are an awkward technological cul-de-sac. Everyone knows and loves CRTs for the obvious benefits they offer– bright colours, low latency, and scanlines to properly blur pixel art. Modern OLEDs have more resolution than the Eye of Horus, never mind your puny human orbs, and barely sip power compared to their forbearers. Plasma, though? Not old enough to be retro-cool, not new enough to be high-tech, plasma displays are sadly forgotten.
It’s funny, because I firmly believe that without plasma displays, CRTs would have never gone away. Perhaps for that I should hate them, but it’s for the very reasons that Plasma won out over HD-CRTs in the market place that I love them.
What You Get When You Get a Plasma TV
I didn’t used to love Plasma TVs. Until a few years ago, I thought of them like you probably do: clunky, heavy, power-hungry, first-gen flatscreens that were properly consigned to the dustbin of history. Then I bought a house.
The house came with a free TV– a big plasma display in the basement. It was left there for two reasons: it was worthless on the open market and it weighed a tonne. I could take it off the wall by myself, but I could feel the ghost of OSHA past frowning at me when I did. Hauling it up the stairs? Yeah, I’d need a buddy for that… and it was 2020. By the time I was organizing the basement, we’d just gone into lockdown, and buddies were hard to come by. So I put it back on the wall, plugged in my laptop, and turned it on.
I was gobsmacked. It looked exactly like a CRT– a giant, totally flat CRT in glorious 1080p. When I stepped to the side, it struck me again: like a CRT, the viewing angle is “yes”.
How it Works
None of this should have come as a surprise, because I know how a Plasma TV works. I’d just forgotten how good they are. See, a Plasma TV really was an attempt to get all that CRT goodness in a flat screen, and the engineers at Fujitsu, and later elsewhere, really pulled it off.
Like CRTs, you’ve got phosphors excited to produce points of light to create an image– and only when excited, so the blacks are as black as they get. The phosphors are chemically different from those in CRTs but they come in similar colours, so colours on old games and cartoons look right in a way they don’t even on my MacBook’s retina display.
Unlike a CRT, there’s no electron beam scanning the screen, and no shadow mask. Instead, the screen is subdivided into individual pixels inside the flat vacuum panel. The pixels are individually addressed and zapped on and off by an electric current. Unlike a CRT or SED, the voltage here isn’t high enough to generate an electron beam to excite the phosphors; instead the gas discharge inside the display emits enough UV light to do the same job.
Each phosphor-filled pixel glows with its own glorious light thanks to the UV from gas discharge in the cell. Image based on “Plasma-Display-Composition.svg” by [Jari Laamanen].Still, if it feels like a CRT, and that’s because the subpixels are individual blobs of phosphors, excited from behind, and generating their own glorious light.
It’s Not the Same, Though
It’s not a CRT, of course. The biggest difference is that it’s a fixed-pixel display, with all that comes with that. This particular TV has all the ports on the back to make it great for retrogaming, but the NES, or what have you, signal still has to be digitally upscaled to match the resolution. Pixel art goes unblurred by scanlines unless I add it in via emulation, so despite the colour and contrast, it’s not quite the authentic experience.
For some things, like the Atari 2600, the scanline blur really doesn’t matter. Image: “Atari 2600 on my 42 inch plasma TV” by [Jeffisageek] The built-in upscaling doesn’t introduce enough latency for a filthy casual like me to notice, but I’ll never be able to play Duck Hunt on the big screen unless I fake it with a Wii. Apparently some Plasma TVs are awesome for latency on the analog inputs, and others are not much better than an equivalent-era LCD. There’s a reason serious retro gamers pay serious money for big CRTs.
Those big CRTs don’t have to worry about burn in, either, something I have been very careful in the five years I’ve owned this second-hand plasma display to avoid. I can’t remember thinking much about burn-in with CRTs since we retired the amber-phosphor monitor plugged into the Hercules Graphics card on our family’s 286 PC.
The dreaded specter of burn-in is plasma’s Achilles heel – more than the weight and thickness, which were getting much better before LG pulled the plug as the last company to exit this space, or the Energy Star ratings, which weren’t going to catch up to LED-backlit LCDs, but had improved as well. The fear of burn-in made you skip the plasma, especially for console gaming.
This screen is haunted by the ghost of CNN’s old logo. Burning in game graphics was less common but more fun. Ironically, it’s an LCD. Image: “logo of CNN burnt on a screen” by [Nate]Early plasma displays could permanently damage the delicate phosphors in only a handful of hours. That damage burnt the unmoving parts of an image permanently into the phosphors in the form of “ghosting”, and unless you caught it early, it was generally not repairable. The ghosting issue got better over time, but the technology never escaped the stigma, and the problem never entirely went away. If that meant that after a marathon Call-of-Duty session the rest of the family had to stare at your HUD on every movie night, Dad wasn’t going to buy another plasma display.
By the end, the phosphors improved and various tricks like jiggling the image pixel-by-pixel were found to avoid burn-in, and it seems to have worked: there’s absolutely no ghosting on my model, and you can sometimes find late-model Plasma TVs for the low, low cost of “get this thing off my wall and up the stairs” that are equally un-haunted. I may grab another, even if I have to pay for it. It’s a lot easier to hide a spare flatscreen than an extra CRT, another advantage to the plasma TVs, and in no case do phosphors last forever.
In the mean time, I’m going to enjoy the contrast ratio, refresh rate, and the bonus space heater. I’m in Canada, and winter is coming, so it’s hard to get too overworked about waste heat when there’s frost on your windowpanes.
Today, if you can find a pneumatic tube system at all, it is likely at a bank drive-through. A conversation in the Hackaday bunker revealed something a bit surprising. Apparently, in some parts of the United States, these have totally disappeared. In other areas, they are not as prevalent as they once were, but are still hanging in there. If you haven’t seen one, the idea is simple: you put things like money or documents into a capsule, put the capsule in a tube, and push a button. Compressed air shoots the capsule to the other end of the tube, where someone can reverse the process to send you something back.
These used to be a common sight in large offices and department stores that needed to send original documents around, and you still see them in some other odd places, like hospitals or pharmacy drive-throughs, where they may move drugs or lab samples, as well as documents. In Munich, for example, a hospital has a system with 200 stations and 1,300 capsules, also known as carriers. Another medical center in Rotterdam moves 400 carriers an hour through a 16-kilometer network of tubes. However, most systems are much smaller, but they still work on the same principle.
That Blows — Or Sucks?
Air pressure can push a carrier through a tube or suck it through the tube. Depending on the pressure, the carrier can accelerate or decelerate. Large systems like the 12-mile and 23-mile systems at Mayo Clinic, shown in the video below, have inbound pipes, an “exchanger” which is basically a switchboard, and outbound pipes. Computers control the system to move the carriers at about 19 miles per hour. You’ll see in the video that some systems use oval tubes to prevent the tubes from spinning inside the pipes, which is apparently a bad thing to do to blood samples.
In general, carriers going up will move via compressed air. Downward motion is usually via suction. If the carrier has to go in a horizontal direction, it could be either. An air diverter works with the blower to provide the correct pressures.
History
This seems a bit retro, but maybe like something from the 1950s. Turns out, it is much older than that. The basic system was the idea of William Murdoch in 1799. Crude pipelines carried telegram messages to nearby buildings. It is interesting, too, that Hero understood that air could move things as early as the first century.
In 1810, George Medhurst had plans for a pneumatic tube system. He posited that at 40 PSI — just a bit more than double normal sea-level air pressure — air would move at about 1,600 km/h. He felt that even propelling a load, it could attain a speed of 160 km/h. He died in 1827, though, with no actual model built.
In 1853, Josiah Latimer Clark installed a 200-meter system between the London Stock Exchange and the telegraph office. The telegraph operator would sell stock price data to subscribers — another thing that you’d think was more modern but isn’t.
Within a few years, the arrangement was common around other stock exchanges. By 1870, improvements enabled faster operation and the simultaneous transit of multiple carriers. London alone had 34 kilometers of tube by 1880. In Aberdeen, a tube system even carried fish from the market to the post office.
There were improvements, of course. Some systems used rings that could dial in a destination address, mechanically selecting a path through the exchange, which you can see one in the Mayo Clinic video. But even today, the systems work essentially the way they did in the 1800s.
Famous Systems
Several cities had pneumatic mail service. Paris ran a 467 km system until 1984. Prague’s 60 km network was in operation until 2002. Berlin’s system covered 400 km in 1940. The US had its share, too. NASA’s mission control center used tubes to send printouts from the lower floors up to the mission control room floor. The CIA Headquarters had a system running until 1989.
In 1920 Berlin, you could use the system as the equivalent of text messaging if you saw someone who caught your eye at one local bar. You could even send them a token of your affection, all via tube.
Mail by tube in 1863 (public domain; Illustrated London News)
In 1812, there was some consideration of moving people using this kind of system, and there were short-lived attempts in Ireland, London, and Paris, among other places, in the mid-1800s. In general, this is known as an “atmospheric railroad.”
As a stunt, in 1865, the London Pneumatic Despatch Company sent the Duke of Buckingham and some others on a five-minute trip through a pneumatic tube. The system was made to carry parcels at 60 km/h using a 6.4-meter fan run by a steam engine. The capsules, in this case, looked somewhat like an automobile. There are no reports of how the Duke and his companions enjoyed the trip.
A controller for the Prague mail system that operated until 2002 (public domain).
A 550-meter demonstration pneumatic train showed up at the Crystal Palace in 1864. Designed by Thomas Webster Rammell. It only operated for two months. A 6.7-meter fan blew air one way for the outbound trip and sucked it back for the return.
Don’t think the United States wasn’t in on all this, too. New York may be famous for its subway system, but its early predecessor was, in fact, pneumatic, as you can see in the video below.
Image from 1867 of the atmospheric train at Saint Germain (public domain).
Many of these atmospheric trains didn’t put the passengers in the capsule, but used the capsule to move a railcar. The Paris St. Germain system, which opened in 1837, used this idea.
Modern Times
Of course, where you once would send documents via tube, you’d now send a PDF file. Today, you mainly see tubes where it is important for an actual item to arrive quickly somewhere: an original document, cash, or medical samples. ThyssenKrupp uses a tube system to send toasty 900 °C steel samples from a furnace to a laboratory. Can’t do that over Ethernet.
There have been attempts to send food over tubes and even take away garbage. Some factories use them to move materials, too. So pneumatic tubes aren’t going away, even if they aren’t as common as they once were. In fact, we hear they are even more popular than ever in hospitals, so these aren’t just old systems still in use.
We haven’t seen many DIY pneumatic tube systems that were serious (we won’t count sucking Skittles through a tube with a shop vac). But we do see it in some robot projects. What would you do with a system like this? Even more importantly, are these still common in your area or a rarity? Let us know in the comments.
The first multi-spacecraft science mission to launch to Mars is now on its way, and catching a ride on the twin probes are the first kiwis to fly to the red planet.
NASA’s ESCAPADE (Escape and Plasma Acceleration and Dynamics Explorers) mission lifted off on a 22-month trip to Mars on Thursday aboard a New Glenn rocket. Once there, the identical satellites will enter Martian orbit to study in real time how space weather affects the planet’s hybrid magnetosphere and how the interaction drove Mars to lose its once-dense atmosphere.
Led by the Space Sciences Laboratory at the University of California, Berkeley—the two spacecraft are named “Blue” and “Gold” after the school’s colors—the ESCAPADE probes are the first Mars-bound vehicles to be designed, built, and tested by Rocket Lab, the end-to-end space company headquartered in California but founded in New Zealand.