Z80 End-Of-Life

I guess it had to happen sometime — and close to fifty years is one hell of a good run for an 8-bit microprocessor from 1975. This week, Zilog announced that it would be moving the storied Z80 microprocessor family to EOL status, and called for last-time-buy orders through June 24.

I took the opportunity to place a small symbolic order for a single chip for the Mini Museum — a processor that 1975 would have drooled over. The Z84C0020PEG is a power-efficient, DC-clockable, 20MHz CMOS version of the venerable Z80 microprocessor. In car terms, it’s a Z80 with a quad barrel carb, dual chrome exhaust, and a sweet paint job.

A 20MHz CMOS, DC-clockable Z80.
Even this one is five years old, bought “new” from Mouser this month.

The Z80, itself a software-compatible improvement on Intel’s popular 8080 microprocessor, is probably in several billion devices, at this point. It has been used to control everything from robots to the Nintendo Game Boy to microwave ovens, as well as its usual use in 8-bit computers, typically running the CP/M operating system. It anticipated the past few decades of PC upgradability by providing a faster, more capable version of the 8080 that could run on the same software as the 8080, if needed, just like later generations of PCs can (usually) run software from at least the previous generation or two.

While the eZ80 will continue to be produced for the foreseeable future, it’s not quite the same for hobbyists and educators like me who grew up with the Z80 (it powered the Timex-Sinclair 1000, which was my first computer). I understand the business case for the decision, though — we’ve long since switched to 32-bit processors even for our Microcontrollers class, and last year, we replaced the Z80 Microprocessors course content with a Verilog-based approach, where students design their own processors. (I do still use the Z80 for opcode examples.)

Although I wouldn’t use a Z80 in a new design, it’s still bittersweet to see it replaced by more modern tech. But I guess for nostalgia, we’ll always have the good old 16F84A. (That’s so deeply entrenched in engineering curricula that Microchip will be stuck making it until the heat death of the Universe!)

Posted in Components, Current Events, EET325, Electronics, Lore, Nostalgia | Tagged , , , , , , , | Leave a comment

Chicken Breeder Reactor

Minecraft chickens (or maybe they’re ducks?) are strange birds. They’re functionally unisex, for one: all adults lay eggs which can hatch into chicks. Oddly, there doesn’t even need to be another chicken present for this to happen, so presumably they’re all clones.

They drop chicken eggs, which can be used for a few things like making cakes. If your gameplay style allows killing passive mobs (I don’t), they can also be a source of chicken meat and feathers, both of which are also useful.

Oh, yeah. And chickens can more or less bring about the end of a Minecraft world.

I saw the idea online somewhere, years ago, that you can make an automatic chicken breeder machine in vanilla Minecraft. It’s even possible to make a version where no chickens are harmed. The idea is, you corral one or more chickens (any chicken will do; they’re all identical) in a small area with a floor made of Hoppers (a single hopper will do.) These hoppers should all feed into a Dispenser powered by a Redstone oscillator and pointed at a wall not far away. When the chicken lays an egg, it is fed into the Dispenser, which shoots it at the wall, breaking it. Some percentage of these become baby chickens.

A Chicken Breeder Reactor.
This is a Really Bad Idea (but it isn’t in my main world, so why not?)

The neat part is, baby chickens pathfind to any adults nearby. Since you have one or more adults in the corral, the baby will try to find its way there. Give it a one-way stairs (with a drop of two or more blocks at the end), and it will end up alongside its parent.

Now, a few minutes later (they grow up so fast!), you’ll have two adults laying eggs. Soon, you’ll have a lot of chickens. And the more you have, the more eggs per hour they lay. The more eggs per hour that they lay, the more egg-laying chickens you have. The process feeds on itself exponentially, and can eventually render a Minecraft world unplayable. And you don’t even kill any chickens (or anything else) to do it.

It should be fun to see it speed up and go wild.

Posted in Games, Minecraft | Tagged , , , | Leave a comment

Generative Adversarial Networks (GANs)

It’s an interesting idea, and kind of amazing that it actually works (and works well). Generative Adversarial Networks (GANs) are, as the name implies, a generative form of Machine Learning that attempts to generate content such as images, based on nothing more than lots of labeled examples of that data.

With GANs, two models evolve in competition with each other: the discriminator network is trained on real images (from a source dataset) and fake images (from a generator network). It is scored on how well it can distinguish real from fake. The generator network, meanwhile, starts out knowing nothing at all about what it should be producing. It learns when its output is fed into the discriminator, which decides how likely it is to be real or fake (or how likely it is or isn’t to be each of a family of possible objects it has been trained on, for instance.)

With a PyTorch model set to record the generator’s output for each generation (including the static-like output of the initial, completely untrained generator model), the model can be seen to learn the images, epoch by epoch. Each frame (epoch) after the first image represents the output of the updated generator network when prompted to produce the digits 0 through 9.

To show the learning process clearly, the learning rate has been decreased by a factor of 20 from the original (quite effective) setting of 0.002.

When training, each epoch (each frame of the movie) takes about 15 seconds to train, on a single GeForce RTX2060 GPU (host PC is a Core i9/9900 with 128GB running Windows 10.) If you view the individual movie frames at one image per second, you’re essentially watching them at the same rate that the network could train, if the learning rate were not artificially slowed.

What does this mean, in practice? Machines can now learn from data rather than being told specifically what is what — sometimes, even if that data isn’t labeled. They can learn categories on their own, too.

And oh, by the way, GPT4 wrote 95% of this code, from just a few prompts. It took a little working with it to get the images right, but my role was essentially just that of test engineer, copying code that GPT4 provided, running it, and reporting back with the results. That’s easy to automate! No doubt, we will soon see coding AIs that do just that. (Some of them can already execute Python code.)

So my computer has taught itself (marginally) better handwriting than mine.
That was bound to happen at some time, but 2024 looks like it will be a fun year.

I love living in the future.

Posted in Coding, Machine Learning / Neural Networks | Tagged , , , , , , , | Leave a comment

“It’s 2024!”

Back when I was in first or second grade (so, sometime in the 1979-1981 timeframe), the sixth graders put on a musical skit about life in The Future. I don’t remember much about it, except everybody was dressed in shiny “futuristic” clothes, mirrored sunglasses, and had flying cars. They sung about what life was supposedly like — I don’t remember the details, except they were very excited to live in The Future, and kept enthusiastically singing, “It’s Twenty-Twenty-Four!!”

Well, here we are. It’s actually 2024. It’s officially The Future, at least as 1980 or so saw it. What would my 1980 counterpart think about modern life, if he got to spend a day in the real 2024?

Well, first, I’d have to coax him away from staring at the 4k monitor I picked up last Fall. “TV” screens are much larger and brighter — but far lighter — than anything 1980 had imagined. Even the expensive projection TVs from back then don’t come close to a $300 OLED monitor.

And then there’s the content. YouTube is better TV than 99% of TV, easily. I’m going to have a better idea of what I want to watch than some TV network executive. Whatever you want to learn — from playing the ukulele to deep learning — has tutorials ready to be watched, 24/7.

To distract my younger self before he finds out about Reddit, I show him a 3D printer and a laser cutter/engraver. Some similar technology did exist back then, but only in industry — the catalyst for the 3D printer hobby explosion was the expiration of some key patents around 2010. “So you have toys that can make other toys,” he observes, as a Benchy starts to print. It’s a great time for hobby electronics in general, I point out, as I show him all of the local aircraft that I can track with my home ADS-B setup.

So do we have flying cars? Well, not really. Honestly, even 1980 would understand that it’s dangerous enough to trust people with driving in 2D, and without the careful training that pilots have to pass in order to be licensed, flying cars would not end well. We do know how to make flying cars (some of them have been around for decades), but the old wisdom that cars make poor airplanes and airplanes make poor cars is still generally true.

We do have some amazing tech in “normal” cars, though. I pull out my Android smartphone (a 2022 model, but still so far beyond any tech 1980 had that it would look like magic) and reserving an Uber. A few minutes later, a sleek-looking white car pulls up, making very little noise. 1980-me comments that the engine is very quiet; I tell him that’s because it doesn’t have one. The driver obliges us by giving us a quick demonstration of its acceleration — comparable to the huge, powerful muscle cars of the 1960s, only almost silent. And we’re probably close to having self-driving cars (full autopilot) commonly available.

“It knows where it is,” 1980-me comments, looking at the moving map display. I explain about GPS and satellite navigation, and how “being lost” is generally very easy to fix, now. He asks if privacy is a concern, and is happy to hear that GPS is receive-only — the satellites don’t know and don’t care who’s using the signals. I decide not to mention all the many ways in which we are tracked, if maybe not geographically.

We head back home. “So, how fast are modern computers compared to what we have in 1980?” I explain about multi-core CPUs, cache memory, gigahertz clock speeds, pipelining, and such. He mentions that he wrote a program in BASIC to calculate prime numbers, and remembered that it took four minutes to find the prime numbers up to 600.

I code the same problem up in FreeBasic on my Windows PC. It takes ten milliseconds. After explaining what a millisecond is (and how it’s actually considered fairly long, in computing terms), we calculate that my modern PC is more than 24,000 times faster. To be fair, this is pitting compiled 64-bit code on the Core i9 against interpreted BASIC on the 1982-era Sinclair — but on the other hand, we’re not making use of the other fifteen logical processors, or the GPU. We race a Sinclair emulator (“your computer can just pretend to be another one?”) to compute the primes up through 10,000. The i9 does even better — some 80,000x faster.

I do the calculation in my head, but don’t mention the more than eight million times difference in memory size between the Sinclair and the i9’s 128GB. (It would be 64 million times, without the Sinclair’s wonky, unreliable add-on memory pack!) First graders don’t yet really have a good intuition for just how big a million is, anyway.

“So, what are video games like?” I put the Quest 2 headset on him and boot up Skyrim VR. He enjoys it — at least right up until he gets too close to the first dragon and gets munched. He loves Minecraft, of course — “It’s like having an infinite Lego set and your own world to build in!”

Then I show him Flight Simulator 2020 and the PMDG 737. He hasn’t found the power switch yet (it’s not super obvious), but he hasn’t stopped grinning, either. And he has yet to meet GPT4 and friends. Or to listen to my mp3 collection.

I love living in the future.

Posted in Nostalgia | Leave a comment