MobiFlight

Flight Simulation is a fun hobby, and has evolved from interesting-but-not-especially-realistic vector graphic depictions of flying something that vaguely resembled a Cessna 182RG around the Chicagoland area, to impressive, raytraced simulations of just about every type of aircraft out there, often with highly accurate physical simulations backing up the aircraft performance.

MS Flight Simulator 1.0, about to overfly KCGX Meigs,
in what we are assured is a Cessna 182RG. (Image: Wikipedia)
On final approach to the virtually-restored Meigs, in an SR-22 in FS2020.

Of course, realism takes a hit when you’re interacting with these glorious models through a computer screen — and manipulating controls by clicking on them and dragging them with a mouse. If real aircrews had to do this to control their planes, they could make it work — but they would declare an emergency due to how difficult (and therefore risky) it makes things. It would be nice to have the same kind of controls for simulation that the actual aircraft use, but this has generally been a hassle, having to program each control to interact with an API such as SimConnect.

MobiFlight (freeware) makes connecting hardware inputs and outputs much simpler. Connect your controls to I/O pins on supported Arduino models, upload the MobiFlight firmware to the board, download and launch the connector software on the PC, and you’re up and running. Make a change on the physical controls, and it will be reflected in the sim.

A quadrature encoder plus SPST switch, used to set the heading bug.
This, plus plugging in the board via USB, is all the hardware setup needed!

As a first test, I implemented a heading bug selector, and verified that it works in the Cessna 172, Cirrus Vision Jet, and the PMDG 737-700. From here, I’m planning to start to recreate the 737 controls, panel by panel. The eventual goal is to have at least the main control panel in the correct position, so I can look over in VR and find each control where it should be, without having to see the physical device.

Posted in Arduino, Aviation, Flight Simulator, User Interface Design | Tagged , , | Leave a comment

Z80 End-Of-Life

I guess it had to happen sometime — and close to fifty years is one hell of a good run for an 8-bit microprocessor from 1975. This week, Zilog announced that it would be moving the storied Z80 microprocessor family to EOL status, and called for last-time-buy orders through June 24.

I took the opportunity to place a small symbolic order for a single chip for the Mini Museum — a processor that 1975 would have drooled over. The Z84C0020PEG is a power-efficient, DC-clockable, 20MHz CMOS version of the venerable Z80 microprocessor. In car terms, it’s a Z80 with a quad barrel carb, dual chrome exhaust, and a sweet paint job.

A 20MHz CMOS, DC-clockable Z80.
Even this one is five years old, bought “new” from Mouser this month.

The Z80, itself a software-compatible improvement on Intel’s popular 8080 microprocessor, is probably in several billion devices, at this point. It has been used to control everything from robots to the Nintendo Game Boy to microwave ovens, as well as its usual use in 8-bit computers, typically running the CP/M operating system. It anticipated the past few decades of PC upgradability by providing a faster, more capable version of the 8080 that could run on the same software as the 8080, if needed, just like later generations of PCs can (usually) run software from at least the previous generation or two.

While the eZ80 will continue to be produced for the foreseeable future, it’s not quite the same for hobbyists and educators like me who grew up with the Z80 (it powered the Timex-Sinclair 1000, which was my first computer). I understand the business case for the decision, though — we’ve long since switched to 32-bit processors even for our Microcontrollers class, and last year, we replaced the Z80 Microprocessors course content with a Verilog-based approach, where students design their own processors. (I do still use the Z80 for opcode examples.)

Although I wouldn’t use a Z80 in a new design, it’s still bittersweet to see it replaced by more modern tech. But I guess for nostalgia, we’ll always have the good old 16F84A. (That’s so deeply entrenched in engineering curricula that Microchip will be stuck making it until the heat death of the Universe!)

Posted in Components, Current Events, EET325, Electronics, Lore, Nostalgia | Tagged , , , , , , , | Leave a comment

Chicken Breeder Reactor

Minecraft chickens (or maybe they’re ducks?) are strange birds. They’re functionally unisex, for one: all adults lay eggs which can hatch into chicks. Oddly, there doesn’t even need to be another chicken present for this to happen, so presumably they’re all clones.

They drop chicken eggs, which can be used for a few things like making cakes. If your gameplay style allows killing passive mobs (I don’t), they can also be a source of chicken meat and feathers, both of which are also useful.

Oh, yeah. And chickens can more or less bring about the end of a Minecraft world.

I saw the idea online somewhere, years ago, that you can make an automatic chicken breeder machine in vanilla Minecraft. It’s even possible to make a version where no chickens are harmed. The idea is, you corral one or more chickens (any chicken will do; they’re all identical) in a small area with a floor made of Hoppers (a single hopper will do.) These hoppers should all feed into a Dispenser powered by a Redstone oscillator and pointed at a wall not far away. When the chicken lays an egg, it is fed into the Dispenser, which shoots it at the wall, breaking it. Some percentage of these become baby chickens.

A Chicken Breeder Reactor.
This is a Really Bad Idea (but it isn’t in my main world, so why not?)

The neat part is, baby chickens pathfind to any adults nearby. Since you have one or more adults in the corral, the baby will try to find its way there. Give it a one-way stairs (with a drop of two or more blocks at the end), and it will end up alongside its parent.

Now, a few minutes later (they grow up so fast!), you’ll have two adults laying eggs. Soon, you’ll have a lot of chickens. And the more you have, the more eggs per hour they lay. The more eggs per hour that they lay, the more egg-laying chickens you have. The process feeds on itself exponentially, and can eventually render a Minecraft world unplayable. And you don’t even kill any chickens (or anything else) to do it.

It should be fun to see it speed up and go wild.

Posted in Games, Minecraft | Tagged , , , | Leave a comment

Generative Adversarial Networks (GANs)

It’s an interesting idea, and kind of amazing that it actually works (and works well). Generative Adversarial Networks (GANs) are, as the name implies, a generative form of Machine Learning that attempts to generate content such as images, based on nothing more than lots of labeled examples of that data.

With GANs, two models evolve in competition with each other: the discriminator network is trained on real images (from a source dataset) and fake images (from a generator network). It is scored on how well it can distinguish real from fake. The generator network, meanwhile, starts out knowing nothing at all about what it should be producing. It learns when its output is fed into the discriminator, which decides how likely it is to be real or fake (or how likely it is or isn’t to be each of a family of possible objects it has been trained on, for instance.)

With a PyTorch model set to record the generator’s output for each generation (including the static-like output of the initial, completely untrained generator model), the model can be seen to learn the images, epoch by epoch. Each frame (epoch) after the first image represents the output of the updated generator network when prompted to produce the digits 0 through 9.

To show the learning process clearly, the learning rate has been decreased by a factor of 20 from the original (quite effective) setting of 0.002.

When training, each epoch (each frame of the movie) takes about 15 seconds to train, on a single GeForce RTX2060 GPU (host PC is a Core i9/9900 with 128GB running Windows 10.) If you view the individual movie frames at one image per second, you’re essentially watching them at the same rate that the network could train, if the learning rate were not artificially slowed.

What does this mean, in practice? Machines can now learn from data rather than being told specifically what is what — sometimes, even if that data isn’t labeled. They can learn categories on their own, too.

And oh, by the way, GPT4 wrote 95% of this code, from just a few prompts. It took a little working with it to get the images right, but my role was essentially just that of test engineer, copying code that GPT4 provided, running it, and reporting back with the results. That’s easy to automate! No doubt, we will soon see coding AIs that do just that. (Some of them can already execute Python code.)

So my computer has taught itself (marginally) better handwriting than mine.
That was bound to happen at some time, but 2024 looks like it will be a fun year.

I love living in the future.

Posted in Coding, Machine Learning / Neural Networks | Tagged , , , , , , , | Leave a comment