Why you shouldn’t buy a Mac in 2020…or maybe ever again.

Sad Mac iconThis week Apple surprised no one by announcing they were beginning their transition to “Apple silicon” in their Mac computer line-up. If you don’t know what that means, its sufficient to understand that they are moving from Intel-based computers, to a processor and related architecture of “their own” design.

I put quotes around “their own” because, despite their announcement, everyone knows that “Apple silicon” is derived from the ARM processor — a family of chips most often used in phones and other mobile devices. ARM has been around a long time, and Apple invested in the company back during the Newton era. Intel has obviously been around even longer, but Apple’s use of Intel chips is the stuff of relatively recent history.

This marks the 4th processor migration for Apple, from the Motorola 6502-based Apple I and Apple II computers, to the Motorola 68000 family in the early Macintosh line-up, to the Motorola (and IBM) PowerPC of 1990s processor-war infamy. With each generation, Apple struggled to position themselves against the WinTel (Windows + Intel) hegemony. It wasn’t until 2006, when they transitioned to Intel, that Apple finally found their footing.

Since Steve Jobs’ hostile take-over of the Macintosh project in the early 80s, Apple’s philosophy on computing has been fairly “closed.” Jobs envisioned the computer as an appliance for average people, not a tinker toy for nerds. In the original Mac, this meant unusual screws and an absence of hardware expansion slots. On the iPhone, it meant a “walled garden” where only Apple-approved apps on the Apple-hosted App Store could be installed (unless you were willing to do some serious hacking.)

It took a long time to prove this philosophy out — it was almost a full generation before non-nerds were doing most of the computer shopping. But in many ways it paid off. Macs have a reputation of being stable, reliable machines, and iPhones are the mobile device most people want to own. iOS really represents the logical outcome of Apple’s trend toward locking things down: its an operating system that users aren’t supposed to know anything about, on hardware that customers aren’t supposed to be able to open.

On the Mac, though, there’s always been another layer: under the simple, friendly veneer of the user interface is a powerful Unix shell. And under the sleek case is fairly standard, commodity hardware. The implications of this for the Mac is that despite Apple’s attempts to end their life prematurely, people with a little know-how can keep their Macs running for years. Unlike phones, where people feel compelled (either by fashion trends, or security concerns) to buy a new one every couple of years (don’t do it!), an Intel Mac can last a decade or more as a useful, performant machine. Obviously this is a problem for a company that primarily sells hardware…

Case in point: this is being written on a 12 year old Mac that Apple tried to stop updating in 2016.

Apple zealots will tell you that the move to ARM will let Apple build smaller, faster machines with better battery life. They’re not wrong — ARM rocks for mobility. What they won’t admit is that the move away from commodity hardware will let Apple control the lifecycle of these new computers the same way they intentionally keep the lifecycle of their phones shorter than necessary:

  • With an Intel-based hardware platform, upgrades made for Windows PCs mostly “just work” in a Mac
  • With an Intel-based hardware platform, many parts can be sourced from other manufacturers to provide for repairs that Apple will no longer supply
  • With an Intel-based hardware platform, users can boot Windows (or Linux) to run software that isn’t compatible with “older” Macs
  • With an Intel-based hardware platform, the developer community can create patches to circumvent artificial end-of-life moves from Apple designed to keep you from upgrading to the newest MacOS

It remains to be seen whether the heroic hackers of the world will be able to bring these benefits to new ARM-based Macs, but if Apple’s plan is to make Macs more like iPhones (which it evidently is), you can bet they won’t help us.

The move from PowerPC to Intel was a painful one for the Mac community. Software we owned stopped working, or had to be run through short-lived and poorly performing compatibility tools. Then there was the swallowing of our pride as we collectively had to admit that Intel really did outperform the G4s and G5s we were so proud of. But ultimately, the benefits for consumers outweighed the costs: it was the right move. Arguably, the move to ARM is significantly less urgent — granted, Intel’s track record over the past few years hasn’t been great, but they’re still putting out decent performance at a reasonable price point. Besides, the average Mac user doesn’t care what kind of silicon they’re running on — and they shouldn’t need to. But they should care if a company is deliberately steering them toward a platform of aggressive planned obsolescence and a treadmill of re-buying things they don’t really need.

I’ve put more than two dozen used Intel iMacs and MacBooks back into service for churches, students, teachers and missionaries — all well past the date Apple would like them to be running, and all stable, reliable and with half of them running Windows 10 at least part of the time. They’re really great machines, and I mourn the end of this era. Maybe Apple’s new products will be better than I think; I’m sure they’ll be sexy pieces of hardware. I just hope they don’t become sexy pieces of garbage in a couple years…

Air Hockey Scoreboard Project

Last year, our church got a donation of an air hockey table, for use by the youth group. It was in nice shape, save for the electronic scoreboard, which wouldn’t keep score. It would light up and make sound when powered, but the score was stuck at “88” to “88.” We resolved to see if we could fix it.

The electronics were wrapped in a plastic banner that crossed the middle of the table, and had only a few wires running to it — one from a puck “catcher” on each side of the table, with a simple switch that was pressed when the puck was present, and one modified ethernet cable running to a small control panel that let you start a timer, or reset the game. Our initial hope was that it would just be a wiring problem, that Ben and I could fix together. It turned out to be much more complicated.

All the wiring was fine, and via various traces, ended up connected to a small logic board — this turned out to be the cause of the problem: it was dead. The board had a part number, but no amount of Internet searching could find a source for a replacement part. We theorized that the actual logic being performed was fairly simplistic, and that we might be able to replicate it with a Raspberry Pi Zero — roughly the same size, and equipped with sufficient GPIO pins to map to the existing wiring. Connecting the puck catchers was easy, and with a little Python code, we could count score and show it on a connected SSH Terminal. The control panel was a little more difficult, but we managed to find a solution for some of its basic functions. The remaining problem was the seven-segment LEDs that actually show the score.

For those we called in some help from an engineering student we know, and managed to come up with the logic to light the LEDs by reverse engineering an array of bit values that could be toggled via the Pi’s GPIO. In theory it was going to be possible to restore displays, and 90% of the functionality of the system. In practice, it didn’t work out that way.

The seven-segment LEDs work in pairs-of-pairs: two LEDs for each player, and double that to mirror the value on the other side of the board. 4 LEDs, with 8 wires each makes for 32 tiny wires that needed to be run. Only half that needed to go the GPIO, since the other half were just mirrored values, but that’s still a lot of wiring — turns out there’s a reason most electronics use a printed circuit board. I briefly entertained designing such a board, and paying to have it printed, but we were still going to run into voltage problems. Once the LEDs were running, very little voltage remained in the little Pi for the blinking lights and buzzers that make the experience fun.

After literally months of soldering, brainstorming, and frequently ignoring the now very-messy project out of frustration, we decided to abandon the banner, and go all in on a Raspberry Pi 3B+ with an add-on display. The logic still worked fine for score-keeping, although I had to come up with a new routine for displaying the score in ASCII characters that filled the screen. The girls each composed a little ditty that gets bleated out by the buzzer when someone wins the game, and Ben designed a mounting shim and 3D printed it. We removed one of the two support poles for the scoreboard, and mounted the Pi to the other — neatly running the wires up the pole. I used plastic cement to attach a reset button and the buzzer to the side of the Pi’s case, providing the key features of the original control panel.

After a successful beta test, we refined the design and improved the crude graphics a little, then installed our new system in the Air Hockey table using most of the original wiring. Its not perfect, but its quite elegant — and I was pleased by how much we all learned putting it together.

Hobby Horse

Susan Kare's original happy MacFor the record, I was 39 years old before I had a hobby.

I mean, I do things outside of work and school, but none long enough to move from amateur to hobbyist. Then, when I got around to picking out a hobby (or maybe it picked me), it ended up looking a lot like my profession.

It’s not though. Its technical, but there’s no way I’ll ever get paid to do it. It’s nerdy, but not in a way that has any commercial value. And its geeky, but not the kind of geeky that redeems itself. And it took 39 years and moving to rural Ohio before I actually had the spare time.

If you want to read about it, there’s a new section of the website and a separate RSS Feed: the Restoration Museum. For everyone else, normal posts will remain in this category.

That Time I Talked to Apple’s Co-Founder

In the fall of 2000, I signed up for a fledgling online auction site called eBay. I wanted to find a relatively obscure piece of Apple Computer kit I’d always wanted, called a Newton MessagePad. I didn’t quite understand how eBay worked, so I offered the maximum I’d be willing to pay on 6 different listings… it was probably a full hour before I realised I’d just committed to buying 6 Newtons! Fortunately, I was out-bid on 5 of them, and only had to pay for one.

Nonetheless, I was a proud owner of a Newton MessagePad 120 — proud, that is, until I learned about the MessagePad 2100. The grand-father of portable computing, killed off in its prime by Steve Jobs in his return to Apple Computer in 1997, the Newton remains an audacious and ambitious piece of computing history.

In 2002, after saving up, I managed to get my hands on an upgrade MessagePad 2000 and began my first experiments with wireless networking and different kinds of after-the-fact hacks and expansions to the long-dead platform. An impressive community of hobbyists had sprung up to keep Newton alive, adding Bluetooth, Wifi, MP3 playing and web surfing. It may have been my first experience in coaxing new usefulness out of abandoned hardware.

I didn’t do much for the community, but I did talk about it a lot — on this very home grown website, and other early-Internet forums. Enough, I guess, that a writer for Wired Magazine found me and scheduled an interview for an upcoming article in his series about the culture of Apple fans. That article appeared a couple months later, and you can still find it if you search the right keywords.

18 years later, that article got me invited to speak at a Worldwide Online Newton Users Conference. Turns out there’s still interest in the little green machine, and more than 70 nerds were gathering online to share their recent hacks, collections and uses for Newt. Of the participants in attendance were some of the original Newt dev team, a well-known tech journalist, and the remaining co-founder of Apple Computers, Steve Wozniak.

Steve was mostly a silent observer — in fact, at first we weren’t quite sure it was really him. At the outset, I challenged the participant bearing his moniker to turn his camera on and prove it. I’m sure we were all delighted when the real deal himself appeared and shared his memories of Newton. He receded back into silence until we had a break. As other participants shut their cameras off to attend to biological needs, I decided to go for broke:

“Is Woz still on?” I asked

A couple seconds of silence…

“Yup! I’m here! I’ve been here listening the whole time!”

“Would you be willing to take a few questions?”

“Absolutely!” says the fabled millionaire, as his camera springs back to life.

He held court with us for 20 minutes. I asked a series of off-the-cuff questions to start the impromptu interview, mostly about nerdy things, but we also talked about teaching kids computers, Covid-19, and travel. After a few minutes I yielded the floor so other participants could pile on. It ended too quickly and Woz remained a silent participant for the rest of the event, but it sure was cool! He’s remarkably down to earth — just one of the nerds, who likes experimenting with technology and talking about his passions. In fact, that’s how Apple started.

The slides for my little talk are hereDownload

They’re mostly just memories, as this event will be in a couple years. But don’t ever doubt the power of technology — and community — to have an impact on people’s lives. The Newton community made a documentary on just that, and its worth watching.

How to Read the News Online

This post is probably long overdue. I’m guilty myself of scrolling through Google News and letting an algorithm decide what I should see. But now, more than ever, its important to get the best information possible. Outlined here will be my attempt to provide some tips to escape the echo chamber, see past ideological spin, and find better sources of information online.

I should start with the caveat that of course this isn’t perfect. But its preferable to the norm…

App and website developers build for “stickiness” — that’s a primary goal. The longer they can keep you inside their experience, the more you are worth to them. That worth is often in advertising dollars, but its always in data: user and behavior information that lets providers create better personas (digital “voodoo dolls“) of their audiences. To restate that more clearly: the main goal of your favorite news app or website is not to inform you — its to make money off you. The longer you stay inside their experience, the more you are worth to them.

With this in mind, its easy to understand how content is created and prioritized. Content creators want to develop content that is interesting to their audiences. Content selection algorithms want to provide content that you resonate with — even when that’s not good for you. The “news” system is designed to affirm your biases, and reinforce the beliefs that brought you there.

Even information aggregators, like Facebook, YouTube and Twitter are running algorithms trying to find what you like and give it to you. They’re everywhere, and they’re cloyingly sycophantic. About once a day Google News offers me a bikini pic of a celebrity along-side other headlines — they know I’m an adult male, and they’re sure I want to see that content. All it takes is one tap to confirm that interest, and tip the algorithm toward more of it.

So if you’re ready to escape the fun house mirror that is Internet news, here’s what to do:

  1. Dump your current News app or go-to website. Google News, Apple News, MSN News, Fox News, CNN news… whatever you use, its all the same. I’m not even talking network bias yet, I’m just talking about algorithm-driven content providers. They’ve all got to go.
  2. Identify raw sources. In the US almost all news comes from the Associated Press first. Each network gets those stories, and puts their own ideological spin on that news. Skip the spin, and find the source: AP, and Reuters are both good for North America.
  3. Identify alternative sources. I’m not talking about fringe sites with extreme beliefs, I’m talking about a source of news that is further removed from the reach of your country’s political parties. In the US, the BBC or the CBC are reasonably impartial observers of what’s happening in your country. Find world news sources that aren’t reported from within your country — you’ll still get the big news items, but the context will be improved.
  4. Once you’ve selected better news sources, find their RSS feeds. OK, I know that sounds like techno-babble, so let’s break out of the numbered list and explain…

RSS stands for Really Simple Syndication (or Rich Site Summary), and its been a backing technology for the web since 1999. If you listen to podcasts, you use it regularly. An RSS feed is just the content from a site, none of the ads, none of the tracking technology, and none of the algorithms. Just the raw content.

Increasingly sites are hiding or obscuring their RSS feeds, because they want you on their site in your browser or on their app, so they can track you. But so far, no one has succeeded in removing it entirely. If you’re technically inclined, you can use tools in your browser to find the feed URL, but if not, there’s easier ways to get it.

I use a service called InoReader. They have a pro version, but the free one has everything you need to search for RSS feeds from the news sources you trust. Once you create an account in InoReader, you can add your selected news sources directly. The content is sucked out of the site via RSS, in aggregate, anonymously and automatically, then made available at the InoReader website or on the InoReader app on your phone or tablet, in a neatly organized fashion. Its a curated news stream that breaks the algorithms that taint the information you’re getting.

InoReader’s RSS Based News Feed

Like I said, its not perfect. InoReader knows what you’re reading — but because it serves raw feeds, it can’t alter them without detection (you can always look at the RSS directly to see if they’re changed; in 4 years of monitoring, I’ve never seen it happen.) Another challenge is that sometimes news sites only publish the first sentence or two into their RSS feed, and you have to click through to their website to read the whole article — but when you do, you can visit as a signed-out, anonymous reader (there are other work-arounds, for those comfortable with deploying a little open source software.) And of course, your critical thinking skills are always needed for any media you consume.

But even with the challenges, and the little bit of extra work it takes to make good selections, the difference is night-and-day. Do this for awhile, then compare the real headline with the liberal and conservative spin carried by other sources, and you’ll realize just how bad things are.

The dangers of the filter-bubble are real, and the increasing polarization in the US (and Canada too!) is a very real result. If you’re going to use technology, you should use it responsibly. The onus is on you to consume information that challenges your beliefs, educates you, and makes you more empathetic toward people who are different than you. Popular “news” technology does the opposite.

Getting to Know Your Digital Voodoo Doll

Cambridge Analytica LogoIf the Cambridge Analytica scandal told us one thing, its how poorly people understand how data is being used. Although the folks at CA may not have had the most altruistic of intentions, they were really only exploiting what was freely available. That they used some data Facebook didn’t intend them to use doesn’t change the fact that the data was there for the taking. People volunteered it willingly, so it was inevitable that it would be put to use.

What is probably less clear in this tale of targeting was that they weren’t really targeting you or I. Rather, the technology allowed them to identify what kind of people we are like, and target people of that kind. This aggregate group identity makes up a persona — a fictional person that has traits and attributes, gathered from the self-provided data of real people, that are useful for addressing many actual individuals that are similar to that persona.

This is not new. In fact, in programming, type inheritance is a powerful concept that is useful for generalization. What’s new in the last decade or so is the volume of self-identified human data, and a few primary keys that allow that data to be associated with unique donators. Lots of web sites have data on you as a mostly anonymous visitor. There’s identifying information, for sure, but nothing you deliberately confirm or setup, so its a “weak link”. When a website requires you to create an account, then they truly have uniquely identifying information for tracking you within the properties that account uses. Facebook is mostly unprecedented because of the scope of that account. As an identifier, its used far beyond the actual Facebook website — its used on other Facebook properties (WeChat, Instagram) and on millions of partner sites that use Facebook log-in, or Facebook data sharing (when you see “Like on Facebook” on a website that is not Facebook, they are sharing data using your identity as a key.)

The effect is that activities spanning the web are opted-in to Facebook data collection, whether you’re aware of it or not. Suddenly a single primary key has a rich repository of information about billions of individuals. Realistically, it would take an incredible effort to actually target a single individual, but it does become very easy to group individuals based on activity. Individuals who “Like” a Republican candidate, individuals who participate in discussions about vaccinations, individuals who view religious videos, etc…

The field of psychographics is the emerging social science of identifying groups based on these common activities, then determining what methods are most effective at influencing the individuals within those groups. Facebook helps out even more, due to a built-in concept called Graph Relationships. These are the links between individuals that can be used to tie people to groups even if those linked individuals provide no explicit data that identifies them as part of the group. You may not have shown any visible interest in a particular political candidate, but if you’re linked to many people who have, you may find yourself targeted as part of that group.

https://www.businessinsider.com/explainer-what-exactly-is-the-social-graph-2012-3

This self-identification increases with your social network, and with your activity. If you’ve seen ads for something you recently thought about (but could swear you didn’t write down or say out loud) the odds are good that you’ve been targeted based on your activities or affiliations, and advertisers “knew” you would be interested in that product or service, because other people like you are interested in it.

I recently saw this concept described as a digital voodoo doll, and the analogy is apt. Advertisers and other influencers aren’t interacting with you directly, instead they’ve created an avatar that is like you, they’ve experimented to determine how best to impact those like you, and then they’ve launched their digital onslaught against the group. When the voodoo doll gets really precise, its called micro-targeting, and you really should be scared of it.

So what can you do about it? Well knowing the importance of identifier keys, you can participate in the web more strategically. It may be easier to sign up for a new service with your Facebook account (keeping track of multiple passwords is hard!), but know that when you do, Facebook gets all that data. Use different keys (new accounts) for different services, to reduce the chance of your activity being linked. You don’t have to quit Facebook entirely, but be careful what you indulge within their scope of view.

On that topic, there are ways to keep fences around that garden. FireFox has an extension that does just that — blocking Facebook tracking on sites not owned by Facebook. The same cautions should apply to any service whose tendrils extend beyond their own .com front-end. Microsoft, Amazon, Google all offer useful developer tools for web creators — in exchange for data collected from those sites. Diversify your digital activity: use different services for different features, and don’t mix and match. For example, Microsoft hosts our email, but not our voice commands. Amazon gets our voice commands through Alexa, but doesn’t store any of our documents. Opt out of data collection when given the choice.

As tech providers find newer, more clever ways to collect data, and the legal framework struggles to keep up, be aware of how you’re inevitably being targeted. Information is neutral — it doesn’t have a bias. Human beings, on the other hand, are biased. If something is presented as information but appeals to your natural bias, question the source — odds are that you’re being manipulated.

The dream of the Internet was that information could be shared instantly and freely with everyone. Those altruistic nerds that invented it may have forgotten that someone has to pay for technology somehow, and perhaps unknowingly, we backed our technology revolution into an ad-supported model. Being willing to pay for content that isn’t ad sponsored seems to have a tendency to inspire a little less subterfuge in the content provider. If you want to learn something new, or engage with a community on a topic, consider private online services — even those that aren’t free, or require a little more work.

There’s no quick fix for Facebook, or Google or even Apple. To make the Internet a better place, its citizens must be aware, involved and active. You can be online without responding to your baser instincts for affirmation or attention, but if you find the dopamine rush too irresistible, you might be better off closing those accounts after all…

Demarc 3.0

Back in the days of land-line phones, your demarc, or demarcation point, was the part of your house where the public utility phone network entered your home. Each outlet in your home connected here in what was called a POTS (Plain Old Telephone Service) network, and connected to one or more lines going out of the house. Frequently this was located near where power entered your home, and later, cable TV. This makes it an excellent point to retro-fit tech into a house that maybe wasn’t designed with nerds in mind.

I know this looks a little crazy, but in version 3.0 of my setup, its much, much cleaner than its ever been. To quote Morpheus, this is the core where we broadcast our pirate signal and hack into the Matrix! This diagram might be a little easier to read:

There’s some really cool stuff in this architecture that I’m pretty proud of. On one hand, its a modern 1gbps network, with distributed 802.11N WiFi, that can filter out ads and pornography, and support remote connections via VPN. On the other, it can also connect any device from the early 1980s to other devices, or to the Internet.

For the very oldest machines, a Raspberry Pi Zero, running the DreamPi image, connects to our home’s POTS network (long since disconnected from the public phone network), inducing the correct voltage, and playing back a dial-tone sound. A Python script on the the Pi listens for an old-school modem trying to dial out, then plays back the handshake sounds of an ISP, then continues to pretend to be a modem, bridging the device onto our network (and thus the Internet.)

For 90s and 2000s era Macs, either physical Ethernet or an old Airport Classic, provide an on-ramp onto our network. The Airport is configured with a whitelist of allowed machine IDs, so that it can run with only WEP security (since that’s the best it can do!) A Performa provides an EtherTalk to LocalTalk bridge, and a PhoneNet ring running around the basement networks the earliest of Apple and Mac computers.

For newer devices, that have always-on Internet connections, another Raspberry Pi runs PiHole DNS, which filters out ads, with OpenDNS upstream, configured to filter adult content. Dubbed the NetPi, it also runs an OpenVPN server, giving us the same safety when we’re away from home. The NetPi, and a little media PC next to it, also host Plex Media servers that share our content with our devices, no matter where we are.

With more of the Internet abandoning HTTP for HTTPS (whether its needed or not) and newer SSL cryptography ruling out connections from machines with lesser cryptography libraries, the NetPi will probably be pressed into service again running a SSL-stripping Proxy. I haven’t quite figured out how to do this yet, but I do have a RSS+Site Scraper utility running, which means I can still read a lot of content on older devices.

Although this one wall in the house is a little complex, the tech is effectively invisible throughout the rest of the house. Ben and I are working on a Raspberry Pi project using a PowerBook from 1999 as the programming terminal, but the 2019 home theater can also stream 4k content — all without touching or re-configuring anything. I can literally start a document on a Mac Plus, revise it on a Performa, print it from there, or pick it up off a combined AppleTalk/SMB share on the NetPi and publish it to the web from my 2019 Surface Laptop. In fact, I sort of just did…

Update: Squid SSL Bump Proxy running!

Einstein Newton Emulator on Android Oreo through 10

I recently brought my Newton MessagePad 120 back to life — for a brief window of time. It died again after less than 48 hours, but it was fun to play with while it lasted.

In lieu of finding more old hardware, I started playing with the Einstein Emulator. I’ve had it running on my Mac for awhile, but since the Newton was portable, it sure would be nice to have the emulator be in my pocket.

Unfortunately, Einstein hasn’t been updated in awhile and didn’t work on my Pixel 3a, nor would the source build in Android Studio on my Mac. A little hacking at it identified two issues:

  • The project had an undocumented dependency on a tool called ninja. Reported here, running this from the command line resolved: brew install ninja
  • Android notifications have changed since the project was created. I found how to update the notification, and implemented it as a work-around. I’m not sure its 100% backward compatible, so I’ve built and signed an APK of the original code and one with my updated code.

These updated bits, plus the necessary dependencies are assembled here.

Housekeeping – on HTTPS

Related to my previous rant on Internet security, the latest trend is to force a move to HTTPS — the encrypted version of the web’s primary protocol. In my opinion, this is largely silly: its security theater, since most scam sites can easily provide a certificate, and it gives browser makers even more leverage over little content developers.

I find it offensive in a different way, too: it breaks compatibility on the Internet. A whole generation of devices that have older versions of SSL, that can’t easily be upgraded, get cut off from today’s web.

There’s a place for HTTPS — namely, anywhere you submit data to a server. I don’t argue the importance of that. But lots of content is just there to be consumed, and the whole transaction with the server is “give me the content.” For a browser to claim that transaction is unsafe, just because the request and response weren’t encrypted, is dumb. Its perfectly safe to read this website without encryption — and there’s millions of sites where that is true.

That said, it irks me to see my own website marked as insecure, so I did what probably every other “little guy” should do, just to keep up with the times, and added a SSL cert for free through Lets Encrypt. However, my implementation does not break compatibility with older devices: you can still access this site without HTTPS by sending an uncommon user-agent. This will happen automatically if you’re, say, in Netscape Navigator on an old Performa, or visiting from a HP TouchPad. Only if a modern OS is detected will my main site meta-redirect to the HTTPS version, and you can over-ride through your browser’s Developer Tools. Otherwise, if you visit via HTTP, you’ll see a brief flash while the content re-loads over an encrypted connection.

Utility and classic sub-domains will remain on HTTP until all these young hippies get off my lawn…

Apple 2 Forever…

AtariComputerAlthough our first family computer, and my first attempt at programming, was an Atari 800XL (for which I collected every peripheral and game I could find), my first computer was a Macintosh 512k — which I rescued from a garbage can outside our church. Its display had collapsed to a thin vertical line, but that didn’t stop me from turning it on, and pretending to type on its keyboard or explore with its mouse. Eventually my parents found someone who could repair it, and it became a useful, slightly more modern family computer. At some point, long after it was obsolete, we traded it in for an also-obsolete Mac Plus, and added a hard drive. After a few years in service, we got a Compaq Presario 486, and the Mac Plus got relegated to storage.

Software was always my main skill set (most attempts at hardware hacking led to cut fingers — I’ve left my blood stains on many a motherboard) and after 20 years in the industry, I no longer feel like too much of an imposter when I call myself a software professional. On hardware, though, I remain a novice — it’s a hobby, not a profession.

I’ve carried that Mac Plus with me from job-to-job, keeping it setup on my desk, or a bookshelf, to remind me where I started and, on the rough days, how much I love what I do. I fired it up occasionally, but the display was beginning to degrade, and it was trending toward a thin vertical line. Recently I decided I was ready to try the same repair my parents had funded so many years before. A PDF copy of the Dead Mac Scrolls revealed the secrets that had eluded my 12-year old self: common failure points in solder and weak or aged capacitors made for an accomplish-able project. With a healthy respect for high voltages, a few YouTube tutorials, and more than a little trepidation, I put the old Mac Plus under the knife, and restored it.

Shortly afterward, I got a handful of other dead Macs, and found there was something of a market for vintage machines that have been lovingly restored. I managed to repair, clean and flip another Mac Plus, in beautiful platinum gray, a Mac SE, and an original 128k. I did not turn a profit, but I did manage to almost break even. In trade for one of those, I was given a couple other retro gems.

Apple ComputerThe Apple //c was the 10-year old computer my dad had in his classroom in Germany in the mid-90s, and the Apple ][gs was the last of the Apple 2 line up, and something of a unicorn that I never really had the chance to play with. The C lacks a power supply and may need some other repairs, but the GS booted up, and I couldn’t resist the challenge of figuring out how to connect it to my home network. Here’s the MacPlus and the IIgs talking to a range of newer devices — including a very new Raspberry Pi.

Here’s what was needed to pull that off:

  • LocalTalk PhoneNet is an adaptation of Apple’s old serial networking protocol, expanding its range using 4 pin phone cabling —- which was cheap and common at the time. I ringed the basement rec room with phone line to connect my Mac Plus, so adding an extension to the IIGS was easy.
  • The LocalTalk Bridge control panel was an unsupported Apple offering that allowed mid-90s Macs with a serial port and an Ethernet port to connect LocalTalk to EtherTalk. Technically both these networks are AppleTalk, with different names for the different connection types. A middling Macintosh Performa serves bridge duty.
  • A Raspberry Pi running a modified Netatalk install, thanks to the A2SERVER installer (and a lot of tinkering) talks AppleTalk over WiFi, and is reachable by the bridge, providing a modern file share for very old computers. The topology looks like this:

I’ll do a full-write up and post it on our vintage-computer friendly companion site: http://classic.jonandnic.com for those who want more details.