Maintainers of Lightning – Part 2

When I was a young computer nerd, back in the early days of a consumer-friendly Internet, formal computer education in high school was limited to whatever rudimentary understanding our middle-aged teachers could piece together for themselves to pass on. But there was one tool that taught me more about web development than any book or class or teacher ever could: View > Source.

View Source MenuYou can do it now, although the output is significantly less meaningful or helpful on the modern web. Dig around your Browser’s menus and it’s still there somewhere: the opportunity to examine the source code of this, and other websites.

It was from these menus that I learned my first simple tricks in HTML, like the infamous blink tag, and later the more complex capabilities offered through a simple programming language called Javascript that increasingly became part of the Web.

Javascript and the web’s DOM (Document Object Model) taught me Object Oriented Programming when my “college level” computer science course was still teaching procedural BASIC. Object Oriented Programming, and the web as a development platform, have defined my career now 25 years hence. I owe an unpayable debt to the View > Source menu. It was the equivalent of “popping the hood” on a car, and it allowed a generation to learn how things work. Your iPhone doesn’t do that – in fact, today’s Apple would prefer if it people would stop trying to ever open their hardware or software platforms.

This, I assert, is why people fell in love with the Apple II, and why it endures after 40 years. It was a solid consumer device that could be used without needing to be built or tinkered with, but its removable top invited you to explore. “Look at the amazing things this computer can do… then peak inside and learn about how!”

As soon as our son was old enough to hold a screwdriver, he was taking things apart to look inside. I’m told I was the same way, but as parents Nicole and I had a plan: we’d take our kid to Goodwill and let him pick out junk to disassemble — rather than working electronics from around the home! I consider it a tragedy that some people outgrow such curiosity. Dr. Buggie has not – in fact, in my brief conversation with him, I think curiosity might be one of his defining characteristics…

Have PHD, Will Travel

Some time in the early 70s, Stephen Buggie completed his PHd in Psychology from the University of Oregon. Finding a discouraging job market in America, Buggie decided to seek his fortunes elsewhere. The timeline from there is a little confusing: he spent 12 years in Africa, teaching in both Zambia during its civil war (he vividly recalls military helicopters landing beside his house) and Malawi. He was in America for a sabbatical in 1987, where he saw computers being used in a classroom, and at some point in there he taught at the University of the Americas in Mexico, who were early adopters of the Apple II for educational use. His two sons were born in Africa, and despite their skin color, their dual citizenship makes them African Americans – a point he was proud to make. He taught 5 years in South Carolina, before joining the University of New Mexico on a tenure track. He maintains an office there as a Professor Emeritus (although rarely visits — especially in the age of COVID) and still has his marking program and years of class records on 5.25” floppy disks.

Apple II Classroom Computers – Seattle’s Living Computer Museum

Buggie found community amongst Apple II users in the 80s and 90s. Before the World Wide Web there were trade magazines, Bulletin Board Systems and mail-in catalogs. Although there was no rating system for buyers and sellers like eBay has today, good people developed a reputation for their knowledge or products. As the average computer user moved to newer platforms, the community got smaller and more tight-knit. Buggie found a key weakness in maintaining older hardware was the power supplies, and scavenged for parts that were compatible, or could be modified to be compatible, and bought out a huge stock – one he is still working through, as he ships them to individual hobbyists and restorers decades later. His class schedule made it difficult for him to go to events, but he did finally make it to KansasFest in recent years, and still talks excitedly about the fun of those in-person interactions, and the value he got from trading stories, ideas and parts with other users.

To be certain, Buggie’s professional skills are in the psychology classroom. He doesn’t consider himself to be any sort of computer expert — he’s more like a pro user who taught himself the necessary skills to maintain his tools. And maybe this is the second theme that emerged in our conversation: like the virtue of curiosity, there is an inherent value in durability. This isn’t commentary on the disposable nature of consumer products so much as it is on the attitudes of consumers.

We shape our tools, and thereafter our tools shape us.

My dad worked with wood my whole life. Sometimes as a profession, when he taught the shop program at a school in town. Sometimes as a hobby, when he built himself a canoe. Sometimes out of necessity, when my parents restored our old house to help finance their missionary work (and the raising of three kids.) His tools were rarely brand new, but they were always well organized and maintained. He instilled in me a respect for the craft and in the artifacts of its execution.

Computers are tools too – multi-purpose ones that can be dedicated to many creative works. When we treat them like appliances, or use them up like consumable goods, the disrespect is to ourselves, as well as our craft. Our lack of care and understanding, our inability to maintain our tools beyond a year or two, renders us stupid and helpless. We cease to be users and we become the used – by hardware vendors who just want us to buy the latest versions, or credit card companies that want to keep us in debt, or by some imagined social status competition…

In those like Dr. Buggie who stubbornly refuse to replace a computer simply because it’s old, I see both nostalgia and virtue.

The nostalgia isn’t always positive – I’ve seen many Facebook posts from “collectors” who have storage rooms full of stacked, dead Macintosh computers that they compulsively buy despite having no use for, perhaps desperate for some prior stage of life, or to be able to claim a part in the computing revolution of yester-year that really did “change the world” in ways we’re still coming to grips with. There is a certain strangeness to this hobby…

But there is also virtue in preserving those sparks of curiosity, those objects that first invited us to learn more, or to create, or to share. And there is virtue in respecting a good tool, in teaching the next generation to steward well the things we use to propel and shape our passions or careers or the homes we build for our families.

Fading Lightning

I joke with my wife that our rec room is like a palliative care facility for dying electronics: my 1992 Laserdisc player is connected to a 2020-model flatscreen TV and still regularly spins movies for us. I still build apps for a mobile platform that was discontinued in 2011 using a laptop sold in 2008. In fact, these two posts about the Apple II were written on a PowerBook G3 from 2001. Old things aren’t necessarily useless things… and in fact, in many ways they provide more possibility than the increasingly closed computing platforms of the modern era.

In an age where companies sue customers who attempt to repair their own products, there is something to be said for an old computer that still invites you to pop the top off and peer inside. It says that maybe the technology belongs to the user – and not the other way around. It says that maybe those objects that first inspired us to figure out how things works are worthwhile monuments that can remind us to keep learning – even after we retire. It says that maybe the lessons of the past are there to help us teach our kids to shape their world, and not to be shaped by it…

Thanks for visiting! If you missed Part 1, check it out here!

Maintainers of Lightning – Part 1

Of the subset of people who are into computers, there is a smaller subset (although bigger than you might think) that are into vintage computers. Of that subset, there’s a significant proportion with a specific affinity for old Apple hardware. Of that subset, there’s a smaller subset that are into restoring and preserving said hardware. For that group of people, something like a Lisa (the immediate precursor to the Macintosh) or a NeXT Cube (Steve Jobs follow-up to the Macintosh, after he was kicked out of Apple) holds special value. Something like an original Apple I board holds practically infinite value.

If you are in this sub-sub-sub-group of people, then you have undoubtedly come across the eBay auctions of one Dr. Stephen Buggie. You’ll know you’ve purchased something from him, because whether its as big as a power supply or as small as a single key from a keyboard, it comes packed with memorabilia of either the Apple II, or his other hobby, hunting for radioactive material in the New Mexico desert. Hand-written, photocopied, or on a floppy disc or DVD, he always includes these small gifts of obscura.

Last year, after the fifth or sixth time one of his eBay auctions helped me rescue an Apple II machine I was restoring, I reached out to Dr. Buggie and asked him if I could interview him for my blog. His online presence outside of eBay is relatively small. You can find him on YouTube, presenting at KansasFest, an annual meeting of sub-sub-sub-group nerds entirely focused on the Apple II line-up of computers. Not all of these folks are into restoration. Some are creating brand new inventions that extend the useful life of this 40+ year-old hardware. Some have been using the hardware continuously for 40+ years, and still live the early PC wars, full of upstarts like Atari, Commodore and Amiga, gearing up for battle against IBM.  Many still see the Macintosh as a usurper, and pledge more allegiance to the machine that Wozniak wrought than the platform that begot the MacBook, iPhone and other iJewelry that all the cool kids line up for today. If you search really hard, you may even find references to Dr. Buggie’s actual, although former, professional life: that of a world-traveling psychology professor. What you won’t find is any sort of social media presence — Facebook and Twitter don’t work on an Apple II, so Dr. Buggie doesn’t have much use for them.

The 5 Whys

What I wanted to know most from Buggie was why Apple II? Why, after so many generations of computer technology had come and gone, is a retired professor still fixing up, using and sourcing parts for, a long-dead computer platform? I don;t think I got a straight answer to that: he likes bringing the Apple //c camping, because the portable form-factor and external power supply make it easy to connect and use in his camper. He likes the ][e because he used it in his classroom for decades and has specialized software for teaching and research. But as for why Apple II at all… well mostly his reasons are inexplicable — at least in the face of the upgrade treadmill that Intel, Apple and the rest of the industry have conspired to set us all on. He uses his wife’s Dell to post things on eBay, and while still connected to the University, he acquiesced to having a more modern iMac installed when the I.T. department told him they would no longer support his Apple II. But if it was up to him, all of Dr. Buggie’s computing would probably happen on an Apple II…

OK, But What is an Apple II?

It’s probably worth pausing for a little history lesson, for the uninitiated. The Apple II was the second computer shipped by Steve’s Jobs and Wozniak in the late 70s. Their first computer, the Apple I (or just Apple) was a bare logic board intended for the hobbyist. It had everything you needed to make your own computer… except a case, power supply, keyboard, storage device or display! The Apple II changed everything. When it hit the market as a consumer-friendly device complete in a well-made plastic case, with an integrated power-supply and keyboard, that could be hooked up to any household TV set, it was a revolution. With very little room for debate, most would agree that the Apple II was the first real personal computer.

Apple I, with aftermarket “accessories” (like a power supply!)

As with any technology, there were multiple iterations improving on the original. But each retained a reasonable degree of compatibility with previous hardware and software, establishing the Apple II as a platform with serious longevity – lasting more than a decade of shipping product. As it aged, Apple understood it would eventually need a successor, and embarked on multiple projects to accomplish that goal. The Apple III flopped, and Steve Jobs’ first pet project, the graphical-UI based Lisa was far too expensive for the mainstream. It wasn’t until Jobs got hold of a skunkworks projects led by Jef Raskin that the Macintosh as we know it was born. By then, Apple II was an institution; an aging, but solid bedrock of personal computing.

The swan song of the Apple II platform was called the IIGS (the letters stood for Graphics and Sound), which supported most of the original Apple II software line-up, but introduced a GUI based on the Macintosh OS — except in color! The GS was packed with typical Wozniak wizardry, relying less on brute computing power (since Jobs insisted its clock speed be limited to avoid competing with the Macintosh) and more on clever tricks to coax almost magical capabilities out of the hardware (just try using a GS without its companion monitor, and you’ll see how much finesse went into making the hardware shine!)

But architecture, implementation, and company politics aside, there was one very big difference between the Apple II and the Macintosh. And although he may not say it outright, I believe it’s this difference that kept the flame alive for Stephen Buggie, and so many other like-minded nerds.

Image stolen from someone named Todd. I bet he and Dr. Buggie would get along famously.

That key difference? You could pop the top off.

I’ll cover why that’s important — both then and now — and more about my chat with Dr. Buggie in Part 2

Install Microsoft Store Apps on Windows 10 LTSC

Switching my home computers to LTSC was the single best decision I’ve made for the health of my network — and for my sanity. If you don’t need all the latest bells and whistles — and more importantly, if you’re driven insane by the constant feature updates that are often more painful to install than they’re worth, give serious thought to getting a hold of LTSC. In particular, on my Mac Pro, every Windows feature update was a battle to keep it stable. The only feature I care about is that it starts up when I need it.

However, there is one (dubious) down side: LTSC does not include the Microsoft Store for getting access to apps. Lots of apps are available from alternate channels, but occasionally there’s one — like Microsoft’s To Do app — that’s only available in their app store. But don’t panic: you can still get these apps (assuming they’re free). It just takes a little more work…

Update: A reader made a comment below that provides an alternate approach — I haven’t tried it yet, but its a fantastic idea, so I’m adding it here. Thanks, Oliver!

Another option is to install the MS Store app itself. I used this scripted command on GitHub: https://github.com/channel168/LTSC-Add-MicrosoftStore#readme

Step 1 – Find the App on the Web

You’ll need a link to the app from a Microsoft website that has a “Get” button. Here’s a direct link to the Productivity section of the web-based version of the Microsoft Store:

https://www.microsoft.com/en-us/store/top-free/apps/pc?category=Productivity

The Web version of the Microsoft Store can help you download apps on LTSC
The Web version of the Microsoft Store will provide the info you need to get the app without the store

Step 2 – Copy that Link

Grab the URL from your browser’s address bar. For Microsoft To Do, it looks like this:

https://www.microsoft.com/en-us/p/microsoft-to-do-lists-tasks-reminders/9nblggh5r558

Step 3 – Extract Download Links

Paste the URL into this site: https://store.rg-adguard.net/ and hit the checkmark

Adguard's alternate app store extracts download links right in your web browser
AdGuard’s handy alternate store will tell you what packages are associated with the App you want

If all goes well, you’ll get a list of one or more packages. For To Do, there were 5 packages that I needed — but the results had almost 10 times that. Don’t worry, you won’t need them all. Start with the .Appx or .Appxbundle that looks like the app you want. Note, from the possible choices, you’ll want the latest version number, and the correct processor architecture. If you’re running 64-bit Windows, you’ll want the x64 version of the app. Download it somewhere memorable.

Step 4 – Open up Powershell

The command to install is a simple one:

Add-AppxPackage -Path "path-to-appx-you-downloaded"

When you run it, you’re likely to get a scary red error dump. Inside that message are helpful tips telling you that the app needs one of the other files from Step 3. Go get it — again paying attention to version number and processor architecture.

Powershell install errors will help you get the dependencies each App needs
The Powershell install error will tell you what dependency package to get next

Step 5 – Repeat

Run the Add-AppxPackage command on each download, then re-try Step 4. Each time it will tell you about another file it’s missing. If you’re feeling confident, you can guess ahead — but to be on the safe side, go file-by-file, grabbing exactly the one it complains about after each attempt at Step 4.

Eventually you’ll have all the dependencies installed, and the app you wanted will actually install — assuming its compatible with your version of Windows, you can now use it like normal!

I’ve got lots more ideas for keeping old systems alive — read more of them here!

Managing Social Media: Google

The company that started with the motto “Don’t Be Evil” has spent the last decade or so flirting with ideas that are awfully close to evil. That doesn’t mean that the organization is bad — any more than a hang nail means a human being is dying — but Google sure could use a pair of nail clippers.

When GMail first came out, I was ecstatic to get an invite. They were transparent about the trade-off at the time, and we all accepted it as reasonable: Google has automated systems that read your mail so that they can personalize advertisements to your interests. If you send an email to someone about how you burnt your toast that morning, seeing ads for toasters in the afternoon seemed fairly innocuous — even a little amusing! At the time, though, Google’s coverage of your digital life was just search results. Adding e-mail felt natural and not really that intrusive.

Fast forward to today, and what Google knows about you is downright terrifying. They don’t just know where you go on the Internet, they know where you’ve been, how often, and when you’re likely to go there again in the real world. And their influence doesn’t stop at knowledge: because of the virtual monopoly of Chrome, and its underlying tech called Chromium which powers most browser alternatives, Google has started making unilateral decisions about how the Internet should work — all in their favor, of course. They don’t like it when you’re not online, because they stop getting data about you. And it doesn’t matter if you’re not using one of their properties directly, because 70% of the 10,000 most popular Internet destinations use Google Analytics. Its actually a great product; it helps web developers understand their audience and build better offerings — and all Google wants in exchange is to know everything about you.

And let’s talk about YouTube, a virtually unavoidable Google property, full of useful content, and a site that historians might one day determine was a leading cause for the end of our democracy. YouTube is awful — and its entirely by accident. Google deflects privacy concerns by pointing out that the analysis of all this data is done by algorithms, not people. There’s probably no person at Google that actually knows how to gather all the information you’ve given them into a profile of you personally. But there doesn’t need to be: their software is sufficiently empowered to manipulate you in ways you aren’t equipped to resist.

YouTube’s recommendation algorithm has been disowned by its own creator as reckless and dangerous, and while its been tweaked since it was launched on the world like SkyNet, the evil AI from the Terminator movie franchise, and now has human over-seers to guide its machinations towards less destructive content, its still a pernicious and outsized influencer of human thought. Look no further than 2020’s rampant embrace of conspiracy theories for proof positive that recommendation engines are not our friends.

Google set out to do none of these things. I’ve been to their campus, and interviewed for jobs with their teams. To a fault, everyone I’ve met is full of idealism and optimism for the power of the Internet to empower individuals and improve society. I actually still like Google as a whole. But if the Internet is Pandora’s box, Google is the one that pried it open, and can’t quite figure out how to deal with what was inside. Humanity is not inherently good, and accelerating our lesser qualities isn’t having the positive outcome Google’s founders might have hoped for.

So, how do you throw the bath water out, but keep the baby? Can you use Google’s awesome tech, without contributing to the problems it creates? I don’t know, but here’s a few of the ideas we’re trying:

Diversify Your Information Holdings

I’ve said this before, and it bears repeating: don’t put all your eggs in the same basket. If you have a Google Mail account for work, have your personal account with another provider. If you use Google Classroom for school, use OneDrive for your private documents. If you have an Android phone, don’t put a Google Home in your bedroom. This isn’t just good security practice, preventing an attacker from gaining access to everything about you from a single hack, its good privacy practice. It limits the picture of you that any one service provider can make. Beware, though, of offerings that appear to be competitive, but are actually the same thing under the hood. The privacy browser Brave may tell a good story about how they’re protecting you, but their browser is based on Google’s Chromium, so its effectively the same as just using Google’s own browser.

Castrate the Algorithm

The YouTube recommendation engine is getting better. They’ve taken seriously the impact they’ve had, and they have smart people who care about this problem working on it. Until they get it right, though, you can install browser extensions that just turn it off altogether. You can still search YouTube for information you want, but you can avoid the dark rabbit trail that leads to increasingly extreme viewpoints. Choose carefully, because browser extensions are information collectors too — but here again, at least you’re diversifying.

Tighten the Purse Strings

Use an ad-blocker, and contribute less to their bottom line. Ad-blockers are relatively easy to use (again, reputation matters here), and available in multiple forms, from user-friendly browser extensions that can be toggled on-and-off, to nerd-friendly solutions you can run on a Raspberry Pi. We’ve eliminated about 80% of the ads we see on our home Internet by using DNS-level filtering — and its remarkably easy to do.

Do a Privacy Check Up

I’ve been involved in software development for 20 years — data really does make software better — but did you know Google will willingly relinquish older data they have on you? All you have to do is ask. Whether you’re an active Google user, in the form of an Android device or one of their enterprise offerings (like Google Classrom), or just an occasionally searcher with an account, you should take them up on this offer and crank up your privacy settings.

Search Elsewhere

Google is still pretty much the top of heap as far as search results go, but they’re far from the only game in town — and the deltas shrink daily. Bing is remarkably close in the search race, although its backed by an equally giantic corporation that is probably no more altruistic with their data acquisition, and DuckDuckGo does a decent job most of the time. Why not switch your default search engine to something other than Google, and switch back opportunistically if you can’t find what you need?

Check Who’s Watching

Just like Facebook has its fingers in most of the Internet, Google is everywhere. A service called Blacklight lets you plug in the address of your favorite website, then gives you a report on all the data collection services that website is cooperating with. The scariest ones are probably the ones you trust to give you news and information. Use RSS where possible, anonymizers, or different browsers for different purposes… which brings me to my final suggestion.

Stop Using Chrome

Oh man, I could go on for pages about how scary Google’s control over the Internet has gotten — all because of Chromium. If you’re old enough to remember all the fears about Microsoft in the 90s, this should all seem familiar. Just like the PC was Microsoft’s playground, and anyone who tried to compete was in danger of being crushed under the grinding wheel of their ambition, the world wide web has become Google’s operating system, and Chrome is the shiny Start Menu that graced every screen. Everything uses Chromium, even Apple’s browser, and Microsoft’s new Edge. It allows Google to basically dictate how the Internet should work, and while their intentions may be mostly good, the results will not be. I’m practically pleading with you: install Firefox and use it as your default browser; switch to a Chromium-based browser only when you have to.

If I sound like a paranoid old man by now, I’ve earned it. I’ve literally been working on the Internet my entire career — my first experiments in web development date back to 1996. I love this thing called the web, and generally Google has been good for it. But a democracy isn’t democratic if its ruled by dictator, and the Internet isn’t open if its entirely controlled by Google. As citizens of cyberspace, you own it to your community to help it stay healthy, and as individuals, you owe it to yourself to practice safe web surfing.

Multi Mac OS X Rescue Drive

Recently someone posted a great idea in a Facebook group for users of old Macs — but apparently wasn’t interested enough in the community to describe how it was accomplished. The idea is to have an external USB drive with multiple Mac OS installers on it, so you can restore or recover a wide range of Macs. Although all the instructions are online, they’re scattered across different sites that have to be pieced together. Here’s my attempt to collected everything you need to make a multi-OS recovery disk for almost any Mac made in the last 15 years.

Multi-boot Mac OS Recovery USB Drive

Things You’ll Need

  • A working Mac with a relatively modern OS (I used a 2008 MacBook Pro running a patched High Sierra)
  • The install media (disk, disk image or installer app) for each OS revision you’ll want (see links at the bottom)
  • At least 100gb external USB drive (an actual drive — not a USB key)
  • Optional, the DosDude patchers for any Mac OS you may want to shove on an unsupported Mac
  • Optional, but recommended, the latest Combo update for each Mac OS you may be installing. (I’ll include links to download these at the bottom of this post)
  • Moderate proficiency with Apple’s Disk Utility and just a little bit of Terminal comfort

Note: You’ll notice that these instructions stop at Mojave. So do I. You could argue that Catalina’s murder of 32-bit apps was necessary for the ARM-transition — and maybe that was right for Apple, but you can read why I don’t think its great for consumers.

Continue reading “Multi Mac OS X Rescue Drive”

Managing Social Media: Facebook

The Delete Facebook movement has been around for a while now, and I have to admit, the idea is tempting. The downside of allowing a single company to have such an outsized view into our lives has become increasingly obvious, while the benefits have dwindled. By design, Facebook is more than just a social network – its evolved over the years to become something of an Internet hub. Sure, there’s a lot less people playing Farmville, but it’s still the closest thing to a ubiquitous messaging platform we have on the Internet, so it’s hard to just turn it off. Short of writing a letter and putting it in the mail, Facebook is the one place where I can get a message to most of my extended family. And there are things to be said too (both good and bad) about Facebook Groups, where strangers with common interests can meet and create connections — most of my hobby projects have been significantly helped by members of one Facebook group or another.

So quitting Facebook might be going a little too far for most of us, but maybe putting some limits on Facebook’s reach can help. Here are some easy steps you can take to control Facebook’s visibility into, and impact on, your digital life.

Delete the App from your Phone… Then Put it Back

Facebook’s mobile app, whether on Android or iOS, has a staggering privacy impact. Except on the latest OS versions, most of these permissions, once granted, are permanent, and accessible in the background. Recent improvements to underlying platforms have revealed numerous “bugs” that have all the appearance of spying on users – even while the app is not in use. For example, Facebook helpfully asks for access to your Address Book to facilitate “finding friends” but can use that information at will to quietly strengthen its social graph (the powerful database that makes Facebook so interesting to advertisers and political parties.) Recently a former engineer reported that Facebook experimented with uploading all your pictures in the background to “improve performance” when you chose to post a picture on their site.

Obviously, it’s nice to have your social network in your pocket – it’s convenient and helps pass the time. But, giving away all your personal data seems foolish. Fortunately, there is a work-around, and its actually quite nice. By design, your mobile web browser is a “sandbox” – websites can’t get the same permissions as Apps can, so they’re intrinsically safer. And to make it more convenient, both Android and iOS allow you to “pin” a website to your home screen so that you can launch it just like an App. The experience is slightly diminished from the full App, but its remarkably elegant, and significantly less intrusive.

The process is slightly different for each platform, but it amounts to:

  • Open Facebook in a web browser
  • Find the browser’s menu, and choose the option to Pin to your Home Screen
  • Find the new Facebook “App” icon on your Home Screen and launch from there
  • Use Facebook more-or-less as normal

A nice side effect of this change is that Notifications go away. You can always launch the “App” to see what’s new, but you won’t get things pushed to you constantly. Facebook Messenger is a separate app, which seems to have less privacy issues, so it can remain installed to allow message notifications.

Put Facebook in a Box

This tip applies to both your phone and your laptop or desktop computer, although the process is a little different. It requires you to get used to having multiple web browsers – and keeping Facebook in a secondary one.

Firefox believes that good fences make good neighbors

My strong recommendation is to use Firefox as your daily driver – it has an extension that can limit Facebook’s reach automatically. Chrome and Edge both are reasonable for privacy, Brave is better, but in other ways all of these browsers contribute to Google’s unreasonable control over the evolution of the Internet – but I’ll get to Google in another post. Suffice it to say, choose your main web browser and make sure you’re signed out of Facebook (and Instagram) completely on it. When you visit facebook.com from that browser, you should get prompted to sign-in – otherwise, assume Facebook is tracking you all over the web.

(Update: if you have to have Chrome, check out these extensions to help keep you safe.)

Facebook uses a browser fingerprint it establishes when you sign-in to their site, combined with tracking that same fingerprint detected through their pervasive advertising network, to piece together your browsing history — this is why Facebook ads seem like they’re reading your mind: they really do know everything you do online. Never use “sign in with Facebook” to log into a non-Facebook website or service. This is another way they track your activity. Your main web browser should be anonymous to Facebook at all times.

Once you’re confident that your primary browser is Facebook free, install and setup a secondary web browser that can be signed in with Facebook. Use this secondary browser for your Facebook community, and limit other web surfing. On a computer this is really easy – your computer comes with a web browser that should be your secondary browser:

On a phone this is a little harder, because you can’t completely change the default browser – the built-in engine will still handle embeds and links no matter what you do. But you can still follow the same pattern – create the Home Screen shortcut “App” using the built-in browser and install another browser to do most of your surfing.

Prune Your Timeline

Aside from its privacy issues, Facebook also functions as sewage run-off for some of the Internet’s worst information pollution. Political viewpoints turn angry during an election year (or pandemic) and sometimes it gets to be a little much. You may learn things about your social network that you wish weren’t true – or maybe you just need a break from all the memes.

Sometimes you have no choice but to just remove connections (de-friend people) if they won’t listen to reason. But often a genuinely decent person has just listened to a little too much Fox or NBC News and you need to take a break from the partisanship. It’s OK to “snooze” people or unfollow them. This allows you to stay connected, without having to get inundated with their ideology.

I don’t mean to suggest we shouldn’t hear ideas and perspectives that are different from ours – in fact, I believe it’s healthy to hear both sides of a debate… as long as both sides are rational, thoughtful and based, at least in part, on objectively verifiable reality, or reasoned interpretations of events. But not all opinions are created equal, and not all sources of information are valid. I’d advocate first for a loving attempt to reason, out of concern for a friend, but I’d also advocate (especially as my kids are moving into an online world) for a limitation of the pollution you expose yourself to online.

The Facebook timeline algorithm is tweaked for engagement (sucking you in) and for maximizing advertising impressions (keeping you on the site so you see more ads). It’s not a good source of information, any more than if everyone in town went to the same park and all started shouting our opinions at each other. Prudently manage who and what shows up on your timeline, or ignore the timeline entirely, in favor of personal interactions or Facebook groups that are healthy for you.

Set App Timers

If you use the Facebook app, or a dedicated browser, both Android and iOS will allow you to limit your time in those apps. You can use this for any App that you find yourself mindlessly scrolling through more than you want to. In iOS, it’s called “Screen Time”, in Android it’s called “Digital Wellbeing”, but in either case you can find it in Settings, and easily set a timeout in minutes per day. Of course, you can over-ride it if you need to, but it’s a good reminder to manage what you’re consuming in a given 24 hour period, and make sure you’re including other interactions and sources of information.

Protecting Your Brain

We don’t let our kids use social media yet – their brains are still forming, and they don’t have all the tools they need to discern what they may read online. But adults aren’t immune from the cognitive biases that can trick our brains into unhealthy patterns. Facebook is a relatively new kind of media – one that empowers peer-to-peer sharing and information dissemination much faster than what we had a generation ago. It has many incredible benefits but inherits all the same problems of previous kinds of media, while introducing a slew of others that humanity isn’t really equipped yet to understand. There are efforts underway to understand and improve how this kind of media works, but until those things mature and inform the evolution of the Internet, it’s up to us as users to think about and manage how we interact with technology and other people using it.

Church Streaming 2.0

Even though we knew it was probably going to happen, when the lock down order came in from the governor, we didn’t really get a lot of time to adjust. The kids were in school one week, and at home the next. Church was meeting in person on one Sunday, and exclusively online the next. A series of probably-Providential events had happened before this, none of which were deliberately timed by me, but all of which turned out to be helpful in getting our little country church online in time.

At the start of 2020, we didn’t even have Internet in our church building — we would upload sermon audio using a 4G hot spot. That audio was recorded on a 2008 iMac that I found on Goodwill Auctions for $140, and I had just replaced the 2006 Windows Vista eMachine that was in the pastor’s office with a 2009 iMac that I got for $80. With this “new” hardware installed, we decided it was time to petition the church leadership for a stable Internet connection. No one was opposed philosophically — they’d just never had a need before. At the February board meeting, they agreed to my proposal, and later in the month, I camped out at the church for a day and a half to wait for, then help, the Internet installer figure out how to connect our 160+ year old church building to the digital world.

The lock down order came only a couple weeks later. The Internet was unreliable, because the rural infrastructure near our location had issues, the more-than-a-decade-old iMacs were far too under-powered for their new task, and a decent webcam was suddenly very hard to find either online or in near-by brick-and-mortar stores… but in a little over a week, we managed to cobble together a streaming system, do some basic training for the pastor’s family, and hold our church’s first online service. All while hoping this would be a very temporary situation. It was not.

At some point in the summer, it became obvious that the system was too fragile for reliable live streaming, and it became more pragmatic to have a pre-recorded service in-the-bag. This created a more fault-tolerant, less stressful experience, but it didn’t change the chewing-gum-and-bailing-twine nature of the system: it all just barely worked, because none of the pieces were ever intended for the tasks that had been thrust upon them. While it became safe enough for most church members to attend an out-door service this summer, others who are at higher risk to the virus, could not attend — and with colder weather looming and no vaccine in sight, it became apparent that online church was going to be a reality for at least a little while longer. The pastor asked for some options for a more permanent system.

I priced out three bundles — good, better and best; cheap, not-as-cheap, and spendy. The elders settled on a combination of pieces that straddled the mid-range. I got to switch from trolling Goodwill, to a picking out a choice refurb from Backmarket (a great place to get used high-end hardware). The new Mac Pro is commonly called the “trash can” for its cylindrical design — Apple later admitted the look left them “designed into a corner” then basically abandoned the high end market for most of 7 years. Its from 2013, but was way over-powered for the time, and still outperforms most of the stuff you’d find at Best Buy today.

We switched from the commercial Windows software, vMix, to an open source package that runs native in macOS called OBS — a favorite of video game streamers and YouTube stars. It starts up in a fraction of the time, and handles virtually limitless inputs with ease. Switching to an HDMI camera with optical zoom, instead of the cheap USB webcam, allowed us to position the rig at the back of the sanctuary — instead of consuming the front half of the room — which made it significantly easier to connect to the sound board, and have an independent audio mix, which will improve both the in-house and online experience, once we tune it.

Ben helped me set everything up, and we even rigged up an iPad based remote control, so if needed, one person can run both the in-house video screen and the online stream.

This week, we’ll do some training on the new system, and probably work out a few kinks, then I’ll report back to the hospital for a follow-up surgery for a blood-clot related issue I’ve been dealing with all summer. Next Sunday, I hope to be worshiping from bed at home while recovering from this thing once-and-for-all!

NextStep/OpenStep NFS File Share with a Raspberry Pi

I recently restored a NeXTstation computer — the grandfather of Mac OS X computers (and therefore the great grandfather of iOS). It joins a network of historical Mac computers in my basement, but was woefully disconnected from them. A crude file transfer between a G4 Cube running OS X 10.4 could be established relatively easily using FTP, but I wanted NeXT to really fit into the neighborhood.

Every computer in the house can reach a common file share running on a Raspberry Pi, which serves up a folder over SMB and legacy AFP (AppleTalk), with an ethernet-capable OS 9 machine bridging to the LocalTalk-only Macs from the early days. Unfortunately NeXTStep and OpenStep support neither SMB nor AFP (technically one version of NextStep had a crappy AppleTalk implementation, but not the version I’m running.) What Next did support was NFS — Network File System. And fortunately so does the Raspberry Pi…there’s just a little modification required to make it work with old versions.

Setup your server as Host on the local system first

After following the steps here to enable NFS and establish a share (using the same folder as AFP and SMB), a fellow nerd on Facebook provided some steps to force support for older clients (NFSv2):

  • Edit the file /etc/exports that you made when you enabled NFS, and decorate your share with some less secure options (at your own risk — obviously don’t expose this to the Internet!) I also had to assign an fsid. Here’s what my export looks like:
    /srv/A2SERVER/A2FILES *(rw,fsid=1,all_squash,insecure,sync,no_subtree_check,anonuid=1000,anongid=1000)
  • Edit the file /etc/default/nfs-kernel-server as sudo
  • At the bottom, add the line RPCNFSDOPTS="--nfs-version 2,3,4"
  • Run exportfs and make sure no errors are reported.

On both OpenStep and NeXTStep, you’ll want to set up a Host for your server. This can be done in NextAdmin/HostManager.app — assuming you’ve already got networking setup. Instructions for setting up the network are here, and Sophie Haskins has a good blog entry about some of the hurdles to NextStep networking.

On NeXTStep 3.3, the NFSManager.app GUI was not able to successfully mount the share — I had to do it from the su command line:

  • Launch Terminal.app
  • Type su and hit enter — provide a password if needed.
  • Enter a mount command like:
    mount serverhost:/pathto/yourshare /Net/localmountpoint
  • So, for example, my server is on a Host named NetPi and the shared path is /srv/A2SERVER/A2FILES (since the path is also used by A2Server’s AFP share) and I want to mount it to a local folder on my NeXTstation called NetPi, so my command is:
    mount netpi:/srv/A2SERVER/A2FILES /Net/netpi

If you want this to run at every boot, use Edit.app as root to add your mount command to the end of the file /etc/rc.local
But be 100% sure your mount command works before you do — this can prevent booting if its wrong. To be on the safe side, include a time-out and limit retries like:
mount -o rw,bg,mnttimeo=8,retry=1 serverhost:/pathto/share /Net/localmountpoint/

In OpenStep 4.2, the NFSManager.app GUI did work, and Sophie’s blog shows how to use it. And just as a point of interest, following the same HostManager.app steps to tell NeXT about a LaserJet 4 compatible printer on the Network let’s it print too!

In OpenStep you can use the GUI, in NextStep the command line works best

Why you shouldn’t buy a Mac in 2020…or maybe ever again.

Sad Mac iconThis week Apple surprised no one by announcing they were beginning their transition to “Apple silicon” in their Mac computer line-up. If you don’t know what that means, its sufficient to understand that they are moving from Intel-based computers, to a processor and related architecture of “their own” design.

I put quotes around “their own” because, despite their announcement, everyone knows that “Apple silicon” is derived from the ARM processor — a family of chips most often used in phones and other mobile devices. ARM has been around a long time, and Apple invested in the company back during the Newton era. Intel has obviously been around even longer, but Apple’s use of Intel chips is the stuff of relatively recent history.

This marks the 4th processor migration for Apple, from the Motorola 6502-based Apple I and Apple II computers, to the Motorola 68000 family in the early Macintosh line-up, to the Motorola (and IBM) PowerPC of 1990s processor-war infamy. With each generation, Apple struggled to position themselves against the WinTel (Windows + Intel) hegemony. It wasn’t until 2006, when they transitioned to Intel, that Apple finally found their footing.

Since Steve Jobs’ hostile take-over of the Macintosh project in the early 80s, Apple’s philosophy on computing has been fairly “closed.” Jobs envisioned the computer as an appliance for average people, not a tinker toy for nerds. In the original Mac, this meant unusual screws and an absence of hardware expansion slots. On the iPhone, it meant a “walled garden” where only Apple-approved apps on the Apple-hosted App Store could be installed (unless you were willing to do some serious hacking.)

It took a long time to prove this philosophy out — it was almost a full generation before non-nerds were doing most of the computer shopping. But in many ways it paid off. Macs have a reputation of being stable, reliable machines, and iPhones are the mobile device most people want to own. iOS really represents the logical outcome of Apple’s trend toward locking things down: its an operating system that users aren’t supposed to know anything about, on hardware that customers aren’t supposed to be able to open.

On the Mac, though, there’s always been another layer: under the simple, friendly veneer of the user interface is a powerful Unix shell. And under the sleek case is fairly standard, commodity hardware. The implications of this for the Mac is that despite Apple’s attempts to end their life prematurely, people with a little know-how can keep their Macs running for years. Unlike phones, where people feel compelled (either by fashion trends, or security concerns) to buy a new one every couple of years (don’t do it!), an Intel Mac can last a decade or more as a useful, performant machine. Obviously this is a problem for a company that primarily sells hardware…

Case in point: this is being written on a 12 year old Mac that Apple tried to stop updating in 2016.

Apple zealots will tell you that the move to ARM will let Apple build smaller, faster machines with better battery life. They’re not wrong — ARM rocks for mobility. What they won’t admit is that the move away from commodity hardware will let Apple control the lifecycle of these new computers the same way they intentionally keep the lifecycle of their phones shorter than necessary:

  • With an Intel-based hardware platform, upgrades made for Windows PCs mostly “just work” in a Mac
  • With an Intel-based hardware platform, many parts can be sourced from other manufacturers to provide for repairs that Apple will no longer supply
  • With an Intel-based hardware platform, users can boot Windows (or Linux) to run software that isn’t compatible with “older” Macs
  • With an Intel-based hardware platform, the developer community can create patches to circumvent artificial end-of-life moves from Apple designed to keep you from upgrading to the newest MacOS

It remains to be seen whether the heroic hackers of the world will be able to bring these benefits to new ARM-based Macs, but if Apple’s plan is to make Macs more like iPhones (which it evidently is), you can bet they won’t help us.

The move from PowerPC to Intel was a painful one for the Mac community. Software we owned stopped working, or had to be run through short-lived and poorly performing compatibility tools. Then there was the swallowing of our pride as we collectively had to admit that Intel really did outperform the G4s and G5s we were so proud of. But ultimately, the benefits for consumers outweighed the costs: it was the right move. Arguably, the move to ARM is significantly less urgent — granted, Intel’s track record over the past few years hasn’t been great, but they’re still putting out decent performance at a reasonable price point. Besides, the average Mac user doesn’t care what kind of silicon they’re running on — and they shouldn’t need to. But they should care if a company is deliberately steering them toward a platform of aggressive planned obsolescence and a treadmill of re-buying things they don’t really need.

I’ve put more than two dozen used Intel iMacs and MacBooks back into service for churches, students, teachers and missionaries — all well past the date Apple would like them to be running, and all stable, reliable and with half of them running Windows 10 at least part of the time. They’re really great machines, and I mourn the end of this era. Maybe Apple’s new products will be better than I think; I’m sure they’ll be sexy pieces of hardware. I just hope they don’t become sexy pieces of garbage in a couple years…

Air Hockey Scoreboard Project

Last year, our church got a donation of an air hockey table, for use by the youth group. It was in nice shape, save for the electronic scoreboard, which wouldn’t keep score. It would light up and make sound when powered, but the score was stuck at “88” to “88.” We resolved to see if we could fix it.

The electronics were wrapped in a plastic banner that crossed the middle of the table, and had only a few wires running to it — one from a puck “catcher” on each side of the table, with a simple switch that was pressed when the puck was present, and one modified ethernet cable running to a small control panel that let you start a timer, or reset the game. Our initial hope was that it would just be a wiring problem, that Ben and I could fix together. It turned out to be much more complicated.

All the wiring was fine, and via various traces, ended up connected to a small logic board — this turned out to be the cause of the problem: it was dead. The board had a part number, but no amount of Internet searching could find a source for a replacement part. We theorized that the actual logic being performed was fairly simplistic, and that we might be able to replicate it with a Raspberry Pi Zero — roughly the same size, and equipped with sufficient GPIO pins to map to the existing wiring. Connecting the puck catchers was easy, and with a little Python code, we could count score and show it on a connected SSH Terminal. The control panel was a little more difficult, but we managed to find a solution for some of its basic functions. The remaining problem was the seven-segment LEDs that actually show the score.

For those we called in some help from an engineering student we know, and managed to come up with the logic to light the LEDs by reverse engineering an array of bit values that could be toggled via the Pi’s GPIO. In theory it was going to be possible to restore displays, and 90% of the functionality of the system. In practice, it didn’t work out that way.

The seven-segment LEDs work in pairs-of-pairs: two LEDs for each player, and double that to mirror the value on the other side of the board. 4 LEDs, with 8 wires each makes for 32 tiny wires that needed to be run. Only half that needed to go the GPIO, since the other half were just mirrored values, but that’s still a lot of wiring — turns out there’s a reason most electronics use a printed circuit board. I briefly entertained designing such a board, and paying to have it printed, but we were still going to run into voltage problems. Once the LEDs were running, very little voltage remained in the little Pi for the blinking lights and buzzers that make the experience fun.

After literally months of soldering, brainstorming, and frequently ignoring the now very-messy project out of frustration, we decided to abandon the banner, and go all in on a Raspberry Pi 3B+ with an add-on display. The logic still worked fine for score-keeping, although I had to come up with a new routine for displaying the score in ASCII characters that filled the screen. The girls each composed a little ditty that gets bleated out by the buzzer when someone wins the game, and Ben designed a mounting shim and 3D printed it. We removed one of the two support poles for the scoreboard, and mounted the Pi to the other — neatly running the wires up the pole. I used plastic cement to attach a reset button and the buzzer to the side of the Pi’s case, providing the key features of the original control panel.

After a successful beta test, we refined the design and improved the crude graphics a little, then installed our new system in the Air Hockey table using most of the original wiring. Its not perfect, but its quite elegant — and I was pleased by how much we all learned putting it together.