In Defense of Elizabeth Holmes

The tech world’s latest villain is a woman named Elizabeth Holmes. You may have heard about her trial on the news, or maybe seen the documentary about her fall from grace. Maybe you even read the Wall Street Journal expose that started her downward spiral. We don’t know much about her as a person, save for the carefully curated public image she presented to the world during her glory days — when she was called a female Steve Jobs, and her company, the Apple Computers of the biotech world. But if the media is to be believed, we now know that she is definitely evil

Her defense at her trial attempted to position her as a victim: she was manipulated by the men in her life, and allegedly abused sexually, which somehow explains why she lied to investors and carried on the shell game that was Theranos for so long. As I write this, the jury is literally still out on whether or not her sob story will garner her any leniency. But the facts of the case are clear: her tech didn’t work, she knew it didn’t work, she lied about it, and — for a time — got rich and famous off of those lies. Eventually her sin, and that of her company, was discovered, and it all unraveled. Her company was destroyed, she was destroyed, and the hope she’d created of a new and easier way to perform medical testing was shown to be a pipe dream. She sold snake oil with the best of them, then got caught. She should be punished.

But I object to the notion that Elizabeth Holmes is some kind of super villain, the first of her kind to come up with this scheme, and pull one over on an innocent and trusting stock market. Heck, Silicon Valley is the home of “fake it til you make it” — an entire culture built on selling B.S. to complicit investors, with the hope of making some unlikely vision a reality. No, I think Elizabeth Holmes’ real crime is that she played the Tech Bro game while being female. She did literally the same thing that a thousand other “entrepreneurs” have done before her, and she did it really, really well for years. She had people believing in, not just her idea, but in her. And like so many before her, she failed. But instead of the embarrassment of that failure being limited to herself, her employees and her shareholders, the embarrassment blew back on a woke culture that needed her to be some kind of proof point, some kind of vindicating example of how feminism can destroy the patriarchy. Elizabeth Holmes wasn’t just a tech hero, she was a culture war hero — and she betrayed those culture warriors by being… the same species as the men she was emulating. Not better than, not some kind of feminist ideal; just another flawed human being. Someone who tried to do something really important, and really hard, and who pushed herself and everyone around her to reach for something insanely great… then when it became evident that it wasn’t working, tried to buy time, stretch the truth, change the goalposts — anything to avoid admitting defeat.

Was she wrong? Definitely.

Was her behavior completely human? Totally.

These are her notes to herself. Comments online suggest these are proof of her manipulative nature. But I don’t see the notes of a coniving con artist, set on deceiving and destroying her way to the top. I see the notes of a struggling human person, trying to discipline her own flaws, work hard enough to be taken seriously, and actually accomplish something for herself and those who depend on her — someone trying do all of that in a place that is disproportionately male-oriented. She didn’t set out to deceive, she set out to succeed. And the fact that she didn’t doesn’t change her worth as a person. The fact that she couldn’t bear to face defeat and lied about it doesn’t mean she has betrayed her gender and should be burned at the stake; it just means she made the same set of mistakes most of us would make in her shoes. The same mistakes that literally hundreds of wannabe Silicon Valley founders have before her. She did all of that while female, but she should face no greater consequences than her male counterparts — many of whom are lauded for the same kinds of failure.

I’m not opposed to feminism: I have two daughters and I want them to be and do whatever they want with their lives. I am opposed to the backlash she’s receiving from the culture she embodied. I’m saying that gender — and culture — factored in to the job she was trying to do — and unfairly so. And I’m saying neither should factor in to the punishment she faces for failing at that job. She doesn’t owe society some special penance because she didn’t succeed at “leaning in.” She’s just a human being who tried something really hard, and failed at it ungracefully. Here’s a list of mostly male-founded companies who fared no better. Let’s not judge her any more harshly for playing the same game while female.

The Year of Linux on the Desktop

One of the first times I heard about Linux was during my freshman year at college. The first-year curriculum for Computer Programmer/Analyst was not particularly challenging — they used it as a level-set year, so assumed everyone was new to computers and programming. I’d been trying to teach myself programming since I was 11, and I was convinced I knew everything (a conviction solidified when one of the profs asked me to help him update his curriculum.) I don’t even remember what class it was for, but we were each to present some technology, and this kid, who looked to be straight out of the movie Hackers, got up and preached a sermon about a new OS that was going to change the world: Linux. Now there was something I didn’t know…

I’d actually downloaded a distro once, and tried to get it going — I figured it would complete the set, along with OS/2 Warp and Windows NT 4 — but hardware support wasn’t there, so I gave up. But after that class, I went home and downloaded Slackware and tried it again. Everything went swimmingly until I figured out that to get the proper resolution out of my graphics card I would have to recompile the Linux kernel myself! I decided this was another level of nerdy, and that I had better things to do. Besides, I already had an alternative-OS religion I was faithful to: the barely-hanging-on MacOS.

It was around the same time that the “second coming” of Steve Jobs was occurring, and the battered Mac community was excited about their salvation. The iMac wowed everyone, and OS X provided the real possibility that Macintosh might survive to see the 21st century after all. I went irresponsibly into debt buying myself an iMac SE — an offering to the fruit company that would re-occur with almost every generation of Apple technology. I sold all my toys and stood in line for the first iPhone. I bent over backward to get someone to buy me the first MacBook Pro with a 64-bit Intel processor. My first Mac ever was rescued from a dumpster, and I’ve saved many more since.

My loving bride has since (mostly) cured me of my irresponsible gadget spending, and I’ve made something of a hobby of fixing up high end Apple computers a few years after their prime, and pressing them back into service. I edit video on a Mac Pro from 2009 — it would have retailed for more than $20k new, but I got it for $800, and have since put maybe $300 into upgrades into it. Its a fantastic machine that runs 5 different operating systems and I will keep it useful for as long as I can. Similarly, for the past 4 years I’ve done hobby development on a 2008 MacBook Pro; another fantastic machine, and while I’ve had better work laptops, I’ve loved none of them like I love that one.

But even I’m having trouble keeping it useful. A newer OS than it was ever supposed to run has been hacked to keep it going, but Apple’s treadmill continues, and one-by-one, its useful capabilities have dropped off: first I couldn’t update Office, then I couldn’t sync DropBox, OneDrive came next, and now Firefox is about to drop support. I need special software to do thermal management, since the OS doesn’t support the hardware, but despite having torn-it-down to re-apply thermal paste multiple times, it runs hot, the fans spin loudly, and it eats through batteries. It could be argued that all you really need is a web browser, a text editor and a Terminal window… but a few more modern accoutrements would be nice. After 14 years of service, I have to admit, its time to let the old girl go…

But the Apple of today is not the company I so slavishly followed in my youth. The Intel years were some of their best: upgradable, repairable hardware in beautiful cases. Now Apple exclusively makes disposable computers, increasingly dumbed down for the masses. What’s a nerd to do?

Its a bit of a joke in the geek world that the year of “Linux On the Desktop” is right around the corner. Since I first played with it in the last 90s, its come a long way — never quite getting popular support amongst the aforementioned masses, but finding its way in other places. If your phone doesn’t run iOS, it runs a version of Linux. Chromebooks also run Linux. And the Raspberry Pi, my favorite home-automating/web-serving/hardware-hacking/education-enabling $30 computer, runs Linux. In fact, I’ve been running Linux for the past 10 years — just never on the desktop.

So for this nerd, 2022 will be my year of Linux On the Desktop. My faithful 2008 MacBook Pro has found a worthy successor in the form of a 2018 Thinkpad X1 Carbon. Purchased for less than $400 on eBay, and needing only new m.2 storage (hard drive), this thing screams. It’s soft-touch carbon fibre case is almost as sexy as a Mac, but the insides are twice as capable — while running silently most of the time. Its light, but solid, with an excellent keyboard with the perfect amount of travel, and a full array of useful ports. Its inner bits are easily accessed with a couple captive-screws, making it simple to repair, and it was designed with Linux in mind, fully supported by the manufacturer on the latest Ubuntu.

So far, I’ve found no down-side to Ubuntu’s flavor of Linux on the Desktop — at least for development use. All my favorite programming environments are supported: Python, C#, Javascript and the LAMP stack. Ubuntu is nicely polished, and the native Bash shell let’s me do anything I want to the underlying OS, or any remote server. It connects to my old gadgets and my new ones, and running full out, still gets 5 hours of battery life.

I’ll continue experimenting with alternative phone OSes too. webOS won’t survive the 3G apocalypse, but Sailfish OS is promising, and having newer mobile hardware is nice — not to mention, allowing me to opt-out of Google’s data collection ecosystem. We’ll see if this machine makes it the full 14 years that my MacBook Pro did — but I’m willing to bet it outlasts any Mac sold in the past year…

The 3G Apocalypse

There are a number of topics that I’d like to blog about, but I’m afraid our current cultural climate doesn’t leave room for the kind of nuance they’d require. Things like our conflicted feelings about getting our youngest vaccinated. We’re pro-vaccine in this house, and 4 out of 5 of us got our shots as soon as responsibly possible once we were eligible. But our 10-year old, who’s already had and recovered from Covid, is a different discussion. It requires consideration and nuance — but saying such things out loud invites input from crazy people, waiting to seize any apparent endorsement of their anti-vac stupidity. If we were to admit some trepidation about vaccinating our youngest, that might somehow give credence to the horse-paste crowd…

Here’s another such topic: 5G cellular networks. I’m going to write about the cluster that is telecommunication technology at the end of 2021, but I don’t want my technical mumbo jumbo to be interpreted as any sort of nod to the “5G networks are a secret plan by Bill Gates to control the minds of people who got microchipped by the Covid vaccine” nut jobs. There’s stupidity, and then there’s conspiracy nut stupidity. I just want to talk about the first kind.

Here in the States, all of the big cellular networks are in the process of shutting down their 2G (aka “Edge”) and 3G infrastructure, to make room for more 5G coverage. On the surface, maybe this sounds like a good plan. I remember laughing at a distant relative still clinging to their “car phone” as Canada shuttered their analog cellular infrastructure late last decade. I’m sure I thought something snarky, like “get with the times, old man.” But sometimes there’s good reason not to rush into euthanizing old technology…

What most people don’t know is 5G is not a singular replacement for previous networks — and won’t be for a long time. In fact, there’s two kinds of 5G, and there will probably have to be a third. The network technology touted by your cell phone carrier as being “next gen” doesn’t actually work inside most buildings…basically at all. So all those cell towers out there, poised to be stripped of their 2G and 3G antennas, won’t be able to help you make a call or update your Instagram if you happen to be inside a building.

Instead, a second 5G infrastructure needs to be created, one that runs on a different frequency entirely and which can penetrate building walls. The unfortunate downside of these low frequencies is that the range sucks — they won’t work outside! So we need two 5Gs, two sets of antennas, two sets of hardware in our phones, and we need to seamlessly move between them. Seems simple enough, but wait!

Turns out there’s a third scenario: places that are partially enclosed so can’t be covered with long-range 5G, but are too big to be properly covered by short-range 5G. Outdoor shopping malls, stadiums, and even “urban canyons” — downtown areas surrounded by tall buildings. These environments may require a different infrastructure again; a “medium range” 5G, that needs different antennas and different hardware in your cell phones.

Now you start to see why they need to cannibalize the older, but still functional, 3G networks — they need room on the towers to run multiple kinds of 5G networks, they need more towers, and they need more people carefully stacking the house of cards that may eventually fulfill the promise of 5G’s faster speeds. Three of everything, and then it’ll be great!

But hey, time marches on, progress in inexorable, and we should just push through these challenges, so we can be better connected, right? We gotta have our Instagram stories as fast as possible, so “get with the times, old man” and all that…

Except there’s another problem. It turns out that millions of “field devices” that depend on 3G are being killed off. 2G or 3G-enabled medical monitoring devices that give patients freedom, connected car technologies that help send life-saving aid in accidents, soil sensors that help farmers make decisions to optimize crop yields, and pipe line and oil field sensors that keep our addiction to fossil fuels fed while helping avoid ecological disasters. All of these will have to be replaced with 5G-enabled devices — and maybe replaced multiple times, until 5G finally stabilizes. 5G won’t make things better right away, in fact, its going to make them worse for a long time…

But even with all these concerns, its going to happen anyway. AT&T has already shut down their 2G network, those customers are out of luck; their 3G customers lose service in December (including all 3 of our cars, which can never be upgraded to 4G). Verizon and T-Mobile will follow suit later in 2022. When you ask a carrier what’s to be done about your old equipment, their only answer is to sell you a new phone — despite the fact that consumer smart phones are only a part of the equation, all they can offer is to take your money for something you probably don’t need.

Smart phone upgrades haven’t been worth lining up for since 2007

And let’s talk about that, because selling you a new, slightly different phone every 12-24 months is big business. Stupid people, who don’t understand that technology evolution inevitably plateaus, are still convinced by Apple that they have to have the latest iPhone. And carriers are happy to help create that need. Our kids each have an iPhone 6s — a phone released in September of 2015. Its a great phone, has all the features they need, can be purchased for about $90 used, and the only real reason to replace will happen when our cell carrier refuses to support it.

Tiny HP VeerMy personal phone is a svelte HP Veer from 2011. Its a total of 4 inches diagonal, doesn’t track my location or report my data to Google, doesn’t wake me up with notifications or bother me with ads. And if it weren’t for the 3G shut down, I’d be the crazy uncle using that thing until the hardware falls apart.

Its too late to stop the Frankenstein’s monster that is 5G. Its not going to control your brain or give you cancer — but it will continue to be a mess of a roll out. There’s not much we can do as consumers to stop this one, but there is something you can do to help break the cycle: stop buying new things you don’t need.

The good news is that if your phone has 4G, it won’t be shut down, or rendered obsolete by 5G. Because of the crappy state of 5G, 4G is likely to be around for another 10 years. So vote with your wallet, and refuse to buy a new smart phone in 2022 — from any company. If your battery is getting old, $80 will fix that. If you drop your phone in a toilet, head over to backmarket.com or eBay and buy the same model used. Don’t be tempted by the dubious promise of “faster networks” or indistinguishable improvements to cameras. Technology isn’t jewelry or a fashion trend: its a tool. Insist on tools that work, that last for years, and that don’t trap you on a treadmill of constant replacements.

Maintainers of Lightning – Part 2

When I was a young computer nerd, back in the early days of a consumer-friendly Internet, formal computer education in high school was limited to whatever rudimentary understanding our middle-aged teachers could piece together for themselves to pass on. But there was one tool that taught me more about web development than any book or class or teacher ever could: View > Source.

View Source MenuYou can do it now, although the output is significantly less meaningful or helpful on the modern web. Dig around your Browser’s menus and it’s still there somewhere: the opportunity to examine the source code of this, and other websites.

It was from these menus that I learned my first simple tricks in HTML, like the infamous blink tag, and later the more complex capabilities offered through a simple programming language called Javascript that increasingly became part of the Web.

Javascript and the web’s DOM (Document Object Model) taught me Object Oriented Programming when my “college level” computer science course was still teaching procedural BASIC. Object Oriented Programming, and the web as a development platform, have defined my career now 25 years hence. I owe an unpayable debt to the View > Source menu. It was the equivalent of “popping the hood” on a car, and it allowed a generation to learn how things work. Your iPhone doesn’t do that – in fact, today’s Apple would prefer if it people would stop trying to ever open their hardware or software platforms.

This, I assert, is why people fell in love with the Apple II, and why it endures after 40 years. It was a solid consumer device that could be used without needing to be built or tinkered with, but its removable top invited you to explore. “Look at the amazing things this computer can do… then peak inside and learn about how!”

As soon as our son was old enough to hold a screwdriver, he was taking things apart to look inside. I’m told I was the same way, but as parents Nicole and I had a plan: we’d take our kid to Goodwill and let him pick out junk to disassemble — rather than working electronics from around the home! I consider it a tragedy that some people outgrow such curiosity. Dr. Buggie has not – in fact, in my brief conversation with him, I think curiosity might be one of his defining characteristics…

Have PHD, Will Travel

Some time in the early 70s, Stephen Buggie completed his PHd in Psychology from the University of Oregon. Finding a discouraging job market in America, Buggie decided to seek his fortunes elsewhere. The timeline from there is a little confusing: he spent 12 years in Africa, teaching in both Zambia during its civil war (he vividly recalls military helicopters landing beside his house) and Malawi. He was in America for a sabbatical in 1987, where he saw computers being used in a classroom, and at some point in there he taught at the University of the Americas in Mexico, who were early adopters of the Apple II for educational use. His two sons were born in Africa, and despite their skin color, their dual citizenship makes them African Americans – a point he was proud to make. He taught 5 years in South Carolina, before joining the University of New Mexico on a tenure track. He maintains an office there as a Professor Emeritus (although rarely visits — especially in the age of COVID) and still has his marking program and years of class records on 5.25” floppy disks.

Apple II Classroom Computers – Seattle’s Living Computer Museum

Buggie found community amongst Apple II users in the 80s and 90s. Before the World Wide Web there were trade magazines, Bulletin Board Systems and mail-in catalogs. Although there was no rating system for buyers and sellers like eBay has today, good people developed a reputation for their knowledge or products. As the average computer user moved to newer platforms, the community got smaller and more tight-knit. Buggie found a key weakness in maintaining older hardware was the power supplies, and scavenged for parts that were compatible, or could be modified to be compatible, and bought out a huge stock – one he is still working through, as he ships them to individual hobbyists and restorers decades later. His class schedule made it difficult for him to go to events, but retirement allowed him to start attending — and writing about KansasFest, and still talks excitedly about the fun of those in-person interactions, and the value he got from trading stories, ideas and parts with other users.

To be certain, Buggie’s professional skills are in the psychology classroom. He doesn’t consider himself to be any sort of computer expert — he’s more like a pro user who taught himself the necessary skills to maintain his tools. And maybe this is the second theme that emerged in our conversation: like the virtue of curiosity, there is an inherent value in durability. This isn’t commentary on the disposable nature of consumer products so much as it is on the attitudes of consumers.

We shape our tools, and thereafter our tools shape us

My dad worked with wood my whole life. Sometimes as a profession, when he taught the shop program at a school in town. Sometimes as a hobby, when he built himself a canoe. Sometimes out of necessity, when my parents restored our old house to help finance their missionary work (and the raising of three kids.) His tools were rarely brand new, but they were always well organized and maintained. He instilled in me a respect for the craft and in the artifacts of its execution.

Computers are tools too – multi-purpose ones that can be dedicated to many creative works. When we treat them like appliances, or use them up like consumable goods, the disrespect is to ourselves, as well as our craft. Our lack of care and understanding, our inability to maintain our tools beyond a year or two, renders us stupid and helpless. We cease to be users and we become the used – by hardware vendors who just want us to buy the latest versions, or credit card companies that want to keep us in debt, or by some imagined social status competition…

In those like Dr. Buggie who stubbornly refuse to replace a computer simply because it’s old, I see both nostalgia and virtue.

The nostalgia isn’t always positive – I’ve seen many Facebook posts from “collectors” who have storage rooms full of stacked, dead Macintosh computers that they compulsively buy despite having no use for, perhaps desperate for some prior stage of life, or to be able to claim a part in the computing revolution of yester-year that really did “change the world” in ways we’re still coming to grips with. There is a certain strangeness to this hobby…

But there is also virtue in preserving those sparks of curiosity, those objects that first invited us to learn more, or to create, or to share. And there is virtue in respecting a good tool, in teaching the next generation to steward well the things we use to propel and shape our passions or careers or the homes we build for our families.

Fading Lightning

I joke with my wife that our rec room is like a palliative care facility for dying electronics: my 1992 Laserdisc player is connected to a 2020-model flatscreen TV and still regularly spins movies for us. I still build apps for a mobile platform that was discontinued in 2011 using a laptop sold in 2008. In fact, these two posts about the Apple II were written on a PowerBook G3 from 2001. Old things aren’t necessarily useless things… and in fact, in many ways they provide more possibility than the increasingly closed computing platforms of the modern era.

In an age where companies sue customers who attempt to repair their own products, there is something to be said for an old computer that still invites you to pop the top off and peer inside. It says that maybe the technology belongs to the user – and not the other way around. It says that maybe those objects that first inspired us to figure out how things works are worthwhile monuments that can remind us to keep learning – even after we retire. It says that maybe the lessons of the past are there to help us teach our kids to shape their world, and not to be shaped by it…

Thanks for visiting! If you missed Part 1, check it out here!

Maintainers of Lightning – Part 1

Of the subset of people who are into computers, there is a smaller subset (although bigger than you might think) that are into vintage computers. Of that subset, there’s a significant proportion with a specific affinity for old Apple hardware. Of that subset, there’s a smaller subset that are into restoring and preserving said hardware. For that group of people, something like a Lisa (the immediate precursor to the Macintosh) or a NeXT Cube (Steve Jobs follow-up to the Macintosh, after he was kicked out of Apple) holds special value. Something like an original Apple I board holds practically infinite value.

If you are in this sub-sub-sub-group of people, then you have undoubtedly come across the eBay auctions of one Dr. Stephen Buggie. You’ll know you’ve purchased something from him, because whether its as big as a power supply or as small as a single key from a keyboard, it comes packed with memorabilia of either the Apple II, or his other hobby, hunting for radioactive material in the New Mexico desert. Hand-written, photocopied, or on a floppy disc or DVD, he always includes these small gifts of obscura.

Last year, after the fifth or sixth time one of his eBay auctions helped me rescue an Apple II machine I was restoring, I reached out to Dr. Buggie and asked him if I could interview him for my blog. His online presence outside of eBay is relatively small. Of of the few places you can find him is on YouTube, presenting at KansasFest, an annual meeting of sub-sub-sub-group nerds entirely focused on the Apple II line-up of computers. Not all of these folks are into restoration. Some are creating brand new inventions that extend the useful life of this 40+ year-old hardware. Some have been using the hardware continuously for 40+ years, and still live the early PC wars, full of upstarts like Atari, Commodore and Amiga, gearing up for battle against IBM.  Many still see the Macintosh as a usurper, and pledge more allegiance to the machine that Wozniak wrought than the platform that begot the MacBook, iPhone and other iJewelry that all the cool kids line up for today. If you search really hard, you may even find references to Dr. Buggie’s actual, although former, professional life: that of a world-traveling psychology professor. What you won’t find is any sort of social media presence — Facebook and Twitter don’t work on an Apple II, so Dr. Buggie doesn’t have much use for them.

The 5 Whys

What I wanted to know most from Buggie was why Apple II? Why, after so many generations of computer technology had come and gone, is a retired professor still fixing up, using and sourcing parts for, a long-dead computer platform? I don’t think I got a straight answer to that: he likes bringing the Apple //c camping, because the portable form-factor and external power supply make it easy to connect and use in his camper. He likes the ][e because he used it in his classroom for decades and has specialized software for teaching and research. But as for why Apple II at all… well mostly his reasons are inexplicable — at least in the face of the upgrade treadmill that Intel, Apple and the rest of the industry have conspired to set us all on. He uses his wife’s Dell to post things on eBay, and while still connected to the University, he acquiesced to having a more modern iMac installed when the I.T. department told him they would no longer support his Apple II. But if it was up to him, all of Dr. Buggie’s computing would probably happen on an Apple II…

OK, But What is an Apple II?

It’s probably worth pausing for a little history lesson, for the uninitiated. The Apple II was the second computer shipped by Steve’s Jobs and Wozniak in the late 70s. Their first computer, the Apple I (or just Apple) was a bare logic board intended for the hobbyist. It had everything you needed to make your own computer… except a case, power supply, keyboard, storage device or display! The Apple II changed everything. When it hit the market as a consumer-friendly device complete in a well-made plastic case, with an integrated power-supply and keyboard, that could be hooked up to any household TV set, it was a revolution. With very little room for debate, most would agree that the Apple II was the first real personal computer.

Apple I, with aftermarket “accessories” (like a power supply!)

As with any technology, there were multiple iterations improving on the original. But each retained a reasonable degree of compatibility with previous hardware and software, establishing the Apple II as a platform with serious longevity – lasting more than a decade of shipping product. As it aged, Apple understood it would eventually need a successor, and embarked on multiple projects to accomplish that goal. The Apple III flopped, and Steve Jobs’ first pet project, the graphical-UI based Lisa was far too expensive for the mainstream. It wasn’t until Jobs got hold of a skunkworks projects led by Jef Raskin that the Macintosh as we know it was born. By then, Apple II was an institution; an aging, but solid bedrock of personal computing.

The swan song of the Apple II platform was called the IIGS (the letters stood for Graphics and Sound), which supported most of the original Apple II software line-up, but introduced a GUI based on the Macintosh OS — except in color! The GS was packed with typical Wozniak wizardry, relying less on brute computing power (since Jobs insisted its clock speed be limited to avoid competing with the Macintosh) and more on clever tricks to coax almost magical capabilities out of the hardware (just try using a GS without its companion monitor, and you’ll see how much finesse went into making the hardware shine!)

But architecture, implementation, and company politics aside, there was one very big difference between the Apple II and the Macintosh. And although he may not say it outright, I believe it’s this difference that kept the flame alive for Stephen Buggie, and so many other like-minded nerds.

Image stolen from someone named Todd. I bet he and Dr. Buggie would get along famously.

That key difference? You could pop the top off.

I’ll cover why that’s important — both then and now — and more about my chat with Dr. Buggie in Part 2

Install Microsoft Store Apps on Windows 10 LTSC

Switching my home computers to LTSC was the single best decision I’ve made for the health of my network — and for my sanity. If you don’t need all the latest bells and whistles — and more importantly, if you’re driven insane by the constant feature updates that are often more painful to install than they’re worth, give serious thought to getting a hold of LTSC. In particular, on my Mac Pro, every Windows feature update was a battle to keep it stable. The only feature I care about is that it starts up when I need it.

However, there is one (dubious) down side: LTSC does not include the Microsoft Store for getting access to apps. Lots of apps are available from alternate channels, but occasionally there’s one — like Microsoft’s To Do app — that’s only available in their app store. But don’t panic: you can still get these apps (assuming they’re free). It just takes a little more work…

Update: A reader made a comment below that provides an alternate approach — I haven’t tried it yet, but its a fantastic idea, so I’m adding it here. Thanks, Oliver!

Another option is to install the MS Store app itself. I used this scripted command on GitHub: https://github.com/channel168/LTSC-Add-MicrosoftStore#readme

Step 1 – Find the App on the Web

You’ll need a link to the app from a Microsoft website that has a “Get” button. Here’s a direct link to the Productivity section of the web-based version of the Microsoft Store:

https://www.microsoft.com/en-us/store/top-free/apps/pc?category=Productivity

The Web version of the Microsoft Store can help you download apps on LTSC
The Web version of the Microsoft Store will provide the info you need to get the app without the store

Step 2 – Copy that Link

Grab the URL from your browser’s address bar. For Microsoft To Do, it looks like this:

https://www.microsoft.com/en-us/p/microsoft-to-do-lists-tasks-reminders/9nblggh5r558

Step 3 – Extract Download Links

Paste the URL into this site: https://store.rg-adguard.net/ and hit the checkmark

Adguard's alternate app store extracts download links right in your web browser
AdGuard’s handy alternate store will tell you what packages are associated with the App you want

If all goes well, you’ll get a list of one or more packages. For To Do, there were 5 packages that I needed — but the results had almost 10 times that. Don’t worry, you won’t need them all. Start with the .Appx or .Appxbundle that looks like the app you want. Note, from the possible choices, you’ll want the latest version number, and the correct processor architecture. If you’re running 64-bit Windows, you’ll want the x64 version of the app. Download it somewhere memorable.

Step 4 – Open up Powershell

The command to install is a simple one:

Add-AppxPackage -Path "path-to-appx-you-downloaded"

When you run it, you’re likely to get a scary red error dump. Inside that message are helpful tips telling you that the app needs one of the other files from Step 3. Go get it — again paying attention to version number and processor architecture.

Powershell install errors will help you get the dependencies each App needs
The Powershell install error will tell you what dependency package to get next

Step 5 – Repeat

Run the Add-AppxPackage command on each download, then re-try Step 4. Each time it will tell you about another file it’s missing. If you’re feeling confident, you can guess ahead — but to be on the safe side, go file-by-file, grabbing exactly the one it complains about after each attempt at Step 4.

Eventually you’ll have all the dependencies installed, and the app you wanted will actually install — assuming its compatible with your version of Windows, you can now use it like normal!

I’ve got lots more ideas for keeping old systems alive — read more of them here!

Managing Social Media: Google

The company that started with the motto “Don’t Be Evil” has spent the last decade or so flirting with ideas that are awfully close to evil. That doesn’t mean that the organization is bad — any more than a hang nail means a human being is dying — but Google sure could use a pair of nail clippers.

When GMail first came out, I was ecstatic to get an invite. They were transparent about the trade-off at the time, and we all accepted it as reasonable: Google has automated systems that read your mail so that they can personalize advertisements to your interests. If you send an email to someone about how you burnt your toast that morning, seeing ads for toasters in the afternoon seemed fairly innocuous — even a little amusing! At the time, though, Google’s coverage of your digital life was just search results. Adding e-mail felt natural and not really that intrusive.

Fast forward to today, and what Google knows about you is downright terrifying. They don’t just know where you go on the Internet, they know where you’ve been, how often, and when you’re likely to go there again in the real world. And their influence doesn’t stop at knowledge: because of the virtual monopoly of Chrome, and its underlying tech called Chromium which powers most browser alternatives, Google has started making unilateral decisions about how the Internet should work — all in their favor, of course. They don’t like it when you’re not online, because they stop getting data about you. And it doesn’t matter if you’re not using one of their properties directly, because 70% of the 10,000 most popular Internet destinations use Google Analytics. Its actually a great product; it helps web developers understand their audience and build better offerings — and all Google wants in exchange is to know everything about you.

And let’s talk about YouTube, a virtually unavoidable Google property, full of useful content, and a site that historians might one day determine was a leading cause for the end of our democracy. YouTube is awful — and its entirely by accident. Google deflects privacy concerns by pointing out that the analysis of all this data is done by algorithms, not people. There’s probably no person at Google that actually knows how to gather all the information you’ve given them into a profile of you personally. But there doesn’t need to be: their software is sufficiently empowered to manipulate you in ways you aren’t equipped to resist.

YouTube’s recommendation algorithm has been disowned by its own creator as reckless and dangerous, and while its been tweaked since it was launched on the world like SkyNet, the evil AI from the Terminator movie franchise, and now has human over-seers to guide its machinations towards less destructive content, its still a pernicious and outsized influencer of human thought. Look no further than 2020’s rampant embrace of conspiracy theories for proof positive that recommendation engines are not our friends.

Google set out to do none of these things. I’ve been to their campus, and interviewed for jobs with their teams. To a fault, everyone I’ve met is full of idealism and optimism for the power of the Internet to empower individuals and improve society. I actually still like Google as a whole. But if the Internet is Pandora’s box, Google is the one that pried it open, and can’t quite figure out how to deal with what was inside. Humanity is not inherently good, and accelerating our lesser qualities isn’t having the positive outcome Google’s founders might have hoped for.

So, how do you throw the bath water out, but keep the baby? Can you use Google’s awesome tech, without contributing to the problems it creates? I don’t know, but here’s a few of the ideas we’re trying:

Diversify Your Information Holdings

I’ve said this before, and it bears repeating: don’t put all your eggs in the same basket. If you have a Google Mail account for work, have your personal account with another provider. If you use Google Classroom for school, use OneDrive for your private documents. If you have an Android phone, don’t put a Google Home in your bedroom. This isn’t just good security practice, preventing an attacker from gaining access to everything about you from a single hack, its good privacy practice. It limits the picture of you that any one service provider can make. Beware, though, of offerings that appear to be competitive, but are actually the same thing under the hood. The privacy browser Brave may tell a good story about how they’re protecting you, but their browser is based on Google’s Chromium, so its effectively the same as just using Google’s own browser.

Castrate the Algorithm

The YouTube recommendation engine is getting better. They’ve taken seriously the impact they’ve had, and they have smart people who care about this problem working on it. Until they get it right, though, you can install browser extensions that just turn it off altogether. You can still search YouTube for information you want, but you can avoid the dark rabbit trail that leads to increasingly extreme viewpoints. Choose carefully, because browser extensions are information collectors too — but here again, at least you’re diversifying.

Tighten the Purse Strings

Use an ad-blocker, and contribute less to their bottom line. Ad-blockers are relatively easy to use (again, reputation matters here), and available in multiple forms, from user-friendly browser extensions that can be toggled on-and-off, to nerd-friendly solutions you can run on a Raspberry Pi. We’ve eliminated about 80% of the ads we see on our home Internet by using DNS-level filtering — and its remarkably easy to do.

Do a Privacy Check Up

I’ve been involved in software development for 20 years — data really does make software better — but did you know Google will willingly relinquish older data they have on you? All you have to do is ask. Whether you’re an active Google user, in the form of an Android device or one of their enterprise offerings (like Google Classrom), or just an occasionally searcher with an account, you should take them up on this offer and crank up your privacy settings.

Search Elsewhere

Google is still pretty much the top of heap as far as search results go, but they’re far from the only game in town — and the deltas shrink daily. Bing is remarkably close in the search race, although its backed by an equally giantic corporation that is probably no more altruistic with their data acquisition, and DuckDuckGo does a decent job most of the time. Why not switch your default search engine to something other than Google, and switch back opportunistically if you can’t find what you need?

Check Who’s Watching

Just like Facebook has its fingers in most of the Internet, Google is everywhere. A service called Blacklight lets you plug in the address of your favorite website, then gives you a report on all the data collection services that website is cooperating with. The scariest ones are probably the ones you trust to give you news and information. Use RSS where possible, anonymizers, or different browsers for different purposes… which brings me to my final suggestion.

Stop Using Chrome

Oh man, I could go on for pages about how scary Google’s control over the Internet has gotten — all because of Chromium. If you’re old enough to remember all the fears about Microsoft in the 90s, this should all seem familiar. Just like the PC was Microsoft’s playground, and anyone who tried to compete was in danger of being crushed under the grinding wheel of their ambition, the world wide web has become Google’s operating system, and Chrome is the shiny Start Menu that graced every screen. Everything uses Chromium, even Apple’s browser, and Microsoft’s new Edge. It allows Google to basically dictate how the Internet should work, and while their intentions may be mostly good, the results will not be. I’m practically pleading with you: install Firefox and use it as your default browser; switch to a Chromium-based browser only when you have to.

If I sound like a paranoid old man by now, I’ve earned it. I’ve literally been working on the Internet my entire career — my first experiments in web development date back to 1996. I love this thing called the web, and generally Google has been good for it. But a democracy isn’t democratic if its ruled by dictator, and the Internet isn’t open if its entirely controlled by Google. As citizens of cyberspace, you own it to your community to help it stay healthy, and as individuals, you owe it to yourself to practice safe web surfing.

Multi Mac OS X Rescue Drive

Recently someone posted a great idea in a Facebook group for users of old Macs — but apparently wasn’t interested enough in the community to describe how it was accomplished. The idea is to have an external USB drive with multiple Mac OS installers on it, so you can restore or recover a wide range of Macs. Although all the instructions are online, they’re scattered across different sites that have to be pieced together. Here’s my attempt to collected everything you need to make a multi-OS recovery disk for almost any Mac made in the last 15 years.

Multi-boot Mac OS Recovery USB Drive

Things You’ll Need

  • A working Mac with a relatively modern OS (I used a 2008 MacBook Pro running a patched High Sierra)
  • The install media (disk, disk image or installer app) for each OS revision you’ll want (see links at the bottom)
  • At least 100gb external USB drive (an actual drive — not a USB key)
  • Optional, the DosDude patchers for any Mac OS you may want to shove on an unsupported Mac
  • Optional, but recommended, the latest Combo update for each Mac OS you may be installing. (I’ll include links to download these at the bottom of this post)
  • Moderate proficiency with Apple’s Disk Utility and just a little bit of Terminal comfort

Note: You’ll notice that these instructions stop at Mojave. So do I. You could argue that Catalina’s murder of 32-bit apps was necessary for the ARM-transition — and maybe that was right for Apple, but you can read why I don’t think its great for consumers.

Continue reading “Multi Mac OS X Rescue Drive”