Making My Own Cloud

Nic and I got our first camera together a few months before our wedding — it was also our last film camera. I had a few digital cameras, but at the time, the quality was low and the prices still high, and we wanted really good pictures that would last. So we got a good scanner to go along with our film, and resolved to began preserving memories.

A couple of kids at Dunn’s River Falls, 2001

The subsequent economies of scale for digital cameras, and later the rapidly improving quality of phone cameras resulted in a proliferation of photographs — and many hours spent on a strategy for organizing and storing them. This got even more complicated with 3 new digital archivists when our kids got phones and began photographing and recording everything.

The result is more than 140GB of photos over 20+ years, organized in folders by year, then by month. The brief metadata I can capture in a folder name is rarely enough to be able to quickly pinpoint a particular memory, but wading through the folders to find something is often a fun walk down memory lane anyway.

Apple's Digital Hub strategy of yester-yearSince I first started out archiving photos, a number of technology solutions have come along claiming to be able to do it better. Flickr, Google Photos, iPhoto, Amazon Photos all made bold assertions that they could automatically organize photos for you, in exchange for a small fee for storing them. The automatic organization always sucked, and the fees were usually part of an ecosystem lock-in play. It seems nothing has been able to beat hierarchical directory trees yet.

Still 140GB is a lot, and 20 years of memories can’t be saved on a single hard drive — there’s too much risk. Some kind of bulk back-up mechanism is important. For the past 8 years, we’ve used Microsoft’s OneDrive. Their pricing is the best, and their sync clients work well on most platforms. They don’t try to force any organization on you, it kinda just works.

Lately, though, they’ve begun playing into the “ecosystem lock” trap. The macOS client is rapidly abandoning older (and more functional) versions of the OS. OneDrive is priced most attractively if you also subscribe to Office, which is also moving to the consumer treadmill model of questionable new features requiring newer hardware. It seems that in order to justify the subscription software model, vendors need us to abandon our hardware every 3-4 years. This artificial obsolescence must be the industry’s answer to the fact that consumer computing innovation has plateaued, and there’s no good reason to replace a computer less than 10 years old any more — and many good reasons not to.

Hidden in an unknown corner of Inner Mongolia is a toxic, nightmarish lake created by our thirst for smartphones, consumer gadgets and green tech. Read the article.

In short, while I’m not unhappy with OneDrive, and consider the current incarnation of Microsoft to be one of the more ethical tech giants, it was high time to begin exploring alternatives. If you’re sensing a theme here lately, it stems from a realization that if us nerds willingly surrender all our data to big companies, then what hope does the average person have of privacy in this connected age?

Fortunately, like with most other tech, there are open source alternatives. Some are only for those who can DIY, but a significant number are available for the less tech savvy willing to vote with their wallets. Both NextCloud and OwnCloud offer sync clients that are highly compatible with a range of operating systems and environments. Both are available as self-host and subscription systems — from a range of providers. At least for now, I’ve decided to self-host OwnCloud in Azure. This is largely because I get some free Azure as a work-perk. If that arrangement changes in the future, I’m very likely to subscribe to NextCloud provided by Hetzner, a privacy-conscious European service that costs less than $5.

My first digital camera, a Kodak DC20, circa 1997. Thanks mom and dad!Right now, our total synced storage needs for the family are under 300GB. I have another terabyte of historical software, a selection of which will remain in a free OneDrive account. The complete 1.3TB backup is on dual hard drives, one always stored offsite. This relatively small size is due to the fact that we stopped downloading video as soon as streaming services became a viable paid alternative — although that appears to be changing.

I started making money on computers as a teen by going around and fixing people’s computers for them. Most made the same simple mistakes, were grateful for help, and were generally eager to learn. In 2022, I’m afraid we’ve all just surrendered to big tech — we’ve decided its too hard to learn to manage our digital lives, so we let someone else do it; in exchange, we’ve stopped being customers and instead we’ve become the products. With our digital presence being so important, maybe its time consumers decided we’re not for sale.

Smart Phones Strike Back

Smart phones are getting interesting again — and I’m not talking about next week’s Apple event, where they’re sure to announce yet another set of incremental changes to the iPhone, or Samsung getting caught throttling apps on their most recent set of samey Galaxy devices. I’m talking about choice re-emerging. Slowly, mind you, but without the participation of big data gobbling companies. Instead, its brought to you by the same type of nerd that railed against the desktop duopoly in the 90s: the Linux crowd.

While Desktop Linux never really had its day, Linux has succeeded in a myriad of different ways: on embedded devices, web servers, and in tiny computers. It turns out there is a place for an alternate operating system — it maybe just looks a little different than the computers you see at Walmart or Best Buy.

As consumers we’re fairly bereft on hardware choice — except for a few expensive experiments, smart phones all pretty much look the same these days. Software choice, too, is basically non-existent. Do you want to buy into Apple’s ecosystem, or Google’s? If you don’t want either, then you’re basically cut off from the world. No banking, no messaging, and thanks to the 3G shutdown, soon no phone calls either.

Those tackling the issue of hardware choice are a relatively brave few. The FairPhone isn’t cheap, or terribly different looking, but its a good effort at making a repairable handset. The PinePhone is a truly hacker-friendly device, with hardware switches to protect against snooping software, and an open bootloader capable of handing off to multiple operating systems. Both devices still look like rectangular slabs, but have important features that the hegemony is unwilling to consider.

It’s also interesting to see big manufacturers explore different form factors. I’m glad to see folding phones making a comeback — although they remain too expensive for my blood. There’s nothing approaching the 3G-era of creative and exploratory hardware design, though. When my Veer finally gets cut off from T-Mobile’s doomed 3G network this summer, there is unlikely to be a replacement quite so svelte.

But while its true that hardware remains mostly yawn-worthy, smart phone operating systems are another story. While Windows Phone still has a few hold outs, and there’s a handful of us holding on to webOS, most of the world has accepted either iOS or Android. But Linux is not to be counted out — and this is where things get really interesting.

This summer, I ran Sailfish OS for a couple weeks. Based on open-source Linux, but with some proprietary software, Sailfish is built by a company in Sweden called Jolla, who offers it as a viable alternative platform and has had success in certifying for use in some European governments. The free base OS is quite stable (although I don’t love some of their UI choices) and the paid version provides Microsoft Exchange enterprise email and a Google-free Android compatibility layer. The former is too buggy for serious use, in my opinion, and the latter suffers from some stability issues. But for basic use, on modern hardware, Sailfish is gradually becoming a legitimate contender.

Jolla is building a real alternative to Android

Other Linux-based offerings are at varying levels of maturity. Ubuntu Touch doesn’t quite have the backing it did when initially launched, but it was a breeze to install on my Pixel 3A. Manjaro Linux is fast becoming the OS of choice for the PinePhone crowd — but I can’t afford the hardware, so I haven’t tried it out. And my own favorite, LuneOS, which is an attempt to restore webOS to its original mobile glory, by porting Open WebOS (and other) components, along with a familiar UI and APP support, continues its slow evolution.

Back in the infancy of the smart phone, I considered myself an early adopter. I stood in line for the first iPhone, having sold two of my most precious gadgets (my Newton 2000, and my iPod) to pay for it. While at Microsoft, I had virtually every Windows Phone 7 device available. And I’ve had numerous Android-based devices, and worked for 3 years on an open Android (AOSP) derived platform at Amazon. But the past 7-10 years have been depressingly boring. Watching people drool over the new iPhone that’s basically the same as the last iPhone is infuriating — its like society collectively agreed that we’re not interesting in investing in innovation, only in rewarding repetition.

Choices!

In early 2022, I can say unequivocally that most people don’t need, and won’t be satisfied with, a Linux-based phone right now. But for the first time in nearly a decade, my desk has an array of interesting smart phones on it again, and I’m optimistic that a couple of these new OSes might evolve to the point where those of us who aren’t willing to trade our privacy for the right to make and receive phone calls might finally have some choices again.

The Next Web

The tech bros have declared a new era of the Internet; they call it Web 3.0, or just web3. They claim this new era is about decentralization, but their claims are suspiciously linked to non-web-specific ideas like blockchain, crypto currency and NFTs. I object to all of it.

I consider myself a child of Web 1.0 — I cut my teeth on primitive web development just as it was entering the mainstream. But I consider myself a parent of Web 2.0. Not that I was never in the running to be a dot com millionaire, but my career has largely been based on the development and employment of technologies related to the read/write web, and many of my own personal projects, such as this website, are an exploration of that era of the Internet. Web 2.0 is the cyberspace I am at-home in. So one might accuse me of becoming a stodgy old man, refusing to don my VR headset and embrace a new world of DeFi and Crypto — and time will tell, maybe I am…

But its my personal opinion that web3 is on the fast track to the Trough of Disillusionment, and that when the crypto bubble bursts, there will be much less actual change remaining in the rubble than the dot-com bubble burst left us with.

Before I dive in, let’s clear up some terms. “Crypto” has always been short for cryptography. Only recently has the moniker been re-purposed as a short form for cryptographic currency. Crypto being equated with currency is a bit of a slight to a technology space that has a wide-variety of applications. But fine, language evolves, so let’s talk about crypto using its current meaning. Crypto includes BitCoin, Dogecoin, Litecoin, Ethereum and all the other memetic attempts at re-inventing currency. Most of these are garbage, backed by nothing except hype. People who are excited by them are probably the same people holding them, desperate to drive up their value. There’s literally no “fundamentals” to look to in crypto — its all speculative. But that doesn’t mean they don’t have value; the market is famously irrational, and things are worth what people think they’re worth. Bitcoin, the original crypto-currency, has fluctuating, but real value, because virtual though it may be, it has increasing scarcity built-in.

I stole this image from somewhere on the web. I assume someone else owns the NFT for it.

I’ve owned Bitcoin twice — and sold it twice. The first time was fairly early on, when I mined it myself on a Mac I rescued. What I mined was eventually worth about $800: enough to buy myself a newer and more powerful computer.

My next Bitcoin experiment was last year, where I bought a dip and sold it when it went high. I made about $300. I made more money buying and selling meme stocks with real companies behind them, so I have no plans to continue my Bitcoin experiments in the near future.

As a nerd who’s profited (lightly) off it, you’d think I’d be more of a proponent of digital currency, but the reality is that none of its purported benefits are actually real — and all of its unique dangers actually are. I won’t cite the whole article, but I need to do more than link to this excellent missive I found on the topic, because its much better written than I could manage, so here’s a relevant excerpt — go read the rest:

Bitcoin is touted as both a secure and non-inflationary asset, which brushes aside the fact that those things have never simultaneously been true. As we’ve seen, Bitcoin’s mining network, and therefore its security, is heavily subsidized by the issuance of new bitcoin, i.e. inflation. It’s true that bitcoin will eventually be non-inflationary (beginning in 2140), but whether it will remain secure in that state is an open question…

The security of bitcoin matters for two reasons. First, because bitcoin’s legitimacy as the crypto store of value stems from it. Out of a sea of coins, the two things that bitcoin indisputably ranks first in are its security and age. In combination, these make bitcoin a natural Schelling point for stored value. If halvings cause bitcoin’s security to fall behind another coin, its claim to the store of value throne becomes a more fragile one of age and inertia alone.

The second reason is the tail risk of an actual attack. I consider this a remote possibility over the timeframe of a decade, but with every halving it gets more imaginable. The more financialized bitcoin becomes, the more people who could benefit from a predictable and sharp price move, and an attack on the chain (and ensuing panic) would be one way for an unscrupulous investor to create one.

Paul Butler – Betting Against Bitcoin

I think a few crypto-currencies will survive, and that blockchain as a technology may find a few persistent use-cases. But I’m also convinced that the current gold rush will eventually be seen as just that. Crypto isn’t that much better than our current banking system: its not faster, it doesn’t offer any real privacy, it isn’t truly decentralized, and blockchain itself is grossly inefficient. But that doesn’t mean people won’t try find uses for it. Just look at NFTs!

NFTs, or Non Fungible Tokens, employ blockchain technology to assert ownership of digital assets that are, by their very nature, fungible. You could, right now, take a screen shot of this website and share it with everyone. You could claim it was your website, and I couldn’t stop you. The website itself would continue to be mine, and DNS records for the domain will be under my ownership as long as I pay the annual renewal. But that screen shot you took can belong to anyone and everyone. This fact has caused no end of grief for media companies, who have been trying, almost since the dawn of the web, to control the distribution of digital media. No wonder NFTs are taking the world by storm right now, both the tech and investment community (who desperately need a reason to justify their blockchain investments) and the media conglomerates, want it to be the answer. Its not — but that reality won’t impact the hype cycle.

A NFT is an immutable record of ownership of a digital asset. The asset may be freely distributed, but the record of ownership remains, preserved in a blockchain that can be added to — but not removed from. Its like a receipt, showing that you bought a JPG or GIF… except that this receipt imparts no rights. I can still freely copy that JPG, host it, post it, email it and share it. The purchase record only serves as evidence that someone stupidly spent money (sometimes incredible amounts of money) for it. Everyone else just gets to benefit from the asset as before. If the stock market indicates the perceived value of holding a share of a company, the NFT market indicates that some people are willing to assign value to holding only the perception itself. Its ludicrous.

You can start to understand why some are calling web3 a giant ponzi scheme. People who bought into ephemeral perceptions of value need to increase the perceived value of their non-existent holdings, so they hype them up and try to get others to buy in, to raise that perceived value — until at some point, the bubble bursts, and not only does all the perception disappear — but the basis of those perceptions mostly go away too. Unlike the dot com bubble, which left behind useful ideas to be gainfully employed in less foolish ways, the web3 bubble has created nothing of real value, and little useful technological innovation will remain.

All that said, I do despair for the state of Web 2.0 — it has been almost completely commercialized, and the individual freedom of expressions the “eternal September” has wrought clearly indicated that our Civilization of the Mind needed some better guardrails. I also believe in some of the concepts of decentralization that are being espoused for the next web. But I’d argue that hyped up ideas of virtual ownership, meta realities, and the mistaken assumption of privacy that comes with cryptographic currency, are not the changes we need. The barrier to entry for the freedom of exchange shouldn’t be based on how much money you’re willing to gamble on dubious claims of hyped-up technologies…

I’m not sure I know what would be a fair and equitable way to approach Web 3.0. Web 1.0’s barriers were around a user’s ability to understand technology — a bar that ensured that most of its participants were relatively intelligent. Web 2.0’s democratization lowered that bar, enabling more diverse conversation and viewpoints (both good and bad), but also enabled a level of exploitation that even the original denizens of the web were unable to stop. The next web can’t be allowed to be more exploitative — and its participants shouldn’t be more easily misled. We need to learn from our mistakes and make something better.

Maybe I’m too old, or too disconnected from the metaverse and tech bro culture to influence much. But I observe that there are still wonderful communities on the web. They’re a little harder to find (sometimes deliberately so), but there are still (relatively small) groups of people who can converse civilly on a topic of shared interest. Since leaving Facebook, I’ve created and joined a few, and it gives me hope that technology can still be enabler for real communication and community. For some people, wearing a funny headset might be a part of that, and that’s OK. For others, it may mean speculating on stocks or new forms of stored value, and if they’re doing that responsibly, maybe that’s OK too. But if Silicon Valley decides that web3 requires crypto currency, NFTs and the metaverse, I think I’m going to skip that update… forever.

Alexa, shut up!

The next tech company we’re redefining our relationship with is Amazon. We’re not breaking up, but we are going to set some more healthy boundaries with the A to Z company.

Amazon has never been as creepy as Facebook, or more recently, Google. As a former employee, I can say with confidence that they are very careful with customer data — the data isn’t the fuel for that particular machine, actual purchases provide that. But data is a valuable lubricant that keeps its gears turning. When I was there, all trackable customer data was immediately anonymized. The encrypted customer ID was a carefully guarded secret that didn’t appear in any reports or analysis. But analysis was very much a part of the business. Data from different classes of customers was aggregated to create predictability and targetability — but never against an individual, only against kinds of shoppers. Its entirely likely that Facebook and Google behave the same way internally — but the difference is the amount of personally identifiable information they store. Amazon mostly just wants your address so they can ship you things.

No, Amazon’s guilt lies more in their impact on its work force. This isn’t entirely their fault, though. We all bought into the amazing convenience of clicking “Buy Now” and having something show up surprisingly fast — next, or even the same, day in some markets. It really is remarkable. At some point, though, to push that convenience to the next level, they have to start pushing against human limitations. Our laws and technology aren’t quite ready for drones to deliver things to our door, so to drive down costs and timelines, we need the people in the process to do more and do it faster. That means warehouse workers working harder, delivery drivers delivering more, and customer service serving more customers. The results are self-evident… the warehouse workers are ready to break, the drivers can’t get one, and the customer service has declined.

At the rate Ben’s toy drones crash, I’m not sure we’re ready for this tech anyway.

Last year we tried ditching Amazon Prime. Initially we expected it to be very difficult — we like getting free 2-day shipping, who doesn’t?! But not having that convenience was a trigger to look other places. We tried to buy local more, or at least spread the purchases to some different big box retailers. Sometimes we paid a little more, usually it took longer to arrive, but at least we weren’t the ones making some fulfillment center employee skip a bathroom break, or some Amazon driver pee in a bottle. It wasn’t more convenient, but it did feel more human. The experiment was a success, and despite Amazon’s attempts to sign us back up at every turn, we won’t be joining Prime again (except maybe for a month when the new Jack Ryan season comes out.)

Also getting significantly cut back at our house is Amazon’s robotic offspring, Alexa. My voice was one of those that helped train her — we had early prototypes in our home, listening to our conversations and performing daily training. Initially they wanted only born-and-raised US English speakers, but I convinced the dev team that my background was so diverse that Alexa wouldn’t be confused by any strong Canadian accents or turns of phrase, so they let me bring her home. It was exciting being a part of making a voice interface an actual reality, and we soon had an Alexa device within earshot of almost every part of our home (although she’s never been allowed in a bedroom.)

However, like many other innovative products from that business unit, Alexa is in a tough spot in 2022. Its great when invention starts out for the pure nerdy joy of creating something new, and I’ll always treasure the memories I have of working on so many secret projects. But all gadgets eventually need a way to support themselves — a business model that justifies their existence. Alexa was created with the distant possibility in mind that customers might one day shop with it, or that Alexa users might become more loyal Amazon customers, and thus influence revenue indirectly. In reality, most orders created by voice were probably mistakes, and Alexa herself is… well, she’s becoming rather annoying.

Shut up, Alexa!

As they’ve added more features, the machine learning algorithm has not improved. This means she gets confused more easily. Our programmed routine to dim the lights for a movie results in instructions on how to make popcorn about 50% of the time.

And if that’s not bad enough, in a desperate attempt at relevance, she’s started notifying us of things… daily. Batteries are on sale! That thing I bought a month ago needs a review! Did we know she could help us with our mental health through daily meditation? Not content to sit and listen quietly, nor to exist just to turn on our lights or spell a word for the kids, Alexa has begun insisting that she needs to be a bigger part of our life. Our reaction to this increasingly pushy robot is to just unplug her. We have three kids, a cat and a bunch of chickens. We haven’t got time for a needy smart speaker too.

So, two Echo devices are being replaced with Apple HomePod Minis. They’re significantly less capable, but also exponentially less demanding. The rest of the Echos, save one in the kitchen, will be decommissioned. Occasionally that’ll meaning turning off a light the old fashioned way. So be it.

Some tech companies are reaching the point of being irredeemable. I don’t think Amazon is there. But I do think its an awfully large commerce engine, barrelling down the information super highway, and it might not be a bad idea for us as consumers to post some speed limits — or for those behind the wheel to tap the brakes every now and then.

Besides, Jeff Bezos is fast becoming a super villain. He doesn’t need any more of my money…

So I Tied An Onion To My Belt – 2021 Edition

Its hard to remember the optimism with which we greeted 2021. Sure, Trumpian insanity was yet to peak, but the vaccine was here, people were getting it, and it really looked like we were about to turn the corner on this whole Covid-19 thing. The close of 2021, and the shadow of Delta and Omicron, proved that the glass wasn’t as close to half full as we hoped, but that doesn’t mean it was a bad year. For awhile there, things almost felt… normal.

Spring and summer saw us cautiously stretching our legs beyond the confines of quarantine, and evidenced the inexorable impact of adolescence on our teenagers — with all the tumult and change that comes with it.

You can call them the “blunder years” if you’d like, but they’ve handled it well so far. Pushing themselves to try new things, and showing leadership in their peer group. Abi signed up for lacrosse this year; they all talked me into (temporarily) fostering a dog; and both Ben and Abi were among the first kids in our area to get their vaccine shots — a choice we left entirely up to them. Eli wasn’t left out either, hitting double digits, and qualifying for her shots later in the year.

As summer called, and case counts plummeted, we enjoyed the return to some of our normal patterns. We welcomed our first visitor since the pandemic hit, had a blast at Family Camp, and made the challenging-but-worthwhile trip across the border, which allowed us multiple wonderful reunions. Ben hit another milestone, this time in his spiritual growth, as he obeyed Jesus into the waters of baptism.

Our very “scenic route” trip home allowed us even more visits and experiences as we trecked across the eastern seaboard of Canada and the US, before returning home to face some challenges. Nicole and I learned how to remove and hang a door. Abi started her multi-year experience with orthodontic work. And Ben worked his first job — manning a booth at a video game conference (not a bad way to start his professional life!)

Still the impact of 2020 created more constraint than we would have liked, so in 2021 we made it a point to fulfill some delayed promises, and the older two each got a special adventure — albeit somewhat late. Things were still fairly normal as fall hit, and we got to enjoy some of the beauty around home, but once again, the colder weather brought rising cases and new restrictions. It seems like something of a miracle that we managed to make it home for Christmas again this year.

Just look at how much the kids have grown in the past 12 months. Physically, its quite obvious in our teenagers (Ben’s voice dropped an octave!) but we’re most proud of who all of them are becoming. Their temperaments are obviously unique (and sometimes get on each other’s nerves!) but most of the time, they genuinely try to put others first, embrace the world and all its challenges with courage and a sense of adventure, and have weathered the pandemic like troopers — accepting limitations when prudent, and enthusiastically saddling up for new experiences whenever we get the opportunity.

At this point, it would seem entirely pointless and foolish to try to predict how 2022 will go. But as much as we can, we hope to enjoy our kids’ energy and excitement to explore new things and new places. We are so grateful to God for giving us each of them, and for blessing us with another year of adventures with them. I’ll let Eli close this year, by telling you about one of hers…

A Bicycle Built for Two

In 2001: A Space Odyssey, the on-board artificial intelligence, named HAL 9000, logic bombs when its mission imperatives appear threatened by the behavior of the humans on-board. As a result, HAL turns homicidal, and the remaining crew-member, Dave, is forced to shut it down. As he moves to disable the computer’s brain, HAL tries to convince Dave to stop — bargaining, threatening and even begging. Eventually it regresses to its early training, and begins singing the song its creator first taught it (A Bicycle Built for Two), its creepy voice dropping in speed and pitch until it… dies. Its a surprisingly poignant moment in a movie that explores the origin — and future — of life.

The experience of shutting down Facebook was remarkably similar. Deactivating an account still leaves your data with Facebook, so I didn’t just want to turn it off — I wanted to delete my data from their servers. Facebook offers a few ways to delete data in bulk, but each of them only work for a few chunks of data before they start mysteriously failing. In the “Manage Activity” section, you can select data by month for bulk actions, up to 50 items at a time, but after doing that a couple times, suddenly the “Delete” option becomes unavailable. In the “My Posts” section, you can delete by year, but after deleting a few items in a given year, it fails and the whole page gets replaced by an error message.

This is what happens if you try to remove too much of your data from Facebook

I even purchased a Chrome extension for $5 that goes through your activity and clicks delete for you automatically. But again, after a few passes, the pages fail to load and the extension gets stuck. Reloading doesn’t fix the bug, nor does switching computers, but if you come back a couple days later, things work again — for awhile. But each time you log-in to try to clean-up Facebook pops up memories to remind you of the good times you posted about on their servers. Its very much like Meta’s brain knows what you’re trying to do, and is bargaining with you: “This is too hard, why don’t you just keep your account?” “Do you really want to delete all these memories of your kids?” “Why don’t you think about it for awhile, and come back after you’ve calmed down…”

Eventually, I had to sit down and go through every post from every day, every month, and every year, for the 14 years I’ve been on Facebook and click “Delete” followed by “Yes” on each and every piece of data they had on me. It took over 8 hours of my holiday, over multiple days — and even then, they’ll helpfully store it in the trash for another 30 days — just in case I change my mind. The AI does not want to give up its precious data.

Are we sure this guy isn’t an evil robot that feeds on user data?

The thing is, even 14 years ago, I knew better than to make Facebook the sole storage location for our precious memories. Our photos are stored in OneDrive, synced to multiple computers, and in cold-storage on a pair of hard drives that rotate into an actual safe in a bank vault every 6 months. It’d take a nuclear war to eliminate our pictures of our kids. Of course, there are a variety of cute little updates from when the kids were little that almost jerked out some tears as I deleted them from Facebook — but I downloaded a copy of my Facebook data before I took that action, so now I have those backed up too:

Abi (6): I brushed my hair and Eli’s hair too!

Eli (3): “Dankoo” — meaning please and thank you

Ben (5): Daddy, what IS Elmo?

And there’s this blog. Some years I’ve blogged a lot, some very little at all. But this site has been here since Nicole and I were engaged. The data on it was made by us, the servers that store it are managed by me, and while Google is free to crawl it, I still own it, and I get to decide what to do with it. Years ago, as our kids first started getting online, we decided we wanted them to have control over their online presence, so we scrubbed our full names from the site. This site used to be the top hit when you Googled for my full name. Now, there’s a dozen other people who share my name that are welcome to have top billing. Our friends and family know where to find us, and while maybe its not as convenient as Facebook, at least this is ours — and if I ever need to unplug it, there’s just one power cord in the basement that’ll do the trick. Its just our little corner of the Internet, but it was built for us two…

I’m not becoming a virtual hermit: I’m on Twitter and Discord and LinkedIn, I’m experimenting with Mastadon and PixelFed. There’s lots of places to find me online — Facebook even still gets my data in the form of Messenger. But they’ve shown themselves to be untrustworthy, and if not intentionally evil, at least incapable of being good

In Defense of Elizabeth Holmes

The tech world’s latest villain is a woman named Elizabeth Holmes. You may have heard about her trial on the news, or maybe seen the documentary about her fall from grace. Maybe you even read the Wall Street Journal expose that started her downward spiral. We don’t know much about her as a person, save for the carefully curated public image she presented to the world during her glory days — when she was called a female Steve Jobs, and her company, the Apple Computers of the biotech world. But if the media is to be believed, we now know that she is definitely evil

Her defense at her trial attempted to position her as a victim: she was manipulated by the men in her life, and allegedly abused sexually, which somehow explains why she lied to investors and carried on the shell game that was Theranos for so long. As I write this, the jury is literally still out on whether or not her sob story will garner her any leniency. But the facts of the case are clear: her tech didn’t work, she knew it didn’t work, she lied about it, and — for a time — got rich and famous off of those lies. Eventually her sin, and that of her company, was discovered, and it all unraveled. Her company was destroyed, she was destroyed, and the hope she’d created of a new and easier way to perform medical testing was shown to be a pipe dream. She sold snake oil with the best of them, then got caught. She should be punished.

But I object to the notion that Elizabeth Holmes is some kind of super villain, the first of her kind to come up with this scheme, and pull one over on an innocent and trusting stock market. Heck, Silicon Valley is the home of “fake it til you make it” — an entire culture built on selling B.S. to complicit investors, with the hope of making some unlikely vision a reality. No, I think Elizabeth Holmes’ real crime is that she played the Tech Bro game while being female. She did literally the same thing that a thousand other “entrepreneurs” have done before her, and she did it really, really well for years. She had people believing in, not just her idea, but in her. And like so many before her, she failed. But instead of the embarrassment of that failure being limited to herself, her employees and her shareholders, the embarrassment blew back on a woke culture that needed her to be some kind of proof point, some kind of vindicating example of how feminism can destroy the patriarchy. Elizabeth Holmes wasn’t just a tech hero, she was a culture war hero — and she betrayed those culture warriors by being… the same species as the men she was emulating. Not better than, not some kind of feminist ideal; just another flawed human being. Someone who tried to do something really important, and really hard, and who pushed herself and everyone around her to reach for something insanely great… then when it became evident that it wasn’t working, tried to buy time, stretch the truth, change the goalposts — anything to avoid admitting defeat.

Was she wrong? Definitely.

Was her behavior completely human? Totally.

These are her notes to herself. Comments online suggest these are proof of her manipulative nature. But I don’t see the notes of a coniving con artist, set on deceiving and destroying her way to the top. I see the notes of a struggling human person, trying to discipline her own flaws, work hard enough to be taken seriously, and actually accomplish something for herself and those who depend on her — someone trying do all of that in a place that is disproportionately male-oriented. She didn’t set out to deceive, she set out to succeed. And the fact that she didn’t doesn’t change her worth as a person. The fact that she couldn’t bear to face defeat and lied about it doesn’t mean she has betrayed her gender and should be burned at the stake; it just means she made the same set of mistakes most of us would make in her shoes. The same mistakes that literally hundreds of wannabe Silicon Valley founders have before her. She did all of that while female, but she should face no greater consequences than her male counterparts — many of whom are lauded for the same kinds of failure.

I’m not opposed to feminism: I have two daughters and I want them to be and do whatever they want with their lives. I am opposed to the backlash she’s receiving from the culture she embodied. I’m saying that gender — and culture — factored in to the job she was trying to do — and unfairly so. And I’m saying neither should factor in to the punishment she faces for failing at that job. She doesn’t owe society some special penance because she didn’t succeed at “leaning in.” She’s just a human being who tried something really hard, and failed at it ungracefully. Here’s a list of mostly male-founded companies who fared no better. Let’s not judge her any more harshly for playing the same game while female.

The Year of Linux on the Desktop

One of the first times I heard about Linux was during my freshman year at college. The first-year curriculum for Computer Programmer/Analyst was not particularly challenging — they used it as a level-set year, so assumed everyone was new to computers and programming. I’d been trying to teach myself programming since I was 11, and I was convinced I knew everything (a conviction solidified when one of the profs asked me to help him update his curriculum.) I don’t even remember what class it was for, but we were each to present some technology, and this kid, who looked to be straight out of the movie Hackers, got up and preached a sermon about a new OS that was going to change the world: Linux. Now there was something I didn’t know…

I’d actually downloaded a distro once, and tried to get it going — I figured it would complete the set, along with OS/2 Warp and Windows NT 4 — but hardware support wasn’t there, so I gave up. But after that class, I went home and downloaded Slackware and tried it again. Everything went swimmingly until I figured out that to get the proper resolution out of my graphics card I would have to recompile the Linux kernel myself! I decided this was another level of nerdy, and that I had better things to do. Besides, I already had an alternative-OS religion I was faithful to: the barely-hanging-on MacOS.

It was around the same time that the “second coming” of Steve Jobs was occurring, and the battered Mac community was excited about their salvation. The iMac wowed everyone, and OS X provided the real possibility that Macintosh might survive to see the 21st century after all. I went irresponsibly into debt buying myself an iMac SE — an offering to the fruit company that would re-occur with almost every generation of Apple technology. I sold all my toys and stood in line for the first iPhone. I bent over backward to get someone to buy me the first MacBook Pro with a 64-bit Intel processor. My first Mac ever was rescued from a dumpster, and I’ve saved many more since.

My loving bride has since (mostly) cured me of my irresponsible gadget spending, and I’ve made something of a hobby of fixing up high end Apple computers a few years after their prime, and pressing them back into service. I edit video on a Mac Pro from 2009 — it would have retailed for more than $20k new, but I got it for $800, and have since put maybe $300 into upgrades into it. Its a fantastic machine that runs 5 different operating systems and I will keep it useful for as long as I can. Similarly, for the past 4 years I’ve done hobby development on a 2008 MacBook Pro; another fantastic machine, and while I’ve had better work laptops, I’ve loved none of them like I love that one.

But even I’m having trouble keeping it useful. A newer OS than it was ever supposed to run has been hacked to keep it going, but Apple’s treadmill continues, and one-by-one, its useful capabilities have dropped off: first I couldn’t update Office, then I couldn’t sync DropBox, OneDrive came next, and now Firefox is about to drop support. I need special software to do thermal management, since the OS doesn’t support the hardware, but despite having torn-it-down to re-apply thermal paste multiple times, it runs hot, the fans spin loudly, and it eats through batteries. It could be argued that all you really need is a web browser, a text editor and a Terminal window… but a few more modern accoutrements would be nice. After 14 years of service, I have to admit, its time to let the old girl go…

But the Apple of today is not the company I so slavishly followed in my youth. The Intel years were some of their best: upgradable, repairable hardware in beautiful cases. Now Apple exclusively makes disposable computers, increasingly dumbed down for the masses. What’s a nerd to do?

Its a bit of a joke in the geek world that the year of “Linux On the Desktop” is right around the corner. Since I first played with it in the last 90s, its come a long way — never quite getting popular support amongst the aforementioned masses, but finding its way in other places. If your phone doesn’t run iOS, it runs a version of Linux. Chromebooks also run Linux. And the Raspberry Pi, my favorite home-automating/web-serving/hardware-hacking/education-enabling $30 computer, runs Linux. In fact, I’ve been running Linux for the past 10 years — just never on the desktop.

So for this nerd, 2022 will be my year of Linux On the Desktop. My faithful 2008 MacBook Pro has found a worthy successor in the form of a 2018 Thinkpad X1 Carbon. Purchased for less than $400 on eBay, and needing only new m.2 storage (hard drive), this thing screams. It’s soft-touch carbon fibre case is almost as sexy as a Mac, but the insides are twice as capable — while running silently most of the time. Its light, but solid, with an excellent keyboard with the perfect amount of travel, and a full array of useful ports. Its inner bits are easily accessed with a couple captive-screws, making it simple to repair, and it was designed with Linux in mind, fully supported by the manufacturer on the latest Ubuntu.

So far, I’ve found no down-side to Ubuntu’s flavor of Linux on the Desktop — at least for development use. All my favorite programming environments are supported: Python, C#, Javascript and the LAMP stack. Ubuntu is nicely polished, and the native Bash shell let’s me do anything I want to the underlying OS, or any remote server. It connects to my old gadgets and my new ones, and running full out, still gets 5 hours of battery life.

I’ll continue experimenting with alternative phone OSes too. webOS won’t survive the 3G apocalypse, but Sailfish OS is promising, and having newer mobile hardware is nice — not to mention, allowing me to opt-out of Google’s data collection ecosystem. We’ll see if this machine makes it the full 14 years that my MacBook Pro did — but I’m willing to bet it outlasts any Mac sold in the past year…