These are not even my pants

My father-in-law is a big fan of A&E’s cop show “Live PD.” In one scene, a man is questioned about an open container (a bottle of Rolling Rock) and he is subsequently patted down. When the pat-down uncovers a small quantity of marijuana, the officer asks about it and the suspect starts to respond, then quickly shifts to denying that the pants he is wearing are his own. “These are not even my pants; these are my friend’s pants.”

It’s a good illustration of a bad lie quickly falling apart. The weed was wrapped in an auto parts store receipt from earlier the same afternoon. Initially the suspect denied having gone to the store. Then he acknowledged putting the pants on that morning. When confronted with the time printed on the receipt, the suspect admits that yes, he did go to the store, but still continued to claim that he did not know how the weed got in the pant pocket. Yeah, right.

This week the governor of Virginia was exposed as having a photograph on his 1984 medical school yearbook page depicting a couple men holding cans of Budweiser: one man is in blackface and the other is wearing a KKK costume.

Governor Ralph Northam apologized on Friday for his decision to appear in the photograph. On Saturday he did a 180 and denied being either of the men in the photograph. We can imagine him saying “That’s not even my hood. That’s my friend’s hood.”

If he thought it would help him, Northam would probably say that he is BOTH people in the photograph, along with some self-serving nonsense about how the photo proved that he was “woke” thirty years before the word took on its present-day meaning.

People are going to see that 1984 photograph through the lens of their own life experience and come to all sorts of different conclusions. However, I think that most will agree that dressing up in blackface and Klan gear, posing for a picture in costume, and posting that picture on a yearbook page were all stupid, offensive things to do — even in 1984.

As my wife pointed out (and I had completely overlooked), since both men are disguised — one in blackface and one in a hood — we may never know for sure who those people are. Is it too much of a stretch to consider that one of the governor’s advisors came to the same conclusion after Northam had admitted being one of the men? A voice in Northam’s ear, or one in his own head, told him to deny that he was in the picture.

So, what might have been a teachable moment about the stupid, offensive things we sometimes do and say, and how attitudes can change, is now a conversation about how someone’s story changes when the heat is on.

Today a Washington Post editorial called for Northam’s resignation.

I agree.

A call for Northam’s resignation is not about “political correctness,” as many people will assert. It is not mere political correctness to demand that white people stop blackening our faces for laughs. It is also not seeking some kind of “ideological purity” to expect our elected representatives to represent ALL of us. Democrats should have a reasonable expectation that our candidates share core Democratic values. We should also require them to meet high standards of conduct.

People do and say all sorts of stupid, offensive stuff. That’s ALL people — me, you, everybody. There is no one who can run for office plausibly claiming otherwise. What’s more, in 2019 there are any number of ways to find out what people have done and said.

How is it, then, that we keep electing people who pretend that some of their life history simply never happened? We don’t have to look beyond the White House for a prime examples. Sure, you thought I meant the current occupant of the White House, and I did, but I was also thinking of another president who declared “I did not have sexual relations with that woman…” and yet another who said “…people have got to know whether or not their president is a crook. Well, I am not a crook.”

Right, and these aren’t even my pants.

There is a weird sort of denial of reality that happens when people are caught in the act.

A lot of people who run for office seem to think that some of the stuff they’ve done and said is just never going to come out. When what you’ve done is publish an offensive photo in a yearbook, did you really think that nobody would notice? Did you not consider that somebody out there doesn’t especially like you or what you stand for?

Avoiding detection is a bad plan when you run for public office and have a history of racist photos, domestic abuse arrests, recorded comments about grabbing pussies, etc. We live in a digital age and our lives are ever more thoroughly documented. Whether that’s a good thing is debatable, but there is no denying that it is happening.

Yet some of our candidates pretend that some part of their documented past is invisible or just doesn’t matter and we keep electing them.

A lot of people are saying that Democrats are shooting ourselves in the foot by pushing some of our own, such as Al Franken, out the door. Well, in the short term, maybe. In the long term I have to believe that purging bad actors is the right thing to do. We need to get them out.

Republicans, are you listening?

Northam must go. The Democrats around him, especially Virginians, should be urging him publicly to resign — not just because he did something stupid and offensive in 1984 and hoped it would go away, but because he tried to wiggle out of it when it became public.

Northam’s credibility is shot and it’s tough to govern if nobody trusts you.

 

Rock and Roll has been going downhill since when?

Last summer I stumbled across an opinion piece on nbcnews.com entitled Passing Michael Jackson, the Eagles now have the best-selling album of all time. And they’re still terrible.

This thing has been bugging me ever since.

My wife has wondered aloud why I would be bothered by something so trivial, and that’s a fair question. First, it’s not a bias against the author, Jeff Slate (whom I do not know or follow), nor against music critics in general. Yes, it’s true that my first wife had some kind of extraprofessional relationship with a music critic. That critic also said some nasty things about a musician I liked on the occasion of the artist’s death. I don’t dislike all music critics just because I knew one with some character defects.

And my beef with the article isn’t just because I like the Eagles. Yes, I said it: I like the Eagles.

My dislike of the column has a lot more to do with its tone and its sophistry.

Mr. Slate wrote his diatribe after reading an AP News article about the Eagles’ first greatest hits collection being certified by the Recording Industry Association of America as the best-selling album of all time, with over 38 million units sold. This is, it seems, an example of nothing failing like success. Mind you, no one is saying that the album at issue, the Eagles’ “Their Greatest Hits 1971-1975” is the BEST album ever recorded, just that it has sold the most copies.

Is there something wrong with selling something that a lot of people want to buy? Apparently so. The author compares the record with junk food and characterizes Eagles’ recordings generally as “bland, soulless music” that “sounded like a cocaine bender coming out of my speakers…”

Slate accuses the band — which he says isn’t a real band, but an act or a group — of a “coldblooded pursuit of stardom,” disbanding in 1980 “amid petty squabbling over (of course) money and control,” and lacking “a keen connection to and appreciation for their audience.”

At one point, Slate flatly states, “Indeed, I believe that through sheer greed and avarice, they have single-handedly brought on the long, slow decline of rock ‘n’ roll as an art form.”

There’s a lot to dislike in that last statement. For starters, is it universally agreed that rock and roll is an art form in a long, slow decline? This assertion immediately reminded me of an scene from 1973’s “American Graffiti.” John Milner, who got stuck with underage Carol as a cruising passenger, turns off the radio while the Beach Boys’ “Surfin’ Safari” is playing. Carol asks why he did that, to which Milner responds: “I don’t like that surfing shit. Rock ‘n Roll’s been going downhill ever since Buddy Holly died.”

Rock ‘n roll has been in a damned long, slow decline indeed if it started when Buddy Holly died, which was sixty years ago this week. For Jeff Slate, somehow, it was brought about by something the Eagles did.

Slate evokes a different movie scene to underscore his dislike of the Eagles: a scene from the Coen brothers’ 1998 cult hit, “The Big Lebowski.” Lebowski asks a cab driver to change the channel because he’s had a rough night and hates the bleepin’ Eagles. Lebowski’s favorites run more along the lines of Creedence Clearwater Revival, who had their heyday in the late 60’s and early 70’s. To each his own.

Slate himself seems partial to the Clash, who were never anything special to me.

Whether the Beach Boys are better or worse than Buddy Holly, or the Eagles are better or worse than Creedence Clearwater Revival, or Joe Strummer and the Clash are “the only band that matters” is largely a matter of personal preference.

What really bugs me the most about Jeff Slate’s hit piece on the Eagles is that it isn’t just a criticism of their greatest hits collection or even their music in general. It’s an attack on the Eagles as people. A music criticism piece has no business attacking musicians, even if the critic doesn’t like the music or the people making it.

And last of all, music criticism should not attack the listener, who may also be a reader. Is the music critic’s taste somehow better than everybody else’s? This seems to be the idea that Mr. Slate is trying to convey. Eagles music is aural junk food, the Eagles themselves are bad people (except for Joe Walsh, who somehow gets a pass), and Eagles fans are just being duped into paying for and listening to stuff that is the very ruination of rock and roll. We’re all a bunch of fools for being taken in.

There is an ongoing debate as to whether the Eagles are great or the Eagles suck. I liked some of their songs and didn’t care for others. Has some of it been played nearly to death? Undoubtedly. And they say that familiarity breeds contempt. But really, does anyone think that they’re going to persuade anybody else one way or the other?

What brings about the perception that some musical genre is in decline, I think, is that we grow up, but the music of our youth will always have a special place in our hearts. A particular song instantly transports us to a time or a place we remember fondly.

Have the Eagles brought about the long, slow decline of rock and roll as an art form? Of course not. I could think of some artists and songs that I especially dislike, and generally, instead of railing against them, I simply change the station.

 

 

 

Wasted Away Again In Unsupportedville

As promised in my last blog entry, I spent a little time test driving Ubuntu, a free, open-source, Linux operating system. Ubuntu is published by a UK company called Canonical. It is billed on its website as “The leading operating system for PC’s, IoT devices, servers, and the cloud.” I wanted to evaluate Ubuntu as a possible alternative to Windows for day-to-day PC operations. I’ll cut right to the chase: for me, it’s not a viable alternative.

Ubuntu was relatively quick and easy for me to get up and running. I found a pretty good document on Lifewire which helped me set up a spare laptop (our old Dell Latitude E6520) to dual boot Windows and Ubuntu. Dual booting allows the user to choose between two operating systems at startup.

I downloaded the latest LTS (long-term support) version of Ubuntu, released in the spring of 2018 and patched afterward, then created a bootable flash drive and ran through the setup without problems or long delays. Setup was pretty straightforward and all of the hardware was supported — two obvious strengths of Ubuntu, especially compared to older versions of Windows. Most computer users buy a computer with an operating system already installed and never change it, except for updates. Therefore, if you want to compete with Microsoft and Apple for installed base, you have to make it as painless as possible.

Unfortunately, it didn’t take me long to feel the pain. One of the many applications bundled with Ubuntu is Mozilla Thunderbird, a popular email client that probably works well for a lot of folks. But one weekend last autumn, for reasons that aren’t entirely clear, Thunderbird stopped working correctly with Comcast email accounts and it still hasn’t been fixed. Yes, it’s possible — though not easy — to find information about the problem, along with conflicting suggestions about fixing it. I implemented a fix that required pointing the program to a server that neither Mozilla nor Comcast seemed to know much about. My email then worked.

Problem solved, right? Not exactly. The fix required setting a security exception to ignore a security certificate discrepancy. Doing so was against the guidance provided by Thunderbird itself, but I was just trying to get into my own email. Within minutes, however, I thought “What the hell am I doing?” I realized that I was getting my email via a server I knew nothing about, aside from the fact that it had a sketchy security certificate and I was accessing it with my UNENCRYPTED email password. This is something that I never would have recommended to anyone else.

I promptly deleted the email settings, uninstalled Thunderbird, and changed my email password via another computer. A few days later I got rid of Ubuntu, reclaimed my disk space, and cleaned up the boot menu.

What went wrong? An application bundled with the operating system failed to work with my email provider, which happens to be a very big company, and the problem has been going on for months. If Microsoft Outlook stopped working with a Comcast email account, I’d expect either Microsoft or Comcast to fix it quickly, as in hours or days.  With a free OS and email app, apparently all bets are off. The problem hasn’t been fixed and one workaround poses an unacceptable security risk.

In prior blogs I’ve talked about commercial products approaching or reaching end of support. Just as bad, or even worse, may be free products that have little or no official technical support to begin with. In the case of a commercial product — an older version of Microsoft Windows or Office, for example, or our old Dell laptop, for another — there are usually fairly long product cycles. These things are supported for a long time (though not forever) and there’s usually a a legacy of published documentation and updates. But where does one turn for reliable, authoritative information about free software? Who is responsible for making this stuff work?

One of the utilities that ships with Ubuntu has a disclaimer that states flatly “This program comes with absolutely no warranty.” It also appears to come with little or no documentation. I found the following in the product’s frequently asked questions listing:

Q: Where can I find some documentation?
A: (Product Name) was designed to be hopefully work for most people without the aid of any documentation. But if you want to find some technical information, the (Product Name) Wiki page might be useful.

It doesn’t inspire confidence. And while I’m sure there are people who are willing to try to figure out how to use a program completely by guessing, I also know that I’m not one of them. I don’t care if something is “free” if it’s so hard to use or so poorly supported that it wastes my time.

If you want a proprietary system that is easy to use and well supported, and one that doesn’t change radically from one version to the next, a Mac is probably the way to go.

I used to joke that Macs are just like PC’s, except in every conceivable detail. In truth, they are just different enough from PC’s that I have trouble using them and they’re expensive enough that I have difficulty affording them.

Computers SHOULD be easy enough to use that they don’t require so much work. They’re supposed to be tools right?

Linux operating systems like Ubuntu are not taking over the desktop, contrary to what some of their fans have been telling us for the better part of thirty years. Linux may be running tons of embedded systems, servers and supercomputers, but the overwhelming majority of personal computers are running some version of Windows or Mac OS X.

Why is Linux still failing on the desktop after all these years? I think it’s largely because the OS and bundled apps still don’t feel like consumer products; they feel like something written in a programming commune.

Too many rough edges and too little support equals not a better mousetrap, just a cheap one.

 

 

To Windows 10 or Not, That Is The Question

In 2015 Microsoft expected Windows 10 to be running on a billion — yes, that’s right, a thousand million — devices within a couple years. Nearly three and a half years later, it’s supposedly running on 700 million devices. That’s a pretty significant shortfall, and remember, Microsoft really, really wants people to upgrade (see my prior blog entry).

Why has Microsoft had to push so hard to get people to use Windows 10?

First, take a look at the ranking of operating system share for the six most recent Windows versions. According to netmarketshare.com, as of December 2018, it goes like this:

  1. Windows 10 — 40.86%
  2. Windows 7 — 36.37%
  3. Windows 8.1 — 5.07%
  4. Windows XP — 4.08%
  5. Windows 8 — 1.04%
  6. Windows Vista — 0.27%

Now look at that same list, in order of release date/age, newest to oldest:

  1. Windows 10 — July 29, 2015 / 3 years, 5 months — 40.86%
  2. Windows 8.1 — October 17, 2013 / 5 years, 2 months– 5.07%
  3. Windows 8 — October 26, 2012 / 6 years, 2 months — 1.04%
  4. Windows 7 — July 22, 2009 / 9 years, 5 months — 36.37%
  5. Windows Vista — November 30, 2006 / 12 years, 1 month — 0.27%
  6. Windows XP — October 25, 2001 / 17 years, 2 months — 4.08%

What’s going on? Windows 7 and Windows XP, with a combined age of 26 years 7 months, still account for more than 40% of the desktop OS share. Meanwhile, Windows 8.1 and 8 together account for just over 6 percent and Windows Vista has all but disappeared.

It’s beginning to make more sense as to why Microsoft seems so motivated to move users to a new OS, but how did we get here?

A little more history of Windows versions post-9/11, according to the Oracle:

Windows XP was released in October 2001 and it became very popular. It was tricky to install, but it was stable and fast. For a lot of users it replaced any one of a bunch of earlier 16- or 32-bit versions of Windows. It ran a lot of legacy software. And it remained popular for a long time, because…

Windows Vista was a major disappointment. It was slow. It introduced a lot of annoying prompts for permission to do things, which baffled and annoyed people. It made connecting to a wifi access point harder than it had been in XP, which also baffled and annoyed people. It cluttered up the Desktop with a bunch of crap nobody needed, including a big analog clock. Yes, this was in addition to the digital clock in the lower right hand corner of the screen, which had been a feature of Windows as long as anyone could remember. Few companies saw the point and the Great Recession came along. Windows XP became more entrenched. Next…

Windows 7 fixed a lot of what was wrong with Vista and seemed to put Windows back on a better track. Setup got easier. A long development cycle meant that Windows 7 handled more hardware straight out of box than its two predecessors. 64-bit software started to make more sense, especially in the workplace. Some editions of Windows 7 also included a licensed version of XP that could be run in a virtual machine to maintain compatibility with older software. This was a big deal to companies that had business processes running in old versions of apps, such as Access 97. Windows 7 was a hit, and Windows XP lived on. And then Microsoft lost its mind and pushed out…

Windows 8. What were they thinking? Microsoft wanted to compete with iPhones and Androids and iPads and Chromebooks and… nobody cared. Windows 8 was a one-size-fits-nobody solution. What problem did it solve? If businesses needed the capability of capturing signatures, for example, that hardware already existed. Digitizing tablets already existed. Touchscreen Windows-based computers ALREADY EXISTED. But many companies had just replaced their last CRT monitors with flat panels. Were they supposed to throw those out and buy touchscreens? Oh, and end users were FORCED to use an entirely different user interface, because the Start menu was hidden and users couldn’t easily get it back. Watch productivity soar as end users try to figure out how the hell to use what was essentially a completely new computer! Wait, what’s this Microsoft Store thingy? Watch productivity be forgotten as end users try to install games on their workplace PC’s! Or stay on Windows 7 and wait for…

Windows 8.1, with which Microsoft displayed an amazing determination to repeat a big mistake. Version 8.1 made small concessions on the user interface, but the Start menu was still missing and the changes were too little, too late for most businesses to care. Microsoft also became pushier about online services, such as OneDrive. In 2014 support ended for Windows XP, but remember, some Windows 7 editions (including 7 Pro and Enterprise) could run Windows XP Mode, which ran older apps in a virtual PC. Who needed 8? Not too many people, it turns out.

Which brings us to Windows 9, which of course doesn’t exist. Would it be too confusing to talk about Windows 9 when there had already been a Windows 95, Windows 98 and Windows 98 2nd Edition? Would everyone think we were partying like it was 1999? Would we have to go through Y2K all over again? Too scary! Skip 9 and go right to…

Windows 10, “the best Windows ever.” Windows as a service. The last version of Windows.

The question is, should you use it?

Windows 7 enters its last year of support this week, with end of support on January 14, 2020. Some of Microsoft’s big customers with deep pockets may end up paying for extended support, but that won’t help you or me. The clock is ticking.

Windows 8.1 will be supported until January 10, 2023, for all the difference it makes: 94% of desktop OS users are not running Windows 8.x and can’t buy it, even if they wanted to (which they don’t).

So, what to do?

If you are running a version of Windows prior to 7, get off the Internet NOW. Your computer is unsafe, your browser is probably unsafe (or soon will be) and it’s only going to get worse.

If you have a Windows 7 machine and you like it, you’re still fine for now, but the end is in sight. See the paragraph above.

Sadly, when a computer operating system and web browsers become unsupported, the only truly safe move is to take them offline. Yes, you can still run installed programs, and you can print and scan locally. You can play media. For most of us it’s a stretch to remember when we had a standalone PC, or even one that did not have an “always on” connection to the Internet. Think about it.

Windows 10 is probably in our future, but it’s going to be as private as I can make it, and with as little reliance on the cloud as possible. Sorry Microsoft, it’s still my computer, my content, my brain.

I’ll probably be experimenting with Ubuntu on one of these old computers soon. Coming from a longtime Microsoft user and recovering support tech, well, that’s a big change.

Windows 10: The OS Microsoft Really, Really Wants You to Use

In December, Windows 10 apparently displaced Windows 7 as the most popular desktop operating system in the world. This is according to NetMarketShare. I say “apparently” for a couple reasons. First, these things are educated guesses —nobody really knows for sure. Second, the graph shows one thing (Windows 10 on top), while the table shows another (Windows 7 in first place). The computer press, who presumably follow these things a lot more closely than I do, are reporting that lines have crossed and Windows 10 now has the greatest share of desktop OSes.

It has taken almost three and a half years for Windows 10 to reach this point. What’s more, Windows 7, in second place among desktop operating systems, is now more than nine years old.

Microsoft has carried out its campaign to move users to Windows 10 like siege warfare, slowly wearing down the enemy (AKA customer) until eventually, inevitably, getting the desired result: Windows users will have the software Microsoft says they will have, the so-called “last version of Windows.” Windows as a service. Windows 10.

There were two major Windows releases between Windows 7 and 10. Most people who can count would guess that they’d be called Windows 8 and 9. In reality they were 8 and 8.1. The smart, quirky folks at Microsoft definitely can count, so what happened to Windows 9? We don’t actually know. An interesting discussion of the question was published on a site called ExtremeTech the day before Windows 10 was released.

Anyway, Windows 8 was a version that nobody I know ever wanted or asked for. It was Microsoft’s attempt to push a touch-centered user interface on customers. It had a garish collection of buttons or “tiles,” some of which updated constantly, and it completely abandoned the Start menu that had been a major feature of Windows since 1995. That’s right, got rid of it. Gone.

Windows 8 is one of many examples of Microsoft seemingly forgetting — or not caring — that there were a gazillion people who already knew how to use their product.

Windows 8.1 was a free upgrade that probably mattered most to those who had the miserable 8.0 or who needed to replace a computer running another version of Windows that was no longer sold or supported.

If you never had a computer running Windows 8, you didn’t miss much. In 2013 I had the agony of supporting Windows 8 in the rollout of a bunch of overpriced, under-powered tablets to a new line of business, in a new office, under very tight deadlines. It was a perfect cluster, which is now enshrined as a success in the LinkedIn profile of the IT director who oversaw it. He left the company less than a year later, getting walked out of the building for other bad judgment/behavior, and the entire IT operation was outsourced. I digress.

After Windows 8.x, practically anything that brought back the Start menu would be viewed favorably, much as Windows 7 was seen as a partial return to sanity after Windows Vista. But Microsoft would take no chances. Windows 10 was given away as a free upgrade to Windows 7 and 8.1 users for a year. Microsoft got pretty pushy about it, in fact, effectively tricking some computer users into installing a new operating system when many of them were happy with what they had.

Microsoft has also been none too subtle about the reasons that people should upgrade anything earlier than version 10: security. In another year (for most of us, anyway) Windows 7 will become unsupported. That means that it, along with its major components — like the bundled browser, Internet Explorer — will become increasingly vulnerable over time, and NOTHING will be done about it. Other software built to run on the unsupported OS will also, over time, become unsupported.

What does it mean for an operating system to become unsupported? It means that using it in connection with other computers becomes insanely risky.

“That’s nice data you got there. It’d be a real shame if anything should happen to it.”

So using the newer operating system — the one Microsoft wants you to use — is a no-brainer, right? Well, they say that if you’re not paying for it, you’re the product.

I repeat, for a full year, Microsoft gave Windows 10 away free to Windows 7 & 8.x users.

Microsoft badly wants people to use this product.

Windows 10 is not, in itself, a bad product. It is, however, a product that seems to have an overwhelming interest in what people are doing with it. From Cortana to browser news feeds and web searches to updates, Windows 10 wants to be incessantly helpful and involved. So helpful, in fact, that the decision whether and when to install updates is made for you. A lot of decisions are apparently made this way, and Microsoft thinks you should be OK with that.

Microsoft releases “feature updates” twice a year and seemingly won’t take no for an answer. Microsoft will add features and remove features as they see fit. Microsoft will fix bugs and patch security problems as they can. Feature updates are cumulative, and each is supported (for most users) for 18 months. What happens after 18 months if someone manages to avoid updating? Does the OS stop working? Is the user banished to Unsupportedville?

Windows 10 still shares some of the architecture and program code of earlier Windows versions, which means it shares some of the same vulnerabilities that just haven’t been exploited. Yet.

Most of us who have used computers in the workplace are accustomed to various degrees of constraint in how we might use them. We’ve also become accustomed to having no expectation of privacy where the company computer and phone are concerned? What about our personal computers? Who’s calling the shots there?

Microsoft makes its case for Windows 10 privacy settings here. It’s an interesting read. Some of the statements sound like they were written by Bill Clinton: “We don’t use what you say in email, chat, video calls or voice mail, or your documents, photos, or other personal files to choose which ads to show you.” OK, how do you use that information?

There’s also a link to the Microsoft Privacy Statement and a history of revisions, starting in July 2015 (which coincides with the introduction of Windows 10). A little light reading.

It’s perhaps ironic that Windows, which was originally named to describe frames containing different programs and content, now describes a product that is giving a lot of insight into what its users think, say and do, where they travel and what they like.

Has Windows 10 become too much of a window into our lives? If so, what can we do about it?

Stay tuned…

 

I want to be governed by partisan hacks, said nobody ever

In 2008, after years on the sidelines, I decided that I wanted to be involved in politics beyond just voting. Barack Obama won the Iowa caucuses and made a speech that turned my head in my kitchen in Colorado and made me want to be for something again.

So I got involved. I ran a precinct caucus and got elected as a delegate to various conventions. I met people. I learned how things worked. Although I didn’t get elected as a delegate to the Democratic National Convention, which Denver hosted in 2008, I used vacation time to volunteer and attended the first and last nights of the convention in person. It was cool. I was witnessing history. I was making history.

On January 20th, I took another day off work and when Aretha Franklin started singing My Country ‘Tis of Thee, I wept.

I stayed involved. In 2009 and following years I got elected to a growing list of committees in increasingly important roles. I served two terms as a congressional district chair and one as vice chair in one of Colorado’s most populous counties. In 2012 I attended the DNC as a member of the rules committee, elected by my state delegation. I paid my way to Charlotte and helped make history again.

Democrats have had some rough times during the past ten years. The 2010 midterms saw the rise of the so-called tea party and Dems lost the House of Representatives. The 2014 midterms saw Republicans retake the Senate. Colorado Democrats passed some gun laws that were not universally popular. There were some recall elections, and losses in the state legislature, statewide offices, county offices, you name it. In 2014, Colorado Democrats lost a US Senate seat. We struggled in vain to flip congressional districts, even in presidential years.

Democrats seemingly have a unique gift for infighting, even when things are going well, but honestly, it’s probably the same on the other side. There are always people on the fringes trying to move a party further left, further right, more this, less that. Identity politics and hot-button issues cause a lot of heartburn for a lot of folks.

And some people, it seems, are just along for the ride.

A lot of people will tell you they’re lifelong Republicans or Democrats, but I can’t make that claim. When I was young, I leaned strongly Democratic. Later, at least for a time, I became more conservative. Somewhere around the time of the 2nd Gulf War, when it became obvious that Iraq’s alleged weapons of mass destruction were a lie, I lost affection for the Republican Party.

I’ve been a partisan, and at times I’ve voted straight tickets. However, there were also times — even when I held party office — when I’ve voted for another party’s candidate.

Being actively involved in party politics means that I’ve met a lot of politicians and gotten to know some of them pretty well. And it turns out that a lot of politicians are actually good people trying to do good work. But sometimes the other party’s candidate is simply the better choice.

In an average year, you win some races and lose some races. In a wave election, like 2018 was, you might win or lose almost all of them.

In 2018, in some places, like Arapahoe County in Colorado, Democrats won almost all of their races. The editors of the Aurora Sentinel think this is a bad thing. I agree with them.

In 2014, three Arapahoe Democrats ran and lost races for the very same positions that they won in 2018. And not to disparage any of the Democrats involved, whom I know to varying degrees, but what was the difference this year? A big part of it is undeniably the fact that Donald Trump is President of the United States, is very unpopular in places like suburban Denver, and is a Republican. As the Sentinel concluded, voters took out “their righteous disdain for politics in Washington and Denver against down-ballot elected officials with the wrong letter at the end of their name on the ballot.”

While there might be some justification for punishing a congressional candidate for supporting a president you don’t like, what’s the justification for throwing out a county sheriff, assessor or clerk over party affiliation? This is partisanship run amok, and I’d argue that it’s not good for anybody.

Political parties exist largely to select and elect candidates who represent common values. Sometimes they get the selection part wrong. Sometimes they give us a Donald Trump. Oftentimes they actually give us great candidates. But not ALWAYS. We can’t assume that someone is necessarily the best choice based solely on party affiliation.

The political parties are not ever going to tell you this. Why? Guess what would happen to a party officer who said “Yeah, I know so-and-so is our candidate for coroner, but the other candidate is a bonafide medical examiner and ours is a hack!” That person would be removed from office, replaced with a hack, and the partisan show would go on.

There are positions that should be decided by nonpartisan elections — you know, like school boards used to be — and some that should not be decided by voters at all. Is there any reason that a board of county commissioners could not hire a sheriff from a group of qualified applicants? Or a clerk and recorder? Do you want the person running elections to be a gung ho partisan? What about the person running the local police force and jail?

Maybe Democrats who rode a blue wave to victory in 2018 will exceed expectations and serve well. I hope they do. But we’ve got to do better than simply hoping when it comes to choosing people to run the government.

Changing One Mind

Yesterday I read an article about a guy in Maine who posts online satire — totally made-up stuff — and then watches it liked and shared far and wide, mostly by people who think it’s true. The article, ‘Nothing on this page is real’: How lies become truth in online America, also tells the story of a woman in Nevada who reads, likes and shares such web content. This article has been one of the most-read items on The Washington Post website during the past 24 hours.

The Facebook site “America’s Last Line of Defense” was created in 2016 as a prank, but it has since become a full-time endeavor for Christopher Blair, who is assisted by about a hundred liberals in policing the site. Blair and his crew dream up outrageous disinformation — “The more extreme we become, the more people believe it” — post it online, watch it go viral, and eventually post a truthful explanation. After that, people berate some of the folks who have promoted the bogus story. If you’ve been on social media, you know what this looks like.

Keep in mind that the site tells people point blank that what they’re reading is fiction. The “About” section on ALLOD’s Facebook page states: “Nothing on this page is real. It is a collection of the satirical whimsies of liberal trolls masquerading as conservatives. You have been warned.”

The Washington Post article, well written and informative, tells much about the attitudes of Mr. Blair, and quite a bit about one of the hapless, lonesome people who refuses to accept that what Blair publishes is pure fiction.

This, we might guess, is a cause for much concern, and it should be. If Americans in large numbers can’t tell the difference between fact and fiction even when fiction is clearly labeled as such, then we’re in big trouble. Discerning the truth is an essential skill in dealing with the world. If we cannot figure out what’s real, the only problems we’ll solve will be accomplished accidentally or by divine intervention. Without getting into the theological weeds, I never got the impression that the divine will was for humanity to sit around online and hope for the best.

We already knew that people tend to believe what aligns with their preconceptions. We also knew that people seek out others who share their interests and views. This is nowhere more true than online, where the biggest, most successful companies are doing everything they can to find out what we like so they can deliver more of it.

No surprise, either, to read that a lot of people are reality challenged. We elected a president whose greatest documented accomplishment prior to his election was arguably a stint on a “reality television” program. And if we’re going to call The Apprentice reality television, we might as well put The Flintstones on the History Channel.

Furthermore, even our language is devolving in a way that makes it more difficult to distinguish the truth from wild exaggeration or outright fiction. The first example of this is the word “literally.” According to the Oxford English Dictionary, “literally” can mean “in a literal manner or sense; exactly” OR it can be a word “used for emphasis while not being literally true.” In other words, literally sometimes means the opposite of literally. “America has literally been torn in half!” It has? “My head literally exploded!” It did?!

Similarly, the word “incredible” has morphed from being “impossible to believe” or “difficult to believe; extraordinary” to “very good; wonderful.” I’ve known a lot of folks in the political realm who use “incredible” to describe people and policies as though the word were an unqualified, unambiguous endorsement. “He’s an incredible candidate!” When party hacks call their own candidates and policies incredible, are they saying they’re unbelievably good, or just unbelievable? Do I have to vote for them to find out?

Think I’m just a pedantic snob? Well, if I were going on about the proper use of apostrophes or homonyms, maybe. Someday perhaps I’ll do that, as there’s plenty of material for an Andy Rooney-styled rant. But here I’m making a point about language, beyond the growing illiteracy of American English speakers: We need a common frame of reference to describe and interpret the world around us. People need to know whether we’re stating facts or telling tales. People need to know whether something is to be believed or not. Words matter.

Some of the words that jumped off the page at me (figuratively) are the following: “What Blair wasn’t sure he had ever done was change a single person’s mind.”

For two years Mr. Blair has been posting absurd fabrications, watching them spread, marveling at the gullible people who are taken in, and holding them up to ridicule. In some cases Blair has used phony profiles to bait people into making inappropriate comments online and getting them in trouble for it.  He’s outed propaganda consumers and distributors alike. Now he wonders, “Where is the edge? Is there ever a point where people realize they’re being fed garbage and decide to return to reality?”  He wonders whether he has changed a single mind.

The question here, for me, is whether the world is better served with more disinformation and incivility online. Are we making the world better or worse? And by lying to people, then calling them names and embarrassing them online, are we winning hearts and minds or creating more entrenched and intractable enemies?

Mitt Romney disparaged “the 47 percent” who allegedly paid no taxes and would naturally vote for his opponent. He wrote off about half of the country and went on to lose the election. Hillary Clinton said “you could put half of Trump’s supporters into what I call the basket of deplorables…” and lose the election. Call them what you will; they won’t be calling you “Madam President.”

It doesn’t take Dale Carnegie to figure out that this is NOT How to Win Friends and Influence People. Baiting people with lies in order to shame and humiliate them only drives them into their corners.

One more point: liberals and Democrats in particular should not make too much of the blue wave election of 2018. Yes, Democrats did well, especially in places where Donald Trump did poorly two years ago. And Democrats did very well with some demographics. But they also did badly with some others who are seemingly as unreachable as ever. You know where Democrats did very well in places like Colorado? INDEPENDENTS. The lesson here is don’t mess it up.

Like it or not, the 2020 campaign has begun. We’re going to have to do better than this.

If the past two years of satire and public shaming have left Mr. Blair wondering whether he’s changed anybody’s mind, maybe it’s time for another approach. At the very least he and his supporters have persuaded a bunch of conservatives that liberals are every bit as nasty as they already believed. And if I ever thought that it made sense to fight fire with fire, believe me, one mind has been changed.