Computer Trends Are the Worst
- Created 2024-05-10
The recent explosion of so-called “AI-powered chat-bots” lately has, once again, had me thinking about trends in the computer world that are driven not by experts or users, but marketing execubots.
I’m going to sound like a stinky, old curmudgeon here, but I stick with me, I have a point. User interfaces were better in the 90s. Wait, wait, wait! I promise I’m not just saying ``old good, new bad.’’ You see, in the 90s we had interfaces that were built by academics. Doctors and students, who spent years upon years thinking about this stuff. How should menus look? How should we show the user that the document they’re looking at extends beyond the screen? How should we handle switching tasks?
These ideas were all given extensive thought. Papers were written about them, arguments were had, and when the dust was settled, we had, more or less, the modern desktop. Sure, the specific details varied between what Microsoft and Sun and Apple were doing, but the basic ideas were the same. Scrollbars, clicking, dragging, fly-out menus, windows. 3D buttons that make what you can and can’t click, and an obvious state for when you are clicking something. These were all great ideas.
These days, nearly all of those ideas are going away. A lot of it is thanks to smart phones, but those bad ideas are seeping into desktop environments now, too.
Consider the following: In the 70s, we had computers where you needed to type into a prompt to tell them what you wanted them to do. And if you typed in the wrong thing, the computer wouldn’t know what to do. And these computers only did one thing at a time.
In the 80s, we started figuring out multitasking, and graphical user interfaces. We could tell a computer to do something, and put it in the background. And we had an easy way for the computer to tell us what it was capable of. And while you still had to figure out the ``language’’ of where to click, you didn’t have to memorize it.
And this was good. There were improvements that could have made life better, but we solved a lot of problems. Then, in the 2000s, we started moving backwards. It sort of made sense to get rid of windows on the small screen of a smart phone. It’s still a shame nobody has figured out how to open two apps side-by-side on a phone or tablet (Oh, wait, Samsung has several times, but it never sticks for some reason).
Then we got rid of the menus, even on desktop. And we changed buttons with obvious state to links. We changed check boxes to ambiguous sliders. We got rid of the damn scroll bars.
And look, we should keep investigating how the user interface should look. We should keep trying new things, and seeing what sticks and what doesn’t. But if you’re going to do that, it needs to be done by an expert in interfaces. Someone with years to devote to the problem, not an overworked code bootcamp graduate with a week to come up with a new solution.
And now, we’re even getting rid of the links. Everything should be an AI chatbot, where you tell the computer what to do. And if you don’t use the correct language, it will tell you it doesn’t know what to do.
computrends.md