- The tangible vs. the digital: Why physical reading still holds its ground (8/23/24)
- Consolidation, choice and tax relief (8/16/24)
- Transparency and accountability (8/2/24)
- Fences, politicians, tradition and ambition (7/26/24)
- Community, transparency and value (7/19/24)
- Stranger than fiction (7/12/24)
- Josh the Otter and the Chevron Decision (7/5/24)
Opinion
Putting processing in perspective
Thursday, January 25, 2024
I recently had a conversation with a coworker about websites and was reminded of the days when portals like GeoCities, Excite, Lycos, and Yahoo offered the ability to create free web pages. Businesses used them to a limited extent, but they were used most as personal pages, acting as the precursors to today's social media sites.
That brought back memories of the times when the home computer had gained relatively wide acceptance, but before many of us had access to the internet. I have to think for a few moments to recall just what we did without connectivity, but we did quite a bit.
In the early '80s, I took a class in what was then called “business computing,” where we wrote primitive code for a random number generator, learned basic DOS commands, then progressed to Lotus 1-2-3 and Word Perfect. My real leap forward came when I left home to go to school in Denver, and there wasn’t a typewriter in sight. That’s when the word processor and I began a lifelong partnership. I was finally free from the constraints of my disorderly handwriting, but spelling? Well, the crude spell checkers of the day matched text against a limited dictionary and kicked out anything that wasn’t recognized as a word. The errant word was replaced with another, but it wasn’t necessarily the right word. Improper homonyms were famously undetected, and we were still tied to our Strunk and Whites for punctuation.
Once I experienced the joys of word processing, I set my sights on obtaining a PC of my own and spent a princely sum of money on a Compaq Portable. It looked more like a portable sewing machine than anything we would recognize today as a computer. It weighed 28 pounds, had two 5 1/4” floppy drives, and a 9” green screen. I used that through school, and I laugh when I think how cool I felt scooting along airport walkways with my sewing machine PC and my bag phone.
Interestingly, every bit of that happened before we all discovered the internet. I had tech-inclined friends tinkering with bulletin board services and Compuserve, but when I jumped in, I took baby steps. I entered via Prodigy, which was a dumbed down, closed network that included a few banks that were online, most of the Sears catalog, a weather page, a bit of news, and a multitude of clubs and special interest groups all provided through the beeps and hisses of a 1200 baud modem.
I suppose I could ramble on about the succession of crude pages and domains I published with Microsoft’s Front Page, including the frighteningly ugly Frames environment. I could wax eloquent about the subsequent digitization of our daily lives, but I mention this because I recollect those experiences when I look at where Artificial Intelligence is going. I can’t help but think we’re at the beginning of something very big, but we have no idea what it is, and we aren’t sure what to do with it.
We have seen this before. People began experimenting with electricity in the 1740s, but it was only a novelty act performed by would-be scientists for European aristocracy. It didn’t become truly useful until the 19th century with electromagnets, the telegraph, and Edison’s lightbulb.
More recently, we have seen the same progression from car phones to brick phones to real, functioning Dick Tracy watches. We also saw it in the awkward early days of music videos at the dawn of the MTV era. I’m always amused by those cringeworthy old vids from when artists recognized the medium's potential but hadn’t a clue of what to do with it.
Artificial Intelligence strikes me as being the same. It’s genuinely fascinating, yet it has the feel of a novelty that hasn’t found its purpose yet. We have already entered a non-technical phase of conventional computing where voice commands drive user-friendly platforms like Amazon’s Alexa. Considering that we have been working on a computer-to-brain interface since the 90s, the possibilities of AI are staggering.
With all of that in mind, I view Chat GPT as no more than a novelty in search of a function. It’s like Franklin’s key in a lightning storm, or Prodigy, or my bag phone, or my favorite campy Nick Lowe video. We’re at the beginning of something very big. For all the negative nonsense in our lives, it’s hard not to admit that these are some pretty exciting times. I can’t wait to see what’s next.