Personally, I guess that you learn more the more issues you have. MacOS is a more closed down ecosystem compared to Windows, malware is less popular and as hardware comes usually bundled with the OS, you shouldn’t encounter as many driver or hardware issues in general.
As a kid I had so much trouble with incompatible software, viruses, adware, drivers, broken hardware etc. And as I had noone to ask, it tought me a lot about the fundamentals of IT and how to research such issues myself.
Counterpoint, I grew up at a time when Mac’s still couldn’t do much outside of what apple specifically developed for them, so I learned a ton about emulation and virtual machines and such to play games or use Photoshop. I guess that supports your hypothesis, I can rock Unix command line stuff and containers like a pro, but hate figuring out drivers
And then there’s 90s Linux because your parents got a used computer with a friend that came with only that and they didn’t want to spend money buying windows 😢 it’s like learning to swim by being yeeted into the ocean, with a couple sharks hanging around.
Debian didn’t have a stable release until 1996…
Even Slackware didn’t shape up nicely until around 98 from what I remember. SLS gave it a GUI but wasn’t well maintained. Linux wasn’t really “good” until early 2000s at the very least.
I just wanted to play Space Cadet Pinball or Commander Keen as a kid, not compile my programs.
Yes, I completely see that. This is not a black or white question. You can use Windows, MacOS, Linux, Android, iOS… and learn close to nothing or you can geek around hour after hour to expand the boundaries of your device.
I would just assume, that you learn less if everything you want to do, works out of the box. And ‘working out of the box’ a typical selling point of the Apple ecosystem. Which of course doesn’t mean that you can’t have a steep learning curve. Your use cases obviously weren’t delivered out of the box, so you had to get creative as well.
I had a jailbroken iPod Touch with a shell on it and spend hours and days overcoming system boundaries just out of spite. I also remember vividly trying to bring mobile games to a Symbian phone, tweaking around with a HP iPAQ on Windows Mobile, manually typing Midi ringtones with a text editor on a Nokia. :D
Personally, I guess that you learn more the more issues you have. MacOS is a more closed down ecosystem compared to Windows, malware is less popular and as hardware comes usually bundled with the OS, you shouldn’t encounter as many driver or hardware issues in general.
As a kid I had so much trouble with incompatible software, viruses, adware, drivers, broken hardware etc. And as I had noone to ask, it tought me a lot about the fundamentals of IT and how to research such issues myself.
Counterpoint, I grew up at a time when Mac’s still couldn’t do much outside of what apple specifically developed for them, so I learned a ton about emulation and virtual machines and such to play games or use Photoshop. I guess that supports your hypothesis, I can rock Unix command line stuff and containers like a pro, but hate figuring out drivers
And then there’s 90s Linux because your parents got a used computer with a friend that came with only that and they didn’t want to spend money buying windows 😢 it’s like learning to swim by being yeeted into the ocean, with a couple sharks hanging around.
At least 80s kids got assembly.
Linux had always been good - put together a new computer, move the OS from the old one, put Linux on the old one…
Find Linux is so much fun, dual boot the new machine on Linux, only keep windows for games
My audio collection from then is all .ogg files
Debian didn’t have a stable release until 1996… Even Slackware didn’t shape up nicely until around 98 from what I remember. SLS gave it a GUI but wasn’t well maintained. Linux wasn’t really “good” until early 2000s at the very least.
I just wanted to play Space Cadet Pinball or Commander Keen as a kid, not compile my programs.
You’re clearly talking about modern Linux.
I don’t think I had budget to buy my own computers until '99, and that’s when I first played with Linux
Yes, I completely see that. This is not a black or white question. You can use Windows, MacOS, Linux, Android, iOS… and learn close to nothing or you can geek around hour after hour to expand the boundaries of your device.
I would just assume, that you learn less if everything you want to do, works out of the box. And ‘working out of the box’ a typical selling point of the Apple ecosystem. Which of course doesn’t mean that you can’t have a steep learning curve. Your use cases obviously weren’t delivered out of the box, so you had to get creative as well.
I had a jailbroken iPod Touch with a shell on it and spend hours and days overcoming system boundaries just out of spite. I also remember vividly trying to bring mobile games to a Symbian phone, tweaking around with a HP iPAQ on Windows Mobile, manually typing Midi ringtones with a text editor on a Nokia. :D