Quote Originally Posted by pitfiend View Post
They see computers as a black boxes with an OS where you double click an icon and start a program.
This was kinda inevitable IMHO, the same things happened a few decades ago with... cars. Back then every "hacker" was a mechanic, being able to disassemble a moby or a car engine, tweak it, etc. These day you just glance the surface, and most of the car mechanics only know how to replace spare parts, change oil or tyres.

Computers are just being commoditized, like cars, planes, electricity, tap water, phone, etc. And a myriad of things of our everyday life that used to be hacker territory in their prime.

As computers became more complex and standardized, one just couldn't grasp or match industrially-made stuff. There used to be a time when you could assemble a decent 8 or 16 bit machine from scratch, write its OS, and have it being technologically competitive or better than off-the-shelf products. Heck you could even design your own silicon and your own 4 or 8 bits CPU from the transistors up if you were motivated enough.

For modern hardware or software, you can at best only scratch the surface or learn specific techniques, and these days people are using tools built upon many layers of other tools. Even ASM isn't the simple environment it was, you could know not just all the opcodes but also the common binary forms for a Z80 CPU f.i., while merely knowing all the instructions that exists in x86-64 (even without AVX and other extensions) is already a challenge.

So IMHO, it was kinda inevitable that computers were to become black boxes.

Not knowing algorithms and data-structures is more disturbing, but hardware is fast enough that most of the developers can get away with any crappy algorithms throughout most of their programming life, and fall-back on experts for the few cases where that's enough (just like you'll call the car mechanic, the plumber, etc. from time to time).