This is not about better programs, this is about teaching computers by example which allows them to perform tasks that the programmer might not even understand.
For example, you can program a computer to recognise cat videos by describing in code what cats look like. That is the old way, using things like the amount of jaggedness of a cat face in this range, a certain aspect ratio which you have calculated.
The modern way is to choose a neural net configuration, and feed it videos telling it which ones contain cats and which ones don't. Get it right, and you can feed it any video and it will identify the cat ones, but the killer here is that at no point has the programmer described a cat in any way to the computer. The machine has worked out what a cat looks like. The machine has learnt.
If you don't find it at least slightly scary, then you haven't understood the technology. We are still miles off general intelligence so no need to build that panic bunker in the garden just yet, but it is still a big step change in how things work.
Edit: Better example, feed a computer medical images and tell it what ailments the patients have. The computer starts to predict from new images what ailments new patients have. Researchers then have to reverse engineer their own system to work out what features of the image the system is using to make the prediction because they want to know what it is looking for because the researchers didn't even know you could make the prediction.