Autonomy
is a strange thing. You cannot give real autonomy to a machine, you can only make it mimic autonomy, but in any event you are only overlaying your own desires onto a machine.
The Mars rover “Curiosity” acts autonomously- the distance between the rover and Earth is too great to allow fast enough communications so that it can be operated remotely- but what it does have is a set of instructions that operate within a limited set of parameters. Go places- move around. Avoid big obstacles and stay out of shadow. Pick up dirt and test it. And as simple as these tasks are, it has a pretty substantial program to do them.
Autonomy requires thinking for ones self. And not just thinking for one’s self, but making choices and decisions not based on your programming, but on independant choices you make based on your environment.
Here’s an example. A robot sees a bird. It’s programming lets it recognize it as a bird. It looks up “Bird” and decides what to do with the bird. Should it catch it and make a photographic record of it’s activities? Should it try to dissect it and see if it is healthy, or deformed, or similar to other birds? Or is it’s bird study already done, and the robot should ignore it?
All of those decisions are made based on the program of another person, not any autonomy on the part of the robot. The robots actions only mimic autonomy, but if the program is known, the actions are completely predictable
A human, with true autonomy, sees the bird and wishes he could fly. Imagination is the key to autonomy, the idea of seeing something and imagining things that have never existed before and attempting to make them happen. Even if you could write code that “Mimics” imagination, it is still only a poor substitute. You could write a program that allowed a robot to hear a song and then decide it wants tolearn to play the piano, but you can’t make it start writing songs on it’s own.

True that. Been coding for a little over 30 years now. ( I was a retread)
Still remember doing a flow chart to navigate a car about 6 blocks on a 4 lane road with stop lights. Could get the car down the road, but it never answered or even asked why.
Imagination. Some people have it and some don’t, or maybe it is we all have it in differing amounts.
Good story arc, Og.
The ability (or, more to the point, the inability) to program autonomy is why I don’t hold out much hope for true AI.
Only the Creator ever figured that out. Or at least He seems to let us think so.
Well, you’ve pretty well described me before coffee on a Monday morning?
Should I be offended? Hang on, let me run that subroutine……
/snrk
Jim
Sunk New Dawn
Galveston, TX
Well, theoretically you could write a program that 1) identified the theory of music, and 2) cataloged the entirety of human composition to that point, and instructed the machine to write a composition that complied with the strictures of the first while not repeating any of the second. I imagine this would effectively simulate original music creation (for a given value of “music”).
I can also imagine the image of Eric S. Raymond or Robb Allen shuddering at the very idea of such a coding project too.
The guys who don’t get their hands dirty (to the extent writing real code does that) have long been saying autonomous AI is right around the corner. Kurzweil’s “Age of Spiritual Machines” comes to mind. Self awareness and the ability to think has been “right around the corner” for about as long as I can remember – at least since about the late 1970s. Heinlein put that idea in “The Moon Is A Harsh Mistress” and I think the idea is just too romantic for people to dismiss.
No, I can’t prove it can’t happen. Can anyone demonstrate proof it can? The reality is more like what you describe, though.
Heinlein at least didn’t imply that autonomous AI was a SMOP.