01-20-2005, 09:04 AM
Nystul,Jan 19 2005, 09:59 PM Wrote:For one thing, it could make things just about impossible to balance for difficulty.
No, it would make it easier. If you change the success function from "winning the game", to one of "the player plays often" or "length of 'time'(or moves etc.) until defeat, but punish success harshly", you can approximate a good AI that will adapt/evolve to one that each player finds challenging (and will adapt in response to new strategies/tactics that the player learns), but not one that uses a "winning" strategy that is too good for a "mere human" to play against.
It is kind of like programming in a servant mentality as the success rule. Play well enough to keep me entertained, but too well and it's off with your head :P
I have done AI in very simple self made games. For example, when I made a Chess AI, the thing 'learned' (evolved in this case) to move its pieces out of check, and to put the opponent into check, however it didn't get much smarter than that because the success criteria didn't handle draws well, so the population devolved into ones that would play into the three repeated moves=draw. This could have been fixed by using an evaluative function of draws that mark it as a success to the one with a material (or otherwise) advantage, so that in a losing position at least, the AI would have an evolutionary advantage if it did not fall into the repeated moves trap.
In the generic AI requirements I defined, the developer still ends up 'hardcoding' the success function, and the environment interfaces (senses and responses) available to the AI.
Of course purists would want an AI that can play any game and adapt/evolve. Unfortunately they don't want to wait the 3billion years for that to happen :D
In nature there is no generic intelligence that can learn anything at all. I don't want to argue the human example, but ignoring that (for brevity), things are genetically(->physically) limited in what they can learn, adapt to, associate etc. Think of the ant that will follow a circular pheromone trail until it dies, not realising that it is going in circles.
Adaptability is always more expensive than hardcoded rules (in energy terms), so if you are going to be in an environment that doesn't vary then the hardcoded, non-adaptive outcome will be genetically dominant. (also insert here a diversion to Shannons sampling theorem in relation to things an organism can adapt to must be at a frequency above the frequency of their lifespan, but lower frequency events cannot be adapted to, but can be evolved towards).
Sorry for the pointless rambling, I've been working late.