资源预览内容
第1页 / 共5页
第2页 / 共5页
第3页 / 共5页
第4页 / 共5页
第5页 / 共5页
亲,该文档总共5页全部预览完了,如果喜欢就下载吧!
资源描述
Machine Learning 3: 95-99, 1988 1988 Kluwer Academic Publishers - Manufactured in The Netherlands GUEST EDITORIAL Genetic Algorithms and Machine Learning Metaphors for learning There is no a priori reason why machine learning must borrow from nature. A field could exist, complete with well-defined algorithms, data structures, and theories of learning, without once referring to organisms, cognitive or genetic structures, and psychological or evolutionary theories. Yet at the end of the day, with the position papers written, the computers plugged in, and the programs debugged, a learning edifice devoid of natural metaphor would lack something. It would ignore the fact that all these creations have become possible only after three billion years of evolution on this planet. It would miss the point that the very ideas of adaptation and learning are concepts invented by the most recent representatives of the species Homo sapiens from the careful observation of themselves and life around them. It would miss the point that natural examples of learning and adaptation are treasure troves of robust procedures and structures. Fortunately, the field of machine learning does rely upon natures bounty for both inspiration and mechanism. Many machine learning systems now borrow heavily from current thinking in cognitive science, and rekindled in- terest in neural networks and connectionism is evidence of serious mechanistic and philosophical currents running through the field. Another area where nat- ural example has been tapped is in work on genetic algorithms (GAs) and genetics-based machine learning. Rooted in the early cybernetics movement (Holland, 1962), progress has been made in both theory (Holland, 1975; Hol- land, Holyoak, Nisbett, Grefenstette, 1985, 1987) to the point where genetics-based systems are find- ing their way into everyday commercial use (Davis Fourman, 1985). Genetic algorithms and classifier systems This special double issue of Machine Learning is devoted to papers concern- ing genetic algorithms and genetics-based learning systems. Simply stated, genetic algorithms are probabilistic search procedures designed to work on large spaces involving states that can be represented by strings. These meth- ods are inherently parallel, using a distributed set of samples from the space (a population of strings) to generate a new set of samples. They also ex- hibit a more subtle implicit parallelism. Roughly, in processing a population of m strings, a genetic algorithm implicitly evaluates substantially more than m3 component substrings. It then automatically biases future populations to exploit the above average components as building blocks from which to con- 96D. E. GOLDBERG AND J. H. HOLLAND struct structures that will exploit regularities in the environment (problem space). Section 3 of the paper by Fitzpatrick and Grefenstette gives a clear discussion of this property. The theorem that establishes this speedup and its precursors - the schema theorems - illustrate the central role of theory in the development of genetic algorithms. Learning programs designed to exploit this building block property gain a substantial advantage in complex spaces where they must discover both the “rules of the game“ and the strategies for playing that “game.“ Although there are a number of different types of genetics-based machine learning systems, in this issue we concentrate on classifier systems and their derivatives. Classifier systems are parallel production systems that have been designed to exploit the implicit parallelism of genetic algorithms. All inter- actions are via standardized messages, so that conditions are simply defined in terms of the messages they accept and actions are defined in terms of the messages they send. The resulting systems are computationally complete, and the simple syntax makes it easy for a genetic algorithm to discover building blocks appropriate for the construction of new candidate rules. Because clas- sifier systems rely on competition to resolve conflicts, they need no algorithms for determining the global consistency of a set of rules. As a consequence, new rules can be inserted in an existing system, as trials or hypotheses, without disturbing established capacities. This gracefulness makes it possible for the system to operate incrementally, testing new structures and hypotheses while steadily improving its performance. Arguments for the evolutionary metaphor These attractive properties of genetics-based systems - explicit parallelism, implicit parallelism, and gracefulness - are explored more fully in the papers that follow. However, before proceeding further we must answer an important question. Of the two natural archetypes of learning available to us - the brain and evolution - why have genetic algorithm researchers knowingly adopted the “wrong“ metaphor? One reason is expedience. The processes of natural evolution and natural genetics have been illuminated by a century of en
网站客服QQ:2055934822
金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号