Soft computing is an emerging approach to computing which parallel the remarkable ability of the human mind to reason and learn in an environment of uncertainty and imprecision.
Soft computing is a field of science which makes use of inexact solution for problem which has no knows method to compute an exact solution.
The idea behind soft computing is to model cognitive behavior of human mind and soft computing is foundation of conceptual intelligence in machines.
There are main difference between soft computing and possibility. Possibility is used when we don’t have enough information to solve a problem but soft computing is used when we don’t have enough information about the problem itself. These kinds of problems originate in the human mind with all its doubts, subjectivity and emotions; an example can be determining a suitable temperature for a room to make people feel comfortable.
Goals of soft computing:
- The main goal of soft computing is to develop intelligent machines to provide solutions to real world problems, which are not modeled or too difficult to model mathematically.
- Its aim is to exploit the tolerance for approximation, uncertainty, imprecision and partial truth in order to achieve close resemblance with human like decision making
- extended to include bio informatics aspects
- Soft computing enables industrial to be innovative due to the characteristics of soft computing: tractability, low cost and high machine intelligent quotient
Some of its principle components include:
- Neural Network (NN)
- Fuzzy Logic (FL)
- Genetic Algorithm (GA)
- These methodologies form the core of soft computing.
In general neural network is a highly interconnected network of a large number of processing elements called neurons in an architecture inspired by the brain.
Pattern matching technique where input patterns are matched with a specific output pattern. Modeled after the neurons in the brain.
Neural networks have many advantage such as
- They handle noise well. Once trained, neural nets show an ability to recognize patterns even though part of the input data is missing or obscured.
- They are robust. Because the information is distributed, neural nets can survive the failure of some nodes.
- They implement parallelism.
- They are able to learn new patterns.
- By patterning themselves after the architecture of the brain, they provide a model of intelligent mechanism.
- They have had success in areas like vision that have frustrated more traditional approaches.
- They are a promising model of associative memory.
- They provide tool for modeling and exploring brain function, much as production systems have helped cognitive scientists study higher-level cognitive processes.
- They can provide nonlinear mappings to represent complicated problems
Because of these advantages, research in Neural Networks is growing
- Function approximation
- Data processing
Fuzzy logic is a superset of conventional logic that has been extended to handle the concept of partial truth. Truth values between “completely true” and “completely false”. The idea of fuzzy logic was first advanced by Dr. Lotfi Zadeh of the University of California at Berkeley in the 1960s. Dr. Zadeh was working on the problem of computer understanding of natural language. Natural language (like most other activities in life and indeed the universe) is not easily translated into the absolute terms of 0 and 1. (Whether everything is ultimately describable in binary terms is a philosophical question worth pursuing, but in practice much data we might want to feed a computer is in some state in between and so, frequently, are the results of computing.)
Zadeh says that rather than regarding fuzzy theory as a single theory, we should regard the process of ”fuzzification” as a methodology to generalize ANY specific theory from a crisp (discrete) to a continuous (fuzzy) form. Thus recently researchers have also introduced “fuzzy calculus”, “fuzzy differential equations” and ”fuzzy control”
- Knowledge Representation
- Pattern recognition
- Data and Information processing
Genetic algorithm (GA) is a search heuristic that mimics the process of natural selection. This heuristic is routinely used to generate useful solutions to optimization and search problems. Genetic algorithms belong to the larger class of evolutionary algorithms (EA), which generate solutions to optimization problems using techniques inspired by natural evolution, such as inheritance, mutation, selection, and crossover.
Genetic algorithms (GAs) are search methods based on principles of natural selection and genetics. GAs encode the decision variables of a search problem into finite-length strings of alphabets of certain cardinality. The strings which are candidate solutions to the search problem are referred to as chromosomes, the alphabets are referred to as genes and the values of genes are called alleles. For example, in a problem such as the traveling salesman problem, a chromosome represents a route, and a gene may represent a city. In contrast to traditional optimization techniques, GAs work with coding of parameters, rather than the parameters themselves.