Inductive Reasoning
An automatic generation of membership functions can be accommodated also by using the essential characteristic of the inductive reasoning that derives a general consensus from the specific as derives the generic from particular. The induction is performed by the entropy minimization principle that clusters mostly optimal the parameter corresponding to the output classes.
This method is based upon an ideal scheme that describes the output and input relationship for a well established database that is the method generates memberships functions based solely upon data provided. The method can be rather useful for complex systems here the data are abundant and static. In situations here the data are dynamic, the method may not be helpful, because the membership functions will continually change with time. The intent of induction is to discover a law containing objective validity and universal application. The laws of induction are summarized here as:
- specified a set of irreducible outcomes of an experiment, the induced probabilities are those probabilities consistent along with all available information that aximize the entropy of the set.
- A set of independent observations of the induced probability is proportional to the probability density of the induced probability of a single observation.
- The induced rule is a rule consistent along with all available information of which the entropy is minimum.
Among these three laws above, the third one is proper for membership function enhancement. A key goal of entropy minimization analysis is to determine the quality of information in a specified data set. The entropy of a probability distribution is a measure the distribution of the uncertainty. This information measures the contents of data to a prior probability for the similar data. The higher the prior estimate of the probability for an outcome to happen, the lower will be the information gained by observing it to happen. The entropy on a set of possible outcomes of a trail here one and only one outcome is true is explained by the summation of probability and the log of the probability for all outcomes.
In other words, entropy is the expected value of information.