LEARNING WITH SELECTED FEATURES

      

ABSTARCT :

Feature selection is the task of choosing a small subset of features that is sufficient to predict the target labels well. Here, instead of trying to directly determine which features are better, we attempt to learn the properties of good features. For this purpose we assume that each feature is represented by a set of properties, referred to as meta-features. This approach enables prediction of the quality of features without measuring their value on the training instances. We use this ability to devise new selection algorithms that can efficiently search for new good features in the presence of a huge number of features, and to dramatically reduce the number of feature measurements needed

EXISTING SYSTEM :

? This observation is notable since it explains the existence of a universal set of features (prototypes) that enables recognition of most objects, regardless of whether the prototypes were taken from pictures that contain the relevant objects or not. ? This result supports the existence of a universal set of features (universal-dictionary) that can be used for recognition of most objects. ? The existence of such a dictionary is a key issue in computer vision and brain research. We also showed that when the selection of features is based on metafeatures it is possible to derive better generalization bounds on the combined problem of selection and classification.

DISADVANTAGE :

? We demonstrate our algorithms on a handwritten digit recognition problem and a visual object category recognition problem. ? In addition, we show how this novel viewpoint enables derivation of better generalization bounds for the joint learning problem of selection and classification, ? The main motivations for feature selection are computational complexity, reducing the cost of measuring features, improved classification accuracy and problem understanding

PROPOSED SYSTEM :

? Krupka and Tishby (2007) proposed a framework that incorporates prior knowledge on features, which is represented by meta-features, into learning. ? They assume that a weight is assigned to each feature, as in linear discrimination, and they use the meta-features to define a prior on the weights. ? The main motivations for feature selection are computational complexity, reducing the cost of measuring features, improved classification accuracy and problem understanding

ADVANTAGE :

? In feature extraction the original input features (for example, pixels) are used to generate new, more complicated features (for example logical AND of sets of 3 binary pixels). ? In the most common selection paradigm an evaluation function is used to assign scoresto subsets of features and a search algorithm is used to search for a subset with a high score ? While in their work meta-features are used for learning a better classifier, in this work meta-features are used for feature selection

Download DOC Download PPT

We have more than 145000 Documents , PPT and Research Papers

Have a question ?

Chat on WhatsApp