Algorithm::DecisionTree - A pure-Perl implementation for constructing a decision tree from multidimensional training data and for using the decision tree thus induced for classifying data. The decision tree is induced from the training data supplied through a diskfile. As the documentation explains, the training data must be in a format that shows the names of the classes, the names of the features, their possible values, etc., at the head of the training data file. Please see training.dat in the examples directory. You can generate your own training data by specifying the class names, the feature names, the names to be used for feature values, etc. All this information is supplied to the data generator in the form of a parameter file. Please see param.txt in the examples directory for an example. From the standpoint of practical usefulness, note that the classifier carries out soft classifications. That is, if the class distributions are overlapping in the underlying feature space and a test sample falls in the overlap region, the classifier will generate all applicable class labels for the test data sample, along with the probability of each class label. For installation, do the usual perl Makefile.PL make make test make install if you have root access. If not, perl Makefile.PL prefix=/some/other/directory/ make make test make install Contact: Avinash Kak email: kak@purdue.edu Please place the string "DecisionTree" in the subject line if you wish to write to the author. Any feedback regarding this module would be highly appreciated.