Many diagstic applications require t only the samples to be classified, but a certainty value is required in addition to the predicted class label indicating the strength of the diagsis. The first part of this Thesis proposes an extension to the decision tree framework to handle classification uncertainty, involving distance calculation to the relevant decision boundary; class density, correct classification probability and confidence estimation. The method is also applicable to trees that utilize oblique hyperplanes to cluster the input space and it is t restricted to the Euclidian distance metric. In the second part it is shown that these classification confidence values can be integrated to derive a consensus decision. Using the proposed combination scheme there is need for an auxiliary combiner or weighting network, the weights are adaptively provided by the individual tree classifiers in the ensemble, new classifiers can be added dynamically without any retraining or modification to the existing system. This discussion can give a head start to anyone - researcher or developer - dealing with problems, where classification certainty is an issue.