The jackknife and the bootstrap are nparametric methods for assessing the errors in a statistical estimation problem. They provide several advantages over the traditional parametric approach: the methods are easy to describe and they apply to arbitrarily complicated situations; distribution assumptions, such as rmality, are never made. This mograph connects the jackknife, the bootstrap, and many other related ideas such as cross-validation, random subsampling, and balanced repeated replications into a unified exposition. The theoretical development is at an easy mathematical level and is supplemented by a large number of numerical examples. The methods described in this mograph form a useful set of tools for the applied statistician. They are particularly useful in problem areas where complicated data structures are common, for example, in censoring, missing data, and highly multivariate situations.