Nonparametric and nonlinear measures of statistical dependence between pairs of random variables have proved themselves important tools in modern data analysis, where the emergence of large data sets can support the relaxation of linearity assumptions implicit in traditional association scores such as correlation. In this talk, I will present two Bayesian nonparametric procedures to test for dependence. In the first one a tractable, explicit and analytic quantification of the relative evidence of dependence vrs independence is derived using Polya tree priors on the space of probability measures. In the second procedure the unknown sampling distribution is specified via Dirichlet Process Mixtures (DPM) of Gaussians, which provide great flexibility while also encompassing smoothness assumptions. After describing the methods, I will contrast their performances in high dimensional spaces. These procedures can accommodate known uncertainty in the form of the underlying sampling distribution and provides an explicit posterior probability measure of both dependence and independence. Well known advantages of having an explicit probability measure include the easy comparison of evidence across different studies, the inclusion of prior information, and the integration of results within formal decision analysis.