An essential part of science is detecting variation in data. I will give an introduction to dependence logic, an approach to logic emphasizing the detection of variation and dependences between variables. I will discuss algorithmic and model theoretic properties of dependence logic, and its relation to other notions of dependence e.g. dependence in biology, model theory, computer science, quantum physics, and economics. In science it is natural to consider probabilities of formulas rather than just the truth values true/false. Suppose we have a set of assignments of fixed variables into the domain of a first order structure. We call such sets teams. Semantics based on teams is the underlying concept of dependence logic. We may ask, what is the probability that a randomly chosen assignment in a team satisfies a given first-order formula in the structure? The Hardy-Weinberg Theorem is an example of such a first order property of probabilities in teams. We give axioms for making inferences about first-order properties of probabilities, and prove the completeness of our axioms. Much of the talk is joint work with T. Hyttinen and G. Paolini.