Whom to marry? How to invest? Whom to trust? Complex problems require complex solutions – so we might think. And if the solution doesn’t work, we make it more complex. That recipe is perfect for a world of known risks, but not for an uncertain world, as the failure of the complex forecasting methods leading to the 2008 financial crisis illustrates. In order to reduce estimation error, good inferences under uncertainty counter-intuitively require ignoring part of the available information,. Less can be more. Yet although we face high degrees of uncertainty on a daily basis, most of economics and cognitive science deals exclusively with lotteries and similar situations in which all risks are perfectly known or can be easily estimated. In this talk, I invite you to explore the land of uncertainty, where mathematical probability is of limited value and people rely instead on simple heuristics, that is, on rules of thumb. We meet Homo heuristicus, who has been disparaged by many psychologists as irrational for ignoring information—unlike the more diligent Homo economicus. In an uncertain world, however, simple heuristics can be a smart tool and lead to even better decisions than with what are considered rational strategies. The study of heuristics has three goals. The first is descriptive: to analyze the heuristics in the “adaptive toolbox” of an individual or an institution. The second goal is normative: to identify the ecological rationality of a given heuristic, that is, the structures of environments in which it succeeds and fails. The third goal is engineering: to design intuitive heuristics such as fast-and-frugal trees that help physicians make better decisions.