**Algorithmic Dimensions and Randomness** Martin-Löf's 1966 discovery that the theory of computing — and only the theory of computing — provides a satisfactory definition of the randomness of individual binary sequences was an early and dramatic indicator of things to come. In addition to providing hardware and software tools, computer science has developed deep theoretical tools that are conceptually transforming the foundations of several areas of science. Many of these tools, like randomness, arise from the growing unification of theoretical computer science with information theory. I have been involved in the development of several types of effective fractal dimensions, including: * Algorithmic dimension (2000) * Finite-state dimension (2001, with Dai, Lathrop, and Mayordomo) * Algorithmic strong dimension (2003, with Athreya, Hitchcock, and Mayordomo) * Mutual dimension (2013, with Case) * Conditional dimension (2018, with N. Lutz) The first of these refines randomness and gives a new characterization of Kolmogorov complexity. The second sheds new light on normal sequences and data compression. The third shows that packing dimension (an ostensibly deeper fractal dimension developed in 1982) is an exact dual of Hausdorff dimension with similarly useful effectivizations. The last two are dimension-theoretic analogs of mutual information and conditional entropy. The Point-to-Set Principle (2018, with N. Lutz) has enabled a growing body of new classical theorems (theorems whose statements do not involve computability or related aspects of mathematical logic) in geometric measure theory to be proven using the relativized algorithmic dimensions of individual points in Euclidean spaces. Some of these new theorems have solved long-standing open problems. I am currently working with colleagues and students to extend these developments, with applications to fractal geometry, ergodic number theory, dynamical systems, chemical reaction networks, and computational complexity.