Open Questions: Mathematics

[Home] [Up] [Glossary] [Topic Index] [Site Map]

The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve.

Eugene Wigner



Introduction

The branches of mathematics

Open questions


Number theory

The Riemann hypothesis

Algebraic number theory

Elliptic curves and modular forms

The Langlands program

Symmetry

Geometry and topology

Algebraic geometry

Knot theory

Noncommutative geometry

Mathematics and physics

Quantum field theory

Quantum geometry

Mathematical analysis and differential equations

Chaos theory and dynamical systems

Self-organization and complex systems

Mathematics and biology

Combinatorics, graph theory, and computation

Algebra


Recommended references: Web sites

Recommended references: Magazine/journal articles

Recommended references: Books

Introduction

Eugene Wigner was a physicist. Ironically, Bertrand Russell, who was a mathematician (among other things), was seemingly less charitable towards the subject: "Mathematics may be defined as the subject in which we never know what we are talking about, nor whether what we are saying is true."

It is very likely, however, that Russell didn't mean to deprecate mathematics by this witticism, for he also said, "Mathematics, rightly viewed, possesses not only truth, but supreme beauty -- a beauty cold and austere, like that of sculpture."

The point is that the ultimate nature of mathematics itself is something that we do not understand very well, despite its dual role in having both profound utility for formulating physical laws and yet total abstraction from the everyday world of experience. It may be that these two apparently contradictory aspects of mathematics are not unrelated. This circumstance is itself an intriguing "open question".

Philosophers of mathematics over the centuries -- and they have been at it since before the time of Plato, more than 24 centuries ago - are responsible for the conversion of millions of trees into philosophy books. It is not clear that the world is especially better off for their labors. But however that may be, we can note that most of their effort has gone into trying to describe what mathematics is "about", and whether what it is about (whatever that may be) is in some sense "real", or simply a construct of the imagination of mathematicians.

Such questions won't concern us here. Instead, we'll consider the question of what mathematics is "about" from a different angle. What is it that mathematicians study? Essentially, two things are obvious: numbers and geometry. That has been true since the time of the ancient Greeks, and even before that, the time of the Egyptians and the Babylonians. You won't go far wrong if you consider number and geometry still to make up the core of what mathematics is about.

To some extent we might add to this the study of logic itself. Of course, mathematicians have, at least since Euclid, relied heavily on logic. But it was only with the work of Gottfried Leibniz (1646-1716) and his dream of creating symbolic logic that logic itself became an object of mathematical investigation. This project matured slowly, with important contributions much later from people like George Boole (1815-64) and (yes) Bertrand Russell (1872-1970). Into this merged the set theory developed by Georg Cantor (1845-1918) and others. Logic and set theory are now well established as the "foundations" of modern mathematics, but as objects of study in themselves they have retreated to niche specialties. Perhaps logic will someday be extended further to better cope with the strangeness of quantum mechanics -- but at present that is speculation.

If you wish to be a little more abstract, you might say that mathematics is "about" relationships. Geometry was originally about things like points and lines and circles -- and their relationships. Much later, by no longer treating length (or size, distance) as fundamental, geometry morphed into topology, which considers only the abstract relationship of "nearness" between points. In another direction, by considering the relationship between an object and what it becomes when subject to transformations such as translations, rotations, and reflections, geometry gave rise to the highly fruitful concept of symmetry.

Similarly, abstractions of the concept of number led to modern algebra -- concepts such as groups, rings, fields, vector spaces, and abstract algebras. These diverse mathematical constructs are all examples of systems of abstract objects governed by a variety of axioms that specify the relationships among the objects. In another direction, efforts by Leibniz and Isaac Newton (1642-1727) to develop the mathematics -- calculus -- necessary for expressing physical laws led to the concept of functions and thence to what is now called mathematical analysis -- essentially a highly generalized form of calculus that incorporates a great deal of abstract topology.

Already you can see that the line dividing the mathematics of geometrical form from the mathematics of number has become blurred. Geometry, except for its most abstract incarnation as topology, depends on numbers to quantify concepts such as length, angle, area, volume, curvature. But in the other direction, mathematical analysis depends on topology for a rigorous foundation. Analysis has also evolved from dealing with functions whose domain is our familiar "Euclidean" space, to functions which "live" on abstract geometric constructs known as "manifolds". This process can also be turned around, to enable the construction of something sufficiently like a type of "geometry" to be the subject of the new field called "noncommutative geometry" -- even though it is much farther removed from everyday ideas of geometry than even "non-Euclidean geometry". We will discuss noncommutative geometry elsewhere among these pages.

Another way in which the line dividing the study of number from the study of shape is blurred can be seen in the concept of symmetry. Mathematically, one speaks of symmetry using the language of "group theory", which is a subfield of algebra that applies equally productively to (among other things)

Perhaps another way, then, to describe what mathematics is "about" is to say that it is about "patterns". These may be patterns which can be discerned in things belonging to the "real" world, such as animals or galaxies. In this case, we are talking about mathematical biology or mathematical physics. This is the stuff of "applied" mathematics. However, and without meaning to detract from applied mathematics, this draws on the concepts and techniques of "pure" mathematics, which concerns itself with discovered (or invented, if you prefer) patterns of abstract things defined axiomatically -- diverse things with exotic names such as "Kähler manifolds", "Banach spaces", "Hopf algebras", and "cohomology groups".

That's roughly as far as it seems worthwhile to discuss what mathematics is "about", without getting into specifics. So lets turn now to more specific things.


The branches of mathematics

It is probably fair to say that the content and nature of the subject of modern mathematics is less familiar to the average scientfically literate person than is the case for other scientific disciplines like physics, astronomy, and biology. There is a general awareness of the sorts of things that researchers in these other disciplines are working on.

However, this is not so, by and large, for mathematics. Anyone with a good university education in science has learned some calculus, linear algebra, probability and statistics, and perhaps something of combinatorics. Yet undergraduate courses in these topics hardly touch on the questions that interest research mathematicians today. At the same time, there are a few research areas which have received some public notice, such as the theory of chaos and complexity. But this situation need not persist. It is quite possible to sketch out the lay of the mathematical landscape for anyone who is willing to take a little time to get his or her bearings.

It has to be admitted to begin with that boundaries between different branches of mathematics (as with almost any other subject) are somewhat fluid and shange over time. In some cases, specialized branches disappear entirely, or at least cease to attract any active interest. And of course, new branches appear from time to time. In mathematics, "chaos theory", noncommutative geometry, and the theory of computational complexity are new arrivals within the last few decades.

Nevertheless, the main branches have been fairly stable over the last century or two. What we can recognize as the main branches often have origins that go back many centuries. But the modern form of these branches mostly began to appear in the 19th century and became well delineated in the 20th century. There's good reason to think that these branches will have intergrown and look very different 100 years from now -- but we can hardly hope to know the future, so let's just look at the present.

Mathematical analysis

The branch whose modern shape goes back the furthest, perhaps, is mathematical analysis. It can be traced back to the calculus of Newton and Leibniz. The primary object of study is functions -- correspondences between elements in one set (the "domain") and elements in another (the "range"). For example, in physics, the motion in space of a physical object can be represented as a function that maps time into position. Newton's laws of motion were formulated in terms of such functions and their derivatives, with the first derivative representing velocity and the second derivative representing acceleration. Integration of functions, which is the operation inverse to differentiation, is also important. Integration is used to define the area of irregularly shaped objects and also to define concepts of physics, such as work.

Calculus, as formulated by Newton and Leibniz was exceptionally useful, but mathematically unrigorous. About 200 years, or more, were required to provide rigorous formulations of the notions of derivatives and integrals sufficiently general for use in all the various sorts of situations in which the concepts were applied. This process is still going on. In quantum field theory, for instance, there is a notion of integration (Feynman path integrals) which formally yields correct results but still lacks rigorous mathematical foundations.

Set theory and point set topology were both byproducts of putting analysis on a rigorous foundation. They provided the framework which made it possible to describe functions as mappings between sets and to precisely specify important notions, such as what it meant for a function to be "continuous". Point set topology was an abstraction from geometry which retained almost nothing except for the notion of what it meant for points to be "near" each other. In these terms, a continuous function is simply one which carries points which are near each other into other points which are also near each other (in the respective topologies of the function's domain and range).

Physical situations are still modeled, just as they were in Newton's work, in terms of differential equations. That is, the unknown in the equation is a function rather than a number. The equation is usually some algebraic combination of one or more functions and their various derivatives. What is required is to find functions that satisfy the equation (or system of equations), given certain initial conditions, such as the initial position and velocity of an object. For a Newtonian equation of motion, the function which is a solution describes the trajectory, over time, of the object -- the orbit of a planet, perhaps. One would like to find all possible functions that satisfy the equation and to be able to write them down in a form that one can compute with. In some cases, there may be a unique function which is the solution, and one would like to be able to identify such cases.

The simplest differential equations of Newtonian mechanics involve merely a function and its first and second derivatives. These are usually easy to solve. But things can become complicated quickly. When several interacting objects are considered, the equations involve multiple variables (position and velocity of the several objects), so "partial derivatives" are required, hence the equations are called "partial" differential equations rather than "ordinary" differential equations.

The two main new physical theories of the 20th century -- general relativity and quantum mechanics -- are both formulated in terms of partial differential equations (Einstein's equation, Schrödinger's equation, Dirac's equation, etc.) Not surprisingly, perhaps, such equations are very difficult to solve explicitly. (But there are techniques for calculating answers using computers, without any explicit solution.) In many cases, it may not be possible to prove that solutions of the equations even exist. (Physically this would mean that the equation is not a correct exact formulation of the probem, though it might be a good approximation.) These difficulties are not an inadequacy of general relativity or quantum mechanics specifically. Even for a classical problem such as fluid flow, which is described by the Navier-Stokes equations, it may be impossible to find solutions or prove that solutions exist.

One way to look at the development of mathematical analysis in the 20th century is to regard it as a major series of improvements in mathematical technology for dealing with differential equations of many kinds. The first step is to take a different point of view on differential equations and to regard the equation as defining an "operator" -- that is, a kind of transformation that applies to a set of functions rather than to a set of points -- a "function of functions". The second step is to add both an algebraic and a topological structure to the set of functions on which the operator acts. Taking these two steps, one gets what is known as "functional analysis", a characteristic product of 20th century mathematics.

David Hilbert (1862-1943) -- the mathematician who stated the famous 23 "Hilbert problems" in 1900 -- was a major contributor to functional analysis. His work gave us "Hilbert space". Any finite dimensional vector space (as studied in elementary linear algebra courses) is (trivially) a Hilbert space. The more interesting examples of Hilbert spaces, however, are infinite dimensional vector spaces, each "point" of which is a function. The axioms of linear algebra define the algebraic structure of a Hilbert space. These axioms allow any two elements to be added together to yield another element of the space. Any element can also be multiplied by a scalar (usually a real or complex number) to give another element. Finally, the axioms specify that there is a "scalar product" between any two elements, which results in a scalar.

The inner product is very important, because it's not only an algebraic construct, but it also gives the space a topology. It does this by making it possible to define a "norm" on the space, which is like the absolute value of an ordinary (or a complex) number -- it tells "how big" the element is. This, in turn, makes it possible to define a "metric" on the space, that is, a scalar-valued function that says how far apart two points of the space are. (The distance between two elements is defined as the norm of the difference of the elements.)

Given the machinery of Hilbert space, it is possible to translate a problem formulated as differential equations into one about operators on the Hilbert space. What this buys for you is the ability to talk about properties of the solutions of the equations without finding the solutions explicitly. This is because it is possible to prove powerful general theorems about Hilbert space, such as the "spectral theorem". This theorem states that for certain types of operators there exist elements with the special property that the result of the operator applied to the element is simply to multiply the element by a scalar. In other words, the operator may change the length but not the direction of these special vectors. Such elements are called "eigenvectors" of the operator, and the scalar multiples are called "eigenvalues". What is so useful about that? Well, for instance, in quantum mechanics it turns out that the possible energy levels of a quantum system are precisely the eigenvalues of an appropriately chosen operator. And there are ways to compute these numbers without explicitly solving a complicated partial differential equation.

Mathematics normally progresses by generalizing useful results. Given Hilbert space as a model, there are generalizations that don't have all the structure of a Hilbert space, such as the inner product. But one can still consider vector spaces that are assumed to have a norm, without specifying exactly where the norm comes from ("normed linear spaces"). Or, taking things a step further, one may suppose merely the existence of a topology (yielding "topological vector spaces"). There is a good reason why such generalizations are worthwhile to make. The reason is that important theorems may be proven in the more general case, without having to make stronger assumptions. The consequence is that the proofs are clearer, since they rest on fewer assumptions (though they aren't necessarily "easier"), and the resulting theorems are more widely applicable.

So far in talking about functions we haven't been specific about the domain and range assumed for the functions. Normally, in problems of classical physics, the domain and range would consist of the real numbers (or more generally, n-space Rn with real coordinates). But from a mathematical perspective, the complex numbers might just as well be used. (By complex numbers we mean, of course, numbers of the form x+iy, where x and y are real and i = √(-1).) Most of the concepts of calculus, including derivatives and integrals, can be defined almost the same way for complex-valued functions of complex variables as they are in the case of real variables. But some of the results are strikingly different in the complex case. For example, if a complex function has even a single derivative, it has derivatives of all orders. Functions of this kind are also known as "analytic" or "holomorphic" functions, and the study of them is known as "complex analysis", in contrast to "real analysis".

Complex analysis was developed largely in the 19th century by people like Augustin-Louis Cauchy (1789-1857) and Bernhard Riemann (1826-66). Another issue that is more prominent with complex functions is that some natural functions even as simple as the complex square root or the natural logarithm may be multiple-valued. This situation requires special handling in order to avoid ambiguity, and Riemann worked out how to do it -- giving us "Riemann surfaces" in the process. This work provided key ideas necessary to define more general geometric objects known as "manifolds", a central idea in modern topology, as discussed below.

Most major issues in complex analysis were resolved by the early 1900s. Today, complex analysis remains extremely useful as a tool in both pure and applied mathematics, but active research tends to focus on specialized concerns, such as the theory of functions of several complex variables. Functional analysis is sufficiently generalized that its results apply equally well to real or complex functions (provided the hypotheses of its theorems are met). Consequently, it is possible to do a lot of analysis today without necessarily specifying whether real or complex functions are involved.

Algebra

We have already seen, with vector spaces (with or without an inner product), how algebraic concepts are useful in other branches of mathematics. The concept of a vector space is very typical of most of algebra in the 20th century. One considers a set of abstract objects, describes how they interact with each other by means of axioms, and then goes on to prove general propositions about the resulting algebraic construct.

Perhaps the most commonly used concept in algebra is that of a "group". A group is a mathematical system consisting of a set of elements and one operation between any two elements of the set. If "∘" denotes the operation, in a group there are three requirements:

  1. the operation should be associative: x∘(y∘z) = (x∘y)∘z;
  2. there should be an identity element "e": e∘x = x∘e = x for all x;
  3. every element should have an inverse: x∘x-1 = x-1∘x = e.

These group axioms are quite general, so it turns out that many things in mathematics, as well as in the "real" world, provide examples of groups. The first example in mathematics came up in the theory of equations and was discovered by Évariste Galois (1811-32). Because of him, we have "Galois groups" that describes symmetries among the roots of a polynomial equation. So powerful was the technique that it helped resolve long-standing problems, such as the (non)solvability of polynomial equations of degree greater than 4 by radicals, and the impossibility of various "ruler and compass" constructions, such as the trisection of angles.

The concept of a "field" also grew out of the study of solving polynomial equations. A field is a straightforward abstraction of already known classes of numbers, such as the rational numbers (ratios of integers), the real numbers, (tricky to define, but conceptually just any number that represents some sort of measurement), and complex numbers. In a field there are two operations, analogous to addition and multiplication, and the usual rules of commutativity, associativity, and distributivity apply, as well as the existence of identity and inverse elements for both operations (with the sole exception that there is no multiplicative inverse of the additive identity element -- no division by 0). Fields are involved in the theory of equations, because the coefficients of the polynomials are assumed to come from a specific field, and the solutions will belong to some "extension" field which is obtained by a straightforward process of "adjoining" the roots to a smaller field, if necessary.

Somewhat later it was recognized that groups could describe geometric symmetries as well. This led eventually to the determination and classification of all possible crystal symmetry types.

Another 19th century source of algebraic ideas was the solution of systems of linear equations (rather than polynomial equations). It was found that the algorithms which were already known for solving linear systems could be expressed efficiently by means of rectangular arrays of the equations' coefficients -- the objects now know as matrices. But matrices consisted simply of rows (or columns) of ordered lists of numbers -- vectors. The matrices could be interpreted as a description of a transformation (mapping) of spaces that consisted of such vectors. This led to the algebraic notion of vector spaces.

Axiomatically, a vector space can be described as a group with additional structure. The basic group structure comes from the addition of vectors. It has an additional property that an arbitrary group doesn't have: the operation is commutative, i. e. x+y = y+x for any vectors x and y. (It is customary to use "+" to denote a commutative group operation. A commutative group is also described as "Abelian", after Niels Henrik Abel (1802-29), who was the mathematician that inspired Galois' work on the question of solvability of polynomial equations.) Besides addition, a vector space also allows multiplication of any vector by a scalar (i. e. a real or complex number). An axiom provides that addition and scalar multiplication are compatible in the sense that c(x+y) = cx + cy for any scalar c and vectors x, y.

The notion of vectors and vector spaces made it possible "algebraize" geometry of any number of dimensions, much as René Descartes (1596-1650) did with his Cartesian coordinates for plane geometry. Geometric transformataions such as rotation and uniform stretching could be represented by matrices, and sets of these transformations -- or equivalently matrices -- could form groups of their own. Such matrix transformation groups are in general not commutative, since matrix multiplication is not commutative.

It wasn't long before mathematicians such as Sophus Lie (1842-99) realized that groups of transformations were applicable to the study of the solution of differential equations. But these transformation groups were interesting on their own -- and became known as Lie groups. Matrix groups consisting of matrices having real or complex numbers as entries also have a natural topology: two matrices are "close" to each other if all their entries are "close" as real or complex numbers. Lie groups are thus an example of more general "topological groups", which have a rich theory due to the interaction of the group and topological structures.

Not only do (certain) sets of matrices form groups, but it turns out that any abstract group can be realized as a suitable group of matrices. This process is called "representation" and led to a rich theory of group representations. Such representations are quite useful, because they make it possible to do explicit calculations with any group, and as a result they enable the proof of powerful general theorems about group structure. Group representations are also fundamental in the application of group theory to quantum mechanics, so they were quite extensively in that connection. Group representations can be used to describe very clearly and efficiently phenomena as diverse as the periodic table of chemical elements and the theory of elementary particles such as quarks and leptons. (Many physicists, however, disliked and mistrusted this application of abstract algebra, even long after it had proven its worth.)

One final 19th century application of abstract algebra we will mention was in "algebraic number theory". This is the study of finding solutions of polynomial equations by means of numbers that are generalizations of ordinary integers. This arises naturally in questions of solving "Diophantine equations", that is, finding integer solutions of polynomial equations, such as Fermat's equation: xn + yn = zn for n ≥ 3. (Of course, we know now there aren't any such solutions.)

Ernst Kummer (1810-93) was especially interested in the Fermat problem, and at one point he thought he had it solved. But he later realized he was mistaken to assume that algebraic integers factor uniquely into primes the same way ordinary integers do. The assumption was in fact false. Nevertheless, algebraic integers like ordinary integers form a structure now called a "ring". (A ring is like a field, except that multiplicative inverses don't necessarily exist.) Some rings have unique factorization and some don't. But Kummer was able to define a particular substructure of a ring, called an "ideal", and any ideal could be expressed uniquely as a product of "prime ideals".

This trick allowed Kummer to make some progress on the Fermat problem, but unfortunately not enough to solve it. His theory of ideals, however, went on to be vastly important in 20th century abstract algebra and for algebraic number theory in particular. David Hilbert, who gave us Hilbert spaces, scored one of his earliest triumphs by producing an elegant synthesis of algebraic number theory known up to that time (1896). That synthesis, built on the work of Kummer and others, became a foundation of much more advanced work on algebraic number theory in the 20th century.

Many, if not most, of the details of the theory of groups, rings, fields, and vector spaces were, as we've seen, discovered in the 19th century. But the results were scattered and expressed using inconsistent notations and concepts from one area to another. It was for the 20th century to see this situation rectified with publications such as the elegant Modern Algebra of B. L. van der Waerden (1903-96) in 1930.

In the 20th century, abstract algebra has grown steadily more abstract. But it has continued to be studied more for its application to other branches of mathematics than for itself. In addition to algebraic number theory, algebra is especially important in the study of algebraic geometry and algebraic topology (as one would expect from the names of those subjects). A field known as "commutative algebra" (the study of commutative rings and modules over such rings) is particularly important for algebraic geometry. Another field, "homological algebra", is an important abstraction of the notions of "homology" and "cohomology" that originated in algebraic topology.

Nevertheless, various specialized areas of algebra have been actively persued for their own sake. The theory of groups is an example. In the last few decades a complete classification of finite "simple groups" has been obtained -- at great effort. A simple group is something like a prime number, in that all finite groups can be constructed in a standarad way from the simple groups. It has turned out that most simple groups occur in families related to Lie groups. But there are a few "sporadic" simple groups, some quite large, which to not fit any particular pattern. There are hints that some of these sporadic groups may be important in other branches of mathematics, such as the theory of automorphic functions.

Topology and geometry

Although geometry is (together with number theory) just about the oldest major branch of mathematics it is also, in its present form, one of the newest. Of course, geometry, in the sense that Euclid knew, has little new to offer. Indeed, one of the major discoveries of the 19th century (by people such as Nicolai Lobachevsky (1793-1856), Janos Bolai (1802-60), and Bernhard Riemann) was that such a thing as non-Euclidean geometry existed. Riemann contributed by far the most important work, but his career was unfortunately short and his results exist mainly in a sketchy form.

It remained for Henri Poincaré (1854-1912) in the last part of the 19th century and the first part of the 20th to create, almost single-handedly, modern geometry and topology. Poincaré was as prolific and universal in his interests as Hilbert (especially allowing for his shorter career). What Hilbert was to modern analysis, Poincaré was to modern geometry, only more so.

As far as current usage is concerned, the terms "topology" and "geometry" are somewhat interchangable. Point set topology is a thing apart. It axiomatizes the notion of "nearness" between points of a topological space. It plays a significant role in providing rigorous foundations for modern analysis. But beyond that it doesn't loom large on the stage of modern mathematics. When a mathematician today refers to topology or geometry, what is meant is some aspect or another of the theory of "manifolds".

A manifold (of which there are many different types) generalizes the idea of a geometric object such as a curve, a surface, or some analogue of a surface in higher dimensions. A manifold also generalizes the notion of "Euclidean" space (the "flat" 2-, 3-, or higher-dimensional space of everyday experience) in being only "locally" like Euclidean space in a certain precise sense, though not (necessarily) so "globally".

The term "topology" tends to refer to the study of manifolds that have no additional structure. This is the area that Poincaré mainly worked on. With a topological manifold, one is mainly concerned about properties of the object which are invariant under any bending or stretching of the object, but not cutting or tearing -- properties such as its dimensionality and the number of holes it has. For this reason, topology is often referred to informally as "rubber sheet geometry". With such an object, one has no notion of distance, area, volume, angle, or curvature, for the simple reason that these can all change if the object is deformed by an allowable transformation. Formally, the allowable transformations are known as a "homeomorphisms" -- 1:1 continuous maps between topological spaces that preserve the local Euclidean structure.

"Geometry", on the other hand, tends to refer to the study of manifolds that have a "metric" structure, i. e. a notion of distance and angle. (The close relationship is given away by the presence of the root "met-", meaning measure, in both "geometry" and "metric".) This notion of distance arises from a "Riemannian" metric, which is named after Riemann, since it is what he dealt with in his work in this area. Riemannian manifolds are considered equivalent under transformations only provided all distances are preserved. They can, therefore, be bent (in certain ways) but not stretched. This therefore, at least seemingly, is a much more specialized situation.

It's not quite so specialized as it might seem, however, because there is an even more stringent notion often applied to manifolds. This notion is that of "differentiability" or "smoothness". The intuitive idea is that a smooth or differentiable manifold has no sharp corners, edges, or creases. The required definition is rather technical, but the net result is that a differentiable manifold is one that allows calculus to be done just as in ordinary Euclidean space. The reason that an ability to do calculus is important is that the classical mechanics of Newtonian physics -- as well as much of the rest of theoretical physics -- can be formulated in terms of manifolds where calculus "works". Doing physics this way provides an immense conceptual unification of the subject.

Because differentiable manifolds have the largest number of applications, it is on them that attention most often focuses. Differentiable manifolds have at each point what is known as a "tangent space", which is quite analogous to the tangent line to a smooth curve. Because the tangent space really is a copy of Euclidean space, it has a natural metric. It is therefore plausible that a differentiable manifold can be given a Riemannian metric, though this theorem is not entirely simple to prove. So every differentiable manifold is Riemannian. The converse, however, is not true -- there are plenty of Riemannian manifolds which aren't smooth.

"Differential topology" and "differential geometry" are often referred to as subfields of modern geometry (and/or topology). Since a topological manifold that has a differentiable structure also has a Riemannian geometric one, the distinction is not very great.

On the other hand, "algebraic topology" and "algebraic geometry" are quite different animals. Algebraic topology refers to the use of algebraic techniques to study the properties of manifolds. These techniques involve devising algebraic constructs (numbers, groups, or other types of algebraic objects) that are invariant under allowable topological transformations. This makes it possible to say for sure that manifolds which have different algebraic invariants cannot be equivalent. Unfortunately, having the same invariants doesn't always guarantee topological equivalence. (This is what the famous Poincaré conjecture is all about.)

In sharp contrast, algebraic geometry is a subject that really straddles the borderline between algebra and geometry. What it's about is studying the solution sets, in some specified algebraic field, of a single polynomial equation or a system of simultaneous equations. So on the face of it, the subject is very algebraic, being a generalization of finding solutions of systems of linear equations and finding roots of polynomials. A solution set of one or more polynomial equations is called an algebraic variety. What makes the subject topological is the fact it can be shown that an algebraic variety is a differentiable manifold (except for isolated singularities such as self-intersections).

Algebraic geometry is a notoriously complicated and difficult subject. The best concepts with with to approach it and the tools that can be used to study it were actively under development throughout the whole 20th century (as well as, to a limited extent, earlier). The techniques involve sophisticated tools drawn from many advanced areas of analysis, algebra, and topology. Unsurprisingly, therefore, it remains a very active area, and one which attracts the most ambitious and skillful mathematicians.

Modern topology, or geometry, or whatever it is called, is no longer about only what we think of as conventional geometric objects. Einstein's theory of general relativity, for instance, is about physical gravitation and mass. Before Einstein, physicists never thought about those concepts as being geometrical, but Einstein showed that in fact they were. Or rather, that the theory of Riemannian geometry as developed by Riemann more than 50 years earlier provided a perfect way to describe gravity and mass. Does that mean that the theory of gravity has been demonstrated to be "nothing but" geometry?

However that question may be answered, the fact is that looking at any number of things from the geometric point of view is extremely fruitful. Most advanced theories in physics, such as superstring theory, are highly geometric. The underlying reason this works is that the theories can be expressed in terms of differential equations. And those equations, in turn, can be regarded as describing geometric entities, such as higher-dimensional manifolds or operators on spaces of functions defined on such manifolds.

The idea that the universe may be understood in terms of geometry is an old one. That idea still makes a great deal of sense. The only thing changed is that the geometry used in this understanding is far more subtle and powerful than that of Johann Kepler's attempt, about 400 years ago, to base cosmology on the five Platonic solids.

Other branches

Are there other important branches of mathematics? How about number theory, say?

Yes, of course there are other branches. Number theory is one of them. It has a history that's probably as long as geometry's. Three books of Euclid's Elements, in fact, deal with the theory of numbers. This is rather interesting when you stop to think about it, as number theory certainly had far less practical applicability than geometry. In fact, it probably had even fewer practical applications in Euclid's day than now. An interest in numerology and some of the more esoteric ideas of the Pythagoreans perhaps explains the presece of number theory in the Elements.

However that may be, number theory remains a very active area of mathematical research even now. It's a little different from the other branches already discussed in that it doesn't provide a lot of powerful general techniques or theorems useful in the other branches. But it has definitely motivated a lot of work and important theorems in the other branches -- which is more than adequate justification for the amount of effort expended on number theory, even without considering its intrinsic interest. In analysis it has motivated studies of of many topics in complex analysis especially, such as the theory of Riemann's zeta function, Riemann surfaces, and various kinds of special functions ("automorphic", "modular", "algebraic"). The motivation for many developments in algebra is obvious, especially the theory of rings, ideals, and the "cohomology of groups". And its effect on geometry has been mediated by many questions of algebraic geometry.

OK. How about still other branches? Mathematics deals with many other topics that don't entirely fit in any branches already discussed. For instance:

Most of these subject areas are rather like number theory (although, except for mathematical physics, they haven't attracted the attention of quite as many mathematicians over the years). The resemblance is that, for the most part, they are applications of mathematics or mathematical ways of thinking. They motivate work in other branches of the subject, but they give back little in the way of usable general theorems and techniques in return. (Logic and set theory are somewhat of an exception to this, though not as much as one might think.) This isn't to belittle any of these subjects, but only to describe their relation to the rest of mathematics.

Mathematical physics deserves special mention, as it has motivated so much mathematics since the time of Newton. Its importance as a source of motivation is obvious. But of late it has also contributed quite a few important ideas as well in areas such as supersymmetry, the Yang-Mills equations and gauge theory, and topological invariants. Mathematicians find themselves continuing to be challenged by the urge to make fully rigorous many ideas of theoretical physics which physicists happily accept simply because the ideas make successful experimental predictions.


Open questions

Mathematics is something of an extreme case. In most other sciences, new theories and ideas tend to replace older ones. Hence in most sciences work that was done long ago (exactly how long depending on the particular science) is usually superceded by work done as recently as a few decades ago, or less. (This may be a slight exaggeration, but not much.)

Mathematics, on the other hand, is cumulative. Valid mathematics that was done in the past is still valid, and often still interesting and useful. You can get some sense of that by how often 19th century mathematicians were mentioned in our discussion of the branches of mathematics. As a result, a great deal of background information on past mathematics is required to understand what contemporary mathematics is about. This is undoubtedly one factor in the perception people have that mathematics is difficult. However that may be, the cumulative nature of the subject does make it hard to describe the current problems and open questions, because so much of the terminology required simply to describe things is unfamiliar.

Therefore, we aren't going to try to summarize here what seem to be the most important open questions. We'll do that in the pages devoted to the various topics, where there is space to explain background and terminology.

Instead, what we want to do here is to talk about what seems to be an accelerating trend in modern mathematics. This trend is what appears to be a large and growing amount of connection between the main branches of the subject. To outsiders, it may look like mathematics continues to fragment into more and more highly specialized subfields, with individual mathematicians increasingly isolated in their own particular niches and increasingly less able to understand or appreciate what is going on elsewhere in mathematics.

There may be some truth to that. And yet, when you look at the amount of cross-fertilization which has gone on in the past -- and which continues to go on now -- it's clear that can't be the whole truth. In fact, there are quite a number of frontier areas of mathematics which significantly build on ideas and results from two or three of the main branches of the subject. We're going to talk about various examples of that here.

Not all of these topics are currently major "open questions" in the sense we normally use. Some of them are questions that have been already resolved, though they remain pregnant with possibilities for further developments. Others are more like research programs, things that many people are working on, without crisply phrased conjectures to cite as open questions. All of them, however, represent important areas of research.

Of course, as just remarked, we can't really begin to explain the concepts in this short space. Most of these topics will be taken up elsewhere in these pages. You can consult the index to look up key words in order to find relevant discussions.

The proof of the Shimura-Taniyama-Weil conjecture

Even the statement of the conjecture, though very short, is mysterious: Every elliptic curve is modular. What is an "elliptic curve"? What does "modular" mean?

If any of the terms here ring bells, it's probably because you've read a bit about Andrew Wiles' 1994 proof of Fermat's Last Theorem. Wiles didn't actually prove FLT. Instead, what he proved was a special case of the STW conjecture, which was already known to imply FLT. This isn't to detract from what Wiles did, because ultimately the STW conjecture is much more important than FLT, and the techniques Wiles used illustrate the theme we're discussing, as does the complete proof of STW by Christophe Breuil, Brian Conrad, Fred Diamond, and Richard Taylor in 1999.

Noncommutative geometry

What on Earth does it mean for some sort of geometry to be "noncommutative"? Good question.

In order to grasp what's involved here, you need one piece of data: an important technique for studying geometric objects consists of using algebraic and analytic information about the object in order to understand it better. There are a variety of ways that mathematicians have done this, but the one in question here involves what is known as "C*-algebras", which are sets of functions having the geometric object as their domain of definition. From knowledge of the C*-algebra it is possible to prove various theorems about the associated geometric object.

Now, a C*-algebra is, among other things, a commutative ring. The question that noncommutative geometry poses is this: If instead of a C*-algebra we start with some sort of noncommutative ring, are there analogous theorems which could be proved purely from the properties of the ring -- even without the existence of an underlying topological object?

The Atiyah-Singer index theorem

Michael Atiyah and I. M. Singer published their theorem in 1963. One of the striking things about it is that it generalizes several important theorems which had been known for quite a long time, such as the theorems of Gauss-Bonnet and Riemann-Roch. However, it doesn't generalize these theorems in the usual way, which is by weakening the hypotheses or strengthening the conclusions of the older theorems. Indeed, it's not even obvious at first glance that the Atiyah-Singer theorem is talking about the same sort of things as the older theorems.

Instead, it is a generalization because of its degree of abstraction. When appropriate special cases of the abstractions it deals with are considered, the older theorems just fall out. It is, then, the best kind of abstraction, one that truly identifies a hidden similarity between apparently disparate concepts.

The Riemann hypothesis, L-functions, elliptic curves, class field theory, and more

The Langlands program

Mikhael Gromov's geometry

Hodge theory and the Hodge conjecture

Topology, gauge theory and the Yang-Mills equations



Recommended references: Web sites

Site indexes

The Math Forum: Math Resources by Subject
Large collection of links, presented in alphabetical order within categories, with extensive annotation. It includes an Internet Mathematics Library which has a detailed topic index.
Mathematics Archives Collection of Links
Large collection of annotated links. Subject matter links are organized by topic.
Open Directory Project: Mathematics
Categorized and annotated mathematics links. A version of this list is at Google, with entries sorted in "page rank" order.
Galaxy: Mathematics
Categorized site directory. Entries usually include descriptive annotations. Has many more mathematics links than Yahoo.
Math on the Web
Hosted at the AMS site. Links are organized in many different categories, such as math departments, people, journals. Lists by subject area are not as complete as from other sources.
Knot a Braid of Links
This is a "cool math site of the week" service to the mathematics community provided in the Camel site of the Canadian Mathematical Society. Sites have been added to this list weekly since 1996.
Mathematical BBS
Very extensive, categorized collection of links, maintained by Josef Eschgfäller and others.
PAM Resources in Mathematics
Good quality, mostly unannotated list of links, maintained by the Special Libraries Association.
WWW Virtual Library: Mathematics
Has general information for the mathematical community, especially links to specific mathematical topics. Hosted at Florida State University Mathematics Department.
Mathematical Resources on the Web
Good categorized list, hosted by the University of Florida Mathematics Department.
Mathematics Web Sites
Provided by the Penn State University Mathematics Department. Contains links for math departments, societies, institutes, journals, software, etc. For specific topics, see the subject area pages.
EEVL Mathematics Section
EEVL (which stands for Enhanced and Evaluated Virtual Library) is "The Internet Guide to Engineering, Mathematics, and Computing". The directory of links is organized hierarchically by topic and individual entries are very detailed. The collection is searchable.
Mathematics Resources on the Internet
Extensive categorized list of links, by Bruno Kevius.
Geometry.Net: Pure and Applied Math
Provides results of Web searches for many mathematics topics. Of special interest are the pages on Theorems and Conjectures and Unsolved Problems.
Cut the Knot: Other Sites of Interest
Short list of top-quality sites with extensive descriptions.
Mathematics on the Internet
Many links in a much more interesting and colorful format than usual, by Massoud Malek.
Electronic Sources for Mathematics
Provided by the University of Pennsylvania Department of Mathematics. The links are mostly to organizational sites.
Yahoo Mathematics Links
Annotated list of links, but not very complete or up-to-date.


Sites with general resources

American Mathematical Society
Principal professional organization for mathematical research in the U. S. Site includes AMS member services, bookstore, events calendar, reference tools, and links to other resources.
Mathematical Association of America
MAA is a professional society that focuses on undergraduate mathematics education. An especially useful feature of the site is its book reviews, which go back to 1996 and cover over 400 books, mostly accessible to math undergraduates or a general audience.
EMIS: The European Mathematical Information Service
Contains a variety of resources, including various databases of mathematical literature and the Electronic Library of Mathematics.
Electronic Library of Mathematics
Contains four main collections: journals, proceedings, monographcs and lecture notes, and classic works.
The Math Forum
Very extensive site with resources of all types, for all levels from elementary to advanced.
PlanetMath.org
"PlanetMath is a virtual community which aims to help make mathematical knowledge more accessible." Its main feature is a collaboratively written mathematics encyclopedia. There are also a number of discussion forums (some of which handle general questions on college and university level mathematics), and lists of online books, expositions, and research papers.
Clay Mathematics Institute
This is the organization that is offering a prize of $1 million for solving each of seven notable unsolved problems.
MathSoft Unsolved Problems
Mathsoft Engineering & Education, Inc. page with links to information on many unsolved problems. By Steven Finch.
Mathematics Archives
Collection of many eduacational resources in mathematics at all levels. The list of lessons, tutorials, and lecture notes is especially helpful for finding overview of a particular topic. The site is hosted at the University of Tennessee (Knoxville).
American Institute of Mathematics
The Institute sponsors, among other things, work on significant unsolved problems. Their site contains news of noteworthy new developments and related information on such things as workshops and conferences. Of particular interest is a set of outlines of selected open problems.
The MacTutor History of Mathematics Archive
Very useful information on selected topics in mathematics, and especially biographies of significant mathematicians. Hosted by the University of St. Andrews, Scotland.
Mathematics
A personal site by Massoud Malek. Contains sections on great mathematicians, fractals, a gallery of surfaces, and external links.


Surveys, overviews, tutorials

Eric Weisstein's World of Mathematics
Outstanding collection of encyclopedia-style articles, by Eric Weisstein and others. Hosted online by Wolfram Research and published in book form as the CRC Concise Encyclopedia of Mathematics.
PlanetMath Encyclopedia
A collaboratively written collection of articles on topics in all areas of mathematics. May also be used as a handy glossary of mathematical terms.
The Mathematical Atlas: A Gateway to Modern Mathematics
Very impressive site by Dave Rusin that provides overviews and lists of resources of modern mathematics, organized by AMS Subject Classifications. Look here if you want a real atlas of the world of math.
Frequently Asked Questions in Mathematics
By the Sci.math FAQ Team, edited by Alex López-Ortiz.
Sci Math FAQ
Another version of the Usenet Sci-Math group FAQ.
Open Problems
The site, by Jeff Erickson, describes a number of unsolved problems, mostly in geometry and computation, and has links to other lists.
Unsolved problems in mathematics
Article from Wikipedia. Mostly links to articles on certain topics.
Mathematics
Article from Wikipedia. See also List of mathematical topics.

Ask an expert

Galaxy: Mathematics Expert Links
Categorized site directory. Entries usually include descriptive annotations. There's also an ask-an-expert list.
Ask Dr. Math
Handles questions about mathematics from elementary school to college/university level. There is an archive of previously answered questions. Part of the Math Forum site.
The CTK Exchange
An informal list of questions and answers in many areas of mathematics, at the Cut the Knot site maintained by Alexander Bogomolny.


Online books and lecture notes

AMS Books Online
Full texts of selected volumes published by the AMS, in PDF format. Many are classics, and most are at an advanced level. Includes links to other lists of online books
MIT OpenCourseWare: Mathematics
Course materials provided by the MIT OpenCourseWare project. New courses are continually added. Detailed lecture notes and additional materials such as problem sets are provided for some, but not all, courses.
Mathematical Monographs and Lecture Notes
Maintained by EMIS. Includes links to other lists of online books
Mathematics: Online Texts
Part of the Google directory. Includes links to other lists of online books
Textbooks in Mathematics
List compiled by Alexandre Stefanov. Includes links to other lists of online books
Online Books and Lecture Notes in Mathematics
Very useful list of links to serious study/learning resources, by David Radcliffe.
Online Mathematics Textbooks
List complied by George Cain.
Math WWW Virtual Library: Online Books
List at the Math WWW Virtual Library.
Geometry & Topology Publications
Specialized monographs from the Geometry & Topology site at the University of Warwick (UK).
The Online Books Page: Call Numbers Starting With QA
From the database of the University of Pennsylvania's Online Books Page.
Math Archives: Lessons, Tutorials, and Lecture Notes
List at the Math Archives site, with many items of a more informal nature.
PlanetMath.org: Books
List of books available for free in electronic form. It is organized both chronologically and by subject classification.
PlanetMath.org: Expositions
A list of online lectures, lecture notes, and other expository documents that focus on education rather than research. There is also a collection of online research papers.
Collection of Lecture Notes, Surveys, and Papers
Hosted at the University of Padeborn (Germany). Many items are in German.
The Maths Linker
"An index of on-line and downloadable mathematical material on the Web." Items are categorized by academic level and subject area.


Recommended references: Magazine/journal articles

Reflections on the Future of Mathematics
Felix Browder
Notices of the AMS, June/July 2002, pp. 658-662
Abridgment of an address given by the retiring president of the AMS. The central point of emphasis is the way in which what are regarded as the most important problems in mathematics tend to involve application of tools and insights from one branch of mathematics to the open questions of another branch.
[Article in PDF format]
A Sideways Look at Hilbert's Twenty-three Problems of 1900
Ivor Grattan-Guinness
Notices of the AMS, August 2000, pp. 752-757
This is a historical article rather than a commentary on the mathematics of the problems themselves, but it does summarize them to some extent and suggests that, despite the influence Hilbert's lecture has had for over a hundred years, it was hastily prepared and could have been more inclusive.
[Article in PDF format]
Possible Trends in Mathematics in the Coming Decades
Mikhael Gromov
Notices of the AMS, August 1998, pp. 846-847
The author reports on some conclusions of a panel of the NSF regarding the future of mathematics. Problems from the natural sciences involving large amounts of data and the theory and application of computation figure prominently.
[Article in PDF format]
Mathematical Problems for the Next Century
Steve Smale
Mathematical Intelligencer, Spring 1998, pp. 7-15
This article is essential reading for a quick overview of the most important unsolved problems in mathematics. Smale made a major contribution to the solution of the Poincaré conjecture and has worked principally on topology and dynamical systems, which is reflected in his selection of problems. The explanation of the problems is concise but clear, and there are many references.
Hilbert's Problems and Their Sequels
Jean-Michel Kantor
Mathematical Intelligencer, Winter 1996, pp. 21-30
Provides a brief overview of the current status of most of the 23 problems on Hilbert's 1900 list. There are useful comments in many cases on possible future directions, particularly of an "applied" nature.


Recommended references: Books

The Millennium Problems: The Seven Greatest Unsolved Mathematical Puzzles of Our Time
Keith Devlin
Basic Books, 2002
Like most books about mathematics that are aimed at a "general" audience, this one is long on background and a little short on details about its subject, which makes being cautioned repeatedly that you'll probably give up before reaching the end it all the more irritating. Nevertheless, within its self-imposed limitations, the book gives a good introduction to some of the Clay Mathematics Institute's Millennium Problems, and at least the general scientific setting for the rest.
Alain Connes, André Lichnerowicz; Marcel Paul Schützenberger -- Triangle of Thoughts
American Mathematical Society, 2001
This slim volume is a transcript of conversations among the three authors, who are all highly distinguished contributors to several fields of mathematics and science each. While mathematics is the central focus, the topics range over philosophy, logic, mathematical physics, cosmology, and quantum mechanics. The emphasis is on the interrelatedness of the various fields as well as speculation about their open questions.
Björn Engquist; Wilfried Schmid, eds. -- Mathematics Unlimited -- 2001 and Beyond
Springer-Verlag, 2001
With 63 separate essays and more than 1200 pages this work is, literally, a weighty tome. Some of the more well-known contributors include Gerd Faltings, Roger Penrose, Serge Lang, and Toshiyuki Kobayashi. All branches of mathematics are represented, but the largest representation is from mathematical physics, applied mathematics, and computational theory. The surveys tend to be more accessible to nonspecialists than the typical mathematical paper. There is leisure-time reading for months in this volume.
John L. Casti -- Five More Golden Rules: Knots, Codes, Chaos, and Other Great Theories of 20th-Century Mathematics
John Wiley & Sons, 2000
In this sequel, Casti selects five more areas of modern mathematics with significant applications: the Alexander polynomial (knot theory), the Hopf bifurcation theorem (dynamical systems), the Kalman filter (control theory), the Hahn-Banach theorem (functional analysis), and Shannon's coding theorem (information theory).
V. Arnold, M. Atiyah, P. Lax, B. Mazur, eds. -- Mathematics: Frontiers and Perspectives
American Mathematical Society, 2000
The present volume is a conscious effort to present the most important problems in mathematics for the coming century, in the spirit of Hilbert's 1900 list. There are 29 papers by very eminent mathematicians, such as Shing-Shen Chern, Alain Connes, S. K. Donaldson, Yu. I . Manin, David Mumford, Roger Penrose, David Ruelle, Steve Smale, Andrew Wiles, Edward Witten, S.-T. Yau, and Barry Mazur. The papers range in style from informal essays to detailed (if brief) technical reviews.
Keith Devlin -- Mathematics: The New Golden Age
Columbia University Press, 1999
The author's thesis is that the last 40 years of the 20th century were a "golden age" of mathematical discovery, in terms of the number and importance of new results obtained. He develops this idea in eleven chapters dealing with prime numbers and factoring, set theory, algebraic number theory, chaos, the theory of simple groups, diophantine equations, the four-color problem, complex function theory, knots and topology, Fermat's last theorem, and computability theory.
Hugo Rossi, ed. -- Prospects in Mathematics: Invited Talks on the Occasion of the 250th Anniversay of Princeton University
American Mathematical Society, 1999
These talks are fairly technical and specialized. But some of them are very pertinent, such as the survey of "Harmonic Analysis in Number Theory" by Henryk Iwaniec and the talks on topics in mathematical physics by Jürg Fröhlich and Ed Witten.
John L. Casti -- Five Golden Rules: Great Theories of 20th-Century Mathematics - and Why They Matter
John Wiley & Sons, 1996
Casti selects five important theorems developed in the 20th century in order to illustrate some of the concerns of modern mathematics. They are: the minimax theorem (game theory), Brouwer's fixed-point theorem (topology), Morse's theorem (singularity theory), the halting theorem (computability), and the simplex method (optimization theory). All have significant applications outside mathematics itself.
Ian Stewart -- The Problems of Mathematics
Oxford University Press, 1987
An excellent overview of some of the main research areas of mathematics in the mid-1980s by an author who knows the material quite well. Chapters dealing with topology, number theory, and applications are especially well done.

Home

Copyright © 2002 by Charles Daney, All Rights Reserved