Christians have always known they were to be pilgrims, a community of believers moving through life, headed in a definite direction, with a specific destination the focus. But the picture of pilgrims also has another meaning—one that becomes increasingly vital as the United States has become post-Christian.