Sifting through the different options in the mortgage market can be very confusing, and the wrong choice can cost someone thousands of dollars.
So, to make the decision more manageable, most people focus on the Annual Percentage Rate, or APR. This number presents the annual cost of the mortgage based on standardized assumptions. Unfortunately, these assumptions are probably not correct for most borrowers in today's world, and can actually lead to the wrong (and extremely more costly) choice.
Dan Melson at Searchlight Crusade explains how this can occur in a recent piece titled why the lowest APR loan isn't necessarily the best. In it he gives some of the background and history behind the concept of APR, and works through a typical case where a loan with a higher APR but lower up-front costs turns out to be significantly better than a higher-cost but lower-APR alternative.
What drives Melson's example is the way APR is calculated -- based on the assumption that the borrower would keep the loan for its full term. In effect, this means that APR assumes that the up-front costs of the loan are amortized out over 30 ears for a typical loan.
But in today's market, the usual loan is either paid off or refinanced about every two or three years. If APR were to be calculated under that assumption, a loan with up-front costs such as points would end up with a significantly higher "effective" APR.
The bottom line - paying up-front costs on a loan could significantly increase your "real" APR (relative to the "Stated" APR) unless you plan on keeping your loan to term, which is highly unlikely for most people today.
And the next time I teach about mortgages, I'll have my students calculate the "effective" APR as an assignment.