Technology, Language, Planning: A New Dynamic for Financial Planners

Daniel J. Royer, Ph.D.
Grand Valley State University

Executive Summary
The technology that drives the analysis that goes on in the financial planning profession determines user expectations and experiences as well as the nature of the discussion between planner and client. This article will comment on the short history of Monte Carlo analysis in financial planning, but its primary focus is on yet a newer technology: dynamic planning software derived from economic research. Lynn Hopewell, in a 1997 JFP article, chided the profession: “Finally . . . we should ask not whether we should use these techniques and tools [Monte Carlo], but rather, why have we not already done so?” The same quandary applies today with regard to dynamic modes of personal economic planning. A JFP columnist has recently said as much: “If Kotlikoff is correct, we need to rapidly revisit our planning process.” This article lays the conceptual groundwork for such a revisiting and explores what differentiates conventional planning software from a dynamic approach based on economic models. The article uses ESPlannerTM as representative of this economic approach—the only available software that uses dynamic computation—but does so in an effort, not to review or promote the software per se, but to describe a new dynamic it makes possible for users, planners, and clients. The new dynamic for planners includes an emphasis on option analysis as well as an economic perspective on life-cycle client planning. For individuals, the new dynamic involves both of these things as well as a radically different quality of interactivity with planning software.

Next Generation Dynamic Planning Software and a New Dynamic for Financial Planners
The technology that drives the analysis that goes on in the financial planning profession determines user expectations and experiences as well as the nature of the discussion between planner and client. The pedestrian user who naively approaches financial planning and believes that “with modern computers” we can now calculate household economic numbers in a way that solves for how much we can live on, now and in retirement—is misinformed. In fact, the standard practice guided by conventional planning software technology begins by asking users—not telling them—how much they will need to live on in retirement. The user could make educated guesses and work backward through trial and error to a “nest-egg” number and optimized draw down that smoothes lifetime living standard and leaves nothing on the table at death, but arriving at this solution would be very time consuming and it’s not what the software is designed to do. Furthermore, and just as important, it’s not what financial planners expect it to do. What conventional software can do is use Monte Carlo analysis to reveal the likelihood of the user’s “best guess” (or the computer’s best guess) about living standard after retirement lasting to the end. However new software developments that use economic modeled dynamic programming will change user expectations and reshape the discussion between financial planner and client.

The most popular professional software in the financial planning industry, MoneyGuidePro, Financial Engines, and online calculators at every major financial website, ask the user to enter an important number—“amount needed to live on in retirement”—or it is entered for them as an assumption based on industry standard rules of thumb. Entering a rule of thumb estimate or calculating an educated guess introduces a potent variable up front in the economic calculations instead of revealing it as a solution. A significant and powerful set of research studies by respected economists demonstrates that even a 10% variation in this rule of thumb (a rule that the financial planning profession currently disagrees about) or educated estimate can lead to significantly different savings recommendations. To make matters worse, the methodology for figuring replacement rate is inherently flawed as economist Laurence Kotlikoff (2007) points out:

Ironically, if households already are saving the appropriate consumption smoothing amounts, they have no need for replacement rate targeting. But if they are not, the replacement rate methodology will produce the wrong replacement rate because it will use actual saving (which is “incorrect”) in calculating the rate.

Estimating the amount of money needed in retirement disguises a significant shortcoming in current practice: the financial planning profession has lulled itself and its clients into accepting ballpark forecasting, a ballpark that is getting larger, not smaller, as planners discover more uncertainties to plan for. Kotlikoff (2006) demonstrates convincingly that missing the target estimate by even 10% can have a huge impact:

If plus or minus 10 percent mistakes occur with equal likelihood, the spread in pre-retirement living standards for two otherwise identical households could easily be 48.8 percent—the difference in the $26,289 and $39,109 pre-retirement living standards of households that set their spending targets 10 percent too high and 10 percent too low, respectively. The corresponding post-retirement spread in living standards is 22.3 percent. (12)

But setting our own retirement living standards is just one of the problems that vexes conventional software. Consider a problem that sounds ordinary and immediate: “How much will my living standard change this year and in subsequent years (i.e., how much can I spend?) if I decide to take my Social Security at 70 instead of 65?” Considering an option on an event in the future (and determining its present impact) is beyond the reach of all of the dozens of financial planning calculators being used by professionals, not to mention the online calculators. The basic question asked above is inappropriate to its engineered design because the software lacks the dynamic programming required to answer it.

Or take another deceptively simple question: How does what my household can spend this year change if I assume that Social Security is going to be cut by 20% fifteen years from now? Again, one could spend hours or days manipulating conventional software to work out an answer, but it’s simply the wrong tool for the job. The consequences of these kinds of questions are not at all trivial if one is trying to do economic financial planning. Those are just two examples—there are dozens of such questions related to present life-cycle decisions hinging on future plans that Kotlikoff’s ESPlanner™ program can generate answers to in seconds but that overwhelm conventional software planning programs.

So what’s going on? Why in this day of dual-core processors is there such a mismatch between what is possible from our software tools and what’s actually available? In order to investigate that question, let’s review a bit of financial planning history.

Throughout the 1980s and 90s standard practice in financial planning used static methods of analysis. Users entered data—one value per variable—and made the entire set of planning decisions at the beginning of the calculation process. The 1997 article in the Journal of Financial Planning “Decision Making under Conditions of Uncertainty: A Wakeup Call for the Financial Planning Profession,” marks an important turning point in the adoption of stochastic methods of analysis. The author, Lynn Hopewell, a former editor of this same journal, explained stochastic analysis and Monte Carlo simulation to his colleagues and readers in the financial planning profession. He pointed out that the technology to describe, not just the best and worst case scenarios or the range of options possible, but the probability of these possible scenarios, was easily available but was simply being ignored by the financial planning profession. He pointed out then that the current state of planner education and use of technologies for dealing with well-known problems of uncertainty was 40 years out of date compared to what was going on in related areas such as business and economics. He wrote in 1997: “One peruses the financial planning literature in vain for papers that explicitly apply state-of-the-art techniques for dealing with the problems of uncertainty. . . . Monte Carlo analysis techniques were the core subjects in business schools 40 years ago and longer. These tools have been widely used in other industries for years” (84). That was 1997, though since then stochastic Monte Carlo simulations have become standard practice in the financial planning profession—perhaps in part due to Hopewell’s chiding wake-up call.

What’s startling about this slice of financial planning history is underscored by Hopewell’s additional comment, “oddly enough, academic attention to the use of modern quantitative techniques applied to personal financial planning decision making has been almost nonexistent” (84). Although the tools for the kind of stochastic analysis Hopewell recommended had been available for years and their use was common knowledge in other professions, he said about financial planning education: our “educational institutions should move out of the Dark Ages” (89). Again, this discussion took place just ten years ago in the flagship journal of the financial planning industry, written by the journal’s former editor.

So ten years ago the financial planning profession was politely scolded by one of its own senior members to use methods of analysis that had been commonplace in fields outside of financial planning for 40 years. The field took up the challenge, and Monte Carlo analysis is now standard practice and there are many tools available to run Monte Carlo analysis. Since then, Boston University economist Laurence Kotlikoff has issued many second wakeup calls for the financial planning profession and described the next generation of analysis: dynamic models that work in both deterministic and stochastic situations. That model has not yet been adopted or even received a significant look from the financial planning profession. Instead, Kotlikoff’s charge of “financial malpractice” because of the poor use of existing technology—and greed, passivity, and laziness from within the planning profession—have been met with skepticism, puzzlement, and denial—but not argument. Until last month, there was but one passing reference to Kotlikoff in the history of this journal. But finally, in the June 2007 issue of the Journal of Financial Planning, Harold Evensky, an internationally recognized financial planning expert and new columnist for this JFP, concluded a review of recent investment research with the following comment on Kotlikoff:

My first reaction to the New York Times article and this paper [one of Kotlikoff’s essays] was incredulity followed by anger; however, after a little more thought, I decided that for the benefit of our clients and our profession, we cannot simply ignore such a challenge, even if it's less than tactful. If Kotlikoff is correct, we need to rapidly revisit our planning process; if he's wrong, we need to speak out forcefully to the public media before some investors follow the "spend more, save less" mantra into bankruptcy.

Kotlikoff in fact does not chant “spend more, save less,” and saying so reinforces the view of outsiders that the financial planning profession has simply been unaware or stubbornly ignorant of the last decade of research in economics—and that a decade of headlines like “You Might be Saving Too Much” cannot goad the profession to make a serious investigation of economic research. Now is the time. Had Evensky read a bit more of Kotlikoff’s work or studied his software, he would not hold this view. He should certainly not write off Kotlikoff’s research without a serious look, nor should he as one CFP in an online forum did, dismiss him as “some guy” hawking software. It’s discouraging to think that the financial planning profession is perhaps still as disconnected from an informed research and development arm and from research in academic disciplines like economics as it was in 1997.

Kotlikoff is not “some guy” in this discussion. One can easily learn from his website at Boston University where he works that Kotlikoff has a PhD in economics from Harvard, has served as a consultant to the International Monetary Fund, the World Bank, the Harvard Institute for International Development, the Organization for Economic Cooperation and Development, the Swedish Ministry of Finance, the Norwegian Ministry of Finance, the Bank of Italy, the Bank of Japan, the Bank of England, the Government of Russia, the Government of Bolivia, the Government of Bulgaria, the Treasury of New Zealand, the Office of Management and Budget, the U.S. Department of Education, the U.S. Department of Labor, the Joint Committee on Taxation, The Commonwealth of Massachusetts, The American Council of Life Insurance, Merrill Lynch, Fidelity Investments, AT&T, and other major U.S. corporations. He has provided expert testimony on numerous occasions to committees of Congress including the Senate Finance Committee, the House Ways and Means Committee, and the Joint Economic Committee. Professor Kotlikoff is author or co-author of eleven books and hundreds of professional journal articles.

But Evensky is right in recognizing that the financial planning profession has yet to take a serious look at what Kotlikoff has to say. Kotlikoff’s work with ESPlanner has been ongoing for over a decade. His research (and provocations) has been covered by the national press since 1999 in the Washington Post, the New York Times, Forbes, Business Week, Bloomburg, the Boston Globe, MarketplaceMoney, Fox News, USA Today, Investment News, and other places. Perhaps at this point Kotlikoff is entitled to talk anyway he wants about the financial planning practice less it wait another forty years to adopt available technology. If the profession investigates, they will see that it’s just as Evensky suggests is possible: the profession needs to “rapidly revisit [their] planning process.”

And Kotlikoff’s research with retirement planning software is not the only reason to revisit the planning process. In the May 2007 issue of JFP, Paula Hogan makes the case that “Life-Cycle Investing is Rolling Our Way.” Her argument is that a new paradigm based on the body of financial economics literature known as life-cycle investing theory is going to fundamentally change how planners work. She cites Bodie (2003) and others to point out that this new paradigm will, among other things, substitute lifetime consumption for wealth in our measures of welfare and dynamic programming and contingent claims analysis for mean-variance efficiency and Monte Carlo simulation with regard to our quantitative models. Her concern, like mine, is that the current-traditional model now driving practice is outmoded and that financial planning professionals need to catch up with the rest of the financial services industry and make use of a “financial economists’ worldview.” “Life-cycle investing is also a core part of the finance curriculum for this generation of business school graduates” she writes, “. . . but, interestingly, is not yet fully incorporated into the financial planning literature.” Hogan’s article and my own argue from the same conceptual model based on a financial economist perspective. But whereas she presses for fundamental changes in investment planning and products, this article presses for complementing this move by adopting the dynamic programming, economics-based retirement software that users and planners would need to complete this paradigm shift, test the efficiency of and integrate such new products, as well as take options on the systemic choices already available in a personal economy. Trite as it may sound: adopting dynamic planning software really is about needing new wineskins for new wine.

Revisiting the Planning Process: Ending with Endogenous Spending Targets
The concept behind this evolution in financial planning software is the notion of an endogenous spending target. The notion of growth from within, or results that are functions of the model itself, helped to create new models of economic growth and other dynamic economic systems. Indeed, one hallmark of contemporary economics is the sophistication with which it understands dynamic systems such as economic growth. Many strict classical models have been abandoned because of their failure to account for the influence of a system itself in generating its own outcomes.

Yet it’s impossible to meet with a financial planner without being asked to specify one’s retirement-years spending needs. The more sophisticated and “accurate” calculators allow the client to specify different targets for up to three different time periods in retirement. But however it’s done—as based on current spending or on rule-of-thumb estimate—it’s a number external to the complex individual economy. But with the use of dynamic models there is now actually a “correct” number, or more to the point, a set of correct solutions. The correct solutions are buried deep in the Gordian matrix of interrelated calculations that determine taxes, spending, income, bequests, children, inheritances, changes in primary and secondary homes, vacation homes, social security variations that get taxed at different rates for different income levels, individual state tax laws, not to mention assumptions about inflation and the volatile market. Despite this complexity they are, after all, just numbers, and even if it turns out to be “rocket science,” personal computers can now do rocket science.

Since many people naively approach a financial planner thinking they are going to learn how much they will have to live on in retirement, the planner has to be careful about how he or she turns the tables and puts the question back to the client. The planner (or online calculator) can ask the client to produce an annual budget or inquire about existing spending or use industry rules of thumb such as 80% of pre-retirement income. However, the planning profession is not itself in agreement about whether 70% or 100% is the best rule of thumb, and as solid research now shows, the mathematical fact is that even small mistakes in this exogenous target guessing game compound and lead to wildly divergent savings recommendations.

The endogenous solution to what a household can live on in retirement is a result of exceedingly complex calculations and patented algorithms and programming modules. The solution is a function of a dozens (if not conceivably hundreds, depending on how one counts) of real option choices a client can make about her personal economy. The first, and most obvious choices have to do with when to begin taking Social Security, when to retire, how much to save in the 401k, and so on. The available choices are partly a function of the user or planner’s own creative thinking. Yet they become even more varied when one realizes that choices can be made in a wide variety of combinations and timing patterns.

One might imagine a sound mixer board with say a dozen sliders that add and subtract economic variables in degrees and in combinations of degrees, each combining to produce “lifetime living standard” as solution presented as a horizontal line chart across the screen, through time from left to right. The line may be perfectly horizontal. Or it may perhaps be a very smooth and gradually rising ramp or stair step with four or five steps; or it might look like a steep one- or two-step staircase. The depth and width of each stair might be available for variable adjustment, and each possible contour entails its own probabilities for success under Monte Carlo probability analysis. A user may wish to take slightly less money over the “life of the trajectory,” so to speak, by having a line chart or economic trajectory that is not so steep from left to right (i.e., representing a trade off of higher living standard in younger years for a lower living standard later, in effect, trading a steep ramp or stair case for one that is more horizontal). Which one is the right solution? They all are right in one sense, for because of dynamic programming, each is already optimized to provide the smoothest living standard possible given user choices as received inputs. But one trajectory is preferable to another to the extent that it provides more consumption over the life of the trajectory, to the extent it entails less risk, but just as importantly, to the extent that this optimum living standard is available when it is needed most. “Optimum” is relative to a user’s personal needs, values, and concerns. The user can choose from among different living standard trajectories, each already made as efficient as possible given the user’s economy.

Seeing an optimum living standard through time as solution may befuddle someone steeped in conventional planning models that solve, not for real standard of living, but for the likelihood of success of a living standard that is based on a rule of thumb or an estimate about how much one will need to live comfortably. The endogenous solution is dynamic in that it is responsive to the changing variables like the cost of our housing going down each year as inflation balloons everything around us except our fixed mortgage payment. As we pay less in housing each year—to stick with this example—our tax liability may change. If our taxes go down, we have more money to consume. If we consume more, we need more term life insurance to protect this more affluent scenario. Our term insurance has an annual premium that then rises, which of course creates a slight drain on our present living standard. The program’s goal is to smooth the living standard, so it responds internally to these choices and makes adjustments and new recommendations. The economic web is complex across the flat dimensions of income and expenses, but it becomes yet more interesting in its dynamic features as we see that taking an option on some future choice compounds backward to this year’s living standard. Out of this complex economic matrix emerges the highest sustainable living standard. Kotlikoff put it this way:

I think the real story here is that desktop computers are becoming sufficiently powerful that we can now move [economic modeling] work from pure research into people’s households. People can now price out their lifestyle choices. How much will getting divorced cost me? Or having another kid? In eight seconds you can figure out how to raise your living standard by 10% or 12% by, say, deciding to delay taking your Social Security. It’s a gold mine. (Rosenberg 72).

From Static to Dynamic Interaction
Solving for optimum living standard creates a different kind of job for the user or financial planner than does the static data gathering and presentation mode of conventional financial planning. Indeed, this new technology will change the scope of the planner and client interaction. It will certainly give power to independent users.

Conventional planning focuses on several obvious variables: current savings, retirement date, investment success/risk, and when to begin taking withdrawals on assets and non-asset income. However, the next step after these variables have been entered is to enter yet one more very important variable—income needed—and finally to determine the size of the retirement nest egg and how long it can be made to last given asset allocation and other assumptions. The planner’s job is to input “income needed” and other data and generate Monte Carlo probability results, modify the inputs and compute again until they are acceptable to the client or until the client is convinced to buy more product or risk more or save more to give a scenario an acceptable probability. Individuals that don’t use planners can do this themselves, but they often can (and sometimes do) just switch software planning programs until they get results they find acceptable. This strategy is most certainly not rocket science. In technical terms, the existing models are static, deterministic, and stochastic, yet what ought to be the end solution to the principal planning question—what will be my living standard (both now and) in retirement?—is exogenously determined as a starting point in advance of even doing the calculations.

The capabilities of ESPlanner are also deterministic and stochastic, but the model is dynamic, not static. In other words, the model allows users to input values, including the capability of making those values random variables. But the dynamic quality unique at this point to ESPlanner allows planning decisions to occur, not once at the beginning, but continuously as the choices and events unfold. Consider again, for example, the issue of declining mortgage in real dollars mentioned above and the way results of each future year back draft to impact decisions to be made in prior years. Or consider that a dynamic model is aware that income and taxes are not static, but rather dynamically determined—our income is a product of our taxes and our taxes are a product of our income. This dynamic capability occurs within the software algorithms, but a dynamic program also engages the user in this dynamic interaction. By placing emphasis on real options, the user can make effectual choices about the immediate or distant future that impact the present as well as the future. A static, deterministic model is like a cake recipe; a dynamic deterministic model is like an interactive puzzle with multiple correct solutions.

How Technology Can Determine Our Expectations and Communication Practices
The planning model engineered within the software itself becomes a metaphor for the experience of the end user. With a static model, we mix ingredients and wait on results. With a dynamic model, ingredients are folded in and results gradually unfold. With the static model, we get a particular result and then perhaps we start over. With a dynamic model, we don’t start over so much as we slide the variables to optimize the contour of our solution. The answer the conventional static model gives—particularly now that the financial planning profession has ten-year’s experience with stochastic models—is typically a Monte Carlo-calculated probability of success. This probability has become the principal answer that most users are told is important and wait to hear. Indeed, waiting for this answer is much like watching with held breath the spinning of a roulette wheel as it clicks, ever slowly, to finally reveal whether you win or lose. The dynamic model, on the other hand, excavates the available shape of our living standard trajectory buried within the complexity of internally connected fixed data and dependent variables and then invites us to use both our imaginations and our theories (and new products such as Hogan (2007) suggests) to reshape or optimize that trajectory. In short, the conventional approach yields a probability number reported, for example, as “a 75% chance of success.” The dynamic approach yields a lifetime trajectory of living standard thread, internally related to our personal economic options. This experience using the dynamic approach is less like watching a roulette wheel, and more like manipulating an electronic control console that rewards efficiency and encourages clever, creative, and imaginative optimization strategies.

How does this change the role of financial planners and the expectations of clients? The biggest change for planners is educating clients to value optimum lifetime living standard over raw income or mere chances of success after retirement. Monte Carlo probability can be generated by both static and dynamic programs, and it’s important, but the static model begs the question: “chances of success of what?” The success of the planner’s industry-standard rule of thumb for what I need? Which rule of thumb—the 70% or the 90% rule? The difference in these two numbers on the impact on my highest optimum living standard today (a result only dynamic programming can reveal) is enormous—and it’s impossible to “guess wisely,” though that’s the only practical option with a static programming model—even if it does include Monte Carlo analysis. The relationship of income to living standard is far from transparent or direct, which is another thing that an economic approach to financial planning reveals but is not apparent to a naïve user or planner. In between these variables lurks taxes, housing, the scaled economy of a household with children or spouse—not to mention a 500-page Social Security book of regulations on how benefits are calculated, taxed, and distributed across a range of contingencies. A dynamic program can calculate Social Security benefits and pass them back to the program’s inputs as a dynamic variable to calculate living standard in the present in a recursive manner that finally zeros in on the smoothest available living standard over time.

A static model will resolve to a static interaction between planner and client because option analysis is not in the foreground of conventional planning software. Options are available, but they are masked by the limitations of the software. “What if” features are not uncommon in conventional planning software, but, again, the problem that “what-if scenario” planning aims to impact is “chances of success after retirement,” not current and future lifetime living standard. Users change the variables and wait for the new Monte Carlo report. Pre-retirement living standard might be a partial or derivative solution if the stochastic analysis eventually becomes so positive that, by implication, the living standard variable should be raised to adjust probabilities back down to standard levels. This methodology involves trial and error and the planner would need to work through the method in the back room as he or she programs in the variables to take back to the client.

A dynamic model, on the other hand, opens up a dynamic and imaginative interaction between planner and client. Option analysis is the very “control panel” of the program, and a lifetime living standard contour is the primary output. Because in a dynamic program living standard is a “bottom line” output—as opposed to a distant abstraction like probability of success—users are engaged in meaningful analysis with present impact and of course probability of success provides additional information. The contour of living standard over an entire life is a practical and meaningful resolution to planner and client discussion, and one that invites speculation and discussion of values and life-cycle choices that have clear and educational option tradeoffs that impact present and future.

In Fortune magazine’s June 2007 retirement guide, Kotlikoff and economist Jonathan Skinner share an exchange about Kotlikoff’s views and ESPlanner. Skinner worries that “the average Joe is going to sit down with this really scary program and is basically going to go with the defaults and may get an exact number, but one which carries all of these implicit assumptions that he may not agree with.” Conventional financial planning is not immune to this concern as financial planners using conventional software might be tempted to let clients get away with not worrying about implicit assumptions and view their clients, rightly or wrongly, as people who just want the bottom line or a probability of success number. The issue here is not really about transparent assumptions. ESPlanner’s assumptions are more transparent than most retirement calculators. It’s about the increased accuracy and usefulness of program output made available by dynamic economic models. Should a client annuitize assets? Should the client work extra years? Should the client draw down the pension sooner or later? Is the client happy with the current living standard? Dozens of questions might best be answered with the help of a financial planner who understands economic options. The questions often engage personal values and subjective options. Again, a good financial planner can help. But with dynamic planning software using an economic model of analysis, there is now no excuse for a static planning model, rough rules of thumb, or planning output that is limited by our technology and constrains our interaction with the software and client—not to mention our imaginations.


Bernheim, Douglas B., Lorenzo Forni, Jagadeesh Gokhale, and Laurence J. Kotlikoff “An Economic Approach to Setting Retirement Saving Goals” in Innovations in Retirement Financing. Ed. Zvi Bodie, Brett Hammond, and Olivia Mitchell, and Stephen Zeldes U of Pennsylvania P, 2002. (77-105).

Bodie, Zvi. “Thoughts on the Future: Life-cycle Investing in Theory and Practice” Financial Analysts Journal. 2003; 59, 1 (24-29).

Evensky, Harold. “What's New? A Potpourri of Investment Research” Journal of Financial Planning. June 2007. .

Hogan, Paula H. “Life-Cycle Investing Is Rolling Our Way.” Journal of Financial Planning. May 2007: Hopewell, Lynn. “Decision Making Under Conditions of Uncertainty: A Wakeup Call for the Financial Planning Profession” Journal of Financial Planning. October 1997: (84-91).

Kotlikoff, Laurence J. “Is Conventional Financial Planning Good for Your Financial Health? February 2006. .

Kotlikoff, Laurence J. “Why Target Practice Equals Financial Malpractice.” Investment News. June 11, 2007. .

Rosenberg, Yuval. Fortune. “Are you Saving Too Much?” Interview with Laurence Kotlikoff and Jonathan Skinner. June 25, 2007. (67-72).

Scholz, John Karl, Ananth Seshadri and Surachai Khitatrakun.. “Are Americans Saving ‘Optimally’ for Retirement?” Journal of Political Economy 114(4) August 2006, 607-643.