Something that often develops as a result of longevity in a particular field is a sense of perspective about the field's trends and changes-the advances and the shifting topics that leave their mark. One aspect of the behavioral health field that has managed to keep my attention for more than 30 years now is the area of clinical documentation, particularly the critical and oft-misunderstood (and thus misused) practice of treatment planning.
As a surveyor for CARF International for the past 16 years, I have had ample opportunity to review hundreds of clinical records from a wide variety of treatment organizations. When I am on a survey, one of the central questions I believe I must be able to answer when reviewing records is whether, based on the quality of the assessment, treatment plan and progress notes, the client is receiving the proper treatment in the most appropriate manner according to the relevant CARF standards. Many times the answer is yes. All too often, though, I have found that either the answer is no or there isn't enough documentation of sufficient quality for me to make a determination.
There are several reasons for the inability to make a decision about the quality and appropriateness of someone's treatment experience based on a review of the record. Most importantly, many treatment centers fail to recognize two fundamental aspects of the treatment planning process. First, treatment plans merely reflect what is put into them-similar to the “garbage in, garbage out” saying about computers. So if there is something faulty in the treatment plans that a center routinely produces, then there is something faulty with some aspect of its clinical systems. That leads to the second misunderstanding: Centers don't seem to understand that there is always a link between their treatment philosophy/process and the treatment plans they produce; that is, the latter reflects the former.
In fact, treatment plans always reflect something. At their best they reflect a well thought-out set of interrelated problems, goals, objectives and interventions, attuned to the program's philosophy, that are appropriately carried out and documented in the progress notes. At other times, though, I've seen treatment plans:
That reflected only the execution of the center's philosophically rigid and unconsciously developed programmatic content, rather than the execution of consciously developed, individualized treatment with appropriately chosen interventions. This is a bit like the “solution in search of a problem” concept, or the idea of clinging to “tradition” for tradition's sake;
That were based less on individual issues gleaned from a good clinical assessment and more on what particular services a program had to offer;
That demonstrated no linkage among the various components (for example, interventions that weren't related to the accomplishment of the stated objective, and objectives that weren't tied to the goal statement);
That indirectly reflected a lack of understanding on the part of the plan's author about the fundamentals of the treatment planning process and the process of change a client goes through while in treatment;
That, because of their obviously poor quality, reflected a lack of adequate supervisory review and/or competence; and
That had all their objectives and interventions tied to nothing more than the projected length of stay (e.g., all assignments due on the same day near or at the end of treatment).
All of these issues have but a few underlying causes. One, of course, is a lack of training in the basics of treatment planning and the process of how clients change during their treatment experience. Another is that beyond the skeleton framework of “problems, goals, objectives and interventions,” there is no explanatory construct that animates and unifies those four concepts into a meaningful architecture.
Of course, it doesn't help that one of the nation's leading accrediting bodies, CARF, does not presently require problem statements, or that interventions be linked to achievement of the stated objectives or that objectives lead to accomplishment of the stated goals. In addition, computerized treatment planning software, although perhaps a boon from an efficiency standpoint, might be inadvertently dulling counselors’ clinical skills, moving us away from the more sound “teach me to fish” concept and toward the “give me a fish” one. At a recent training on treatment planning, a therapist underscored his growing reliance on treatment plan software by derisively referring to it as “seductive.”
Reforming the treatment process
Beyond the obvious flaws in current treatment planning practices, both self-inflicted and those of the “cookie-cutter” computer-generated variety, lie larger and more concerning questions about the actual process of treatment. Can we afford to continue to act as though there is no relationship between substandard treatment plans and substandard treatment? Does simple compliance with treatment goals equal “successful” treatment? Can we continue clinging to ultimately indefensible attitudes such as, “We don't have to get too concerned about our poor treatment plans because we know we're providing good treatment”? The answer to all of these questions is an emphatic “no!”