StATS: Should (Can?) Statistical Consultants, be Independent? (December 14, 2006)

I attended a webinar, "Should (Can?) Statistical Consultants, be Independent?" presented by Janet Wittes.

Statisticians are like Humpty-Dumpty (When I use a word...), in that there are certain words (validated, prespecified, intent-to-treat, and independent) have special meanings just to statisticians. The talk focused on the last word, independent. Independent is not ignorant or uninterested (although disinterested meaning lack of conflict of interest is good).

As statisticians are we an advocate, like a lobbyist, lawyer, or expert witness? An advocate's job is to fashion the best result for the client. An advocate who identifies with their client, they will use the word "we" in their conversation. An advocate should not be independent. An advocate and a client share the same interest.

Contrast an advocate versus a consultant. Are you a hired gun? Is your fee contingent based? Does your job depend on the answer? Noted that a "hired gun" is not a "prostitute". The contrast listed below help to distinguish an advocate (first choice in the pair) from a consultant (second choice in the pair).

A third role is "collaborator". A collaborator is a "team member" and uses the pronoun "we" but is still interested in truth rather than the best result.

It is important in a consulting environment to feel valuable, but perhaps more importantly, it is important we be perceived as important by our clients. It is critical that we "sit at the table" meaning participate actively rather than passively. It is important to talk and we need to talk in English, not in Statistics.

Dr. Wittes offered a two dimensional grid. To the left is a reviewer and to the right is a team member. To the top is an advocate and to the bottom is a critic. A critic can actually be valuable in the sense of serving as a devil's advocate. Independence is on the lower portion of the grid (but not necessarily at the very bottom because an extreme critic is actually an advocate of the opposite side).

Statisticians fall naturally into the role of critics, but it is hard not to let personal feelings move us into the role of advocates. Our clients want us to be advocates, but they grudgingly accept our role as critics.

I asked two questions. First, when does the "best result" not coincide with the "truth"? Dr. Wittes distinguished between "not going beyond the truth" for an advocate versus "producing a neutral result without a particular spin" for a consultant.

I also asked whether the distinction of an advocate versus a consultant changes when the goal of the project is to develop new software tools as opposed to producing a particular data analysis? Dr. Wittes agreed that new tools (and she added new methodologies) that advocacy is more troublesome when the

Another person wanted to draw a distinction between consultation and collaboration by the depth of the involvement, and Dr. Wittes agreed and elaborated on this point. A collaboration needs to really get inside the problem and understand it thoroughly.

Another person distinguished between these terms based on whether you actually see the data. To offer suggestions without seeing the data means you are a consultant. If you see the data, you are involved to a greater extent to become a collaborator.

A fourth person worried that by translating our hard work into simple language, perhaps we are oversimplifying and hurting ourselves by hiding many of the complexities of the problem and making it seem that the work is easy enough for anyone to do. Dr. Wittes agreed and offered an analogy of scientific approaches that use "cartoons" to explain mechanisms and pathways.

A fifth person asked do you see a difference between consulting with companies versus consulting with regulators. Dr. Wittes saw a huge difference. Regulators have to operate in a different framework.

A sixth person asked if a consultant who has been paid as part of the grant be included or excluded as an author on any research publications. Dr. Wittes pointed out that the degree of collaboration and interaction will determine this. Sometimes our contributions are undervalued and we are not given enough credit for our work and sometimes our contributions are overvalued and a trivial effort is overstated. Sometimes researchers want our names on publications for political reasons. If there is a misperception about whether you are "part of the team" then you need to fix things.

Dr. Wittes then offered her ten rules (guidelines) for consulting.

1. Know thyself. Know what kind of analyses that you prefer, that you accept, that you grudgingly accept, that you hate, and that you wouldn't be associated with. Be honest with yourself--there have to be things that you really hate to do.

There is tension in your work when you are in a regulated environment. It is easy if you and the regulators are on the same side. But don't hide behind a regulator and use them as an excuse. When you disagree with the regulators, life is very hard. When you are more conservative than the regulators, you are in an especially difficult position because a less rigorous approach would still be accepted by the regulator. She offered the method "Last Observation Carried Forward" is not ideal, but which is still acceptable in some regulatory environments.

2. Learn the relevant nouns. In theory, we can do the data analysis using abstract labels like x, y, and z. We never really know the scientific or technical area as well as the experts. But we work more effectively and are more credible when we know the language. You should not play a statistical "trust me" game.

3. Keep up to date. You need to attend meetings and courses. You need to read journals and books. Examine your own behavior--are you only using the methods that you learned in grad school or are you applying new methods?

4. Have some humility. You need a reasonable but not a degrading amount of humility. It's okay to say "I don't know."

5. Work with people you like and respect. Or you can train them to be likeable. An outsider can easily turn down work, because there will always be someone else who will take up the job. This is harder for an insider, especially if you are the lone statistician. You have to say yes to everyone, but you can use some balance. Spend more time and attention with those people you like and with those collaborations that are most lucrative to you.

6. Don't let yourself be abused. There is hardly ever such a thing as a statistical emergency. Again this is more difficult internally. Just be frank if you can't do something--the worst case is they'll find someone else. If you always respond, then you will get more and more abusive requests.

7. Train your clients to work in their interest. You need to teach your client how to read (their is a lot of formal guidance available to scientists and technicians and they need to be familiar with this information.) Teach them that accuracy and precision matter (words and shades of meaning are very important to statisticians). Teach them that they can't hide bad news (someday the truth will emerge and then it's a disaster). Keep your critical eye fresh (know when you are being co-opted).

Dr. Wittes told a story about a situation where data indicated that there was a safety problem with a certain drug. The people in the room minimized the problem and tried to argue against the data. Dr. Wittes finally blurted out "I wouldn't want any of you to be my doctor!" The conversation stopped for a while but when discussion resumed her comment was ultimately ignored. At the break though, several of the other people (other non-doctors in the room) spoke to her and thanked her for her honesty. The doctors needed to hear bad news from an outsider.

8. Don't be all things to all people. Consulting is like a blind date. She offered a case study where a Phase 3 trial was ambiguous and the client believes that the data meets all the regulatory requirements, but the regulatory agency doesn't. You side with the agency. Do you go to the FDA and have them believe something about you that isn't true? Do you encourage the company not to submit and fail to serve their interests? A second case study involves a paper that you are a co-author on where you disagree with the tone, they omit something you want in, they give you too little time to respond. When do you remove your name? There are no easy answers to these cases.

9. Listen to what the client wants and know what your client should want. Give them no more or no less than what they want. But teach the client what he/she should want, and be prepared to sever the relationship if the client is continually uninterested in important issues. This is just intellectual honesty.

10. Avoid being the unwitting agent of someone else's destruction. You need to understand who is hiring you and why. There may be some political undercurrents that you need to be aware of. Are you being used as a tool or a pawn? Are you the first statistician involved or are you being used to attack the work of a previous statistician? If you are being hired as an outside consultant be sure to work closely with the internal statisticians.

In conclusion, Dr. Wittes asked what you give up when you become a consultant. You give up the choice of problems that you work on and the intellectual ownership of problems. On the other hand, you gain the chance to work on many things, and the ability to contribute widely. For any given case, think about where you are on the reviewer/team member and critic/advocate axes. Make sure that your client is comfortable with where you are, and do midcourse corrections if there is a disparity in perceptions or if you are uncomfortable in your current role.

One question was what you should do when you notice negative results or cautionary results in a regulated environment. Do you point this out to the client? to the regulator? Dr. Wittes said that you always disclose all information to your client. If the client does not want to disclose this information to the regulator, then you need to educate them that the statisticians at the regulatory agency are usually quite sharp and are likely to notice the same things that you noticed. Another participant pointed out that if there are negative or cautionary findings and you do not report them to the regulator, you will have a disaster on your hands.

This page was written by Steve Simon while working at Children's Mercy Hospital. Although I do not hold the copyright for this material, I am reproducing it here as a service, as it is no longer available on the Children's Mercy Hospital website. Need more information? I have a page with general help resources. You can also browse for pages similar to this one at Category: Human side of statistics.