Skip to content

Epistemic Diligence and Honesty

Written by Rebecca Brown

All else being equal, it is morally good for agents to be honest. That is, agents shouldn’t, without good reason, engage in non-honest behaviours such as lying, cheating or stealing. What counts as a ‘good reason’ will vary depending on your preferred ethical theory. For instance, Kant (in)famously insisted that even if a murderer is at the door seeking out their victim you mustn’t lie to them in order to protect the victim’s life. A rule utilitarian, in contrast, might endorse lies that can generally be expected to maximise expected utility (including, presumably, lying to murderers about the whereabouts of their intended victims).

What will actually count as being dishonest will vary depending on your preferred conception of honesty. If honesty has very extensive requirements, failure to volunteer relevant information when you know someone would find it useful might be a failure of honesty. On a narrower account, perhaps even ‘paltering’ – misleading by telling the truth – might not count as dishonest so long as what the agent says is technically true.

Christian Miller (2021) suggests we should view honesty as ‘reliably not intentionally distorting the facts as the agent sees them.’ This means that an honest person will generally try to give an accurate picture of what she takes to be the truth. But what if the person has a completely mistaken understanding of the facts? For instance, I might try to explain to you how aeroplanes manage to avoid falling out of the sky. This will involve some reference to the way air moves over the wings at different rates and how different pressure above and below the wing allows the plane to defy gravity. Even if I try to be scrupulously honest and tell you exactly how I think aeroplanes fly, I will probably give you misleading information, because I just don’t understand it very well.

Is it still correct to describe my actions as honest? Probably. I have never given you the impression that I know anything about aeroplanes or wind. Let us assume that I didn’t exaggerate my competence in this area. So, lack of knowledge needn’t be a barrier to honesty.

What if I hold a medical degree rather than a philosophical one. And what if you, as my patient, ask me about the health benefits of a particularl exercise therapy for low mood that you’ve come across. I vaguely remember seeing someone tweet about this therapy once, and I think they were moderately positive about it. I’ve never looked at the clinical evidence base, NHS guidelines or had first hand experience of it. But I figure it can’t hurt and reply “Oh yes – I’ve heard of that. It’s supposed to be good.”

Is this honest? On the one hand, I don’t exactly ‘distort the facts as I see them.’ The truth is, I think the therapy will probably help. The problem is that I haven’t really made any effort to check how good the therapy is. As a result, I am not really justified in believing the therapy will help.

Moreover, I am acting in my role as a doctor. Unlike the case where a philosopher gives you a poor explanation of aeroplane flight, you might reasonably expect a medical doctor to know what they’re talking about when making assertions about medical treatments. I probably shouldn’t tell you I think a treatment is likely to benefit you unless I have good reason to believe that is true.

This suggests that an honest agent doesn’t merely present the ‘facts as she sees them’. She must also avoid misrepresenting her epistemic status regarding those facts. To put this another way, she shouldn’t imply that she is better informed than she really is; she shouldn’t state those facts with undue confidence.

One variety of person who misrepresents her knowledge is a liar. Liars intentionally mislead people. The uninformed doctor doesn’t necessarily intend to mislead her patient about the therapy: recall, she thinks it will probably help (she just doesn’t have much evidence for this belief). Another variety of dishonest person who misrepresents her knowledge is the bullshitter. Bullshitters display an indifference to the truth or falsity of what they say. Perhaps the doctor bullshits to her patient: she doesn’t care whether or not the therapy is beneficial, she just says it works to give the impression that she knows her stuff, and to get the patient off her back.

I don’t think this quite captures the doctor either. She is not indifferent to the truth or falsity of what she says. If she knew the truth, she would say it (whether or not she knew the treatment to be beneficial, harmful or ineffective). So the doctor is not a liar and not a bullshitter. But she does misrepresent her state of knowledge.

What kind of failure of honesty is the doctor guilty of? My suggestion is that she is guilty of a failure of epistemic diligence. We all have epistemic reasons to avoid forming false beliefs and to instead form true beliefs. And this means we should only form beliefs when we are epistemically justified in doing so (or only hold them with a degree of credence that is epistemically justified). There might be exceptions to this: some philosophers suggest that there can be practical and moral reasons to form beliefs (or avoid forming beliefs) even when the evidence supports an alternative belief. For instance, we might have a moral reason to avoid using racial stereotypes when forming beliefs, even if we were in a context where racial stereotypes were predictive of people’s behaviour.

The doctor has the normal kind of epistemic reasons to avoid forming unjustified beliefs about the exercise therapy. But she also has moral reasons, in her role as a doctor, to avoid forming unjustified beliefs about medical treatments in general and to avoid asserting these unjustified beliefs to her patients. Note that what it takes to be ‘epistemically justified’ in believing (or asserting) something is different for the philosopher talking about aeroplanes compared to the doctor talking about medical treatments. The doctor’s status as an expert in this domain means that the evidentiary basis for her belief is expected to be stronger (certainly stronger than a half recalled tweet).

On a standard reading of Miller’s definition of honesty (‘not distorting the facts as you see them’) then epistemic diligence doesn’t look like a requirement. You can be as slapdash in forming beliefs as you like, so long as you report them accurately. But honesty is supposed to be a virtue, and there doesn’t seem to be anything particularly virtuous about sharing your beliefs if you are not epistemically justified in holding those beliefs.

One option, consistent with Miller’s definition, is just to require that the honest agent doesn’t ‘distort’ the fact that she has epistemically unjustified beliefs. So the doctor can tell her patient that she thinks the exercise therapy will work, but she must also share the relevant caveat that she has formed that belief on the basis of a tweet, and doesn’t really know much about the effectiveness of the therapy. This would avoid distorting the fact that the doctor doesn’t really know what she’s talking about in this case.

A second, more controversial option is to make epistemic diligence a requirement of honesty. Either, one might hold that an honest person must be epistemically diligent – she must be epistemically justified in forming the beliefs she holds, and then she must present those beliefs in a non-distorted way. Alternatively, one might keep epistemic diligence separate from honesty, but hold that virtuous honesty must always be accompanied by epistemic diligence; it is still ‘honest’ in some sense to report beliefs that you are not epistemically justified in holding but it will only be virtuously honest to report those beliefs when you have exercised epistemic diligence in forming them.

Why bother trying to make epistemic diligence a requirement of honesty? Put simply, because otherwise honesty doesn’t look much good. Honesty derives its value (whether that be on the basis of respect or autonomy or similar) from the value we place on holding true beliefs. It is disrespectful for people to lie to us – and it undermines our autonomy – because it thwarts our attempts to have an accurate picture of the world. And so honesty that doesn’t generally serve the function of helping agents to have an accurate picture of the world does not seem particularly useful. This is heightened when we think about expert agents, including group agents, who provide advice. It is no good for such experts to share ill-informed beliefs that don’t meet the professional standards that we expect of them. Lazy experts who do not exercise their expertise in order to form accurate beliefs, but nonetheless share those beliefs freely are not, it seems to me, acting with a virtuous form of honesty. Such experts might get themselves off the hook by being explicit about their laziness, and thus encouraging people not to view their assertions as the assertions of an ‘expert’. But in failing to apply the epistemic standards associated with their expertise, they effectively surrender their status as an expert. To be both honest and an expert, they must exercise epistemic diligence. 

 

References

Miller, Christian B. Honesty: The philosophy and psychology of a neglected virtue. Oxford University Press, 2021.

Share on

8 Comment on this post

  1. Elegant piece of writing. Analytically and organizationally superior. Am pleased to witness growth of this blog and an expansion of topical material that well-typifies the practical ethics intention. Hope others may appreciate your efforts also.

  2. You convinced me that honest communication of claim P needs needs the speaker to also communicate either subjective probability (credence(P)) or info about perceived degree of warrant for P being true. But these are different. Which to choose? [My doctor is pretty sure that God has a plan for me…]

  3. Being convinced of anything has led to a life of inquiry. For example, I know there are people who believe that one religious faith or another is the one true one. For them, it is a true belief. For others, it is a religious faith, among many—one of which might be true Or none of which are because they do not meet the litmus test, whatever that may be. Using ‘doctor’ in one medical sense of another, I have not known one who believed God has /had a plan for me. If he or she believed this, I was not informed. Contratemps, many or most doctors of divinity will offer counsel and opinion on such matters. This is what they have been trained to do;it is their calling. So, in this sense, true belief is in the eye of the believer—the corner of the combattant. All others need not apply. Beliefs are, largely if not totally, propositions. This, we may adduce, is why Davidson called them propositional attitudes. There are dozens of them. True beliefs are not so plentiful. That epistemic diligence is needed to ferret them out. Honesty is one of the easier virtues, so long as circumstance and contingency do not interfere. Life is fraught with them, especially when we are dealing with friends and loved ones. Diligence, of any kind needs an accompaniment of honesty.it helps immeasurably also when we know what we are talking about. See: Wittgenstein…

  4. Not having read the cited book I would like to pose the following question:-
    Would it be true to say that truth is time dependent?

    What is meant by that is that each truth has its own time, and that time (for a truth) is variable for different audiences dependent upon the understanding/knowledge/comprehension they may apply during any given period of time. (Equally applicable to scientific and/or social fact).

    The consequences from that are many and have many implications.

    Examples illustrative of that issue:
    1. The Flat Earth concept, although a Round World was accepted by some it took many hundreds/thousands of years before it become a scientific fact(truth). So various opinions (truths) were widely held during that period.
    2. In the political world, something which is known will occur (a fact) either because it will be caused/created/has been caused/created becomes information of value because time may be reversed for those who are not aware of that information. In that sense a truth is posited which people do not believe until the fact of the issue is exposed.
    3. In the scientific world it was not possible for anything to travel faster than light. That was an accepted fact (truth). Until it was disproven and another truth superseded it.
    4. In education it is necessary to time the disclosure of any particular truth with care, because if the necessary information is not in possession of the recipient they may reject the truth it causing a blockage to further learning, and possibly setting their thought in stone for the rest of their lives.
    5. In religion a set of stable social ideas is provided which become, much like the flat earth, set in stone until confronted with something which cannot be denied and hence get adapted, or the religion dies out because it is unable to cope with an advancing social realm.
    6. In philosophy a given philosophical outlook will produce certain outcomes in given circumstances, those circumstances creating contextually relevant times when a particular truth becomes more widely acceptable and visible.

    Each set of ‘truths’ probably appear paradoxical to others not living within them. And that indicates a necessity for suspending judgement whilst gaining enough information to further understand and hopefully fully comprehend any truth(s) being projected. This may also cause problems if the truth is revealed too far in advance of its acceptance. So does a period of opportunity exist in truthtelling, at one extreme of which an individual is perceived as a prophet, and at the other as stating the obvious? Does the epistemic diligence referred to then become more about the timing of a truth as much as providing a validating reasoning. Lies/jokes can cause a real problem as they project egoistical rather than truth based issues which may be difficult to differentiate between.

    Comprehended in this way each projection of any truth would have its own time, outside of which, for any particular audience, it becomes interpreted as something else. I suppose such a thing could be utilised as a system which could be used to measure progress at an intellectual level, but that would not be a popular or widely held truth.

  5. Good analysis, Mr. Ian. I don’t know if it goes too far afield from the content/intention of the essay, but such is not my call to make. Hope the other Van Pelt is reading. He would like this. In my opinion, your assessment of the temporality of truth aligns well with mine on contextual reality. We make things up as we go, according to interests, beliefs, culture and other factors. This is a bit like what John Perry, at Stanford, wrote some years ago regarding LEVELS of reality, only more subtle in genesis. Whether you agree with my thinking on this is unimportant. Nice work.

  6. Addendum:
    Truth is more than time-dependent. It is impossible when it challenges contextual reality. I conducted an experiment today, challenging a number of posts’ notions on everything from feminist economics to whether AI deserves status as an independent personage: does it have rights. None of those responses/answers are accessible; no one at the blog sites has chosen to rebuke or rebut my assertions or challenges. Yet. Now, these folks are supposed to be thinkers. They may already hold doctorate degrees; may even be tenured professors at prestigious schools. Apparently my responses a. Are considered pedestrian, or b. Do not merit consideration, because I am unknown, or am too well known now to merit respect as an autodidact. There is likely a c., maybe a d. I could surmise those, but that would prove tiresome..This blog is about ethics. So let’s find those, shall we?

  7. The wrong message appears to have been taken from my very poor, fractured and rushed writing.
    The main point attempting to be drawn out was not the contextuality (I hate using those words because of the conceptual confusion inherent within the differing, yet true interpretations given to them) of truth, I think that has long been accepted and the examples were more of an introduction to the main issue being raised.
    But where a certain truth begins to become acceptable there exists a period of time during which it may be communicated, in various ways, to various audiences and be largely accepted. That particular period of time, beginning with the early stages would be seen as prophecy, and the later stages would be seen as stating the obvious. It seems that period, although forming the initialisation of the contextually relevant period for that particular truth, allows for different types of approach.
    e.g. Two hundred unconnected people may during that period, beginning from the early prophecy stage through to the stating the obvious stage, each be speaking the truth, if having carried out due diligence they repeated it accurately, but that truth would be being received/perceived differently by the differing audiences during that time, who may themselves form different conceptual ideas about that truth.
    This led into the link between epistemic diligence (those technical words again) and truth. In the here described scenario, to properly exercise epistemic diligence, one would be required to take thorough and careful account of the context in which the truth was being communicated as well as the timing of the truth, amongst the other persons also communicating it, so as not to detract from the ‘same’ truth they proffered. Truth would then become nothing more than a political manoeuvrer to maximise whatever ethical/moral/philosophical driver was considered most important. Ethics perhaps fine, Truth – a real difficulty because of all the disparate tensions becoming attached to it. People are likely to respond with ‘that’s humanity’, but especially in the sciences that response does not progress truth. This does not propose any exact precision for truth in most areas, and I suspect the blog article author’s focused analysis would not really be seeking that either.

Comments are closed.