When their next album was released, I was parked outside the store before it opened. I had loved everything they had recorded previously, so there was no need to preview anything they put out in the future; I was certain I would love it just as much. I took the album home and listened once through it - I thought it sucked. I was truthfully really disappointed in it. Their sound had changed, they were using different chords, different distortion, and it was an entirely different direction from all of their old stuff. But I had already decided they were my favorite band.
So I played it again. And I played it again. I listened to the entire album all the way through probably 15 times until I finally did like it. In essence, I forced myself to like it by ignoring the possibility that it might just not be that good. I eventually stopped listening to that album as much and reverted to their old stuff that I still loved.
Now, years later, I still insist that they are an amazing band. I own every song they've ever released (and some they've not). But I no longer think they are infallible. I now know that they're just a band and some of their stuff is better than others. I also think a lot of other, newer bands are just as good or even better.
This is just a small example of a common psychological phenomenon where we tend to quickly make a decision based on limited information, and then no matter what information we receive after that initial decision, we mold it to back up our initial decision (some studies on the phenomenon are Perkins, Farady, & Bushey, 1991; Pyszczynski & Greenberg, 1987). For example, in an election, people usually decide very early on which candidate they prefer, and then any debates in which he or she engages, or any decisions he or she makes will only confirm the decision, no matter how much the voter might have disagreed with it before.
This pattern can be seen in the LDS Church as well. The accompanying phrase is "milk before meat". For example, if the missionaries approached an investigator's door to talk about the law of tithing or the eternal doctrine of plural wives, things probably wouldn't go that well. Instead, they talk about a new prophet and a new book of scripture containing God's will for mankind. So an investigator will quickly decide if he or she likes Joseph Smith and the Church, and then no matter what information follows will usually conform it to the initial decision. It doesn't matter how he or she felt about plurality of wives before, he or she already decided that Smith was God's instrument, so it must be okay.
Now, please understand that I do not suggest that I am immune to this tendency. Some readers will argue that I decided long ago the Church isn't true and read anything I could that would back up that decision. Some will argue that this blog is only to confirm to myself over and over that my decision was correct. I cannot claim that such a thing is not going on at some level, but I have done my best to be conscious of any bias I have. My suggestion to those who disagree with me is to consider the possibility that you too are not immune to this tendency.
To overcome this tendency, we all must be conscious that it exists and do our best to remain unbiased until we have enough evidence to decide responsibly. I do not suggest that emotion or the spirit or gut reaction or conscience (or whatever you choose to call it) should have no part in our decision. What I do suggest is that we should not base such important, eternal decisions on those things alone, but see what our emotions/conscience/gut reactions/the spirit tell us after we know more. If the Book of Mormon teaches great things about faith in Jesus Christ, then we know it is a great book about faith in Jesus Christ. But is that enough to also conclude it is an ancient record of the ancestors of the Native Americans?
Perkins, D. N., Farady, M., & Bushey, B. (1991). Everyday reasoning and the roots of intelligence. In J. F. Voss, D. N. Perkins & J. W. Segal (Eds.), Informal reasoning and education (pp. 83-105). Hillsdale, NJ: Erlbaum.
Pyszczynski, T., & Greenberg, J. (1987). Toward an integration of cognitive and motivational perspectives on social inference: A biased hypothesis-testing model. Advances in Experimental Social Psychology, 20, 297-340.