5-Star Ratings and Automatic Archiving for Content: Just Say No

5-Star Ratings and Automatic Archiving for Content: Just Say No

Section 1: Content rating is different behind the firewall than it is on the Internet

Users of intranets and knowledge management repositories often ask for content rating systems for 1-5 stars similar to the user ratings offered by sites such as Amazon.com. "Let's have content ratings like Amazon." That doesn't work well when not at scale. The number of people who actually rate things on Amazon isn't a very large number, but because there are so many people, it's large enough to matter. 

The scale of the Internet is far greater than that of most intranets. Inside of a company, the percentage of people who might actually rate a document is tiny. So the number of ratings given to any document in a repository may not be sufficient to yield useful results.

Typically, there are only two types of people who actually give a one-to-five-star rating on a document: the person who wrote it will give themselves five stars, and someone else who had some axe to grind will give them one star. You won't get a useful rating.

If I'm asked to come up with a five-star rating, it's hard. I have to struggle with, "Is this three or four or five?" so I won't do it. I think that's the wrong approach. It's better to ask simple questions such as, "Were you able to reuse this document?" That's a yes or no question, and an easier one to answer. It's like the Like button. If you click the Like button, you don't have to think too hard about it. You either like it or you don't. If you like it, you click Like. If you have to rate something one, two, three, four or five, you have to think about it. That's a different dynamic.

I like this approach better: "Click here if you were able to reuse this document." That's a very objective statement. You either were able to reuse it or you weren't. If a document’s button is clicked a lot, then you can say that's probably a document that you'll want to promote or have appear higher in search results. Try the following:

  • Add a “I reused this document” or "I found this useful" button, similar to a “Like” button, but more specific, to all content. Encourage users to click on this button for content they were able to reuse.
  • Allow content to be tagged with "recommended” or "good example” or "proven practice" by an authoritative source. 

Based on my experience and that of others, user ratings of content within an organization are not as successful as those on Internet sites such as Amazon.com. Several attempts to implement a ratings system for intranet pages and documents proved unsuccessful and were abandoned. Reasons for this include:

  • The value of a document is not known until after it has been downloaded and read. There is a lag between the time it is accessed and the time a rating can fairly be made, and the user may no longer be logged into the repository at the time the rating could be applied.
  • Within an enterprise, the identity of users is generally known. Thus, they may be unwilling to give a bad rating since they would become known to the document contributor and possibly suffer negative social consequences.
  • The number of people who might actually rate any given site or document is too low to be statistically significant.

In the internal ratings examples I know of, most documents had no rating. For the few that did have a rating, it was usually five stars. Thus, there was no real value.

In lieu of ratings, I suggest that three types of feedback on content be enabled:

  1. A tag that can be applied by the site owner to designate content as “Recommended” or “Good Example” or “Proven Practice.” Content tagged this way would be displayed with a graphic icon to distinguish it.
  2. A tag that can be applied by users similar to the “Like” function of Facebook to indicate that they found the content useful. Content tagged this way would be displayed with a notation “n people found this useful.”
  3. If the thank-you enhancement is implemented (see below), then any content which has been the subject of a thank-you message would be displayed with a notation “n thank-you messages.”

Search can be enhanced to allow ranking by these fields, or to restrict results to content having any or all of these tags. Saved searches can be used on sites to display only tagged content.

In document repositories, I suggest adding a button that makes it easy to thank the contributor of the document. Clicking on the button should open a short form that can be filled in with a message thanking the contributor of the document, including how and why the document was valuable to the user.

The system should keep track of the number of thank-you messages sent for each document and display this in the repository as a form of user rating. A ranked list of all documents and the number of thank-you messages sent for each would be useful.

The system should also keep track of the number of thank-you messages sent to each contributor, which can be displayed on their personal profile page and in a ranked list of all contributors.

See also: Presentation to the SIKM Leaders Community by Dave Thomas in April, 2011 — Implementing content rating behind the firewall

Section 2: Don't automatically archive content; improve search instead

No alt text provided for this image

Knowledge repositories often are configured to automatically archive documents after some predetermined period of time. The intent is that after content has been available for 90 days (or whatever duration is chosen) it is no longer current, and thus should be removed from the repository. The assumption is that this old content should not appear in search results or in lists of available documents. Reasons include:

  • Old documents are no longer relevant, accurate, or useful.
  • Searches yield too many results, so weeding out old documents will improve user satisfaction with search.
  • Content contributors should refresh documents periodically.

Contributed content does not automatically become obsolete after a fixed period of time. It may remain valuable indefinitely.

I offer the analogy that just because Peter Drucker died in 2005, we don't remove his books from the library. His insights will continue to be useful for a very long time.

One firm where I worked had an automatic archiving process. As a result, I would often receive messages from frustrated users who were searching for content that they had previously found in the repository but could no longer find. I would have to restore this content from the archive to the active repository. This caused users to be annoyed with the KM program, resulted in a lot of wasted time and effort, and sometimes delayed the retrieval of important information needed for client work.

In Knowledge management and innovation, Steve Denning wrote:

The quality of knowledge does not depend on whether it is old or new but rather whether it is relevant, whether it still works. Whether it is old or new hardly matters. The question is: does it work? The dynamic of academia is different. Here the new is celebrated, whether it is useful or not. The old is looked down on, not because it isn't useful, but because the raison d'être of academia is to create the new, not the useful. Innovation in industry will often draw on lessons from the past, particularly those that have been forgotten, or those that can be put together in new combinations to achieve new results. The bottom line however is not whether the knowledge is new, but whether it works in practice.

With the cost of mass storage steadily decreasing, there are few good reasons to remove content from knowledge repositories unless it is known to be outdated, incorrect, or useless. Instead, allow search engines to limit results based on dates and other metadata to help users more easily find the content they need.

Don't automatically archive content in a knowledge repository, threaded discussion board, or other collection of knowledge. Instead, ensure that the search engine can limit results by the date of the knowledge object. Defaults can be set to limit results to the last 90 days, one year, or whatever duration is desired. But it should be easy for users to change the date range to include older content in the search results.

Follow the recommendations from Section 1 above, enabling content to be tagged with "recommended” or "good example” or "proven practice" by an authoritative source, and by users with a "I reused this document" or "I found this useful" button. Then allow searching by date, tag attribute, most-liked by users, etc.

Resources

  1. Content Management Process
  2. Archiving, Document Management, and Records Management
  3. Archiving Content
  4. Are you content with your content?
  5. Improving enterprise search results: Why don't you just tell me what you need?
Vandana Wadhawan

Knowledge Management, Content Management, Learning and Training | Ex-Deloitte

3y

This is such a good read and so relatable.

Alberto Lizzi

International Development and Programme Management

6y

Absolutely! At UNDP we are in the process to replace the five star rating system as well. It makes very little sense to ask to ask the reader for a review PRIOR to reading the publication.

Craig O'Shannessy

Personal Loan | Asset Finance | Equipment Finance | Unsecured Loans | Secured Loans

6y

Good piece, Stan.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics