Customer Satisfaction: Priorities for Improvement

Jason Hargrove (CC BY 2.0)

Marketing expert Donald R. Bacon, Professor in the Daniels College of Business at the University of Denver and editor of the Journal of Marketing Education, has a new article that introduces a method for identifying priorities in order to achieve greater overall customer satisfaction.

Understanding Priorities for Service Attribute Improvement” was published in the May 2012 issue of the Journal of Service Research. To view other articles in this issue, please click here.

Dr. Bacon writes in the executive summary:

In today’s increasingly competitive environment, service managers continually look for ways to improve their services. They generally know that some improvements will have more impact on customer satisfaction than others, but with limited budgets, identifying the specific improvements with the greatest impact is critical. This paper offers managers a new tool to understand which improvements to their services will give the biggest boost to customer satisfaction with the service.

The task of identifying the attributes with the greatest impact is particularly challenging because customers may not care that much about further improvement in some attributes. Thus, the attributes that were top priorities for improvement last year may not be top priorities this year because they have reached a point of diminishing returns. Several methods exist for understanding how customers value additional improvements in a service attribute, such as importance-performance analysis (IPA), regression analysis, and factor regression. However, these techniques have important limitations that are discussed in this article.

This paper introduces marginal utility analysis (MUA), a new method of understanding the relationship between an attribute and the overall customer evaluation of a service. The data necessary for the method can be collected within a typical customer satisfaction survey. With these data, the method estimates the shape of the relationship between each attribute and the overall evaluation of the service. Other techniques assume the same functional form between the attribute and the overall evaluation. MUA does not force an attribute to assume the same general form. Consequently, MUA provides a more accurate and detailed picture of each attribute’s relationship to the overall evaluation. Based on Item Response Theory and Rasch analysis, MUA is entirely different than factor and regression analyses and brings a new and richer perspective to understanding customers’ evaluation of service performance.

MUA is demonstrated using customer satisfaction data collected by a medical professional association following their annual conference. The method displays the results graphically and provides managers with a helpful visual tool for identifying the attributes that are top priorities for improvement now, and those that may become top priorities in the future as conditions change. For example, at low levels of satisfaction, customers value reducing the crowding at the conference more than increasing the amount of continuing education credit available. However, at higher levels of satisfaction, the priorities are reversed; reducing the crowding further is not as valued as increasing the educational opportunities. The method’s output is easy to interpret and should generate insightful conversations among the full management team.

Read the full article here. To learn more about the Journal of Service Research, please click here.

Are you interested in receiving email alerts whenever a new article or issue becomes available? Then follow this link!

Business & Management INK

Business and Management INK puts the spotlight on research published in our more than 100 management and business journals. We feature an inside view of the research that’s being published in top-tier SAGE journals by the authors themselves.

Leave a Reply

avatar

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  Subscribe  
Notify of