Design Principles for Explainable Sales Win-Propensity Prediction Systems

Bibtex

Cite as text

						@Select Types{,
							 
							 
							 
							 
							 
							Journal   = "Band-1",
							 Title= "Design Principles for Explainable Sales Win-Propensity Prediction Systems", 
							Author= "Tiemo Thiess, Oliver Müller, Lorenzo Tonelli", 
							Doi= "https://doi.org/10.30844/wi_2020_c8-thiess", 
							 Abstract= "MAN Energy Solutions, one of the largest ship engine manufacturers in the world, is looking into further improving its hit rate of through-life engineering services and spare parts quotations. We help to solve this relevant field problem by building a novel machine learning based sales win-propensity prediction system that utilizes the lightGBM algorithm, SHapley Additive exPlanations, and a second layer conditional probability model of quotation age. Moreover, we build an implementation method for the broader class of such systems and extend the scientific literature on explainable machine learning by abductively developing and instantiating the design principles (DPs) of local contrastive explainability, global explainability, selective visualization, causality, confirmatory nudging, and accountability in a sales win-propensity system.

", 
							 Keywords= "Machine Learning, Explainability, Sales, Maritime Industry, ADR", 
							}
					
Tiemo Thiess, Oliver Müller, Lorenzo Tonelli: Design Principles for Explainable Sales Win-Propensity Prediction Systems. Online: https://doi.org/10.30844/wi_2020_c8-thiess (Abgerufen 19.04.24)

Abstract

Abstract

MAN Energy Solutions, one of the largest ship engine manufacturers in the world, is looking into further improving its hit rate of through-life engineering services and spare parts quotations. We help to solve this relevant field problem by building a novel machine learning based sales win-propensity prediction system that utilizes the lightGBM algorithm, SHapley Additive exPlanations, and a second layer conditional probability model of quotation age. Moreover, we build an implementation method for the broader class of such systems and extend the scientific literature on explainable machine learning by abductively developing and instantiating the design principles (DPs) of local contrastive explainability, global explainability, selective visualization, causality, confirmatory nudging, and accountability in a sales win-propensity system.

Keywords

Schlüsselwörter

Machine Learning, Explainability, Sales, Maritime Industry, ADR

References

Referenzen

1. Danish Ship Finance: Shipping Market Review. (2018).
2. Benbasat, I., Zmud, R.W.: Empirical Research in Information Systems: The Practice of Relevance. MIS Q. 23, 3 (2006).
3. Sein, Henfridsson, Purao, Rossi, Lindgren: Action Design Research. MIS Q. (2011).
4. Monat, J.P.: Industrial sales lead conversion modeling. Mark. Intell. Plan. 29, 178–194 (2011).
5. Yan, J., Zhang, C., Zha, H., Gong, M., Sun, C., Huang, J., Chu, S., Yang, X.: On machine learning towards predictive sales pipeline analytics. In: Twenty-ninth AAAI conference on artificial intelligence (2015).
6. Xu, X., Tang, L., Rangan, V.: Hitting your number or not? A robust & intelligent sales forecast system. In: 2017 IEEE International Conference on Big Data (Big Data). pp. 3613–3622. IEEE (2017).
7. Tversky, A., Kahneman, D.: Judgment under uncertainty: Heuristics and biases. Science (80-. ). 185, 1124–1131 (1974).
8. Bohanec, M., Robnik-Šikonja, M., Kljajić Borštnar, M.: Organizational Learning Supported by Machine Learning Models Coupled with General Explanation Methods: A Case of B2B Sales Forecasting. Organizacija. 50, 217–233 (2017).
9. Yan, J., Zhang, C., Zha, H., Gong, M., Sun, C., Huang, J., Chu, S., Yang, X.: On machine learning towards predictive sales pipeline analytics. (2015).
10. D’Haen, J., Van den Poel, D.: Model-supported business-to-business prospect prediction based on an iterative customer acquisition framework. Ind. Mark. Manag. 42, 544–551 (2013).
11. Bohanec, M., Kljajić Borštnar, M., Robnik-Šikonja, M.: Explaining machine learning models in sales predictions. Expert Syst. Appl. 71, 416–428 (2017).
12. Sharma, R., Mithas, S., Kankanhalli, A.: Transforming decision-making processes: A research agenda for understanding the impact of business analytics on organisations. Taylor & Francis (2014).
13. Drucker, P.F.: The Effective Decision, (1967).
14. Hollander, E.P., Vroom, V.H., Yetton, P.W.: Leadership and Decision-Making. Adm. Sci. Q. (1973).
15. Sutanto, J., Kankanhalli, A., Tay, J., Raman, K.S., Tan, B.C.Y.: Change Management in Interorganizational Systems for the Public. J. Manag. Inf. Syst. 25, 133–176 (2009).
16. Kayande, U., De Bruyn, A., Lilien, G.L., Rangaswamy, A., van Bruggen, G.H.: How incorporating feedback mechanisms in a DSS affects DSS evaluations. Inf. Syst. Res. (2009).
17. Gregor, S., Benbasat, I.: Explanations from intelligent systems: Theoretical foundations and implications for practice. MIS Q. (1999).
18. Martens, D., Provost, F.: Explaining data-driven document classifications. (2013).
19. Baskerville, R.L., Myers, M.D.: Design ethnography in information systems. Inf. Syst. J. 25, 23–46 (2015).
20. Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM : A Highly Efficient Gradient Boosting Decision Tree. In: Advances in Neural Information Processing Systems 30 (NIPS 2017). pp. 3149–3157 (2017).
21. Lundberg, S.M., Erion, G.G., Lee, S.-I.: Consistent Individualized Feature Attribution for Tree Ensembles. (2018).
22. Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. (1996).
23. Izenman, A.J.: Recent Developments in Nonparametric Density Estimation. J. Am. Stat. Assoc. 86, 205 (1991).
24. Štrumbelj, E., Kononenko, I.: A General Method for Visualizing and Explaining Black-Box Regression Models. Presented at the (2011).
25. Štrumbelj, E., Kononenko, I.: Explaining prediction models and individual predictions with feature contributions. Knowl. Inf. Syst. 41, 647–665 (2014).
26. Lundberg, S.M., Lee, S.I.: A unified approach to interpreting model predictions, (2017).
27. Lundberg, S.M., Lee, S.-I.: Consistent feature attribution for tree ensembles. (2017).
28. Ribeiro, M.T., Singh, S., Guestrin, C.: “Why Should I Trust You?” Proc. 22nd ACM SIGKDD Int. Conf. Knowl. Discov. Data Min. – KDD ’16. 1135–1144 (2016).
29. Ribeiro, M.T., Singh, S., Guestrin, C.: Model-Agnostic Interpretability of Machine Learning. (2016).
30. Lipton, P.: Contrastive Explanation. R. Inst. Philos. Suppl. 27, 247–266 (1990).
31. Miller, T.: Explanation in artificial intelligence: Insights from the social sciences, (2019).
32. Harman, G.H.: The Inference to the Best Explanation. Philos. Rev. 74, 88 (1965).
33. Miller, G.A.: The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychol. Rev. 63, 81 (1956).
34. Cowan, N.: The magical number 4 in short-term memory: A reconsideration of mental storage capacity. Behav. Brain Sci. 24, 87–114 (2001).
35. Larkin, J.H., Simon, H.A.: Why a Diagram is (Sometimes) Worth Ten Thousand Words. Cogn. Sci. 11, 65–100 (1987).
36. Tetlock, P.E.: Accountability: The neglected social context of judgment and choice. Res. Organ. Behav. 7, 297–332 (1985).
37. Tetlock, P.E., Skitka, L., Boettger, R.: Social and Cognitive Strategies for Coping With Accountability: Conformity, Complexity, and Bolstering. (1989).
38. Simonson, I., Staw, B.M., Haas, W.A.: Deescalation Strategies: A Comparison of Techniques for Reducing Commitment to Losing Courses of Action. (1992).
39. Lerner, J.S., Tetlock, P.E.: Accounting for the effects of accountability. Psychol. Bull. 125, 255 (1999).
40. Siegel-Jacobs, K., Yates, J.F.: Effects of procedural and outcome accountability on judgment quality. Organ. Behav. Hum. Decis. Process. 65, 1–17 (1996).
41. Wiseman, R.M., Gomez-Mejia, L.R.: A Behavioral Agency Model of Managerial Risk Taking. Acad. Manag. Rev. 23, 133–153 (1998).
42. Nickerson, R.S.: Confirmation Bias: A Ubiquitous Phenomenon in Many Guises. Rev. Gen. Psychol. 2, 175–220 (1998).
43. Thaler, R.H., Sunstein, C.R.: Nudge: Improving decisions about health, wealth, and happiness. Penguin (2009).
44. Shmueli, G., Galit Shmueli: To explain or to predict? Stat. Sci. 25, 289–310 (2010).
45. Wheelwright, S., Makridakis, S., Hyndman, R.J.: Forecasting: methods and applications. John Wiley & Sons (1998).
46. Lawrence, R., Perlich, C., Rosset, S., Khabibrakhmanov, I., Mahatma, S., Weiss, S., Callahan, M., Collins, M., Ershov, A., Kumar, S.: Operations research improves sales force productivity at IBM. Interfaces (Providence). 40, 33–46 (2010).
47. Zhang, C., Li, X., Yan, J., Qui, S., Wang, Y., Tian, C., Zhao, Y.: Sufficient Statistics Feature Mapping over Deep Boltzmann Machine for Detection. In: 2014 22nd International Conference on Pattern Recognition. pp. 827–832. IEEE (2014).
48. Duncan, B., Elkan, C.: Probabilistic modeling of a sales funnel to prioritize leads. Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Min. 2015-Augus, 1751–1758 (2015).
49. Bohanec, M., Robnik-Šikonja, M., Kljajić Borštnar, M.: Decision-making framework with double-loop learning through interpretable black-box machine learning models. Ind. Manag. Data Syst. 117, 1389–1406 (2017).
50. Eitle, V., Buxmann, P.: Business Analytics for Sales Pipeline Management in the Software Industry: A Machine Learning Perspective. (2019).
51. Štrumbelj, E., Kononenko, I., Robnik Šikonja, M.: Explaining instance classifications with interactions of subsets of feature values. Data Knowl. Eng. 68, 886–904 (2009).
52. Robnik-Šikonja, M., Kononenko, I.: Explaining classifications for individual instances. IEEE Trans. Knowl. Data Eng. 20, 589–600 (2008).

Most viewed articles

Meist angesehene Beiträge

GITO events | library.gito