Opening the Black Box: How to Design Intelligent Decision Support Systems for Crowdsourcing

Bibtex

@Select Types{,
	  
  
   
  
   
   Journal   = "Band-1",
  Title    = "Opening the Black Box: How to Design Intelligent Decision Support Systems for Crowdsourcing", 
  Author    = "Marcel Rhyn, Niklas Leicht, Ivo Blohm, and Jan Marco Leimeister", 
  Doi    = "https://doi.org/10.30844/wi_2020_a4-rhyn", 
  Abstract    = "In crowdsourcing, reviewing and evaluating textual data is a latent challenge. While text mining and machine learning represent promising technologies to solve this problem, it is still unclear how information systems based on these technologies (i.e., intelligent decision support systems) should be designed. In this study, we address this gap and develop overarching design requirements, design principles, and design features for intelligent decision support systems in crowdsourcing. The study follows a design science research approach with a cross-industry research consortium comprising 8 organizations. Our results are based on 41 semi-structured interviews, 13 expert workshops with 53 participants, statistical analyses with data from 676 crowdsourcing projects, and 2 field tests. For research, we introduce transparency and control as two additional meta-requirements for intelligent decision support systems and capture seven guiding principles for designing such systems. For practitioners, we describe specific design features that show how to instantiate these principles.

", 
  Keywords    = "Crowdsourcing, Decision Support, Design Science Research", 
}

Abstract

Abstract

In crowdsourcing, reviewing and evaluating textual data is a latent challenge. While text mining and machine learning represent promising technologies to solve this problem, it is still unclear how information systems based on these technologies (i.e., intelligent decision support systems) should be designed. In this study, we address this gap and develop overarching design requirements, design principles, and design features for intelligent decision support systems in crowdsourcing. The study follows a design science research approach with a cross-industry research consortium comprising 8 organizations. Our results are based on 41 semi-structured interviews, 13 expert workshops with 53 participants, statistical analyses with data from 676 crowdsourcing projects, and 2 field tests. For research, we introduce transparency and control as two additional meta-requirements for intelligent decision support systems and capture seven guiding principles for designing such systems. For practitioners, we describe specific design features that show how to instantiate these principles.

Keywords

Schlüsselwörter

Crowdsourcing, Decision Support, Design Science Research

References

Referenzen

1. Blohm, I., Leimeister, J.M., Krcmar, H.: Crowdsourcing: How to Benefit from (Too) Many Great Ideas. MIS Q. Exec. 12, 199–211 (2013).
2. Leicht, N., Rhyn, M., Hansbauer, G.: Can Laymen Outperform Experts? The Effects of User Expertise and Task Design in Crowdsourced Software Testing. In: Proceedings of the 24th European Conference on Information Systems (ECIS). pp. 1–11. AIS, Istanbul, Turkey (2016).
3. Barbier, G., Zafarani, R., Gao, H., Fung, G., Liu, H.: Maximizing Benefits from Crowdsourced Data. Comput. Math. Organ. Theory. 18, 257–279 (2012).
4. Piezunka, H., Dahlander, L.: Distant Search, Narrow Attention: How Crowding Alters Organizations’ Filtering of Suggestions in Crowdsourcing. Acad. Manag. J. 58, 856–880 (2015).
5. Walter, T.P., Back, A.: A Text Mining Approach to Evaluate Submissions to Crowdsourcing Contests. In: Proceedings of the 46th Hawaii International Conference on System Sciences (HICSS). pp. 3109–3118. IEEE, Waikoloa, Hawaii (2013).
6. Feng, Y., Chen, Z., Jones, J.A., Fang, C., Xu, B.: Test Report Prioritization to Assist Crowdsourced Testing. In: Proceedings of the 10th Joint Meeting on Foundations of Software Engineering (ESEC/FSE). pp. 225–236. ACM, Lombardy (2015).
7. Zhao, Y., Zhu, Q.: Evaluation on Crowdsourcing Research: Current Status and Future Direction. Inf. Syst. Front. 16, 417–434 (2014).
8. Abbasi, A., Sarker, S., Chiang, R.H.L.: Big Data Research in Information Systems: Toward an Inclusive Research Agenda. J. Assoc. Inf. Syst. 17, 1–32 (2016).
9. Wang, W., Benbasat, I.: Trust in and Adoption of Online Recommendation Agents. J. Assoc. Inf. Syst. 6, 72–101 (2005).
10. Chandra, L., Seidel, S., Gregor, S.: Prescriptive Knowledge in IS Research: Conceptualizing Design Principles in Terms of Materiality, Action, and Boundary Conditions. In: Proceedings of the 48th Hawaii International Conference on System Sciences (HICSS). pp. 4039–4048. IEEE, Hawaii (2015).
11. Peffers, K., Tuunanen, T., Rothenberger, M.A., Chatterjee, S.: A Design Science Research Methodology for Information Systems Research. J. Manag. Inf. Syst. 24, 45–77 (2007).
12. Österle, H., Otto, B.: Consortium Research. Bus. Inf. Syst. Eng. 2, 283–293 (2010).
13. Todd, P., Benbasat, I.: Evaluating the Impact of DSS, Cognitive Effort, and Incentives on Strategy Selection. Inf. Syst. Res. 10, 356–374 (1999).
14. Shim, J.P., Warkentin, M., Courtney, J.F., Power, D.J., Sharda, R., Carlsson, C.: Past, Present, and Future of Decision Support Technology. Decis. Support Syst. 33, 111–126 (2002).
15. Wang, W., Benbasat, I.: Interactive Decision Aids for Consumer Decision Making in ECommerce: The Influence of Perceived Strategy Restrictiveness. MIS Q. 33, 293–320 (2009).
16. Geiger, D., Schader, M.: Personalized Task Recommendation in Crowdsourcing Information Systems – Current State of the Art. Decis. Support Syst. 65, 3–16 (2014).
17. Blohm, I., Zogaj, S., Bretschneider, U., Leimeister, J.M.: How to Manage Crowdsourcing Platforms Effectively? Calif. Manage. Rev. 60, 122–149 (2018).
18. Tushman, M.L.: Special Boundary Roles in the Innovation Process. Adm. Sci. Q. 22, 587– 605 (1977).
19. Arnott, D., Pervan, G.: A Critical Analysis of Decision Support Systems Research Revisited: The Rise of Design Science. J. Inf. Technol. 29, 269–293 (2014).
20. Simon, H.A.: The New Science of Management Decision. Prentice Hall, New Jersey (1960).
21. Chiu, C.M., Liang, T.P., Turban, E.: What Can Crowdsourcing Do for Decision Support? Decis. Support Syst. 65, 40–49 (2014).
22. Eppler, M.J., Mengis, J.: The Concept of Information Overload: A Review of Literature from Organization Science, Accounting, Marketing, MIS, and Related Disciplines. Inf. Soc. 20, 325–344 (2004).
23. Jacoby, J.: Information Load and Decision Quality: Some Contested Issues. J. Mark. Res. 14, 569–573 (1977).
24. Schick, A.G., Gordon, L.A., Haka, S.: Information Overload: A Temporal Approach. Accounting, Organ. Soc. 15, 199–220 (1990).
25. Swain, M.R., Haka, S.F.: Effects of Information Load on Capital Budgeting Decisions. Behav. Res. Account. 12, 171–198 (2000).
26. Häubl, G., Trifts, V.: Consumer Decision Making in Online Shopping Environments: The Effects of Interactive Decision Aids. Mark. Sci. 19, 4–21 (2000).
27. Silver, M.S.: Decisional Guidance for Computer-Based Decision Support. MIS Q. 15, 105–122 (1991).
28. Holsapple, C., Lee-Post, A., Pakath, R.: A Unified Foundation for Business Analytics. Decis. Support Syst. 64, 130–141 (2014).
29. Feldman, R., Sanger, J.: The Text Mining Handbook: Advanced Approaches in Analyzing Unstructured Data. Cambridge University Press, Cambridge (2007).
30. Rzepka, C., Berger, B.: User Interaction with AI-Enabled Systems: A Systematic Review of IS Research. In: Proceedings of the 39th International Conference on Information Systems (ICIS). pp. 1–17. AIS, San Francisco (2018).
31. Hevner, A.R., March, S.T., Park, J., Ram, S.: Design Science in Information Systems Research. MIS Q. 28, 75–105 (2004).
32. Meth, H., Mueller, B., Maedche, A.: Designing a Requirement Mining System. J. Assoc. Inf. Syst. 16, 799–837 (2015).
33. Walls, J.G., Widmeyer, G.R., El Sawy, O.A.: Building an Information System Design Theory for Vigilant EIS. Inf. Syst. Res. 3, 36–59 (1992).
34. Sein, M.K., Henfridsson, O., Rossi, M., Lindgren, R.: Action Design Research. MIS Q. 35, 37–56 (2011).
35. Gregor, S., Jones, D.: The Anatomy of a Design Theory. J. Assoc. Inf. Syst. 8, 312–335 (2007).
36. Venable, J., Pries-Heje, J., Baskerville, R.: FEDS: A Framework for Evaluation in Design Science Research. Eur. J. Inf. Syst. 25, 77–89 (2016).
37. Breiman, L.: Random Forests. Mach. Learn. 45, 5–32 (2001).
38. Theodorou, A., Wortham, R.H., Bryson, J.J.: Designing and Implementing Transparency for Real Time Inspection of Autonomous Robots. Conn. Sci. 29, 230–241 (2017).
39. Siau, K., Wang, W.: Building Trust in Artificial Intelligence, Machine Learning, and Robotics. Cut. Bus. Technol. J. 31, 47–53 (2018).
40. Tyler, T.R., Rasinski, K.A., Spodick, N.: Influence of Voice on Satisfaction With Leaders: Exploring the Meaning of Process Control. J. Pers. Soc. Psychol. 48, 72–81 (1985).
41. Sproles, N.: The Difficult Problem of Establishing Measures of Effectiveness for Command and Control: A Systems Engineering Perspective. Syst. Eng. 4, 145–155 (2001).
42. Wang, L., Jamieson, G.A., Hollands, J.G.: Trust and Reliance on an Automated Combat Identification System. Hum. Factors. 51, 281–291 (2009).
43. Mayer, R.C., Davis, J.H., Schoorman, F.. D.: An Integrated Model of Organizational Trust. Acad. Manag. Rev. 20, 709–734 (1995).
44. Chen, H., Chaing, R.H.L., Storey, V.C.: Business Intelligence and Analytics: From Big Data to Big Impact. MIS Q. 36, 1165–1188 (2012).

Most viewed articles

Meist angesehene Beiträge

GITO events | library.gito