Introduction
1. Ethiopia has the second highest GDP in East Africa behind Kenya according to the World Economic Outlook Database (2019), but it ranks in the lowest quartile of the 2020 Human Development Index (ranked 173 out of 189 countries). See the June 23, 2021, New York Times article on the country’s most recent food crisis, “Famine Hits 350,000 in Ethiopia, Worst Hit Country in a Decade.”
2. Sara’s digital journey toward becoming a successful CEO challenges the status quo in more ways than one—across all industries, the percentage of business leaders who are Black women is still very small. Take, for example, the CEOs of the businesses on the 2021 Fortune 500 list: forty-one are women (8.2 percent), and two of those women are Black (less than 1 percent of the five hundred CEOs).
3. P. Weill, T. Apel, S. L. Woerner, and J. S. Banner, “It Pays to Have a Digitally Savvy Board,” MIT Sloan Management Review 60, no. 3 (2019): 41–45.
4. There are so many examples of companies not just encouraging employees to develop digital competencies but requiring them to do so. Later in the book, we’ll give examples of how some big tech companies like Google, Amazon, and Atos approach this issue. Jonathan Vanian’s “How Amazon Is Tackling the A.I. Talent Crunch,” Fortune, June 1, 2021, describes Amazon’s approach in detail: https://fortune.com/2021/06/01/how-amazon-is-tackling-the-a-i-talent-crunch/. But non-technical companies are making digital skills a requirement, too. J. P. Morgan recently began to require that all new staff take coding lessons. (See https://www.ft.com/content/4c17d6ce-c8b2-11e8-ba8f-ee390057b8c9.) Examples like this are becoming too numerous to list.
5. Quoted from Chambers keynote at Cisco Live 2015. An edited version of the transcript of his entire speech can be found here: https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/ciscos-john-chambers-on-the-digital-era.
6. Many recent surveys show data supporting this claim. A recent PWC report (https://www.pwc.com/gx/en/issues/upskilling/everyone-digital-world.html) discussing surveys of tech execs and their employees summarized their findings in this way:
It’s a new world that needs new skills. To many, that is an exciting prospect because it speaks to progress. Most of the CEOs and leaders we talk to agree in principle. But they also tell us that they are not ready. The sheer speed, scope and impact of technological change are challenging their businesses—and society at large—in fundamental ways. At the World Economic Forum in Davos, where we met with more than 150 business leaders last January, just about every conversation ended with the same question: how are we going to prepare our people?”
7. For a detailed discussion of the many ways in which technologies like phones generate new data through novel forms of computation see Youngjin Yoo, Ola Henfridsson, and Kalle Lyytinen, “The New Organizing Logic of Digital Innovation: An Agenda for Information Systems Research,” Information Systems Research 21, no. 4 (2010): 724–735.
8. C. S. Dweck, Mindset: The New Psychology of Success (New York: Random House, 2008).
9. We suggest that the approach that J. P. Morgan took (see note 4 above) is narrowly focused on skills rather than on helping employees to develop a broader digital mindset that helps them orient to the world in new ways. Our research shows that skills are important, but not at the expense or in lieu of a mindset change.
10. D. Crystal, English as a Global Language, 2nd ed. (Cambridge University Press, 2003); T. Neeley, Tsedal, “Global Business Speaks English: Why You Need a Language Strategy Now,” Harvard Business Review, May 2012, 116–124.
11. Over the past decade we have published many studies of digital transformation and technology use at work through which we have developed and tested these ideas. Some of those studies (conducted with many talented coauthors) include: P. M. Leonardi, D. Woo, and W. C. Barley, “Why Should I Trust Your Model? How to Successfully Enroll Digital Models for Innovation,” Innovation: Organization & Management (February 2021); T. Neeley, Remote Work Revolution: Succeeding from Anywhere (New York: HarperCollins, 2021); P. M. Leonardi, W. C. Barley, and D. Woo, “On the Making of Crystal Balls: Five Lessons About Simulation Modeling and the Organization of Work,” Information & Organization 31 (2021); P. M. Leonardi, “COVID-19 and the New Technologies of Organizing: Digital Exhaust, Digital Footprints, and Artificial Intelligence in the Wake of Remote Work,” Journal of Management Studies 51 (2021), 247–251; P. M. Leonardi and J. W. Treem, “Behavioral Visibility: A New Paradigm for Organization Studies in the Age of Digitization, Digitalization, and Datafication,” Organization Studies 41, no. 12 (2020): 1601–1625; T. B. Neeley and B. S. Reiche, “How Global Leaders Gain Power through Downward Deference and Reduction of Social Distance,” Academy of Management Journal (2020); P. M. Leonardi, D. E. Bailey, and C. S. Pierce, “The Co-Evolution of Objects and Boundaries Over Time: Materiality, Affordances, and Boundary Salience,” Information Systems Research 30, no. 2 (2019): 665–686; B. S. Reiche and T. B. Neeley, “Head, Heart, or Hands: How Do Employees Respond to a Radical Global Language Change over Time?” Organization Science 30, no. 6 (2019): 1252–1269; I. C. Cristea and P. M. Leonardi, “Get Noticed and Die Trying: Signals, Sacrifice, and the Production of Face Time in Distributed Work,” Organization Science 30, no. 3 (2019): 552–572; P. M. Leonardi, “Social Media and the Development of Shared Cognition: The Roles of Network Expansion, Content Integration, and Triggered Recalling,” Organization Science 29, no. 4 (2018): 547–568; T. B. Neeley and P. M. Leonardi, “Enacting Knowledge Strategy through Social Media: Passable Trust and the Paradox of Non-Work Interactions,” Strategic Management Journal 39, no. 3 (2018): 922–946; T. Neeley, The Language of Global Success (Princeton, NJ: Princeton University Press, 2017); P. M. Leonardi and D. E. Bailey, “Recognizing and Selling Good Ideas: Network Articulation and the Making of an Offshore Innovation Hub,” Academy of Management Discoveries 3, no. 2 (2017): 116–144; P. M. Leonardi, “The Social Media Revolution: Sharing and Learning in the Age of Leaky Knowledge,” Information and Organization 27, no. 1 (2017): 47–59; T. B. Neeley and T. L. Dumas, “Unearned Status Gain: Evidence from a Global Language Mandate,” Academy of Management Journal 59, no. 1 (2016): 14–43; T. Neeley, “Global Teams That Work,” Harvard Business Review, October 2015, 74–81; P. M. Leonardi, “Ambient Awareness and Knowledge Acquisition: Using Social Media to Learn ‘Who Knows What’ and ‘Who Knows Whom,’ ” MIS Quarterly 39, no. 4 (2015): 747–762; P. M. Leonardi, “Materializing Strategy: The Blurry Line Between Strategy Formulation and Strategy Implementation,” British Journal of Management 26 (2015): 17–21; P. J. Hinds, T. B. Neeley, and C. D. Cramton, “Language as a Lightning Rod: Power Contests, Emotion Regulation, and Subgroup Dynamics in Global Teams,” Journal of International Business Studies 45, no. 5 (2014): 536–561; P. M. Leonardi, “Social Media, Knowledge Sharing, and Innovation: Toward a Theory of Communication Visibility,” Information Systems Research 25, no. 4 (2014): 796–816; T. B. Neeley, “Language Matters: Status Loss and Achieved Status Distinctions in Global Organizations,” Organization Science 24, no. 2 (2013): 476–497; P. M. Leonardi and C. Rodriguez-Lluesma, “Occupational Stereotypes, Perceived Status Differences, and Intercultural Communication in Global Organizations,” Communication Monographs 80, no. 4 (2013): 478–502; P. M. Leonardi, “When Does Technology Use Enable Network Change in Organizations? A Comparative Study of Feature Use and Shared Affordances,” MIS Quarterly 37, no. 3 (2013): 749–775; D. E. Bailey, P. M. Leonardi, and S. R. Barley, “The Lure of the Virtual,” Organization Science 23, no. 5 (2012): 1485–1504; W. C. Barley, P. M. Leonardi, and D. E. Bailey, “Engineering Objects for Collaboration: Strategies of Ambiguity and Clarity at Knowledge Boundaries,” Human Communication Research 38, no. 3 (2012): 280–308; M. Mortensen and T. B. Neeley, “Reflected Knowledge and Trust in Global Collaboration,” Management Science 58, no. 12 (2012): 2207–2224; P. M. Leonardi and J. W. Treem, “Knowledge Management Technology as a Stage for Strategic Self-Presentation: Implications for Knowledge Sharing in Organizations,” Information and Organization 22, no. 1 (2012): 37–59; P. M. Leonardi, T. B. Neeley, and E. M. Gerber, “How Managers Use Multiple Media: Discrepant Events, Power, and Timing in Redundant Communication,” Organization Science 23, no. 1 (2012): 98–117; P. M. Leonardi, “Innovation Blindness: Culture, Frames, and Cross-Boundary Problem Construction in the Development of New Technology Concepts,” Organization Science 22, no. 2 (2011): 347–369; D. E. Bailey, P. M. Leonardi, and J. Chong, “Minding the Gaps: Technology Interdependence and Coordination in Knowledge Work,” Organization Science 21, no. 3 (2010): 713–730; P. M. Leonardi, J. W. Treem, and M. H. Jackson, “The Connectivity Paradox: Using Technology to Both Decrease and Increase Perceptions of Distance in Distributed Work Arrangements,” Journal of Applied Communication Research 38, no. 1 (2010): 85–105.
12. Such anonymity is customary when presenting data in academic research. Because we did many of these studies before we knew we were going to write this book, we didn’t always ask people for permission to use their real names or the names of their organizations. Where we could track them down, we did ask them. In other cases, we preserve the anonymity we promised them by using pseudonyms.
13. Tacit knowledge is like “feel.” The canonical example of tacit knowledge is riding a bike: you can feel how to do it, but it’s difficulty to write a set of rules for how. For an excellent treatise on tacit knowledge and its many forms see H. Collins, Tacit and Explicit Knowledge (Chicago: University of Chicago Press, 2010).
14. Thanks to Ian Buckley, “What Is Coding and How Does It Work?” MUO, June 3, 2021 (https://www.makeuseof.com/tag/what-is-coding/) for providing the example of Python code that we use here.
15. If you’re interested to learn more about different language paradigms, or simply learn what various languages are good for, check out Robert Diana’s useful list, “The Big List of 256 Programming Languages,” DZone, May 16, 2013: https://dzone.com/articles/big-list-256-programming.
Chapter 1
1. For anthropological studies of how algorithms have changed trading see D. MacKenzie, An Engine, Not a Camera: How Financial Models Shape Markets (Cambridge, MA: MIT Press, 2008); D. Beunza and D. Stark, “Tools of the Trade: The Socio-Technology of Arbitrage in a Wall Street Trading Room,” Industrial and Corporate Change 13, no. 2 (2004): 369–400; D. Beunza, Taking the Floor (Princeton, NJ: Princeton University Press, 2019).
2. For a deeper discussion of the difference between tools and platforms and how they make a difference for the kind of communication we can have in the workplace, see P. Leonardi, “Picking the Right Approach to Digital Collaboration,” MIT Sloan Management Review 62, no. 2 (2021): 1–7.
3. There are, of course, important differences in the kind of technology underlying chatbots and advanced conversational agents. Much of the difference has to do with the type of learning algorithm upon which they are based. In Russell and Norvig’s widely cited typology (S. Russell and P. Norvig, Artificial Intelligence: A Modern Approach [New York Pearson, 2002]), chatbots can range from being simple reflex agents, which act only on the basis of the current precept and ignore the rest of the precept history, to learning agents that can operate in unknown environments by learning from past performance and applying that learning when it takes in new precepts and decides on actions.
4. See, for example, research on the “Computers as Social Actors” (CASA) paradigm, including C. Nass, J. Steuer, and E. R. Tauber, “Computers Are Social Actors,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (April 1994), 72–78; B. Reeves and C. Nass, The Media Equation: How People Treat Computers, Television, and New Media Like Real People (Cambridge: Cambridge University Press, 1996); L. Gong, “How Social Is Social Responses to Computers? The Function of the Degree of Anthropomorphism in Computer Representations,” Computers in Human Behavior 24, no. 4 (2008): 1494–1509; S. S. Sundar, “Loyalty to Computer Terminals: Is It Anthropomorphism or Consistency?” Behaviour & Information Technology 23, no. 2 (2004): 107–118.
5. See J. McCarthy, M. L. Minsky, N. Rochester, and C. E. Shannon, “A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence,” AI Magazine 27, no. 4 (August 31, 1955): 12–15.
6. N. Bostrom, “How Long Before Superintelligence?” International Journal of Futures Studies 2 (1998).
7. For a classic and thorough text on the different functions of AI and their links to human cognition, see S. Russell and P. Norvig, Artificial Intelligence: A Modern Approach (New York: Pearson, 2002). For debates about how AI and human cognition are likely to be similar or diverge, see H. Dreyfus, S. E. Dreyfus, and T. Athanasiou, Mind over Machine (New York: Simon and Schuster, 2000); H. A. Simon, The Sciences of the Artificial (Cambridge, MA: MIT Press, 2019).
8. We borrow and build on this great list by Tim Urban, which is published at https://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html. We present only a summary of the great ideas Urban uses; this link provides more detailed examples.
9. There are many recent examples of how AI is changing various industries. We’ve picked a few good summary articles here. For finance, see D. Belanche, L. V. Casaló, and C. Flavián, “Artificial Intelligence in FinTech: Understanding Robo-Advisors Adoption among Customers,” Industrial Management & Data Systems 199, no. 7 (2019): 1411–1430; and for an early deep discussion see R. R. Trippi and E. Turban, eds., Neural Networks in Finance and Investing: Using Artificial Intelligence to Improve Real World Performance (New York: McGraw-Hill, 1992). For insurance see S. Hall, “How Artificial Intelligence Is Changing the Insurance Industry,” The Center for Insurance Policy & Research 22 (2017): 1–8. And for medicine see P. Szolovits, ed., Artificial Intelligence in Medicine (New York: Routledge, 2019).
10. For a great article in which she breaks down this example see https://marilynika.medium.com/an-intro-to-ai-ml-and-deep-learning-ffd2f2fbf1e.
11. And this computing power is growing quickly. Research Lab OpenAI found that the amount of computing power used to train the largest AI models has doubled every 3.4 months since 2012—tracking Moore’s law. This means that resources used today are doubling at a rate seven times faster than before. See https://www.technologyreview.com/2019/11/11/132004/the-computing-power-needed-to-train-ai-is-now-rising-seven-times-faster-than-ever-before/.
12. For a much more detailed discussion than is presented here, see Robins’s full description: www.intel.com/content/www/us/en/artificial-intelligence/posts/difference-between-ai-machine-learning-deep-learning.html.
13. A. Djouadi and E. Bouktache, “A Fast Algorithm for the Nearest-Neighbor Classifier,” IEEE Transactions on Pattern Analysis and Machine Intelligence 19, no. 3 (1997): 277–282; S. Suresh, K. Dong, and H. J. Kim, “A Sequential Learning Algorithm for Self-Adaptive Resource Allocation Network Classifier,” Neurocomputing 73, nos. 16–18 (2010): 3012–3019.
14. As we’ll discuss later, this accuracy assumes appropriate training data. For a great discussion of how bias seeps into computer vision and facial recognition software see J. Lunter, “Beating the Bias in Facial Recognition Technology,” Biometric Technology Today 2020, no. 9 (2020): 5–7.
15. To learn more about computer vision and its applications, see these two great resources: R. Szeliski, Computer Vision: Algorithms and Applications (Berlin: Springer Science & Business Media, 2010); and C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, “Rethinking the Inception Architecture for Computer Vision,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016), 2818–2826. The first citation provides an overview aimed at a broader audience (though it is somewhat technical). The second provides a state of the art review of the technical innovations in computer vision.
16. For an excellent primer on NLP and a discussion of its benefits and drawbacks, see G. G. Chowdhury, “Natural Language Processing,” Annual Review of Information Science and Technology 37, no. 1 (2003): 51–89.
17. For a great discussion by Yelp software engineer C. Wei-Hong, see https://engineeringblog.yelp.com/2015/10/how-we-use-deep-learning-to-classify-business-photos-at-yelp.html.
18. If you are interested in a deeper comparison of what these different approaches to learning can and cannot do, see R. Sathya and A. Abraham, “Comparison of Supervised and Unsupervised Learning Algorithms for Pattern Classification,” International Journal of Advanced Research in Artificial Intelligence 2, no. 2 (2013): 34–38; and B. Mahesh, “Machine Learning Algorithms—A Review,” International Journal of Science and Research (IJSR) 9 (2020): 381–386.
19. J. G. Nam, S. Park, E. J. Hwang, J. H. Lee, K. N. Jin, K. Y. Lim, and C. M. Park, “Development and Validation of Deep Learning–Based Automatic Detection Algorithm for Malignant Pulmonary Nodules on Chest Radiographs,” Radiology 290, no. 1 (2019): 218–228.
20. Y. Liu, T. Kohlberger, M. Norouzi, G. E. Dahl, J. L. Smith, A. Mohtashamian, and M. C. Stumpe, “Artificial Intelligence–Based Breast Cancer Nodal Metastasis Detection: Insights into the Black Box for Pathologists,” Archives of Pathology & Laboratory Medicine 143, no. 7 (2019): 859–868.
21. A very recent study by Sarah Leibowitz and colleagues describes the major changes that radiologists had to make to their work as they determined whether to incorporate AI judgment into this practice. This is a great example of how some radiologists can develop a digital mindset, while others cannot. See S. Leibowitz, N. Levina, and H. Lishivitz-Asad, “To Incorporate or Not to Incorporate AI for Critical Judgments: How Professionals Deal with Opacity When Using AI for Medical Diagnosis,” Organization Science (forthcoming).
22. The statistics around data production are truly mind-numbing. For a regularly updated list of what the world’s current data landscape looks like, see https://techjury.net/blog/big-data-statistics/#gref.
23. For a great discussion of IBM’s latest moves in this area see https://www.wraltechwire.com/2021/07/06/quantum-computing-wars-ibm-honeywell-make-major-moves-as-competition-intensifies/.
24. For an early discussion that lays the groundwork and provides an accessible overview of conversational user interfaces see M. F. McTear, “Spoken Dialogue Technology: Enabling the Conversational User Interface,” ACM Computing Surveys (CSUR) 34, no. 1 (2002): 90–169. For a more contemporary discussion of current issues and applications see P. Lauren and P. Watta, “A Conversational User Interface for Stock Analysis,” in 2019 IEEE International Conference on Big Data (Big Data) (December 2019): 5298–5305; S. Holmes, A. Moorhead, R. Bond, H. Zheng, V. Coates, and M. McTear, “Usability Testing of a Healthcare Chatbot: Can We Use Conventional Methods to Assess Conversational User Interfaces?” in Proceedings of the 31st European Conference on Cognitive Ergonomics (September 2019): 207–214.
25. For an interesting discussion of Julie’s evolution, see https://content.verint.com/LP=5353.
26. C. Nass and Y. Moon, “Machines and Mindlessness: Social Responses to Computers,” Journal of Social Issues 56, no. 1 (2000): 81–103; C. Nass, B. J. Fogg, and Y. Moon, “Can Computers Be Teammates?” International Journal of Human-Computer Studies 45, no. 6 (1996): 669–678; C. Nass, Y. Moon, B. J. Fogg, B. Reeves, and D. C. Dryer, “Can Computer Personalities Be Human Personalities?” International Journal of Human-Computer Studies 43, no. 2 (1995): 223–239.
27. See also a great study by Kim and Sundar that discusses how inattention is related to anthropomorphism and the consequences that arise from it: Y. Kim and S. S. Sundar, “Anthropomorphism of Computers: Is It Mindful or Mindless?” Computers in Human Behavior 28, no. 1 (2012): 241–250.
28. Trust is, of course, a complicated construct. We like the definition offered by an interdisciplinary team of management researchers: “Trust is a psychological state comprising the intention to accept vulnerability based upon positive expectations of the intentions or behavior of another” (395): D. M. Rousseau, S. B. Sitkin, R. S. Burt, and C. Camerer, “Not So Different After All: A Cross-Discipline View of Trust,” Academy of Management Review 23, no. 3 (1998): 393–404.
29. J. M. Logg, J. A. Minson, and D. A. Moore, “Algorithm Appreciation: People Prefer Algorithmic to Human Judgment,” Organizational Behavior and Human Decision Processes 151: 90–103.
30. D. T. Newman, N. J. Fast, and D. J. Harmon, “When Eliminating Bias Isn’t Fair: Algorithmic Reductionism and Procedural Justice in Human Resource Decisions,” Organizational Behavior and Human Decision Processes 160 (2020): 149–167.
31. The research also suggests that these points apply to people’s judgment of algorithmic or AI-based work monitoring. A recent study found that human-free tracking feels less judgmental and will, therefore, allow for a greater subjective sense of autonomy. The results of five experiments conducted by the authors supported these predictions, revealing that participants were more likely to accept technology-operated than human-operated tracking, an effect driven by reduced concerns about potential negative judgment, which, in turn, increased subjective sense of autonomy—and these results varied by the kinds of behaviors people believed were being tracked. See R. Raveendhran and N. J. Fast, “Humans Judge, Algorithms Nudge: The Psychology of Behavior Tracking Acceptance,” Organizational Behavior and Human Decision Processes 164 (2021): 11–26.
32. E. Glikson and A. W. Woolley, “Human Trust in Artificial Intelligence: Review of Empirical Research,” Academy of Management Annals 14, no. 2 (2020): 627–660.
33. M. J. Barnes, J. Y. Chen, and S. Hill, Humans and Autonomy: Implications of Shared Decision-Making for Military Operations (Aberdeen Proving Ground, MD: US Army Research Laboratory, 2017); A. R. Selkowitz, S. G. Lakhmani, and J. Y. Chen, “Using Agent Transparency to Support Situation Awareness of the Autonomous Squad Member,” Cognitive Systems Research 46 (2017): 13–25.
34. This is not quite as simple as it sounds. There have been numerous discussions in the fields of informatics and science and technology studies that debate about the antecedents of “transparency” versus “opacity.” For two good introductions to this discussion see J. Burrell, “How the Machine ‘Thinks’: Understanding Opacity in Machine Learning Algorithms,” Big Data & Society 3, no. 1 (2016): 2053951715622512; P. Dourish, “Algorithms and Their Others: Algorithmic Culture in Context,” Big Data & Society 3, no. 2 (2016): 2053951716665128.
35. Although most researchers and scientists agree that transparency is important in AI-based tools, the degree to which AI algorithms should be transparent or how they could be transparent is hotly debated. For some good summaries of these debates from very different perspectives see A. Rai, “Explainable AI: From Black Box to Glass Box,” Journal of the Academy of Marketing Science 48, no. 1 (2020): 137–141; D. Castelvecchi, “Can We Open the Black Box of AI?” Nature News 538, no. 7623 (2016): 20; A. Adadi and M. Berrada, “Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI),” IEEE Access 6 (2018): 52138–52160.
Chapter 2
1. C. D. Cramton, “The Mutual Knowledge Problem and Its Consequences for Dispersed Collaboration,” Organization Science 12, no. 3 (2001): 346–371.
2. E. M. Eisenberg, “Ambiguity as Strategy in Organizational Communication,” Communication Monographs 51, no. 3 (1984): 227–242; E. M. Eisenberg, Strategic Ambiguities: Essays on Communication, Organization, and Identity (Thousand Oaks, CA: Sage, 2006).
3. In a recent study, we found that individuals who worked remotely were much more likely to receive higher quality projects and better promotions when they caught the attention of those working at headquarters through ambiguous communication. Doing so allowed them to rise to the top of the information deluge that onsite managers experienced. See I. C. Cristea and P. M. Leonardi, “Get Noticed and Die Trying: Signals, Sacrifice, and the Production of Face Time in Distributed Work,” Organization Science 30, no. 3 (2019): 552–572.
4. For good reviews of this research see S. Kumar, Y. Tan, and L. Wei, “When to Play Your Advertisement? Optimal Insertion Policy of Behavioral Advertisement,” Information Systems Research 31, no. 2 (2020): 589–606; Y. Li, K. W. Wan, X. Yan, and C. Xu, “Real Time Advertisement Insertion in Baseball Video Based on Advertisement Effect,” in Proceedings of the 13th Annual ACM International Conference on Multimedia (November 2005): 343–346.
5. See this New York Times story on the subject: https://bits.blogs.nytimes.com/2011/12/19/back-at-work-and-catching-up-on-online-shopping/.
6. For a review of this trend see P. M. Leonardi, “COVID-19 and the New Technologies of Organizing: Digital Exhaust, Digital Footprints, and Artificial Intelligence in the Wake of Remote Work,” Journal of Management Studies (2020).
7. “How AI Demands Organizational Change,” MIT Panel, January 4, 2021.
8. Tsedal B. Neeley and Paul M. Leonardi, “Enacting Knowledge Strategy through Social Media: Passable Trust and the Paradox of Nonwork Interactions,” Strategic Management Journal 39, no. 3 (2018): 922–946.
9. “Metaknowledge Is Knowledge of ‘Who Knows What’ and ‘Who Knows Whom’ ” in Y. Ren and L. Argote, “Transactive Memory Systems 1985–2010: An Integrated Framework of Key Dimensions, Antecedents, and Consequences,” Academy of Management Annals 5, no. 1 (2011): 189–229. Research shows that when an individual’s metaknowledge is distinguished by both correctness (he or she can correctly identify what and whom a coworker knows) and breadth (he or she can make such a correct identification not just of a few coworkers, but of many coworkers), metaknowledge is often linked to team performance on routine tasks; (Y. Ren, K. M. Carley, and L. Argote, “The Contingent Effects of Transactive Memory: When Is It More Beneficial to Know What Others Know?” Management Science 52, no. 5 [2006]: 671–682); people’s ability to recombine existing ideas into new innovations (A. Majchrzak, L. P. Cooper, and O. E. Neece, “Knowledge Reuse for Innovation,” Management Science 50, no. 2 [2004]: 174–188); reduction in work duplication across the organization (P. Jackson and J. Klobas, “Building Knowledge in Projects: A Practical Application of Social Constructivism to Information Systems Development,” International Journal of Project Management 26, no. 4 [2008]: 329–337); and many more positive benefits.
10. This kind of borrowing is one of the key ways that innovation (especially in product and process) happens. Some researchers call it “brokering,” but the main idea is that ideas generated in one place and time can be useful in different contexts. Thus recombination is key. See A. B. Hargadon, “Brokering Knowledge: Linking Learning and Innovation,” Research in Organizational Behavior 24 (2002): 41–85; R. S. Burt, “Structural Holes and Good Ideas,” American Journal of Sociology 110, no. 2 (2004): 349–399.
11. We’ve discussed this point in more detail elsewhere. For a discussion about how important (but how hard it can be) to switch from reactive to proactive learning via social tools, see P. M. Leonardi, “Social Media, Knowledge Sharing, and Innovation: Toward a Theory of Communication Visibility,” Information Systems Research 25, no. 4 (2014): 796–816.
12. C. Thompson, “Brave New World of Digital Intimacy,” New York Times Magazine, September 5, 2008, 7.
13. This is what Pew Research says on this subject: https://www.pewresearch.org/internet/2016/11/11/social-media-update-2016/.
14. This insight builds on a long line of research about the importance of “informal” social relationships at work. In addition to the kind of lubrication for knowledge sharing we discuss here, there is ample evidence that people rely on informal relations and “friends” to find meaning in their work and to increase their productivity. And friendship correlates strongly with retention. See J. R. Lincoln and J. Miller, “Work and Friendship Ties in Organizations: A Comparative Analysis of Relation Networks,” Administrative Science Quarterly 24, no. 2 (1979): 181–199; G. A. Ballinger, R. Cross, and B. C. Holtom, “The Right Friends in the Right Places: Understanding Network Structure as a Predictor of Voluntary Turnover,” Journal of Applied Psychology 101, no. 4 (2016): 535.
15. For a specific analysis of the benefits of non-work interaction on work-related project effectiveness, see P. M. Leonardi and S. R. Meyer, “Social Media as Social Lubricant: How Ambient Awareness Eases Knowledge Transfer,” American Behavioral Scientist 59, no. 1 (2015): 10–34; and T. B. Neeley and P. M. Leonardi, “Enacting Knowledge Strategy through Social Media: Passable Trust and the Paradox of Nonwork Interactions,” Strategic Management Journal 39, no. 3 (2018): 922–946.
16. Hinds and Bailey offer a great research-based account of this problem: P. J. Hinds and D. E. Bailey, “Out of Sight, Out of Sync: Understanding Conflict in Distributed Teams,” Organization Science 14, no. 6 (2003): 615–632.
Chapter 3
1. M. Lewis, “The No-Stats All-Star,” New York Times, February 13, 2009.
2. Daniel Kahneman won the Nobel Prize in economics for this work with Amos Tversky in which they showed that people in general (not just high school students) are notoriously bad at dealing with probabilities—so bad in fact that it can skew their decision-making in profound ways. See, for example, Daniel Kahneman and Amos Tversky, “Prospect Theory: An Analysis of Decision under Risk,” Econometrica 47, no. 2 (1979): 263–292.
3. For an excellent discussion of the production of data as a social object, see C. Alaimo and J. Kallinikos, “Managing by Data: Algorithmic Categories and Organizing,” Organization Studies (2020): 0170840620934062.
4. But, of course, it couldn’t stay that way for long. JoAnn Yates documents the rise of categorization and the role it played in control in her excellent historical analysis: J. Yates, Control through Communication: The Rise of System in American Management (Baltimore: Johns Hopkins University Press, 1993).
5. RFID is an important data-producing technology. For an accessible overview of what it is, see R. Want, “An Introduction to RFID Technology,” IEEE Pervasive Computing 5, no. 1 (2006): 25–33.
6. And for a definitive account of its uses, see K. Finkenzeller, RFID Handbook: Fundamentals and Applications in Contactless Smart Cards, Radio Frequency Identification and Near-Field Communication (Hoboken, NJ: Wiley, 2010).
7. To learn more about this unfortunate story, see https://www.nytimes.com/2006/02/15/us/a-onehouse-400-million-bubble-goes-pop.html.
8. For a detailed example of the proliferation of sensors in basketball, see https://techcrunch.com/2017/03/17/watch-sensors-track-a-full-court-basketball-game-in-real-time/.
9. So much has been written on the consequences of black-box technologies, but for a couple of great examples to expand your thinking in several different areas, see F. Pasquale, The Black Box Society (Cambridge, MA: Harvard University Press, 2015); D. Castelvecchi, “Can We Open the Black Box of AI?” Nature News 538, no. 7623 (2016): 20; P. Cortez and M. J. Embrechts, “Opening Black Box Data Mining Models Using Sensitivity Analysis,” in 2011 IEEE Symposium on Computational Intelligence and Data Mining (CIDM) (April 2011): 341–348.
10. G. C. Bowker and S. L. Star, Sorting Things Out: Classification and Its Consequences (Cambridge, MA: MIT Press, 2000).
11. The New York Times addresses the issue: https://www.nytimes.com/2008/11/23/magazine/23Netflix-t.html.
12. For more details on this story, see https://www.wired.com/insights/2014/03/potholes-big-data-crowdsourcing-way-better-government/.
13. J. Buolamwini and T. Gebru, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” in Conference on Fairness, Accountability and Transparency, PMLR (January 2018): 77–91.
14. For an overview of Brayne’s studies, see S. Brayne, “Big Data Surveillance: The Case of Policing,” American Sociological Review 82, no. 5 (2017): 977–1008; S. Brayne, Predict and Surveil: Data, Discretion, and the Future of Policing (New York: Oxford University Press, 2020); S. Brayne, “The Criminal Law and Law Enforcement Implications of Big Data,” Annual Review of Law and Social Science 14 (2018): 293–308.
15. To read more about LASER’s legal problems and its implementation by the LAPD, see https://www.latimes.com/local/lanow/la-me-laser-lapd-crime-data-program-20190412-story.html#:~:text=LASER%2C%20or%20%E2%80%9CLos%20Angeles’,certain%20crimes%20in%20an%20area.
16. For full details, see K. Lum and W. Isaac, “To Predict and Serve?” Significance 13, no. 5 (2016): 14–19.
17. If you’re interested in algorithms and bias, you must read Cathy O’Neil’s cleverly and appropriately titled book: Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (New York: Crown, 2016). O’Neil discusses how opacity and scale (and, of course, the use of proxies) make it more likely that bias will enter into the construction of algorithms that affect our lives in so many ways.
18. To learn more about Wadell’s work and his simulation modeling for urban planning, see P. Waddell, “UrbanSim: Modeling Urban Development for Land Use, Transportation, and Environmental Planning,” Journal of the American Planning Association 68, no. 3 (2002): 297–314; P. Waddell and G. F. Ulfarsson, “Introduction to Urban Simulation: Design and Development of Operational Models,” in Handbook of Transport Geography and Spatial Systems (Newark, NJ: Emerald Group Publishing, 2004).
19. For a comprehensive overview of construal levels, see Y. Trope, N. Liberman, and C. Wakslak, “Construal Levels and Psychological Distance: Effects on Representation, Prediction, Evaluation, and Behavior,” Journal of Consumer Psychology 17, no. 2 (2007): 83–95.
20. C. J. Wakslak, Y. Trope, N. Liberman, and R. Alony, “Seeing the Forest When Entry Is Unlikely: Probability and the Mental Representation of Events,” Journal of Experimental Psychology 135, no. 4 (2006): 641–653.
21. Another example of this problem comes from a study we did of ten major product development organizations. In each organization we found that engineers would bring hundreds of slides to a decision-making meeting to try to persuade high-level managers of important design changes. We found the managers’ propensity to be swayed by an argument was inversely correlated to the technical specificity of the presentation and, more on point, to the amount of evidence presented. Managers were most convinced when the data were presented as a story with clear consequences. Engineers who could adopt that mode of data presentation and persuasion were nearly ten times as effective at pushing through a design change, especially late in the product development process.
22. B. M. Wiesenfeld, J. N. Reyt, J. Brockner, and Y. Trope, “Construal Level Theory in Organizational Research,” Annual Review of Organizational Psychology and Organizational Behavior 4 (2017): 367–400.
Chapter 4
1. There has been nearly a century’s worth of debate about the exact provenance of this quote. As usual, Quote Investigator provides a thorough investigation (complete with verified scans of documents) that discusses how this quote originated and how it has been adapted over time: https://quoteinvestigator.com/2014/01/15/stats-drunk/#note-7989-2.
2. This blog discusses Spotify’s metrics: https://musicindustryblog.wordpress.com/tag/spotify-metrics/.
3. “Spotify’s premium average revenue per user (ARPU) worldwide from 2015 to 2020” in Statista, retrieved March 09, 2020, from https://www.statista.com/statistics/813789/spotify-premium-arpu/. See also David Trainer, “It Sounds Like Spotify Is in Trouble,” Forbes.com, October 13, 2020, https://www.forbes.com/sites/greatspeculations/2020/10/13/it-sounds-like-spotify-is-in-trouble/?sh=60b8bc2c813b.
4. N. Loose, et al., “eCommerce: Amazon.com (USA) Customers, Brand Report: Statista Global Consumer Survey,” ecommerceDB, Statista, November 2019.
5. J. Harter, “Employee Engagement Is on the Rise in the U.S.,” Gallup, August 26, 2018, retrieved March 8, 2020, https://news.gallup.com/poll/241649/employee-engagement-rise.aspx.
6. Geico.com Coverage Calculator, accessed March 10, 2020, https://www.geico.com/coverage-calculator/.
7. R. L. Miller and A. D. Stafford, Economic Education for Consumers (Mason, OH: Cengage, 2009), 475.
8. J. Dubé and P. Rossi, Handbook of the Economics of Marketing, vol. 1, Handbooks in Economics (Amsterdam: Elsevier Science & Technology 2019).
9. M. Jancer, “Too Hot or Cold? The Embr Wave Is Your Personal Thermostat,” retrieved March 11, 2020, https://www.wired.com/story/embr-wave-personal-thermostat-wearable/.
10. F. Caudillo, S. Houben, and J. Noor, “Mapping the Value of Diversification,” McKinsey on Finance 55 (2015): 10–12.
11. A/B testing is really just a popular name for controlled experiments. The important thing to remember about any controlled experiment is that the design sets the researchers up to isolate a change and be able to make a correlational or causal connection between that one change and the outcomes of interest.
12. N. P. N. Patel and Amazon, “3 A/B Testing Examples That You Should Steal [Case Studies],” retrieved March 21, 2019, https://www.crazyegg.com/blog/ab-testing-examples/.
13. R. White, “7 Incredible Examples of A/B Tests by Real Businesses,” retrieved January 16, 2020, https://blog.hubspot.com/marketing/a-b-testing-experiments-examples.
14. L. Gonick and W. Smith, The Cartoon Guide to Statistics (New York: HarperCollins, 1993).
15. A regression is a method that analyzes the relationship between two or more variables. The method consists of fitting a straight line between data points of variables plotted on an X-Y axis, where the line represents the average change in the “response variable” (Y) based on the change in the “explanatory variable” (X).
16. V. Hunt, D. Layton, and S. Prince, “Diversity Matters,” McKinsey & Company 1, no. 1 (2015): 1.
17. Ibid.
18. Ibid.
Chapter 5
1. For more on Shamoon, see https://en.wikipedia.org/wiki/Shamoon#:~:text=Shamoon%20was%20designed%20to%20erase,time%20on%20Wednesday%2C%20August%2015; and https://money.cnn.com/2015/08/05/technology/aramco-hack/.
2. For a vivid illustration of such evolution, with specific examples from the mobile software industry, see B. D. Eaton, S. M. Elauf-Calderwood, C. Sorensen, and Y. Yoo, “Structural Narrative Analysis as a Means to Unfold the Paradox of Control and Generativity That Lies within Mobile Platforms,” Proceedings of the 10th International Conference on Mobile Business (2011): 68–73.
3. For more on the Twitter bug, see https://www.zdnet.com/article/twitter-bug-shared-location-data-for-some-ios-users/#:~:text=In%20December%202018%2C%20Twitter%20said,non%2Dfollowers%20and%20search%20engines.
4. For a comprehensive review of the concept of technical debt and the various ways it can impact product development generally, and security specifically, see Z. Li, P. Avgeriou, and P. Liang, “A Systematic Mapping Study on Technical Debt and Its Management,” Journal of Systems and Software 101 (2015): 193–220.
5. K. Beck, M. Beedle, A. Van Bennekum, A. Cockburn, W. Cunningham, M. Fowler, and D. Thomas, “Manifesto for Agile Software Development,” Agile Alliance, http://agilemanifesto.org/.
6. For more details about AppFolio and Eric Hawkin’s effective approach to managing software development teams, see T. Neeley, P. Leonardi, and M. Morris, “Eric Hawkins Leading Agile Teams@ Digitally-Born AppFolio (A),” Case 9-419-066 (Boston: Harvard Business School, June 2019).
7. For an interesting “confessional” about refactoring from software developers, see D. Silva, N. Tsantalis, and M. T. Valente, “Why We Refactor? Confessions of Github Contributors,” in Proceedings of the 2016 24th ACM Sigsoft International Symposium on Foundations of Software Engineering (November 2016): 858–870.
8. For more on the severity of the hacking, see https://www.nytimes.com/2018/03/15/technology/saudi-arabia-hacks-cyberattacks.html.
9. Astute technology leaders like Jan Chong of fintech company Tally tell us that trying to convince C-level executives about the importance of making proactive changes to stay ahead of security concerns is one of the most difficult parts of the job. “Nobody wants to spend money on something that isn’t broken” says Chong. But it’s exactly that kind of thinking that leaves a company and its products vulnerable to security breaches.
10. For an excellent inside, behind-the-scenes account of what happened at Cambridge Analytica, see https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump.
11. For a description of the unprecedented ruling, see https://www.ftc.gov/news-events/press-releases/2019/07/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions.
12. The size of data leaks of consumer information keeps growing. As of the writing of this book, the latest in a series of data leaks occurred at the carrier T-Mobile, exposing the data of more than 1 million users. For a regularly updated list of the world’s largest data breaches, see https://spanning.com/resources/industry-research/largest-data-breaches-us-history/.
13. J. S. Brown and P. Duguid, The Social Life of Information: Updated, with a New Preface (Boston: Harvard Business Review Press, 2017).
14. This phenomenon is what researchers of social networking sites call “networked privacy.” See A. E. Marwick and D. Boyd, “Networked Privacy: How Teenagers Negotiate Context in Social Media,” New Media & Society 16, no. 7 (2014): 1051–1067; L. Palen and P. Dourish, “Unpacking ‘Privacy’ for a Networked World,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (April 2003): 129–136.
15. For a detailed discussion of this process, see P. M. Leonardi and J. W. Treem, “Behavioral Visibility: A New Paradigm for Organization Studies in the Age of Digitization, Digitalization, and Datafication,” Organization Studies 41, no. 12 (2020): 1601–1625.
16. T. Gillespie, “The Relevance of Algorithms,” in T. Gillespie, P. J. Boczkowski, and K. A. Foot, Media Technologies: Essays on Communication, Materiality, and Society (Cambridge, MA: MIT Press, 2014): 167–194.
17. Bernard Woo and Bart Willemsen, “Hype Cycle for Privacy, 2020,” Gartner Research, https://www.gartner.com/document/3987903?toggle=1.
18. List pulled directly from brochure presented by Deloitte and Ryerson University, “Privacy by Design: Setting a New Standard for Privacy Certification,” co-written by the privacy expert who developed PbD, Dr. Ann Cavoukian: https://www2.deloitte.com/content/dam/Deloitte/ca/Documents/risk/ca-en-ers-privacy-by-design-brochure.PDF.
19. D. Mulligan and J. King, “Bridging the Gap between Privacy and Design,” University of Pennsylvania Journal of Constitutional Law 14, no. 4 (2012), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2070401.
20. For a full interview about this topic, see https://www.itsinternational.com/its6/feature/here-ai-has-place-privacy-design.
21. For more on diamonds and blockchain, see https://www.babson.edu/academics/executive-education/babson-insight/technology-and-operations/diamonds-and-the-blockchain/#.
22. For more on the diamond industry, see A. Baker and T. Shikapa, “Blood Diamonds,” Time, September 7, 2015, https://time.com/blood-diamonds/.
23. For an interesting overview of Everledger’s business model, see https://venturebeat.com/2019/09/24/everledger-raises-20-million-to-track-assets-with-blockchain-tech/.
24. “The Growing List of Applications and Use Cases of Blockchain Technology in Business and Life,” Insider Intelligence, March 2, 2020, https://www.businessinsider.com/blockchain-technology-applications-use-cases.
25. “What is Medrec?” MedRec, https://medrec.media.mit.edu/.
26. “IBM Food Trust,” https://www.ibm.com/products/food-trust.
27. B. Dimitrov, “How Walmart and Others are Riding a Blockchain Wave to Supply Chain Paradise,” Forbes, December 5, 2019, https://www.forbes.com/sites/biserdimitrov/2019/12/05/how-walmart-and-others-are-riding-a-blockchain-wave-to-supply-chain-paradise/?sh=70196faa7791.
28. M. Iansiti and K. R. Lakhani, “The Truth About Blockchain,” Harvard Business Review, January–February 2017, 9–10.
29. I. Heap, “Smart Contracts for the Music Industry,” Medium, March 15, 2018, https://medium.com/humanizing-the-singularity/smart-contracts-for-the-music-industry-3e641f87cc7; T. Felin and K. R. Lakhani, “What Problems Will You Solve with Blockchain?” MIT Sloan Management Review 60, no. 1 (Fall 2018): 32–38.
30. K. Lakhani, “How Does Blockchain Work?” hbr.org, September 20, 2017, https://hbr.org/video/5582134272001/whiteboard-session-how-does-blockchain-work.
31. M. Iansiti and K. R. Lakhani, “The Truth About Blockchain.”
32. Ibid.
33. “Stellar-Based Lightyear Acquires Chain, Forms New Entity,” Stellar, https://bitcoinmagazine.com/business/stellar-based-lightyear-acquires-chain-forms-new-entity.
34. J. Aki, “IBM Introduces ‘World Wire’ Payment System on Stellar Network,” Bitcoin Magazine, September 5, 2018, https://bitcoinmagazine.com/articles/ibm-introduces-world-wire-payment-system-stellar-network.
35. “The Fintech 2.0 Paper: Rebooting Financial Services,” Oliver Wyman, Anthemis Group, and Santander Innoventures, https://www.oliverwyman.com/our-expertise/insights/2015/jun/the-fintech-2-0-paper.html.
36. This phenomenon is often referred to as the “critical mass effect.” For a discussion of this effect in the context of interactive media, see M. L. Markus, “Toward a ‘Critical Mass’ Theory of Interactive Media: Universal Access, Interdependence and Diffusion,” Communication Research 14, no. 5 (1987): 491–511.
37. T. Felin and K. R. Lakhani. “What Problems Will You Solve With Blockchain?”
Chapter 6
1. For a poignant discussion of the widely known secret that crash test dummies are designed for men (the dummies of women are just scaled down versions of male dummies without any changes for female anatomy), see the wonderful recent article: M. Kuhn and H. Schank, “10,000 Women Die in Car Crashes Each Year Because of Bad Design,” Fast Company, August 26, 2021, https://www.fastcompany.com/90669431/10000-women-die-in-car-crashes-each-year-because-of-bad-design.
2. For excellent accounts of early experiments, including those of Thomas Edison at his Menlo Park laboratory, see A. Hargadon, How Breakthroughs Happen: The Surprising Truth about How Companies Innovate (Boston: Harvard Business School Press, 2003).
3. J. McCormick, B. Hopkins, and T. Schadler, “The Insights-Driven Business,” Forrester Research, July 2016.
4. S. H. Thomke, Experimentation Matters: Unlocking the Potential of New Technologies for Innovation (Boston: Harvard Business School Press, 2003).
5. T. R. Eisenmann, M. Pao, and L. Barley, “Dropbox: ‘It Just Works,’ ” case 9-811-065 (Boston: Harvard Business School, January 19, 2011).
6. M. Bertoncello, “How L’Oréal, a Century-Old Company, Uses Experimentation to Succeed in the Digital Age,” https://www.thinkwithgoogle.com/intl/en-cee/future-of-marketing/digital-transformation/how-lor%C3%A9al-century-old-company-uses-experimentation-succeed-digital-age/.
7. P. M. Madsen and V. Desai, “Failing to Learn? The Effects of Failure and Success on Organizational Learning in the Global Orbital Launch Vehicle Industry,” Academy of Management Journal 53, no. 3 (2010): 451–476.
8. Amy Edmondson and colleagues provide compelling examples for why it is so hard for most people and companies to learn from failure, but why it’s so essential that they learn to do so: A. C. Edmondson, “Strategies for Learning from Failure,” Harvard Business Review, April 2011, 48–55; A. L. Tucker and A. C. Edmondson, “Why Hospitals Don’t Learn from Failures: Organizational and Psychological Dynamics That Inhibit System Change,” California Management Review 45, no. 2 (2003): 55–72; M. D. Cannon and A. C. Edmondson, “Failing to Learn and Learning to Fail (Intelligently): How Great Organizations Put Failure to Work to Innovate and Improve,” Long Range Planning 38, no. 3 (2005): 299–319.
9. D. Coppola, “E-Commerce Worldwide—Statistics and Facts,” Statista, September 17, 2021, https://www.statista.com/topics/871/online-shopping/.
10. See “Number of Global Social Network Users 2017–2025,” https://www.statista.com/statistics/278414/number-of-worldwide-social-network-users/, January 28, 2021.
11. See S. Kemp, “Digital 2020: Global Digital Overview,” Datareportal, January 30, 2020, https://datareportal.com/reports/digital-2020-global-digital-overview.
12. For data see P. M. Leonardi, “Ambient Awareness and Knowledge Acquisition,” MIS Quarterly 39, no. 4 (2015): 747–762.
13. For a discussion of how such predictions can be made as well as their implications for privacy see P. Leonardi and N. Contractor, “Better People Analytics,” Harvard Business Review, November–December 2018, 70–81.
14. For lengthy discussion of how major tech companies are doing this, you must read S. Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: PublicAffairs, 2019).
15. See the New York Times piece here: https://www.nytimes.com/interactive/2021/02/08/opinion/stimulus-checks-economy.html.
16. See Zuboff, The Age of Surveillance Capitalism.
17. For a great discussion of this trend, see S. H. Thomke, Experimentation Works: The Surprising Power of Business Experiments (Boston: Harvard Business Review Press, 2020).
18. H. J. Wilson and K. Desouza, “8 Ways to Democratize Experimentation,” hbr.org, April 18, 2011, https://hbr.org/2011/04/8-ways-to-democratize-experime.html.
19. One big reason is that knowledge generated in another part of the company often suffers from the “not-invented-here syndrome.” For a discussion of how this syndrome arises and how to combat it, generally, see the great work by M. Hansen, Collaboration: How Leaders Avoid the Traps, Build Common Ground, and Reap Big Results (Boston: Harvard Business Press, 2009).
20. As the leaders of IDEO, one of the world’s most successful design firms, note in their various books about the company’s innovation process, recognizing that most attempts will fail is one of the most important things innovators can do. See T. Kelley and D. Kelley, Creative Confidence: Unleashing the Creative Potential within Us All (New York: Currency, 2013); T. Brown, “Design Thinking,” Harvard Business Review, June 2008, 84.
21. Amy Edmondson has written extensively on this topic. For her most foundational work see A. Edmondson, “Psychological Safety and Learning Behavior in Work Teams,” Administrative Science Quarterly 44, no. 2 (1999): 350–383; and for a more accessible translation see A. C. Edmondson, The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth (Hoboken, NJ: Wiley, 2018).
22. A. C. Edmondson, R. M. Bohmer, and G. P. Pisano, “Disrupted Routines: Team Learning and New Technology Implementation in Hospitals,” Administrative Science Quarterly 46, no. 4 (2001): 685–716.
23. For a detailed review of Project Aristotle see C. Duhigg, “What Google Learned from Its Quest to Build the Perfect Team,” New York Times Magazine, February, 2016.
Chapter 7
1. T. Neeley, J. T. Keller, and J. Barnett, “From Globalization to Dual Digital Transformation: CEO Thierry Breton Leading Atos into ‘Digital Shockwaves’ (A),” case 9-419-027 (Boston: Harvard Business School, June 2019).
2. Barbara Czarniawska and Carmelo Mazza, “Consulting as a Liminal Space,” Human Relations 56, no. 3 (2003): 267–290.
3. E. L. Wagner, S. Newell, and W. Kay, “Enterprise Systems Projects: The Role of Liminal Space in Enterprise Systems Implementation,” Journal of Information Technology 27, no. 4 (2012): 259–269.
4. K. E. Weick and R. E. Quinn, “Organizational Change and Development,” Annual Review of Psychology 50, no. 1 (1999): 361–386.
5. In the digital world, the basics of change as outlined in Kotter’s famous model (that just about every business school student ever has read) still hold. But the context in such models of change assumes that once a change happens, it’s over. In the digital era, change is like an M.C. Escher drawing—one change bleeds seamlessly into the next. This means that not only our vocabulary but our conceptualization of what change is and how it happens needs to—change. J. P. Kotter, Leading Change (Boston: Harvard Business Review Press, 2012).
6. N. Furr, J. Gaarlandt, and A. Shipilov, “Don’t Put a Digital Expert in Charge of Your Digital Transformation,” hbr.org, https://hbr.org/2019/08/dont-put-a-digital-expert-in-charge-of-your-digital-transformation.
7. W. R. Kerr, F. Gabrieli, and E. Moloney, “Transformation at ING (A): Agile,” case 9-818-077 (Boston: Harvard Business School, January 2018, revised May 2018).
8. T. Neeley, Remote Work Revolution: Succeeding from Anywhere (New York: HarperCollins, 2021).
9. M. Iansiti, K. Lakhani, H. Mayer, and K. Herman, “Moderna (A),” Case N9-621-032 (Boston: Harvard Business School, September 2020).
10. Of course, big pharma is not the only industry in which work is siloed. The smart book The Silo Effect by Gillian Tett describes the rise and proliferation of structural siloing in the modern corporation. See G. Tett, The Silo Effect: The Peril of Expertise and the Promise of Breaking Down Barriers (New York: Simon & Schuster, 2015).
11. J. A. Chatman and S. E. Cha, “Leading by Leveraging Culture,” California Management Review 45, no. 4 (2003).
12. R. M. Kanter, B. Stein, and T. D. Jick, The Challenge of Organizational Change: How Companies Experience It and Leaders Guide It (New York: Free Press, 1992), 492–495.
13. T. Neeley, “Global Business Speaks English: Why You Need a Language Strategy Now,” Harvard Business Review, May 2012, 116–124.
14. T. Neeley, J. T. Keller, and J. Barnett, “From Globalization to Dual Digital Transformation: CEO Thierry Breton Leading Atos into ‘Digital Shockwaves’ (A),” case 9-419-027 (Boston: Harvard Business School, June 2019).
15. G. C. Kane, D. Palmer, A. N. Phillips, D. Kiron, and N. Buckley, “Achieving Digital Maturity,” MIT Sloan Management Review 59, no. 1 (2017).
16. T. Neeley and J. T. Keller, “From Globalization to Dual Digital Transformation: CEO Thierry Breton Leading Atos into ‘Digital Shockwaves’ (B),” case supplement 419-046 (Boston: Harvard Business School, April 2019).
17. For an analysis of this Englishnization process and its parallels to digital transformation see T. Neeley, The Language of Global Success (Princeton, NJ: Princeton University Press, 2017).
18. We borrow some of the material in this section from Paul Leonardi’s “You’re Going Digital—Now What?” MIT Sloan Management Review (Winter 2020), which we use with permission of the publisher.
19. D. Leonard-Barton and I. Deschamps, “Managerial Influence in the Implementation of New Technology,” Management Science 34, no. 10 (1988): 1252–1265; W. Lewis, R. Agarwal, and V. Sambamurthy, “Sources of Influence on Beliefs about Information Technology Use: An Empirical Study of Knowledge Workers,” MIS Quarterly 27, no. 4 (2003): 657–678.
20. A huge program of research on technology acceptance has focused on this issue, showing convincingly that people ask themselves if a new technology is useful and if it is easy to use. If people can’t answer these questions in the affirmative, they are likely to reject a new technology even if they are asked specifically to use it. For a dizzying review of such studies see V. Venkatesh, M. G. Morris, G. B. Davis, and F. D. Davis, “User Acceptance of Information Technology: Toward a Unified View,” MIS Quarterly (2003): 425–478.
21. For a very clear example of this rhetoric versus reality in the area of total quality management see M. J. Zbaracki, “The Rhetoric and Reality of Total Quality Management,” Administrative Science Quarterly (1998): 602–636.
22. A seminal article that described how users can appropriate the features of new technologies in many different ways (despite being told to use them in one consistent way) is G. DeSanctis and M. S. Poole, “Capturing the Complexity in Advanced Technology Use: Adaptive Structuration Theory,” Organization Science 5, no. 2 (1994): 121–147.
23. For a detailed analysis of these social network changes, see P. M. Leonardi, “When Does Technology Use Enable Network Change in Organizations? A Comparative Study of Feature Use and Shared Affordances,” MIS Quarterly (2013): 749–775.
24. G. C. Kane, A. N. Phillips, J. R. Copulsky, and G. R. Andrus, The Technology Fallacy: How People Are the Real Key to Digital Transformation (Cambridge, MA: MIT Press, 2019); J. Battilana and T. Casciaro, “Change Agents, Networks, and Institutions: A Contingency Theory of Organizational Change,” Academy of Management Journal 55, no. 2 (2012): 381–398.
25. For an overview of the science of social networks and why they work the way they do see P. R. Monge, N. S. Contractor, P. S. Contractor, R. Peter, and S. Noshir, Theories of Communication Networks (New York: Oxford University Press, 2003). For a more accessible approach to using social networks for organizational analysis see R. Cross, J. Liedtka, and L. Weiss, “A Practical Guide to Social Networks,” Harvard Business Review, March 2005, 124–132.
26. A. Palmer, “Lifelong Learning Is Becoming an Economic Imperative,” The Economist, January 14 2017, 12.
27. For more on Kim dos Santos and Booking.com, see https://www.themuse.com/advice/keep-learning-at-work-why-this-company-prioritizes-personal-development.
Conclusion
1. The epigraph is from Everybody’s Political What’s What? by George Bernard Shaw (1944), 330.
2. In 2021 digital technologies such as Facebook, Instagram, and Twitter connected people with friends, family members, colleagues, and often simple acquaintances. A study shows that more than 60 percent of people’s social interactions now occur on such mediated platforms. At work, tools such as Slack, Yammer, and Workpl@ce (by Facebook) connect employees who work in different cities, states, and countries. But these digital technologies don’t just connect people who live and work in different geographic areas; they also are becoming the increasing platforms of choice for people who live and work in close physical proximity. A study we conducted at a major imaging device company showed that even employees who worked in the same office building (separated by no more than 120 linear feet from each other) did 85 percent of their communication through digital tools. As a consequence, these employees rarely stopped by each other’s desks and even began to conduct most of their meetings through digital tools like Zoom and Microsoft Teams. As one employee remarked, “I feel very separated from my colleagues.”
3. H. A. Simon, Models of Bounded Rationality: Empirically Grounded Economic Reason, vol. 3 (Cambridge, MA: MIT Press, 1997).
Appendix
1. The company’s instruments for faster learning are described in this Spotify HR blog: https://hrblog.spotify.com/2018/02/25/six-instruments-for-faster-learning/.
2. See “Inside Learning: How Yelp Created a Successful Learning Culture,” Always Learning Blog, September 1, 2017, https://www.continu.co/blog/yelp-successful-learning-culture.
3. See M. Gladden, “How Westpac Group Is Re-Imagining the Delivery of Learning,” Go1, July 13, 2020, https://www.go1.com/customer-story/how-westpac-group-is-re-imagining-the-delivery-of-learning.
4. See “20 Companies Where You’ll Keep Learning Every Day,” The Muse, https://www.themuse.com/advice/20-companies-that-value-learning.
5. See B. Goodwin, “Philips Looks to Artificial Intelligence to Train Its Workforce,” ComputerWeekly.com, October 29, 2018, https://www.computerweekly.com/news/252451426/Philips-looks-to-artificial-intelligence-to-train-its-workforce.