How to shape your organization’s AI culture while in turmoil?

Behavioural science, which studies human behaviour via psychological, cognitive and emotional lenses, provides essential insights into how developers and designers should approach the building of AI. It highlights the cognitive biases that can influence algorithm design and the ethical blind spots resulting from prioritizing technical efficiency over social consequences.

SelFou.digital

4/11/20256 min read

Experts direct leaders on their need to define organizational behaviour scenarios which guide their AI and define their culture. How is that possible with turmoil stirring on the international stage? According to Project Syndicate, organizations are confused about their own raison d’être.

Further to an article by Mariana Mazzucato and Rainer Kattel in Project Syndicate: ‘Around the world, governments are trying to reinvent themselves in the image of business. Elon Musk’s DOGE crusade in the United States is quite explicit on this point, as is Argentina’s chainsaw-wielding president, Javier Milei. But one also hears similar rhetoric in the United Kingdom, where Cabinet Office Minister Pat McFadden wants the government to foster a “test-and-learn” culture and move toward performance-based management.’

They continue by saying, ‘The problem is that governments and businesses serve vastly different purposes. If public policymakers start mimicking business founders, they will undermine their own ability to address complex societal challenges.

For startups, the highest priority is rapid iteration, technology-driven disruption, and financial returns for investors. Their success often hinges on solving a narrowly defined problem with a single product, or within a single organization. Governments, by contrast, must tackle complex, interconnected issues like poverty, public health, and national security. Each challenge calls for collaboration across multiple sectors, and careful long-term planning. The idea of securing short-term gains in any of these areas doesn’t even make sense.

Unlike startups, governments are supposed to uphold legal mandates, ensure the provision of essential services, and enforce equal treatment under the law – more important today than ever. Metrics like market share are irrelevant, because the government has no competitors. Rather than trying to “win,” it should focus on expanding opportunities and promoting the diffusion of best practices. It must be long-term minded, while achieving nimble and flexible structures that can adapt.’

What of AI then? How can organizations move forward in these complex times and orchestrate the biggest transformations of this century?

According to a Walkingthetalk.com white paper (part of SRPartners.com), ‘AI may be able to predict the best target culture based on hundreds of thousands of external and internal information sources. To do this, it could simulate various scenarios of behaviour change and measure the resulting business outcomes.’

Deloitte argues that ‘Even when designed well, organizations should keep in mind that the most successful transformations are typically based on workers’ consent and buy-in, and this takes time. Leaders should seek ongoing measurement of KPIs, using them to track progress and iteratively hone the change program. Adding support where behaviors aren’t taking hold and celebrating achievements along the way is often key to ultimately arriving at a culture that can drive AI-fueled success.’

In a Forbes article, Tshilidzi Marwala- The writer of Rector of the United Nations (UN) University adds: ‘Humans are at the heart of AI development, and their decisions, prejudices, and actions affect AI systems. Behavioural science, which studies human behaviour via psychological, cognitive and emotional lenses, provides essential insights into how developers and designers should approach the building of AI. It highlights the cognitive biases that can influence algorithm design and the ethical blind spots resulting from prioritizing technical efficiency over social consequences.

So, when asked How does AI interact with culture? Walkingthetalk answers: ‘We foresee two main angles in which AI will impact organisational culture. The first one is that AI will help to manage culture. The second angle is that AI will directly influence the mind-sets and behaviours of people at work through the systems that will be implemented. AI will help to manage culture.’

According to them, this will be done through conducting a current culture assessment, identifying the target culture, developing a culture plan and measuring the culture. They also identify the following archetypes: innovation, customer-centric, achievement, people first, and greater good.

Instead of the ‘greater good’, Mariana Mazzucato speaks to the common good which she defines as: ‘The common good offers an avenue to explore the link between individual and communal interests. It is not just about maximising the sum of aggregate individual interests, but about common interests and mutual concern. In this sense, the common good is rooted in the values and collective responsibility that members of current and future generations share.’

Which leads us to think: Depending on what sort of culture we want to build, our desired state, values and common vision will vastly change.

We believe this is a perfect time– this time of turmoil– to seek out the people in your organization: Take time out and listen to them. Not only about their fears and concerns about the geopolitical environment we live in, but also about their perceptions of AI and what surrounds it. Build on what was said above and use this time wisely.

Current events have politicians, markets and international customers alike wondering what tomorrow will bring. We like to thing that through history (our current bedside reading is Homelands, by Timothy Garton Ash), some insight into what tomorrow holds will shine. And because we like to help you out through your digital transformation, we are providing you with some common ingredients to agreement your journey along the way.

Best, as always, your SelFou.digital Team

Source, Deloitte:

‘Ingredients of an AI-ready culture: Trust

Surprisingly, surveyed high-achieving organizations (Transformers and Pathseekers) report more than twice the amount of fear compared to low-achieving organizations (Underachievers and Starters). Typically, when we consider AI-related fear, the focus is on job loss or machines replacing humans.

But high achievers also reported little desire to reduce employee headcount as well as high investment in training and change management. When viewed through this lens, fear may be a positive indicator that an organization’s AI vision is bold. This can bear fruit when paired with other supportive actions and cultural characteristics to drive success. A culture that trusts, even if they fear, demonstrates agility. Change management can help build that trust.

Executive interviews confirmed this interpretation, calling out a variety of behaviors, such as collaboration, relationship-building, and training, which may collectively point to higher levels of trust within the organization. Trust is based on competence and intent:3 If employees believe in the organization’s ability to build capable AI systems and its intent to use technology for their benefit—not detriment—then trust can grow.

Ingredients of an AI-ready culture: Data fluency

“In order for there to be AI success, people will have to change their relationship with data,” says Andrew Beers, chief technology officer at Tableau. Part of this, of course, involves building advanced technical data capabilities; however, that’s often a smaller piece of the puzzle than leaders realize. More foundational tends to be raising the base level of data literacy across all levels of the organization. This means encouraging everyone to build the critical thinking skills needed to ask the right questions and then find the right data to solve problems in their everyday work.

Developing data-literacy skills builds confidence and a deeper trust in models and AI, which in turn can help set organizations up for positive outcomes. High-achieving organizations from our survey (Transformers and Pathseekers) were approximately three times more likely to trust AI more than their own intuition, compared to low-achieving organizations (Starters and Underachievers). Naturally, trusting AI doesn’t mean blindly following model outputs. Tulia Plumettaz, director of data science at Wayfair emphasizes this point: “We have a widespread culture of experimental validation.

We don’t accept an answer of, ‘The model said so.’ No. Model outcomes are continuously scrutinized through live testing and validation.” In other words, data-focused organizations tend to require a more profound understanding of data. Workers should be incentivized to explain and justify model decisions; this serves to drive more creative insights as well as faster detection of model errors if and when they arise.

Upskilling is important in this effort. Most organizations understand the importance of including training or reskilling to support an AI transformation—in fact, nearly three quarters of all surveyed organizations did not report a strong preference for hiring externally over reskilling their current workforce.

Ingredients of an AI-ready culture: Agility

AI-fueled organizations typically do more than trust data; they demonstrate a willingness to quickly turn insights into action and rapid experimentation.

Rajeev Ronanki, SVP and chief digital officer at Anthem agrees, commenting on the degree of change this can require for organizations that have grown prioritizing safer and more secure investments: “A lot of [the challenge] is getting comfortable with the fail-fast, pivot mindset when you take on and do new things,” he notes.

Building an AI-ready culture: The need for change management

AI in particular is significantly altering the way work gets done, requiring a redefinition of work4, and subsequently which skills and capabilities the human workforce needs to deliver value.5 Most organizations underinvest in these activities: Only 37% of survey respondents re ported significant investment in change management, incentives, or training activities to help their people integrate new technology into their work, often resulting in a slower, less successful transformation.


Even when designed well, organizations should keep in mind that the most successful transformations are typically based on workers’ consent and buy-in, and this takes time. Leaders should seek ongoing measurement of KPIs, using them to track progress and iteratively hone the change program. Adding support where behaviors aren’t taking hold and celebrating achievements along the way is often key to ultimately arriving at a culture that can drive AI-fueled success.’