Potential uses of ChatGPT for Change Management and Strategy Work


With all the hype going on, we couldn’t resist the temptation of asking ourselves what could we here at 8008.Agency - or our colleagues in management consulting - do with ChatGPT. How could we use it and would it benefit our clients? Let’s take a look at our main services and see. 

Strategy creation and implementation

Any strategy related tasks out there that ChatGPT can help with? 

Short answer: Potentially, yes.

ChatGPT has the potential to automate many of the steps in the strategy creation process. This can (potentially) save organizations time and money, as well as free up resources for other tasks. ChatGPT could use its AI capabilities to suggest strategies based on data from past successes and failures, which could (potentially) help organizations develop a more effective strategy. 

According to itself, ChatGPT can: 

  • Define goals and objectives by analyzing data such as customer needs, industry trends, and competitive landscape -
  • Analyze market conditions and identify opportunities and threats (as a side note, how relevant are the above, when the data it currently has access to is capped at 2021?)

  • Identify the target audiences and their associated characteristics.
  • Generate lists of strategies and tactics 
  • Put together a plan that outlines the steps necessary to execute the strategy. 
  • Monitor and adjust the strategy (how could this happen? maybe when a human would re-input the fresh data and the older strategy, just maybe something new will return).

Why just ‘potentially’?

ChatGPT can not provide the same level of insight as experienced strategy people. Furthermore, the AI capabilities of ChatGPT may not be able to accurately assess the environment and market conditions that could affect the success of a strategy. Finally, ChatGPT does not have the ability to assess the human factor, such as the impact of human behaviour or preferences, which is critical in the success of any strategy.

What is it about specific or complex advice about strategy creation and implementation that should not be given into the hands of ChatGPT?

ChatGPT will encounter limitations when it comes to providing more specific or complex advice about strategy creation. It uses natural language processing to understand user input only, which means it can’t provide detailed, specific advice on how to create a strategy. Additionally, ChatGPT can only provide information based on the information it has received from its users, so it may not be able to provide advice on strategies that are outside of its current knowledge.

Moving on, when it comes to implementation, one cannot (one can, but should not) trust ChatGPT with an ability to provide guidance on the complexities of strategy implementation. First-hand limitations here are: it can’t provide advice on the timing of various tasks, the resources needed to complete them, the potential risks associated with them, or the best approach to overcoming them. What is more, there will be no insight provided into how various stakeholders may respond to the implementation of a specific strategy. So if you want to use it, you will be better served employing it as a tool for the content of a strategy, rather than the implementation of it.

How about change management?

Change management has a lot to do with understanding context and nuance, arguably one of the worst places to be for ChatGPT. This is one place to not look for guidance as of how to implement change or how to manage the transition. ChatGPT. will have nothing to say about how to handle resistance to change or how to ensure that change is successful. And obviously, ChatGPT cannot develop the relationships and trust necessary to ensure successful change management, because of lack of human insight (obviously), reduced engagement capabilities, no cultural sensitivity and an inability to identify unforeseen challenges.

Cultural change processes? Forget about those

ChatGPT cannot provide understanding of what is important to a particular culture, or develop new values and norms that could be accepted within that culture; or help with conflict resolution; or empathize and understand the nuances of a conflict that a human can, and consequently provide the same level of resolution. Neither can it do any of the community building work, as it has nothing to say when it comes to the connection and understanding between members of a community. However, even in the absence of such an understanding, it could mediate conflicts with some success based on the available literature.

Leadership then? I guess you can see what’s coming

  • Making decisions about organizational restructuring: that is difficult to do when one cannot assess the complex dynamics of an organization, its culture, and the potential impact of any changes. 
  • Developing a shared vision and strategic plan: this is not possible without capturing the nuances of human emotion, essential to fostering commitment and alignment. 
  • Managing resistance to change: to begin with, ChatGPT can’t read and interpret body language, facial expressions, and tone of voice to understand how people are responding to the proposed change. 
  • Coaching and developing people: unlikely to happen in a personalized way, with the  tailored advice and support that people need in order to develop and grow.

How about communication? ChatGPT is, after all, a conversational agent.

ChatGPT is suited for general conversation and small tasks. It can be used to provide customer support, answer FAQs, or provide basic information. As we've already witnessed, it can very well be used for automating certain tasks in a customer service journey: it can engage and interact with customers and provide support and guidance.

But because ChatGPT is unable to properly determine the context of complex conversations, the more complex the communicational setting, the more difficult it will be for it to provide accurate responses. It cannot understand the nuances of human language, which makes interpreting the intentions behind a conversation well... difficult. Staying on the complexity variable, it cannot take into account a variety of perspectives and ideas, and at this moment neither can it fathom ideas which may me outside of its pre-defined parameters. So creative solutions to complex problems are a limit here, which is where humans still shine,

All throughout this article I felt you are not completely embracing ChatGPT - why is that?

It’s not it, it’s us.

‘The saddest aspect of life right now is that science 
gathers knowledge faster than society gathers wisdom.’
-Isaac Asimov

ChatGPT is ok as long as we are all on the same page with what it is, can do and can’t do. But where and when has it ever been the case that everyone agreed on subjects like this? 

Very importantly, all that is above described as having the potential of going wrong is based on the optimistic assumption that ChatGPT uses  accurate data, written by humans. And this in itself becomes a huge leap of faith with every passing day given the proliferation of fake social media accounts and inaccurate or even fake facts/samples of writing actually created by AI chatbots rather than humans. ‘Garbage in, garbage out’ you never get old!

What we find a bit tricky here is people, not ChatGPT. Some people may end up believing that ChatGPT can substitute humans in a conversation because of its seemingly meaningful way in which it can use natural language processing and artificial intelligence algorithms to interpret complex conversations and respond. 

But how it actually works is that it constructs a sentence word by word, selecting the most likely term that should come next, based on its training. The way that it reaches an answer comes after elaborate guessing. It is because of this feature that it can argue incorrect or nonsensical answers as if they were true, in a "hallucination" of fact and fiction, as some scientists call it.

A major caveat lies in our capacity to develop and deploy ChatGPT responsibly and with the purpose of enhancing our well-being. And to once again escape an innate tendency of humans, of seeking to attribute human traits, emotions, or intentions to non-human entities.