How EDA can fix ChatGPT’s flaws and unlock business value

From instant translations and idea generation to composing emails and essays from scratch, ChatGPT is beginning to permeate into our everyday lives. A UBS study revealed that the chatbot reached 100 million monthly active users in January, a mere two months after its launch, making it the fastest-growing consumer application in history.

The untold story: Limitations holding ChatGPT back

However, there are certain drawbacks and limitations that currently hinder both ChatGPT and AI in general from achieving their full potential. This is where event-driven architecture (EDA) can play a crucial role in facilitating the flow of information between systems. In EDA, systems that “publish” events interact with other systems that express interest in specific information by “subscribing” to relevant topics.

By incorporating EDA into the development of applications, internal features can be seamlessly integrated and made more responsive. EDA enables the absorption and fulfilment of requests when ChatGPT is invoked, resulting in improved response times, reduced energy consumption, and even opening up new e-commerce opportunities for both B2B and B2C businesses. Let’s delve into the details.

5 ways EDA unlocks ChatGPT’s potential

  1. Enable automatic answers by streamlining the request and response cycle
    Currently, ChatGPT operates using a conventional “request/reply” approach. You ask a question or make a request, and ChatGPT provides a response. However, envision a scenario where ChatGPT could proactively send you information that it knows you would find interesting!

    Consider this example: You employ ChatGPT to summarise and jot down action items from a Zoom meeting attended by multiple participants. Instead of each participant individually seeking information, EDA would enable ChatGPT to instantly share the meeting notes with all attendees, including those who were unable to attend. This way, everyone would automatically and immediately be brought up to speed on the meeting outcomes. By proactively sending a single message to a group of recipients, ChatGPT would alleviate the need for numerous request/reply interactions over time, thereby improving service efficiency for users.

    This capability can be advantageous in any collaborative activity that involves ChatGPT’s assistance. For example, teams working together on a shared codebase could benefit greatly. Rather than ChatGPT individually suggesting changes or improvements to each developer through their integrated development environment (IDE), users could subscribe to their IDE to receive suggestions. The underlying EDA technology would then distribute the suggestions to all subscribed developers when they access the codebase.
  1. Reduce ChatGPT’s energy consumption with better intelligent resource utilisation
    ChatGPT places a significant demand on resources, particularly in terms of processing power and CPU usage, necessitating the use of specialised graphical processing units (GPUs). Moreover, ChatGPT relies on a considerable number of GPUs (currently estimated at upwards of 28,936) for model training and query processing, resulting in substantial expenses estimated to range from US$0.11 to US$0.36 per query.

    Additionally, it is crucial to acknowledge the environmental impact of this model. The substantial power consumption associated with GPUs contributes to energy wastage, with data scientists estimating ChatGPT’s daily carbon footprint to be approximately 23.04 kgCO2e, a figure comparable to other large language models such as BLOOM.

    However, it is worth noting that there is potential for improvement in reducing ChatGPT’s carbon output. Efforts can be made to handle all requests more efficiently and thereby potentially lower the estimated daily carbon footprint.

    By implementing EDA, ChatGPT can optimise its resource usage by processing requests only when they are received, rather than running continuously. This approach would help reduce unnecessary energy consumption and contribute to a more sustainable operation.
  1. Addressing ChatGPT’s unavailability during peak loads
    ChatGPT faces the challenge of managing a high volume of incoming user requests. The popularity and rapid growth of ChatGPT, coupled with its unpredictable demand patterns, often overwhelm the system, resulting in error messages such as “sorry, can’t help you” for both premium and free users. Recent outages have highlighted the strain on the system as it struggles to scale rapidly and keep up with increasing traffic, while also facing competition from new rivals like Google Bard. So, how does EDA come into play?

    In the event of ChatGPT reaching its capacity, the implementation of EDA can play a vital role. EDA enables the buffering of requests and their asynchronous servicing across multiple event-driven microservices, allowing ChatGPT to handle requests as the service becomes available. By decoupling the services, a failure in one service does not cause a cascade failure in others.

    The event broker, a critical component of EDA, acts as a stateful intermediary that buffers and stores events until the service is back online, ensuring their delivery. This approach enables quick scaling by adding service instances without causing downtime for the entire system. As a result, both availability and scalability are improved.

    With the assistance of EDA, users of ChatGPT services worldwide can submit their requests at any time, and ChatGPT will deliver the results as soon as they are ready. This eliminates the need for users to re-enter their queries to receive generative responses, leading to enhanced scalability and reduced response times.
  2. Integrate ChatGPT into business operations to disrupt the AI e-commerce marketplace
    AI plays a critical role in the e-commerce marketplace, with projections indicating that the e-commerce AI market will reach a value of US$45.72 billion by 2032. Consequently, it comes as no surprise that prominent e-commerce players are exploring ways to incorporate ChatGPT into their business operations. Shopify, for instance, has developed a shopping assistant powered by ChatGPT, which can provide product recommendations to users based on an analysis of their search engine queries.

    EDA holds the potential to further enhance the shopping experience and enable B2C and B2B businesses to gain deeper insights into their customers. By tracking key events at high volume from e-commerce platforms, businesses can uncover valuable patterns in customer behaviour. This includes identifying the most profitable items in specific regions and understanding the factors that influence purchasing decisions. Such information can be utilised by the ChatGPT machine learning model, which can predict customer behaviour and offer personalised product recommendations. These developments represent just the initial steps in leveraging ChatGPT-based models for these purposes.
  3. Improve responsiveness for your global user base
    Given that ChatGPT and ChatGPT apps have a global user base, efficient distribution of data from GPT queries becomes essential. In this context, an event mesh architecture is highly suitable to meet this demand.

    An event mesh is a layered architecture consisting of a network of event brokers. Its purpose is to enable the routing and reception of events from one application to any other application, regardless of their deployment location. By employing an event mesh, you can dynamically direct data to interested subscribers on an on-demand basis, rather than broadcasting ChatGPT results to all applications and relying on application logic to filter out irrelevant data. This approach significantly enhances the user experience while also optimising compute and network resources.

Unlock the full potential of ChatGPT with EDA

ChatGPT is still in its early stages, but its rapid user adoption and regular feature updates indicate that its story is far from over. Whether it’s used to tackle service outages and reduce energy consumption, enhance scalability, resilience, and flexibility, or introduce new business use cases to B2B and B2C organisations, EDA has the capability to help this emerging generative AI tool build upon its current success.