Blog

Four ways AI applications are evolving right now – and why everyone in martech should be paying attention

  • Félix Chrétien

    Félix Chrétien

    AI Architect, Knak

  • Aidan MacMillan

    Aidan MacMillan

    Software Developer, Knak

Published May 22, 2024

4 AI Trends Revolutionizing Martech from Google Cloud Next '24

AI is so new, and everything is happening so fast, that it’s hard to keep track of what’s changing, and how. Is a particular idea a flash in the pan, or does it herald a significant new development? Both developers and tech-savvy marketers are eager to distinguish between buzz and genuine, lasting innovation.

In this post, we want to share four fundamental trends about generative AI that were highlighted at the recent Google Cloud Next ’24 conference in Las Vegas. These insights resonate with what we see happening in the martech and AI industries, and will be of interest to anyone in marketing, and in particular marketing technologists and people in ops.

1. It’s becoming easier to integrate AI into various marketing applications

We take AI very seriously at Knak. We are committed to getting as much as we can out of the technology so that we can support our customers as they use AI to create awesome marketing campaigns. In fact, our main reason for going to Google Cloud Next ’24 was to learn about and validate best practices for implementing AI.

We were pleased to find that over the last year, discourse has shifted away from basic experiments with large language models (LLMs) to a robust dialogue about best practices in prompt engineering and fine-tuning for LLMs.

While the field is still awash in shades of grey, common challenges have emerged across industries, revealing patterns in LLM integration and application.

Principles like evaluation-driven development, online performance logging, and considerations for scalable deployment – principles learned from developers’ experience in traditional machine learning and data science – are more relevant than ever.

The machine learning operations (MLOps) market, which includes platforms like Google Cloud’s Vertex AI, Weights and Biases, and Arize AI, has built on these principles in rolling out tools that expedite development and ease the integration of AI into various applications.

The lesson for people in martech: A rapid evolution in best practices for LLMs has resulted in a range of new tools that make it easier to use AI in marketing. A few examples are Knak’s new AI translation and email subject line generation tools.

2. As AI improves, keep an eye on quality assurances

AI applications are inherently dynamic and unpredictable. Therefore, they need continuous monitoring and require innovative testing approaches tailored to specific tasks they are meant to accomplish.

In addition to software performance, they also need to be tested for equity and fairness.

While there is not yet a standardized methodology for evaluating LLMs, the consensus at Google Cloud Next ’24 was clear: an evaluation-driven approach is crucial.

The distinct challenge with LLMs, compared to traditional machine learning, is the lack of objective, universally accepted performance metrics.

To meet that challenge, developers are getting better at evaluating the performance of AI tools.

They are becoming creative, devising bespoke tests that might include anchoring tasks to numerical scores, relying on standardized human evaluations, or using auxiliary LLMs as judges.

The lesson for people in martech: Expect the quality of your AI tools to improve rapidly. But as they do, it’s best to always make sure your AI suppliers have processes in place to evaluate their AI and its potential biases.

3. Data still matters

A year ago, people were talking about the promise of data-less workflows thanks to pre-trained foundational AI models.

People aren’t talking that way anymore. They have discovered that data still matters. And they are using AI to get themselves – and their data – more organized.

For example, companies have been making tremendous use of generative AI as a search tool. This has been especially useful in firms where internal knowledge is scattered over several locations, making searches inherently difficult. Using AI as a search tool overcomes that problem.

In fact, the proliferation of internal AI assistants made Google Cloud Next ’24 almost as much about the retrieval-augmented generation architecture as about AI itself.

The ability to search and organize data is having an unexpected effect.

AI is encouraging some companies to digitize and streamline their own internal documentation and work processes – something they might not have done without AI giving them a push.

For example, internal chatbots that employees access when they need assistance with technical issues won’t work efficiently unless the chatbots have access to the company’s documentation. So some teams are taking advantage of AI initiatives to overhaul the quality and accessibility of their internal data.

The lesson for people in martech: Use AI to get yourself organized. You may find it provides a reason (and maybe even the budget) to streamline your own internal processes.

4. The number of small, specialized AI tools is proliferating

The big players in generative AI are well-known. The conference highlighted not only Google’s Gemini, with its impressive one-million-token context window, but also the diverse array of models from third-party partners like Anthropic’s Claude, which is supported in Google Cloud through Vertex AI.

But we noted that there is a shift away from using large, high-performance models for every task. Instead, companies are turning to smaller, specialized models for particular jobs.

The reason is simple: The top-tier models can be expensive (think licensing costs) and computing can be costly.

While large models remain valuable for complex applications, smaller ones are increasingly viable for simpler tasks like straightforward labeling.

The open-source community in particular is leading the charge in reducing AI costs, enabling broader application of AI tools and helping the technology reach its transformative potential.

The lesson for people in martech: It’s worth looking at some of the new, smaller tools out there. These tools work well with the big players and they can help keep costs down.

Parting thoughts

We came away from Google Cloud Next ’24 as convinced as ever that AI presents a tremendous opportunity for businesses in general and for marketing in particular.

The use of AI tools will help every business improve its productivity.

As for marketers, we see AI taking on more and more routine tasks so that marketers can focus on the creative side of the job.

That approach underpins our approach to developing and refining our product. We at Knak want to continue to support and accelerate the creation of email campaigns with generative AI.

In other words, we want Knak to support the idea that AI can handle the repetitive tasks machines do best, so that humans can focus on what humans do best: being creative.

Feel free to reach out if you have any thoughts or comments.


Share this article

  • Félix Chrétien

    Co-Author

    Félix Chrétien

    AI Architect, Knak

  • Aidan MacMillan

    Co-Author

    Aidan MacMillan

    Software Developer, Knak

Why marketing teams love Knak

  • 95%better, faster campaigns = more success

  • 22 minutesto create an email*

  • 5x lessthan the cost of a developer

  • 50x lessthan the cost of an agency**

* On average, for enterprise customers

** Knak base price

Ready to see Knak in action?

Get a demo and discover how visionary marketers use Knak to speed up their campaign creation.

Watch a Demo