Artificial Intelligence and Machine Learning Trends in 2024

Discover the transformative trends 2024 in AI and machine learning, encompassing custom enterprise models and the rise of open-source AI, promising a revolution in the industry with multimodal capabilities at the forefront.

After introducing ChatGPT in November 2022, 2023 became a pivotal year for artificial intelligence. Developments, including a thriving open-source community and advanced multimodal models, set the stage for significant AI progress.

While generative AI continues to captivate the tech world, a more nuanced and mature approach emerges as organisations shift from experimentation to real-world applications. This year’s trends highlight a deepened sophistication and caution in AI development, focusing on ethics, safety, and evolving regulations.

Explore the top AI and machine learning trends for 2024:

1. Multimodal AI

Multimodal AI surpasses traditional single-mode data processing by including multiple input types like text, images, and sound—a leap toward mimicking human abilities to process diverse sensory information.

Mark Chen, Head of Frontiers Research at OpenAI, emphasized the importance of multimodal interfaces, stating, ‘We want our models to see what we see and hear what we hear, generating content that appeals to more than one of our senses.’

2. Agentic AI

Agentic AI signifies a notable shift from reactive to proactive AI. These advanced AI agents demonstrate autonomy, proactivity, and the ability to act independently. Unlike traditional AI systems that primarily respond to user inputs and adhere to predetermined programming, AI agents comprehend their environment, set goals, and work to achieve those objectives without direct human intervention.

For instance, in environmental monitoring, an AI agent could be trained to collect data, analyze patterns, and initiate preventive actions in response to hazards such as early signs of a forest fire. Similarly, a financial AI agent could actively manage an investment portfolio using adaptive strategies that respond to real-time changing market conditions.

“2023 was the year of being able to chat with an AI,” noted computer scientist Peter Norvig, a fellow at Stanford’s Human-Centered AI Institute, in a recent blog post. “In 2024, we’ll see the ability for agents to get stuff done for you. Make reservations, plan a trip, connect to other services.”

3. Open source AI

Building large language models and other powerful generative AI systems is expensive and requires enormous amounts of computing and data. But using an open-source model enables developers to build on top of others’ work, reducing costs and expanding AI access. Open-source AI is publicly available, typically free, helping organizations and researchers to contribute to and build on existing code.

GitHub data from the past year shows a remarkable increase in developer engagement with AI, particularly generative AI. In 2023, productive AI projects entered the top 10 most popular projects across the code hosting platform for the first time, with projects such as Stable Diffusion and AutoGPT pulling in thousands of first-time contributors.

Early in the year, open-source generative models were limited in number, and their performance often needed to catch up to proprietary options such as ChatGPT. But, the landscape broadened significantly throughout 2023 to include powerful open-source contenders such as Meta’s Llama 2 and Mistral AI’s Mixtral models. This could shift the dynamics of the AI landscape in 2024 by providing smaller, less-resourced entities with access to sophisticated AI models and tools previously out of reach.

“It gives everyone easy, fairly democratized access, and it’s great for experimentation and exploration,” Barrington said.

4. Retrieval-augmented generation

Although generative AI tools were widely adopted in 2023, they continue to be plagued by the problem of hallucinations: plausible-sounding but incorrect responses to users’ queries. This limitation has presented a roadblock to enterprise adoption, where hallucinations in business-critical or customer-facing scenarios could be catastrophic. Retrieval-augmented generation (RAG) has emerged as a technique for reducing hallucinations, with potentially profound implications for enterprise AI adoption.

RAG blends text generation with information retrieval to enhance the accuracy and relevance of AI-generated content. It enables LLMs to access external information, helping them produce more accurate and contextually aware responses. Bypassing the need to store all knowledge directly in the LLM also reduces model size, which increases speed and lowers costs.

“You can use RAG to gather a ton of unstructured information, documents, etc., [and] feed it into a model without having to fine-tune or custom-train a model,” Barrington said.

These benefits are particularly enticing for enterprise applications where up-to-date factual knowledge is crucial. For example, businesses can use RAG with foundation models to create more efficient and informative chatbots and virtual assistants.

5. Customized Enterprise Generative AI Models

While broad, multi-purpose tools like Midjourney and ChatGPT gain attention, businesses increasingly lean towards smaller, purpose-specific models tailored to niche requirements. Rather than creating entirely new models, organizations often customize existing generative AI models by adjusting their architecture or refining them with domain-specific datasets. This approach is more cost-effective than building a new model from scratch or relying on expensive API calls to public Language Model Models (LLMs).

Shane Luke, VP of AI and Machine Learning at Workday highlighted the advantages, stating, “Calls to GPT-4 as an API, just as an example, are costly in terms of cost and latency. We are working on optimizing to have the same capability, but it’s very targeted and specific. And so it can be a much smaller, more manageable model.”

Customized generative AI models excel in catering to specific markets and user needs. They can be crafted for diverse scenarios, including customer support, supply chain management, or document review. This customization proves particularly valuable in sectors with specialized terminology and practices like healthcare, finance, and legal industries.

Subscribe to our latest News!

Latest News

Follow us

Latest News

Follow us

Subscribe to our
latest News!