New HYBRID AI Model Just SHOCKED The Open-Source World - JAMBA 1.5

251 views1363 WordsCopy TextShare
AI Revolution
AI21 Labs has released two new open-source AI models, Jamba 1.5 Mini and Jamba 1.5 Large, featuring ...
Video Transcript:
so AI 21 Labs the brains behind the Jurassic language models has just dropped two brand new open- source llms called Jambo 1. 5 mini and Jambo 1. 5 large and these models are designed with a unique hybrid architecture that incorporates Cutting Edge techniques to enhance AI performance and since they're open source you can try them out yourself on platforms like hugging face or run them on cloud services like Google Cloud vertex AI Microsoft aure and Nvidia Nim definitely worth checking out all right so what's this hybrid architecture all about okay let's break it down in simple terms most of the language models you know like the ones used in chat GPT are based on the Transformer architecture these models are awesome for a lot of tasks but they've got this one big limitation they struggle when it comes to handling really large context Windows think about when you're trying to process a super long document or a full transcript from a long meeting regular Transformers get kind of bogged down because they have to deal with all that data at once and that's where these new Jamba models from AI 21 Labs come into play with a totally new game-changing approach so AI 21 has cooked up this new hybrid architecture they're calling the SSM Transformer now what's cool about this is it combines the classic Transformer model with something called a structured State space model or SSM the SSM is built on some older more efficient techniques like neural networks and convolutional neural networks basically these are better at handling computations efficiently so by using this mix the Jamba models can handle much longer sequences of data without slowing down that's a massive win for tasks that need a lot of context like if you're doing some complex generative AI reasoning or trying to summarize a super long document now why is handling a long context window such a big deal well think about it when you're using AI for real world applications especially in businesses you're often dealing with complex tasks maybe you're analyzing long meeting transcripts or summarizing a giant policy document or even running a chatbot that needs to remember a lot of past conversations the ability to process large amounts of context efficiently means these models can give you more accurate and meaningful responses or denan the VP of product at AI 21 Labs actually nailed it when he said an AI model that can effectively handle long context is crucial for many Enterprise generative AI applications and he's right without this ability AI models often tend to hallucinate or just make stuff up because they're missing out on important information but with the Jamba models and their unique architecture they can keep more relevant info in memory leading to way better outputs and less need for repetitive data processing and you know what that means better quality and lower cost all right let's get into the nuts and bolts of what makes this hybrid architecture so efficient so there's one part of the model called Mamba which is actually very important it's developed with insights from researchers at Carnegie melon and Princeton and it has a much lower memory footprint and a more efficient attention mechanism than your typical Transformer this means it can handle longer context windows with ease unlike Transformers which have to look at the entire context every single time slowing things down Mamba keeps a smaller state that gets updated as it processes the data this makes it way faster and less resource intensive now you might be wondering how do these models actually perform well AI 21 Labs didn't just hype them up they put them to the test they created a new Benchmark called ruler to evaluate the models on tasks like multihop tracing retrieval aggregation and question answering and guess what the Jamba models came out on top consistently outperforming other models like llama 317b llama 3.
1 45b and mistra large 2 on the arena hard Benchmark which is all about testing models on really tough tasks Jamba 1. 5 mini and large outperformed some of the biggest names in AI Jamba 1. 5 mini scored an impressive 46.
1 beating models like mixol 8 x22 B and command R plus while Jambo 1. 5 large scored a whopping 65. 4 outshining even the big guns like llama 317b and 45b one of the standout features of these models is their speed in Enterprise applications speed is everything whether you're running a customer support chatbot or an AI powered virtual assistant the model needs to respond quickly and efficiently the Jambo 1.
5 models are reportedly up to 2. 5 times faster on Long context than their competitors so not only are they powerful but they're also super practical for high-scale operations and it's not just about speed the Mamba component in these models allows them to operate with a lower memory footprint meaning they're not as demanding on hardware for for example Jambo 1. 5 mini can handle context lengths up to 140,000 tokens on a single GPU that's huge for developers looking to deploy these models without needing a massive infrastructure all right here's where it gets even cooler to make these massive models more efficient AI 21 Labs developed a new quantization technique called experts int 8 now I know that might sound a bit technical but here's the gist of it quantization is basically a way to reduce the Precision of the numbers used in the model's computations this can save on memory and computational costs Without Really sacrificing quality experts in eight is special because it specifically targets the weights in the mixture of experts or Mo layers of the model these layers account for about 85% of the models weights in many cases by quantizing these weights to an 8bit Precision format and then de quantizing them directly inside the GPU during runtime AI 21 Labs managed to cut down the model size size and speed up its processing the result Jamba 1.
5 large can fit on a single 8 GPU node while still using its full context length of 256k this makes Jamba one of the most resource efficient models out there especially if you're working with limited Hardware now besides English these models also support multiple languages including Spanish French Portuguese Italian Dutch German Arabic and Hebrew which makes them super versatile for Global applications and here's a cherry on top AI 21 Labs made these models developer friendly both Jamba 1. 5 mini and large come with built-in support for structured Json output function calling and even citation generation this means you can use them to create more sophisticated AI applications that can perform tasks like calling external tools digesting structured documents and providing reliable references all of which are Super useful in Enterprise settings one of the coolest things about Jamba 1. 5 is AI 21 lab's commitment to keeping these models open they're released under the Jamba open model license which means developers researchers and businesses can experiment with them freely and with availability on multiple platforms and Cloud Partners like AI 21 Studio Google Cloud Microsoft Azure Nvidia Nim and soon on Amazon Bedrock datab bricks Marketplace and more you've got tons of options for how you want to deploy and experiment with these models looking ahead it's pretty pretty clear that AI models that can handle extensive context windows are going to be a big deal in the future of AI as Oran from AI 21 Labs pointed out these models are just better suited for complex data heavy tasks that are becoming more common in Enterprise settings they're efficient fast and versatile making them a fantastic choice for developers and businesses looking to push the boundaries in AI so if you haven't checked out Jamba 1.
Related Videos
New AI Humanoid ROBOTS That Will Soon OUTPERFORM HUMANS
8:50
New AI Humanoid ROBOTS That Will Soon OUTP...
AI Revolution
23,958 views
OK. Now I'm Scared... AI Better Than Reality!
8:10
OK. Now I'm Scared... AI Better Than Reality!
AI Revolution
123,805 views
AI Pioneer Shows The Power of AI AGENTS - "The Future Is Agentic"
23:47
AI Pioneer Shows The Power of AI AGENTS - ...
Matthew Berman
544,482 views
How I'd Learn AI (If I Had to Start Over)
15:04
How I'd Learn AI (If I Had to Start Over)
Thu Vu data analytics
781,068 views
Transformers, explained: Understand the model behind ChatGPT
24:07
Transformers, explained: Understand the mo...
Leon Petrou
10,263 views
NVIDIA CEO Jensen Huang Leaves Everyone SPEECHLESS (Supercut)
18:24
NVIDIA CEO Jensen Huang Leaves Everyone SP...
Ticker Symbol: YOU
874,124 views
OpenAI's Newest AI Humanoid Robot - Figure 02 - Just Stunned the Robotics World!
8:07
OpenAI's Newest AI Humanoid Robot - Figure...
AI Revolution
125,247 views
About 50% Of Jobs Will Be Displaced By AI Within 3 Years
26:26
About 50% Of Jobs Will Be Displaced By AI ...
Fortune Magazine
308,872 views
OpenAI Quietly Released a Better ChatGPT Version Surprising Users
8:56
OpenAI Quietly Released a Better ChatGPT V...
AI Revolution
86,642 views
New AI ROBOT with 3 Brains SHOCKED Experts!
9:16
New AI ROBOT with 3 Brains SHOCKED Experts!
AI Revolution
35,390 views
UNCENSORED GROK 2.0 Just BROKE The Internet!
9:12
UNCENSORED GROK 2.0 Just BROKE The Internet!
AI Revolution
27,009 views
The AGI Company Presents AGENT Q The AI Master of the Impossible
8:54
The AGI Company Presents AGENT Q The AI Ma...
AI Revolution
13,462 views
What Is an AI Anyway? | Mustafa Suleyman | TED
22:02
What Is an AI Anyway? | Mustafa Suleyman |...
TED
1,433,268 views
NASA Isn't Telling Us Something About The Moon
15:14
NASA Isn't Telling Us Something About The ...
The Space Race
439,312 views
Mastering Google's VLM PaliGemma: Tips And Tricks For Success and Fine Tuning
21:01
Mastering Google's VLM PaliGemma: Tips And...
Sam Witteveen
10,121 views
Generative AI in a Nutshell - how to survive and thrive in the age of AI
17:57
Generative AI in a Nutshell - how to survi...
Henrik Kniberg
1,896,266 views
26 Incredible Use Cases for the New GPT-4o
21:58
26 Incredible Use Cases for the New GPT-4o
The AI Advantage
797,740 views
AI ROBOTS Are Becoming TOO REAL! - Shocking AI & Robotics 2024 Updates 24/7
AI ROBOTS Are Becoming TOO REAL! - Shockin...
AI Revolution
INSANE OpenAI News: GPT-4o and your own AI partner
28:48
INSANE OpenAI News: GPT-4o and your own AI...
AI Search
877,422 views
Google Releases AI AGENT BUILDER! 🤖 Worth The Wait?
34:21
Google Releases AI AGENT BUILDER! 🤖 Worth...
Matthew Berman
233,329 views
Copyright © 2024. Made with ♥ in London by YTScribe.com