The most important AI trends in 2024

258.42k views1309 WordsCopy TextShare
IBM Technology
Explore IBM watsonx → https://ibm.biz/get_started-watsonx AI is growing at a frenetic pace - just ...
Video Transcript:
we're a little ways into 2024 now and the pace of AI certainly isn't slowing down but where will it be by the end of the year well we've put together nine trends that we expect to merge throughout the year some of them are Broad and high level some are a bit more technical so let's get into them oh and if you stumbled across this video in 2025 let us know how we did okay Trend number one this is the year of the reality check it is the year of more realistic expectations when generative AI first
hit Mass awareness it was met with breathless news coverage everyone was messing around with chat GPT darly and the like and now the dust is settled we're starting to develop a more refined understanding of what AI powered Solutions can do now many generative AI tools are now being implemented as integrated elements rather than Standalone chatbots and like they enhance and complement existing tools rather than revolutionize or replace them so I think co-pilot features in Microsoft Office or generative fill in Adobe Photoshop and embedding AI into everyday workflows like these helps us to better understand what
generative AI can and cannot do in its current form and one area generative AI is really extending its capabilities that is in multi model AI now ai multimodal models can take multiple layers of data as input and we already have interdisciplinary models today like open AI GPT 4V and Google Gemini that can move freely between natural language processing and computer vision tasks so users can for example like ask about an image and then receive a natural language answer or they could ask out loud for instructions to let say repair something and receive visual aids alongside
step-by-step text instructions new models are also bringing video into the fold and where this really gets interesting is in how multimodal AI allows for models to process more diverse data inputs and that expands the information available for training and inference for example by ingesting data captured by video cameras for holistic learning so there's lots more to come this year now Trend three that relates to smaller models now massive models they jump started the generative AI age but they're not without drawbacks according to one estimate from the University of Washington training a single gpt3 size model
requires the yearly electricity consumption of over a th000 households and you might be thinking sure that's training we know that's expensive but what about inference well a standard day of chat GPT queries Rivals the daily energy consumption of something like 33,000 households smaller models meanwhile are far less resource intensive much of the ongoing innovation in llms has focused on yielding greater output from fewer parameters now GPT 4 that is rumored to have around 1.76 trillion parameters but many open-source models have seen success with model sizes in the 327 billion parameter range so billions instead of
trillions now in December last year mistol released mixol that is a mixture of experts or Ane model integrating eight neural networks each with 7 billion parameters and mistol claims that mixol not only outperforms the 70 billion parameter variant of llama 2 on most benchmarks at six times faster influence speeds no less but that it even matches or outperforms open AI far larger GPT 3.5 on most standard benchmarks smaller parameter models can be run at lower cost and run locally on many devices like personal laptops which conversely brings us to Trend number four which is GPU
and Cloud costs the trend towards smaller models is p driven as much by necessity as it is by entrepreneurial Vigor the larger the model the higher the requirement on GPS for training and inference relatively few AI adopters maintain their own infrastructure so that puts upward pressure on cloud costs as providers update and optimize their own infrastructure to meet gen demand or while everybody is scrambling to obtain the necessary gpus to power the infrastructure if only these models were a bit more optimized they need less compute yes that is Trend number five that is model optimization
now this past year we've already seen adoption of techniques for training tweaking and fine-tuning pre-train models like quantization you know how you can reduce the file size of an audio file or a video file just by lowering its bit rate well quantization lowers the Precision used to represent model data points for example from 16bit floating point to 8 bit integer to reduce memory usage and speed up inference also rather than directing directly fine-tuning billions of model parameters something called Laura or low rank adaptation entails freezing pre-train model weights and injecting trainable layers in each Transformer
block and Laura reduces the number of parameters that need to be updated which in turn dramatically speeds up fine tuning and reduces the memory needed to stor model updates so expect to see more model optimization techniques emerge this year okay let's uh let's knock out a few more and the next one is all about custom local models open-source models afford the opportunity to develop powerful custom AI models that means trained on an organization's proprietary data and fine-tuned for their specific needs keeping AI training and inference local avoids the risk of proprietary data or sensitive personal
information being used to train closed Source models or otherwise pass through to the hands of third parties and then using things like rag or retrieval augmented generation to access relevant information rather than storing all of that information directly within the llm itself that helps to reduce model size Trend number seven that is virtual agents now that goes beyond the straightforward customer experience chatbot because virtual agents relate to task automation where agents will get stuff done for you they'll they'll make reservations or they'll complete checklist tasks or they'll connect to other services so lots more to
come there Trend number eight that is all about regulation now in December of last year the European Union reached provisional agreement on the artificial intelligence act also the the role of copyright material in the training of AI models used for Content generation remains a hotly contested issue so expect much more to come in the area of Regulation and finally we're at Trend number nine which is the continuance of something called Shadow AI what's that well it's The Unofficial personal use of AI in the workplace by employees it's about using gen AI without going through it
for approval or oversight now in one study from Ernest and Young 90% of respondents said they used AI at work but without corporate AI policies in place and importantly policies that are observed this can lead to issues regarding security privacy compliance that sort of thing so for example an employee might unknowingly feed trade secrets to a public facing AI model that continually trains the model on user input or they might use copyright protected material to train a proprietary model and then that could expose the company to legal action the dangers of generative AI rise kind
of almost in a linear line with its capabilities and that Line's going up with great power comes great responsibility so so there you have it nine important AI trends for this year but but but why nine don't these things almost always come in tens well yes yes they do and that's your job what is the one AI trend for 2024 that we haven't covered here the missing 10th Trend let us know in the comments if you have any questions please drop us a line below and if you want to see more videos like this in
the future please like And subscribe thanks for watching
Copyright © 2025. Made with ♥ in London by YTScribe.com