Boosted AI Capabilities with Intel Core Ultra PCs – Experience Optimal Integration of Cutting-Edge Including Stable Diffusion
Boosted AI Capabilities with Intel Core Ultra PCs – Experience Optimal Integration of Cutting-Edge Including Stable Diffusion
The generative AI revolution has mostly been focused on running large and complex AI models in server datacenters. Some AI models are optimized enough to run on typical computers, though, and Intel is making some progress there.
Intel announced today that there are now over 500 AI models optimized for its new Intel Core Ultra processors, which were revealed in December and have started to appear in new PC laptops. That list likely includes many experimental and testing models that don’t serve a practical purpose for most applications, but there are a few big ones: Phi-2, Meta’s Lllama model , Mistral, Bert, Whisper, and Stable Diffusion 1.5 .
Intel said in a press release, “Models form the backbone of AI-enhanced software features like object removal, image super resolution or text summarization. There is a direct link between the number of enabled/optimized models and the breadth of user-facing AI features that can be brought to market. Without a model, the feature cannot be designed. Without runtime optimization, the feature cannot reach its best performance.”
Most (if not all) of those AI models can run on non-Intel hardware, but adding support for the newer hardware features specific to Intel’s latest chips makes them more practical for real-world use. For example, Intel said the OpenVINO AI model’s optimization process included “load-balancing across all the compute units, compressing the models to run efficiently in an AI PC, and optimizing the runtime to take advantage of memory bandwidth and core architecture within Intel Core Ultra.”
Machine learning and AI models that run locally on computers is nothing new, but running newer generative AI models locally on PCs has a few interesting use cases. You could have something like ChatGPT and Microsoft Copilot running entirely on your own PC, potentially eliminating the privacy concerns and network connectivity requirements that come with sending prompt data to external servers. NVIDIA’s ChatRTX local chatbot is a step in that direction, but it’s still experimental and requires a PC with a powerful RTX 30 or 40-series graphics card.
Intel is hoping that software using these optimized models might push people to buy newer computers with Core Ultra processors. For now, though, cloud-based AI tools like ChatGPT and Copilot aren’t going anywhere.
Source: Intel
Also read:
- [New] Expert Tips Harnessing Full Capabilities of Aiseesoft Screen Tech for 2024
- [New] Incorporate Subtitles for Improved Viewing WMP Guide
- [Updated] Exploring the World of Digital Photo Manipulation
- [Updated] In 2024, Behind the Scenes of Virtual Reality Films
- [Updated] In 2024, Step-by-Step Guide to Saving Google Meet Interactions
- 2024 Approved Find Your Dream Free VFX Alternative - Top 30 Sites Explored Deeply
- 2024 Approved HTC Vive Unveiled Mastering Your 3D World
- 2024 Approved Unparalleled 8 Webcams Elevate Your Livestream Experience
- In 2024, How to Screen Mirroring Asus ROG Phone 8 Pro? | Dr.fone
- Lock Your Asus ROG Phone 8 Pro Phone in Style The Top 5 Gesture Lock Screen Apps
- Top-Rated Asus Router Models in 2
- Title: Boosted AI Capabilities with Intel Core Ultra PCs – Experience Optimal Integration of Cutting-Edge Including Stable Diffusion
- Author: Frank
- Created at : 2024-10-28 17:13:52
- Updated at : 2024-10-29 17:10:19
- Link: https://some-techniques.techidaily.com/boosted-ai-capabilities-with-intel-core-ultra-pcs-experience-optimal-integration-of-cutting-edge-including-stable-diffusion/
- License: This work is licensed under CC BY-NC-SA 4.0.