Building a Home Server to Meet Future AI Demands

Click the blue text above to follow me

Building a Home Server to Meet Future AI Demands

​​​​

…… Main Content ……

How to Ensure Your Next Computer is Ready for AI

Make sure to plan ahead to meet future AI demands.

Building a Home Server to Meet Future AI Demands

Image: Dall-E/Tip: Dan Ackerman AI is changing the way many of us interact with the internet, not to mention our computers. As AI tools like Copilot and ChatGPT are expected to grow in Windows and macOS machines, purchasing computers will become more interesting. The rapid advancements in AI technology can be dizzying, but one thing we’ve learned is that if you’re buying new computer hardware today, you need to consider how it will run AI tomorrow, especially if you’re thinking of running AI locally rather than in the cloud.

Buying the best computer you can get right now seems easy. Of course, you might end up with a machine that broadly fits your desired specifications. However, if you want to future-proof as much as possible for AI, it’s important to consider the features that will help you the most now, such as machine learning chip technology, RAM/memory, and hard drive space.

Machine Learning and AI Chips

The easiest feature to look for in any new purchase is dedicated hardware for machine learning and AI.

AMD, NVIDIA, and Apple have begun incorporating additional AI features into their chips, whether in graphics cards or full silicon packages. We’re still learning what these extra features offer to the average person, but we do know they are currently being used to speed up video, photo, and audio editing.

Intel has been a leader in integrating AI into its latest Core Ultra processors, designed specifically to optimize AI processing, especially if you’re putting AI tools directly on local hardware rather than in the cloud. AMD is also following suit, discussing processors with enhanced AI capabilities such as the Ryzen 7 8700G, while NVIDIA continues to impress with its AI-focused advancements in both hardware and software.

One key way we might benefit from these AI-capable chips is through offline assistants like Siri, Google Assistant, or others. Unlike Google’s Gemini or ChatGPT, which can answer questions with complex reasoning like, “How many M&Ms can I fit in the trunk of a modern sonata?” they can speed up tasks like calling a friend, setting timers, creating to-do lists, and even organizing calendars.

AI is also increasingly being used to help process images. Many photo applications can already recognize people, pets, and locations in photos. Both Apple and Google have search bars where you can type “beach” and get images of beaches. As AI technology improves on devices, these types of features may get better.

It’s All About Memory

Aside from the ML-specific chips made by Apple, AMD, Intel, and NVIDIA, the next thing you need to pay attention to is memory.

AI technology on devices requires as much memory as possible, including RAM and storage space, and that’s not an exaggeration. One of the best models currently running on computers, Llama 2 from Facebook owner Meta, can use up to 37GB of RAM. Yes, that’s RAM.

Meanwhile, NVIDIA’s new Chat with RTX is available for free download for PCs equipped with NVIDIA 30-series and 40-series GPUs, requiring about 36GB of storage space to download the installer.

While it may seem like a lot of memory today, don’t worry too much. Currently, AI on devices is more likely to remain in simpler tasks that require a small portion of system memory to run, while we save some advanced tasks for web-based AI like OpenAI’s ChatGPT, Google’s Gemini, Microsoft’s Copilot, etc.

It’s hard to say exactly how much memory we all need, but if you plan to buy a computer that will last more than a few years, it’s a safe bet you’ll want at least 16GB of memory, but more likely 32GB.

ARM vs x86

One of the more interesting things we’ve learned so far about AI technology is how it relies on GPUs. Strangely enough, it turns out that ARM-based chips currently have tighter integration between GPU, memory, and CPU.

A key example is Apple, whose M-series chips compete with similar devices planned by Intel, AMD, and mobile chipmaker Qualcomm.

The good news for consumers is that competition could lower prices and drive innovation. Microsoft has already stated that it plans to incorporate AI into many aspects of its upcoming Windows updates, and Apple is reportedly not far behind. A new AI-forward Windows system based on Qualcomm ARM chips is also expected sometime in 2024.

Recommended Specifications

If you want to build or purchase a system that is ready not just for today’s AI tools but also for tomorrow’s AI, then some specifications to look for now include:

  • CPU: Intel 14th Gen Meteor Lake or AMD Ryzen 7 8000G Series
  • GPU: NVIDIA 4000 Series or AMD 7000 Series, with at least 12GB of VRAM
  • RAM: At least 32GB of DDR5 memory
  • Storage: 512GB or more SSD storage
Original Article:
https://www.microcenter.com/site/mc-news/article/ai-ready-pc-how-to.aspx
References:
https://dataresident.com/best-processors-for-data-science/#google_vignette
https://www.intel.com/content/www/us/en/products/details/processors/core-ultra.html

/Public Account/Cannot leave comments, add me/WeChat/Join group

Follow/Video Account/Live, I will accompany you for evening study.

Don’t forgetFavorites】【Likes】 and 【Forward oh.

Building a Home Server to Meet Future AI Demands

Leave a Comment

×