Войти в систему

Home
    - Создать дневник
    - Написать в дневник
       - Подробный режим

LJ.Rossia.org
    - Новости сайта
    - Общие настройки
    - Sitemap
    - Оплата
    - ljr-fif

Редактировать...
    - Настройки
    - Список друзей
    - Дневник
    - Картинки
    - Пароль
    - Вид дневника

Сообщества

Настроить S2

Помощь
    - Забыли пароль?
    - FAQ
    - Тех. поддержка



Пишет Slashdot ([info]syn_slashdot)
@ 2025-03-25 00:20:00


Previous Entry  Add to memories!  Tell a Friend!  Next Entry
Software Engineer Runs Generative AI On 20-Year-Old PowerBook G4
A software engineer successfully ran Meta's Llama 2 generative AI model on a 20-year-old PowerBook G4, demonstrating how well-optimized code can push the limits of legacy hardware. MacRumors' Joe Rossignol reports: While hardware requirements for large language models (LLMs) are typically high, this particular PowerBook G4 model from 2005 is equipped with a mere 1.5GHz PowerPC G4 processor and 1GB of RAM. Despite this 20-year-old hardware, my brother was able to achieve inference with Meta's LLM model Llama 2 on the laptop. The experiment involved porting the open-source llama2.c project, and then accelerating performance with a PowerPC vector extension called AltiVec. His full blog post offers more technical details about the project.

Read more of this story at Slashdot.



(Читать комментарии) (Добавить комментарий)