Skip to main content
  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free
    Length: 00:28:41
06 Jun 2023

Anshumali Shrivastava, ThirdAI Corp. ABSTRACT: Large Language Models (LLMs) and GPT have enormous potential to drive automation and efficiency in the future. Every enterprise is rushing toward becoming the early adopter of this novel technology. However, LLMs’ cost, energy, and privacy vulnerability are becoming significant barriers. The primary issue is that LLMs require massively specialized infrastructure and very costly training from a money and carbon perspective. In this lecture, we will look at emerging technologies that can reduce LLMs’ cost, computations, and energy footprint by several orders of magnitude. As a result, even commodity infrastructure like CPUs is sufficient to build these massively large language models with complete “air-gapped privacy”. With this technology we have the opportunity to disrupt the economics and carbon footprint of Mega-AI models.We will walk over some demos of the savings in cost and energy, including how to train 1B parameter models on your laptop without draining battery.

More Like This

  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free