List of Interesting Material

date
Jul 31, 2022
slug
interesting-material
author
status
Public
tags
Pinned
summary
type
Post
thumbnail
updatedAt
Feb 21, 2023 06:46 AM

Table of Contents

Book List

  • The Age of AI (And Our Human Future) by Henry A. Kissinger, Eric Schmidt, and Daniel Huttenlocher
  • Life 3.0 by Max Tegmark (my boss)
  • Superintelligence by Nick Bostrom
  • I, Robot by Isaac Asimov

Internet

  • Wikipedia
    • The Mind-Body Problem: a philosophical debate concerning the relationship between thought and consciousness in the human mind, and the brain as part of the physical body.
    • Orchestrated Objective Reduction: a theory introduced by Roger Penrose and Stuart Hameroff that consciousness arises from quantum effects in cellular structures called microtubules.
    • AIXI: a theoretical mathematical formalism for artificial general intelligence reinforcement learning agent.
    • Panpsychism: the view that a mindlike aspect is fundamental to reality, and everything in the universe has varying degrees of consciousness. Philip Goff on Lex Fridman
    • Limits of Computation
      • Bremermann's limit: a limit on the maximum rate of computation that can be achieved in a self-contained system in the material universe.

Papers

  • Language Models are Few-Shot Learners by Brown et al., the original paper introducing GPT-3, a 175 billion parameter model. “Scaling up language models greatly improves task-agnostic, few-shot performance, sometimes even reaching competitiveness with prior state-of-the-art fine-tuning approaches.”
  • PaLM: Scaling Language Modeling with Pathways by Chowdhery et al., the 540 billion parameter Pathways Language Model (PaLM) from Google Research. The Pathways system enables efficiently training a single model across multiple TPU v4 pods. State-of-the-art few-shot performance across most language understanding + generation tasks.
  • Why does deep and cheap learning work so well? by Lin, Tegmark, and Rolnick: NNs can approximate arbitrary functions, but the class of functions of practical interest can frequently be approximated through “cheap learning” with exponentially fewer parameters than generic functions.