GoodAI Research Roadmap 2021/2022 | GoodAI

In 2021/22 GoodAI will focus on four core research areas: learning to learn, lifelong (gradual) learning, open-endedness, and generalization / extrapolation of meta-learned algorithms.

AI takeover

An AI takeover is a hypothetical scenario in which some form of artificial intelligence (AI) becomes the dominant form of intelligence on Earth, with computer programs or robots effectively taking the control of the planet away from the human species. Possible scenarios include replacement of the entire human workforce, takeover by a superintelligent AI, and the popular notion of a robot uprising. Some public figures, such as Stephen Hawking and Elon Musk, have advocated research into precautionary measures to ensure future superintelligent machines remain under human control.

How Naftali Tishby’s Information Bottleneck Theory Can Break Open The Black Box Of Deep Learning

Deep Learning has been achieving tremendous results. But its is largely unknown how deep learning models are achieving such feats. Tishby proposes a theory that can answer some of these questions

Category:Memory processes

Memory processes describes ways to classify memories, based on duration, nature and retrieval of information.

Molecular evolution

Molecular evolution is the process of change in the sequence composition of cellular molecules such as DNA, RNA, and proteins across generations. The field of molecular evolution uses principles of evolutionary biology and population genetics to explain patterns in these changes. Major topics in molecular evolution concern the rates and impacts of single nucleotide changes, neutral evolution vs. natural selection, origins of new genes, the genetic nature of complex traits, the genetic basis of speciation, evolution of development, and ways that evolutionary forces influence genomic and phenotypic changes.

(PDF) Long Short-term Memory

PDF | Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient,... | Find, read and cite all the research you need on ResearchGate

Event-based backpropagation can compute exact...

Scientific Reports - <ArticleTitle Language=En OutputMedium=All xml:lang=en>Event-based backpropagation can compute exact...

Back-propagation Now Works in Spiking Neural Networks!

by Timothée Masquelier (UMR5549 CNRS - Université Toulouse 3) Back-propagation is THE learning algorithm behind the deep learning revolution. Until recently, it was not possible to use it in spiking neural networks (SNN), due to non-differentiability issues. But these issues can now be circumvented, signalling a new era for SNNs.

The Oxford Handbook of Cognitive Science

Cognitive Science is an avowedly multidisciplinary field, drawing upon many traditional disciplines or research areas--including Linguistics, Neuroscience, Philosophy, Psychology, Anthropology, Artificial Intelligence, and Education--that contribute to our understanding of cognition. Just aslearning and memory cannot truly prove effective as disconnected studies, practical applications of cognitive research, such as the improvement of education and human-computer interaction, require dealing with more complex cognitive phenomena by integrating the methods and insights from multipletraditional disciplines. The societal need for such applications has played an important role in the development of cognitive science. The Oxford Handbook of Cognitive Science emphasizes the research and theory that is most central to modern cognitive science. Sections of the volume address computational theories of human cognitive architecture; cognitive functioning, such as problem solving and decision making as they have beenstudied with both experimental methods and formal modeling approaches; and cognitive linguistics and the advent of big data. Chapters provide concise introductions to the present a

Weight Agnostic Neural Networks

Have you ever wondered how most mammals are capable of fairly complex tasks, like walking, straight after being born? They haven’t had…


Behavior reflects nervous system activity and is dependent on multiple factors including external stimuli, past experience, neuronal structure and changes in the internal milieu of the animal. Alterations at the cellular or functional level can profoundly alter basal and evoked activity. Therefore, behavioral assays offer the researcher simple, sensitive and powerful tools to interrogate neuronal function.

Unit 3 Application) Evolving Neural Network for Time Series Analysis

The Culmination of Unit 3 by applying our Concepts to Evolve a Neural Network for Predicting a Time Series Problem

Neurogenesis in the nematode Caenorhabditis elegans

The nervous system represents the most complex tissue of C. elegans both in terms of numbers (302 neurons and 56 glial cells = 37% of the somatic cells in a hermaphrodite) and diversity (118 morphologically distinct neuron classes). The lineage and morphology of each neuron type has been described in detail and neuronal fate markers exists for virtually all neurons in the form of fluorescent...

Why general artificial intelligence will not be...

Humanities and Social Sciences Communications - <ArticleTitle Language=En xml:lang=en>Why general artificial intelligence will not be...

Adaptive resonance theory

Adaptive resonance theory (ART) is a theory developed by Stephen Grossberg and Gail Carpenter on aspects of how the brain processes information. It describes a number of neural network models which use supervised and unsupervised learning methods, and address problems such as pattern recognition and prediction.

How brainless slime molds redefine intelligence

Nature - Single-celled amoebae can remember, make decisions and anticipate change, urging scientists to rethink intelligent behavior.

Welcoming the Era of Deep Neuroevolution

By leveraging neuroevolution to train deep neural networks, Uber AI Labs is developing solutions to solve reinforcement learning problems.


© 2004 Robert A. Freitas Jr. and Ralph C. Merkle. All Rights Reserved. Robert A. Freitas Jr., Ralph C. Merkle, Kinematic Self-Replicating Machines, Landes Bioscience, Georgetown, TX, 2004. 2.1.1 A Logical Organization of Self-Replication Von Neumann set for himself the goal of showing what the logical organization of a self-replicating machine might be.

Technological singularity

The technological singularity—or simply the singularity[1]—is a hypothetical point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization.[2][3] According to the most popular version of the singularity hypothesis, called intelligence explosion, an upgradable intelligent agent will eventually enter a runaway reaction of self-improvement cycles, each new and more intelligent generation appearing more and more rapidly, causing an explosion in intelligence and resulting in a powerful superintelligence that qualitatively far surpasses all human intelligence.

Models of neural computation

Models of neural computation are attempts to elucidate, in an abstract and mathematical fashion, the core principles that underlie information processing in biological nervous systems, or functional components thereof. This article aims to provide an overview of the most definitive models of neuro-biological computation as well as the tools commonly used to construct and analyze them.