In studying
learning and
memory,
neuroscientists are concerned with the interactions between
neurons that occur at
synapses. To grossly
oversimplify a vast topic, one major way in which learning and memory can occur is by increasing
synaptic strength. Consider any pair of neurons, one pre-synaptic and one post-synaptic. In order for a signal to pass between them, the pre-synaptic
cell must fire an
action potential and release
neurotransmitters into the
synaptic cleft between the two neurons. These transmitters, in turn, produce an effect on the post-synaptic cell's
membrane potential called a
post-synaptic potential, which can be either
excitatory or
inhibitory (
EPSP versus
IPSP). If the excitatory effect is great enough, then the post-synaptic cell will become
depolarized to the extent that it, too, will produce an action potential, allowing the electrochemical signal to
propagate.
Research done on
neural networks and other experimental preparations has concluded that synaptic strength can be increased through
Hebbian learning, in which if one neuron is stimulating another neuron while the second neuron is also firing -- indicating a high correlation between the activity of the two neurons -- the connection between them becomes stronger. When this effect persists over time, it is known as
long-term potentiation (
LTP). (Similarly, synapses can become weakened when low correlation between the activity of the neurons is shown -- called "long-term depression" when it persists.)
This is all well and good, until you consider that
your neurons are firing all the time. If synapses become strengthened every time two or more neurons fire in concert, a runaway
positive feedback loop is produced. The probability of a post-synaptic cell firing would keep getting higher and higher with every instance of Hebbian phenomena until every cell was excited beyond any reasonable point. Obviously, some sort of regulatory mechanism must be occurring. This is what we call
synaptic scaling.
It has been shown that in addition to the phenomena
regulating learning and
LTP, there are checks in place in each neuron that control its overall firing
rate, keeping it within a
sane limit. If the firing rate of the cell drastically increases, the neuron adjusts so that each of its synapses is proportionally weakened. If firing becomes too depressed, the neuron will in turn proportionally strengthen all its synapses. This means that if one of a neuron's many synapses is very strong, significantly increasing its overall firing rate, the strong synapse will be preserved during synaptic scaling, but many weaker synapses of the same cell will lose significant long-term effects as all the synapses become proportionally "
downsized." This synaptic weakening may be linked to synapse
competition and the eventual
elimination of very weak synapses.