Humans have the ability to quickly adapt to new tasks and generalize what they learned to improve the learning process itself. This capacity has been referred to as learning to learn or meta-learning. To support meta-learning, synaptic plasticity must solve a difficult long-term credit assignment problem and determine how to change a synapse such that the outcome of a complex learning process improves. Here, we revisit systems consolidation theories and propose that the neocortex is a meta-learning system supported by the hippocampus. In particular, we show that local synaptic plasticity rules can extract long-term credit assignment information when neocortical learning based on current experience is intermixed with auxiliary learning problems prescribed by the hippocampus. We test our theory on a spiking neural network with fast and slow synaptic components and find that our plasticity rules configure the slow synaptic components such that a family of regression tasks can be quickly learned from few examples. Additional benchmark experiments with deep neural networks show that our plasticity rules match and sometimes even outperform existing biologically-implausible meta-learning algorithms. Our theory extends the conventional view on the hippocampus and the neocortex as complementary learning systems, providing a unified perspective of systems-level consolidation and synaptic consolidation.