Computational neuroscientists often describe synapses by a single number: the synaptic weight. Learning is then implemented by shifting this number. In contrast, biological synapses are complex systems with their own internal dynamics. This structure has profound consequences for their ability to learn and store memories. We analyse the entirety of a broad class of models of synaptic plasticity. We find trade-offs between rapid learning and slow forgetting and the models that navigate them optimally. This yields predictions for the different synaptic structures found in different brain regions. We also investigate genetic/pharmacological manipulations intended to speed up learning. We uncover the rules by which such interventions succeed or fail. The outcome is determined by both neural activity and synaptic structure. This provides an explanation for the mixed results of experiments with enhanced plasticity.