Mean-field models of the cortex have been used successfully to interpret the origin of features on the electroencephalogram under situations such as sleep, anesthesia, and seizures. In a mean-field scheme, dynamic changes in synaptic weights can be considered through fluctuation-based Hebbian learning rules. However, because such implementations deal with population-averaged properties, they are not well suited to memory and learning applications where individual synaptic weights can be important. We demonstrate that, through an extended system of equations, the mean-field models can be developed further to look at higher-order statistics, in particular, the distribution of synaptic weights within a cortical column. This allows us to make some general conclusions on memory through a mean-field scheme. Specifically, we expect large changes in the standard deviation of the distribution of synaptic weights when fluctuation in the mean soma potentials are large, such as during the transitions between the "up" and "down" states of slow-wave sleep. Moreover, a cortex that has low structure in its neuronal connections is most likely to decrease its standard deviation in the weights of excitatory to excitatory synapses, relative to the square of the mean, whereas a cortex with strongly patterned connections is most likely to increase this measure. This suggests that fluctuations are used to condense the coding of strong (presumably useful) memories into fewer, but dynamic, neuron connections, while at the same time removing weaker (less useful) memories.