But how do you tell the value of an explanation? Should it not empower you to some new action or ability? It could be that the explanation is somewhat of a by-product of other prediction-making theories (like how plate tectonics relies on thermodynamics, fluid dynamics, and rock mechanics, which do make predictions).
It might also make predictions itself, such as that volcanoes not on clear plate boundaries might be somehow different (distribution of occurrence over time, correlation with earthquakes, content of magma, size of eruption...), or that understanding the explanation for lightning allows prediction that a grounded metal pole above the house might protect the house from lightning strikes. This might be a different kind of prediction, though, since it isn't predicting future dynamics. Knowing how epidemics works doesn't necessarily allow prediction of total infected counts or length of infection, but it does allow prediction of minimum vaccination rates to avert outbreaks.
Nonetheless, a theory as a tool to explain, with very poor predictive ability, can still be useful, though less valuable than one that also makes testable predictions.
But in general, it seems like data -> theory is the explanation. Theory -> data is the prediction. The strength of the prediction depends on the strength of the theory.