I've always wondered something.
I have a 9V wall wart that measures over 14V.
I have a 6V wallwart that measures over 8V.
Why is the actual voltage out of a wall wart considerably higher that the voltage printed on it?
That is common for unregulated wall warts (that appears to be what you have). The rated voltage is at their rated current. If you look will will see a current rating along with the voltage rating.
When an unregulated wall wart (which is what you appear to have) is plugged into the AC mains but not powering anything (supplying any current), its output voltage will be higher than it basic rated voltage.
If the device that it is powering is using more current that the wall wart is rated for, then its output voltage may be lower than its rating.
Regulated wall warts will produce their rated voltage (or very close to it) when they are just plugged into the AC mains and not powering any device and the voltage will be stable up to its rated output (and maybe a little more).
There are two kinds of wall warts. The traditional transformer based ones, which are large, heavy, and poorly regulated, and the new all-electronic ones. Because they were cheaper and more reliable, the older ones were more popular, but s the price of copper (for the transformer) shot up, they lost the price advantage and now almost all the new ones are electronic, with a digital (not linear analog) regulator. The new ones regulate very precisely, even cheap no-name ones.
If you have an old one that is poorly regulated, you can often splice in s 3-pin linear regulator (78xx or 79xx chips will carry 1-1.5 amps as is) in the output to clamp it more tightly.
Fun facts about wall warts: It can cost $50,000 to get UL certification for a device, so manufacturers like to certify one wall wart, and use it as the external power supply for many devices. And, wall warts were one of the leading causes of home fires in the US. Mainly overheating transformers.