And what does the maximum value of resistance it can measure, have to do with its input impedance when measuring voltages?
That's close enough to what's in the input stage of most multimeters. That string of resistors adds up to the meter's input impedance - most of the time, ~10Meg or thereabouts.
Especially when measuring "downstream" of a large-value series resistor, that and the meter's input impedance end up as a(n extra) resistive voltage divider, giving you a lower-than-expected reading.
That can also happen if the voltage source is unable to provide more than a handful of uA (microamps). To get a reading of 60V across your meter's 10Meg input impedance would require a source capable of providing at least 6uA; anything less, and you'll be dragging down the voltage you're trying to measure, again getting a lower-than-expected reading.
https://documents.milwaukeetool.com/58-14-2219D7.pdf
Page 3, lower right:
It helps to know one's tools, how they operate as well as their real-world limitations and/or idiosyncracies.
1G works as well, just correct the math (1010Meg divided by 10Meg, multiplied by the voltage reading you get).