To expand on that nice description: Field strength and receiver sensitivity and receiver meter readings are different animals.
Receiver sensitivity is just one way to rate how a receiver will "hear" a signal arriving at its input. Besides your own ears, there at least 8 ways to determine sensitivity. Most have to do with taking background noise into account.
A receiver's signal meter (more or less) reports on how much signal it is receiving from what ever antenna is attached to it. For example: S9 = 50.2uV (in a 50 ohm system) = -73dBm. If you were to hook up a signal generator to the antenna input and have it output -73dBm, the s meter would read S9 or 50.2 micro volts if so marked.
The reason why a tabletop receiver can not give you an actual field strength reading is due to the antenna. Hook up a good or crappy antenna to a receiver and of course the "S" reading will reflect the antenna's ability to extract the signal and get it to the radio. Two antennas, two readings, same location. The difference between the actual field strength (as measured by calibrated equipment) and what the radio reports is called the antenna factor. Most of the time you won't know the antenna factor (AF) of your antenna so the readings your radio is reporting are relative readings only. To use a radio to make actual field strength readings at a location requires an antenna with a known AF and usually a calibrated receiver as well.
Finally, I'm sure it is possible for the manufacturer of a portable radio with a fixed antenna to include AF values in a look up table so that it will display a corrected field strength reading. I would probably want to verify the accuracy before relying on the measurements as absolute values.
That's how I see it. Please correct me if I'm wrong!