Ok I may have figured it out and if so, WOW it really doesnt need to be that complicated...
So I picked a time, say 4:00 PM and took the reading on the realtime and on the 1hour historical.
realtime: 69ms
1hr historical: 957ms
The realtime updates every 20s (20000ms) so take (69 / 20000) * 100 = .345%
The hour historical updates every 5 min (300000ms) so (957 / 300000) * 100 = .319%
These two number are faily close and within a margin of error Id say.
If the ms reading doesnt really mean anything then why cant they just graph the percentage based on the timeframe?