Stacy's average speed was 50 miles per hour. To calculate average speed, you divide the total distance traveled (200 miles) by the total time taken (4 hours).
He drove 180 miles in 5 hours for an average speed of 36 miles per hour.
That would depend on the average speed. If the average is 50 mph, they drove 650 miles.
60 miles
300 MILES
Average speed was 65 mph. 100 miles + 420 miles = 520 miles 2 hours + 6 hours = 8 hours 520/8 = 65 mph
60
60
To get the answer, you would divide how many miles he drove by how long it took him to drive it. The information you need to set up your problem is 150 miles and 2.5 hours. 150 / 2.5 = 60 So Jordan drove an average of 60 miles per hour.
52 m.p.h.
25
Average seed = (total distance) / (total time) = (80+100) / (2+3) = 180/5 = 36 miles per hour