Waypoint Mission Calculation Time. How is it calculated as there seems to be an error

I ask this because I’m learning how to plan a mission. I set all my waypoints to have a speed of 2.2km/h. The route is estimated to be 341 meters. Litchi says this will take 13 minutes, but converting 2.2km/h to meters per second gives 0.61 and you divide that into 341 meters you get 588 seconds which is approximately 9.3 minutes. Why such the large discrepancy?

All my points have the same heading so I really have no idea why the calculation is so off, is there someone who can give me guidance on this?

Hi Wdobbs, im not 100% sure but it is surely calculate the take off and landing ,dont know if it can make sense?

Did you take into account height differences between waypoints?
In case of height differences you have to use the pythagorean theorem to calculate the actual distance between 2 waypoints.
Also in a “Straight Line” mission the drone stops at every waypoint.

1 Like

They are the same height. I don’t believe I have a straight line waypoint mission, as there’s a slight curve, but how would I tell for sure?

Use the simulator when you are at home and see the time of your mission.

The simulator will also show when the drone is flying to altitude and rth.

If you are capturing photos, and have delays before and after, that can add additional time also.
As bpa said, the simulator is a good tool to see how long a mission will take, and if it will run as expected also. I was out this morning and was, as often, surprised by how long it takes for the drone to reach destination altitude, especially since it slow down as it approaches final altitude.