I ask this because I’m learning how to plan a mission. I set all my waypoints to have a speed of 2.2km/h. The route is estimated to be 341 meters. Litchi says this will take 13 minutes, but converting 2.2km/h to meters per second gives 0.61 and you divide that into 341 meters you get 588 seconds which is approximately 9.3 minutes. Why such the large discrepancy?

All my points have the same heading so I really have no idea why the calculation is so off, is there someone who can give me guidance on this?

Did you take into account height differences between waypoints?
In case of height differences you have to use the pythagorean theorem to calculate the actual distance between 2 waypoints.
Also in a “Straight Line” mission the drone stops at every waypoint.

If you are capturing photos, and have delays before and after, that can add additional time also.
As bpa said, the simulator is a good tool to see how long a mission will take, and if it will run as expected also. I was out this morning and was, as often, surprised by how long it takes for the drone to reach destination altitude, especially since it slow down as it approaches final altitude.