The demo videos had my expectations much higher for this and I wonder if I have something misconfigured or I’m just not familiar enough with Android. The interface seems pretty straightforward though, and clearly my video feed is working properly. For some reason the tracking square moves around more than the drone. It does slightly react but is very sluggish and generally you can just walk right out of the bounding box. Also, my attempt to orbit while tracking led to an immediate crash and it just didnt’ seem to try to track.
I’m a little scared to try out Mission hub now. This all worked much better with my Spark. I wonder if my phone has something to do with it- it seems fairly slow and is a Samsung A12.
Thanks for any help!
Here’s a video illustrating the above:
Try on a more contrasty environment in a bright day. For me the tracking works pretty well. I don’t know what’s wrong in your case. Also make sure you don’t have any other apps running other than litchi to prevent lag. Also don’t use ND filters during tracking.
From the Online User Guide:
Warning: Because Litchi decodes the video, runs the algorithms to track the object and renders the video, all of this in real-time, a high performance mobile device is required. Be sure to close all background applications and screen recorders before using Track. Video lag will kill tracking.
-
have the same problem it won’t track or fails
to keep up with me I have a Samsung Note 20 Ultra 5g phone also, good phone dumb app, It’s like it’s still in Beta…
I don’t want to be quite that critical of it, they could have left this feature out of the assortment, but they didn’t. It just looks like it works better for other devices. I surely appreciate some of the other things that do work well like the waypoints, and GPS tracking. Can’t win 'em all! That said, I hope they refine this and get it to be more reliable.
If you haven’t tried the Follow mode, it works great. It follows the RC and not the camera framed object.
I use it in place of Track and find it works excellent unless you exceed the speed of the drone.
Last month I bought a mini se. For me the tracking feature was the #1 reason to purchase litchi.
I ran some tests with IOS / iPhone 12 Pro and it didn’t go very well. The mark was lost also with high contrast. In most cases it lost the mark right away. When it worked for ~10 seconds , it did so only while te target was moving very slowly.
If you watch demos on YouTube, also their targets are mostly moving very slowly and it seems the videos cut before any challenging situation for the tracking algorithm happens. That may not apply for all examples, but it seems to me that most of them are done under close to “lab conditions”, meaning ideal conditions for tracking, and not necessarily conditions you would have in most of your “normal” day-to-day applications. Tracking works mostly if several ideal conditions are met… at least that’s my impression so far…
How it it that tracking on my old DJI Spark works so much better if they’re using the same tech and video feed? Is it a hardware decoding vs. software decoding thing?
Spark has an active track in the native app. And it uses its own processor for this.
Through litchi it does not work well.
An App has NO processor, it uses the processor inside the connected smart device.
Yes. But through litchi it’s not used
Ok, so basically yeah, effectively hardware decoding on the Spark’s motherboard vs. software decoding on whatever your smart device is controlling the drone in Litchi. To be clear, bpa is saying the (video)processor onboard the Spark is not used by Litchi, and likely not even present on the Mini2 since the feature isn’t available on the native DJI fly.
Quite right. Spark with hacked speed limit is much better than mini2 in active track. But only with dji go4
If you use litchi it will be about the same. But spark will avoid obstacles.