I got my Basler ace - acA800-200gm GigE camera and want to use it for my mechanical-optical laboratory experiment (see image below), whereby I disturb mechanically coupled lenses with an impulse hammer / shacker and the optical respose (light point position) is identified. Furthermore, I will actively control, stabilize and compensate the image motion by the usage of the current light point position as a feedback, which will handeld by the controller of an installed linear motor. Therefore, I need to design a real-time machine vision system.

Anyway, at the moment I use the provided pylon Viewer  on my weak notebook to record videos with the max 200fps and I do the image processing (tracking of the light point position) later on, in an offline postprocess.

For doing the same online, which means to acquire the images from the GigE camera in real-time with about 200fps and additionally do the image processing (track light point within 1/200fps = 5ms) I think about to use an embedded system like the Nvidia Jetson TK1, which can run pylon Linux ARM (pylon GigEVision Driver) and where I can deploy my own C++ application (maybe using OpenCV & Cuda) for the image acquisition and image processing. The obtained position I could communicate e.g. via serial USB or TCP/IP to my Arduino Due, which produce my analog output for the motor.

Do you think that this realization is feasible and do you have any experience concerning the performance of the image acquisition and processing within 5ms on the Nvidia Jetson TK1 from the GigE? The Nvidia Jetson TX2 would be more powerful, but is also much more expensive, or do you have any other suggestions?