DNN-based visual perception for high-precision motion control

The high-speed, high-precision positioning of objects is a critical component in various industrial manufacturing processes. The semiconductor die packaging, for instance, requires the precise pickup and placement of semiconductor dies on sub- strates. This is done by coupling the silicon wafer which contains thousands of semiconductor dies, with a motion control platform equipped with linear motors and encoders. The motion controller relies on linear motors and encoders to accurately position the silicon wafer at reference positions, which are determined through the relative positions of the dies on the wafer. However, the challenge arises when neighboring dies get misaligned during the pickup process, making it impossible to read the position of the die through encoders. This paper addresses the challenge of precise alignment in high-speed, micro-scale manufacturing environments, where traditional methods struggle due to the disconnect between the point-of-interest (dies) and point-of-control (motor/silicon wafer). To overcome these challenges, we propose a Deep Neural Network (DNN) based perception that allows for robust sensing of die positions. We also propose a fusion mechanism to incorporate this vision feedback with the encoder to accurately detect the misalignment and compensate for it before periodic pickups of the dies. We use a software-in-the-loop validation framework to demonstrate that our proposed method could successfully eliminate the misalignment before the pickup in the range under consideration.