PerFC: An Efficient 2D and 3D Perception Software-Hardware Framework for Mobile Cobot

Authors

  • Tuan Dang University of Texas at Arlington
  • Khang Nguyen University of Texas at Arlington
  • Manfred Huber University of Texas at Arlington https://orcid.org/0009-0007-0294-9147

DOI:

https://doi.org/10.32473/flairs.36.133316

Keywords:

Robotics, Perception, Framework

Abstract

In this work, we present an end-to-end software-hardware framework that supports both conventional hardware and software components and integrates machine learning object detectors without requiring an additional dedicated graphic processor unit (GPU). We design our framework to achieve real-time performance on the robot system, guarantee such performance on multiple computing devices, and concentrate on code reusability. We then utilize transfer learning strategies for 2D object detection and fuse them into depth images for 3D depth estimation. Lastly, we test the proposed framework on the Baxter robot with two 7-DOF arms and a four-wheel mobility base. The results show that the robot achieves real-time performance while executing other tasks (map building, localization, navigation, object detection, arm moving, and grasping) with available hardware like Intel onboard GPUs on distributed computers. Also, to comprehensively control, program, and monitor the robot system, we design and introduce an end-user application. The source code is available at https://github.com/tuantdang/perception_framework.

Downloads

Published

08-05-2023

How to Cite

Dang, T., Nguyen, K., & Huber, M. (2023). PerFC: An Efficient 2D and 3D Perception Software-Hardware Framework for Mobile Cobot. The International FLAIRS Conference Proceedings, 36(1). https://doi.org/10.32473/flairs.36.133316

Issue

Section

Special Track: Autonomous Robots and Agents