Publication: Identifying Objects with related Angles using Vision-based System integrated with Service Robots
DOI
Type:
Thesis
Date
2022
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Manipulation an object can be done with the collaboration of a human to a robot by
introducing the object in a proper way. To do this in an easy way, we can model the
object inside the robot head and add some sensors and cameras to identify the specific
object. But when it comes to the real world, we cannot model all the objects in the
world inside a robot head. If we can manipulate every object there can be more work
would have done by the robots in efficient way.
This research will present a strategy to identify the unknown objects using a visionbased system and with the perspective angles of the detected object and the system is
integrated with service robots. This will go in a way when the robot should be able to
identify the objects around the robot in an asynchronous manner with rotational
angles and the pitch and roll angles, perspective to the robot standing surface. The
research will be based on Artificial intelligence, Machine learning, and Robotics.
Robotics operating system is used for simulating the robots and identification.
For the identification process, a few ways can be used. Vision-based identification
using color and depth images from an RGB camera, and this research is mainly based
on this RGB, and depth feature integrated with YoloV5. And there are some other
ways to identify objects like using a 3D-LiDAR laser scanner. However, this learning
process, should have a stable object to model and train the object. After the object
recognition, by using the proposed methodology robots can calculate and estimate the
angles of the detected object.
After the acquisition, the robot should be able to identify the object any time when it
sees the object. Since this is a robot, we can use this to model unknown objects and
retrieve the data from its database and manually name them if there is no one to name
it in the time being
