Robotics and Mobile Computing: Moving Towards situational Awareness

Image Credits:  UASVISION.COM

Virtually every robotic innovation ever done or still in pursuit in 2019 have been inspired by nature. You must have heard about bio inspired robots. Scientists and researchers world over have been working seriously on building electronic devices that would someday outperform living creatures. Or better yet, extract natural features, patterns from living creatures into electronic devices. Whichever way or method, this would eventually pave way to discovery in robotics advancement and it will always owe its origins to nature’s inspiration.

One of the ways Robotic engineering is making new frontiers is through sensors. Think of robotic sensors like human or living things’ senses of sight (vision), hearing (ear), touch (skin), taste (tongue), and lastly taste (tongue). All these organs make up a living thing’s identity. They are also administered by the human brain.

For instance, we could assume a human brain as a class or main object. This object encapsulates senses of sight (vision), hearing (ear), touch (skin), taste (tongue), and lastly taste (tongue). These senses could also stand as individual classes that extends the main class (brain). Or, we could simply make each senses properties of the main class, erstwhile creating a function or method for each organ in the main class.

Moving away from the abstract described above, as we speak now, electronic cameras are already serving as sight of several devices including mobile applications (computer vision), driver-less cars, un-manned aerial vehicles. Sound (acoustics, sonars) are being reproduced with microphones. Touch with pressure instruments. Robotics applying chemical sensors can prove to be very effective than human analogies in relation to taste. Proprioception or call it a robot’s ability of “self-awareness” might seem quite away off, but very possible in the nearest future if not already happening as we speak.

There is an arm of robotics that I find very interesting. It’s often referred to as “mapping, navigation as well as simultaneous localization and mapping (SLAM)”. It performs the concept of computationally solving the problem of virtually building and updating the scan of a diverse or new area erstwhile tracking an object’s position. Particle filter, Extended Kalman filter and GraphSLAM are some of the popular computational algorithms in use for this concept.

Algorithms conforming to SLAM are fashioned to conform to disposable assets, thus it might not be targeted for standard procedures. Automated vehicles, UAVs, etc are some examples in this category.

 

 

What this theory simply implies is that with a series of sensor observations, on certain discrete time levels, say t, the SLAM problem is capable of computing an figure of an object’s location represent as Xt above. And of cos a layout of the new environment represented in the equation above.

Google’s ACORE is a typical application that adapts this concept. This is aided by data capture by sensors. Though it is pertinent to mention that today’s Robotics algorithm is more of a combination of statistical mappings (application in self-driving cars), Sensors (in handheld phones, cameras, GPS satellites), Kinematics (movement of a robot), Loop closures which entails update to former locations on a robotic map, Exploration which focuses on next point of movement, Bio inspired robotics (self-awareness) mentioned earlier, complexity which involves huge computational assets and is one of the major issues of advanced robotics.

In addition, it is important to mention here that limited asset based Robotics give evolution to OrthoSLAM which operate with very limited assets. OrthoSLAM analyzes a certain environment with orthogonal or geometric planes. Machine learning that deals with pattern recognition is currently helping to detail multiple-arrayed sensors. Dependable model for this complex, non-linear system of motions for the actuators, something difficult to do by directly calculating the expected wave of the soft-bot.

Wearable robotics, delicate applications such as surgery are some of the useful applications of robotics.

References:

  1. Bailey, T.; Durrant-Whyte, H. (2006). “Simultaneous localization and mapping (SLAM): part II”. IEEE Robotics & Automation Magazine. 13 (3): 108–117. doi:10.1109/mra.2006.1678144. ISSN1070-9932.

 

  1. Durrant-Whyte, H.; Bailey, T. (2006). “Simultaneous localization and mapping: part I”. IEEE Robotics & Automation Magazine. 13 (2): 99–110. doi:10.1109/mra.2006.1638022. ISSN1070-9932.