Covid-19 – the accelerator to smart logistics, smart transportation and smart supply chain services

Covid-19 has, regrettably, amplified the general public’s reluctance to travel by bus, train or tram and reinforced their desire to travel by car.  This is a great shame because since 2016, more than 52 countries globally have been striving to reduce their carbon footprint to meet the Paris Agreement targets of 2030 by encouraging people take advantage of public transport systems. However, since the outbreak of the pandemic, public transport initiatives the world over have ground to a halt, along with any effort to reduce CO2 emissions and plastic waste. While people were willing to cycle during lockdown, their confidence has waned now that the roads are busy once again for fear of not being visible to drivers, particularly in poor or low-light situations. 

 

Another repercussion is the loss in public confidence to visit our city centres to spend their hard-earnt cash on “non-essential” items. Such is this loss that according to the Office for National Statistics, only 36% of the English population feels happy about visiting our high street stores and one in five claimed they would never enter clothes shop again. As a result, there has been a surge in online retail and this is putting increased pressure on associated supply chain and last mile delivery logistics, which are not only labour-intensive, people have no choice but to work in close proximity. 

 

The pandemic has highlighted just how dependent we still are on manual processes for many of our key services to keep operating. Autonomation and IoT are the single two most disruptive technologies in our digital world so, with this in mind, private couriers, bus, train, lorry, and van drivers should not be obliged to put themselves at risk when providing a service such as hand-delivering parcels or groceries to our doorsteps. 

 

Although we may be through the worst, safety measures such as social distancing and limited in-person contact will need to be maintained for the foreseeable future to reduce the risk of a second wave. For this to be both practically possible and economically viable there is a need for breakthrough innovations to expedite and accelerate the deployment of autonomous applications and processes. Last mile delivery, supply chain logistics and freight are all great contenders for this. However, significant safety concerns remain for systems reliant on camera or LIDAR technology, particularly in poor/low light or adverse weather conditions. Until this is resolved the widespread adoption of smart logistics, smart transport or smart delivery systems is just not feasible. 

 

For the safe and effective operation of driverless vehicles and automated parcel delivery methods, these systems must be able to reliably and repeatably: detect, locate, identify, and classify all subjects, objects and obstacles they encounter. Not only that, they must be able to respond with minimal delay and utmost accuracy through capturing high-fidelity data in any environment, in any location, even when GNSS (GPS) or mobile coverage is unavailable.

 

What if there was a cost-effective way to improve the fidelity of the data captured by existing technologies so autonomous vehicles and services could be rolled out en masse?  Health risks associated with in-person contact would be curbed because there would be no need for a human operator in many instances.

 

The possibility of such a solution is set to become a reality thanks to a pioneering project being fronted by R4DAR along with a consortium of disruptive tech start-ups, entrepreneurs, and ML practitioners. Intended to work alongside existing cameras, Lidars, motion sensors and other embedded tech, the consortium is developing a low-cost, low maintenance identification technology that augments the reliability of all data collected by autonomous vehicles or other smart solutions. Using beacons and radars, the breakthrough system is able to identify key subjects/objects of interest, through a simple data exchange to determine:

 

  • What/who’s out there
  • Where it is
  • What it is doing

 

This unambiguous data captured as a result, provides the unequivocal information needed for accurate and informed decision-making, mitigating many of the risks associated with autonomous applications. Delivery drones, for example would be able to navigate their way through busy urban environments, dropping off parcels at different addresses without any need for human contact. The same technology could be mounted on road signs, or seamlessly integrated into smart motorway systems or used as a wearable by cyclists to improve the visibility of vulnerable road users. 

 

As everyday life slowly returns to the new normal, the crisis has not only heightened our dependency on legacy systems that are labour-intensive, it has provided the springboard needed to accelerate the development of autonomous systems and applications. This will remove the need for unnecessary in-person contact in the short to medium term and will go a long way towards preventing a global shutdown in the event of future pandemics. The only blocker is the accuracy of the data collected because decisions taken as a result will only be as informed as the fidelity as the data captured in the first place.

 

Visit https://r4dartech.com/ to find out more.

 

About Clem Robertson

Clem Robertson is a technology veteran with 25 years plus experience in embedded engineering, integrated circuit design, programme management and B2B product delivery across multiple markets including fabless semi, semiconductors, RF & telecoms, automotive, medical & defence. He has a proven track record of creating and leading multi-discipline teams that deliver world class technologies for object/subject data capture and has held senior roles at Plextek, Nurija and Airvana. He has recently championed and led the development of next-generation imaging radar technologies for multiple markets and applications including Runway Foreign Object Damage (FOD) Detection. He founded R4DAR technologies in April 2019.

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Helios™2 Ray Time-of-Flight Camera Designed for Unmatched Performance in Outdoor Lighting Conditions

Helios™2 Ray Time-of-Flight Camera Designed for Unmatched Performance in Outdoor Lighting Conditions

The Helios2 Ray camera is powered by Sony's DepthSense IMX556PLR ToF image sensor and is specifically engineered for exceptional performance in challenging outdoor lighting environments. Equipped with 940nm VCSEL laser diodes, the Helios2 Ray generates real-time 3D point clouds, even in direct sunlight, making it suitable for a wide range of outdoor applications. The Helios2 Ray offers the same IP67 and Factory Tough™ design as the standard Helios2 camera featuring a 640 x 480 depth resolution at distances of up to 8.3 meters and a frame rate of 30 fps.