NISSAN’S engineers have been inspired by the animal kingdom as they develop new technologies that will shape the future of mobility.
One of Nissan’s longer-term research and development goals is to virtually eliminate deaths and serious injuries among occupants of its vehicles.
Toru Futami, engineering director of advanced technology and research, said that studying the behaviour of animals moving in groups helps engineers understand how vehicles can interact with each other for a safer and more efficient driving environment.
He said: ‘In our quest to develop collision-avoidance systems for the next generation of automobiles, we needed to look no further than to Mother Nature to find the ultimate form of collision-avoidance systems in action, in particular, the behavioural patterns of fish.’
The research team created the EPORO (EPisode 0 Robot), using Laser Range Finder (LRF) technology inspired by a bumblebee’s eyes, which have a field of vision of more than 300 degrees.
Six EPORO units were built and can communicate among themselves to monitor each other’s positions to avoid collisions as well as be able to travel side-by-side or in single-file – in the same way fish travel.
Mr Futami explained: ‘In current traffic laws, cars are supposed to drive within the lanes and come to a halt at stop signals, but if all cars were autonomous, the need for lanes and even signals could be gone.
‘Fish follow these three rules: Don’t go away too far, don’t get too close and don’t hit each other.
Less traffic congestion
‘Fish form schools with these three rules. A school of fish doesn’t have lines to help guide the fishes, but they manage to swim extremely close to each other. So if cars can perform the same type of thing within a group, we should be able to have more cars operating with the same width roads.’
Nissan has also created the Biometric Car Robot Drive, or BR23C, which mimics the uncanny collision-avoidance ability of bumblebees. It was a joint project with the University of Tokyo.
The Laser Range Finder (LRF) detects obstacles in a 180-degree radius in front of it up to two metres away. The BR23C calculates the distance to the obstacle(s), then immediately sends a signal to a microprocessor, which translates this information and moves or repositions the vehicle accordingly to avoid a collision.
‘The split-second it detects an obstacle,’ explains Toshiyuki Andou, manager of Nissan’s Mobility Laboratory and principal engineer of the project, ‘the car robot will mimic the movements of a bee and instantly change direction by turning its wheels at right angles or greater to avoid a collision.’