Month: January 2017

Worms, birds and insects inspire the robots of the future

A post by Dr Silvia Ardila-Jiménez, Post-doctoral Research Associate, Imperial College London

 

The development of autonomous systems is one of the technology trends driving the fourth industrial revolution. Autonomous systems in transportation are perhaps the most widely talked about, but beyond this we’re already seeing systems deployed in sectors like environmental monitoring and agriculture.

The range of potential applications is huge: search and rescue, border surveillance, construction, energy, health, sports and recreation, agriculture, and food and water security to name a few. And whilst advances in this area are vast – fueled by machine learning, data science, robotics etc. – no man-made system can perform at the level of living organisms.

How do animals achieve such incredibly complex tasks and what are the biological principles that govern them? How can we use nature’s solutions for our own objectives? We’ve been assembling an international network of researchers to understand these fundamental questions; if you’re interested then please get in touch.

Flying animals can perform precise, agile manoeuvres, like hovering while feeding from a moving flower, mating in mid-air, and tracking and intercepting prey. Dr Tom Daniel (University of Washington and Director of the Air Force Center of Excellence on Nature-Inspired Flight Technologies and Ideas) is investigating how moths achieve these tasks which have very low probabilities of success.

Dr Daniel is building fundamental understanding on how they integrate multiple sensory modalities (e.g. vision, inertia), which may help to improve flight control in engineered aircrafts. The mechanics of the moth model may also provide insight into alternative modes of sensing and actuating.

Robber flies are relatively small insects, but have evolved flight strategies which make them as successful predators as larger flying insects. Dr Gonzalez-Bellido (University of Cambridge) is investigating specific anatomical adaptations in their visual system that enable them to track small targets, and adjust their flight trajectory to intercept their prey with high-accuracy.

Birds have evolved their own adaptations for flight, dynamically adjusting their wing shape in response to wind. The Windsor group at Bristol University record and model the dynamic 3-D structure of bird wings during flight. They’ve shown that this knowledge can be applied to make drone wings which respond to the environment and improve flight.

At Imperial College, Dr Mirko Kovac’s Arial Robotics Lab has combined the capabilities of two different organism types to develop an aquatic micro-air vehicle for monitoring water health. The robot, as seen below, dives into water like a gannet, and then launches like a flying fish back into the air. You can see a video of this in action here.

Drone dives into water like a gannet
Image copyright Ben Porter

Whilst flying animals are an obvious area for inspiration, worms can also give us new solutions. C. elegans for example are tiny nematode worms that burrow through soil – a complex chemical environment – in search of food. A robot inspired by this worm and capable of navigating obstacles with minimal sensing has been developed by Dr. Boyle’s group at the University of Leeds.

Social animals bring a whole set of useful and challenging behaviours. Dr. Paoletti’s group at Liverpool is looking at swarming and schooling to develop groups of robots that can collaborate locally to perform tasks such as recognition and surveillance.

Engineers taking inspiration from nature, often called biomimicry, is nothing new; Leonardo Di Vinci’s flying machine is a famous early example. With today’s technology however we can go beyond merely mimicking nature; we are capable of looking deeper at the underlying natural principles, and adapting them to improve our own systems.

The challenge now is to bring together expertise from engineering and biology to study, understand and assess the potential benefits of looking at nature for inspiration to enable improvements in application technologies. This is something we’re actively engaged in at the Institute of Security Science and Technology. If this is something you are interested in please get in touch!

 

Silvia is an engineer with a PhD in computational neuroscience from Imperial College London. In her Ph.D. she looked at how different areas in the primary visual system interact to process incoming information using large data sets. Silvia is currently working as a Post-doctoral Research Associate in the Department of Bioengineering and the ISST working on pathways from nature inspired research into application technologies.

The interaction between safety and security

A post by Professor Chris Hankin, Director ISST

Increasing digitization has led to convergence between IT (Information Technology) used in offices and mobile devices, and OT (Operational Technology) that controls devices used in critical infrastructure and industrial control systems. The IoT (Internet of Things) is also rapidly growing, with around 10 billion devices today.

These trends raise concerns about the interaction between safety and security. The reality of the threat has been highlighted in national news coverage, from cyber security vulnerabilities being exploited to compromise vehicle safety, to denial of service attacks launched from consumer devices.

Discussions are sometimes hampered by the lack of clear definitions of the concepts. Safety is often understood as concerning protection against accidents, whilst security is about protecting systems against the action of malicious actors. But these two definitions miss some essential aspects of the two concepts. A slightly different view is that safety is about protecting the environment from the system and security is about protecting the system from the environment.

Another contrast between the two concepts is how we approach risk assessment. Safety often considers the risk to life and limb and measures risk using actuarial tables. Security more often measures risk through consideration of the threat to information assets – at the moment data breach may be the key concern. As cyber physical systems become more prevalent there must be a convergence between these different approaches.

From a regulatory and standards point of view, the following Venn diagram summarises the current situation:


However, practitioners recognize that there is not a clear separation (indeed it would be undesireable if there was), so the following is a better diagram of the current situation:


New standards are beginning to consider both safety and security.  There is then a question about how large the intersection should be.  There appears to be general agreement that the following diagram is wrong:

There are differences between the two concepts and we have hinted at what those might be. However, some commentators, predominantly from the security sector, have questioned whether a system can be safe if it is insecure.

The examples of compromise to vehicle safety mentioned earlier give some weight to this view – it is clear that physical harm can result from the exploitation of cyber vulnerabilities. So maybe the following diagram is a better representation:

This is not universally accepted – some would argue that insecure components can be deployed in a system without compromising the safety because of the way in which those components are deployed and their effect is constrained.

Of course an alternative diagram would represent the secure systems as a subset of the safe ones – this could be verbalized by a slogan that a system cannot be secure if it is not safe. This is clearly wrong; safety, in the way we have viewed it here, is only really an issue for OT systems but we clearly want our IT systems to be secure.

For the future, we might want to re-think the relationship between safety and security. The UK Cyber Security Strategy 2016-2021, published on 1st November 2016, is based on three strands – Defend, Deter, and Develop – underpinned by international collaboration. The Defend strand talks a lot about “secure by default” systems and this could be an argument for breaking “out of the box”:

I am sure that this is a debate that will continue for some time.

Chris Hankin

Chris Hankin is Director of the Institute for Security Science and Technology and a Professor of Computing Science. His research is in cyber security, data analytics and semantics-based program analysis.