{"id":1947,"date":"2019-12-05T07:00:33","date_gmt":"2019-12-05T07:00:33","guid":{"rendered":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/?p=1947"},"modified":"2020-01-29T15:53:32","modified_gmt":"2020-01-29T15:53:32","slug":"christmas-xbox-technology-could-help-transform-healthcare","status":"publish","type":"post","link":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/2019\/12\/05\/christmas-xbox-technology-could-help-transform-healthcare\/","title":{"rendered":"Not just for Christmas \u2013 how Xbox technology could help transform healthcare"},"content":{"rendered":"<figure id=\"attachment_1996\" aria-describedby=\"caption-attachment-1996\" style=\"width: 1920px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" width=\"1920\" height=\"1080\" class=\"wp-image-1996 size-full\" src=\"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/files\/2019\/12\/kinect3.jpg\" alt=\"A man with a robot in a laboratory\" \/><figcaption id=\"caption-attachment-1996\" class=\"wp-caption-text\">Xbox tech could soon offer a helping hand in operating theatres<\/figcaption><\/figure>\n<p>Many will be wishing to discover an Xbox-shaped gift glittering under the Christmas tree this year. Aside from the seemingly endless hours of entertainment, joy, frustration and competition that these consoles offer, Xbox technology \u2013 and other similar gadgets \u2013 is finding uses outside of the gaming world, and in the healthcare research sphere.<!--more--><\/p>\n<p>Originally developed as a gaming accessory, Microsoft\u2019s Xbox Kinect uses cameras and sensors to map a space in 3D. Combined with its motion capture capability, this means the tech can not only pinpoint objects, but it can track individuals\u2019 movements and gestures. These are both key functions that could be useful in an array of healthcare settings, as research by our Hamlyn Centre and others is demonstrating.<\/p>\n<p>Led by Dr George Mylonas, Hamlyn Centre&#8217;s <a href=\"https:\/\/www.harms-lab.org\/\">HARMS lab<\/a> focuses on using frugal innovations, coupled with techniques based on perception, to develop low-cost healthcare technologies, largely in the surgical setting. That\u2019s where the Kinect fits in, explains HARMS lab researcher Dr Alexandros Kogkas.<\/p>\n<p>\u201cXbox offered one of the first commercial 3D cameras on a large scale; the technology has been well-known for many years now,\u201d he says. \u201cIt was made available at scale because of gaming, and that\u2019s why it\u2019s cheap \u2013 one Kinect costs just \u00a3150, so it\u2019s relatively low-cost to deploy.\u201d<\/p>\n<h2>Xbox technology for safer surgery<\/h2>\n<p>One of the major ways that the lab is using Kinect technology is to ease the burden of surgeons. Performing surgery is mentally and physically exhausting. Prolonged stress means that as many as <a href=\"https:\/\/www.ncbi.nlm.nih.gov\/pmc\/articles\/PMC4884544\/\">50% of surgeons can experience burnout<\/a>. Not only does this risk the surgeon\u2019s health and wellbeing, but that of their patients, too, because of the higher likelihood of mistakes.<\/p>\n<p>\u201cOur goal is to use the technology to make surgery safer,\u201d says Kogkas. \u201cWhether that\u2019s by improving ergonomics, enabling better collaboration, or boosting surgeons\u2019 skills.\u201d<\/p>\n<p>Kogkas and colleagues are working on a system that can both facilitate interactions in the operating theatre and also track patterns of behaviour for analysis.<\/p>\n<figure id=\"attachment_1954\" aria-describedby=\"caption-attachment-1954\" style=\"width: 300px\" class=\"wp-caption alignright\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"206\" class=\"size-medium wp-image-1954\" src=\"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/files\/2019\/12\/Xbox-healthcare-setup-300x206.png\" alt=\"A demo of the robotic scrub nurse setp in the surgical healthcare setting\" \/><figcaption id=\"caption-attachment-1954\" class=\"wp-caption-text\">The robotic scrub nurse, left, in action during a lab trial of the tech.<\/figcaption><\/figure>\n<p>For instance, helper robots have been developed to assist surgeons during procedures, which can recognise, locate and handle instruments as required. These \u201crobotic scrub nurses\u201d are designed to complement real scrub nurses, who often have to juggle many jobs at once and therefore may not be able to assist with delivering surgical tools as quickly as necessary.<\/p>\n<p>Using Kinects to map the room and its contents, combined with eye-tracking glasses to record the physician\u2019s focus, their system allows surgeons to control the robotic scrub nurse using only their gaze. This means they can effortlessly focus their eyes on the instrument they need, and the robot will deliver it. Check out the video below to see this in action.<\/p>\n<p>\u201cSo far we\u2019ve tested this system on mock surgeries performed by real surgeons and scrub nurse teams, comparing how surgeons completed the task with and without the gaze-controlled scrub nurse,\u201d says Kogkas. \u201cWe didn\u2019t find any major differences, which means that the system is feasible and doesn\u2019t interrupt workflow. But we want to see an improvement in task performance, so we have more work to do.\u201d<\/p>\n<p><iframe loading=\"lazy\" title=\"Free-View, 3D Gaze-Guided Robotic Scrub Nurse\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/hPfN6ofl09U?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<h2>Building a smarter system<\/h2>\n<p>The system also allows surgeons to use gaze to scan through on-screen patient records, such as imaging results used to guide the procedure. This would mean surgeons don\u2019t have to leave the room to examine the data and then re-sterilise on return, potentially reducing the length of the surgery.<\/p>\n<p>\u201cBut ultimately our main goal is to make the system more intelligent,\u201d Kogkas says. \u201cBy learning common sequences from surgeons, we want to develop algorithms that can predict the next task and prompt the robot to act accordingly.<\/p>\n<p>\u201cThis is where human scrub nurses are really indispensable; they can anticipate what the surgeon needs from experience. We want to train robots to do the same.\u201d<\/p>\n<h2>Preventing errors in the operating theatre<\/h2>\n<p>Robots aside, learning from the data generated by the system could help surgeons in other ways, too. By tracking metrics like blinking rate, head position and body posture, the researchers hope to develop artificial intelligence algorithms. These could <a href=\"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/2019\/10\/02\/how-robots-in-space-could-lead-to-better-healthcare-on-earth\/\">detect when surgeons are experiencing fatigue<\/a>, or objectively quantify effort, for example. This could then prompt a programmed response, such as brightening the lights to stimulate wakefulness, triggering an alert for colleagues to intervene, or suggest a raised level of alertness among the theatre team.<\/p>\n<figure id=\"attachment_1962\" aria-describedby=\"caption-attachment-1962\" style=\"width: 300px\" class=\"wp-caption alignright\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"200\" class=\"size-medium wp-image-1962\" src=\"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/files\/2019\/12\/surgery-3034133_640-300x200.jpg\" alt=\"Surgeons working in the operating theatre\" \/><figcaption id=\"caption-attachment-1962\" class=\"wp-caption-text\">The system could help surgeons stay alert during procedures.<\/figcaption><\/figure>\n<p>\u201cWe hope our framework could be applied in \u2018Surgery 4.0\u2019 \u2013 data-driven surgery,\u201d explains Kogkas. \u201cUltimately, we want to be able to use our data to help predict errors before they occur, and find ways to intervene to stop them from happening.\u201d<\/p>\n<p>This will likely require much more information than the current system divulges, though. So the group has begun collaborating with other institutions, including Harvard Medical School, to integrate more sensors. This could include reading brainwaves through an EEG cap and measuring heart rate or other indicators of stress.<\/p>\n<p>\u201cWe hope to see our framework deployed soon in operating theatres,\u201d Kogkas says. \u201cAlthough we developed it for surgery, we realised we could apply this framework in other settings as well. That\u2019s when we began our work on assistive living.\u201d<\/p>\n<h2>From hospital to home<\/h2>\n<p>For people with severe disability, such as tetraplegia, gaining a degree of independence could be life-changing. Kogkas and colleagues hope to be able to offer that through their work. They\u2019re using their framework to develop a robotic system that can assist with daily living activities for people who have lost the ability to use their limbs.<\/p>\n<p>Similar to the operating theatre setup, the researchers are training algorithms for robots so that they can recognise and locate objects found in the home environment, such as a mug or box of cereal. An installed Kinect or similar sensor can then track objects in a person\u2019s room, so that their gaze can be mapped onto them through eye-tracking glasses.<\/p>\n<p>A person could, therefore, use the system to direct a robot to carry out various tasks, such as moving objects around or pouring a bowl of cereal. You can see this in action in the YouTube video below:<\/p>\n<p><iframe loading=\"lazy\" title=\"GaGARoS: Gaze-Guided Assistive Robotic System for Daily-Living Activities\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/TnJfpend7Gs?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<p>\u201cThis is a big, complex project, in part because there are so many different tasks that need to be programmed,\u201d explains Kogkas. \u201cIt\u2019s one thing being able to recognise and move a coffee cup, but to be useful the robot also needs to bring it towards the person\u2019s lips so they can drink from it.<\/p>\n<p>\u201cOther labs are developing inventories of different daily living activities, so we hope that we can collaborate to take this research forward to the next phase.\u201d<\/p>\n<p>But progress has already begun. Having tested the system successfully in healthy volunteers, Kogkas and colleagues have now been granted ethical approval to begin trialling the technology in patients for the first time. The study is due to commence next year.<\/p>\n<p>\u201cFor us, it\u2019s important to do translational research,\u201d he says.<\/p>\n<p>\u201cThat means we don\u2019t only care about developing something to publish in an academic journal. We want to actually see it used in the real world, to help people and improve the quality of healthcare.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Many will be wishing to discover an Xbox-shaped gift glittering under the Christmas tree this year. Aside from the seemingly endless hours of entertainment, joy, frustration and competition that these consoles offer, Xbox technology \u2013 and other similar gadgets \u2013 is finding uses outside of the gaming world, and in the healthcare research sphere.<\/p>\n","protected":false},"author":1308,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[11640],"tags":[],"class_list":["post-1947","post","type-post","status-publish","format-standard","hentry","category-design"],"_links":{"self":[{"href":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/wp-json\/wp\/v2\/posts\/1947","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/wp-json\/wp\/v2\/users\/1308"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/wp-json\/wp\/v2\/comments?post=1947"}],"version-history":[{"count":16,"href":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/wp-json\/wp\/v2\/posts\/1947\/revisions"}],"predecessor-version":[{"id":2114,"href":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/wp-json\/wp\/v2\/posts\/1947\/revisions\/2114"}],"wp:attachment":[{"href":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/wp-json\/wp\/v2\/media?parent=1947"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/wp-json\/wp\/v2\/categories?post=1947"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs-staging.imperial.ac.uk\/ighi\/wp-json\/wp\/v2\/tags?post=1947"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}