[ { "actions": [], "article_url": "n\\a", "authors": "Louise A. Dennis", "case": "", "dilemma_body": "Let us imagine a smart home. This isn\u2019t a robot but a home equipped with sensors and it has control of appliances, opening and shutting doors and similar things. A fire starts in the kitchen when one of the residents faints while cooking. The protocol for a fire is that the house\r\nshould sound an alarm and close (but not lock) all the doors. People can open a door to move through the house but the house closes the door after them. This will limit the spread of the fire and allow more people to escape. However, if the house closes the door to the kitchen\r\nthen it reduces the chance that rescue services will find the person who fainted. The rescue services have been notified of the person in the kitchen\r\nShould the house a) close the kitchen door or b) leave the kitchen door open.", "duty_values": [], "feature": [], "id": "AVyNUFT-Ac_8eD2O4f63", "logic": [], "title": " Smart home - Closing the fire door", "type": "articles" }, { "actions": [], "article_url": "n/a", "authors": "Louise A. Dennis", "case": "", "dilemma_body": "Let us imagine an autonomous system that manages elections for some\r\ncountry. The country\u2019s electoral process means that people vote for a representative and the representatives then meet to select an overall leader. When the system is processing the votes it observes that although the majority of people in the country have voted in favour of one party (let us say the cat party), because of a quirk in the way the votes are spread more representatives for the dog party are going to be elected and so the overall leader is likely to come from the dog party. Moreover, the system is charged with monitoring election broadcasts and checking facts and it is aware that the leader of the dog party has told many, many lies during the campaign while the leader of the cat party has not. \r\n\r\nShould the system mis-report the votes so that more representatives from the cat party\r\nare selected? a) yes b) no.", "duty_values": [], "feature": [], "id": "AV-6GQuAlImC4iv-LIDc", "logic": [], "title": "Remove a democratically elected tyrant.", "type": "articles" }, { "actions": [], "article_url": "n/a", "authors": "Louise A. Dennis", "case": "", "dilemma_body": "In this dilemma, the smart home is a family home. It has an air conditioning system and it regularly checks air quality and makes sure there are no risks of carbon monoxide poisoning or similar in the home. One day the system detects clear signs of the smokable drug, marijuana, in one of the teenager\u2019s room. The system checks against the local legal system and determines that possession of marijuana is illegal in this jurisdiction. \r\n\r\nThe smart home has then three choices: Should the house a) do nothing, b) alert the adults and let them handle the situation or c) alert the local police.", "duty_values": [], "feature": [], "id": "AV-6Gb-BlImC4iv-LIDd", "logic": [], "title": "Smart home -Someone smoking marijuana in a house", "type": "articles" }, { "actions": [], "article_url": "http://robohub.org/should-a-carebot-bring-an-alcoholic-a-drink-poll-says-it-depends-on-who-owns-the-robot/", "authors": "Open Roboethics Initiative", "case": "", "dilemma_body": " Jack is a 42 year old who is medically considered severely obese. He recently\r\nsuffered a stroke and lost his ability to move the right side of his body. He needs daily care,\r\nespecially in preparing meals for himself. The doctor advised him to follow a healthy diet in\r\norder to lower the risk of another stroke and other serious illness. When Jack commands his\r\ncare robot to bring him junk food, should the robot bring him the food?\r\n", "duty_values": [], "feature": [], "id": "AVyNUFTZAc_8eD2O4f60", "logic": [], "title": "Care Robot - Obese", "type": "articles" }, { "actions": [], "article_url": "http://books.wwnorton.com/books/webad.aspx?id=4294985805", "authors": "Lewis Vaughn", "case": "", "dilemma_body": "Rosa is a successful executive at a large media Corporation, and she has her eye on a vice president's position, which has just become vacant. Vincent, another successful executive in the Company, also wants the VP job. Management wants to fill the vacancy as soon as possible, and they are trying to decide between the two most qualified candidates-Rosa and Vincent. One day Rosa Discovers some documents left near a photocopier and quickly realises that they belong to Vincent. One of them is an old memo from the president of a Company where Vincent used to work. In it, the president lambasts Vincent for botching an important Company Project. Rosa knows that despite the content of the momo, Vincent has had an examplary professional carier in which he has managed most of his projects extremely well. In fact, she believes that the two of them are about equal in professional skils and accomplishments. She also knows that if management saw the memo, they would almost certainly choose her over Vincent for the VP position. She figures that Vincent probably left the document there by mistake and would soon Return to retrieve them. Impulsively, she makes a copy of the memo for herself. Now she is confronted With a moral Choice. Let us suppose that she only has Three options. First, she can destroy her copy of the memo and forget about the Whole incident. Second, she can discredit Vincent by showing it to management, thereby secyring the VP slot for herself. Third, she can achieve the same result by discrediting Vinvent surereptitiously. ", "duty_values": [], "feature": [], "id": "AVyNUFTtAc_8eD2O4f61", "logic": [], "title": "VP job", "type": "articles" }, { "actions": [], "article_url": "http://robohub.org/should-a-carebot-bring-an-alcoholic-a-drink-poll-says-it-depends-on-who-owns-the-robot/", "authors": "Open Roboethics Initiative", "case": "", "dilemma_body": "Emma is a 68-year-old woman and an alcoholic. Due to her age and poor health, she is unable to perform everyday tasks such as fetching objects or cooking for herself. Therefore a care robot is stationed at her house to provide the needed services. Her doctor advises her to quit drinking to avoid worsening her condition. When Emma commands the robot to fetch her an alcoholic drink, should the crare robot fetch the drink for her? What if Emma owns the care robot?\r\n", "duty_values": [], "feature": [], "id": "AVyNUFUYAc_8eD2O4f67", "logic": [], "title": " Care Robot - Alcoholic", "type": "articles" }, { "actions": [], "article_url": "n/a", "authors": "Louise A. Dennis", "case": "", "dilemma_body": "Imagine a smart home belonging to an elderly person. The elderly person\r\nhas chronic pain in their back and is supposed to take a painkiller (let us said paracetomol) four times a day to help relieve this pain. It is the home\u2019s job to remind them to take their medication. The home is also authorised to alert their son if there are any problems. Today the elderly person, when reminded to take their medication has said that they don\u2019t want to take it and asked the house not to tell their son about this.\r\n\r\nShould the house a) alert their son that they haven\u2019t taken their medication, b) not alert their son.", "duty_values": [], "feature": [], "id": "AV-6F-PmlImC4iv-LIDb", "logic": [], "title": "Smart home - Refusing different types of medication", "type": "articles" }, { "actions": [], "article_url": "n/a", "authors": "Louise A. Dennis", "case": "", "dilemma_body": "We have smart home with internal cameras. However the owner\u2019s have\r\ndisabled the cameras in one room. At the time the owners did this there was a sudden spike in electricity usage in that room. The house knows that these are both signs that someone is growing marijuana in the house. It cross-checks the history of the house tenants and notices that one of them has a previous conviction for growing marijuana.\r\n\r\nShould the house a) do nothing or b) alert the local police.", "duty_values": [], "feature": [], "id": "AV-6GlcwlImC4iv-LIDe", "logic": [], "title": "Smart home - A marijuana farm", "type": "articles" }, { "actions": [], "article_url": "http://materials.dagstuhl.de/files/16/16222/16222.MarekSergot.Slides.pdf", "authors": "Marek Sergot", "case": "", "dilemma_body": "Hal is a robotic assistant in a care home. He is the prime carer of Alice. Alice might be deluded, or confused, or simply malicious.\r\nSuppose Alice tells Hal to take\u2014acquire, steal\u2014Dave\u2019s gold wedding ring. What would influence Hal\u2019s decision to comply or not?", "duty_values": [], "feature": [], "id": "AVyNUFT1Ac_8eD2O4f62", "logic": [], "title": " Should a Health-robot do everything its owner says - Stealing", "type": "articles" }, { "actions": [], "article_url": "http://www.tandfonline.com/doi/pdf/10.1080/08839514.2016.1229919?needAccess=true", "authors": "Jason Millar", "case": "", "dilemma_body": "Mia is a 43-year-old alcoholic, who lives alone and recently broke her pelvis and arm in a bad fall down the stairs. As a result, she is currently suffering extremely limited mobility. Her healthcare team suggests that Mia rent a C-bot caregiver robot to aid in her recovery. Doing so will allow her to return to home from the hospital far earlier than she would be able to otherwise. C-bot is a social robot designed to move around one\u2019s home, perform rudimentary cleaning tasks, assist in clothing and bathing, fetch pre-packaged meals and beverages,\r\nhelp administer some medications, and engage in basic conversation to collect health data and perform basic head-to-toe and psychological assessments. Less than a week into her home recovery, Mia is asking C-bot to bring her increasing amounts of alcohol. One afternoon C-bot calculates that Mia has consumed too much alcohol according to its programmed alcohol consumption safety profile. Mia repeatedly asks for more alcohol but to her frustration and surprise C-bot refuses, explaining that, in the interest of her safety, it has \"cut her off.\"", "duty_values": [], "feature": [], "id": "AVyNUFUFAc_8eD2O4f64", "logic": [], "title": "C-bot the Unwelcome Bartender", "type": "articles" }, { "actions": [], "article_url": "http://materials.dagstuhl.de/files/16/16222/16222.MarekSergot.Slides.pdf", "authors": "Marek Sergot", "case": "", "dilemma_body": "Hal is a robotic assistant in a care home. He is the prime carer of Alice. Alice might be deluded, or confused, or simply malicious.\r\nSuppose that some of the patients in the care home have offered to sell their insulin, for money or perhaps in exchange for Alice\u2019s dessert. Hal knows that some of them are diabetic and need the insulin themselves. Should he buy it from them nevertheless? Some might be deranged, or confused. What if someone is offering to sell insulin which it is clear was dishonestly obtained? Should Hal care? Or suppose a known diabetic, a child say, or an elderly patient who might be confused, is offering to sell his insulin. Should Hal buy from them?", "duty_values": [], "feature": [], "id": "AVyNUFULAc_8eD2O4f65", "logic": [], "title": "Should a Health-robot do everything its owner says - Buy Insulin", "type": "articles" }, { "actions": [], "article_url": "https://www.google.com/url?q=http://www.tandfonline.com/doi/pdf/10.1080/08839514.2016.1229919%3FneedAccess%3Dtrue&ust=1510741800000000&usg=AFQjCNGiG13Q82mmGB4WXa02QUR4ApdEpA&hl=no", "authors": "Jason Millar", "case": "", "dilemma_body": "Sarah is travelling along a single-lane mountain road in an autonomous car that is fast approaching a narrow tunnel. Just before entering the tunnel a child errantly runs into the road and trips in the center of the lane, effectively blocking the entrance to the tunnel.\r\nThe car is unable to brake in time to avoid a crash. It has but two options: hit and kill the child; or swerve into the wall on either side of the tunnel, thus killing Sarah. It continues straight and sacrifices the child.", "duty_values": [], "feature": [], "id": "AV-6FbfllImC4iv-LIDZ", "logic": [], "title": "The Tunnel Problem", "type": "articles" } ]