The BBC has a long term program called the moral maze hosted with Michael Buerk. This was a good one exploring the issues of social media. http://www.bbc.co.uk/iplayer/episode/b01ntgw5/Moral_Maze_The_Moral_Code_of_Social_Media/ However, as we begin to employ more computer-controlled objects, cars, robots, and machines that need to operate autonomously in our real-time chaotic environment, situations will inevitably arise in which the software has to choose between a set of tragic, unpleasant, bad, even horrible, alternatives. Example 1. You’re driving along in your car which has an insurance protection system on and can see that a n uninsured poor driver is about to break a red light in front of you that will lead to a crash. The automatic system takes over and you come to a sudden halt, however the person behind you now takes evasive action, swerves to avoid you and hits the same uninsured driver killing the person instead of you. Example 2. Your self-driving car crosses a bridge as