Textbook site by Wiley is here.
Table of Contents
• CHAPTER 1: Introduction to Cyberethics: Concepts, Perspectives, and Methodological Frameworks
- Scenario 1–1: Hacking into the Mobile Phones of Celebrities
- 1.1 Defining Key Terms: Cyberethics and Cybertechnology • 1.1.1 What Is Cybertechnology? • 1.1.2 Why the Term Cyberethics?
- 1.2 The Cyberethics Evolution: Four Developmental Phases in Cybertechnology
- 1.3 Are Cyberethics Issues Unique Ethical Issues? • Scenario 1–2: Developing the Code for a Computerized Weapon System • Scenario 1–3: Digital Piracy • 1.3.1 Distinguishing between Unique Technological Features and Unique Ethical Issues • 1.3.2 An Alternative Strategy for Analyzing the Debate about the Uniqueness of Cyberethics Issues • 1.3.3 A Policy Vacuum in Duplicating Computer Software
- 1.4 Cyberethics as a Branch of Applied Ethics: Three Distinct Perspectives • 1.4.1 Perspective #1: Cyberethics as a Field of Professional Ethics • 1.4.2 Perspective #2: Cyberethics as a Field of Philosophical Ethics • 1.4.3 Perspective #3: Cyberethics as a Field of Sociological/Descriptive Ethics • Scenario 1–4: The Impact of Technology X on the Pleasantville Community
- 1.5 A Comprehensive Cyberethics Methodology • 1.5.1 A “Disclosive” Method for Cyberethics • 1.5.2 An Interdisciplinary and Multilevel Method for Analyzing Cyberethics Issues
- 1.6 A Comprehensive Strategy for Approaching Cyberethics Issues
- 1.7 Chapter Summary
• CHAPTER 2: Ethical Concepts And Ethical Theories: Frameworks For Analyzing Moral Issues
- Scenario 2–1: The Case of the “Runaway Trolley”: A Classic Moral Dilemma
- 2.1 Ethics and Morality • 2.1.1 What Is Morality? • 2.1.2 The Study of Morality: Three Distinct Approaches for Evaluating and Justifying the Rules Comprising a Moral System
- 2.2 Discussion Stoppers as Roadblocks to Moral Discourse • 2.2.1 Discussion Stopper #1: People Disagree on Solutions to Moral Issues • 2.2.2 Discussion Stopper #2: Who Am I to Judge Others? • 2.2.3 Discussion Stopper #3: Morality Is Simply a Private Matter • 2.2.4 Discussion Stopper #4: Morality Is Simply a Matter for Individual Cultures to Decide • Scenario 2–2: The Price of Defending Moral Relativism
- 2.3 Why Do We Need Ethical Theories?
- 2.4 Consequence‐Based Ethical Theories • 2.4.1 Act Utilitarianism • Scenario 2–3: A Controversial Policy in Newmerica • 2.4.2 Rule Utilitarianism
- 2.5 Duty‐Based Ethical Theories • 2.5.1 Rule Deontology • Scenario 2–4: Making an Exception for Oneself • 2.5.2 Act Deontology • Scenario 2–5: A Dilemma Involving Conflicting Duties
- 2.6 Contract‐Based Ethical Theories • 2.6.1 Some Criticisms of Contract‐Based Theories • 2.6.2 Rights‐Based Contract Theories
- 2.7 Character‐Based Ethical Theories • 2.7.1 Being a Moral Person vs. Following Moral Rules • 2.7.2 Acquiring the “Correct” Habits
- 2.8 Integrating Aspects of Classical Ethical Theories into a Single Comprehensive Theory • 2.8.1 Moor’s Just‐Consequentialist Theory and Its Application to Cybertechnology • 2.8.2 Key Elements in Moor’s Just‐Consequentialist Framework
- 2.9 Chapter Summary
• CHAPTER 3: Critical Reasoning Skills for Evaluating Disputes in Cyberethics
- SCENARIO 3–1: Reasoning About Whether to Download Software from “Sharester”
- 3.1 What Is Critical Reasoning? • 3.1.1 Some Basic Concepts: (Logical) Arguments and Claims • 3.1.2 The Role of Arguments • 3.1.3 The Basic Structure of an Argument
- 3.2 Constructing an Argument
- 3.3 Valid Arguments
- 3.4 Sound Arguments
- 3.5 Invalid Arguments
- 3.6 Inductive Arguments
- 3.7 Fallacious Arguments
- 3.8 A Seven‐Step Strategy for Evaluating Arguments
- 3.9 Identifying Some Common Fallacies • 3.9.1 Ad Hominem Argument • 3.9.2 Slippery Slope Argument • 3.9.3 Fallacy of Appeal to Authority • 3.9.4 False Cause Fallacy • 3.9.5 Fallacy of Composition/Fallacy of Division • 3.9.6 Fallacy of Ambiguity/Equivocation • 3.9.7 The False Dichotomy/Either–Or Fallacy/All‐or‐Nothing Fallacy • 3.9.8 The Virtuality Fallacy
- 3.10 Chapter Summary
• CHAPTER 4: Professional Ethics, Codes of Conduct, and Moral Responsibility
- Scenario 4–1: Fatalities Involving the Oerlikon GDF‐005 Robotic Cannon
- 4.1 What Is Professional Ethics? • 4.1.1 What Is a Profession? • 4.1.2 Who Is a Professional? • 4.1.3 Who Is a Computer/IT Professional?
- 4.2 Do Computer/IT Professionals Have Any Special Moral Responsibilities?
- 4.3 Professional Codes of Ethics and Codes of Conduct • 4.3.1 The Purpose of Professional Codes • 4.3.2 Some Criticisms of Professional Codes • 4.3.3 Defending Professional Codes • 4.3.4 The IEEE‐CS/ACM Software Engineering Code of Ethics and Professional Practice
- 4.4 Conflicts of Professional Responsibility: Employee Loyalty and Whistle‐Blowing • 4.4.1 Do Employees Have an Obligation of Loyalty to Employers? • 4.4.2 Whistle‐Blowing • Scenario 4–2: NSA Surveillance and the Case of Edward Snowden
- 4.5 Moral Responsibility, Legal Liability, and Accountability • 4.5.1 Distinguishing Responsibility from Liability and Accountability • 4.5.2 Accountability and the Problem of “Many Hands” • Scenario 4–3: The Case of the Therac‐25 Machine • 4.5.3 Legal Liability and Moral Accountability
- 4.6 Do Some Computer Corporations Have Special Moral Obligations?
- 4.7 Chapter Summary
• CHAPTER 5: Privacy and Cyberspace
- Scenario 5–1: A New NSA Data Center
- 5.1 Privacy in the Digital Age: Who Is Affected and Why Should We Worry? • 5.1.1 Whose Privacy Is Threatened by Cybertechnology? • 5.1.2 Are Any Privacy Concerns Generated by Cybertechnology Unique or Special?
- 5.2 What Is Personal Privacy? • 5.2.1 Accessibility Privacy: Freedom from Unwarranted Intrusion • 5.2.2 Decisional Privacy: Freedom from Interference in One’s Personal Affairs • 5.2.3 Informational Privacy: Control over the Flow of Personal Information • 5.2.4 A Comprehensive Account of Privacy • Scenario 5–2: Descriptive Privacy • Scenario 5–3: Normative Privacy • 5.2.5 Privacy as “Contextual Integrity” • Scenario 5–4: Preserving Contextual Integrity in a University Seminar
- 5.3 Why Is Privacy Important? • 5.3.1 Is Privacy an Intrinsic Value? • 5.3.2 Privacy as a Social Value
- 5.4 Gathering Personal Data: Surveillance, Recording, and Tracking Techniques • 5.4.1 “Dataveillance” Techniques • 5.4.2 Internet Cookies • 5.4.3 RFID Technology • 5.4.4 Cybertechnology and Government Surveillance
- 5.5 Analyzing Personal Data: Big Data, Data Mining, and Web Mining • 5.5.1 Big Data: What, Exactly, Is It, and Why Does It Threaten Privacy? • 5.5.2 Data Mining and Personal Privacy • Scenario 5–5: Data Mining at the XYZ Credit Union • 5.5.3 Web Mining: Analyzing Personal Data Acquired from Our Interactions Online
- 5.6 Protecting Personal Privacy in Public Space • 5.6.1 PPI vs. NPI • Scenario 5–6: Shopping at SuperMart • Scenario 5–7: Shopping at Nile.com • 5.6.2 Search Engines and the Disclosure of Personal Information
- 5.7 Privacy Legislation and Industry Self‐Regulation • 5.7.1 Industry Self‐Regulation and Privacy‐Enhancing Tools • 5.7.2 Privacy Laws and Data Protection Principles
- 5.8 A Right to “Be Forgotten” (or to “Erasure”) in the Digital Age • Scenario 5–8: An Arrest for an Underage Drinking Incident 20 Years Ago • 5.8.1 Arguments Opposing RTBF • 5.8.2 Arguments Defending RTBF • 5.8.3 Establishing “Appropriate” Criteria
- 5.9 Chapter Summary
• CHAPTER 6: Security in Cyberspace
- Scenario 6–1: The “Olympic Games” Operation and the Stuxnet Worm
- 6.1 Security in the Context of Cybertechnology • 6.1.1 Cybersecurity as Related to Cybercrime • 6.1.2 Security and Privacy: Some Similarities and Some Differences
- 6.2 Three Categories of Cybersecurity • 6.2.1 Data Security: Confidentiality, Integrity, and Availability of Information • 6.2.2 System Security: Viruses, Worms, and Malware • 6.2.3 Network Security: Protecting our Infrastructure • Scenario 6–2: The “GhostNet” Controversy
- 6.3 Cloud Computing and Security • 6.3.1 Deployment and Service/Delivery Models for the Cloud • 6.3.2 Securing User Data Residing in the Cloud • 6.3.3 Assessing Risk in the Cloud and in the Context of Cybersecurity
- 6.4 Hacking and “The Hacker Ethic” • 6.4.1 What Is “The Hacker Ethic”? • 6.4.2 Are Computer Break‐ins Ever Ethically Justifiable?
- 6.5 Cyberterrorism • 6.5.1 Cyberterrorism vs. Hacktivism • Scenario 6–3: Anonymous and the “Operation Payback” Attack • 6.5.2 Cybertechnology and Terrorist Organizations
- 6.6 Information Warfare (IW) • 6.6.1 Information Warfare vs. Conventional Warfare • 6.6.2 Potential Consequences for Nations that Engage in IW
- 6.7 Chapter Summary
• CHAPTER 7: Cybercrime and Cyber‐Related Crimes
- Scenario 7–1: Creating a Fake Facebook Account to Catch Criminals
- 7.1 Cybercrimes and Cybercriminals • 7.1.1 Background Events: A Brief Sketch • 7.1.2 A Typical Cybercriminal
- 7.2 Hacking, Cracking, and Counter Hacking • 7.2.1 Hacking vs. Cracking • 7.2.2 Active Defense Hacking: Can Acts of “Hacking Back” or Counter Hacking Ever Be Morally Justified?
- 7.3 Defining Cybercrime • 7.3.1 Determining the Criteria • 7.3.2 A Preliminary Definition of Cybercrime • 7.3.3 Framing a Coherent and Comprehensive Definition of Cybercrime
- 7.4 Three Categories of Cybercrime: Piracy, Trespass, and Vandalism in Cyberspace
- 7.5 Cyber‐Related Crimes • 7.5.1 Some Examples of Cyber‐Exacerbated vs. Cyber‐Assisted Crimes • 7.5.2 Identity Theft
- 7.6 Technologies and Tools for Combating Cybercrime • 7.6.1 Biometric Technologies • 7.6.2 Keystroke‐Monitoring Software and Packet‐Sniffing Programs
- 7.7 Programs and Techniques Designed to Combat Cybercrime in the United States • 7.7.1 Entrapment and “Sting” Operations to Catch Internet Pedophiles • Scenario 7–2: Entrapment on the Internet • 7.7.2 Enhanced Government Surveillance Techniques and the Patriot Act
- 7.8 National and International Laws to Combat Cybercrime • 7.8.1 The Problem of Jurisdiction in Cyberspace • Scenario 7–3: A Virtual Casino • Scenario 7–4: Prosecuting a Computer Corporation in Multiple Countries • 7.8.2 Some International Laws and Conventions Affecting Cybercrime • Scenario 7–5: The Pirate Bay Web Site
- 7.9 Cybercrime and the Free Press: The Wikileaks Controversy • 7.9.1 Are WikiLeaks’ Practices Ethical? • 7.9.2 Are WikiLeaks’ Practices Criminal? • 7.9.3 WikiLeaks and the Free Press
- 7.10 Chapter Summary
• CHAPTER 8: Intellectual Property Disputes in Cyberspace
- Scenario 8–1: Streaming Music Online
- 8.1 What Is Intellectual Property? • 8.1.1 Intellectual Objects • 8.1.2 Why Protect Intellectual Objects? • 8.1.3 Software as Intellectual Property • 8.1.4 Evaluating a Popular Argument Used by the Software Industry to Show Why It Is Morally Wrong to Copy Proprietary Software
- 8.2 Copyright Law and Digital Media • 8.2.1 The Evolution of Copyright Law in the United States • 8.2.2 The Fair‐Use and First‐Sale Provisions of Copyright Law • 8.2.3 Software Piracy as Copyright Infringement • 8.2.4 Napster and the Ongoing Battles over Sharing Digital Music
- 8.3 Patents, Trademarks, and Trade Secrets • 8.3.1 Patent Protections • 8.3.2 Trademarks • 8.3.3 Trade Secrets
- 8.4 Jurisdictional Issues Involving Intellectual Property Laws
- 8.5 Philosophical Foundations for Intellectual Property Rights • 8.5.1 The Labor Theory of Property • Scenario 8–2: DEF Corporation vs. XYZ Inc. • 8.5.2 The Utilitarian Theory of Property • Scenario 8–3: Sam’s e‐Book Reader Add‐on Device • 8.5.3 The Personality Theory of Property • Scenario 8–4: Angela’s B++ Programming Tool
- 8.6 The “Free Software” and “Open Source” Movements • 8.6.1 GNU and the Free Software Foundation • 8.6.2 The “Open Source Software” Movement: OSS vs. FSF
- 8.7 The “Common Good” Approach: An Alternative Framework for Analyzing the Intellectual Property Debate • 8.7.1 Information Wants to be Shared vs. Information Wants to be Free • 8.7.2 Preserving the Information Commons • 8.7.3 The Fate of the Information Commons: Could the Public Domain of Ideas Eventually Disappear? • 8.7.4 The Creative Commons
- 8.8 PIPA, SOPA, and RWA Legislation: Current Battlegrounds in the Intellectual Property War • 8.8.1 The PIPA and SOPA Battles • 8.8.2 RWA and Public Access to Health‐Related Information • Scenario 8–5: Elsevier Press and “The Cost of Knowledge” Boycott • 8.8.3 Intellectual Property Battles in the Near Future
- 8.9 Chapter Summary
• CHAPTER 9: Regulating Commerce and Speech in Cyberspace
- Scenario 9–1: Anonymous and the Ku Klux Klan
- 9.1 Introduction and Background Issues: Some Key Questions and Critical Distinctions Affecting Internet Regulation • 9.1.1 Is Cyberspace a Medium or a Place? • 9.1.2 Two Categories of Cyberspace Regulation: Regulating Content and Regulating Process • 9.1.3 Four Modes of Regulation: The Lessig Model
- 9.2 Digital Rights Management (DRM) • 9.2.1 Some Implications of DRM for Public Policy Debates Affecting Copyright Law • 9.2.2 DRM and the Music Industry • Scenario 9–2: The Sony Rootkit Controversy
- 9.3 E‐Mail Spam • 9.3.1 Defining Spam • 9.3.2 Why Is Spam Morally Objectionable?
- 9.4 Free Speech vs. Censorship and Content Control in Cyberspace • 9.4.1 Protecting Free Speech • 9.4.2 Defining Censorship
- 9.5 Pornography in Cyberspace • 9.5.1 Interpreting “Community Standards” in Cyberspace • 9.5.2 Internet Pornography Laws and Protecting Children Online • 9.5.3 Virtual Child Pornography • 9.5.4 Sexting and Its Implications for Current Child Pornography Laws • Scenario 9–3: A Sexting Incident Involving Greensburg Salem High School
- 9.6 Hate Speech and Speech that Can Cause Physical Harm to Others • 9.6.1 Hate Speech on the Web • 9.6.2 Online “Speech” that Can Cause Physical Harm to Others
- 9.7 “Network Neutrality” and the Future of Internet Regulation • 9.7.1 Defining Network Neutrality • 9.7.2 Some Arguments Advanced by Net Neutrality’s Proponents and Opponents • 9.7.3 Future Implications for the Net Neutrality Debate
- 9.8 Chapter Summary
• CHAPTER 10: The Digital Divide, Democracy, and Work
- Scenario 10–1: Digital Devices, Social Media, Democracy, and the “Arab Spring”
- 10.1 The Digital Divide • 10.1.1 The Global Digital Divide • 10.1.2 The Digital Divide within Nations • Scenario 10–2: Providing In‐Home Internet Service for Public School Students • 10.1.3 Is the Digital Divide an Ethical Issue?
- 10.2 Cybertechnology and the Disabled
- 10.3 Cybertechnology and Race • 10.3.1 Internet Usage Patterns • 10.3.2 Racism and the Internet
- 10.4 Cybertechnology and Gender • 10.4.1 Access to High‐Technology Jobs • 10.4.2 Gender Bias in Software Design and Video Games
- 10.5 Cybertechnology, Democracy, and Democratic Ideals • 10.5.1 Has Cybertechnology Enhanced or Threatened Democracy? • 10.5.2 How has Cybertechnology Affected Political Elections in Democratic Nations?
- 10.6 The Transformation and the Quality of Work • 10.6.1 Job Displacement and the Transformed Workplace • 10.6.2 The Quality of Work Life in the Digital Era • Scenario 10–3: Employee Monitoring and the Case of Ontario vs. Quon
- 10.7 Chapter Summary
• CHAPTER 11: Online Communities, Virtual Reality, and Artificial Intelligence
- Scenario 11–1: Ralph’s Online Friends and Artificial Companions
- 11.1 Online Communities and Social Networking Services • 11.1.1 Online Communities vs. Traditional Communities • 11.1.2 Blogs and Some Controversial Aspects of the Blogosphere • Scenario 11–2: “The Washingtonienne” Blogger • 11.1.3 Some Pros and Cons of SNSs (and Other Online Communities) • Scenario 11–3: A Suicide Resulting from Deception on MySpace
- 11.2 Virtual Environments and Virtual Reality • 11.2.1 What Is Virtual Reality (VR)? • 11.2.2 Ethical Aspects of VR Applications
- 11.3 Artificial Intelligence (AI) • 11.3.1 What Is AI? A Brief Overview • 11.3.2 The Turing Test and John Searle’s “Chinese Room” Argument • 11.3.3 Cyborgs and Human–Machine Relationships
- 11.4 Extending Moral Consideration to AI Entities • Scenario 11–4: Artificial Children • 11.4.1 Determining Which Kinds of Beings/Entities Deserve Moral Consideration • 11.4.2 Moral Patients vs. Moral Agents
- 11.5 Chapter Summary
• CHAPTER 12: Ethical Aspects of Emerging and Converging Technologies
- Scenario 12–1: When “Things” Communicate with One Another
- 12.1 Converging Technologies and Technological Convergence
- 12.2 Ambient Intelligence (AmI) and Ubiquitous Computing • 12.2.1 Pervasive Computing, Ubiquitous Communication, and Intelligent User Interfaces • 12.2.2 Ethical and Social Aspects of AmI • Scenario 12–2: E. M. Forster’s “(Pre)Cautionary Tale” • Scenario 12–3: Jeremy Bentham’s “Panopticon/Inspection House” (Thought Experiment)
- 12.3 Nanotechnology and Nanocomputing • 12.3.1 Nanotechnology: A Brief Overview • 12.3.2 Ethical Issues in Nanotechnology and Nanocomputing
- 12.4 Autonomous Machines • 12.4.1 What Is an AM? • 12.4.2 Some Ethical and Philosophical Questions Pertaining to AMs
- 12.5 Machine Ethics and Moral Machines • 12.5.1 What Is Machine Ethics? • 12.5.2 Designing Moral Machines
- 12.6 A “Dynamic” Ethical Framework for Guiding Research in New and Emerging Technologies • 12.6.1 Is an ELSI‐Like Model Adequate for New/Emerging Technologies? • 12.6.2 A “Dynamic Ethics” Model
- 12.7 Chapter Summary