0
1

Obviously technology can't be moral or immoral on its own. The same innovations that brought us nuclear energy also brought us the bomb.

That said, the technologies discussed here seem to pose some unique ethical questions. Certainly the insights into data that they provide can be very beneficial, with applications ranging from medicine to ecology to social networking. The potential abuses of these tools are also significant. With huge datasets and modern techniques its possible to pry deeply into privacy, to censor or flag various behaviors and to manipulate public and consumer opinion. It's not too hard to imagine some fairly Orwellian applications.

Do practitioners in this field think much about these issues? Do you worry that the tools you develop might be used unscrupulously? If so, does this inform your choice of employer or project?

asked Sep 23 '10 at 12:56

Miles%20Egan's gravatar image

Miles Egan
195479

1

I really like Jonathan Franzen's Imperial Bedroom essay, as it discusses privacy and its implications, and how much our modern freak-outs over privacy disappearing are odd, since privacy is a fairly recent invention. I recall seeing a copy online, but you can find it in his How to Be Alone book and http://www.newyorker.com/archive/1998/10/12/1998_10_12_048_TNY_LIBRY_000016574

(Sep 23 '10 at 16:30) Alexandre Passos ♦

Franzen makes some good points about the difference between social privacy and intellectual privacy. I guess my concerns revolve around more fundamental questions of privacy, like the limits of surveillance and censorship.

(Sep 23 '10 at 16:57) Miles Egan

2 Answers:

Machine learning is a tool, which is no more or less ethical than mathematics or statistics.

How it's used is a different question. I guess it depends upon your perspective. Historically, the positive applications of mathematics and statistics have outweighed the negative ones. I'm a technological optimist, so I believe that technology ultimately leads to good, but a societal pragmatist insofar as I believe that people won't become better or kinder to each other, except at a very very slow pace.

I've always argued that becoming rich doesn't change you as a person. It just optimizes you into the asshole or generous person that deep down you always wanted to be, and helps you self-actualize.

Machine learning is a tool, that helps us optimize. It doesn't tell you the appropriate choice of loss function. That, I believe, is ultimately an aesthetic question, where aesthetics is---ipso facto---whatever is beyond objective evaluation.

answered Sep 23 '10 at 17:09

Joseph%20Turian's gravatar image

Joseph Turian ♦♦
579051125146

I completely agree that a tool in itself can neither be ethical or unethical. I'm also, on the whole, optimistic about the future uses of ML. I certainly would not turn back the clock on technology even if I could.

I guess I'm just wondering how practitioners approach these issues in their work today. Years ago I quit a job because I didn't feel comfortable with some of the things the company was doing and I can imagine some tasks I might be assigned in ML might make me similarly uncomfortable.

(Sep 23 '10 at 17:18) Miles Egan

Definitely people are thinking about privacy in data mining. See Data mining with differential privacy as one paper for example. (No time to write more. Sorry!)

answered Sep 23 '10 at 17:03

Noel%20Welsh's gravatar image

Noel Welsh
72631023

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.