Featured Post

Trust: Missing in action where it counts

Whom do you trust? That's a big, loaded question. And at least one organisation has been putting out a Trust Barometer for 14 years now...

Wednesday, October 01, 2014

Digital Ethics: How Not to Mess Up with Technology

With innovation getting so much airtime in recent years, what is the consensus on how all that innovative technology is being used? To get some perspective, Gartner (a consultancy) hosted a webinar put together by their VP of Research, Frank Buytendijk, which debated Digital Ethics. What is Digital Ethics anyway? Buytendijk defines it as “A system of values and moral principles for the conduct of digital interactions among business, people, and things.”

He realises that most people go along with the ‘guns don’t kill, but people do’ school of thought. Similarly with technologies, it is people who are putting it to wrong use, so the responsibility should lie with them.  He, however, stated that technology has a “moral imprint”, even though it may have been created to solve a certain and usually common-place problem. So he feels that the innovator/s should be responsible for the unintended use that his/her invention is put to.

He used the example of TomTom, which is a bi-directional navigation software that people use in their vehicles. It gives data about traffic conditions as well as where the driver is at any given moment (in relation to the traffic congestion). It does aggregate and share its subscribers data with whoever pays for it. But the data shared is anonymous. The buyer of the data doesn’t know who was driving where, at a particular time of the day.

Even such ‘cleaned up’ data has its uses. It is being used by the Ministry of Infrastructure to map out where its own road-repair works are ongoing, and whether this is further causing traffic snarls. This turns out to be further value addition for TomTom users – even if indirectly. Another user of this data is the traffic police. They analyse the data to monitor speeds on various stretches of road. In doing this, they can plan speed traps to make drivers slow down and drive carefully.

So he asked, is this a good or a bad thing? Actually, both customers of the service and the media made a hue and cry, along the lines of ‘big brother is watching’ scenario, when it came to cops being given access to the data. TomTom’s share price fell and they had to rewrite their terms and conditions – to analyse their data for traffic jam patterns…and nothing more.

The reality is that ultimately, one cannot undo knowledge and pretend ignorance of the unexpected consequences of technology. Buytendijk suggests that monitoring is required. Here’s a framework to help you decide whether you are going to be indifferent or take a stand:       
  • Black Hat: Is your main goal to make a buck. There is no place for ethics in business.
  • Grey Hat: You just want to do your job and avoid an ethical discussion. If and when pushed into one, you make it up as you go along.
  • White Hat: You think through the ethics of technology. Ethical responsibility outweighs business responsibility. 
Which category you want to belong to, is entirely up to you.

Buytendijk does agree that the organisation you work with has to take a stand, but since an organisation is a collection of people, it has to start with someone who has to initiate the ‘ethics’ discussion.

In a sense, just like a good idea, even taking an ethical stand does begin with an individual…and then spreads out to others.

Graphics are sourced from the webinar: Digital Ethics: How Not to Mess Up with Technology. This was written for Beyond Jugaad.

No comments: