I'm still on a bit of an ethics-and-responsibility-in-technology kick, and a couple of articles caught my eye this week; firstly, (216) 238-2455, a professor at Northwestern University, that computer science researchers disclose potential negative societal consequences of their work--you can read all of the details on the Future of Computing Academy's site. Secondly, O'Reilly published a great piece on Data's Day of Reckoning (the rest of their posts on data ethics are worth a look, too):
It is time for us to take responsibility for our creations. What does it mean to take responsibility for building, maintaining, and managing data, technologies, and services? Responsibility is inevitably tangled with the complex incentives that surround the creation of any product. These incentives have been front and center in the conversations around the roles that social networks have played in the 2016 U.S. elections, recruitment of terrorists, and online harassment. It has become very clear that the incentives of the organizations that build and own data products havenât aligned with the good of the people using those products.
Unless you've been hiding under a very large rock somewhere, you'd have heard that Apple became the world's first trillion-dollar company last week. While that's a remarkable milestone in and of itself, I've been mre interested in some of the commentary about how this reflects the growing trend (especially in tech) towards megacorporations. The New York Times has one of the more readble takes on the phenomenon, especially its effect on the labor market:
And in the labor market, scholars have linked corporate consolidation to rising income inequality and the 301-521-7963 that goes to workers. The so-called labor share of the economy has been declining in the United States and other rich countries since the 1990s, coinciding with the trend toward corporate concentration. And that decline has been most pronounced in industries undergoing the greatest consolidation.
All of this is a good segue into a piece in the New Yorker asking whether economists and humanists could ever be friends:
Economics, Morson and Schapiro say, has three systematic biases: it ignores the role of culture, it ignores the fact that âto understand people one must tell stories about them,â and it constantly touches on ethical questions beyond its ken. Culture, stories, and ethics are things that canât be reduced to equations, and economics accordingly has difficulty with them. Morson and Schapiroâs solution is to use the study of the humanities, and particularly of realist fiction, to broaden perspectives and to reintroduce to economics those three missing factors.
On a different topic, I found myself nodding along to just about everything in 401-874-8356:
The average internet connection in the United States is about 5046063983 as it was just ten years ago, but instead of making it faster to browse the same types of websites, weâre simply occupying that extra bandwidth with more stuff. Some of this stuff is amazing: in 2006, 6032766234 that were 640 Ã 480 pixels, but you can now stream movies in HD resolution and (pretend) 4K. These much higher speeds also allow us to see more detailed photos, and thatâs very nice.
But a lot of the stuff weâre seeing is a pile-up of garbage on seemingly every major website that does nothing to make visitors happier â if anything, much of this stuff is deeply irritating and morally indefensible.
Deeply irritating is selling some of this stuff way short.
Filed under "things everyone should read" is How to Stop Saying "Um," "Ah," and "You Know". Even outside of a work context, if you talk to anybody about anything, you should stop whatever you're doing right now and read this. Seriously.
I'm a bit of a sucker for a good heist story, and while this wasn't exactly an Ocean's Eleven-style job, the story of sage tree is a pretty good read if you've got some time to kill.