The World Economic Forum warns that AI may destabilize the financial system

“And McWaters says that as AI becomes more widely used in finance, it will be important to consider issues like biased algorithms, which can discriminate against certain groups of people. Financial companies should not be too eager to simply replace staff either, he says. As the study suggests, human skills will remain important even as automation becomes more widespread.”

The World Economic Forum warns that AI may destabilize the financial system
https://www.technologyreview.com/s/611890/the-world-economic-forum-warns-that-ai-may-destabilize-the-financial-system/
via Instapaper

Franken-algorithms: the deadly consequences of unpredictable code

“Computers are already vastly superior to us at certain specialized tasks, but the day they rival our general ability is probably some way off – if it ever happens. Human beings may not be best at much, but we’re second-best at an impressive range of things.

Here’s the problem. Between the “dumb” fixed algorithms and true AI lies the problematic halfway house we’ve already entered with scarcely a thought and almost no debate, much less agreement as to aims, ethics, safety, best practice. If the algorithms around us are not yet intelligent, meaning able to independently say “that calculation/course of action doesn’t look right: I’ll do it again”, they are nonetheless starting to learn from their environments. And once an algorithm is learning, we no longer know to any degree of certainty what its rules and parameters are. At which point we can’t be certain of how it will interact with other algorithms, the physical world, or us. Where the “dumb” fixed algorithms – complex, opaque and inured to real time monitoring as they can be – are in principle predictable and interrogable, these ones are not. After a time in the wild, we no longer know what they are: they have the potential to become erratic. We might be tempted to call these “frankenalgos” – though Mary Shelley couldn’t have made this up.


Algorithms are beginning to learn from their environments. Illustration: Marco Goran Romano”

Franken-algorithms: the deadly consequences of unpredictable code
http://www.theguardian.com/technology/2018/aug/29/coding-algorithms-frankenalgos-program-danger
via Instapaper



Life, the universe and everything, according to futurist Michio Kaku

“The other thing Hawking warned us about was artificial intelligence. What's your opinion?

We've been brainwashed by Hollywood on this. Our most advanced robot has the intelligence of a cockroach. We can't even get a robot to get out of a car, something a five-year-old can do. We're not likely to see a robot that thinks, and looks and talks like you, uses your voice inflections, until the end of this century.”

Life, the universe and everything, according to futurist Michio Kaku
https://www.smh.com.au/technology/life-the-universe-and-everything-according-to-futurist-michio-kaku-20180820-p4zylp.html
via Instapaper


In the Future of Work, Creativity Still Rules - Connected Futures on techvshuman Gerd Leonhard

“Any smart company,” he said, “would not just fire everyone when they can replace them with machines, but move them to more value-adding jobs … If the goal is to spread the power of technology and the benefits across society and create new jobs and new positions and reinvest, then we can do that.”

I don’t think that the end of routine means the end of jobs. I think it just means the end of routine.
- Gerd Leonhard

For the most part, Leonhard fears, we are falling short of that goal. To start, he believes we must move from rote learning to developing the qualities that will separate humans from machines, whether in education or workplace culture.

“We need in the future to focus on the human only-skills,” he stressed. “Mostly emotional intelligence. This is right now a number one desired capability in HR. And we need to teach it to our kids. That includes intelligence to create, to imagine, to tell a story.”

Preparing for these changes, Leonhard believes, is also the smart, competitive thing to do.

“People look at technology,” he explained, “and say wow, instead of having people do this, we can have AI or whatever. And that is a very short-term view because in five or 10 years every single company will have that technology. So, you end up being a commodity. But the thing that makes it a real company that has values and purpose and meaning is the people that work in it.”

The Floating, Fluid, Future Team

As technology change upends the very foundation of how we work, expect teams to be dynamic and fluid, management less hierarchical, and collaboration constant. (Cisco’s Future of Work research supports this conclusion.)

“I believe that the future is about wide-scale collaboration,” Leonhard said, “working on missions rather than for jobs and for necessarily one company.”

For some, that will mean adapting to a gig economy. That won’t be without its pain points, especially if we allow what Leonhard has called “digital Darwinism.””

In the Future of Work, Creativity Still Rules - Connected Futures
https://connectedfutures.cisco.com/article/future-work-creativity-still-rules/
via Instapaper



Why liberal arts and the humanities are as important as engineering - Vivek Wadhwa

“With the convergence of medicine, artificial intelligence and sensors, we can create digital doctors that monitor our health and help us prevent disease; with the advances in genomics and gene editing, we have the ability to create plants that are drought resistant and that feed the planet; with robots powered by artificial intelligence, we can build digital companions for the elderly. Nanomaterial advances are enabling a new generation of solar and storage technologies that will make energy affordable and available to all.

Creating solutions such as these requires a knowledge of fields such as biology, education, health sciences and human behavior. Tackling today’s biggest social and technological challenges requires the ability to think critically about their human context, which is something that humanities graduates happen to be best trained to do.

An engineering degree is very valuable, but the sense of empathy that comes from music, arts, literature and psychology provides a big advantage in design. A history major who has studied the Enlightenment or the rise and fall of the Roman Empire gains an insight into the human elements of technology and the importance of its usability.”

Why liberal arts and the humanities are as important as engineering - Vivek Wadhwa
http://wadhwa.com/2018/06/12/liberal-arts-humanities-important-engineering/
via Instapaper



Why liberal arts and the humanities are as important as engineering - Vivek Wadhwa

“Steve Jobs touted the importance of liberal arts and humanities at the unveiling of the iPad 2: “It’s in Apple’s DNA that technology alone is not enough — it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our heart sing, and nowhere is that more true than in these post-PC devices.” With this focus, he built the most valuable company in the world and set new standards for the technology industry.

Logitech CEO Bracken Darrell, who majored in English, also emphasized this. I recently asked him how he turned his company around and caused its stock price to increase by an astonishing 450 percent over five years. He said that it was through relentlessly focusing on design in every product the company built; that engineering is important but what makes a technology product most successful is its design.

The key to good design is a combination of empathy and knowledge of the arts and humanities. Musicians and artists inherently have the greatest sense of creativity. You can teach artists how to use software and graphics tools; turning engineers into artists is hard.”

Why liberal arts and the humanities are as important as engineering - Vivek Wadhwa
http://wadhwa.com/2018/06/12/liberal-arts-humanities-important-engineering/
via Instapaper

Man vs. Machine: The New Ethics of Cybersecurity

“Leonhard’s proposed manifesto focuses on five specific human rights that he believes could be endangered if we don’t have an ethical framework by which to guide us. These are:

1. The right to remain natural ,i.e., biological. This means we can be employed, use public services, buy things and function in society without a requirement to deploy technology on or inside our bodies.

2. The right to be inefficient if and where it defines our basic humanities. We must be able to make the choice to be slower than technology and not make efficiency more important than humanity.

3. The right to disconnect. We must retain the right to switch off connectivity, go dark on the network and pause communications, tracking and monitoring.

4. The right to be anonymous. We must retain the option of not being identified and tracked, such as when using a digital application or platform when it doesn’t pose a risk or impose itself on others.

5. The right to employ or involve people instead of machines. We should not allow companies or employers to be disadvantaged if they choose to use people instead of machines—even if it’s more expensive and less efficient.”

Man vs. Machine: The New Ethics of Cybersecurity
https://www.securityroundtable.org/new-ethics-of-cybersecurity/
via Instapaper



Review of Technology vs Humanity: The coming clash between man and machine - Ron Immink -

“As Peter Diamandis’ 6Ds, he identified a number of effects.

Dependency – Leaving our thinking to software and algorithms because it’s just so much more convenient and fast.
Confusion – Not knowing if it was the intended human who replied to my emails, or her AI assistant. Or even not knowing if I made my own decision or if I was manipulated by my IDA.
Loss of control – Not having a way of knowing if the AI’s anticipation was correct or not, as we could not possibly track the system’s logic or even comprehend the workings of a quantum computing-fueled, machine-learning system. In other words, we would need to either trust it completely or not at all, similar to the dilemma that some airplane pilots are already facing with their autopilot systems.
Abdication – Being tempted to leave more tasks to systems that would handle them for us, whether it is coordinating personal schedules, making appointments, or answering simple emails. Then, of course, it would be very likely that we would simply blame the cloud/bot/AI if something went wrong.
Reminds me of the “The Seventh Sense“. Do you think AI will let us when it has taken over?”

Technology vs Humanity: The coming clash between man and machine - Ron Immink - Bestselling author, speaker, entrepreneur and ecopreneur.
https://www.ronimmink.com/technology-vs-humanity-coming-clash-man-machine/
via Instapaper

‘Find Your Passion’ vs growth mindset

“Dweck, one of the paper’s authors, has previously studied different types of mind-sets as they relate to intelligence. People who have a growth mind-set about their own intelligence tend to be less afraid of failure, according to her research, because they believe smarts are cultivated, not inherent. Interests are related to, but distinct from, abilities, the study authors told me: You can be interested in something but not very good at it. “I’ve been playing guitar for 25 years, but I can’t say that my abilities have gotten that much better in the past 10 years,” O’Keefe said.

Dweck told me that “find your passion” has a laudable history. “Before that, people were saying, ‘Find your genius,’ and that was so intimidating. It implied that only people who were really brilliant at something could succeed,” she said. “‘Find your passion’ felt more democratic. Everybody can have an interest.” But this study suggests that even the idea of finding your “true” interest can intimidate people and keep them from digging further into a field.”

‘Find Your Passion’ Is Awful Advice
https://www.theatlantic.com/science/archive/2018/07/find-your-passion-is-terrible-advice/564932/
via Instapaper

Let’s make private data into a public good

“The so-called sharing economy is based on the same idea. Instead of interacting with some kind of institution (like a travel agency), customers interact with each other. The role of a company, then, is not to provide the service but to connect sellers (like someone who owns a car and is willing to drive it) with buyers (someone who needs a ride). These so-called platforms are presented as a radical transformation in the way goods and services are produced, shared, and delivered. But they’re also an easy way for companies to avoid responsibility. When disabled users complain to Uber that their drivers refuse to put wheelchairs in the trunk, Uber says, well, we’re not a taxi company, we’re just a platform. Airbnb is similarly reluctant to take responsibility for the safety of the premises offered on its site, or for racial discrimination against renters by property owners. After all, Airbnb didn’t build the apartments and doesn’t own them—it’s just a platform.”

Let’s make private data into a public good
https://www.technologyreview.com/s/611489/lets-make-private-data-into-a-public-good/
via Instapaper