tag:digitalethics.net,2013:/posts Digital Ethics by FuturistGerd 2018-12-08T12:53:11Z Digital Ethics by Futurist Gerd Leonhard tag:digitalethics.net,2013:Post/1351410 2018-12-08T12:53:10Z 2018-12-08T12:53:11Z Microsoft Warns Washington to Regulate A.I. Before Its Too Late
“AI Now—a group composed of tech employees from companies including Microsoft and Google, and affiliated with New York University—says exemplify the need for stricter regulation of artificial intelligence. The group’s report, published Thursday, underscores the inherent dangers in using A.I. to do things like amplify surveillance in fields including finance and policing, and argues that accountability and oversight are necessities where this type of nascent technology is concerned. Crucially, they argue, people should be able to opt out of facial-recognition systems altogether. “Mere public notice of their use is not sufficient, and there should be a high threshold for any consent, given the dangers of oppressive and continual mass surveillance,” the organization writes. “These tools are very suspect and based on faulty science,””

Microsoft Warns Washington to Regulate A.I. Before Its Too Late
https://www.vanityfair.com/news/2018/12/microsoft-warns-washington-to-regulate-ai-before-its-too-late
via Instapaper




]]>
tag:digitalethics.net,2013:Post/1346382 2018-11-22T09:34:46Z 2018-11-22T09:34:48Z One of the fathers of AI is worried about its future - we need more democracy in A.I. research !
“another reason why we need to have more democracy in AI research. It’s that AI research by itself will tend to lead to concentrations of power, money, and researchers. The best students want to go to the best companies. They have much more money, they have much more data. And this is not healthy. Even in a democracy, it’s dangerous to have too much power concentrated in a few hands.”

One of the fathers of AI is worried about its future
https://www.technologyreview.com/s/612434/one-of-the-fathers-of-ai-is-worried-about-its-future/
via Instapaper




]]>
tag:digitalethics.net,2013:Post/1345243 2018-11-19T07:31:29Z 2018-11-19T07:31:30Z The future of artificial intelligence depends on human wisdom - some good AI stats
“Where the physical and intellectual capacities of humans are inherently limited, AI has the potential to add to that reservoir of capacity to improve lives.

Then, of course, there is the money. As a result of AI, it is projected that global GDP would increase by up to 14 percent in 2030, an estimated increase of $15.7 trillion, with the greatest gains to come in China (26 percent increase in GDP) and the U.S. (14 percent increase in GDP). Gartner predicts that by 2020, almost all new software will contain AI elements.

Huge amounts are already being invested by businesses that seek the efficiency gains and outsized accomplishments AI promises. Venture capital investment in AI startups grew 463 percent from 2012-2017. A McKinsey report noted that global demand for data scientists has exceeded supply by over 50 percent in 2018 alone. They are so coveted that some Chinese companies are reportedly hiring senior machine learning researchers with salaries above $500,000. According to Mark Cuban, by 2017 Google had incorporated AI into its business model and generated $9 billion more as a result and Cuban also posited that the world’s first trillionaire would stem from the AI field.”

The future of artificial intelligence depends on human wisdom
https://www.salon.com/2018/11/17/the-future-of-artificial-intelligence-depends-on-human-wisdom/
via Instapaper

]]>
tag:digitalethics.net,2013:Post/1344291 2018-11-15T19:58:21Z 2018-11-15T19:58:22Z Automation may take our jobs—but it’ll restore our humanity
“One implication of all this is that for humans to succeed in the AI-powered future, we need to double down on our humanity. Technical skills will no doubt remain important in the future of work, but as AI allows us to automate repetitive tasks across many industries, these will in many cases take a back seat to soft skills. Communication, emotional intelligence, creativity, critical thinking, collaboration, and cognitive flexibility will become the most sought-after abilities. To prepare for that future, we need to emphasize developing higher-order thinking and emotional skills.”

Automation may take our jobs—but it’ll restore our humanity
https://qz.com/1054034/automation-may-take-our-jobs-but-itll-restore-our-humanity/
via Instapaper



]]>
tag:digitalethics.net,2013:Post/1343021 2018-11-12T04:50:18Z 2018-11-12T04:50:19Z Mindset and Heartset - a crucial message !
“The focus on mindset has even deeper roots. If we go back to the Enlightenment from the late 1600’s to the early 1800’s, the key message from the great thinkers of that time was to celebrate the power of the mind and all that it could accomplish. It’s not an accident that this era was also known as the “Age of Reason.” The mind is of course a powerful vehicle for driving amazing insights and accomplishments and should be celebrated. But there’s a risk that we reduce everything to the mind. It’s all about ideas and reason. The body is just a distraction or, at best, something to be nurtured because it holds our mind. Life is so much more complicated than that.”

Mindset and Heartset
https://edgeperspectives.typepad.com/edge_perspectives/2018/11/mindset-and-heartset.html
via Instapaper

]]>
tag:digitalethics.net,2013:Post/1342776 2018-11-11T14:58:57Z 2018-11-11T14:59:00Z Mindset and Heartset - must read by John Hagel
“The key assumption in the room was that it was all about the mind. They assumed that our assumptions and beliefs shape what we feel and what we do. In this view of the world, emotions are a distraction, or at best a second order effect, and it’s ultimately all about our mind.

Expanding our view
I would suggest that we’re a lot more complicated than that. Our emotions aren’t just derivative of our assumptions and beliefs. Emotions shape our perceptions, assumptions, thoughts and beliefs as well. If you try to shape assumptions and beliefs without paying attention to the emotions that already exist, good luck.

We need to move beyond mindset and expand our horizons to address our heartset: what are the emotions that filter how we perceive the world, shape what we believe and influence how we act?”

Mindset and Heartset
https://edgeperspectives.typepad.com/edge_perspectives/2018/11/mindset-and-heartset.html
via Instapaper



]]>
tag:digitalethics.net,2013:Post/1342733 2018-11-11T10:37:46Z 2018-11-11T10:37:46Z Artificial Intelligence Is Not A Technology (Forbes)
“Yet, with centuries of technology advancement and the almost exponential increase of computing resources, data, knowledge, and capabilities, we still have not yet achieved the vision of Artificial General Intelligence (AGI) -- machines that can be an equal counterpart of human ability. We’re not even close. We have devices we can talk to that don’t understand what we’re saying. We have cars that will happily drive straight into a wall if that’s what your GPS instructs it to do. Machines are detecting images but not understanding what they are. And we have amazing machines that can beat world champions at chess and Go and multiplayer games, but can’t answer a question as basic as “how long should I cook a 14 pound turkey?” We’ve mastered computing. We’ve wrangled big data. We’re figuring out learning. We have no idea how to achieve general intelligence.”

Artificial Intelligence Is Not A Technology
https://www.forbes.com/sites/cognitiveworld/2018/11/01/artificial-intelligence-is-not-a-technology/?utm_campaign=f93d7b1204-EMAIL_CAMPAIGN_2018_11_10_05_46&utm_medium=email&utm_source=Cognitive%2BRoundUp&utm_term=0_8baf59472a-f93d7b1204-98985207
via Instapaper



]]>
tag:digitalethics.net,2013:Post/1342449 2018-11-10T11:00:26Z 2018-11-10T11:00:30Z Ignore AI Fear Factor at Your Peril: A Futurist’s Call for 'Digital Ethics'
“In fact, Leonhard cited Gartner’s recent pronouncement that a leading tech topic for 2019 will be “digital ethics,” a focus on compliance, values and respect for individuals' data in response to public concerns about privacy. Leonhard himself defines digital ethics as “the difference between doing whatever technological progress will enable us to do, and putting human happiness and societal flourishing first at all times.””

Ignore AI Fear Factor at Your Peril: A Futurist’s Call for 'Digital Ethics'
https://www.enterprisetech.com/2018/11/03/ignore-the-ai-fear-factor-at-your-peril-a-futurists-call-for-digital-ethics/
via Instapaper



]]>
tag:digitalethics.net,2013:Post/1342448 2018-11-10T10:59:26Z 2018-11-10T10:59:26Z Sundar Pichai of Google: ‘Technology Doesn’t Solve Humanity’s Problems’
“But there’s a deeper thing here, which is: Technology doesn’t solve humanity’s problems. It was always naïve to think so. Technology is an enabler, but humanity has to deal with humanity’s problems. I think we’re both over-reliant on technology as a way to solve things and probably, at this moment, over-indexing on technology as a source of all problems, too.”

Sundar Pichai of Google: ‘Technology Doesn’t Solve Humanity’s Problems’
https://www.nytimes.com/2018/11/08/business/sundar-pichai-google-corner-office.html
via Instapaper

]]>
tag:digitalethics.net,2013:Post/1336527 2018-10-27T13:03:52Z 2018-10-27T13:03:53Z PDF with presentation from ICEE Fest 2018: Technology and Humanity - the next 10 years (Futurist Gerd Leonhard)

]]>
tag:digitalethics.net,2013:Post/1336526 2018-10-27T13:01:24Z 2018-10-27T13:01:25Z Shared presentation: Humans and Technology - Heaven or Hell? (AHRA Aruba 2018, Gerd Leonhard)

]]>
tag:digitalethics.net,2013:Post/1335846 2018-10-25T11:00:44Z 2018-10-25T11:00:44Z Apple’s Tim Cook makes blistering attack on the ‘data industrial complex’
Download Attachment
Available until Nov 24, 2018
“At its core this technology promises to learn from people individually to benefit us all. But advancing AI by collecting huge personal profiles is laziness, not efficiency,” Cook continued.

“For artificial intelligence to be truly smart it must respect human values — including privacy. If we get this wrong, the dangers are profound. We can achieve both great artificial intelligence and great privacy standards. It is not only a possibility — it is a responsibility.”


Apple’s Tim Cook makes blistering attack on the ‘data industrial complex’
https://recode.cmail20.com/t/d-l-nvtljt-ydyhdtqtk-d/
via Instapaper

Click to Download
2018-Gerd-Leonhard-Futurist-Gerd-on-Stage-Becoming-Superhuman.mp4
0 bytes
]]>
tag:digitalethics.net,2013:Post/1335592 2018-10-24T14:53:33Z 2018-10-24T14:53:35Z Let's Talk About AI Ethics; We're On A Deadline
“Code that learns is both powerful and dangerous. It threatens the basic rules of markets and civic life. AI requires a new technical and civic infrastructure, a new way to conduct business, a new way to be together in community.

AI and enabling technologies like robotics and autonomous vehicles will change lives and livelihoods. Great benefits and unprecedented wealth will be created. But with that will come waves of disruption.

Compared to prior revolutions, this one is occurring at exponential speed and while impacts are ubiquitous, control is concentrated. AI is a centralizing force. It plows through monster data sets in seconds aggregating benefits and wealth at an unprecedented speed.”

Let's Talk About AI Ethics; We're On A Deadline
https://www.forbes.com/sites/tomvanderark/2018/09/13/ethics-on-a-deadline/
via Instapaper








]]>
tag:digitalethics.net,2013:Post/1335590 2018-10-24T14:43:11Z 2018-10-24T14:43:14Z Gartner picks digital ethics and privacy as a strategic trend for 2019
“Any discussion on privacy must be grounded in the broader topic of digital ethics and the trust of your customers, constituents and employees. While privacy and security are foundational components in building trust, trust is actually about more than just these components. Trust is the acceptance of the truth of a statement without evidence or investigation. Ultimately an organisation’s position on privacy must be driven by its broader position on ethics and trust. Shifting from privacy to ethics moves the conversation beyond ‘are we compliant’ toward ‘are we doing the right thing.””

Gartner picks digital ethics and privacy as a strategic trend for 2019
http://social.techcrunch.com/2018/10/16/gartner-picks-digital-ethics-and-privacy-as-a-strategic-trend-for-2019/
via Instapaper



]]>
tag:digitalethics.net,2013:Post/1334752 2018-10-22T05:38:37Z 2018-10-22T05:38:38Z Alexa’s Most Dangerous Feature Can’t Be Undone
“These and other tech corporations have grand ambitions. They want to colonize space. Not interplanetary space. Everyday space: home, office, car. In the near future, everything from your lighting to your air-conditioning to your refrigerator, your coffee maker, and even your toilet could be wired to a system controlled by voice.”

Alexa’s Most Dangerous Feature Can’t Be Undone
https://www.theatlantic.com/magazine/archive/2018/11/alexa-how-will-you-change-us/570844/
via Instapaper



]]>
tag:digitalethics.net,2013:Post/1334066 2018-10-20T00:34:29Z 2018-10-20T00:34:30Z Joseph Stiglitz on artificial intelligence: 'We’re going towards a more divided society'
“Beyond the impact of AI on work, Stiglitz sees more insidious forces at play. Armed with AI, tech firms can extract meaning from the data we hand over when we search, buy and message our friends. It is used ostensibly to deliver a more personalised service. That is one perspective. Another is that our data is used against us.

“These new tech giants are raising very deep issues about privacy and the ability to exploit ordinary people that were never present in earlier eras of monopoly power,” says Stiglitz. “Beforehand, you could raise the price. Now you can target particular individuals by exploiting their information.”

We’ve gone from a 60-hour working week to a 45-hour week and we could go to 30 or 25

Joseph Stiglitz
It is the potential for datasets to be combined that most worries Stiglitz. For example, retailers can now track customers via their smartphones as they move around stores and can gather data on what catches their eye and which displays they walk straight past.

“In your interactions with Google, Facebook, Twitter and others, they gather an awful lot of data about you. If that data is combined with other data, then companies have a great deal of information about you as an individual – more information than you have on yourself,” he says.”

Joseph Stiglitz on artificial intelligence: 'We’re going towards a more divided society'
http://www.theguardian.com/technology/2018/sep/08/joseph-stiglitz-on-artificial-intelligence-were-going-towards-a-more-divided-society
via Instapaper



]]>
tag:digitalethics.net,2013:Post/1333142 2018-10-17T06:56:34Z 2018-10-17T06:56:36Z Gartner picks digital ethics and privacy as a strategic trend for 2019
“But what really stands out in Gartner’s list of developing and/or barely emergent strategic tech trends is digital ethics and privacy — given the concept is not reliant on any particular technology underpinning it; yet is being (essentially) characterized as an emergent property of other already deployed (but unnamed) technologies. So is actually in play — in a way that others on the list aren’t yet (or aren’t at the same mass scale).

The analyst dubs digital ethics and privacy a “growing concern for individuals, organisations and governments”, writing: “People are increasingly concerned about how their personal information is being used by organisations in both the public and private sector, and the backlash will only increase for organisations that are not proactively addressing these concerns.”

Yes, people are increasingly concerned about privacy. Though ethics and privacy are hardly new concepts (or indeed new discussion topics). So the key point is really the strategic obfuscation of issues that people do in fact care an awful lot about, via the selective and non-transparent application of various behind-the-scenes technologies up to now — as engineers have gone about collecting and using people’s data without telling them how, why and what they’re actually doing with it.”

Gartner picks digital ethics and privacy as a strategic trend for 2019
http://social.techcrunch.com/2018/10/16/gartner-picks-digital-ethics-and-privacy-as-a-strategic-trend-for-2019/
via Instapaper







]]>
tag:digitalethics.net,2013:Post/1331876 2018-10-13T12:28:27Z 2018-10-13T12:28:28Z GSMA boss urges strong response to digital pressures - Mobile World Live
“There are fears that digital disruption has the potential for destroying jobs and increasing inequality, and in the process, upending the European social model,” said Granryd.

He continued to state that politicians and regulators alike are “seeking to harness the digital revolution” while ensuring Europe’s citizens are fully benefitting, as democracies become a target of “malign influences misusing the networks that are part of our everyday life.”

“The unrestricted nature of the internet, once seen as its greatest virtue, is now being challenged as its greatest weakness,” the GSMA boss added.”

GSMA boss urges strong response to digital pressures - Mobile World Live
https://www.mobileworldlive.com/featured-content/home-banner/gsma-boss-urges-strong-response-to-digital-pressures/
via Instapaper



]]>
tag:digitalethics.net,2013:Post/1331246 2018-10-11T17:21:33Z 2018-10-11T17:21:34Z Tech Workers Now Want to Know: What Are We Building This For?
“Across the technology industry, rank-and-file employees are demanding greater insight into how their companies are deploying the technology that they built. At Google, Amazon, Microsoft and Salesforce, as well as at tech start-ups, engineers and technologists are increasingly asking whether the products they are working on are being used for surveillance in places like China or for military projects in the United States or elsewhere.”

Tech Workers Now Want to Know: What Are We Building This For?
https://www.nytimes.com/2018/10/07/technology/tech-workers-ask-censorship-surveillance.html
via Instapaper


]]>
tag:digitalethics.net,2013:Post/1331081 2018-10-11T08:00:21Z 2018-10-11T08:00:22Z Universal Basic Income Is Silicon Valley’s Latest Scam
“Uber’s business plan, like that of so many other digital unicorns, is based on extracting all the value from the markets it enters. This ultimately means squeezing employees, customers, and suppliers alike in the name of continued growth. When people eventually become too poor to continue working as drivers or paying for rides, UBI supplies the required cash infusion for the business to keep operating.”

Universal Basic Income Is Silicon Valley’s Latest Scam
https://medium.com/s/powertrip/universal-basic-income-is-silicon-valleys-latest-scam-fd3e130b69a0
via Instapaper


]]>
tag:digitalethics.net,2013:Post/1330466 2018-10-09T05:29:36Z 2018-10-09T05:29:36Z Why Data Needs Humans as Much as Humans Need Data
“The thing that will separate the great agencies and great work from the merely informed and statistically correct is not the quantity of the data, but the quality of the people sifting through, interpreting and applying it—the people who can turn cold, hard facts into insights, ideas and emotions; the people who know how to truly connect with others.”

Why Data Needs Humans as Much as Humans Need Data
https://www.adweek.com/brand-marketing/why-data-needs-humans-as-much-as-humans-need-data/
via Instapaper



]]>
tag:digitalethics.net,2013:Post/1328470 2018-10-03T13:24:31Z 2018-10-03T13:24:31Z Newco Shift | Technology, Humanity, and the Existential Test
“the technology industry reveled for decades as a kind of benighted warrior for societal good. As one Senator told me during the Facebook hearings this past summer, “we purposefully didn’t regulate technology, and that was the right thing to do.” But now? He shrugged. Now, maybe it’s time.”

Newco Shift | Technology, Humanity, and the Existential Test
https://shift.newco.co/2018/09/24/technology-humanity-and-the-existential-test/
via Instapaper
]]>
tag:digitalethics.net,2013:Post/1328147 2018-10-02T12:06:42Z 2018-10-02T12:06:43Z A Model for the Future of Education
“For me it’s about passion, curiosity, imagination, critical thinking, and grit.

Passion: You’d be amazed at how many people don’t have a mission in life… A calling… something to jolt them out of bed every morning. The most valuable resource for humanity is the persistent and passionate human mind, so creating a future of passionate kids is so very important. For my 7-year-old boys, I want to support them in finding their passion or purpose… something that is uniquely theirs. In the same way that the Apollo program and Star Trek drove my early love for all things space, and that passion drove me to learn and do.
Curiosity: Curiosity is something innate in kids, yet something lost by most adults during the course of their life. Why? In a world of Google, robots, and AI, raising a kid that is constantly asking questions and running “what if” experiments can be extremely valuable. In an age of machine learning, massive data, and a trillion sensors, it will be the quality of your questions that will be most important.
Imagination: Entrepreneurs and visionaries imagine the world (and the future) they want to live in, and then they create it. Kids happen to be some of the most imaginative humans around… it’s critical that they know how important and liberating imagination can be.
Critical Thinking: In a world flooded with often-conflicting ideas, baseless claims, misleading headlines, negative news, and misinformation, learning the skill of critical thinking helps find the signal in the noise. This principle is perhaps the most difficult to teach kids.
Grit/Persistence: Grit is defined as “passion and perseverance in pursuit of long-term goals,” and it has recently been widely acknowledged as one of the most important predictors of and contributors to success.”

A Model for the Future of Education
https://singularityhub.com/2018/09/12/a-model-for-the-future-of-education-and-the-tech-shaping-it/
via Instapaper

]]>
tag:digitalethics.net,2013:Post/1326732 2018-09-28T10:03:11Z 2018-09-28T10:03:11Z Just Don’t Call It Privacy
“In other words, asking companies whose business models revolve around exploiting data-based consumer-influence techniques to explain their privacy policies seems about as useful as asking sharks to hold forth on veganism.

“Congress should not be examining privacy policies,” Marc Rotenberg, the executive director of the Electronic Privacy Information Center, a prominent digital rights nonprofit, told me last week. “They should be examining business practices. They should be examining how these firms collect and use the personal data of customers, of internet users.””

Just Don’t Call It Privacy
https://www.nytimes.com/2018/09/22/sunday-review/privacy-hearing-amazon-google.html
via Instapaper


]]>
tag:digitalethics.net,2013:Post/1326512 2018-09-27T17:12:24Z 2018-09-27T17:12:25Z Just Don’t Call It Privacy
“In a surveillance economy where companies track, analyze and capitalize on our clicks, the issue at hand isn’t privacy. The problem is unfettered data exploitation and its potential deleterious consequences — among them, unequal consumer treatment, financial fraud, identity theft, manipulative marketing and discrimination.”

Just Don’t Call It Privacy
https://www.nytimes.com/2018/09/22/sunday-review/privacy-hearing-amazon-google.html
via Instapaper

]]>
tag:digitalethics.net,2013:Post/1326391 2018-09-27T14:06:26Z 2018-09-27T14:06:27Z Top 9 ethical issues in artificial intelligence
“In 2014, roughly the same revenues were generated by the three biggest companies in Detroit and the three biggest companies in Silicon Valley ... only in Silicon Valley there were 10 times fewer employees.”

Top 9 ethical issues in artificial intelligence
https://www.weforum.org/agenda/2016/10/top-10-ethical-issues-in-artificial-intelligence/
via Instapaper

]]>
tag:digitalethics.net,2013:Post/1324402 2018-09-22T15:25:15Z 2018-09-22T15:25:17Z Former Head of Google China Foresees an AI Crisis--and Proposes a Solution
“AI will take many single-task, single-domain jobs away. You can argue that humans have abilities that AI does not: We can conceptualize, strategize, create. Whereas today’s AI is just a really smart pattern recognizer that can take in data, optimize, and beat humans at a given task. But how many jobs in the world are simple repetitions of tasks that can be optimized? How many jobs require no creativity, strategizing, conceptualization? Most jobs are repetitive: truck-driving, telemarketing, dishwashing, fruit picking, assembly-line work, and so on. I’m afraid that about 50 percent of jobs in the world are in danger.”

Former Head of Google China Foresees an AI Crisis--and Proposes a Solution
https://spectrum.ieee.org/tech-talk/robotics/artificial-intelligence/former-head-of-google-china-foresees-an-ai-crisis
via Instapaper

]]>
tag:digitalethics.net,2013:Post/1324348 2018-09-22T11:52:57Z 2018-09-22T11:52:57Z The future of humanity depends on design ethics, says Tim Wu
“To be clear, these false loops are an explicit business strategy. The more you can convince someone that they need to keep checking your site, the more time they’ll spend on your platform–and the more ads they’ll see. It’s the same philosophy that underpins incessant notifications and the infinite scroll you find on many media sites (including our own). “If you were to obey Facebook’s mandate–hey, this friend commented on this, you should comment back, oh, you need to like this–you’d spend 24 hours there and still you’d never close the loops,” Wu says.”

The future of humanity depends on design ethics, says Tim Wu
https://www.fastcompany.com/90239599/the-future-of-humanity-depends-on-design-ethics-says-tim-wu
via Instapaper


]]>
tag:digitalethics.net,2013:Post/1322627 2018-09-17T16:26:21Z 2018-09-17T16:26:39Z Forget the new iPhones: Apple’s best product is now privacy
“In 2018, no issue is more important than user privacy–or the lack of it. We’re tracked by private industry on an unprecedented scale, with major corporations having so much data about us–much of it gleaned without our knowledge–that they can tell when a teenager is pregnant (and inform the teen’s father) or even predict your future actions based on decisions you haven’t made yet. If you want to be part of this world, designed by advertisers and tech giants, you must relinquish your right to privacy. In other words, we live in a commercial surveillance state.”

Forget the new iPhones: Apple’s best product is now privacy
https://www.fastcompany.com/90236195/forget-the-new-iphones-apples-best-product-is-now-privacy
via Instapaper

]]>
tag:digitalethics.net,2013:Post/1321942 2018-09-15T16:00:23Z 2018-09-15T16:00:34Z Senior Google Scientist Resigns Over “Forfeiture of Our Values” in China
““I view our intent to capitulate to censorship and surveillance demands in exchange for access to the Chinese market as a forfeiture of our values and governmental negotiating position across the globe,” he wrote, adding: “There is an all-too-real possibility that other nations will attempt to leverage our actions in China in order to demand our compliance with their security demands.””

Senior Google Scientist Resigns Over “Forfeiture of Our Values” in China
https://theintercept.com/2018/09/13/google-china-search-engine-employee-resigns/
via Instapaper

]]>