This link has been bookmarked by 78 people . It was first bookmarked on 17 May 2016, by someone privately.
-
31 Dec 17
-
16 Nov 16
-
01 Oct 16
-
17 Aug 16
-
24 Jul 16
-
22 Jul 16
-
20 Jul 16
-
The brain wasn’t a black box at all. It was more like a computer.
-
The so-called cognitive revolution started small, but as computers became standard equipment in psychology labs across the country, it gained broader acceptance
-
Because if the world is a computer, then the world can be coded.
-
Our machines are starting to speak a different language now, one that even the best coders can’t fully understand.
-
With machine learning, programmers don’t encode computers with instructions. They train them
-
it has recently become immensely more powerful, thanks in part to the rise of deep neural networks
-
-
13 Jul 16
-
08 Jul 16
-
29 Jun 16
-
27 Jun 16
-
23 Jun 16
-
20 Jun 16
-
19 Jun 16Karl Fisch
"These forces have led technologist Danny Hillis to declare the end of the age of Enlightenment, our centuries-long faith in logic, determinism, and control over nature. Hillis says we’re shifting to what he calls the age of Entanglement. “As our technological and institutional creations have become more complex, our relationship to them has changed,” he wrote in the Journal of Design and Science. “Instead of being masters of our creations, we have learned to bargain with them, cajoling and guiding them in the general direction of our goals. We have built our own jungle, and it has a life of its own.” The rise of machine learning is the latest—and perhaps the last—step in this journey."
-
11 Jun 16
-
24 May 16
-
23 May 16
-
22 May 16
-
20 May 16
-
-
the age of Entanglement
-
Instead of being masters of our creations, we have learned to bargain with them, cajoling and guiding them in the general direction of our goals. We have built our own jungle, and it has a life of its own.
-
Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”
-
-
19 May 16
-
-
Even Google’s search engine—for so many years a towering edifice of human-written rules—has begun to rely on these deep neural networks. In February the company replaced its longtime head of search with machine-learning expert John Giannandrea, and it has initiated a major program to retrain its engineers in these new techniques.
-
If in the old view programmers were like gods, authoring the laws that govern computer systems, now they’re like parents or dog trainers.
-
Artificial intelligence wasn’t supposed to work this way. Until a few years ago, mainstream AI researchers assumed that to create intelligence, we just had to imbue a machine with the right logic.
-
implications of an unparsable machine language
-
If the rise of human-written software led to the cult of the engineer, and to the notion that human experience can ultimately be reduced to a series of comprehensible instructions, machine learning kicks the pendulum in the opposite direction
-
Now the technological elite is even smaller, and their command over their creations has waned and become indirect.
-
Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”
-
We’re just learning the rules of engagement with a new technology.
-
In the long run, Thrun says, machine learning will have a democratizing influence. In the same way that you don’t need to know HTML to build a website these days, you eventually won’t need a PhD to tap into the insane power of deep learning.
-
Machine learning suggests the opposite, an outside-in view in which code doesn’t just determine behavior, behavior also determines code. Machines are products of the world.
-
-
Víctor González Pacheco
Soon We Won’t Program Computers. We’ll Train Them Like Dogs https://t.co/pT5KF5FISk Soon We Won’t Program Computers. We’ll Train Them Like Dogs https://t.co/pT5KF5FISk — Víctor González (@vgonpa) May 19, 2016
-
-
In this world, the ability to write code has become not just a desirable skill but a language that grants insider status to those who speak it.
-
They have access to what in a more mechanical age would have been called the levers of power.
-
Our machines are starting to speak a different language now, one that even the best coders can’t fully understand.
-
In traditional programming, an engineer writes explicit, step-by-step instructions for the computer to follow. With machine learning, programmers don’t encode computers with instructions. They train them.
-
If you want to teach a neural network to recognize a cat, for instance, you don’t tell it to look for whiskers, ears, fur, and eyes. You simply show it thousands and thousands of photos of cats, and eventually it works things out.
-
If in the old view programmers were like gods, authoring the laws that govern computer systems, now they’re like parents or dog trainers.
-
Right now Google, for example, is facing an antitrust investigation in Europe that accuses the company of exerting undue influence over its search results. Such a charge will be difficult to prove when even the company’s own engineers can’t say exactly how its search algorithms work in the first place.
-
Over the past few years, as networks have grown more intertwined and their functions more complex, code has come to seem more like an alien force, the ghosts in the machine ever more elusive and ungovernable. Planes grounded for no reason. Seemingly unpreventable flash crashes in the stock market. Rolling blackouts.
-
technologist Danny Hillis to declare the end of the age of Enlightenment,
-
Hillis says we’re shifting to what he calls the age of Entanglement.
-
“Instead of being masters of our creations, we have learned to bargain with them, cajoling and guiding them in the general direction of our goals. We have built our own jungle, and it has a life of its own.”
-
Already the companies that build this stuff find it behaving in ways that are hard to govern. Last summer, Google rushed to apologize when its photo recognition engine started tagging images of black people as gorillas. The company’s blunt first fix was to keep the system from labeling anything as a gorilla.
-
Stephen Hawking
-
“Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”
-
But don’t be too scared; this isn’t the dawn of Skynet.
-
Already, engineers are working out ways to visualize what’s going on under the hood of a deep-learning system. But even if we never fully understand how these new machines think, that doesn’t mean we’ll be powerless before them. In the future, we won’t concern ourselves as much with the underlying sources of their behavior; we’ll learn to focus on the behavior itself. The code will become less important than the data we use to train it.
-
-
18 May 16Lun Esex
My latest @wired, on the rise of machine learning and the end of "traditional" code: https://t.co/0pdI5cVyq5
-
dlgogma
Thoughtful analysis of the impact of machine learning on the future of programming. https://t.co/Gt9PH3i1DJ
-
17 May 16
-
Ian O'Byrne
Soon We Won’t Program Computers. We’ll Train Them Like Dogs - from Wired https://t.co/mhfmECB8Un https://t.co/AwbhSmnL2T
-
-
Before the invention of the computer, most experimental psychologists thought the brain was an unknowable black box
-
behaviorists, as they called themselves, confined their work to the study of stimulus and response, feedback and reinforcement
-
in the mid-1950s, a group of rebellious psychologists, linguists, information theorists, and early artificial-intelligence researchers came up with a different conception of the mind
-
People, they argued, were not just collections of conditioned responses
-
They absorbed information, processed it, and then acted upon it
-
They had systems for writing, storing, and recalling memories
-
It was more like a computer
-
The brain wasn’t a black box at all
-
By the late 1970s, cognitive psychology had overthrown behaviorism
-
As the digital revolution wormed its way into every part of our lives, it also seeped into our language and our deep, basic theories about how things work
-
if the world is a computer, then the world can be coded
-
Code is hackable
-
Code is destiny
-
Code is logical
-
software has eaten the world
-
we have surrounded ourselves with machines that convert our actions, thoughts, and emotions into data
-
Companies use code to understand our most intimate ties
-
the ability to write code has become not just a desirable skill but a language that grants insider status to those who speak it
-
If coders don’t run the world, they run the things that run the world
-
whether you like this state of affairs or hate it—whether you’re a member of the coding elite or someone who barely feels competent to futz with the settings on your phone—don’t get used to it
-
Our machines are starting to speak a different language now, one that even the best coders can’t fully understand
-
the biggest tech companies in Silicon Valley have aggressively pursued an approach to computing called machine learnin
-
programmers don’t encode computers with instructions
-
They train them
-
This approach is not new
-
it has recently become immensely more powerful, thanks in part to the rise of deep neural networks, massively distributed computational systems that mimic the multilayered connections of neurons in the brain
-
machine learning powers large swaths of our online activity
-
With machine learning, the engineer never knows precisely how the computer accomplishes its tasks
-
The neural network’s operations are largely opaque and inscrutable
-
these black boxes assume responsibility for more and more of our daily digital tasks
-
they are going to change how we think about ourselves, our world, and our place within it
-
in the old view programmers were like gods
-
now they’re like parents or dog trainers
-
that is a much more mysterious relationship to find yourself in
-
machine learning changes what it means to be an engineer
-
After a neural network learns how to do speech recognition, a programmer can’t go in and look at it and see how that happened
-
When engineers do peer into a deep neural network, what they see is an ocean of math: a massive, multilayer set of calculus problems that—by constantly deriving the relationship between billions of data points—generate guesses about the world
-
Artificial intelligence wasn’t supposed to work this way
-
AI researchers assumed that to create intelligence, we just had to imbue a machine with the right logic
-
early proponents of machine learning, who argued in favor of plying machines with data until they reached their own conclusions
-
Neural nets had no symbols or rules, just numbers
-
The implications of an unparsable machine language aren’t just philosophical
-
learning to code has been one of the surest routes to reliable employment
-
a world run by neurally networked deep-learning machines requires a different workforce
-
machines render old skills irrelevant
-
we’ll still need coders for a long time yet
-
it will become a meta skill
-
code will remain a powerful, if incomplete, tool set
-
machine learning will do the bulk of the work
-
humans still have to train these systems
-
The job requires both a high-level grasp of mathematics and an intuition for pedagogical give-and-take
-
It’s almost like an art form
-
There’s only a few hundred people in the world that can do that really well
-
that tiny number has been enough to transform the tech industry in just a couple of years
-
the cultural consequences will be even bigger
-
the rise of human-written software led to the cult of the engineer, and to the notion that human experience can ultimately be reduced to a series of comprehensible instructions
-
The code that runs the universe may defy human analysis
-
even simple algorithms can create unpredictable emergent behavior
-
networks have grown more intertwined and their functions more complex
-
code has come to seem more like an alien force, the ghosts in the machine
-
the end of the age of Enlightenment, our centuries-long faith in logic, determinism, and control over nature
-
the age of Entanglement
-
our technological and institutional creations have become more complex
-
Instead of being masters of our creations, we have learned to bargain with them, cajoling and guiding them in the general direction of our goals
-
Coders were at least human
-
the technological elite is even smaller, and their command over their creations has waned and become indirect
-
the companies that build this stuff find it behaving in ways that are hard to govern
-
this all suggests a coming era in which we forfeit authority over our machines
-
the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all
-
We’re just learning the rules of engagement with a new technology
-
even if we never fully understand how these new machines think, that doesn’t mean we’ll be powerless before them
-
we won’t concern ourselves as much with the underlying sources of their behavior
-
we’ll learn to focus on the behavior itself
-
The code will become less important than the data we use to train it
-
it looks a lot like good old 20th-century behaviorism
-
the process of training a machine-learning algorithm is often compared to the great behaviorist experiments of the early 1900s
-
machine learning will have a democratizing influence
-
you don’t need to know HTML to build a website these days
-
you eventually won’t need a PhD to tap into the insane power of deep learning
-
For much of computing history, we have taken an inside-out view of how machines work
-
First we write the code, then the machine expresses it
-
Machine learning suggests the opposite, an outside-in view in which code doesn’t just determine behavior, behavior also determines code
-
we will come to appreciate both the power of handwritten linear code and the power of machine-learning algorithms to adjust it
-
biologists have already started figuring this out
-
Gene-editing techniques like Crispr give them the kind of code-manipulating power that traditional software programmers have wielded
-
discoveries in the field of epigenetics suggest that genetic material is not in fact an immutable set of instructions but rather a dynamic set of switches that adjusts depending on the environment and experiences of its host
-
Our code does not exist separate from the physical world; it is deeply influenced and transmogrified by it
-
A cell is a machine for turning experience into biology
-
computers are becoming devices for turning experience into technology
-
We will go from commanding our devices to parenting them
-
-
abdcharies
Soon We Won’t Program Computers. We’ll Train Them Like Dogs via Digg http://ift.tt/1WClQsd
Would you like to comment?
Join Diigo for a free account, or sign in if you are already a member.