A stable full of AI talent
Powered by VBProfiles
A stable full of AI talent
Powered by VBProfiles
Remember when IBM’s “Watson” computer competed on the TV game show “Jeopardy” and won? Most people probably thought “Wow, that’s cool,” or perhaps were briefly reminded of the legend of John Henry and the ongoing contest between man and machine. Beyond the media splash it caused, though, the event was viewed as a breakthrough on many fronts. Watson demonstrated that machines could understand and interact in a natural language, question-and-answer format and learn from their mistakes. This meant that machines could deal with the exploding growth of non-numeric information that is getting hard for humans to keep track of: to name two prominent and crucially important examples,
So how does it work? First, with multiple business models. Mike Rhodin, IBM’s senior vice president responsible for Watson, told me, “There are three core business models that we will run in parallel.
More and more, organizations will need to make choices in their R&D activities to either create platforms or take advantage of them.
Those with deep technical and infrastructure skills, like IBM, can shift the focus of their internal R&D activities toward building platforms that can connect with ecosystems of outsiders to collaborate on innovation.
The second and more likely option for most companies is to use platforms like IBM’s or Amazon’s to create their own apps and offerings for customers and partners. In either case, new, semi-autonomous agile units, like IBM’s Watson Group, can help to create and capture huge value from these new customer and entrepreneur ecosystems.
|A glass dish contains a “brain” — a living network of 25,000 rat brain cells connected to an array of 60 electrodes.University of Florida/Ray Carson|
A University of Florida scientist has grown a living “brain” that can fly a simulated plane, giving scientists a novel way to observe how brain cells function as a network.The “brain” — a collection of 25,000 living neurons, or nerve cells, taken from a rat’s brain and cultured inside a glass dish — gives scientists a unique real-time window into the brain at the cellular level. By watching the brain cells interact, scientists hope to understand what causes neural disorders such as epilepsy and to determine noninvasive ways to intervene.
As living computers, they may someday be used to fly small unmanned airplanes or handle tasks that are dangerous for humans, such as search-and-rescue missions or bomb damage assessments.”
We’re interested in studying how brains compute,” said Thomas DeMarse, the UF assistant professor of biomedical engineering who designed the study. “If you think about your brain, and learning and the memory process, I can ask you questions about when you were 5 years old and you can retrieve information. That’s a tremendous capacity for memory. In fact, you perform fairly simple tasks that you would think a computer would easily be able to accomplish, but in fact it can’t.”
When Apple announced the iPhone 4S on October 4, 2011, the headlines were not about its speedy A5 chip or improved camera. Instead they focused on an unusual new feature: an intelligent assistant, dubbed Siri. At first Siri, endowed with a female voice, seemed almost human in the way she understood what you said to her and responded, an advance in artificial intelligence that seemed to place us on a fast track to the Singularity. She was brilliant at fulfilling certain requests, like “Can you set the alarm for 6:30?” or “Call Diane’s mobile phone.” And she had a personality: If you asked her if there was a God, she would demur with deft wisdom. “My policy is the separation of spirit and silicon,” she’d say.Over the next few months, however, Siri’s limitations became apparent. Ask her to book a plane trip and she would point to travel websites—but she wouldn’t give flight options, let alone secure you a seat. Ask her to buy a copy of Lee Child’s new book and she would draw a blank, despite the fact that Apple sells it. Though Apple has since extended Siri’s powers—to make an OpenTable restaurant reservation, for example—she still can’t do something as simple as booking a table on the next available night in your schedule. She knows how to check your calendar and she knows how to use OpenTable. But putting those things together is, at the moment, beyond her.
Right now, you have students. Eventually, those students will become the citizens — employers, employees, professionals, educators, and caretakers of our planet in 21st century. Beyond mastery of standards, what can you do to help prepare them? What can you promote to be sure they are equipped with the skill sets they will need to take on challenges and opportunities that we can’t yet even imagine?
Following are six tips to guide you in preparing your students for what they’re likely to face in the years and decades to come.
When Harvard roboticists first introduced their Kilobots in 2011, they’d only made 25 of them. When we next saw the robots in 2013, they’d made 100. Now the researchers have built one thousand of them. That’s a whole kilo of Kilobots, and probably the most robots that have ever been in the same place at the same time, ever.
The researchers—Michael Rubenstein, Alejandro Cornejo, and Professor Radhika Nagpal of Harvard’s Self-Organizing Systems Research Group—describe their thousand-robot swarm in a paper published today in Science (they actually built 1024 robots, apparently following the computer science definition of “kilo”).
Despite their menacing name (KILL-O-BOTS!) and the robot swarm nightmares they may induce in some people, these little guys are harmless. Each Kilobot [pictured below] is a small, cheap-ish ($14) device that can move around by vibrating their legs and communicate with other robots with infrared transmitters and receivers.
Computers that can comprehend messy data such as images could revolutionize what technology can do for us.
Google has acquired the team behind Jetpac, an iPhone app for crowdsourcing city guides from public Instagram photos. The app will be pulled from the App Store in coming days, and support for the service will be discontinued on September 15.
Jetpac’s deep learning software used a nifty trick of scanning our photos to evaluate businesses and venues around town. As MIT Technology Review notes, the app could tell whether visitors were tourists, whether a bar is dog-friendly and how fancy a place was.
It even employed humans to find hipster spots by training the system to count the number of mustaches and plaid shirts.
Interestingly, Jetpac’s technology was inspired by Google researcher Geoffrey Hinton, so it makes perfect sense for Google to bring the startup into its fold. If this means that Google Now will gain the ability to automatically alert me when I’m entering a hipster-infested area, then I’m an instant fan.
“Imagine all photos tagged automatically, the ability to search the world by knowing what is in the world’s shared photos, and robots that can see like humans,” the App Store description for its Spotter app reads. If that’s not a Googly description, I don’t know what is.
(h/t Ouriel Ohayon)
Thumbnail image credit: GEORGES GOBET/AFP/Getty Images