The Global Search for Education: Education and Jobs

The Future of Employment study makes clear that what matters most today is what you can do with what you know, rather than how much you know.

- Dr. Tony Wagner

What does today’s technology mean for tomorrow’s jobs and how can we better structure our education system to ensure that the future working population can prosper in the labor market?

A large range of 20th century jobs are endangered by the machine age. A recent Oxford Martin School study by Dr. Carl Benedikt Frey (Oxford Martin School) and Dr. Michael A. Osborne (Department of Engineering Science, University of Oxford) found that 47% of current US jobs are at risk of automation within the next twenty years. Further, despite recent job growth in the service industry sector, occupations within this industry are highly susceptible. Frey and Osborne assessed the degree to which 702 specific jobs are vulnerable to computerization, distinguishing these occupations into categories of high, medium and low risk.

Job Automation May Threaten Half of U.S. Workforce (Bloomberg)

Mobile robots and ‘smart’ computers — that learn on the job — make it likely that occupations employing about half of today’s U.S. workers could be possible to automate in the next decade or two, according to an Oxford University study that estimated the probability of computerization of more than 700 occupations. Published March 12, 2014

Sources: University of Oxford, Carl Benedikt Frey and Michael A. Osborne
GRAPHIC: AKI ITO / BLOOMBERG NEWS & DAVE MERRILL / BLOOMBERG VISUAL DATA

One of the main ways governments have helped people during previous waves of technological progress is through education system reform. What should government be doing now to make the changes that are necessary?To discuss these issues further, I am joined today in The Global Search for Education by Dr. Carl Frey and Dr. Michael Osborne, authors of The Future of Employment: How Susceptible are Jobs to Computerization, and Dr. Tony Wagner, Expert in Residence at Harvard University‘s Innovation Lab. Tony will be leading a presentation on Education for Innovation at last week’s OPPI Festival in Helsinki, Finland.

I can only recommend that young people continue to gain the kind of cognitive and creative skills that give them a competitive edge over machines.” Dr. Michael A. Osborne

Gentlemen, could you please summarize what you believe were your most important findings in your study?

Michael: We found that a substantial fraction (47%) of current US employment is at risk of automation within the next twenty years. While some of these occupations are in categories previously thought unthreatened by automation, such as logistics and services, we expect automation to continue to predominately threaten low-skilled workers. In fact, we found a strong negative relationship between the average degree of education within an occupation and its susceptibility to computerization. In a similar way, this was true for the average wage: the better-paid jobs, featuring largely better-educated workers, are unlikely to be automated in the near future. Quantitatively, we found that if only a quarter of people within an occupation have a bachelor’s degree or better, the occupation would likely have a fifty-fifty chance of being automatable within the foreseeable future. If half of workers within an occupation have at least a bachelor’s degree, its probability of automation is close to zero. It seems clear that education is a crucial issue in considering future jobs.

Can you speak a little about the range of 20th century jobs that are endangered by the machine age?  

Michael: We firstly expect that existing trends of automation in production will continue: robots, with ever improving sensors and manipulators, will continue to replace factory workers. We further predict that  

many sales jobs are vulnerable: online shopping and self-checkouts will only continue to become more popular at the expense of human salespeople and cashiers. In fact,  

telemarketers were rated as one of the most computerizable occupations; to the dismay, no doubt, of anyone who is sick of speaking to robots on the phone. Perhaps more surprisingly,

we expect transportation and clerical jobs to be at risk from new technologies. Autonomous vehicles threaten many logistics occupations, such as drivers of forklifts or mine vehicles, while

big data analytics place occupations reliant on storing or accessing information at risk, such as tax preparers.

As evidence for the latter: we’re already seeing paralegal jobs replaced by algorithms, so this is not an unreasonable prediction.

We finally suspect that many jobs in the service sector will be increasingly at risk, with the growth of service robotics and sophisticated algorithms.

As examples, court reporters may have their jobs threatened by transcription software, and

electronics repair jobs are already being affected by the declining costs of increasingly complex electronic items. This is particularly alarming given the recently high fraction of workers undertaking service work.

Nonetheless, many other service sector jobs are likely to remain unautomatable; as an example, human housekeepers are still much better at their jobs than robots.

Carl: To expand a bit on that, what we are saying is that service occupations that do not require much creative and social intelligence are likely to be automated. Some personal service jobs, however, do require especially some social intelligence. These, we think, will not be automated.

In the short run, the government could support employment by stimulating the demand for personal services. In the long run, I do not believe there is much of a substitute for training workers to work with computers.” – Dr. Carl Benedikt Frey

Please discuss some of the characteristics of occupations not at risk of computerization. 

Michael: These jobs involve tasks at which machines are relatively poor: tasks involving creativity or social intelligence. As examples, I think

  • recreational therapists,
  • mental health counselors and
  • primary school teachers are relatively safe for the foreseeable future.
  • Many people may also be surprised to learn that occupations requiring work in very cluttered environments are also relatively safe. For example, the perceptual capacity of a human housekeeper, able to distinguish unwanted dirt from a pot plant, is unlikely to be matched by a robot cleaner for many decades.

Tony, why does this evolutionary phase require more revolutionary changes in education versus the gradual changes we have seen in previous generations?

The Future of Employment study makes clear that what matters most today is what you can do with what you know, rather than how much you know. Many recent college graduates find themselves unemployed or underemployed because they lack the skills needed in an increasingly innovation-driven economy. With academic content knowledge having become a commodity that’s available on every internet-connected device, the ability to 

  • initiate, 
  • discern, 
  • persevere, 
  • collaborate, and 
  • to solve problems creatively 

are the qualities most in demand today and will be increasingly important in the future. The problem is that our education system was designed, primarily, to teach the three R’s and to transmit content knowledge. We need to create schools that coach students for skill and will, in addition to teaching content. If we don’t make this transition quickly, a growing number of our youth will be unemployable at the same time that employers complain that they cannot find new hires that have the skills they need.

We need to create schools that coach students for skill and will, in addition to teaching content. If we don’t make this transition quickly, a growing number of our youth will be unemployable at the same time that employers complain that they cannot find new hires who have the skills they need.” – Dr. Tony Wagner

What recommendations would you make to governments about retraining workers who are now or will be unemployed as a result of this evolution? 

Tony: I wish I had an intelligent answer to this important question, but I’m a “recovering” high school English teacher, not an economist. My hunch is that it will take a generation to better prepare young people for the new economy. Meanwhile, perhaps we’ll need to put people to work repairing our crumbling infrastructure, helping out in preschools and assisted living homes, and so on. There is a lot to be done to make our country a better and more humane place to live. The question is: are we willing to pay people to do this work?

Carl: In the short run, the government could support employment by stimulating the demand for personal services. In the long run, I do not believe there is much of a substitute for training workers to work with computers.

If you were speaking to a group of high school students today, what fields and disciplines would you encourage them to explore to ensure success in the job market?

 Tony: First, I would encourage them to pursue their real interests. Curiosity and intrinsic interest trump mere academic achievement today. Secondly, I’d suggest they consider designing an interdisciplinary major in college around a problem of interest to them. Innovation increasingly happens at the intersections of academic disciplines, not within them.

Michael: One thing that came out very clearly from our analysis was the continuing importance of education. In particular, we found a strong negative trend between an occupation’s average level of education and its probability of computerization. As such, I can only recommend that young people continue to gain the kind of cognitive and creative skills that give them a competitive edge over machines. In particular, and I may be biased, but occupations revolving around creative uses of data are likely to be resistant to automation for some time. Further, people skills: the ability to negotiate, or persuade, are likely to become increasingly important for human work, due to their resistance to automation. Finally, manual work in unstructured environments is probably a fairly safe bet: gardeners are unlikely to have to worry about their jobs for a good long while.

 

C. M. Rubin, Dr. Tony Wagner, Dr. Carl Benedikt Frey, Dr. Michael A. Osborne

Photos are courtesy of the Oxford Martin School and Tony Wagner.

For more information on the Oxford Martin School Study:

http://www.futuretech.ox.ac.uk/sites/futuretech.ox.ac.uk/files/The_Future_of_Employment_OMS_Working_Paper_1.pdf

For more information on Education for Innovation at the OPPI Festival: http://oppifestival.com/

In The Global Search for Education, join me and globally renowned thought leaders including Sir Michael Barber (UK), Dr. Michael Block (U.S.), Dr. Leon Botstein (U.S.), Professor Clay Christensen (U.S.), Dr. Linda Darling-Hammond (U.S.), Dr. Madhav Chavan (India), Professor Michael Fullan (Canada), Professor Howard Gardner (U.S.), Professor Andy Hargreaves (U.S.), Professor Yvonne Hellman (The Netherlands), Professor Kristin Helstad (Norway), Jean Hendrickson (U.S.), Professor Rose Hipkins (New Zealand), Professor Cornelia Hoogland (Canada), Honourable Jeff Johnson (Canada), Mme. Chantal Kaufmann (Belgium), Dr. Eija Kauppinen (Finland), State Secretary Tapio Kosunen (Finland), Professor Dominique Lafontaine (Belgium), Professor Hugh Lauder (UK), Professor Ben Levin (Canada), Lord Ken Macdonald (UK), Professor Barry McGaw (Australia), Shiv Nadar (India), Professor R. Natarajan (India), Dr. Pak Tee Ng (Singapore), Dr. Denise Pope (US), Sridhar Rajagopalan (India), Dr. Diane Ravitch (U.S.), Richard Wilson Riley (U.S.), Sir Ken Robinson (UK), Professor Pasi Sahlberg (Finland), Professor Manabu Sato (Japan), Andreas Schleicher (PISA, OECD), Dr. Anthony Seldon (UK), Dr. David Shaffer (U.S.), Dr. Kirsten Sivesind (Norway), Chancellor Stephen Spahn (U.S.), Yves Theze (Lycee Francais U.S.), Professor Charles Ungerleider (Canada), Professor Tony Wagner (U.S.), Sir David Watson (UK), Professor Dylan Wiliam (UK), Dr. Mark Wormald (UK), Professor Theo Wubbels (The Netherlands), Professor Michael Young (UK), and Professor Minxuan Zhang (China) as they explore the big picture education questions that all nations face today.

The Global Search for Education Community PageC. M. Rubin is the author of two widely read online series for which she received a 2011 Upton Sinclair award, “The Global Search for Education” and “How Will We Read?” She is also the author of three bestselling books, including The Real Alice in Wonderland, and is the publisher of CMRubinWorld.

Follow C. M. Rubin on Twitter: www.twitter.com/@cmrubinworld

TAGS: Education Technology Future of Employment Carl Benedikt Frey
Michael A. Osborne Automation of Production Tony Wagner The Global
Search for Education Computerizable Occupations Job Automation Oxford
Martin School 20th Century Job Market C. M. Rubin Education for
Innovation OPPI Festival Education Reform Occupation Computerization

Numenta, Jeff Hawkins’ AI startup, is now only about learning your AWS patterns

ORIGINAL: Gigaohm
by Derrick Harris
MAR. 25, 2014

 

photo: Shutterstock / Will Deganello
SUMMARY:
Numenta, the machine learning company from Palm creator Jeff Hawkins, has narrowed its business model to focus solely on predicting anomalies in Amazon Web Services (AWS) instances. It’s one of several big changes at the company in the past few years.
Numenta, the startup from Palm creator Jeff Hawkins that’s trying to commercialize a “cortical learning algorithm” that mimics the brain’s capability to detect complex patterns, has narrowed its focus to predicting anomalies in customers’ Amazon Web Services instances. It’s a pretty sharp pivot for the company, which came out of stealth mode in late 2012 talking about how it could learn patterns and predict failures in any system pumping out streams of performance data.
Here’s how I described Numenta’s broad capabilities when covering its partnership with smart-grid-management company EnerNOC last year:
Grok … is continuously learning from every new data point that hits the system, and it’s always readjusting its models to account for any changes it sees in the patterns of data. Not only does this help it make predictions faster and more accurately, but it also helps Grok spot anomalies that could cause problems.
Here’s how Numenta now describes the product on its web site:
Grok leverages sophisticated algorithms to analyze connected datastreams, such as those from AWS CloudWatch. Through complex pattern analysis, Grok identifies abnormal conditions or gradual trends – situations that tools based on thresholds or simple statistics can easily miss.
A screenshot of Grok.
While Numenta previously discussed turning Grok loose on streaming data as part of an automated system that could even take action in extreme scenarios, the new Grok provides a mobile-only interface for users to check their metrics. Curiously, it’s Android-only.
Actually, Numenta has undergone a lot of changes in the past few years. The company was founded in 2005 to commercialize one neuroscience-based algorithm but shifted its focus to the cortical learning algorithm in 2009. Co-founder and former CTO Dileep George left in 2010 to launch Vicarious, the artificial intelligence startup that recently received a $40 million investment from Elon Musk, Mark Zuckerberg and Ashton Kutcher, among others. Former CEO Rami Branitzky left in August 2013 and is now a managing director at SAP Ventures. Numenta Co-founder Donna Dubinsky is currently CEO.
From May 2013 until Tuesday, Numenta had changed its name to Grok Solutions and maintained Numenta as the name of its open source corollary.
It will be interesting to see if Numenta can catch on among AWS users at a level it apparently couldn’t as a general-purpose technology. There are already myriad services out there for analyzing AWS performance – Stackdriver, Boundary and New Relic among the more well-known ones — and Numenta will have to provide a truly differentiated service to make a name for itself. Its algorithms might be fancier and tuned for prediction rather than just monitoring, but I suspect the folks in charge of keep cloud applications running are tiring of new tools to consider and aren’t too keen on fixing environments that aren’t broken.
For more on the technology behind Numenta, check out then-CEO Branitzky presenting as part of our Big Ideas collection at Structure Data 2013.

Why the Bizarre Ocean Dandelion is Like an Ant Colony on Steroids

By Rebecca Helm, Brown University
Mar 26, 2014
I am not alone. (Photo : NOAA, CC BY)
I was 12 when I first came across an ocean dandelion. I wish I had known then how strange these animals truly are. I was watching a documentary where researchers had collected a deep-sea dandelion using a submersible, but upon returning to the surface, the dandelion had disintegrated into nothing but “petals”.
The announcer said: “We know almost nothing about the ocean dandelion. What it eats, how it reproduces, how it is put together.Since watching this documentary, new species of ocean dandelions have been discovered, yet much of their biology remains poorly understood. But scientists do know something about how they are constructed, and when I found out, 16 years later, I was in for quite a shock.
NOAA, CC BY
A few years ago I started work in a lab studying a strange group of animals called siphonophores that includes the Portuguese man-of-war. Turns out the ocean dandelion is a siphonophore, and these siphonophores challenge a simple assumption about what it means to be an animal.
To explain that let’s look at human beings. You are one animal, but you are made of trillions of cells that work together. For biologists this description is that of “levels of organisation”. On one level, you are trillions of cells, on the second level, you are one unique animal. But ocean dandelions have a third level.
NOAA, CC BY
Imagine a single creature that is not just made up of trillions of cells, but also hundreds of animals. All these animals work together in the same way your cells work together, creating a kind of super organism. A colony of ants could be considered a super organism, all working together with one queen. Siphonophores, like the ocean dandelion, take this whole idea one step further. The ocean dandelion is like an ant colony on steroids.
Each ocean dandelion is a collection of individual animals, all working together for the colony, like different ants form a colony. There are different jobs for different members. Some protect the colony, some catch food, some reproduce. But there is one key difference between an ant colony and an ocean dandelion: individual ants work together, but still remain separated from one another, for members of the ocean dandelion colony, this isn’t true.
The many animals that make up the ocean dandelion actually share tissues with one another. They have one shared community stomach system, so what one animal eats, all get to digest. Colony members have some independence, and are capable of their own movements. However, a vast colony-wide nervous system also coordinates individual movements, so that many members can work collectively for a common goal.
Forget the hammer and sickle, communism, your symbol should be the ocean dandelion.
NOAA, CC BY
Each “petal” of the disintegrated ocean dandelion I saw years ago was actually a single member of the colony, able to survive a short time on its own before starving to death. A change in pressure, or bumpy ride to the ocean’s surface, may have been what caused the colony to collapse.
Despite the time that’s passed since I first saw the ocean dandelion, there is still a lot we don’t know.
What does it eat? How does it reproduce? But we know something about how it’s put together, and I never would have guessed the answer would be so strange. Twelve-year-old me would be thrilled.
NOAA, CC BY
This is an edited version of an article that first appeared on Deep Sea News.
Rebecca Helm does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.

Infographic Competition: Visualizing the Scale of the Brain

ORIGINAL: Visually
President Obama’s BRAIN Initiative has brought brain mapping to the forefront of popular science. But what does it mean to map the brain?
One human brain contains seven orders of magnitude of spatial complexity and at least 10 orders of temporal magnitude. These numbers are hard to fathom so MIT’s EyeWire has teamed up with FEI and Visually to launch a “Scale of the Brain” Infographic Competition. Entries should visualize spatial scales in the brain.
Judges

Prizes

1st:

2nd: $200 – Sponsored by FEI
3rd: $100 – Sponsored by FEI
Criteria
Judges will grade entries on the following criteria:
Information is represented accurately and communicated clearly. Sources for additional information not in the creative brief should be included according to best practices (use original sources whenever possible).
Story/layout is engaging and insightful, helping to pull the viewer through the graphic.
The infographic’s design should be attractive and captivating without detracting from the communication of the information. Illustrations, layout, font, and color choices are all important.
Each judge gets 10 points per category. All categories are totaled for each group. Totals from each judge are averaged together. Highest average wins.

Licensing

All entries to the competition should be licensed under creative commons so that the entries may be used in whole or part as educational materials anywhere. The creative commons attribution 4.0 sign should be included somewhere on the design.

Logos

The winning entries will be asked to add the logos of FEI, EyeWire and Visually, along with their own logo or name.
Submissions
Send all submissions as links or attachments to contest@visual.ly before April 30th at Midnight, PDT.
Questions about the contest can also be directed to that email address.
Schedule
  • Submission Deadline: April 30
  • Judging: May 9
  • Winner Announced: May 15
Creative Brief
The spatial scale of the brain ranges from meters to nanometers, with plenty of gradations in between. This quick summary will get you started, and these images are a great visual reference.

Revolutionizing Genomics and Personalized Medicine with IBM Watson (Video of Project Launch Event AND DEMO)

The New York Genome Center and IBM Watson Group Announce Collaboration to Advance Genomic Medicine March 19, 2014

IBM NYGC Press Release

IBM Selected as First Technology Partner for Leading Genomic Research Institution; Project aims to Apply Advanced Analytics to Genomic Treatment Options for Brain Cancer Patients
New York, NY (March 19, 2014) – The New York Genome Center (NYGC) and IBM (NYSE: IBM) today announced an initiative to accelerate a new era of genomic medicine with the use of IBM’s Watson cognitive system. IBM and NYGC will test a unique Watson prototype designed specifically for genomic research as a tool to help oncologists deliver more personalized care to cancer patients.

NYGC and its medical partner institutions plan to initially evaluate Watson’s ability to help oncologists develop more personalized care to patients with glioblastoma, an aggressive and malignant brain cancer that kills more than 13,000 people in the U.S. each year. Despite groundbreaking discoveries into the genetic drivers of cancers like glioblastoma, few patients benefit from personalized treatment that is tailored to their individual cancer mutations. Clinicians lack the tools and time required to bring DNA-based treatment options to their patients and to do so, they must correlate data from genome sequencing to reams of medical journals, new studies and clinical records — at a time when medical information is doubling every five years.

This joint NYGC Watson initiative aims to speed up this complex process, identifying patterns in genome sequencing and medical data to unlock insights that will help clinicians bring the promise of genomic medicine to their patients. The combination of NYGC’s genomic and clinical expertise coupled with the power of IBM’s Watson system will enable further development and refinement of the Watson tool with the shared goal of helping medical professionals develop personalized cancer care.

The new cloud-based Watson system will be designed to analyze genetic data along with comprehensive biomedical literature and drug databases. Watson can continually ‘learn’ as it encounters new patient scenarios, and as more information becomes available through new medical research, journal articles and clinical studies. Given the depth and speed of Watson’s ability to review massive databases, the goal of the collaboration is to increase the number of patients who have access to care options tailored to their disease’s DNA.

Since the human genome was first mapped more than a decade ago, we’ve made tremendous progress in understanding the genetic drivers of disease. The real challenge before us is how to make sense of massive quantities of genetic data and translate that information into better treatments for patients,” said Robert Darnell, M.D., Ph.D., CEO, President and Scientific Director of the New York Genome Center. “Applying the cognitive computing power of Watson is going to revolutionize genomics and accelerate the opportunity to improve outcomes for patients with deadly diseases by providing personalized treatment.

First Watson Application in Genomic Research

Watson will complement rapid genome sequencing and is expected to dramatically reduce the time it takes to correlate an individual’s genetic mutations with reams of medical literature, study findings, and therapeutic indications that may be relevant. The intention is to provide comprehensive information to enable clinicians to consider a variety of treatment options that the clinician can tailor to their patient’s genetic mutations. It will also help NYGC scientists understand the data detailing gene sequence variations between normal and cancerous biopsies of brain tumors.

As genomic research progresses and information becomes more available, we aim to make the process of analysis much more practical and accessible through cloud-based, cognitive innovations like Watson,” said Dr. John E. Kelly III, Senior Vice President and Director of IBM Research. “With this knowledge, doctors will be able to attack cancer and other devastating diseases with treatments that are tailored to the patient’s and disease’s own DNA profiles. If successful, this will be a major transformation that will help improve the lives of millions of patients around the world.

The goal is to have the Watson genomics prototype assist clinicians in providing personalized genomic analytics information as part of a NYGC clinical research study. The solution has been under development for the past decade in IBM’s Computational Biology Center at IBM Research.


New York State’s Investment in Genomic Medicine

New York State is at the forefront of advancing medical science and commercialization. Governor Andrew M. Cuomo recently proposed $105 million to fund a partnership between NYGC and the University at Buffalo’s Center for Computational Research to advance genomics research. This investment to enhance the state’s genomic medicine capabilities, together with NYGC’s acquisition of Illumina’s state-of-the-art HiSeq X Ten whole human genome sequencing system, will accelerate the availability of valuable genomic information in New York.

New York State’s investment in cutting-edge innovative industries is creating jobs and growing the economy in Western New York and across our state,” said Governor Cuomo. “This collaboration between the New York Genome Center and IBM will help make the region a new hub for the growing bio-tech industry.

IBM is NYGC’s Founding Technology Member and will advance the organization’s goals of translating genomic research into clinical solutions for serious disease through the collaboration of medicine, science and technology. As biology increasingly becomes an information science, the promise of genomics is closer to reality with the help of data-driven analytics methods and more powerful computing systems. IBM and NYGC’s computational biology experts are renowned for accelerating life sciences discoveries using deep analytical approaches and next generation information technologies.

Learn more about this story at http://ibm.co/1cXTb6u.

To view a Flickr image gallery that illustrates today’s news please click here.

To join the social conversation on Twitter use the hashtag #NYGCWatson.

About the New York Genome Center
The New York Genome Center (NYGC) is an independent, nonprofit at the forefront of transforming biomedical research and clinical care with the mission of saving lives. As a consortium of renowned academic, medical and industry leaders across the globe, NYGC focuses on translating genomic research into clinical solutions for serious disease. Our member organizations and partners are united in this unprecedented collaboration of technology, science, and medicine. We harness the power of innovation and discoveries to improve people’s lives – ethically, equitably, and urgently. Member institutions include: Albert Einstein College of Medicine, American Museum of Natural History, Cold Spring Harbor Laboratory, Columbia University, Cornell University/Weill Cornell Medical College, Hospital for Special Surgery, The Jackson Laboratory, Memorial Sloan-Kettering Cancer Center, Icahn School of Medicine at Mount Sinai, New York-Presbyterian Hospital, The New York Stem Cell Foundation, New York University, North Shore-LIJ, The Rockefeller University, Roswell Park Cancer Institute and Stony Brook University. For more information, visit: www.nygenome.org.

Website: www.nygenome.org
Facebook: www.facebook.com/nygenome
Twitter: @nygenome

About IBM Watson

Named after IBM founder Thomas J. Watson, Watson was developed in IBM’s Research labs and is now being accelerated into market by the new Watson Group. Watson represents a new class of software, services and apps that think, improve by learning, and discover answers and insights to complex questions from massive amounts of Big Data. Watson’s ability to answer complex questions posed in natural language with speed, accuracy and confidence is transforming decision-making across a variety of industries, including health care, financial services and retail. IBM has advanced Watson from a game-playing innovation into a commercial technology. Using natural language processing and analytics, Watson processes information akin to how people think, representing a major shift in an organization’s ability to quickly analyze, understand and respond to Big Data. Now delivered from the cloud and able to power new consumer and enterprise services and apps, Watson is 24 times faster, smarter with a 2,400 percent improvement in performance, and 90 percent smaller – IBM has shrunk Watson from the size of a master bedroom to three stacked pizza boxes. IBM is investing $1 billion to introduce a new class of cognitive computing services, software and apps, and investing $100 million to spur innovation for software application providers to develop a new generation of Watson-powered solutions. Learn more about IBM Watson at www.ibmwatson.com. Learn more about IBM Research at www.research.ibm.com.
Learn more about IBM healthcare at ibm.com/smarterhealthcare.

Media Contacts:

NYGC
Lark-Marie Antón
(646) 977-7044
lanton@nygenome.org

IBM
Christine Vu
(914) 945-2755
vuch@us.ibm.com

Watson Chromosome analysis from a genome sequencing.

Watson Chromosome Pathways.

Watson Drill Down Results Support from a genome sequencing.
Pathways. Work underway using IBM Watson at the New York Genome Center will apply the use of IBM Watson cognitive technology to map genome sequencing results to retrieve insights from medical literature and drug information to find possible treatment options physicians can recommend to their patients. In this image, a cancer mutation is shown on a cell protein pathway from genome sequencing. (Photo credit: IBM)

IBM Watson Analyzes Human Genome (Infographics credit: IBM)

Original Press Realease (Download PDF)

Zuckerberg and Elon Musk back software startup that mimics human learning

ORIGINAL: The Guardian
Dominic Rushe in New York
21 March 2014San Francisco startup Vicarious aims to create ‘a computer that thinks like a person except it doesn’t need to eat or sleep’

Vicarious is developing ‘machine learning software based on the
computational principles of the human brain’. Photograph: Sebastian
Kaulitzki / Alamy/Alamy

Some of Silicon Valley’s biggest names are backing a hitherto low-profile tech startup that aims to recreate the human neocortex as computer code.Vicarious, a four-year-old San Francisco-based startup, claims to be “building software that thinks and learns like a human”. According to the Wall Street Journal Facebook’s Mark Zuckerberg and Tesla’s Elon Musk have just invested $40m in the company.

They join Peter Thiel, a PayPal billionaire, whose Founders Fund targets cutting edge technology. Ashton Kutcher, actor and tech investor, is also investing, as is Facebook co-founder Dustin Moskovitz.

The neocortex is the outer layer of the cerebral hemispheres and in humans is crucial to the use of the senses as well as activities such as language, motor commands and spatial reasoning.

According to the company’s website, Vicarious is developing “machine learning software based on the computational principles of the human brain. Our first technology is a visual perception system that interprets the contents of photographs and videos in a manner similar to humans. Powering this technology is a new computational paradigm we call the Recursive Cortical Network.

The company has already managed to create software that will solve Captcha, the online tests used by many websites to supposedly identify humans from computers. Company founder Scott Phoenix told the WSJ that if they are successful, Vicarious will have created “a computer that thinks like a person except it doesn’t need to eat or sleep“.

Phoenix said his aim was to create a computer that can understand not just shapes and objects but the textures associated with them. He said he hopes Vicarious’s computers will learn to how to cure diseases and create cheap, renewable energy, as well as performing the jobs that employ most human beings. “We tell investors that right now, human beings are doing a lot of things that computers should be able to do,” he said.

The investment comes amid a boom in funding for artificial intelligence ventures, In January IBM announced it was investing more than $1bn to create the Watson Group, a 2,000-employee division dedicated to developing its self-learning super-computer. The money includes $100m to fund startups that find creative uses for Watson.

Earlier this week IBM announced a partnership with the New York Genome Center that will attempt to use Watson to identify the genetic components of brain cancer.

Advancing brain cancer treatment through genomics

ORIGINAL: IBM Research
By Dr. Ajay Royyuru, Director of IBM Research’s Computational Biology CenterIBM and the New York Genome Center testing Watson prototype on glioblastoma

We have put Watson to work in any number of different ways and in any number of different industries. Healthcare, though, was its first real job. It’s gone to medical school, and even studied health insurance. And now Watson is working with the New York Genome Center to launch a pilot that tackles a new medical challenge – glioblastoma.

Dr. Robert Darnell, MD, PhD, President, CEO and Scientific Director of the New York Genome Center (left) and Dr. Ajay Royyuru, PhD, Director of the Computational Biology Center, IBM Research (right)

The most common kind of brain cancer, glioblastoma annually kills 13,000 people in the US alone. As a cancer of the brain, it’s difficult to take tissue samples, for one, so it can’t be examined like most other kinds of cancers. And it moves quickly. Diagnosis to death is on average only 12 months.

All cancers are a disease of the genome. It’s the genome itself that’s progressively changing from normal to abnormal when someone has cancer. When we can determine which genes start to “go bad,” we can better-determine what specific treatment would work to stop it. Therein lies the challenge: How can we better understand what is happening at a genetic level?

The key to glioblastoma’s genetic code is in the human genome. So while we know our cells’ biochemical pathways, it’s also an overwhelming amount of data – billions of DNA base sequences, plus millions of studies, medical documents and clinical records.

Different kinds of brain cancers manifest in different ways and progression rates, so finding these details about glioblastoma is a molecule-sized needle in the genome haystack.

That’s why my team – with decades of research experience in biology as a data science – and NYGC, with the expertise and resources of a dozen top hospitals and medical schools, are collaborating on a project with Watson in genomics. Our goals with this prototype and ensuing studies are to assist physicians with discovering personalized treatment for patients with glioblastoma.

Watson can read millions of pages of medical literature in seconds. By applying its natural language processing and analytics to the genome, it could find connections between what’s buried in journals about the interaction of certain genes, and where those genes are in the genome. And so, in the same way Watson evaluates and hypothesizes on other medical diagnosis based on electronic health records and a doctor’s evaluation (see a demo), it could evaluate and hypothesize about mutations in a cancer cell’s genome that caused the disease, not based on a wide demographic swath of those with similar characteristics, but for an individual based on their personal genome.

Connecting medical literature to the genome 



Today, we know and have detailed medical literature on the biochemical pathways our genes take. But we don’t know where in the genome these cancerous perturbations happen in that molecular network of interactions. So, we’re loading Watson with genome data from NYGC, along with medical literature to map out where these deviations happen. Watson will be able to see that, in the context of given cancer mutations in the genome, which pathways matter. And in the context of those interactions, suggest evidence of potential treatments.

IBM Watson and New York Genome Center. Video: IBM SocialMedia

This journey takes clinicians from trials, to validating what genomic knowledge improves treatment, to routine analysis that helps patients. Ultimately, we want to see our partners at NYGC and physicians upload genomic data into the Watson Genome on the cloud, where the system could quickly synthesize a personalized report of available evidence of treatment options.