Samsung Acquires Viv, AllSeen Alliance Disbands, and Bias in NLP

Yitaek Hwang
Samsung Acquires Viv, AllSeen Alliance Disbands, and Bias in NLP

Samsung Jumps into the Voice Interface Race

Viv became the latest AI company to be acquired. TechCrunch announced earlier this week that Samsung has agreed to acquire Viv. Although Viv has yet to launch since its celebrated demo in May at Disrupt NY 2016, it is widely regarded as the better version of Siri, developed by the creators of Siri themselves. With Google releasing Google Home earlier this week and the other tech giants rushing to dominate the voice interface platform, what does this mean for Samsung?

Summary:

  • Viv trumps Siri and other AI assistants in two ways: 1) it allows multiple silos of information across apps, thereby establishing a interconnected structure; 2) Viv claims to utilize program synthesis: Viv seeks to understand user intent and dynamically generate programs to handle task in near real time.
  • Viv allows Samsung to recapture some of its smartphone market share that plummeted after the Galaxy Note 7 battery recall. Viv will be a serious upgrade to Samsung’s native NLP service.
  • With Google moving further into hardware and Apple seemingly forcing itself into the voice interface race conversation, this acquisition allows Samsung to create a ubiquitous voice interface to all of its products.

Takeaway: Tech giants are figuring out that the future of IoT is ultimately going to be determined by how the users interact with and integrate all the connected devices into their lives. Samsung’s SVP Jacopo Lenzi nails it in his interview, “We do see the evolution of the customer experience being enabled by AI particularly as we continue to add devices to their system, to IoT, and the importance of something like this to really allow you just to engage with technology in the way they really want to which is simple conversational interface.” Here we see that the next wave of IoT is not just in building chat bots or AI assistants, but building an interface for consumers with which to interact.

+ Forbes: What Amazon gets about Smart Homes that Google yet doesn’t

Down Goes AllSeen Alliance

AllSeen Alliance, the steward of AllJoyn led by Qualcomm, reported that it voted to disband last week according to Stacey Higginbotham. This means the end of AllJoyn and another device discovery and communication standard for IoT becoming obsolete. This leaves Intel’s Iotivity and Google’s Weave to try to accomplish a network standard to make IoT more accessible.

Image Credit: Alljoyn

Summary:

  • Most of the engineers working on AllJoyn/AllSeen Alliance will join Intel and Samsung’s Iotivity protocol and Open Connectivity Foundation.
  • The end of AllJoyn leaves no standard for IoT devices. Iotivity is still working out its certification labs, while version 1.1.1. will be the last supported AllJoyn release.

Takeaway: One of the main issues for consumer IoT adoption is the lack of standards. While consumers are more or less familiar with Wi-Fi, Bluetooth, and NFC, it’s rare to find someone who understands lotivity, ZigBee, Z-Wave, or Weave. While cellular carriers are rushing to build their own networks for IoT, maybe coming to an agreement for a standard should be on our priority list.

+ Wareable: Thread, ZigBee, Z-Wave: Why Smart Home standards matter
+ Medium: Three reasons carriers are building new networks for IoT

Quote of the Week

Approximately 8 of the 319 million people in the United States read the Wall Street Journal, about 2 percent of the population. If you look at the language — standardized English — being fed into many natural language processing units, it’s based on the language of that 2 percent. And many machines literally use the venerable, business-focused newspaper to better understand the English language.

– Tonya Riley

Image Credit: Inverse

Brendan O’Connor, assistant professor of computer science at the University of Massachusetts Amherst, points out that the way tech giants train their AI systems doesn’t necessarily reflect how we typically speak. It is lacking diversity. Many NLP tools like Google’s SyntaxNet (deep learning language processing framework) fail to pick up language construction of dialects. This leads to Google’s search systems to push websites that are primarily written in African-American languages, further down the search results due to the way it was trained.

O’Connor sees this as a reflection of bias and lack of diversity in AI systems currently. AI systems are only as good as the data we feed it. As NLP continues to get better, it may be time to employ linguists to make sure the training set reflects not just standard English but the slang and dialects used by 98% of the population.

The Rundown

  • Grade your multiple choice test using OMR, Python, and OpenCV — PyImageSearch
  • Lesser known Git commands — Hacknoon
  • Generate faces with deconvolution network — mind,etc
  • Building an AI Startup: Realities & Tactics — Medium

Resources

  • Udacity: Contribute to Udacity’s open source self-driving car project — GitHub
  • subpixel: A convolutional neural network implementation to increase image quality — GitHub
  • Open Images Datase: Google tagged libraries of pictures for image recognition training — Google Research Blog
  • CNN Image Compression: Google also releases its image compression techniques using neural networks — Google Research Blog
Author
Yitaek Hwang
Yitaek Hwang - Senior Writer, IoT For All
Yitaek is a Senior Writer at IoT For All who loves learning about IoT, machine learning, and artificial intelligence. He graduated from Duke University with a dual degree in electrical/computer and biomedical engineering and is a huge Cameron Crazie.
Yitaek is a Senior Writer at IoT For All who loves learning about IoT, machine learning, and artificial intelligence. He graduated from Duke University with a dual degree in electrical/computer and biomedical engineering and is a huge Cameron Crazie.