Artificial intelligence (AI) has come a long way, from both a consumer and business perspective. Siri and Alexa have brought AI to the masses, and chatbots, virtual assistants and big data analytics are making automated intelligence a part of most business environments. AI still has a long way to go to reach its full potential, and the next step for the companies at the forefront is to bring intelligence out of the cloud and push it to the edge and devices.
That means smarter applications and a shift in the technology world of moving AI from the cloud to the edge. The trend is certainly in line with the growing prevalence of the Internet of Things (IoT) and it only makes sense that phones and other frequently used devices will have intelligence built into them. Facebook (News - Alert) has been on top of its AI game, announcing FastText earlier this month. The solution is an open-source version of the company’s AI designed to understand language.
FastText works by breaking down language into small bits using deep neural networks engineered to represent big, complex ideas by separating them into small pieces that relate to each other. The solution simplifies those relationships and essentially pares down 10,000 standard connections between ideas into 50 “meta-connections.” That means the application was reduced from gigabytes to kilobytes, making it possible to work on phones and devices. FastText will enable users to search Facebook posts, suggest hashtags and moderate content right on a device, for faster and more secure service that may be performed locally, without Internet.
Google (News - Alert) is using a different approach to bringing intelligence to the edge and devices, taking advantage of its strong cloud presence. The company this week announced a new AI chip and complementary cloud service to deliver AI to partners and devices. The processor is designed to train and execute deep neural networks, but in a twist, Google will not sell the chip directly. The company is marketing a new cloud service, which it plans to have in place before the end of the year, which will enable businesses and developers to access the processors in Google data centers for building out software for phones and edge devices.
The chip has been unofficially named TPU 2.0 or Cloud TPU and can be used to train as well as run neural networks, and it does so much more quickly than existing processors. It was also designed specifically to work with TensorFlow, Google’s open source software for running neural networks. That could prove to be a limitation for developers used to other software engines, however.
Intel is also allegedly working on a dedicated AI chip, while Amazon and Microsoft (News - Alert) offer GPU processing through their cloud services. Those services use large farms of GPU chips, which were originally developed for gaming graphics and other uses, to train neural networks.
There’s no telling which AI model will win in the end, but all the players seem to agree that faster and more efficient neural network technology that may be pushed to the edge is the direction the market is taking. How that intelligence is delivered to the edge and devices, and how well it performs in the long run, remains to be seen.
If you’d like to learn more about AI, be sure to check out TMC (News - Alert) and Crossfire Media’s newest conference and expo, Communications 20/20, happening July 18-20 at Caesars Palace in Las Vegas. The event will focus on the next wave of technology and innovations that will transcend the importance of person to person contact, disrupting the future of the entire communications industry. Find out more HERE.