Are You Using AI In Your Job?

We want to understand the real-world applications of AL and ML in business and the impact it will have on all our jobs.

Want to help? Complete the survey, your insights could make a big difference. It will just take one minute.
You'll be the first to get access to the final report.-->

How NLP Tools from Microsoft, Google & Apple Are Making Our World More Hands-free

| August 4, 2020

The world is in an era of transition. We’re actively working towards a reality that allows us to do everything as hands-free as possible – thanks to advances of natural language processing (or NLP) tools from Microsoft, Google, and Apple – the sky’s the limit. The underlying developer components for handling language input are becoming a heavy focus. These tools give us the ability to precisely interact with our applications as though we’re tapping or clicking away on a screen. It may not be noticeable, but natural language processing toolkits are becoming increasingly essential to our online activity.

The COVID-19 pandemic has played a major role in opening our eyes to how easily we spread germs during our day-to-day routine. We can both reduce infectious transmissions as well as more easily interact with app-driven processes in our lives through voice. So why not make the most of such technology? Here, we’ll take a quick look at why hands-free tech is important. Then we’ll discuss some specific changes in popular development platforms that give users unprecedented control over their apps.

Why natural language processing and hands-free is so important

There is a slew of content and data circulating about just how dirty we are. The average human accepts and transmits all kinds of gross stuff just by existing. And because we’re always touching our phones, these devices provide a revealing cross-section of scum that showcases the magnitude of the icky things we collect.

Of course, humans have been exchanging the unseen before anthropologists would have even considered us humans. As one writer puts it, we are a “superorganism” made up of trillions of little beings that help us, harm us, and just kind of exist. Transferring these bits of ourselves improves our immune system in a lot of ways. However, this generality doesn’t quite transpose when we have infectious diseases running rampant.

Rather than go through widespread shutdowns that adversely affect our economy, reducing touch and interaction with various natural language processing toolkits seems to be a viable solution to combating diseases. In life-or-death situations – such as those created by COVID-19 – mitigating disease without crippling economies can be simple. As simple as reducing the amount of physical interaction between individuals.

It will take some time, but developers of the software we use in our daily lives can play a role in alleviating engagement with our dirty devices as well as with each other. While it’s not a perfect solution, it’s a step in the right direction.

NLP toolkits offered by Microsoft, Google, and Apple that make us more hands-free

Blue Label Labs has been deeply involved in creating NLP integrations by utilizing Azure Cognitive Services for the apps we build. For Microsoft NLP, the Azure platform gives us access to a powerful AI engine. It allows us to develop speech-to-text and language translation components with relative ease.

In the past, we developed an app known as PromptSmart. This teleprompter service used speech recognition to auto-scroll a text document while the user speaks. At the time in 2014, the only viable natural language processing toolkit we had at our disposal was known as OpenEars. This platform provided exactly what we needed to develop the app with extraordinary results. Today we can use the Microsoft NLP tool, the cloud, to control text-to-speech and language translation. This is a vast improvement from tools we used some 6 years ago. Though OpenEars was phenomenal for its time, PromptSmart was  supported by a sole developer in Germany. We are grateful to be in a much different environment these days. 

Thankfully, trillion-dollar companies like Google and Apple are offering natural language processing toolkits for their platforms. These make developing hands-free apps as easy as possible.

Recently, Google has opened up Google Assistant by providing the Actions SDK for NLP that allows developers to work with their preferred tools for Google NLP. This platform gives developers a system that enables “local authoring of Natural Language Understanding (NLU – basically just another acronym for NLP) and conversation flows.” It also imports data from other sources. They have added a feature known as Home Storage that allows developers to store conversation data in a logical convention. Their Continuous Match Mode also provides a context-sensitive response mechanism for long-form content.

Another new feature Google has developed as part of the Actions SDK is the Actions Builder feature. This useful Google NLP allows developers to build visual workflows using drag-and-drop tools when developing conversation webs. This graphical interface puts everything in one place so it’s easier to analyze conversations. It makes debugging and testing more centralized. Thus, it removes the need for developers to switch back and forth between the consoles.

Apple has improved its offerings by expanding SiriKit Media Intents which allows developers to build in requests for hands-free operation via Siri as part of Apple NLP. For example, the INPlayMediaIntent event allows developers to build solutions for playing audio. Users can say something like “play song [in my app]” to access media through Siri. There is a similar event called INSearchForMediaIntent that allows people to search for specific media within an app. You can say something like “find Bayside in my app” (if you were looking for that particular band) to find items associated with Bayside.

Ultimately, all of these NLP Tools from each platform will enable users to live more hands-free. While the capabilities are somewhat limited at this time, there is the potential to use these commands for much more than entertainment. In smart homes, people can interact with connected devices using the Apple NLP solution, HomeKit. 

With improved development on the backend, you could be able to invoke more specific commands like when interacting with a smart thermostat. Right now, you can use voice commands to turn things on and off, but you can’t negotiate specific tasks like setting your thermostat to a certain temperature during a specified period. In the future, we’ll see more advanced operations become possible for apps that can be controlled through voice commands. Whether they were developed using natural language processing toolkits from Microsoft, Google, or Apple.

We can build your app to handle complex voice commands

At Blue Label Labs, we’re constantly expanding how we develop applications by embracing forward-thinking processes to satisfy every possible end-user. Our technological prowess dates back to the days we built PromptSmartthrough. Now, current endeavors include developing Alexa Skills, software with the Azure Cognitive Services API, and we have a wealth of actionable knowledge with NLP toolkits. Get in touch to learn how Blue Label Labs can build your product to be part of this new, hands-free world.

+ posts

Get the latest from the Blue Label Labs’ blog in your inbox

Subscribe

* indicates required