Site icon Medical Market Report

Alexa’s new features will let users personalize the A.I. to their own needs

Amazon is preparing to roll out a trio of new features that will allow consumers to further personalize their Alexa experience by helping train the Alexa A.I. using simple tools. In a few months’ time, consumers will be able to teach Alexa to do things like identifying specific sounds in their household, such as a ringing doorbell or instant pot’s chime, for example. Or, for Ring users, the A.I. could notice when something has visually changed — like when a door that’s meant to be closed is now standing open. Plus, consumers will be able to more explicitly direct Alexa to adjust to their personal preferences around things like favorite sports teams, preferred weather app, or food preferences.

The features were introduced today at Amazon’s fall event, where the company is announcing its latest Echo devices and other new hardware.

The new sound identifying feature builds on something Alexa already offers, called Alexa Guard. This feature can identify certain sounds — like glass breaking or a fire or carbon monoxide alarm — which can be helpful for people who are away from home or for those who are hard of hearing or Deaf, as it helps them to know there is a potential emergency taking place. With an upgraded subscription, consumers can even play the sound of a barking dog when a smart camera detects motion outside.

Now, Amazon is thinking of how Alexa’s sound detection capability could be used for things that aren’t necessarily emergencies.

Image Credits: Amazon

With a new feature, consumers will be able to train Alexa to hear a certain type of sound that matters to them. This could be a crock pot’s beeping, the oven timer, a refrigerator that beeps when left open, a garage door opening, a doorbell’s ring, the sound of water running, or anything else that makes a noise that’s easy to identify because it generally sounds the same from time to time.

By providing Alexa with 6 to 10 samples, Alexa will “learn” what this noise is — a big reduction from the thousands of samples Amazon has used in the past to train Alexa about other sounds. Customers will be able teach Alexa a new custom sound directly from their Echo device or through the Alexa mobile app, Amazon says.

However, the enrollment and training process will take place in the cloud. But detection of the sound going forward will happen on the device itself, and Amazon will not send the audio to cloud after enrollment.

Once trained, users can then choose to kick off their own notifications or routines whenever Alexa hears that noise. Again, this could help from an accessibility standpoint or with elder care, as Alexa could display a doorbell notification on their Fire TV, for instance. But it could also just serve as another way to start everyday routines — like when the garage door sounds, Alexa could trigger a personalized “I’m Home” routine that turns on the lights and starts your favorite music.

Amazon says Custom Sound Event Detection will be available next year.

Along similar lines, consumers will also be able to train the A.I. in their Ring cameras to identify a region of interest in the camera feed, then determine if that area has changed. This change has to be fairly binary for now — like a shed door that’s either open or closed. It may not be able to handle something more specific where there is a lot of variation.

This functionality, called “Custom Event Alerts,” will start rolling out to Ring Spotlight Cam Battery customers in the coming months.

Finally, another Alexa feature will allow the smart assistant to learn a user’s preferences related to food, sports, or skill providers. (Skills are the third-party voice apps that run on Alexa devices.) Consumers will be able to say something like, “Alexa, learn my preferences,” to start teaching Alexa. But the learning can be done in subtler ways, too. For instance, if you ask Alexa for nearby restaurants, you could then say something like, “Alexa, some of us are vegetarian” to have steakhouses removed from the suggestions.

Meanwhile, after Alexa learns about your favorite sports teams, the A.I. will include more highlights from the teams you’ve indicated you care about when you ask for sports highlights.

And after you tell Alexa which third-party skill you’d like to use, the A.I. assistant will default to using that skill in the future instead of its own native responses.

For now, though, only third-party weather skills are supported. But Amazon wants to expand this to more skills over time. This could help to address skills’ lower usage, as people can’t remember which skills they want to launch. It would allow for a more “set it and forget it” type of customization, where you find a good skill, set it as your default, then just speak using natural language (e.g. “What’s the weather?”) without having to remember the skill by name going forward.

Amazon says that this preference data is only associated with the customer’s anonymized customer ID, and it can be adjusted. For example, if a vegetarian goes back to meat, they could say “Alexa, I’m not a vegetarian” the next time Alexa returns their restaurant suggestions. The data is not being used to customize Amazon.com shopping suggestions, the company said.

This preference teaching will be available before the end of the year.

Amazon says these features represent further steps towards its goal of bringing what it calls “ambient intelligence” to more people.

Ambient A.I., noted Rohit Prasad, SVP and head scientist for Alexa, “can learn about you and adapt to your needs, instead of you having to conform to it.”

“Alexa, to me, is not just a spoken language service. Instead, it is an ambient intelligence service that is available on many devices around you to understand the state of the environment, and even acts proactively on your behalf,” he said.

Source Link Alexa’s new features will let users personalize the A.I. to their own needs

Exit mobile version