Smart Speakers could bring contactless health monitoring by detecting abnormal heart rhythms

According to a new research done by University of Washington, ordinary smart speakers could be used as a contactless way to screen for irregular heartbeats. The researchers came up with an AI-powered system. It relies on sonar technology to pick up vibrations caused by nearby chest wall movements. It it ever comes to existence, it has the potential to change how doctors conduct telemedicine appointments by providing data that would otherwise require wearables, health hardware or an in-person checkup.

“We have Google and Alexa in our homes all around us. We predominantly use them to wake us up in the morning or play music,” said Shyam Gollakota, a UW computer science professor and co-author of the report. “The question we’ve been asking is, can we use the smart speaker for something more useful.” Smartphone makers could integrate the technology into existing products via software updates, researchers say.

As per the researchers, their goal was to find a way to use devices that people already have to edge cardiology and health monitoring into the future. This system has a mounted chest wall. If you want a reading, you will have to sit within two feet of the speaker for it to work.

It works by emitting audio signals into the room at a volume humans can’t hear. The pulses bounce back to the speaker, and an algorithm works to identify beating patterns generated from a human’s chest wall. Another algorithm is then applied to determine the amount of time between two heartbeats. These inter-beat intervals could allow doctors to gauge how well your heart is functioning.

This data was compared to results from medical-grade ECG monitors. Surprisingly, the smart speakers’ readings turned out to be relatively accurate, only deviating from the ECG readings by an amount that “wasn’t medically relevant,” the researchers say. The test was done on a developer version of Alexa with a low-quality speaker to run their tests. Hence, speakers in mainstream devices could be more powerful, which could enable readings from farther away.

Via: The Washington Post

The post Smart Speakers could bring contactless health monitoring by detecting abnormal heart rhythms appeared first on Pocketnow.

Apple, Google training their voice assistants to understand people with speech disabilities

According to National Institute on Deafness and Other Communication Disorders, approximately 7.5 million people in the U.S. have trouble using their voices. This group is at the risk of being left behind by voice-recognition technology. But we are in 2021 – the era to make technology more accessible to everyone. And, tech firms, including Apple and Google are working on improving their voice assistants to understand atypical speech. They are now trying to train voice assistants to understand everyone.

“For someone who has cerebral palsy and is in a wheelchair, being able to control their environment with their voice could be super useful to them,” said Ms. Cattiau. Google is collecting atypical speech data as part of an initiative to train its voice-recognition tools. Training the voice assistants like Siri and Google Assistant could improve the voice-recognition experience for a number of groups including senior with degenerative diseases.

Apple is working to help Siri automatically detect if someone speaks with a stutter

Apple debuted its Hold to Talk feature on hand-held devices in 2015. It gives users control over how long they want the voice assistant Siri to listen to them. The feature prevents the assistant from interrupting users that have a stutter before they have finished speaking. Now, Apple is working to help Siri automatically detect if someone speaks with a stutter. The company has built a bank of 28,000 audio clips from podcasts featuring stuttering to help its assistant recognize atypical speech.

Google’s Project Euphoria is the company’s initiative where it is testing a prototype app that lets people with atypical speech communicate with Google Assistant and smart Google Home products. It aims to train the software to understand unique speech patterns. The company hopes that these snippets will help train its artificial intelligence in the full spectrum of speech.

Amazon isn’t far off with its Alexa voice assistant. The company announced Alexa integration with Voiceitt, which lets people with speech impairments train an algorithm to recognize their own unique vocal patterns.

Source: WSJ

The post Apple, Google training their voice assistants to understand people with speech disabilities appeared first on Pocketnow.

Amazon extends Spotify podcast support for Alexa devices to 11 more countries

In December last year, Amazon announced that Alexa-enabled smart devices can finally play podcasts on Spotify for both premium and free tier users. But so far, the neat convenience has been exclusive to the US market. That finally changes, as Amazon is finally extending Alexa support for playing podcasts on Spotify to users in 11 more countries. Owners of Alexa-enabled devices can now ask their smart device to play a podcast on Spotify in Austria, Brazil, Canada, France, Germany, India, Ireland, Italy, Mexico, Spain and the United Kingdom.

And in case you’re wondering, this new Alexa capability is available for both free and premium users. Of course, the latter class of users might have to listen to ads before the podcast episode starts on Spotify. All you have to do is say something like “Hey Alexa, play the XYZ podcast” and you’re good to go.

In addition to starting a podcast episode anew on Spotify, users can also ask Alexa to ask to play an episode again or resume from the point where they left off in the Spotify app. Plus, you can switch to the next or previous episode, or navigate the playlist based on the timeframe it was released, all via voice commands. Plus, you can also enjoy the podcast library of someone else’s playlist using the Spotify Connect feature.

Sounds interesting? Follow these steps to let Amazon’s AI assistant control Spotify podcast playback on the smart devices in your home:

  • Download the Amazon Alexa app on your Android or iOS device and link it your smart device.
  • In the app, open the menu and follow this path: Settings > Music & Podcasts.
  • Find the “Link New Service” option and select Spotify. Doing so will link your Spotify account to the Alexa account on your smart device. “If you’re in Brazil, Mexico, Germany, or the UK, tap “Default Services,” and then select Spotify as the default podcast service,” adds Amazon.
  • Now, all you have to do is say the magic words “ Hey Alexa, play XYX podcast on Spotify.”

The post Amazon extends Spotify podcast support for Alexa devices to 11 more countries appeared first on Pocketnow.

Alexa gets Live Translation to translate real-time convos on Echo devices

Amazon is introducing a new feature for Alexa-enabled Echo devices. It is rolling out Live Translation feature, which allows individuals speaking in two different languages to converse with each other. Here, the Alexa-enabled Echo devices will act as an interpreter and will translate both sides of the conversation.

The latest development comes from Amazon’s blog, which announced the rollout of the Live Translation feature for Alexa in the US. Yes, the feature is currently limited to the US. As of now, it works with six language pairs — English and Spanish, French, German, Italian, Brazilian Portuguese, or Hindi. It only works on Echo devices with locale set to English US.

To start, you need to ask Alexa to serve as an interpreter for the language that the person you want to talk to speaks. Now, as the two of you converse, Alexa will keep translating by automatically identifying the person who’s talking. If you own an Echo device with a screen, you’ll also be able to see a visual translation in addition to hearing the audio. If you want to end the conversation, you’ll need to say, “Alexa, stop“.

Naturally, the feature would be more helpful if it were brought to the Alexa mobile app since travelers are (at times) in dire need of live translation during their visit to foreign lands. But for now, it is limited to Alexa-enabled Echo devices. It will be particularly useful for shop owners who need to interact with foreign travelers on a daily basis.

For the unaware, this is not the first time a company is introducing the live translation feature. It has existed on Google Assistant-enabled smart devices for over a year now. The company introduced interpreter mode for its Assistant at the start of 2019, and it was brought to Android and iOS by the end of the year.

Amazon says it is working on adapting the neural-machine-translation engine to conversational-speech data and generating translations that incorporate relevant context, such as tone of voice or formal versus informal translations. 

The post Alexa gets Live Translation to translate real-time convos on Echo devices appeared first on Pocketnow.

Amazon Fire TV gets redesigned UI and better Alexa integration

Amazon announced a bunch of hardware in September this year where it also teased a new experience for its Fire TV lineup. Now, the company has started rolling out a new and redesigned software interface for its Fire TV devices. Amazon is calling it the new “Fire TV experience,” which is also being claimed to be the most significant software update yet. It includes more personalized recommendations and watch lists. Moreover, it brings support for multiple users – up to six users. It also supports picture-in-picture to allow users to access multiple programs at once.

First things first, the home screen has a new design. The redesign will help users to easily find content to watch. It also provides scrolling previews that allow you to jump right into your favorite shows on supported streaming services. Further, you can access pinned apps, Live TV, and your library from the home screen. The company has also introduced a ‘Find’ tab that will display browsable categories, genres, and personalized recommendations for what to watch next. 

Now Alexa can recognise your voice and automatically switch profiles.

You also get support for creating up to six profiles with their own viewing history, watch lists, settings, and more. The update also brings Kids profiles, which block content that isn’t deemed family-friendly. There is an improved Alexa integration, which can be used to navigate around the interface. The voice assistant can now recognize your voice. Further, it can be used to switch to the correct profile after recognizing a user’s voice, in order to easily access their recommended content. Moreover, answers to queries like the weather will be less intrusive. The result will now appear at the bottom of your screen, instead of taking up the whole space.

The new Fire TV experience is rolling out to the most recent devices like Fire TV Stick and Fire TV Stick Lite. Moreover, the new UI should go live on the Fire TV Stick 4K and Fire TV Cube early next year.

The post Amazon Fire TV gets redesigned UI and better Alexa integration appeared first on Pocketnow.

Amazon revamps Alexa app homescreen, adds navigation features

Amazon has updated its Alexa app with a new homescreen and updated navigation options. The homescreen will give direct access to the most frequently used app features. It is coming to both Android and iOS devices and it will be rolling out worldwide over the next month.

Amazon Alexa app’s new homescreen now shows personalized suggestions based on what the users care about most as they continue to use the app. Further, the Alexa button has been moved to the top of the homescreen. Amazon says now its easier to find start talking to the voice assistant.

The menu or “More” option that was originally at the top left of the screen has been moved to the bottom right as well. It now brings options like “Lists & Notes”, “Reminders & Alarms”, “Routines”, and more. The overhaul now provides simpler navigation and easier access to the most important features of the app.

Via: Gadgets360

The post Amazon revamps Alexa app homescreen, adds navigation features appeared first on Pocketnow.

Amazon brings hands-free mode to Alexa app on Android and iOS

It's only fair to share...Share on RedditShare on FacebookShare on Google+Tweet about this on TwitterPin on PinterestShare on Tumblr

Amazon Alexa had a feature missing on its smartphone app that should have been a part of it all along. The smart assistant comes with hands-free control of all manner of devices, but the feature was missing on the smartphone app. Now, Amazon has started rolling out hands-free mode to Alexa app for both Android and iOS devices.

The Amazon Alexa has had its own app for a while now. Even some smartphones now come with Alexa built-in. Until now, you needed to touch the Alexa button on the bottom to control the assistant with your voice. But now, it is finally changing. It is enabled automatically when the app is turned on.

To stay hands-free, all you need to do is open the Alexa app, either manually or by using another smart assistant like Google Assistant or Siri. Then, you can control Alexa with your voice as you normally would on an Echo or other device. You can ask it to control your smart home, play music or anything and everything else that Alexa can do.

Via: Engadget

The post Amazon brings hands-free mode to Alexa app on Android and iOS appeared first on Pocketnow.

It's only fair to share...Share on RedditShare on FacebookShare on Google+Tweet about this on TwitterPin on PinterestShare on Tumblr

Apple, Amazon and Google will now work together to improve smart homes

Google, Amazon, and Apple are joining forces with other brands to make a new and unified Connected Home platform for all smart home devices

The post Apple, Amazon and Google will now work together to improve smart homes appeared first on Pocketnow.