Google is expanding its real-time caption feature, Live Captions, from Pixel phones to anyone using a Chrome browser, as first spotted by XDA Developers. Live Captions uses machine learning to spontaneously create captions for videos or audio where none existed before, and making the web that much more accessible for anyone who’s deaf or hard of hearing.

When enabled, Live Captions automatically appear in a small, moveable box in the bottom of your browser when you’re watching or listening to a piece of content where people are talking. 

Chrome’s Live Captions worked on YouTube videos, Twitch streams, podcast players, and even music streaming services like SoundCloud. However, it seems that Live Captions in Chrome only work in English, which is also the case on mobile.

Live Captions can be enabled in the latest version of Chrome by going to Settings, then the “Advanced” section, and then “Accessibility.” (If you’re not seeing the feature, try manually updating and restarting your browser.) When you toggle them on, Chrome will quickly download some speech recognition files, and then captions should appear the next time your browser plays audio where people are talking.

Live Captions were first introduced in the Android Q beta, but until today, they were exclusive to some Pixel and Samsung phones. Now that they’re on Chrome, Live Captions will be available to a much wider audience.