We are experimenting with offering you all major news stories of the day on one page, for easy reading. Here are your updates from June 16, 2020.
Boston Dynamics Spot robot now available for $74,500
Boston Dynamics finally let its first major robot Spot out of the lab last year after years of development? Now the robot maker has announced the availability of its four-legged robot for $74,500 for US businesses.
Though at a hefty price tag, any US firm can now go ahead and buy their very own Spot for the price of a Tesla Model S at shop.bostondynamics.com.
For the uninitiated, the quadruped robot is said to climb stairs and traverses rough terrain with unprecedented ease, yet is small enough to use indoors. According to the company, Spot has a proven track record of supporting the remote operation and autonomous sensing across a variety of industries, and is remarkably intuitive, enabling you to focus on the job you do best.
Spot is designed for use in industrial or commercial applications by trained professionals. It is not certified safe for in-home use or intended for use near children.
The Spot Explorer developer kit includes the robot, two batteries, the battery charger, the tablet controller, a robot case, a power case, and Python client packages for Spot APIs. Boston Dynamics has plans to sell Spot payloads, and customers can look forward to get software updates “when available.”
The official website mentions that the Spot Explorer ships in six to eight week.
GitHub Decides To Drop Coding Terms Like ‘Master’ And ‘Slave’
As demonstrations supporting the Black Lives Matter movement continue across the country, Microsoft-owned coding site GitHub has decided to drop decades-old coding terms such as “master” from its systems. The aim is to remove references to slavery, BBC said in a report.
GitHub is home to over 50 million developers working together to discover, fork, and contribute to over 100 million projects.
As reported by BBC, Nat Friedman, the CEO at GitHub, said the firm is working on changing the term “master” to a neutral term. Talking in terms of technology, “master” refers to the main version of code that controls other copies, or processes.
Friedman made the announcement on Twitter while replying to Google Chrome developer Una Kravets. She first suggested that she would be happy to rename the “master” branch of the project to “main”.
“It’s a great idea and we are already working on this,” replied Friedman on Twitter.
In a similar move, Google encourages developers to avoid using the terms “blacklist” and “whitelist” in both Chromium web browser project and Android operating system for directories of those things that are explicitly banned or allowed, BBC said.
Uber open sources Neuropod
Uber recently open-sourced Neuropod, an abstraction layer on top of existing deep learning frameworks that provides a uniform interface to run DL models. Now, adding support for a new framework across all of your tooling and infrastructure is as simple as adding it to Neuropod.
Uber said that Neuropod has been instrumental in quickly deploying new models at the company since its internal release in early 2019. Over the last year, Uber have deployed hundreds of Neuropod models across Uber ATG, Uber AI, and the core Uber business. These include models for demand forecasting, estimated time of arrival (ETA) prediction for rides, menu transcription for Uber Eats, and object detection models for self-driving vehicles.
Currently, Neuropod supports several frameworks including TensorFlow, PyTorch, Keras, and TorchScript, while making it easy to add new ones.
Neuropod has allowed Uber to quickly build and deploy new deep learning models. With its ‘Version Selection’ feature, users can specify a required version range of a framework when exporting a model. For example, a model can require TensorFlow 1.13.1 and Neuropod will automatically run the model using OPE with the correct version of the framework. This enables users to use multiple frameworks and multiple versions of each framework in a single application.
Also, ‘Seal Operations’ enables applications to specify when they’re “done” using a tensor. Once a tensor is sealed, Neuropod can asynchronously transport the data to the correct destination before an inference is run. This helps users parallelize data transfer and computation, the company said.
As Uber continues to expand upon Neuropod by enhancing current features and introducing new ones, it is looking forward to working with the open source community to improve the library.
Amazon launches AI-based ‘Distance Assistant’
The ongoing global pandemic has necessitated worldwide social distancing measures. In order to promote social distancing behavior in real-time, Amazon too has set out to use augmented reality to create a magic-mirror-like tool that should help warehouse workers see their physical distancing from others.
Amazon said it is rolling out the new camera system called “Distance Assistant” in its warehouses. The system, consisting of a 50-inch monitor, a camera, and a local computing device, is said to provide employees with live feedback on social distancing.
The standalone unit uses machine learning models to differentiate people from their surroundings, according to an official post. Combined with depth sensors, it is said to create an accurate distance measurement between associates.
As and when warehouse workers walk past the camera, a monitor displays live video with visual overlays to show if they are actually standing six feet apart. Those following social distancing guidelines are highlighted with green circles, while those violating the rules are highlighted with red circles.
Amazon said it has already finished its first Distance Assistant installations at a handful of its buildings. Next, the company is planning to deploy hundreds of these units over the next few weeks.
Additionally, the tech giant has started with the process to open source the software and AI behind its camera system to help others create their own Distance Assistants.