Baidu has announced that it will start distributing a version of PaddlePaddle his deep learning platform under Apache license by September 30. Even if there is not much to say about this tool, the company insists that its platform is easier to address than other deep learning tools. This strategic move reminds what other IT giants like Google or Facebook have done before. Turning to open source is the best way to build a broader community to develop a technology and the Chinese specialist in Internet search knows it.
The code name PaddlePaddle is inspired by Parallel distributed deep learning. As we speak, only an alpha version has been released on Github and for the moment we have to believe Baidu when it says that its platform will be easier to use, allowing developers to focus on high-level model structures without having to worry about low-level details. According to a spokesman of the Chinese group, programs will require less code than on other deep learning platforms. Baidu has already used it internally to develop products and search results ranking technology like classification of large images, OCR, translation and advertising.
The community is the key
The biggest advantage for Baidu in releasing parts of their developments is to take advantage of a broader community of researchers, engineers and even passionate intervening to correct and enrich the code. But it’s not only about having hundreds of brains working happily for free, it’s also a way to ensure that the platform is not remaining in between the hands of researchers, and to facilitate the initiation to deep learning tools to less qualified enthusiastic users. “We believe PaddlePaddle is an important addition to other deep learning tools because it is easier to address,” says spokesman Cole Calisa citing health and transport as two of the many sectors that could benefit from artificial intelligence.
PaddlePaddle supports most of the architectures available such as convolutional or recurrent neural networks and can take advantage of large number of GPU and CPU. Optimization is done at different levels of the platform to take best advantage of the processing power, memory and architecture, says Baidu. The company has also announced a partnership with Nvidia to work in the field of autonomous cars, probably involving a certain amount of DGX-1 systems or loads of Titan X (Titan X)
Last November, Google open sourced TensorFlow, its 2nd generation of deep learning system, followed by Microsoft which did the same with his Computational Network Toolkit (CNTK). The social network Facebook has also distributed some of his deep learning software under an open source license.
© HPC Today 2018 - All rights reserved.
Thank you for reading HPC Today.