0
Microsoft is using its annual Connect(); developers conference to make a number of AI-related announcements, including the open sourcing of one of its key pieces of its Windows Machine Learning (Windows ML) platform.

Credit: Microsoft
Microsoft is open sourcing the Open Neural Network Exchange (ONNX) runtime, officials said today, December 4. The ONNX runtime is an inference engine for machine-learning models in the ONNX format. Microsoft is making it available on GitHub so developers can customize and integrate the runtime into their existing systems and compile/build it on a variety of operating systems.
Credit: Microsoft
The ONNX engine is a key piece of Windows ML. Microsoft is building this machine-learning interface into Windows 10 to try to get developers to use trained machine learning models in their Windows apps. The Windows ML inference engine can evaluate trained models locally on Windows devices, instead of requiring developers to run them in the cloud.
Microsoft and Facebook announced the ONNX format in 2017 in the name of enabling developers to move deep-learning models between different AI frameworks, including its own Cognitive Toolkit (CNTK). More recently, Microsoft officials have started downplaying the Microsoft Cognitive Toolkit, in favor of Facebook’s PyTorch and Google’s TensorFlow, as reported last month by CNBC.
When I asked Microsoft about CNBC’s report, a spokesperson provided the following statement:
“Microsoft believes an open ecosystem will help bring AI to everyone. We’re seeing traction for ONNX and Python with developers and data scientists so we are increasing our investments in those areas, while we continue to support Microsoft Cognitive Toolkit. We have nothing else to share at this time.”
Microsoft also is making its Azure Machine Learning service generally available as of today, December 4. Azure ML enables developers and data scientists to build, train and deploy machine-learning models. My ZDNet colleague Andrew Brust has more details on Azure ML.
In other AI news, Microsoft is continuing to flesh out its Azure Cognitive Services application-programming interface (API) strategy. Today, Microsoft is adding container support for its Language Understanding API (in preview form). Recently, Microsoft announced it was making available several other of its Cognitive Services APIs in containers to enable developers to bring these AI capabilities offline and to edge devices.
In addition, Microsoft is making the custom-translation capability of Translator Text generally available. This API allows customers to use human-translated content to build their own custom translation systems.
Featured stories
2018: The year Apple finally remembered its extreme pro users
Phishing warning: If you work in this one industry you’re more likely to be a target
Hackers are opening SMB ports on routers so they can infect PCs with NSA malware
AWS Outposts brings AWS cloud hardware on-premises
Related Topics:
Windows 10
Digital Transformation
CXO
Internet of Things
Innovation
Enterprise Software
0