DarwinAI of Waterloo, Ontario, has formed a strategic collaboration with Lockheed Martin to improve how the aerospace giant’s customers understand and leverage Artificial Intelligence. Founded in January 2017 by University of Waterloo academics Alexander Wong and Javad Shafiee, DarwinAI focuses on developing what the computer-science sector refers to as Explainable AI (XAI).
XAI attempts to illustrate for its users how neural networks reach decisions, which DarwinAI has formalized into its Generative Synthesis platform and resulting GenSynth Explain product. The Waterloo start-up – having already worked with the likes of Audi, Intel, Nvidia (graphics processing for gaming and professional markets) and Voyage (self-driving cars) – explains its technological approach allows enterprises to build AI they can trust.
Ian Sample, a science editor for The Guardian, back in late-2017 described XAI as “methods and techniques in the application of Artificial Intelligence technology, such that the results of the solution can be understood by human experts.” Sample explains this approach contrasts with the concept of the black box in machine learning, where even the designers of the system cannot explain why the AI arrived at a specific decision.
Sample was writing about what might be best described as AI ethics, developing fair, accountable and transparent systems to be held in account. The concept of XAI can then also be applied to the business world for companies to build core growth strategies around both the development of an AI platform and ultimately understand its results. As DarwinAI notes in its announcement of collaborating with Lockheed Martin, the lack of understanding around AI’s decision-making process has hampered the widespread adoption of AI.
In response to this industry-wide impasse, DarwinAI explains it has created its “explainability” platform, again called GenSynth Explain, for deep learning development powered by proprietary technology. In addition to improving neural network efficiencies, DarwinAI explains the platform can dramatically reduce the time it takes to produce robust and accurate models through the insights it generates.
“Explainability is a critical challenge in our industry. Understanding how a neural network makes its decisions is important in constructing robust AI solutions that our customers can trust,” said Lee Ritholtz, director and chief architect of applied artificial intelligence at Lockheed Martin. Ritholtz explains Lockheed Martin will work with DarwinAI to identify projects across its enterprise to apply XAI technology.
“Negotiating AI’s black-box problem in a practical, actionable manner is a key focus for us this year,” said Sheldon Fernandez, CEO, DarwinAI. “Our collaboration with a leader in the aerospace industry such as Lockheed Martin underscores the importance of trustworthy AI solutions.”
DarwinAI was named a cool vendor in Gartner’s October 2019 Cool Vendors in Enterprise AI Governance and Ethical Response report. CB Insights recently included DarwinAI on its 2020 AI 100 list of the 100 most promising private AI companies in the world – selected from nearly 5,000 global companies based on factors like patent activity, investor quality, news sentiment analysis, proprietary Mosaic scores, market potential, partnerships, competitive landscape, team strength and tech novelty.
Print this page