Embedded Insiders

How to Integrate AI into the Embedded Enterprise

July 02, 2020 Brandon Lewis and Richard Nass of Embedded Computing Design, Featuring Special Guest Michael Stewart of Anaconda, Inc. Season 3 Episode 8
Embedded Insiders
How to Integrate AI into the Embedded Enterprise
Show Notes Chapter Markers

AI is all the rage these days, and poised to disrupt nearly every industry. In fact it already is, but more because organizations are struggling to integrate new AI personnel, processes, tools, and workflows alongside their existing infrastructure. It’s so bad in fact that a recent IDC survey reported that the majority of companies are failing in their AI initiatives.

In this episode of the Embedded Insiders, Brandon and Rich interview Michael Grant, Vice President of Services at Anaconda, an open-source-centric data science company who manages the Anaconda distribution of the Python and R programming languages. Michael explains some of the obstacles organizations looking to enter the AI space need to watch out for before they get started, from licensing issues to security vulnerabilities to technical strategies. He then discusses how his company’s recent collaboration with the IBM Watson team can help such organizations integrate, organize, and manage their AI solutions stacks, from model development to endpoint inferencing, using a package-centric architecture.

Later, Jean Labrosse is back with more “Things That Annoy a Veteran Software Engineer,” as he rants about the use of lengthy macros in the C language.

Tune in for more.


For more information, visit embeddedcomputing.com

Why is AI so Hard?
How to Integrate AI into the Embedded Enterprise
Don't Use Lengthy C Macros!