This episode proposes best practices for CX leaders navigating the issue of large language model (LLM) hallucination. It was inspired by (1) conversations with several customer support and AI leaders, and (2) research on the recent failure by the chatbot used by the National Eating Disorders Association (NEDA).
To briefly summarize, CX leaders should:
This show is hosted by John Walter. He is the COO of ZMAXINC, which has been advising large brands on the selection of human agent outsource vendors for 27 years. Today the company also advises on the selection of AI vendors. John is also an attorney and a member of the AI, Big Data, and E-Privacy committees of the American Bar Association.
To contact or follow John on LinkedIn, here is a link to his profile: https://www.linkedin.com/in/jowalter/
To learn more about ZMAXINC, here is a link to the company website: https://www.zmaxinc.com/
This episode proposes best practices for CX leaders navigating the issue of large language model (LLM) hallucination. It was inspired by (1) conversations with several customer support and AI leaders, and (2) research on the recent failure by the chatbot used by the National Eating Disorders Association (NEDA).
To briefly summarize, CX leaders should:
This show is hosted by John Walter. He is the COO of ZMAXINC, which has been advising large brands on the selection of human agent outsource vendors for 27 years. Today the company also advises on the selection of AI vendors. John is also an attorney and a member of the AI, Big Data, and E-Privacy committees of the American Bar Association.
To contact or follow John on LinkedIn, here is a link to his profile: https://www.linkedin.com/in/jowalter/
To learn more about ZMAXINC, here is a link to the company website: https://www.zmaxinc.com/