Article Published: 4/29/2024
Artificial intelligence (AI) is a popular topic, one which evokes either excitement or trepidation for many people. This emerging technology has numerous implications for almost every industry and profession, and counseling is no exception. Some AI applications are already trickling into clinics and offices, with a deluge expected in the next few years.
We spoke with NBCC Director of Ethics Brenden Hargett, PhD, NCC, MAC, LCMHC, LCAS, and Jeffrey Parsons, PhD, NCC, LPCC-S, about the ethical implications of using AI technology in counseling practice. Dr. Parsons is a counselor educator and former technology professional. Recently, he worked with Hargett to develop NBCC’s Ethical Principles for Artificial Intelligence in Counseling, a supplement to the NBCC Code of Ethics.
“Individuals using technology to enhance their work is something that we've all kind of gravitated to over the last couple of years,” says Hargett. “However, counselors have to be well informed about the technology, what it's intended to do, the scope of it—all of the details—before they embrace it and adopt it. It's the counselor’s responsibility, as it relates to scope of practice, to make sure that it fits within the confines of professional responsibility and to ensure client safety.”
“The first thing is just keeping to the basics of ethical practice,” says Parsons. “If you look at the ACA Code of Ethics, the NBCC Code of Ethics, some of those underlying principles always apply. We already know client welfare has to be the priority. That hasn't changed with AI.”
To ensure client welfare and apply those same ethical principles, a counselor needs to understand any new technology before using it in practice. Evaluating AI tools for suitability and patient safety requires a counselor to research its intended applications and limitations.
Perhaps the most glaring limitation of most technologies is confidentiality.
“Most of the platforms out there are not meant for health care practice; they aren't HIPAA compliant,” cautions Parsons. “You can’t just plug client information into ChatGPT, because you're essentially feeding client data into the public domain.”
Hargett concurs: “One of the most important things is for counselors to be sensitive to how using the technology may infringe on a person's rights and confidentiality as a client. If that is not clear, if that's not safeguarded in all regards, then they should not use it.”
For ethical and legal reasons, counselors should stick to software tailor-made for their professional use. HIPAA compliance isn’t enough on its own, however. As with any tool, AI is only safe when used properly. Ethical use of any AI tool comes down to competence, another fundamental ethical principle.
“Across all of our profession’s ethical codes, competence is emphasized,” says Parsons. “Any tool that we use—and it doesn't matter what it is—does not alleviate our responsibility to our clients and our responsibility for our decisions. We can't replace our personal competence by relying on a computer to do it for us.”
Informed consent is another important issue for counselors looking at using AI technology in any capacity. Hargett explains that counselors should explain to clients “the what, the how, the whens, and the whys."
“What is it? Really having a full understanding of the technology—and that connects to the HIPAA privacy requirements. How will it be used? What will it be used for? When will it be used? And why are you using it?”
Additionally, as part of the informed consent process, counselors must allow clients who are uncomfortable with the use of AI to opt out of such technology use.
“The client must know, ‘I do have a right to say no, I'm not interested in doing that,’” says Hargett. “That means that the counselor now has to figure out other ways to gather the information and work with that client.”
Remaining competent and practicing ethically doesn’t only require research into any new potential applications, but also being aware of software updates and other changes to the tools a counselor already uses. Many software companies have begun including AI features in existing products, from search engines to word processors. Health care is no exception.
“I think the thing that would surprise a lot of people is how embedded AI already is in the tools that we use,” says Parsons. “Some of the EHR [electronic health record] platforms out there are starting to integrate AI. So now when I'm writing my case notes, I have a little AI assistant that is HIPAA compliant that can do a preliminary diagnosis or draft some language that would be acceptable to billing companies.”
Parsons predicts that such AI features will become prevalent in EHR technology over the next few years, as software developers and health practitioners seek to ease the documentation burden.
“Most counselors will be using AI whether they want to or not, because it'll just be baked into the tools that they use for their documentation.” However, continues Parsons, that isn’t necessarily a bad thing.
“I think AI is fantastic—if you're well-grounded in what you're doing—as a way of speeding the process up. But it may be a hindrance for students or people who are still learning counseling processes."
The issue is that inexperienced counselors may allow AI tools to fill in for skills they haven’t yet mastered. If they aren’t knowledgeable about the technologies they use, this could happen without them even realizing it. This poses a danger to clients and to counselors’ professional development.
“What happens to the clinical skills of the counselor when they're trusting the technology to do all the work?” asks Hargett. “What happens when the lights go out, when there's no power? Do you know how to manage this client in a session and how to track and document the information without technology?”
“We have to accept the reality that AI is here,” says Parsons. “But helping, particularly, counselors-in-training to really understand the limits of it and to use it responsibly is super important. And the problem with that is most counselor educators are not savvy with this technology enough to have that conversation. Some are, but many don't understand the tech as well as the students do.”
Despite—and because of—the risks, Parsons recommends that all counselors stay informed on the applications of AI in the profession.
“I wouldn't be scared of the technology,” says Parsons, “because there are some real opportunities to make life easier for counselors and potentially for clients. There are a lot of good things that can come out of the tech, but you won't know where the bad parts are if you don't understand it.”
“I have no real issue with the technology, but we can't solely depend on the technology,” says Hargett. “What sparked this whole conversation was a conference last year, where in sessions about technology, they were pushing it, pushing it, pushing it. ‘Oh, it'll do this; it'll do that.’ And no one was talking about what's next and how this impacts the client.”
NBCC’s Ethical Principles for Artificial Intelligence in Counseling will soon be available on our website. This supplement to the NBCC Code of Ethics will provide thorough guidance for counselors to ethically and appropriately integrate AI technologies into their work.
Dr. Brenden Hargett serves as the Director of Ethics for the National Board for Certified Counselors and Affiliates, Inc. He holds a PhD in rehabilitation counseling and rehabilitation counselor education from North Carolina Agricultural and Technical State University in Greensboro, NC. Dr. Hargett is credentialed as a National Certified Counselor (NCC), Master Addictions Counselor (MAC), North Carolina Licensed Clinical Mental Health Counselor (LCMHC), and North Carolina Licensed Clinical Additions Specialist (LCAS).
Dr. Hargett also works as Adjunct Faculty in the Department of Counseling at North Carolina A&T State University and in various other roles. He works tirelessly to improve and enhance the behavioral health and in various roles with the intent to create a responsive and comprehensive array of resources and services.
Dr. Jeffrey Parsons is a Licensed Professional Clinical Counselor Supervisor (LPCC-S) and National Certified Counselor (NCC) with a wealth of experience in counseling. He holds a bachelor's degree in family sciences from Brigham Young University, an MS in community counseling from Portland State University, and a PhD in counselor education from the University of Iowa.
Dr. Parsons has served as a Board member and Board Chair for the Council for the Accreditation of Counseling and Related Educational Programs (CACREP). He has also served as vice-chair of the Kentucky Board of Licensed Professional Counselors, chair of the Product Development Committee for the Association for Counselor Education and Supervision, member of the NBCC Ethics Council, and a member of the CACREP External Training Committee.
The information provided by the National Board for Certified Counselors, Inc. (NBCC) on the nbcc.org website (site) is for general information purposes only. NBCC makes significant efforts to maintain current and accurate information on this site. We are not responsible for any information concerning NBCC or our programs, services, or activities that is published or displayed on any third-party website(s). These websites are maintained by third parties over which we exercise no control, and for which we have no responsibility. Individuals should verify any information obtained from third-party sources by referring to our official site or contacting our customer service team directly.
Copyright ©2024 National Board for Certified Counselors, Inc. and Affiliates | All rights reserved.