AI is transforming the world of work: Employee upskilling and development are key as the technology gains momentum

Updated 7 months ago on May 17, 2024

Portland Providence TLR 1600x900 headshots 062824
From left to right: Valerie Banschbach, Sarah Simmons, Laura Salerno Owens, Lissa Kohnke, Tara Dahl

Candace Bick recently hosted a wide-ranging discussion about artificial intelligence and the workforce with Valerie Banschbach, PhD, dean of the College of Arts and Sciences at Portland State University, which sponsored the event; Tara Dahl, executive director of Oregon Tech Works; Lissa Kohnke, president of Motus Recruiting & Staffing Inc.; Laura Salerno Owens, shareholder/president of Markowitz Herbold PC; and Sarah Simmons, president of R/West. Here are the highlights of that conversation.

Candace Bick: How is your organization and industry currently using AI?

Valerie Banschbach: At Portland State University, our main goal right now is to prepare students to use AI ethically in college and in entering the workforce. This will either lead to significant human progress or be another huge driver of inequality. We believe we need to be proactive and responsible in how we educate students.

At first, AI was a problem in education because students used GPT to write essays and do homework. But we can't turn our backs on AI in higher education. So the first step was to determine how to establish and maintain guardrails so that AI doesn't steal students' opportunities to learn. We developed an academic honesty policy for AI that allows faculty to decide for themselves how they want their students to work with it in the classroom. We are going to continue to develop this policy as AI evolves.

The second step was to create a class on generative AI tools, open to both faculty and students. Students are actively using AI. Many of them are already enhancing learning with AI tools.

As a third step, I organized an AI ethics and policy meeting for leaders in business, healthcare, government, and education. We discussed how to develop policies in different organizational settings. I learned a lot about the challenges workers face, which will help us advance our AI education for students. We plan to continue organizing similar events.

Tara Dahl: We are a non-profit association for technology companies in Oregon. We partner with functional business services such as a health insurance broker, a payroll company, an SEO company for website development - things that tech companies need to grow their business and fill gaps in operational functions. Our insurance brokerage uses artificial intelligence to compare and contrast health insurance plans. Data management uses AI to manage, analyze and summarize large data sets. Our website development SEO partner uses AI to improve coding efficiency or research.

AI is very much about improving efficiency, cutting costs, helping people do the tasks that take them hours to do much faster so they can focus on other tasks. I was running a lunch and learn on AI. Basically what our audience was saying was that AI is great for supplementing what we do, but the human element is still very much needed.

Lissa Kohnke: We use it for recruiting and staffing, both as an external recruiting partner and for internal talent acquisition teams, as well as for agility, efficiency, and cost savings. We use it most often for sourcing candidates, creating job descriptions, interview questions, automating onboarding, and candidate outreach. We see the benefit of eliminating bias by carefully crafting job descriptions.

The human element isn't going anywhere; you need to be picky when you get a response and make sure it does what you want it to do. Finding candidates is a job category that may be going away. A recruiter used to spend a lot of time coming up with Boolean search queries, creating the perfect string to put into the database, to return profiles that match what you're looking for. Now artificial intelligence allows you to enter a job description into a text box and return such profiles. This makes routine tasks more efficient.

Laura Salerno Owens: I wouldn't be a good lawyer if I didn't talk about the risks. At Markowitz Herbold, we use it for some administrative functions, marketing, human resources, written procedures, but we don't currently use it for legal documents or disclosures. We do use it to record a meeting and write summaries. But in this case, you need to think about confidentiality. If you have a meeting with a client and it is confidential, what happens to that recording?

There are many barriers to adaptation in the legal field. I would say that artificial intelligence is the internet's sprout man; it's very likely to give you the wrong answer. You need a human element, a quality assurance process.

One of the biggest barriers to legal representation is cost. If we can find ways to make it more affordable, it will be a huge tool. Faster summaries of evidence, data sets, statutes and regulations, legal research - generative AI can be very useful. Lawyers are trying to learn more about AI because our clients are using AI. If you don't know what your clients are doing and what's happening in business innovation, it will limit your effectiveness as a lawyer.

Sarah Simmons: I work in the creative services industry. I love artificial intelligence. It has truly changed my business, especially generative AI. We've been using AI for machine learning for years, but using generative AI for content creation has been the most revolutionary. We've gone from using the tools we use to generate creative ideas to dynamic content creation. We can now train and feed AI models using existing content and brand guidelines, and it will generate copy with the appropriate brand voice. We can personalize emails and create content for our target personas much faster. We are exploring synthetic voice generation. We were already using image generation, but it's gotten much better. Social content generation, project management - but you still need a human to creatively lead it all.

Beeke: Job candidates worry about how long they will be able to work thanks to artificial intelligence. What are the main considerations your company or client companies look for when implementing artificial intelligence tools and processes?

Conque: For most people, AI will change the way they do their jobs, so the fears are not unfounded. The main thing employers can do is to actively communicate with employees. With the advent of AI, there will be whole categories of people who will need to be trained - their role won't disappear, but it will change.

And there are some people who will need to be retrained as their roles leave. It's important to identify from a business perspective which positions in your organization will require upskilling or potential reskilling, and then develop and communicate the plan to your employees.

Focus on your business goals and how the various AI tools align with and support your organization's strategic plan.

Salerno Owens: If you don't realize that your employees are using AI, then you are in denial. We should all be on guard and have policies in place about how they can use it and how to train them.

Bick: How do you set up guardrails and orient your employees to the ethical use of AI? What concerns should organizations have?

Salerno Owens: There are two major problems. The first is bias. Especially in generative AI, your information is only as good as the sources you rely on. It's very important that people critically evaluate, is what they're relying on unbiased and honest, is there no built-in bias and discrimination in the source material?

UNESCO has published a report that the results of generative AI still reflect a significant amount of gender and sexual bias. For example, in recruitment, let's say you have a very male-dominated field and you use AI to find the best candidates. According to this report, it will take male names and put them at the top of the list because the algorithm shows that there are a lot of men in that field. Hence, men are more desirable candidates. This doesn't mean that AI shouldn't be used, but we need to train people on how to train AI.

From an ethical perspective, the first thing a business needs to consider is how you train your employees to use AI and understand what the limitations might be. The second ethical consideration is transparency. Companies need to think about how they want to inform customers about the use of AI.

Banschbach: We work with students on how to better design assignments and how to use AI as a critical thinking partner rather than a tool for plagiarism. If individual faculty decide to allow students to use AI in their work, they also require students to be transparent about how they use it.

AI will continue to exist, so an approach where you say, "No, you can't use AI tools in your work," is not going to be successful. So we try to help students think about responsible use when it's necessary.

Conque: I asked this question on ChatGPT to find out what AI thinks we should be concerned about regarding ethics. It touched on all the topics we've already touched on. It also mentioned vigilance and humility, as well as recognizing the evolving nature of ethical issues. Just because we are responding to these ethical challenges today, everything is evolving so fast that these challenges and issues will evolve as fast as technology.

Dahl: Just as companies outsource bookkeeping, they can outsource the management of artificial intelligence and set up checks and balances and have a third party that can hold them accountable. So companies may soon be hiring technical consultants as part of their job descriptions or outsourcing them.

Bick: What skills will be most important in the new AI landscape? How is higher education preparing students?

Banschbach: If we think about what jobs will be left after AI can solve technical problems, code, and improve the efficiency of analysis, what will be left for humans to do in the workplace? It seems obvious that the remaining needs focus on uniquely human skills - ethical reasoning, human empathy, understanding of society, historical and cultural awareness. The rise of artificial intelligence is a great argument in favor of a liberal arts education. At Portland State University, given our mission, we have a solid foundation for teaching students to apply AI tools for the common good.

Kohnke: Empathy, collaboration, relationship building, critical thinking, and now cultural intelligence are all key human factors. Now all these soft skills are coming to the forefront.

Bick: We have a multigenerational workforce. How do you help your employees of different ages test, try and adapt to the use of AI?

Simmons: It's easy to get overwhelmed, so it's important to break AI and tool testing into smaller chunks and create an environment for collaborative learning. We've started asking mixed-age team members to test specific things related to their job responsibilities and report back to the group. That way they feel like they have a voice. If this proves successful, we'll schedule a demonstration, and then it can be shared."

Salerno Owen: For example, attorneys have an ethical obligation to provide effective representation, which includes familiarizing themselves with new technologies. So I want to explain to my colleagues that it's about being the best lawyers we can be, not only to fulfill our ethical obligations, but to be effective advocates. If you can present AI in a way that people like and show that it will help them do their jobs better, you will get more advocates."

Conque: I see this as a huge opportunity for intergenerational collaboration. The younger generation has a lot to give to their more mature colleagues by showing them how to use (AI) and teaching them. And mature employees have a lot to offer - empathy, critical thinking when you look at a situation and analyze it rather than just entering data into ChatGPT. If you can communicate to your teams the value they bring to each other, it opens up opportunities for cross-cultural collaboration.

Beeke: Are there any special requests for women and BIPOC staff as you prepare your workforce for an AI-driven world?

Dal: In our colleges and universities, students are taught these skills. But what about those who don't attend these institutions and don't have these opportunities? What can be done so that they can upgrade their skills and use these tools to improve their lives and jobs. So that everyone can take advantage of government programs and get help building their resume to get a job and continue to succeed.

Conque: It all comes down to representation, access, and removing bias. You have to make sure that the training you offer is not only representative, but also includes pure learning communities. This representation speaks volumes and will lead to the cultural relevance of these groups. Access is very important - opportunities for scholarships, funding, networking events, removing barriers that come from financial hardship, lack of mentorship, and systemic bias in hiring and promotion. I don't think AI training presents any more unique challenges than we already have in any type of training for these communities.

Beeke: What message would you give to employers as we move toward working with artificial intelligence, at least in part?

Simmons: My hope is that employers will identify how we can use AI tools and technologies to amplify talent and, in turn, create more value for our businesses. If artificial intelligence tools allow people to do more of what they love to do, then businesses can grow at scale.

Salerno Owens: Employers who don't educate themselves about AI are missing a huge opportunity to not only do their jobs more effectively, but to better understand the environment and customer perspectives. You need to be transparent about how you're using AI and educate yourself on how to make sure you're using it responsibly.

Conque: Be careful. AI is everywhere. The biggest challenge many companies face is where and when they should deploy AI-based solutions, rather than just acquiring a platform and developing specific use cases for generative AI.

Employers fear they will be left in the dust in terms of competitive advantage. I've seen small organizations buy a system and then not have the resources or infrastructure to implement and use it consistently, transparently and ethically. Before you buy a solution, sit down at the table, invite different experts, identify where it will have the biggest impact, and figure out what the problem is.

Dahl: If you have doubts or fears about AI, analyze it to find resources to help you alleviate those fears and feel more comfortable moving into an AI-driven future. Second, engage your employees more. Employees have a wealth of knowledge, resources, and skills that they bring to the workplace. Ask, what are your pain points? How can we use AI tools to solve them? Implement them in a way that not only meets the needs of your employees, but also drives productivity and efficiency.

Banschbach: As we are constantly focusing on the development of generative AI in different organizational structures, we also really need collaboration between different sectors so that we can learn from each other.

Let's get in touch!

Please feel free to send us a message through the contact form.

Drop us a line at mailrequest@nosota.com / Give us a call over skypenosota.skype