Episode Transcript
Speaker 1 00:00:08 Hi everyone. I'm posting today from the UK where I'm finishing a little vacation and I want to give you some thoughts on a very, very large meeting we just had last week of the big reset where we had close to 300 companies share their experiences and their questions and their needs and their interests in ai. And as you might expect, uh, after listening to all of the conversations, it's pretty clear that everyone is experimenting and starting from scratch and not in the sense that a lot of companies aren't using AI tools, but the set of issues around data privacy, data security usage, which tools to use, which tools to buy, what to develop internally versus what to develop from a vendor, the role of the human interface to the AI system, being transparent about what the system does, being transparent about what the system is supposed to do and why we've brought it in and not letting people worry too much about it.
Speaker 1 00:01:13 And then building infrastructure and technical skills on using these things are all very, very immature. Now, these are big companies. We're talking about large telecommunication companies, media companies, lots of healthcare companies, banks and insurance companies, asset management companies. I mean we're talk, I'm not gonna give you the company names, tech companies. And in for the most part, it's a pretty consistent story that everyone has lots of excitements about this. But how to actually use it and where to use it is not clear. And just let me give you a couple thoughts in sort of recapping all of these conversations. The first is a good sort of sober question of what problem are you trying to solve? Because AI, like all these other technologies, is a shiny object and it's fun to play with and there are many, many potential use cases for it. And so what, what many, many of the companies have done is they've said, look, we're gonna have a task force.
Speaker 1 00:02:14 We're gonna get a bunch of people look at various technologies and tools, and then before we buy anything, we're gonna sit down and decide where to use it and where is the biggest problem we can solve. One of the companies for example, mentioned that they bought an AI-based learning tool several years ago to experiment, gave it to a bunch of employees and no one used it because they didn't have time and they weren't sure why they were supposed to do use it or what they were supposed to do with it. In other words, just because AI is new doesn't mean everybody wants to play with it. It's really important for us as the curators and the, in some sense, the architects of these new tools to sit down and figure out where the action is and where we're gonna get the most return on investment.
Speaker 1 00:02:57 For example, simple things like making it easy for recruiters and hiring managers to write job descriptions is a big roi. It may not seem very exciting, you know, relative to turnover or retention analysis, but you know, that's one that could be really big. Another one that was huge and several of the healthcare providers was AI based scheduling. One of the biggest problems in many companies, call centers, healthcare providers, certainly in retail companies is the scheduler. When is my shift? Can I get a different shift this week because I have something else I need to do? You know, my family, you issues have changed. My hourly availability is not what it was last week or maybe last month. Who's gonna do all that changing for me? You know, that's a very manual time wasting process. There are now AI tools that do that and these particular healthcare companies said that, you know, they've now enabled a couple of these things, they work really well.
Speaker 1 00:03:52 But people are starting to ask questions like, well, am I allowed to update it and am I allowed to change it? Once the AI tells me what my shift is, how are these decisions being made? Is my manager going to evaluate me based on these decisions? So as intelligent of these, as these tools are, the issue for employees is what does this tool mean relative to me and my job and my role and my performance and my pay? And how do I, uh, push back or use this tool if I don't think it's doing the right thing? And unfortunately, you know, this is just where we are in this market. Our work with our own co-pilot has already shown us that there's some spectacular things that happen and there's some sort of disappointing things that happen. In fact, there was a funny article in Fortune magazine, bill just sent me last night, night that even chat sheet PT is doing math incorrectly in some cases.
Speaker 1 00:04:47 So, so we can't guarantee that these tools are perfect by any means. So we have to factor in this idea that if we're going after a big problem, how are the people using the tool going to respond? Now the other thing that comes up a lot as I talk to companies that are fairly far down the learning curve, companies that might use Eightfold or Fuel 50 or Gloat or a couple of those other off the shelf products is Agile. The process of rolling out iterating, improving, fixing, getting feedback on a continuous basis because these technologies are so new, you can't think of them as an E R P like rollout, which was the old way of thinking about software, which was along the lines of let's do a bunch of testing, let's do a bunch of integration, let's configure it, let's hire some consultants, let's put together a bunch of training and let's push a button and roll it out.
Speaker 1 00:05:41 I don't think we're at that stage with any of these tools yet. They're all extremely powerful and therefore require a lot of education and training and some of the users are not going to be ready for the capabilities that they have. And that means you're gonna learn and improve and learn and improve and learn and improve. And by doing that, the ROI is going to be spectacularly high. But if you do what that other company said and you kind of buy a tool, tell everybody how to use it and roll it out and sit back, I don't think you're gonna be very happy. Now another issue that came up interestingly enough is the roles inside of hr. Let's think about this for a minute. You buy an HR AI tool rather. Maybe it's a learning tool, maybe it's a recruiting tool, maybe it's a scheduling tool and maybe it's a content generation, whatever.
Speaker 1 00:06:29 And a bunch of these issues that are true in all other technology implementations come up I is the data accurate? What about privacy? Who's taking responsibility for data governance? Where did the data come from? What are the ethical rules or guidelines around the decisions that the system is making? How is the DEC system making those decisions? Do we need to know how the system is making those decisions? You know, one of the companies actually made an interesting point because they're in a regulated financial services company and they use a an off-the-shelf product. They actually use eightfold that when the New York City law came out, which was only a few weeks ago, mandating that any recruiting tool that uses AI be able to report on its bias or lack thereof, the executives just said, stop using the tool. We're not gonna use it just like that even though they had been successful in using it for a year or two already.
Speaker 1 00:07:24 So we aren't just gonna be playing with tools like video games, we're going to be thinking about these, you know, more systemic security, data management issues as well. For that reason, I am pretty sure that most of your implementations of AI are going to have to be done with the support of it or a data management team that helps you. Now of course many of the people in the call are looking for advice and best practices and tips on what tools to use and we're gonna get into that over the next couple of weeks of course. But unfortunately because of the way the market is at the moment, there are fewer than those of those that you may think. There are some vendors that have been doing this for a long time. Paradox in the case of recruiting eightfold in the case of talent selection and internal mobility fuel 50 in the case of careers gloat in the case of talent marketplaces.
Speaker 1 00:08:20 But for the most part I would say it's less than a dozen companies that have uh, robust end-to-end implementations of this stuff that you can learn from. So don't expect the vendors to have completely baked off the shelf solutions yet. Some of them do, but not a lot of them. So we're gonna be dealing in a market over the next year or two where we're gonna be discovering things that are exceptionally interesting that other people may not have seen before. There's a company I've run across in Europe called Sauna s a nna, which has a fascinating end-to-end learning system based on ai. Most of you have probably not heard of it, but they've been focused mostly on the Northern European market initially. We're gonna find a lot of those and stick with us and those of you that are members or join the big reset on the next cycle and we'll continue to write about what these things are.
Speaker 1 00:09:13 Now, if, if you haven't read the white paper or the research report we wrote on the deep dive on ai, I really suggest you do because what's come up in a lot of these conversations is people asking questions about whether the tool or system they're looking at is an add-on to an existing system, which of course is the easiest thing to do. That's what everybody would like a new implementation or an entirely new architecture. And you may or may not want to get involved in the architectural issues of what's going on under the covers of any of these systems, but I'm afraid you're going to have to, because if you look at the features that are being added to ERPs, one of the companies for example, uses Dayforce, which is Ceridian, and they were going to buy a talent intelligence system and then all of a sudden Ceridian not launched a bunch of talent intelligence features.
Speaker 1 00:10:02 So they're gonna use that. That's you know, obviously where most companies will try to go, which are gonna be in this situation of build versus buy. Let's suppose you have Workday. Let's suppose you have Oracle. Let's suppose you have SAP success factors, Ceridian, adp, whatever it may be. I guarantee you the vendor's going to tell you they have an AI solution for recruiting, for job descriptions, for pay, whatever it may be. And then you're gonna find a best of breed vendor that has something that's maybe two or three times more powerful. And you're gonna have to scratch your head and say, which one of these directions do we wanna go? And this is just starting to come up in these conversations. I mean, my advice here is there's two criteria that you have to consider when you have to make that decision. The first is, you know, how big is the problem you're trying to solve and does it warrant, you know, a new vendor in your stack?
Speaker 1 00:10:53 It might, I mean it may be a big problem and that the incumbent vendors' features just don't solve it well enough that you can afford to wait for them to figure out what the startups or the more innovative vendors have done. Or you may not be in that state. You may be in a state where we want some of these new features, but we're not hurting for some massive return on investment yet. So we wanna experiment with the features from the incumbent vendor first, see how well they work, and then decide if we wanna evaluate one of the new tools. So you know, that's come up and it will come up more and more. Now we're only starting this sprint now and we're gonna be doing this for five weeks. So we'll share lots more information and it's including a research report when we're finished.
Speaker 1 00:11:39 But some of the things that I've learned over the years following the space so close closely in talking to so many vendors, the first is whether you like it or not, I believe you're going to have to have someone with some architectural interest in understanding the differences architecturally between, between these systems. We for example, have been building our co-pilot on OpenAI initially, and there's some things we've found about OpenAI that were not what we thought. And so we're basically forced to go back down to an architectural decision about whether we continue to use open AI or go elsewhere, that that's not something an end user HR professional or consultant would necessarily know how to do. Second issue that comes up is clearly respons understanding responsibilities of who, one of the companies I just talked to on the big reset is a large financial services company and the person implementing the EI system for careers is not an L and D.
Speaker 1 00:12:40 The L and D leader owns skills. This talent head of talent owns careers. They're not in the same group, they're not working on the same scale or pace and they're, you know, having a hard time coordinating. So what AI does of course, is it brings a lot of data sources together to make a decision faster and more intelligently than it did before. Well the owners of those different sources of data, maybe in different places in your company, even something like scheduling in the healthcare industry, what the scheduling folks were talking about was the scheduling tool works great, but what we found out in the scheduling tool is if the onboarding process isn't done well and we don't get good data about the employee in the system when they're onboarded, we can't schedule them correctly. So we've gotta get the payroll and onboarding and recruiting folks to fulfill some of our requirements in order for us to make sure that our backend AI system works effectively.
Speaker 1 00:13:38 So you know, this whole topic of systemic HR is coming up over and over and over again and I think the more you um, look at AI solutions, the more you're going to be forced to have those discussions about how multiple stakeholders in HR contribute to the success of this exciting new thing that everybody wants to use. One of the groups is talking about ethics, one of the groups talking about employee experience, one of the groups talking about skills, another groups talking about business strategies. And it, let me mention the one that was talking about skills. I hate to say this, but I don't think any of us know enough about AI yet compared to where we will be a year from now, including the vendors. By the way, I interview a lot of vendor executives and I ask them questions that they don't know the answers to either.
Speaker 1 00:14:27 In some sense this is a lot of experimental technology that was authored and originally developed by computer scientists and they're fascinated by the effectiveness of it, but they don't use it <laugh>. So there's a learning process going on on the vendor side just as fast as there is on our side as users and implementers. And what I think that means to me and to you is whichever tools or technologies you decide to use, whether it be internally developed off the shelf or something that you buy from a vendor directly or your IT department buys, you really need to think about a vendor that you can talk to that you can relate to that is honest with you on what they've learned and what they've experienced. Cuz they will teach you a lot, even though the vendors aren't necessarily experts on all the aspects of AI yet they have customers that have gone down this learning curve and you will learn from them. Anyway, I'm gonna stop there. It's been a fascinating couple of weeks in Europe. I have a bunch of client calls this week and then I'll be back in California first week of August. So those of you that are trying to reach me, I'll be around and uh, we will keep in touch with you on this. This is a massively important topic and this big reset group is gonna be one of the most interesting ones we've done so far. Thank you.