Episode Transcript
[00:00:08] Hello everyone. Today, at the risk of over discussing one of the biggest news stories in Tech, I'd like to sort of take a step back and reflect with you on what this OpenAI saga can teach us. Now for those of you that haven't followed it, it's pretty quick. What happened over the last week is the board of OpenAI fired Sam Altman, one of the founders. And there was essentially a revolt among the employees and Microsoft who owns 49% of the company. And at the end of a lot of negotiation and agony by the board, we end up with the new board. The old board is mostly gone. One of the people remains. Altman threatened to go to Microsoft and take the people with him, threatened to start a new company. Now goes back to OpenAI as the CEO and a new board is being formed, which will be a bigger board, and Microsoft will be on the board. And the issue that created this mess was the issue of AI safety. And a lot of the original board members were really focused on slowing down the development of AI. And Sam was focused on really building a business. So my reflections on the situation are first of all, it would have been really sad if OpenAI had fallen apart the way it looked like it was going to because that company has pioneered the commercialization of AI in a way that we never ever would have done before. And they're so far ahead of everybody else that it would have been really disappointing to see them disappear. So I am just thrilled that this happened. But what do we learned from this? Okay, number one, employees are much more important than a board.
[00:01:50] Now a lot of you maybe think to yourselves, well, I would love to be on a board someday. What an honorable thing to do. Boards are so great. That's not really true. The reason we have boards in US. Corporations at least, is really only one reason, and that is to act on behalf of shareholders and evaluate and decide whether the CEO is doing his or her job. That is really the main reason for the board as well as to take legal responsibility for many things that the company could do wrong and take care of executive compensation and diversity and things like that. Most boards are not intimately involved in the details of the company. Some are, but most are not. And many of them have outside directors who are not even business people or they're not even involved in the industry. Sometimes they're politicians or people in the public sector. And this particular board was mostly those types of people. But the employees are really the most important part of a company, and especially a software company. For those of you that have ever worked in the software industry, your innovation goes home every night and decides the next day if they're going to come in and when they don't come in, your company comes to a grinding halt. And as I learned during three of my various work experiences in my own career, when the top people, the most ambitious, highly educated, highly skilled people leave, the people underneath them leave. And then you're left with the mediocre sort of middle level performers and the company stagnates. And this happens all the time when companies lose their momentum. And clearly the OpenAI team was about to suffer that fate and everybody was trying to poach their people for the last 72 hours. So once again, everybody's reinvigorated. At least it appears, the company, it's not a perfect company. I mean, if you look at the Glassdoor ratings at OpenAI, it's not really that high. It's around 3.9 out of five, which is actually not that high. Google's is higher. Microsoft's is higher. Many really great companies are higher than that. But it's still an emerging, growing company. I doubt Sam Altman is going to be the CEO when they have 100,000 or 10,000 or 20,000 employees. He might. He's really more of a startup person. But the employees call the shots. And once the board realized that the employees were unhappy, they had no power at all. I mean, really, the owners of OpenAI, which include the investors and Microsoft basically said, this ain't going to work because we own this company and you guys aren't governing it well if you're alienating the employees. The second story or really lesson here is a role of leadership. I'm not going to beat this one to death because we just talked about it and you've all read or at least heard of the research we published a couple of weeks ago. But the reason everybody was upset at OpenAI is because Altman is a spiritual or inspirational leader. I'm sure there are lots of questions about this, that and the other thing that go on there and probably people that don't like him too. But for the most part, he has brought this company from zero to an $80 billion valuation in about eight years. You go find a leader that can do that, attract great people, build a business, do the deal with Microsoft, get the investors. That isn't an inspirational leader. I question how many people really are capable of that. Now, he was in the right place at the right time and he is very experienced at startup business models and business operations because of his work at Y Combinator. I happen to know his brother pretty well and I've met him once. He seems like an awfully good guy in terms of a business person. But people didn't leave because they didn't like the board. They were going to leave because of him. And leaders have an enormous role in inspiration, in creating energy, in creating clarity of mission, in holding people accountable, in being approachable and listening and understanding what's going on in the organization. And the senior leaders, when or if they become connected or they leave can have a dramatic impact on the performance of a company. That is why CEO succession is such a difficult job in tech. In tech, most of the CEOs in many successful companies were the founders, and they very, very carefully transition to other, more senior, experienced leaders over time. When I was at Cybase, which at the time was a really successful company, our CEO was a very inspirational technology guy. Everybody adored him. He was just a wonderful, wonderful human being. And when the company got big, it was very hard to find the right executive team to transition. There was an executive team that grew the company to about 100 and 5200 million. And then it started to fall apart, and there were all sorts of problems. And eventually the leadership team was replaced by a bunch of outsiders who came in and the company went sideways and was eventually sold to SAP. So that's the second lesson, is, regardless of the board, the most important thing they do is select the CEO and decide if the CEO they have is doing what he or she should be doing. There's a transition going on at workday right now, obviously. Satya Nadella has been an exceptional CEO, one of the best in the world. At Microsoft, taking over from Steve Ballmer. You can look at Cisco, you can look at Google, you can look at Facebook, you can look at whatever company you like in every industry you like, and you'll find that the CEO is largely responsible for most of the inspiration and innovation that takes place in the company. Third, of course, is the idea of culture. And you've all read our change agility research, our research on resilience, our research on employee experience. Culture is, in some sense, one of the most important assets a company has. Culture tends to be developed early in the company's founding, and usually it is the culture or mission or purpose of the company that defines its success. And it is oftentimes very hard to change it because it creates behaviors and activities and reward systems and attracts certain people that become and reinforce the culture. From what I can tell about OpenAI, the original founding members were focused on research and safety and technology, not on making money, not on huge returns for shareholders. That obviously has changed. And during the period from the founding around 2015, I believe, to today, it has become a very important profit making company with what is called a capped profit. So the reason OpenAI makes a profit, at least the theory that I've read and most people talk about, is they need the money to buy all the compute cycles they need to train the models. So the billions of dollars that have been thrown into OpenAI are not going to shareholders. They're really going to the operations and the development of the models. There are investors, and Microsoft is an investor, and they get a capped return. So my guess is what happened is we really had a culture clash here between the original founding concept and the later result. You all probably know that Elon Musk dropped out of OpenAI and has been very critical of them for not fulfilling their original mission, as were some of the board members who were gone. Are no longer on the, you know, what appears to be the story on culture is I'm sure there's a lot of debate going on in the company about AI safety and working with the government. And by the way, Sam Altman went all over the world talking about safety, so he clearly has this on his mind. But this clash between the culture of the original founders and the culture of the resulting business clearly became part of the story. I guess the other thing that I would mention for those of you that aren't in tech is I personally believe that technology may be one of the most talent centric industries that I've worked in. Every industry has critical and strategic talent, and I'm really not trying to be biased towards tech at all. I mean, every company I've ever met or worked in had gurus who are the core of the business, and the business wouldn't really succeed without them. In technology, the difference is the rate of change is so high that when a company falls behind or a product falls behind, it is very hard to catch up. And so there's this continuous effort to stay current as the technology platforms change. AI is very new in our world of HR. Every platform, every software company that you buy from is trying to become an AI centric company. It's much harder for the older companies to do this because they have tens to thousands of customers running traditionally architected systems. And it's not clear how generative AI is going to replace or supplement or complement what they have. So they're really doing a lot of work to figure out how to make those transitions happen. And that means they need really top people. We have partnered with Sauna Labs and the team we're working with on Galileo is extremely sophisticated and experienced in AI. We would not have been able to do this alone. And I personally have a lot of faith in them bringing Galileo to market that you're going to see best of Breed world class AI for those of you who use it. And we've been doing a lot of work with it. And believe it or not, this thing is far more powerful than I ever thought it was, which we can talk more about later. If we fell behind, if Sana fell behind, or Sauna's technology in AI fell behind, we would obviously be in a very difficult situation too. So we're all dependent on this.
[00:12:14] When you go out and you buy success factors or Workday or Oracle or ADP or Isims or Beamery or Gloat or Eightfold or whatever it is, you've placed a bet, in a sense, on the technology savvy and the sophistication of their technologists. Many times when you see a product slow down or fall behind, sometimes it's the product management, but oftentimes it's the technology team. They're not able to think through or design or implement contemporary technologies. And most software companies, when they become big, they slow down because they have more people and they have more products and they have more offerings. I mean, one example of a company that I think time will tell is ServiceNow. ServiceNow has become a massive company. They've been extremely successful at rolling out new technology at a very high rate of speed. I have noticed things are slowing down there, and I think that's natural for them. Their customers can't absorb new technology as fast as they could when they were smaller. But so for those of you that are HR tech buyers or consultants or implementers, the quality of the team, the expertise of the team, of the vendor that you work with is very, very important. I think there are lots of examples of companies that lose their top people and they never catch up. A good example is webex versus zoom. I think we all loved webex. It was very pioneering in the early days. The team that built webex left to go do zoom. It's not clear if webex will ever catch up. I could give you a lot more examples like that. So anyway, that's really what's been going on in OpenAI from my standpoint. There's a drama that's kind of fun to read about that's been a little bit sleep interrupting for those of us that are in the space. But ultimately, it comes down to employees, leadership and culture. And the new board, as it's formed, will probably be bigger and probably have more experience in the market and be more diverse. And that will be a good thing for the company, because the company is obviously very, very important to all of us and all of our technology partners who use their technology too. And despite the fact there's a lot of competition, which there is, I think, for at least the next several years, maybe a long time, OpenAI is going to be a major, major important player in this new paradigm of AI. Okay, have a great weekend for those in the US. Have a great Thanksgiving, and we'll talk again next week. Bye for now.