‘Evolution of Software Testing: From Brick Phones to AI’ with Leigh Rathbone, Head of Quality Engineering at CAVU

In the latest episode of ‘groCTO: Originals’ (formerly ‘Beyond the Code: Originals’), host Kovid Batra engages with Leigh Rathbone, Head of Quality Engineering at CAVU who has a rich technical background with reputable organizations like Sony Ericsson and The Very Group. He has been at the forefront of tech innovation, working on the first touchscreen smartphone and smartwatch, and later with AR, VR, & AI tech. The conversation revolves around ‘Evolution of Software Testing: From Brick Phones to AI’.

The podcast begins with Leigh introducing himself and sharing a life-defining moment in his career. He further highlights the shift from manual testing to automation, discussing in depth the automation framework for touchscreen smartphones from 19 years ago. Leigh also addresses the challenges of implementing AI and how to motivate teams to explore automation opportunities. He also discusses the evolution of AR, VR, and 3D gaming & their role in shaping modern-day testing practices, emphasizing the importance of health and safety considerations for testers.

Lastly, Leigh offers parting advice urging software makers & testers to always prioritize user experience & code quality when creating software.

Timestamps

  • 00:06 - Leigh’s Introduction
  • 01:07 - Life-defining Moment in Leigh’s Career
  • 04:10 - Evolution of Software Testing
  • 09:20 - Role of AI in Testing
  • 11:14 - Conflicts with Implementing AI
  • 15:29 - Adapting to AI with Upskilling
  • 21:02 - Evolution of AR, VR & 3D Gaming
  • 25:45 - Unique Value of Humans in Testing
  • 32:58 - Conclusion & Parting Advice

Links and Mentions

Episode Transcript

Kovid Batra: Hi, everyone. This is Kovid, back with another episode of Beyond the Code by Typo. Today, we are lucky to have a tech industry veteran with us on our show today. He is the Head of Quality Engineering at CAVU. He has had fascinating 25-plus years of engineering and leadership experience, working on cutting-edge technologies including the world's first smartphone and smartwatch. He was also involved in the development of progressive download and DRM technologies that laid the groundwork for modern streaming services. We are grateful to have you on the show, Leigh. 

Leigh Rathbone: Thank you, Kovid. It's great to be here. I'm really happy to be invited. I'm looking forward to sharing a few experiences and a few stories in order to hopefully inspire and help other people in the tech industry. 

Kovid Batra: Great, Leigh. And today, I think they would have a lot of things to deep dive into and learn from you, from your experience. But before we go there, where we talk about the changing landscape of software testing, coming from brick phones to AI, let's get to know a little bit more about each other. Can you just tell us something about yourself, some of your life-defining experiences so that I and the audience can know you a little more? 

Leigh Rathbone: Yeah. Well, I'm Leigh Rathbone. I live in the UK, uh, in England. I live just North of a city called Liverpool. People might've heard of Liverpool because there's a few famous football teams that come from there, but there's also a famous musical band called the Beatles that came from Liverpool. So, I live just North of Liverpool. I have two children. I'm happily married, been married for over 20 years. I am actually an Aston Villa football fan. I don't support any of the Liverpool football clubs. I'm not a cricket fan or a rugby fan. It's 100 percent football for me. I do like a bit of fitness, so I like to get out on my bike. I like to go to the gym. I like to drink alcohol. Am I allowed to say that, Kovid? Am I allowed to say that? I do like a little bit of alcohol. Um, and like everybody else, I think I'm addicted to Netflix and all the streaming services, which is quite emotional for me, Kovid, because having played a part in the building blocks and a tiny, tiny part in the building blocks of what later became streaming, when I'm listening to Spotify or when I'm watching something on Amazon Video or Netflix, I do get a little bit emotional at times thinking, "Oh my God! I played a minute part of that technology that we now take for granted." So, that's my sort of out-of-work stuff that, um, I hope people will either find very boring or very interesting, I don't know. 

Kovid Batra: No, I definitely relate to it and I would love to know, like, which was the last, uh, series you watched or a movie you watched on Netflix and what did you love about it? 

Leigh Rathbone: So, I watched a film last night called 'No Escape'. Um, it's a family that goes to, uh, a country in Asia and they don't say the name of the country for legal reasons. Um, but they get captured in a hotel and it's how they escape from some terrorists in a hotel with the help of Brosnan who's also in the film. So, yeah, it was, uh, it was high intensity, high energy and I think that's probably why I liked it because from the very almost 5-10 minutes, it's like, whoa, what's going on here? So, it was called 'No Escape'. It's on Netflix in the UK. I don't know whether it'll be on Netflix across the world. But yeah, it's an old film. It's not new. I think it's about three years old. But yeah, it was quite enjoyable. 

Kovid Batra: Cool, cool. I think that that's really interesting and thank you for such a quick, sweet intro about yourself. And of course, your contributions are not minute. Uh, I'm sure you would have done that in that initial stage of tech when the technology was building up. So, thanks on behalf of the tech community there. 

Uh, all right, Leigh, thank you so much and let's get started on today's main topic. So, you come from a background where you have seen the evolution of this landscape of software testing and as I said earlier, also like from brick phones to AI, I'm sure, uh, you have a lot of experiences to share from the days when it all started. So, let's start from the part where there was no automation, there was manual testing, and how that evolved from manual testing to automation today, and how things are being balanced today because we are still not 100 percent automated. So, let's talk about something like, uh, your first smartphone, uh, maybe where you might not have all the automation, testing or sophisticated automation possible. How was your experience in that phase? 

Leigh Rathbone: Well, I am actually holding up for those people that, uh, can either watch the video. 

Kovid Batra: Oh my God! Oh my God! 

Leigh Rathbone: I'm holding up the world's first touchscreen smartphone and you can see my reflection and your reflection on the screen there. This is called the Sony Ericsson P800. I worked on this in 2002 and it hit the market in 2003 as the world's first touchscreen smartphone, way before Apple came to the market. But actually, if I could, Kovid, can I go back maybe four years before this? Because there's a story to be told around manual testing and automation before I got to this, because there is automation, there is an automation story for this phone as well. But if I can start in 1999, I've been in testing for 12 months and I moved around a lot in my first four years, Kovid because I wanted to build up my skillsets and the only way to do that was to move jobs. So, my first four years, I had four jobs. So, in 1999 I'm in my second job. I'm 12 months into my testing career and I explore a tool called WinRunner. I think it was the first automation tool. So, there I am in 1999, writing automation scripts without really knowing the scale of what was going to unfold in front of not just the testing community, but the tech community. And when I was using this tool called WinRunner. Oh, Kovid, it was horrible. Oh my God! So, I would be writing scripts and it was pretty much record and playback, okay? So, I was clicking around, I was looking at the code, I was going, "Oh, this is exciting." And every time a new release would come from the developers, none of my scripts would work. You know the story here, Kovid. As soon as a new release of code comes out, there's bug fixes, things move around on the screens, you know, different classes change, there might be new classes. This just knocks out all of my scripts. So, I'd spend the next sort of, I don't know, eight days, working, reworking, refactoring my automation scripts. And then, I just get to the point where I was tackling new scripts for the new code that dropped and a new drop of code would come. And I found myself in this cycle in 1999 of using WinRunner and getting a little bit excited but getting really frustrated. And I thought, "Where is this going to go? Has it got a future in tech? Has it got a future in testing?" Cause I'm not seeing the return on investment with the way I was using it. So, at that point in time, 1999, I saw a glimpse, a tiny glimpse of the future, Kovid. And that was 25 years ago. And for the next couple of years, I saw this slow introduction, very, very slow back then, Kovid, of manual testing and automation. And the two were very separate, and that continued for a long, long time, whereby you'd have manual testers and automation testers. And I feel that's almost leading and jumping ahead because I do want to talk about this phone, Kovid, because this phone was touchscreen, and we had automation in 2005. We built an automation framework bespoke to Sony Ericsson that would do stress testing, soak testing, um, you know, um, it would actually do functional automation testing on a touchscreen smartphone. Let that sink into 19 years ago. We built a bespoke automation framework for the touchscreen smartphone. Let that sink in folks. 

Kovid Batra: Yeah. 

Leigh Rathbone: Unreal, absolutely unreal, Kovid. Back in the day, that was pretty much unheard of. Unbelievable. It still blows my mind to this day that in 2005, 19 years ago, on a touchscreen smartphone, we had an automation framework that added loads and loads of value. 

Kovid Batra: Totally, totally. And was this your first project wherein you actually had a chance to work hands-on with this automation piece? Like, was that your first project? 

Leigh Rathbone: So, what happened here, Kovid, and this is a trend that happened throughout the testing and tech industry right up until about, I'd say that seven years ago, we had an automation team and a manual team. I'll give you some context for the size. The automation team was about five people. The manual test team was about 80 people. So, you can see the contrast there. So, they were doing pretty much what I was doing in 1999. They were writing some functional test scripts that we could use for regression testing. Uh, but they were mostly using it for soak testing. So in other words, random touches on the screen, these scripts needed to run for 24 hours in order for us to be able to say, "Yes, that can, that software will exist in live with a customer touching the screen for 24 hours without having memory leaks, as an example." So, their work felt very separate to what we were doing. There was a slight overlap with the functional testing where they'd take some of our scripts and turn them into, um, automated regression packs. But they were way ahead of the curve. They were using this automation pack for soak testing to make sure there was no memory leaks by randomly dibbing on a screen. And I say dibbing, Kovid, because you touched the screen with a dibber, right? It wasn't a finger. Yeah, you needed this little dibber that clicked onto the side of the phone in order to touch the screen. So, they managed to mimic random clicks on the screen in order to test for memory leaks. Fascinating. Absolutely fascinating. So at that point, we started to see a nice little return on investment on automation being used. 

Kovid Batra: Got it. Got it. And from there, how did it get picked up over the years? Like, how have teams collaborated? Was there any resistance from, of course, every time this automation piece comes in, uh, there is resistance also, right? People start pointing things. So, how was that journey at that point? 

Leigh Rathbone: I think there's always resistance to change and we'll see it with AI. When we come on to the AI section of the talk, we'll see it there. There will always be resistance to change because people go through fear when change is announced. So, if you're talking to a tester, a QA or a QE and you're saying, "Look, you're going to have to change your skill sets in order to learn this." They're gonna go through fear before they spot the opportunity and come up the other side. So, everywhere I've gone, there's been resistance to automation and there's something else here, Kovid, from the years 1998 to 2015, test teams were massive. They were huge. And because we were in the waterfall methodology, they were pretty much standalone teams and all the people that were in power of running these big teams, they had empires, and they didn't want to see those empires come down. So actually, resistance wasn't just sometimes from testers themselves, it was from the top, where they might say, "Oh, this might mean that the number of testers I need goes down, so, therefore, my empire shrinks." And there were test leaders out there, Kovid, doing that, very, very naughty people. Like, honestly, trying to protect their own, their own job, instead of thinking about the future. So, I saw some testers try and accelerate the use of automation. I also saw test leaders put the brakes on it because they were worried about the status of their jobs and the size of their empires. 

Kovid Batra: Right. And I think fast-forward to today, we won't take much long to jump to the AI part here. Like, a lot of automation Is already in place. According to your, uh, view of the tech industry right now uh, let's say, if there are a hundred companies; out of those hundred, how many are at a scale where they have done like 80 percent or 90 percent of automation of testing? 

Leigh Rathbone: God! 80 to 90 percent automation of testing. You'll never ever reach that number because you can do infinite amounts of testing, okay? So, let's put that one out there. The question still stands up. You're asking of 100 companies, how many people have automation embedded in their DNA? So I would probably, yeah, I would probably say it's in the region of 70 to 80 percent. And I'd be, I wouldn't be surprised if it's higher, and I've got no data to back that up on. What I have got to back that up on is the fact that I've worked in 14 different companies, and I spend a lot of time in the tech industry, the tech communities, talking to other companies. And it's very rare now that you come across a company that doesn't have automation. 

But here's the twist, Kovid, there's a massive twist here. I don't employ automation testers, okay? So 2015, I made a conscious effort and decision not to employ automation testers. I actually employed testers who can do the exploratory side and the automation side. And that is a trend now, Kovid, that really is a thing. Not many companies now are after QAs that only do automation. They want QAs that can do the exploratory, the automation, a little bit of performance, a little bit of security, the people skills is obviously rife. You know, you've got to put those in there with everything else. 

Kovid Batra: Of course. 

Leigh Rathbone: Yeah. So for me now, this trend that sort of I spotted in 2014 and I started doing in 2015 and I've done it at every company I've been to, that really is the big trend in testers and QAs right now. 

Kovid Batra: Got it. I think it's more like, uh, it's an ever-growing evolutionary discipline, right? Uh, every time you explore new use cases and it also depends on the kind of business, the products the company is rolling out. If there are new use cases coming in, if there are new products coming in, every time you can just have everything automated. So yeah, I mean, uh, having that 80-90% testing scale automated is something quite far-fetched for most of the teams, until and unless you are stagnated over one product and you're just running it for years and years, which is definitely not, uh, sustainable for any business. 

So here, my question would be, like, how do you ensure that your teams are always up for exploring that side which can be automated and making sure that it's being done? So, how do you make sure? One is, of course, having the right hires in the team, but what are the processes and what are the workflows that you implement there from time to time? People are, of course, doing the manual testing also and with the existing use cases where they can automate it. They're doing that as well. 

Leigh Rathbone: It's a really good question, Kovid, and I'll just roll back in the process a little bit because for me, automation is not just the QA person's task and not even test automation. I believe that is a shared responsibility. So, quality is owned by everybody in the team and everyone plays their different parts. So for me, the automation starts right with the developers, to say, "Well, what are you automating? What are your developer checks that you're going to automate?" Because you don't want developers doing manual checks either. You want them to automate as much as they can because at the unit test level and even the integration test level, the feedback loops are really quick. So, that means the test is really cheap. So, you're getting some really good, rich feedback initially to show that nothing obvious has broken early on when a developer adds new code. So, that's the first part. So, that now, I think is industry standard. There aren't many places where developers are sat there going, "I'm not going to write any tests at all." Those days are long, long gone, Kovid. I think all, you know, modern developers that live by the modern coding principles know that they have to write automated checks.

But I think your question is targeted at the QAs. So, how do we get QAs involved? So, you have to breed the curiosity gene in people, Kovid. So, you're absolutely right. You have to bring people in who have the skills because it's very, very hard to start with a team of 10 QAs where no one can automate. That's really hard. I've only ever done that once. That's really hard. So, what I have done is I've brought people in with the mindset of thinking about automation first. The mindset of collaborating with developers to see what they're automating. The curiosity and the skill sets to grow and develop and learn more about the tools. And then, you have to give people time, Kovid. There is no way you can expect people who don't have the automation skills to just upskill like that. It's just not fair. You have to support, support, and support some more. And that comes from myself giving people time. It's understanding how people learn, Kovid.

So, I'll give you an example. Pair learning. That's one technique where you can get somebody who can't automate and maybe you get them pairing with someone else who can't automate and you give them a course. That's point number one. Pair learning could be pairing with someone who does know automation and pairing them with someone who doesn't know automation. But guess what? Not everyone likes pairing because it's quite a stressful environment for some people. Jumping on a screen and sharing your screen while you type, and them saying, "No, you've typed that wrong." That can be quite stressful. So, some people prefer to learn in isolation but they like to do a brief course first, and then come back and actually start writing something right in the moment, like taking a ticket now that they're manually testing, and doing something and practising, then getting someone to peer review it. So, not everyone likes pair learning. Not everybody likes to learn in isolation. You have to understand your people. How do they like to learn and grow? And then, you have to understand, then you have to relay to them why you are asking them to learn and grow. Why am I asking people to change? 'cause the skill bases that's needed tomorrow and the day after and in two years time are different to the skill bases we need right now or even 10 years ago. And if people don't upskill, how are they going to stay relevant? 

Kovid Batra: Right. 

Leigh Rathbone: Everything is about staying relevant, especially when test automation came along, Kovid, and people were saying, "Hey, we won't need QAs because the automation will get rid of them." And you'd be amazed how many people believed that, Kovid, you'd be absolutely gobsmacked how many tech leaders had in their minds that automation would get rid of QAs. So, we've been fighting an uphill struggle since then to justify our existence in some cases, which is wrong because I think the value addition of QAs and all the crafts when they come together is brilliant. But for me, for people who struggle to understand why they need to upskill in automation, it's the need to stay relevant and keep adding value to the company that they're in.

Kovid Batra: Makes sense. And what about, uh, the changing landscape here? So basically, uh, you have seen that part where you moved to phones and when these phones were being built, you said like, that was the first time you built something for touchscreen testing, right? Now, I think in the last five to seven years, we have seen AR, VR coming into the picture, right? Uh, the processes that you are following, let's say the pair learning and other things that you bring along to make sure that the testing piece, the quality assurance piece is in place as you grow as a company, as you grow as a tech team. For VR, AR kind of technologies, how has it changed? How has it evolved? 

Leigh Rathbone: Well, massively, because if you think about testing back in the day, everybody tested on a screen. And most of us are still doing that. And this is why this phone felt different and even the world's first smartwatch, which is here. When I tested these two things, I wasn't testing on a screen. I was wearing it on my wrist, the watch, and I was using the phone in my hand in the environment that the end user would use it in. So, when I went to PlayStation, Kovid, and I was head of European Test Operations for Europe with PlayStation, we had a number of new technologies that came in and they changed the way we had to think about testing. So, I'll give you some examples. Uh, the PlayStation Move, where you had the two controllers that can control the game, uh, VR, AR, um, 3D gaming. Those four bits of technology, and I've only reeled off four, there was more. Just in three years at PlayStation, I saw how that changed testing. So, for VR and 3D, you've got to think about health and safety of the tester. Why? Because the VR has bugs in it, the 3D has bugs in it, so it makes the tester disorientated. You're wearing.. They're not doing stuff through their eyes, their true eyes, they're doing it through a screen that has bugs in it, but the screen is right up and close to their eyes. So there was motion sickness to think about. And then, of course, there was the physical space that the testers were in. You can't test VR sat at a desk, you have to stand up. Look, because that's how the end users do it. When we tested the PlayStation Move with the two controllers, we had to build physical lounges for testers to then go into to test the Move because that's how gamers were going to use the game. Uh, I remember Microsoft saying that they actually went and bought a load of prompts for the Kinect. Um, so wigs and blow-up bodies to mimic different shapes of people's bodies because the camera needed to pick up everybody's style of hair, whether you're bald like me, or whether you have an afro, the camera needed to be able to pick you up. So all of a sudden, the whole testing dynamics have changed from just being 2 plus 2 equals 4 in a field, to actually can the camera recognize a bald, fat person playing the game. 

Everything changed. And this is what I mean. Now, it's performance. Uh, for VR, for augmented reality, mixed reality glasses, there's gonna be usability, there's gonna be performance, there's gonna be security. I'll give you one example if I can, Kovid. I'm walking down the road, and I'm wearing, uh, mixed reality glasses, and there's a person coming towards me in a t-shirt that I like, and all of a sudden, my pupils dilate, a bit of sweat comes out my wrist, That's data. That's collected by the wearable tech and the glasses. They know that I like that t-shirt. All of a sudden, at the top right-hand corner of those glasses, a picture of me wearing that t-shirt appears, and a voice appears on the arm and goes, "Would you like to purchase?" And I say, "Yes." And a purchase is made with no interaction with the phone. No interaction with anything except me saying 'yes' to a picture that appeared in the top right-hand corner of my phone. Performance was key there. Security was really key because there's a transaction of payments that's taken place. And usability, Kovid. If that picture appeared in the middle of the glasses, and appeared on both glasses, I'm walking out into the road in front of a bus, the bus is going to hit me, bang I'm dead because of usability. So, the world is changing how we need to think about quality and the user's experience with mixed reality, VR, AR is changed overnight.

Kovid Batra: I think I would like to go back to the point where you mentioned automation replacing humans, right? Uh, and that was a problem. And of course, that's not the reality, that cannot happen, but can we just deep dive into the testing and QA space itself and highlight what exactly today humans are doing that automation cannot replace? 

Leigh Rathbone: Ooh! Okay. Well, first of all, I think there's some things that need to be said before we answer that, Kovid. So, what's in your head? So, when I think of AI, when I think of companies, and this does answer your question, actually, every company that I've been into, and I've already mentioned that I've worked in a lot, the culture, the people, the tech stack, the customers, when you combine all of those together for each company, they're unique, absolutely unique to every single company. When you blend all of those and the culture and make a little pot of ingredients as to what that company is, it's unique. So, I think point number one is I think AI will always help and assist and is a tool to help everyone, but we have to remember, every company is unique and AI doesn't know that. So, AI is not a human being. AI is not creative. I think AI should be seen as a member of the team. Now if we took that mindset, would we invite everybody who's a member of the team into a meeting, into an agile ceremony, and then ignore one member of that team? We wouldn't, would we? So, AI is a tool and if we see it as a member of the team, not a human being, but a member of the team, why wouldn't we ask AI its opinion with everything that we do as QAs, but as an Agile team? So if we roll right back, even before a feature or an epic gets written, you can use AI for research. It's a member of the team. What do you think? What happened previously? It can give you trends. It can give you trends on bugs with previous projects that have been similar. So, you can use AI as a member of the team to help you before you even get going. What were the previous risks on this project that look similar? Then when you start getting to writing the stories, why wouldn't you ask AI its opinion? It's a member of the team. But guess what? Whatever it gives you, the team can then choose whether they want to use it, or tweak it, or not use it, just like any other member of the team. If I say this is my opinion, and I think we should write the story with this, the team might disagree. And I go, "Okay, let's go with the team." So, why don't we use AI as exactly the same, Kovid, and say, "When we're writing stories, let's ask it. In fact, let's ask it to start with 'cause it might get us into a place where we can refactor that story much quicker." Then when we write code, why aren't we as devs using AI as a member, doing pair programming with it? And if you're already pair programming with another developer, add AI as the third person to pair program with. It'll help you with writing code, spotting errors with code, peer reviews, pull requests. And then, when we come to tests, use it as a member of the team. " I'm new to learning Cypress, can you help me?" Goddamn right, it can. "I've written my first Cypress test. What have I done wrong?" It's just like asking another colleague. Right, except it's got a wider sort of knowledge base and a wider amount of parameters that it's pulling from. 

So for me, will AI replace people? Yes, absolutely. But not just in testing, not just in tech, AI has made things easily accessible to more people outside of tech as well. So, will it replace people's jobs? I'm afraid it will. Yes. But the people who survive this will be the ones who know how to use AI and treat it as a member of the team. Those people will be the last pots of people. They will be the ones who probably stay. AI will replace jobs. I don't care what people say, it will happen. Will it happen on a large scale? I don't know. And I don't think anyone does. But it will start reducing number of people in jobs, not just in tech. 

Kovid Batra: That would happen across all domains, actually. I think that that's very true. Yeah. 

So basically, I think it's more around the creativity piece, wherein if there are new use cases coming in, the AI is yet not there to make sure that you write the best, uh, test case for it and do the testing for you, or in fact, automate that piece for the coming, uh, use cases, but if the teams are using it wisely and as a team member, as you said, that's a very, very good analogy, by the way, uh, that's a great analogy. Uh, I think that's the best way to build that context for that team member so that they know what the whole journey has been while releasing an epic or a story, and then, probably they would have that creativity or that, uh, expertise to give you the use case and help you in a much better way than it could do today, like without hallucinating, without giving you results that are completely irrelevant. 

Leigh Rathbone: Yeah, I totally agree, Kovid. And I think this is, um, if you think about what companies should be doing, companies should be creating new code, new experiences for their customers, value add code. If we're just recreating old stuff, the company might not be around much longer. So, if we are creating new stuff, and let's make an assumption that, I don't know, 50 percent of code is actually new stuff that's never been out there before. Well, yeah, AI is going to struggle with knowing what to do or what automation test it could be. It can have a good stab because you can set parameters and you can say, you can give it a role, as an example. So, when you're working with chatGPT, you can say, as a professional tester or as a, you know, a long-term developer, what would be my mindset on writing JavaScript code for blah, blah, blah, blah? And it will have a good stab at doing it. But if it's for a space rocket that can go 20 times the speed of light, it might struggle because no one's done that yet and put the data back into the LLM, yet. 

Kovid Batra: Totally. Totally makes sense. Great. I think, Leigh, uh, with this thought, I think we'll bring our discussion to an end for today. I loved talking to you so much and I have to really appreciate the way you explain things. Great storytelling, great explanation. And you're one of the finest ones whom I have brought on the show, probably, so I would love to have another show with you, uh, and talk and deep dive more into such topics. But for today, I think we'll have to say goodbye to you, and before we say that, I would love for you to give our audience parting advice on how they should look at software quality testing in their career. 

Leigh Rathbone: I think that that's a really good question. I think the piece of advice, regardless of what craft you're doing in tech, always try and think quality and always put the customer at the heart of what you're trying to do because too many times we create software without thinking about the customer. I'll give you one example, Kovid, as a parting gift. Anybody can go and sit in a contact centre and watch how people in contact centres work, and you'll understand the thing that I'm saying, because we never, ever create software for people who work in contact centres. We always think we're creating software that's solving their problems, but you go and watch how they work. They work at speed. They'll have about three different systems open at once. They'll have a notepad open where they're copying and pasting stuff into. What a terrible user experience. Why? Because we've never created the software with them at the heart of what we were trying to do. And that's just one example, Kovid. The world is full of software examples where we do not put the customer first. So we all own quality, put the customer front and center. 

Kovid Batra: Great. I think, uh, best advice, not just in software testing or in general or any aspect of business that you're doing, but also I think in life you have to.. So I believe in this philosophy that if you're in this world, you have to give some value to this world and you can create value only if you understand your environment, your surroundings, your people. So, to always have that empathy, that understanding of what people expect from you and what value you want to deliver. So, I really second that thought, and it's very critical to building great pieces of software, uh, in the industry also. 

Leigh Rathbone: Well, Kovid, you've got a great value there and it ties very closely with people that write code, but leaders as well. So, developers should always leave the code base in a better state than they found it. And leaders should always leave their people in a much better place than when they found them or when they came into the company. So, I think your value is really strong there. 

Kovid Batra: Thank you so much. All right, Leigh, thank you. Thank you so much for your time today. It was great, great talking to you. Talk to you soon. 

Leigh Rathbone: Thank you, Kovid. Thank you. Bye.