BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Podcasts Great Products Need a Culture of Quality and Passionate People

Great Products Need a Culture of Quality and Passionate People

In this podcast Shane Hastie, Lead Editor for Culture & Methods spoke to Melissa Daley, Bob Crews and Adam Sandman, speakers at the InflectraCON 2022 State of Testing conference, about the state of testing and how to instil a culture of quality into software teams

Key Takeaways

  • Quality is more than just testing
  • Leaders need to Model Blameless Behaviour
  • The quality of communication in a development team directly impacts the quality of the software being built
  • A culture of quality starts with people passionate about solving customer problems
  • The shortage of people with technical skills has meant organisations are open to employing people with a wider variety of backgrounds into software testing, which is bringing valuable diversity into the profession

Transcript

Hey, folks. QCon London is just around the corner. We'll be back in person in London from March 27 to 29. Join senior software leaders at early adopter companies as they share how they've implemented emerging trends and best practices. You'll learn from their experiences, practical techniques and pitfalls to avoid so you get assurance you're adopting the right patterns and practices. Learn more at qconlondon.com. We hope to see you there.

This is Shane Hastie for the InfoQ Engineering Culture podcast. Today, I'm sitting down with a group of people from East Coast USA. We've got Melissa Daley, Adam Sandman, and Bob Crews who met at the state of testing conference, the InflectraCON state of testing conference. Am I correct with that, Adam?

Adam Sandman: Yeah. Well, it was the InflectraCON Agile, DevOps, Testing, and Quality event, so yes.

Introductions [01:00]

Shane Hastie: All right. And we want to explore the state of testing, what's happening with quality in software development today. But before we get into that, probably useful idea is get to know each other a little bit. Can I start with you, Melissa? Who's Melissa?

Melissa Daley: I'm Melissa Daley. I'm the founder and owner of Orca Intelligence. We are a product analysis and design organization. My background comes from over 20 years in IT, primarily in software engineering. I started out as we were talking before, as a tester, testing out systems, and went into development, then went into management and all that other great stuff. Mostly around business analysis before management. But a lot of my career has been around the business analysis and requirements engineering and solutions architecture.

Shane Hastie: Welcome. Thank you. And Bob?

Bob Crews: Hi, Shane. Thank you. I've been in IT for about 35 years now. I started off as a COBOL developer and a DBA, and was into programming. I absolutely loved it. And then about 10 years later, I was on an FBI project and I fell in love, absolutely fell in love with test automation, quality assurance, software testing. And then almost 20 years ago, I co-founded and started a company, Checkpoint Technologies, where our primary focus is functional performance and application security, whether it's manual requirements gathering, test automation, you name it. But that is our passion. That's what we love to do. And I am CEO and co-founder of Checkpoint Technologies.

Shane Hastie: Thank you very much. And Adam.

Adam Sandman: Oh, great. Well, thanks for having me on the show, Shane. And my background is interesting. I started programming at the age of 10, writing 8-bit video games using computers like the Acorn, which preceded the ARM processors which we all use today. And I also wrote other kinds of software as a child. In adulthood, end up in software. No surprise. Worked for an IT consulting firm called Sapient in Boston, LA, and the Washington, DC area. And it was during that time that actually, I shouldn't say this on the podcast, but I really found that testing was horrible. And I was the project manager. And what I found was testers had all these really great gizmos and great tools and amazing products. They could use compilers, IDEs, all the buggers, all this really cool kit. And they gave testers spreadsheets and whiteboards and word documents and pieces of paper.

And so when I left Sapient, I founded Inflectra, the company I lead today which I've been leading for 16 years. And the original mission was to provide great tools for testers, both automated and manual testing. And then we found that if we don't have developers on the same platform, there's no communication. So we actually moved into the development space in providing project management and development testing tools. And our whole mission as a company is to bring harmony to the software lifecycle by providing functionality and tools that enable developers, testers, and managers to all work together. So that way, you don't have that problem I had 20 years ago where some people have amazing tools and can work really efficiently, other people are left out in the cold.

Shane Hastie: And I heard from Melissa a strong background in analysis, and I almost want to say, but there's something to the left of those tools, Adam, that aren't there.

Quality is more than just testing [04:09]

Adam Sandman: Idea management and then trying to come up with the... Well, ideation is the popular word now. Melissa's company actually works. And let's talk more about this. This is one of the things that we find and you mentioned it earlier on, is quality is a bigger topic and quality is more than just testing. And we found that at our event. And one of the challenges and you think about quality is what are the requirements? People don't know the requirements. And in many cases, if you're building a system, it has to work in a context, government, state, local, federal or requirements that are by law, FDA. And people don't think of those things until they go live. So Melissa, if you can tell a bit more about some of the things that you've done in those areas, because they're very, very smart?

Melissa Daley: Yes. And so I'm very excited about this conversation because quality does start before you start testing. You always have to build a good requirements framework in order to get you to a great testing experience. So we created a software called Swiftly that allows you to automate requirements generation. The intent is that you put in information in the tool and it automatically generates your requirements for you. So then guess what? That matches up with your testing. Basically, if you think of requirements, requirements is an atomic view of a scenario. And then you need to permutate that information in order to test it because now, you're testing all the different scenarios in the different route.

And so we found that over the years, similar to what Adam has been saying, is that going through the software development lifecycle, that experience was all about documentation and heavy documentation and heavy Excel spreadsheets and so forth. And then these complex systems that will allow you to put all this information in there, and it seemed like a lot of people had a learning curve around that. So by me having the experience in those systems, I decided that we needed a more simpler tool that will allow you to automate this.

And one of the things I was actually talking about earlier today with some colleagues was that there's going to be a future where you can converge and diverge information. And that information is always going to be generated for software development because we're always trying to basically create something on the fly, put it out there in the test environment, and then fail quickly. That's what we hear in the agile environment. But if we don't have the documentation just as quickly, going through that process just as rapidly as the development cycle, then we're going to lose that quality as it gets down to the testing cycle. We're going to always say we have bugs or we have low-quality testing results when we don't have the right requirements and the most up-to-date requirements to provide you with that. That's how we're starting to solve the problem, so I would love to hear what everyone else has to share too.

Shane Hastie: It sounds to me very much like the behaviour-driven development shift stuff.

Drawing on Behaviour-Driven Development [06:55]

Melissa Daley: Correct. Exactly, exactly. Behavior-driven development, which has always been around. It's been around. It's just now we're evolving into something a little bit more unique.

Shane Hastie: Bob, your thoughts?

Bob Crews: The solutions that we're all talking about and Melissa's talking about, it's great because it helps organizations, as we've already mentioned, start earlier, but it also helps them look at the whole effort of testing more holistically so that they're just not targeting one small module. And that helps eliminate that gap between those who deliver the application or the system and those who end up using it. Because a lot of times, that gap gets ignored. So that I'm sure Melissa and Adam and I, I know we've all been in situations where the developers, the architects, and the business analysts, they designed everything as it was supposed to be designed. They developed it as it was supposed to be developed, tested it, got it to the point where there were just perhaps 1% defects and issues. And we deliver it to the end user and they say, "No, this isn't what I wanted."

So the solutions of Inflectra and Orca Intelligence, they help bridge that gap. And it also helps make testing and testers so much more efficient. And by being more efficient, they become more effective, which is key. Because one thing our solutions won't do, it doesn't make testers great. The testers have to know what they're doing to begin with. They have to have a passion for quality. They have to understand what they're doing. But the solutions that we're discussing can absolutely make them more efficient and thus more effective, Shane.

Adam Sandman: Shane, your podcast is all about culture. And I think Bob, you hit the nail on the head. You need the culture of quality. If you haven't got a culture of quality, you can put any tool you want, any magic silver bullet, and it won't make any difference. You need a culture of quality to permeate the organization.

Agile is an interesting beast because Agile has made a lot of things better but has made some things in my mind worse. Agile was trying to reduce the conceptual risk. Because when we talk about risks in software development, conceptual risk, schedule, risk, and technical risk, Agile lets you do concepts earlier. You see it faster. You can do spike solutions, and with sprints, you should reduce schedule risk, which are all great things. However, you've also got to the point where you've made requirements user stories. They're very small. They're very brittle. You can have a bucket of user stories so there's no context. You lose the big picture, the holistic view that you were talking about, Bob. That's what I think is being lost in the shuffle in some ways, and if you think about that.

Bob Crews: Absolutely.

Shane Hastie: What is a culture of quality?

Having a Culture of Quality [09:28]

Bob Crews: A culture of quality in my mind, it's got to start with a passion in looking at the big picture, reminding me of the story about two brick layers, two stone layers. Masons while they're building the Notre Dame, and somebody comes along and they go to the first one and he says, "What are you doing?" And he says, "I'm laying stone." And they go to the second one and say, "What are you doing?" And he says, "Well, I'm building a church." And that's how everybody should look at it. At testers, they have that culture. It starts from the top down with leadership, and you've got to help make your team, your entire team, exceptionally proud of what they're doing and what they're creating, and truly understand that quality is a key role in all of that.

Melissa Daley: The quality culture I think is exactly what you're saying, Bob. You have a team that's passionate about wanting to build good code for the customer. They want to build good development or good software for their end user. And with that, because of that passion, they're going to look and pay attention to the details and build a framework that's going to allow them to move through a software development life cycle that's going to have quality throughout every step. Your project management life cycle should have quality. Your schedule should have some quality around there. And then your requirements of course, no one really thinks about the requirements framework. Everyone thinks about a development framework and they also think about a testing framework. But having a framework for the entire life cycle of your software development really increases the quality for the entire system itself. So I think that's important.

Adam Sandman: And one last thing I'd like to add about the culture of quality is around the leadership itself. I was thinking back to InflectraCON, our event, where we had some people talking about these topics. And one of the things that people said is that in some companies, it's a culture when you find a bug of which developer coded the bug, whose fault it is. You have a blame or fault culture. The second kind of culture is one where you find the bug and instead you say, "Let's find the root cause. Let's together as a team dig down and find out why did this bug get in there. Not to blame the person but to fix the process to prevent it happening again." Preventative culture.

Leaders Must Model Blameless Behaviour [11:35]

And the leadership of the team will determine. If they're going to blame people, then everyone's going to put their heads down and no one's going to be a whistleblower and say, "There's something that smells bad over there," because I'm going to get blamed for it. If you've got a culture from the top down that's like, "I want to hear everything that's good and bad. No one gets blamed. Let's just put it on the table," then I think you're going to get a very different outcome.

Bob Crews: At organizations where I've been at, I've tried to get everybody to actually get excited. Whether you're a developer, a tester, whomever, when a defect is uncovered. Because you uncover the defect, you resolve it. And you know what? You are that much closer to perfect software, and that's what it's all about. It's the same excitement I feel when I make a mistake for the first time. It's like, "All right, I learned something. I won't make that mistake again." Well, I always do and then I get frustrated. But that's how it look, finding a defect should be that excitement.

Melissa Daley: Yeah. And I also think, to piggyback on that, the defect is not only in the software. I think we concentrate so much on defects being in the software. There's defects throughout the process. And being able to identify, like you said, if you made a mistake or you went through a learning experience and you have your aha moments, that aha moment will uncover throughout. Maybe it's unit testing, maybe it's your design. It was a defect in the design. There was a defect in the requirements. Even in the planning, I don't think people concentrate on that. There could be defects in the planning, hence why we have retrospectives. If you're doing regular Agile, you want to figure out what those defects are in your planning process and it then trickles all the way down throughout the entire life cycle.

Bob Crews: Right. That's a great point, Melissa.

Adam Sandman: That is absolutely. One of the things also around that is the communication from when you're a project manager. We don't think about what's the quality of the communication, especially with remote working, when arguing on video calls and Zoom. That's not bad, but it's different. You lose and you gain. And also we find you're working with people in offshore development, different countries, different cultures. Our company has employees in almost every continent and we have meetings together. And there are times what I explained to you is not what you heard. And imagine that I was describing a customer feature to you. So a support agent talks to someone, gets an idea for a feature. Maybe it's an amazing feature. That can get lost in the cultural transition from a support person to this analyst to a developer. Maybe with different natural languages they speak, different cultures. And in that translation, the quality of communication is not perfect. But we assume that we've heard it correctly and we write it down and write a story and build it. So quality of communication is really important.

Shane Hastie: Let's dig into that remote, because most organizations today are working remote. The pandemic has changed this, and I personally feel that it's a pretty permanent change. We're seeing organizations able to take advantage of having people in many, many different places. And for those of us who are able to now work remotely, it means I don't have to spend an hour and a half in traffic and all of that. So there's a lot of good things that are coming out of this remote work. There's also some challenges, but what's the impact been on quality and testing?

Challenges in Remote Work [14:38]

Bob Crews: The big challenges that I found is it has to do with the processes that were in place relative to how communications were conducted. There was so much before where it was face-to-face. It was in meetings. It was easy to quickly whiteboard something. You could see somebody's facial expressions, body language, know whether or not they were engaged or ignoring you. So now with everybody remote, somebody might have a valid reason for not being able to use their webcam or something. But you don't have that personal connection which adds a new, I don't want to say challenge, but a new paradigm to communication. And communication is about verbally saying it, listening, making sure that all parties understand. I believe that's become trickier now, and we're all getting used to that and we are learning the solutions like using Zoom and some of the other online meeting applications. So it's interesting. It's exciting. I'm certainly not worried about it, but it's a process.

Adam Sandman: We're lucky. We actually have in our office, we have remote. We have in-person. We actually have both. We have people obviously in Argentina and other countries. They physically can't get here. We haven't come once a year though, so we do make room for team building. We have a policy of trying to get everyone in the entire company together at least for a week, once a year. And around InflectraCON, we did it till we could co-locate it with the conference.

But what we found is that it's interesting. There are some activities that now work better and some activities that work worse, and we've adjusted for those. But one of the things we found very useful is the fact when we're doing some infrastructure testing and things where we're using Amazon Web Services and we're doing consoles, it's actually much better remote. Because I can screen share with someone and we can literally do a shadowing. Whereas in the old days if I was sitting over someone's shoulder for half an hour watching them do something, I mean God, I'd be coughing over them. It was unpleasant. I couldn't sit for half an hour over someone's shoulder, pair programming or pair infrastructure. But pair programming and pair infrastructure activities where you're teaching someone who's and apprentice is so comfortable. I can be doing all the work. They sit there watching, ask questions, and we can do it for an hour on end and there's no discomfort. That's how communication is actually much better.

I think definitely the brainstorming and conception, that's the stuff I think is the weakest still on virtual. Where we can, we'll bring people into a physical room or we'll do multiple rooms linked together by a screen so at least we have some person-to-person. I have found those activities work the best in-person. But definitely a lot of things can be better remote. Even when we're actually in together, we'll do a screen sharing even though we're in the same physical building, which proves the point.

Melissa Daley: Yeah. And I'll say that it didn't change drastically for us because we were always remote pretty much unless we had to be on a client site. But I think quality had to improve. The thoroughness and communication did have to increase. Listening had to increase, because sometimes you get into an environment where no one wants to turn on their camera so you have to be a little bit more intentional in listening. Those are the things where I think for us, it was more improving the quality through listening more, not really shifting the work environment too much because we were pretty much being in IT. We were always just using all of the different tools, but definitely improving the listening skills so that we can make sure that we're getting what we needed out of everyone.

Adam Sandman: Melissa, how do you deal with what I call the curse of Alt+Tab, at least at Windows, where people are in a meeting, and I'm as guilty, Alt+Tab-ing, checking email?

Melissa Daley: Yeah, ugh. The reason why I cringe around that is because I'm one of those. I'm a multitasker. I'm like "Okay, so I can hear what's going on..." But I think I've gotten really good at it because I can always jump in and out of conversations. But I've been like that since a child. My family makes fun of me because of that. I can be in two conversations at the same time. Call it a very weird skill, but sometimes it's not good as I get older because I end up missing certain things. But you're right. But now, some of the technology will tell you when someone is not as attentive. It's a very tricky situation because you want people to be able to stay engaged, but sometimes there's some that lull in the conversation where you're like, "I know what they're about to say so let me just go and check my email."

Adam Sandman: "Well, I've read it before. Oh, I bet that you 10 times already, this person. They're a blow hard." That's just... Yeah.

Shane Hastie: What is happening in terms of the people in the industry where we're still towards the tail end maybe of this great resignation, great reassessment, however we want to call it. What's happening in terms of the workforce in testing today?

Attracting People from Diverse Backgrounds [19:23]

Bob Crews: I find one of the things that because people are so focused on working remotely that they've started now focusing on exactly what they want to do within quality and software testing. I'm finding when candidates come to me for a position, they're able to communicate much more precisely and they're more adamant about exactly what it is they want to do, the technologies they want to work on, what kind of specialty they want to develop than ever before. And if I can't offer them that, then they look elsewhere.

So whether or not that's a byproduct of the pandemic and all of that working remotely, I'm not sure. Or it could be a byproduct of, I believe that over the years, quality and software testing has started to get a lot more respect. I'm starting to see more universities offer software testing courses. The University of South Florida just 20 minutes from where I am offers one, and I speak there at least once a year. That's the big thing I see, more of a focus on specialization in quality and software testing.

Adam Sandman: That must make really hard with staffing when you have different projects and different clients and you can't control what they're using.

Bob Crews: Exactly right. Very difficult, very difficult.

Melissa Daley: Yeah. I would have to agree with that. I think people during interviews, they're being very intentional in what they want to do, very intentional. They come with a meaning and a purpose, and it's challenging because like you said Adam, if you can't meet that, what can you do? Because the clients will have their own environments, or they may have their own selection of tools that they want to use, and you have no control over that. One of the things that we've seen is really making our own internal work culture as conducive as possible to attract still those prime candidates.

Adam Sandman: That's a really good point. I completely agree with Melissa there. Culture is a huge selling point. Obviously, we are a tool vendor so we only have one project. We have the same system we have from 15 years ago. Now obviously, we evolve with technologies, we're constantly refactoring it, bringing in, react in new technologies, and the testing is improving and changing. But ultimately it is one system. And if you're with us for 15 years and 10 years, many people with us, not just myself who've been there that long, they've been working on one system, maybe two systems. We've got a few products, but not hundreds, their entire careers. So you have to stimulate innovation in different ways. You have to be able to make them feel this is a great place to work in other aspects with the culture, flexibility, benefits, just the place they'd love to come to work so it doesn't feel like work.

The other thing we're seeing around recruitment is we are looking for non-traditional backgrounds. We find there's a shortage in the industry of developers and testers, and anyone who can code is going to either become an automation engineer or a developer, and it's very hard to get them and retain them. And so what we are doing is we are reaching out to people who've never done development before. And we actually have a program called Second Acts where we actually look for people who either never went to college and have studied computer science on their own, either to being a test or a developer, and bring them in, or people who maybe were in testing or development 10 years ago, took time off to do family things and have come back. And those have been very successful in expanding effectively the pool of people who can work for us.

We're now taking it to the next level and actually working with some non-profits to actually try and do this in a more systemic way. And there's a couple of organizations we're working with in different countries where they actually bring people to use our tools, and we help teach them on how to become a software tester. And these are people who are the road diggers, minimum wage workers in the informal economy in some cases, bring them into the formal economy. And I think we all have a duty to do that if we want to stay competitive and have people to hire, because everyone's getting into software. Every industry's becoming a software industry, and we're competing with those people for our talent.

Shane Hastie: Where is testing and quality going? What is the crystal ball? We've heard and there've been some things I've seen certainly around integrating AI, for instance. What's happening there? And what do you see as the trends in the future?

The Future of Testing [23:17]

Bob Crews: You just mentioned AI. Absolutely, Shane. It is absolutely going in that direction, machine learning, AI, and everything. And that's key. That's key. The other thing I see happening, a good thing from a process point of view which is a subject I love, is more risk analysis. Because application systems, just because of the market, they have to be delivered faster than ever before. That's not going to change. If anything, our development time, deployment time's going to become shorter. And I do not believe we will ever be able to increase the speed of testing to match the speed of possible delivery. So what we've got to be able to do is make sure that we perform risk analysis, and at the very least, target that. Because we're seeing in the news every day, whether it's a security risk, a functional problem, a website goes down because of performance, these can cause irreparable harm to an organization, financially, life or death in many situations, or negative publicity.

So I love seeing risk analysis become part of the process, and I love seeing artificial intelligence growing to become part of the technology that we're going to be able to use. And when it comes to artificial intelligence, Melissa knows more about that than I do.

Melissa Daley: Yeah. I agree with you. One part that I disagree with you a little bit, just a little bit, is about that testing won't catch up with being able to deliver fast. I think that's what it was.

Adam Sandman: Keeping up with deployment speed. Can you test as fast?

Melissa Daley: Yeah, keeping up with deployment speed.

Adam Sandman: Going in, yeah.

Melissa Daley: Yeah. I think machine learning and AI will get us there. Again, I was just having this conversation with colleagues where it goes back to being able to generate all that information fast enough and regenerate it for anything new, but doing it literally at the speed of light. If you're using quantum, of course now I'm getting really deep, if you need quantum databases and so forth and you have all this data, you're being able to generate real documentation at any time, pulling it up, pushing it back down, putting it back up, pushing it back in just like you would at deployment.

I actually think that the more we get to the micro level of quality, because I think we're at certain levels of quality. Like before, we were at the very, very high level quality when we were using just basic documentation and all written down and so forth, and then we got into our spreadsheets. And then if you get to the micro levels that machine learning will get you in quality, we're able to do just as fast as deployment. That is at least my goal.

Adam Sandman: I guess we're with quantum computers, it's designed for such parallel activities that you can, in theory, could you traverse every single edge node? Every edge case simultaneously? Which is scary.

Melissa Daley: Exactly. And keep it going. You just keep it all parallel, keep them all going.

Adam Sandman: Except quantum computers will break every encryption. One of our colleagues went to DEF CON and came back and scared the bejesus out of us because when quantum computers are available, they're going to break every encryption we have overnight. HTTPS, every encrypted credential, everything's going to be broken. And it will happen so fast that we won't have time to react.

So actually thinking about the future, there's lots of different things I think that will happen around risks. One thing that's very interesting about risks is can we use AI and machine learning to actually deal with risk management and risk analysis? Because a computer model could be used to model things like weather patterns, large data sets that are uncorrelated, and it will find in there risks that we haven't anticipated. It might find a risk in a new computer system. You're deploying this computer system into a particular target user group which didn't match the data set it was designed for, you wouldn't have known that. But because we've done all this data analysis, we can actually tell you that the demographic is different. Maybe it's got a large number of colorblind people using red, green. That's just a very simple example. But AI machine learning could potentially do some of the risk analysis or risk assessment piece from these large data sets.

The converse I think is that machine learning, potentially because we were using algorithms that we haven't designed, there may be risks we don't even know. It was modeled on this dataset. We're applying it to this other dataset. What's the risk of that? So it adds risk. Reduces risk, any by equal measure.

Melissa Daley: This conversation is so great. It's really exciting me to get this detailed about where we can go in the future. I just get delighted.

How do you Test AI Systems? [27:35]

Bob Crews: I do too. And I love artificial intelligence. And one of the things, Melissa, that I'm always thinking about is, all right, well when it comes to artificial intelligence, how are we going to test that?

Adam Sandman: How do we check it?

Bob Crews: How do you check it? Because I think imagine Albert Einstein, and if he started as a small boy, could I teach him first, second, third, fourth grade mathematics? Certainly. But then at a certain point, I can't teach him anymore, let alone test him to determine if he knows what in the world he's talking about. So then I see having to test AI, right?

Melissa Daley: Exactly, yes.

Adam Sandman: Loving the stoke. Again, like separate machines that test each other.

Melissa Daley: Yup.

Adam Sandman: Like the space where we all check each other.

Bob Crews: And the trust that we're going to have to have in that AI, that's going to be a leap for us.

Adam Sandman: What if the AI gangs up on us? They'll all be in cahoots. They'll be like in the playground.

Melissa Daley: Exactly, exactly.

Adam Sandman: They'll be like, "No, no, it's good. Don't worry. We've all checked it." And the human's like, "Are you sure?" The machines are like, "Oh yeah, we checked it."

Bob Crews: Yeah, that's right. Movies are made of that.

Melissa Daley: Correct.

Shane Hastie: What does this mean if I'm somebody who's thinking about testing as a career. What do I need to learn?

What to Learn for a Career in Testing? [28:46]

Melissa Daley: The basics.

Bob Crews: Yes, yes. The basics.

Melissa Daley: Right?

Bob Crews: I was going to say-

Melissa Daley: The basics.

Bob Crews: ... absolutely, start with the basics.

Melissa Daley: Yeah. Understand that you're looking at the whole system, but you're looking at the different modules. Of course, I'm going to break it down based off of what we do. So you look at the high level features, epics, you look at the actual features, and then you look at the different details which is going to be translated into user stories. But by breaking it down and decomposing your testing that way and understanding decomposition, then you're able to do a better test. But of course, I'm going to always go back to the requirements. You have to get the right requirements funneled down so you can have a better test.

So knowing the basics and knowing what the foundation of quality means; quality just doesn't mean the function works, but quality also means that the design works for the end user, that the infrastructure works for the end user, all those. It's like a couple of things that you have to have for quality and making sure the system is interoperable, that it interacts correctly, that it's functioning from the end user's point of view based on the scenario, and then also the design is accessible. So all of those things, understanding what that means for your tool in quality.

Bob Crews: I love, Melissa, that you said the basics because I was down in Mexico City teaching a course on software testing, and it was so refreshing because out of the nine students I had, seven of them were between the ages of 22 and 26. And they were problem solvers. They were abstract thinkers. They were excellent. So they had the very foundational aptitude that I want in a tester. And then we were talking about things like equivalence classes of vectoring part, all that good stuff so that they could better understand how to be a very efficient and thus effective tester. Because I'm a firm believer, organizations tend to test not too little but too much. And what I mean by that is there are a lot of redundant testing and things like that. So if people learn the foundations of being a solid tester, then learning the tools that you're going to be introduced to, that'll be the easy part. But you'll already be a good tester.

Adam Sandman: I think it's a great time to be a tester. 20 years ago if you came in and they would say, "Well, you can't code very well. We'll make you a tester." That was the mindset 20 years ago. And so if I was a tester 20 years ago, it was not a great career path because you always felt secondary and you were basically brought in to do what they call monkey testing where you're basically just following these scripts and typing away and not having an intellectual experience. I think now if you were starting on a career path as a tester, I would say watch some Steve Job videos. Do social sciences. Think of user behavior. Your job is to be the user advocate. You're going to have to put yourself in the shoes of these users and try and figure out how they're going to use the system.

So it's an amazing role to be in that and then take that experience and be able to translate that into, "How do I test something effectively? What are we missing? What risks have we not thought of?" It's a great intellectual exercise. It's a great questioning role. But I think 20 years ago, it wouldn't have been. And I think as you said, Bob, testers are demanding that they want to work on certain technologies. I think they're going to only want to work for companies that recognize that. And if you are going to put a tester in a role where they're basically doing robotic tasks, they're going to quit. And so I think as a tester, you have a lot more autonomy on your career path than ever before.

Shane Hastie: Some really, really interesting conversations here, folks. Thank you so much for taking the time to talk to us today. If people want to continue the conversation, where do they find you?

Melissa Daley: You can find me on LinkedIn. I'm on it daily, on LinkedIn. Or you can find us at Orca Intel on either Twitter, Facebook, and Instagram, and also on LinkedIn as well. Primarily we're always on LinkedIn so that is the best way. Or at our website, www.orcaintelligence.com.

Bob Crews: Everybody can find me. I am on LinkedIn and the name again is Bob Crews. That's C-R-E-W-S. So not like Tom Cruise, but Bob Crews, C-R-E-W-S. I'm on LinkedIn quite a bit. You can always also go to our website at Checkpoint Technologies with one T, checkpointech.com, and email me at bcrews@checkpointech.com. But if any of our listeners would like to continue this conversation, I love this stuff so please reach out.

Adam Sandman: Same places, really. LinkedIn, adam.sandman, Sandman like the current Netflix series or several movie characters or music songs by Metallica. I'm on Twitter somewhat, not as much as I used to be. I think I'm doing more LinkedIn these days. Or you can go to inflectra.com. That's the company. And we're also on Twitter, Instagram, Facebook, and LinkedIn. And also if you're interested coming in person, if you want a trip to Washington, DC, this is where we all met in May. Next year in April, we're going to be there. I think it's the last week of April in the Washington, DC area. Come to InflectraCON. We'd love to see you there. We have lots of discounted tickets and there's early bird right now. And then we can have a conversation there in person. I think Bob and Melissa are both going to be there. Shane, if you want to come, you are hereby welcome. You're hereby invited.

Shane Hastie: That'd be great.

Adam Sandman: I'm going to be in Australia in October if anyone's listening from Australia. And I think some of us are going to be in California at Star West at other events in the software testing world. So any of those events in the next few months, I think quite few us will be there in person as well.

Shane Hastie: Thank you so much.

Mentioned

About the Authors

More about our podcasts

You can keep up-to-date with the podcasts via our RSS Feed, and they are available via SoundCloud, Apple Podcasts, Spotify, Overcast and YouTube. From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Previous podcasts

Rate this Article

Adoption
Style

BT