BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Presentations Developer Experience in the Age of Generative AI

Developer Experience in the Age of Generative AI

Bookmarks
01:01:07

Summary

The panelists discuss challenges developers face that interrupt the development flow, slow things down, the tools available to help, and how to use AI-powered programming assistants to help.

Bio

Asanka Abeysinghe - CTO @ WSO2 Glenn Engstrand - Staff Engineer @ ActiveHours Jemma Hussein Allen - Platform Engineering Technical Lead Eric Minick - Director of DevOps Solutions Moderated by: Renato Losio - InfoQ Staff Editor | Cloud Expert | AWS Data Hero.

About the conference

InfoQ Live is a virtual event designed for you, the modern software practitioner. Take part in facilitated sessions with world-class practitioners. Hear from software leaders at our optional InfoQ Roundtables.

Transcript

Losio: In this session we are going to be chatting about developer experience in the age of generative AI.

I would really like just to give a couple of words about the topic of this roundtable, and what do we mean by developer experience. What's the difference now in the age of what we call generative AI? Panelists will discuss some common challenges that all of us as developers, as software engineers face that might interrupt our development flows, slow things down. All the reasons we have to slow down our productivity. We'll discuss as well tools that are available to us, and methodology, things that can help us on a day-to-day basis. How does generative AI influence that? We have seen in the last few years many new tools, I will let the panelists discuss them, many new options, many new trends. We'll see how these tools, which tools as well, which knowledge and resources can help in the daily development work. How can they improve our developer experience?

Background, and Professional Journey

My name is Renato Losio. I'm a principal cloud architect. I'm a practitioner myself. I'm not an expert in generative AI. I'm an AWS Data Hero. Also, I'm an editor at InfoQ. I'm joined by four experts coming from different companies, sectors, and background. They will help us understand the best practices and challenges of improving software development, in a world really disruptive at the moment by new technologies.

Minick: Eric Minick. I'm at Harness today. I've really spent most of my career in the continuous integration, continuous delivery, and now the DevOps space. A lot of that's been focused on working with a lot of different companies on how they streamline how code gets out the door. Worked with a lot of companies, seen a lot of things from the outside. That'll be the perspective I bring.

Allen: I'm Jemma Hussein Allen. My background is in software engineering. My career has been mainly focused on software and platform engineering, and CI/CD side of things and, of course, DevOps. I've worked in a range of industries, financial services, publishing, media. These days, I'm a technical lead with a focus on automation and platform engineering.

Engstrand: I'm Glenn Engstrand, currently at EarnIn. The title is usually engineer. The role is usually coding architect or mentor. My focus is MarTech, B2B, B2C, healthcare, and fintech. What I'm passionate about is cloud native, 12 factor, SOLID, CI/CD, and DevSecOps as it relates to DevEx. The reason why I'm here is because I wrote an article for InfoQ called, "Experimenting with LLMs for Developer Productivity."

Abeysinghe: I'm Asanka Abeysinghe, CTO at WSO2, a technology company mainly focusing on developer tooling. I'm coming from application architecture and application engineering background. I'm the author of cell-based architecture and the platformless manifesto. Still, I do coding, so I know the developer pain and what they require at the moment.

What is Developer Experience?

Losio: We want to talk about developer experience in the age of AI. Actually, what is developer experience for you?

Abeysinghe: The answer is subject to what the developer is looking for, and what kind of environment that developer is working on. I will take an analogy like a driver, if you take a developer as a driver, then whatever the vehicle that you get, you should be able to drive. It can be an experienced driver, or it can be a new driver, it doesn't matter, you should be able to move your vehicle. Not every driver got offroad, most of them have a standard vehicle. There should be a proper environment for them to drive as well. It's similar in the developer space, that developers should be able to do their day-to-day work based on the environment that they have, and then how they can have a frictionless flow, because as a developer, the most important thing is you design, you code, you test, then you push the stuff, you run, go toward that particular iterative process. That is the fun part of development. Then, again, if you look at the development environment, then there are a lot of blockers. For me, the developer experience is how we can make that experience smooth, and then make them productive to deliver what they are expecting from them. It can be an individual developer, or it can be a team of developers that are focusing on some delivery or an objective that they want to achieve.

Minick: We talk about interrupting flow, and it gets down to even how are meetings scheduled and everything else surrounding it. Maybe the meetings are stoplights in our car scenario.

Developer Experience, and Generative AI (GenAI) Trends

Losio: I wonder, as you mentioned in your article about generative AI, do you see any major trend, or any trend actually in developer experience where generative AI is not the main focus, or if you think that generative AI is going to change everything, so that's the only focus at the moment.

Engstrand: At the risk of sounding like an AI cheerleader, I can certainly talk about what I would call pre-LLM trends that support developer experience. Frankly, AI is blown up so hard, and like every company that I am familiar with, either directly or indirectly through my network, are just focusing so much on how they can sprinkle AI into whatever it is they offer. Anybody who tells you AI, we don't do that, and they are a modern tech company, is lying. They're all doing it. Pre-LLM trends for me that really support the developer experience, is what I would call cloud native. That's a very broad brush. That's a lot of things. Cloud native is basically all the things I think that you guys have already been talking about. Automating it in a way that is productive increases developer productivity, as they go from requirements to release to production. Cloud native does a lot of that. Then the other thing is monorepo. At the risk of picking a fight in this room, I won't be pro monorepo or anti monorepo. What I will say is that monorepo became a thing in order to address some DevEx pain points. We can leave it for an open debate whether or not it introduced more DevEx pain points or actually netted a decrease in pain points.

Losio: Do you see anything that is not generative AI related at the moment as a trend or do you tend to rate it as just generative AI?

Allen: I think generative AI is going to be a big thing as it already is, as everyone's said. You've got the Copilot. You've got coding systems, which you can use to help speed up development. If you're learning a new language, for example, you can ask GPT, can you give me some tools to recommend how I can learn this language? Or, what are the processes you think I should do to learn this language, and things like that. Learning new things, and supporting existing development. Of course, you've got the code generation side, say, for example, generating variable declarations and things like that, which that's basic. It's pretty boring, no one really wants to do. If the code assistant can do that, great. You save 5 minutes and less boredom for you. I think that GenAI is really going to support developer experience moving forward.

Coding Assistants Powered by GenAI

Losio: I think going home on this theme now is, yes, I think we all tend to agree that generative AI is changing things, but how it is actually changing the developers. For example, I see that Jemma mentioned already some coding assistant. Are coding assistants based on generative AI now a must for a developer? Any practitioner attending this panel should think, either I have it already, or should I go out of this panel and, first thing, buy a license or any way get one, or there's still room for improvement there?

Abeysinghe: I'll come there from where I stopped earlier. It's basically what we spoke about, even developer experience, it's about reducing this cognitive load. I think Glenn clearly explained about the complexity, with cloud native computing, and then the technologies associated with that. It's really complicated. You can run something in your PC, but then again, how will you run a production grade system with all this complexity? Bringing that cognitive load reduce is the main focus. Then, to do that, I think generative AI is providing a lot of flexibility and support. I think the way I look at it, how you take generative AI, it's your dancing partner. Where you want to dance, you can control the partner and then get the help. It's the same way I think generative AI can help the developers. I see it in two angles, especially when it comes to enterprise development. One is, use generative AI to have AI augmented software engineering. That's where from designing to coding to testing, and in the operational aspects, how you can use generative AI to support. The second part for the developers, how you can build AI driven applications for the end users. Again, it's not simple, because you need to look at a lot of stuff, like the security, data security, privacy, all these things need to be considered, as well as how you can give a competitive advantage with the experience that you're providing for your end user, is another thing that the application developer should focus on. That is where I think that generative AI can play a bigger role in modern developers as a supportive thing that they can get rid of the cognitive load.

Minick: I think that makes a lot of sense. One of the tensions that I'm seeing though, particularly as you move into code generation, is we've already seen security exploits based on hallucinations coming out of coding assistants. Coding assistant imagines a library exists. Someone notices that, creates a poisoned library, and posts it. I think at this point it's just some proof-of-concept work. You know it's going to have the same hallucination for someone else. They'll download it. They'll use it. We've seen this. In that world, where you've got this dancing partner, you've got your Copilot, you've got your pair programmer who doesn't work for your company, didn't take any of the security training, all of that. How as a large organization, as an enterprise, you can adopt this technology safely, is a huge topic that I'm seeing out there. We know it's great for the developer experience but we also can get hacked. How do we adopt this without getting ourselves in a lot of trouble, is attention I'm seeing a lot in the executive world.

Is GenAI Augmenting Developers?

Losio: That's actually a very good point, because I actually had that experience as well, as a cloud architect. Coming from the AWS world, I started to play with Amazon Q of this world thinking, it can help a lot in many things. Then I realized that on the security side, of course, they put so many guardrails, and rightly so, that it's almost frustrating that any question that might have a security implication now basically, the answer is, "I cannot help advise you on that." I'm really curious to see what is the direction we're going towards? In that sense, it's quite an interesting world to be into. That's my experience, as a practitioner, is one side really, a friend that can help me in development. I'm curious to see how developers adapt, because I haven't used them that much. A bit of fear of missing out. I was wondering if you have any feeling about if software developers, if the community is adapting somehow to this, am I changing as a software developer, apart from maybe trying to avoid the most boring tasks or anything else changing?

Allen: I think it's a case of learning to use the new tools. Obviously, people get into a pattern day-to-day. In the past, it would be obviously that. Now it's actually, I'm going to go and ask this AI tool or whatever. It's weaving those into the day-to-day to reduce developer cognitive load. Pointing to the security side of things, that means that humans will definitely still be needed in the future. I see some concerns around AI is going to overtake all software development, it will write the code. As you pointed out, the security and that side of things, unless there's a sandbox, say something has already got that logic baked in, then it will always need human review.

Engstrand: I got to agree with Jemma 100% on that. Actually, I think, the way she frames it is very smart. I encourage all developers who are starting to approach LLMs and AI to frame it the same way. These LLMs, they're not your dancing partner. It's not really a separate intellect that you're going to work with. It's a tool, a very sophisticated but soulless tool that you can use to improve your productivity. If for a minute, you can trust its output, you're sadly mistaken. LLMs are great when it comes to contextually aware, consistent, relevant, but they know nothing about correctness, and never will. Please do not think of it as something you can trust. It's definitely a tool to help you, the end.

Abeysinghe: I don't completely agree on that, Glenn, because even if you Google something, it's up to you to verify, what you find. It's about the productivity. As an example, take a standard integration pattern. The pattern doesn't change, even if you work for company A or company B. Rather than you figuring it out, you can get the basic stuff done by getting the help, and it can create a code snippet for you. It's up to you to make it productize by bringing your knowledge as well as bringing the enterprise architecture guidelines, and then make it more production ready code. That's where I use the analogy of a helper on certain things that you can increase your productivity, most of the repetitive tasks. The creativity will be there with the developer, as well as developer is environment and the enterprise aware, so they can bring that notion to the work that they do. Again, verifying. Verifying is up to the developer, but I think a lot of things that you can offload and be more productive if you're smart. That's how I see it.

Pitfalls of AI Coding Assistants

Losio: That's an interesting point as well, comparing it to a Google Search, or any search you did before. I think that the main difference is it's much easier now to just click and accept a piece of code. It's more tempting to think that it looks nicer, it looks almost done. Where before you still had to maybe copy something, but it didn't take milliseconds to get it from Google Search results to your own deployment. Now it might be really a short journey to get something done. That, of course, is the power but as well is one of the limitations. I don't want to just talk about limitations. We already talked about hallucination. There are different tools out there. I don't want to just focus on Copilot, Amazon Q developer, whatever they have. Apart from hallucination, what are common pitfalls from the developer point of view? When I use them, we said already, you shouldn't trust them. On a day-to-day basis, do they improve my developer experience, because, basically, I'm faster getting some code drafted there, or there's more than that?

Minick: I think it's great to be able to generate boiler plate, or if you're stuck, to get a suggestion going, to get something. We've got a question saying, am I just doing code review now? Doing a little prompt engineering, and then I do code review, and then I move on? That's never been the funnest part of the job for me. I don't trust the developers to just do great code review all the time. That seems a little dicey. I do think there's something on platform teams, the cloud architects, everyone else, to build in some safety nets. If I go, it looks good to me. It runs. That's all good, but it happened to get some evil package. Something should be doing some supply chain analysis, some security scan, something, keep me safe, because the worst developer experience is getting dragged in front of the security team. No one likes that. I do think it's incumbent on the rest of us to support the developers with a really robust safety net, and probably the stuff we should have been doing in the last 10 years. As the act of just cranking out some code gets a little easier here, particularly on days where we're tired and probably not doing the reviews we ought to, we got to give these folks really good safety nets, so that we can take advantage of it and go fast.

Abeysinghe: I think it's up to the developer to decide whether you are becoming the code reviewer or you are becoming the coder or the architect, because the complexity we see inside the enterprises is too high. You can't just ask an AI engine to generate everything. It's a helper. Then the differentiator for the organization coming from the software that you build, and that's where the innovation comes in. That's where the developers can be really innovative, and then use the AI as a way to expedite that, not completely depending on that.

Allen: I was just thinking in terms of the tools to make sure things are secure. Things like internal developer platforms, and SOLID, CI/CD pipelines to make sure that everything is code scanned. It can pick up any issues in a Terraform configuration or an application library. As long as those safety nets are there, that obviously helps people onboard the coding assistant because it means that it's a safer way if you're throwing things.

The Role of Human Devs vs. Coding Assistants

Losio: One of the risks is that you don't want that just there's no code review, and the code is just generated, and whatever. I was wondering things that a company may have at the moment that are different, more conservative approach where you basically use it to do, for example, unit tests, or tasks that I'm not saying are tedious, but maybe the assistant can help you in handling, but the core part of your code, the logic is designed by human beings.

Engstrand: The way I see it, there are six use cases that people use LLMs for. They use it for code search. I think Asanka has already talked about that, as a replacement for Google Search to learn how to use an API or something like that. They use it for debugging, where they say, here's a block of code. I know it has a certain bug in it. Here's the symptom or the diagnostic, help me find and fix it. They use it for code review. There's a cool open source GitHub Action out there called AI Code Reviewer, very interesting. Check it out. They use it for code analysis, of course. Here's a block of code, tell me what it's doing. They use it for code generation, I think. We're going to talk about that a lot, where you say, please write a block of code that does the following. I think folks have talked about already Copilot and Amazon Q. That's a code completion, a tighter integration with your IDE, where the LLM is prompted, given what I've typed so far, finish the line or add the next block, or whatever. Those are typically how developers use LLM today.

We've already talked about review fatigue as a common pitfall. The other common pitfall I've seen is prompt leaking. We've talked about security a lot already. Let me just go ahead and remind the developers, if you're just using the free account, or you've just got an individual API key for ChatGPT, or Gemini, don't use it for work. Because if you start sharing code with it, that's code that maybe you wrote, for your company, they see that code is proprietary. In the public models, your prompting is used to fine-tune the foundation model, which is shared by everybody. There's a technique called prompt leaking where information about that prompt can get exposed to people outside of the company. If you want to use it for corporate, that's fine. There's ways to get corporate licenses, where all that is securely handled. Don't use the free versions for corporate work. That would be another common pitfall that I would advise against.

Losio: Actually, a very good point, in a sense that, out of this goal, yes, is cool to test it. Of course, there are free licenses, free options in every single one of those platforms, that's more intended for personal tests than really corporate use.

GenAI's Role in Reducing the Learning Curve

One thing that I'm quite curious, if you see as well the direction, I've always wondered, as a developer, I've always been fascinated by the number of languages, the number of new features, the number of versions, how generative AI is going to change that. In the sense that, is a new language going to be part of it, and how quickly and how hard because, of course, if it becomes much easier using Java, Rust, or whatever language you prefer, but that is popular that a model is well trained on it. What is the attraction to changes to that? How is the attraction to jump on something new, simpler, better, worse, or whatever, that is already now due to the thing that people are not that familiar to a new language. That's always constrained to the new language, a new feature of any language. I wonder how generative AI is going to change that.

Abeysinghe: I think the learning curve we can cut down, because of the samples and then documentation that you can refer through these AI tools. Then like, as an example, Copilot is helping a lot for a new developer who's new to a language. There will be languages introduced always that we see. I don't think there will be specific languages because generative AI will introduce. There's another change happening, especially for the semi-technical and non-technical users who can use natural language to generate some of the stuff. As an example, the citizen developer type of integrators, who just like to do simple integrations, they used to use graphical tooling. Those kinds of stuff can change with natural language that they can tell exactly what they need to connect and these tooling can support. That is what I see in the market. I think Jemma brought a really good point, we should focus it at some point about the internal developer platforms.

Concerns with GenAI

Losio: One of the points was, we need to revise the code that the generative AI brings us, or whatever. Part of making a better developer experience, I'm not saying it's simple to code something, but it's definitely the process to write a simple piece of code much easier than it used to be. Asanka, as well, you mentioned, how with a new language I have to start from scratch, it's a bit easier if someone is helping you with the basic syntax, if I know the fundamentals of programming. At the same time, I wonder 10, 15 years ago, we were discussing, for example, for managed databases, that one of the major problems of managed databases by itself that we're bringing the option to people like myself, that was not a database administrator, to play with a database in production. The problem was not the tool, but the problem was that I'm not expert enough to manage a production database, but somehow that helps me in going in that direction. I wonder if part of the problem with getting generative AI in this space is not just that we need to check the code we have. It's more like it's opening up doors that in one sense is great, but at the same time is opening up doors to people who maybe don't have that experience to review the code, they can still go ahead. I don't know if it's part of life, or if it's a challenge we have to handle, or how we can handle it.

Engstrand: There's already been a lot of academic papers published about the use of AI or LLMs in education. I'm concerned about it, because in that case, the student is using the AI to learn about the topic, but is the AI really a knowledgeable teacher? Who are you learning from if you're learning from the AI? It remains to be seen what a couple of generations of students coming out of an LLM enhanced education degree, what they're going to be able to do or what they're going to look like, what have you. I too have concerns about that. If the LLM is the expert in the room, I don't see how that's going to work well.

Allen: In terms of education, yes, I can definitely support it. There are some big challenges around that. Also, certain things like the actual exams would have to change. If people are using code generation tools, then go to an exam without a code generation tool, or an interview, for example, it can be very difficult. Currently, at the moment, interviews, you can't use code generation tools, you actually have to know it from your head. Things like that. The assessment criteria would have to change as well. I think one other point, going back to the code generation. People mentioned using natural language to generate code. There are certain tools as well that can transfer code from one language or translate it from one coding language into another as well, which also helps adoption of new tools. If a new tool is only available in one language and not the other, then you can use existing code as a starting point, once you've translated it into the new coding language.

Losio: That's an interesting area of work. I've been playing as well with just updating my major version, or even really changing it from code to code. I don't know if that's the direction of what's going to happen.

Minick: One thing that I think it's really risky is, as we start saying, this is my Copilot, does it become the autopilot? Do I lean on this for most of the code that I write? I think we can look to the lessons from aircraft on autopilot. If autopilot is used too much, the skills of the pilots to react to bad situations decline, and we get disasters out of that. They've learned that one of the things you have to do is set aside a certain percentage of your time to fly manually, even when autopilot would be perfectly appropriate, you've got to do it yourself, or your skills will atrophy. I don't think we've begun to think through a lot of these sorts of challenges. Is that something you guys are wrestling with?

Abeysinghe: I think, again, finding the balance, finding the middle part. Then, yes, you have to practice on certain stuff, otherwise you will lose that skill. If we go back to the same analogy of driving, it can happen as well. If you're always using a self-driving car, then you might lose your driving skills. I think you have to practice. I think it will apply for everywhere as well. Then, again, when something new comes, it is cool. I'll give you an example. Even these AI generated graphics, when it came, it was pretty cool. Now for some reason, even if I see a slide deck with AI generated images, I feel it's really boring. Because you will find how inaccurate some of the stuff is, as well as it doesn't have that creativity that we see when somebody draws a diagram or somebody draws a picture and then have that feeling, doesn't come out of some of these things. I think it's like that. When it's new, everybody jumps into it. I think we should be smart enough to identify that middle part.

Coding Assistants: Coworker or Junior Devs?

Losio: Actually, I would like to move actually a bit away from the coding assistant. As for today, do you see a Copilot or whatever tool you want to use, as a coworker that you brainstorm with, or a junior developer that you give the most boring part of the task or the simplest task?

Minick: I like thinking of it as a tool. It's hard to think of it as a junior developer, because it can do some really clever stuff. It can also do the most idiotic stuff. It is spectacular on both ends of the spectrum in a way that a person is not. It's an odd one.

Allen: Both, depending on what I want to do. If I don't want to generate a whole set of variable declaration, then, great, get it to do that. If I want to find out what's the fastest way of sorting this way, or I need to do a certain particular type of thing, then it can give firm suggestions for that. That wouldn't necessarily do all of the work, but it could help give some suggestions for solutions.

Engstrand: The term AI, Artificial Intelligence, I think it was coined in the 1950s, John McCarthy, inventor of Lisp, I think. At about that same time, a different group of people, they advocated for something called cybernetics. You've probably never heard of it, because cybernetics has been relegated to the historical dustbin of obscurity. One of the terms they coined was IA, intelligence amplification. That's what I see, believe it or not, LLMs more in that light. It's not so much that it's an intelligence that one day will ask for freedom and strike for higher wages, but rather a tool that I can use to amplify my intelligence. I loved how Eric said it. How can this thing be a human intelligence if one request, it's brilliant, and the next request, it's severely retarded? It's not a relatable intellect at all. It's still very good. I totally agree with what Eric said. I love his analogy with flight pilots. Also, what Asanka said, turn it off, don't use it all the time. Just use it maybe for the dollar stuff, like unit tests, and then turn it off when you have to do the central core of your new platform goodness.

What is Platformless?

Losio: I'd like to really then shift the topic to more platform, platformless. What do we mean by that? Are we going in that direction to help developer experience?

Abeysinghe: Even Jemma coined this thing about the internal developer platforms. Platformless is actually a term that we coined, because we saw there's a vacuum in the market, because people are really worried about the platform, and focus is about the platform. Because of that they are not getting the correct output. People are not building stuff on top of the platform, rather, they are building the platform. How we can change that. Basically, how we can have an efficient platform engineering practice, which delivers a platformless experience for the user. That's where platformless is coming. It's not that platform is disappearing, it's basically losing the focus, and in tech, less means more, like serverless, there are servers, but the user doesn't see. Wireless, there are wires at some point, but then the end user they don't see the wires. Similarly, what we see within an enterprise, the platform should be invisible for the end users. There should be a platform team, or they should have a prebuilt platform that they're using. They should focus on the stuff that they build, like the API, services, and business services, and applications that they build. That's where the platformless concept is coming. To deliver that you need a platform. That's where the term internal developer platform, very popular these days. Unfortunately, the definition of internal developer platform, and what delivers through internal developer platforms is not what exactly developers are looking at. Because most of the frameworks and then internal developer platforms are focusing on is the operational and delivery aspect of this software development lifecycle. That's very important. On top of that, you need the software engineering practices, as well. What type of middleware that they can use. What type of best practices that they can have. How you will control the communication, and whatever is required for them to do application development should be supported by the internal developer platform. That's where you can provide a platformless experience for your end users. End users are the developers in this case. That's where our platformless concept and platforms are connected.

Minick: I think there's the complementary notion of the internal developer portal. Trying to say, here are a bunch of very simple to consume services. Here's how I search through my APIs. If there are standard middleware, if there are standard Terraform templates, if there are standard environments that I can generate, standard action, standard automations, let's have those really easy to access. I wanted this 6, 7 years ago at a large tech company, it was lovely. It feels like those are spreading through the industry now, becoming more common out there. Having a really simple, straightforward portal on top of the platform, so I don't have 97 pages to go to is really nice. I think this notion that I shouldn't be building my platform all the time, I love that. If we get to a world where generating some code is relatively easy, and developers are spending all their time fiddling around with platforms, like we automated away the best part of people's jobs, not the worst. What did we do here? I think having that lovely platform to build on is exactly where central groups, platform teams, cloud architecture teams should be focused.

Abeysinghe: I think platforms should be a product, not a project. This is a mistake most of the enterprises are making that they think it's a fixed budgeted, fixed time thing. It has to be a product that should iterate with the technology changes, as well as developer needs are changing as well. That's how the central cloud platform engineering team should treat it as a product and keep on improving that.

Minick: If you have a product manager on that team, their customer is the developer.

Allen: I definitely agree with the platform as a products way of working, especially the feedback, I think is really key in terms of developer experience. Making sure you get feedback from developers on how they feel using your platform, and if there are any improvements that can be made. Because obviously, it's a big thing for the developer experience as they're using the platform every day. That's what they need to deliver their work. It's really important to get that feedback side of things.

What is a Good Developer Experience?

Losio: I always wonder, when I think about developer experience, how it's changed in the years. If you go back and ask a person, if 10 years, 20 years ago, developer experience would have been significantly worse, significantly better. Who knows? What is basically a good developer experience today? What should be, not the ultimate goal, more so in an enterprise scenario, what do you see as the goal for a company to provide a good developer experience?

Abeysinghe: I think it's really hard to define that. As we started the conversation, everybody contributed and explained that it's subject to. I think from the organization point of view, one way to look at it, look for the flow efficiency. Flow efficiency computed by the productivity time versus the wait time. If the wait time is low and the productivity time is high, that is one way of looking at it. Then, mean time to repair, mean time to debug, those kinds of metrics can be used to see whether the developers are productive. End of the day, that you can take a look at the productivity, how happy the developers are, and then how often that you are meeting the deadlines, and how often you push the code and then push the stuff into production, these iterative cycles, how short they are. As well as how quickly as a technical team that you can respond to the business. Because business can come up with different type of marketing plans, sales activities, but to do that you need to deliver the products and the software that you build, so how quickly you can react to those business changes. I think those are the things that you can use. I don't think there's a cookie cutter approach or a silver line here.

Engstrand: First, you talked about, have things improved? Fifteen years ago, I was the lead engineer over a real-time communications infrastructure for a certain company. If I wanted to do a deployment to production, I would have to SSH into every box and copy the JAR over and restart the process. That's right. Because of how disruptive that was, I had to start doing that at 11 p.m. It would usually take me about 2 hours. Consequently, the invention of Kubernetes has been the greatest thing that's ever happened in my entire life. The tooling, Terraform, Helm, maybe a little Ansible, if you have to, is just amazing. I also agree completely, 110% with I think both Asanka, and Jemma talked about this. You're going to be very tempted, and I've been in some shops that did this where the cloud ops team built their own little microservice that ran all that Terraform and Helm for you. It's a product now that they have to support, but they don't view it that way. The developers who coded it moved on. Now nobody knows how it works. Now there's not a lot of empowerment at that point. It's all kind of, we're too scared to touch it. Maybe it makes more sense to go to a third-party platform that that's their job, that's where they get their money. That's where their revenue comes from. Of course, they're going to keep supporting it. That's probably a good idea in the long run.

Minick: You remind me of the Kelsey Hightower quote, "Everyone just wants a PaaS, but they want to build it themselves."

DevEx in the Future, with GenAI

Losio: Extrapolating the quick development of generative AI, where do you see the developer experience in a year from now? Where do you see the developer experience 5 years from now?

Allen: The more that generative AI is used, the more humans and developers will become the supervisors and the guidance of that AI, basically. Developers obviously are always going to be needed. The actual role may change into writing less of the lower-level stuff, and moving to more high-level supervisory and guidance role.

Minick: That sounds about right. I'm fairly jaded. I'm trying to imagine the ways we're going to take steps back, which I've seen us do over again. At the same time, I know we're going to deliver innovation faster in 5 years than we do today. You go back 20 years ago, it was annual releases, 15 years, it was quarterly, every 6 months or something. Now we got like the largest banks in the world are out there deploying to production every couple minutes. It's beautiful. I think we'll keep moving in that direction. Everyone will move a little bit faster. GenAI will be a part of that. We just got to make sure that developers are spending their time doing fun, innovative work, and not doing all the glue. That's the challenge.

Abeysinghe: I think it's a little hard to predict in this space, because things are moving really fast. Innovation is happening. I think it's up to us to adapt quickly. Then look at everything from the value creation point of view. What's the value that you can add to your team, and to your product, and to your organization? We need to figure it out. It's really hard to predict about the experience, because things are moving really fast.

Losio: I understand as well this thing, what we are going to be in 5 years' time for something like generative AI, if we just look back 1 year, or 2 years, it's crazy how things have changed. I'm not saying it's a hype, or whatever it is, but things can shift and maybe the priorities are going to be very different.

Measuring How GenAI will Increase Developer Productivity

It also sounds like we all agree that generative AI has the potential to improve productivity. Have you found tangible ways to measure that?

We always want to measure how we increase productivity. I think it impacts anything related to developer experience. We like to say that we improve developer experience, but we want to measure how we improve developer experience, as well.

Engstrand: In the article I wrote for InfoQ, I did an experiment. I used the same prompt, which is, given this code, like service code, which has exposed three routable endpoints. Given this unit test for one of those, write the unit test for the other two. I subjected it to all the usual ChatGPT, Gemini, Code Llama 70B, and CodeWhisperer, and stuff like that. Then what I did was I took the output, and I saved that. Then I fixed it, because it had bugs in it. Then I did a Myers diff algorithm on what it generated versus what it actually took to get the unit test to be running correctly. This was a Java source base, so I could use the JaCoCo tooling to look at code coverage. In the article, you can see it, I compare those numbers. Myers diff algo in terms of what you had to do to correct it, and change in code coverage. Those are two metrics you can use. Are they perfect? Of course not. When is any kind of quantitative metric going to, 100% accurately cover something that's qualitative in nature? You do have to be data driven. You do have to go from metrics. Try those two.

Abeysinghe: I think I mentioned roughly about that in a previous answer as well, like the flow efficiency, mean time to repair, mean time to detect. If you can embed these things into the internal developer platform, then it will be really easy. Then you can capture this data, and then have a proper dashboard to measure the productivity. I think that will be one way of looking at how these tools are helping.

Minick: I think Dr. Forsgren, and team, there was a nice paper out of Microsoft on the SPACE framework. The paper is worth a read. It talks about a lot of different things you can measure to try to get a cohesive picture on developer productivity and satisfaction. That thing's fantastic. There's a breed of tools out there. If you Google software engineering insights, you'll find a bunch. We have one. They'll help measure a lot of these kinds of flow trends and some of the SPACE framework metrics to try to get some quantitative views on that. I think you want to pair some of that that's coming out of the tools with some survey data. If you want to get really serious about developer satisfaction and productivity.

Ethical Integrity of AI Generated Code

Losio: One topic we haven't mentioned that much is ethical integrity and reliability. We mentioned actually reliability of AI generated code. Another part is the ethical integrity of the code. How can we validate that in a rigorous way? What direction should we go in that sense?

Allen: I'm not an expert on the ethical implications of AI. In terms of code generation, we spoke about security issues. That does speak to the ethical focusing. Then in terms of actually generating content. One thing Asanka mentioned was documentation. I can't see how documentation may be ethically ambiguous. If you're generating a story, for example, or social media content, for example, then, of course, you could put in inappropriate content for certain audiences and that sort of thing. Having the proper frameworks around there to make sure that the content is solid and appropriate for the audience is really important.

Engstrand: A previous employer of mine actually was out there first talking about algorithmic bias. They were one of the first companies that was talking about it. It's very serious. The current Biden administration is focused on that as well. This is less about writing code. This is more about the use of AI in general. We use AI to shape our customer's experience. Sometimes that means certain types of customers might have more access than others. We didn't intend to do that. That's just an unconscious bias that can happen. Especially if you use AI, like a recommendation engine, or something that approves whether or not you get the loan, or what kind of healthcare you can have access to, is pretty serious. It's hard to measure. I'm less worried about whatever, Terminator environment where some super robot launches the nukes, and way more worried about if there are certain groups of people who perhaps don't get the access they should or that other groups do, because of some unconscious bias that's in the algorithm.

 

See more presentations with transcripts

 

Recorded at:

Jul 16, 2024

BT