Yes, we're doing a couple of things. I gave a talk on Agile Data Warehousing. Agile BI, a couple of days ago. I was on a Scaling Panel of the Executive Forum Monday and we had a booth where we talked to a lot people and shared information and had a good time.
Shane:
Cool. Well, maybe we can talk about a few of those things. One of the things you've mentioned is Disciplined Agile 2.0 is now available. Tell what it is and then what's different.
Yes. Definitely. So the Disciplined Agile framework came out in June 2012 in the form of the Disciplined Agile Delivery book. At that time, the focus was on delivery. So from beginning to end, if you're still doing projects, how do we start an Agile project and do the initial Sprint Zero, and the inception stuff in an effective, streamlined manner. How you do construction and then finally how you transition effectively into production and then, of course, loop back around, rinse and repeat.
The philosophy there was always “it depends”. So Disciplined Agile is the "it depends" framework because just based on observation, every person is unique, every team is unique, and every organization is unique. So if you want to own your own process which is a fundamental in Agile, then you really need to have a flexible framework that guides you through all these decisions, that shows how it all fits together.
DAD brings those -- makes those options explicit since, here's the things you need to consider, here are your options, here are the tradeoffs. So choose intelligently. In some cases, there's sort of maturity model thing going on where this practice is generally better than this practice, and so on. So sometimes you're not doing the best because you're at whatever maturity you're at. You're doing the best that you can.
But there's better options that you can figure out working your way up the food chain. So the -- so that's the basic idea with Disciplined Agile Delivery.
In the last year and a half, we've been working on what we're now calling Disciplined Agile 2.0. The basic idea is let's bring this philosophy of "it depends" to the IT department. It's based on some of the observation that we see more and more organizations struggling with scaling Agile, we see them struggling with scaling Agile at the tactical level. So Disciplined Agile Delivery was all about scaling tactical level. So the delivery team, "How do you work when you're larger team, when you're geographically distributed, you're in a regulatory compliance situation?", and many other scaling factors. As we saw that enough organizations are struggling with that, the -- but what you also see is that when you start rolling Agile and Lean across the board with many Agile and Lean teams going, then suddenly these teams are touching on other parts of the organization.
My Agile delivery team needs to work with the data folks, needs to work with the enterprise architect, needs to work with operations, and fulfilment and many others. The challenge becomes “how do enterprise architects work in an agile manner? How can they add real value? How can they get better?” But at the same time, if you look at it from the point of view of the enterprise architect. How can they work with these Agile teams, these Lean teams, and some of these traditional teams that are still underway. So they've got a significantly harder problem in many ways.
The same can be said of all these other functions. So that's what we're getting at. So, how does all this flow? How do you streamline all these? One of the things that we see, so we often engage at the executive level as well at the team level, and we see the -- if you take a look at the problem space at the CIO, it's significantly different than what developers think.
They're worried with staff morale, they're worried about keeping the lights on, they're worried about paying the bills but they're also worried about how to make all these -- IT is phenomenally complex. How to make it work together? There's this challenge in the industry where the enterprise architect, they've got their bodies of knowledge. The operations people have their bodies of knowledge. They data folks have their body of knowledge. Developers their body of knowledge…
Even the developers can't agree with each other. That's why you get these raging debates in Agile community. The same thing with the enterprise architecture community. The problem is that these bodies of knowledge, these religions don't fit well together, right?
The enterprise architects, they know, they're at the center of the universe. Everybody wanted what they wanted things would be great. You can talk to the data folks. Well, they know. They've got the secret sauce and they know that if only everybody else would do exactly what they wanted to do it will all work out. The same with the project management folks and so on.
Now, if you're the CIO and you're dealing with this stuff, you're hosed. You've got no hope-makers. So anyway, in Disciplined Agile, we're showing you, "Here's how this stuff fits together. Here's how it all flows. Here are the options." So now you have, "Here are the rules of engagement these groups need to start considering. More importantly, it’s based on this observation, so if you believe in the Agile manifesto, which I certainly do, then all teams should try to get better. They should try to learn, and all individuals should be doing that.
The implication of that, if you have thousands or hundreds of teams in your organization, which is reasonably common, well, all these teams are constantly changing. This team here, they're doing what they're doing to get better. They interact with this team here. So that has an effect on them. These guys are also trying to get better and so on.
All these teams are slowly doing whatever it is they're doing to get better. They're having an effect on everything else. So in Disciplined Agile 2.0, we're taking a look at the bigger picture, almost saying, "Wait a minute. If I want you want to be successful at Agile and Lean, we really need to optimize the whole, not just locally optimize”.
Shane: What a weird thought.
Yes, yes. What a weird thought. Yes. But then the challenge becomes -- it really is "it depends." It's back to this initial observation of every individual's unique, every team's unique, every organization is unique, so the implication of that is you're going to have to think. You're going to have to know what you're doing and you need to understand what the implications are, what the --
br>
Shane: But that's why I buy a framework. I don't want to have to think.
Well, yes, yes. So a lot of the frameworks are rather prescriptive. So they say, "This is the efficient way because this is the only way that I know." Even at this conference is a lot of advice like that. "I have the holy secret sauce and if only we follow it, things will be great.
But inherently, we all know as professionals that it's simply not true, which is why you see all this push back against the frameworks. Rightfully so. It's unfortunate that we often get caught up in that push back and people just assume that, "Oh, well, because SAFe is prescriptive. Therefore, Disciplined Agile must be prescriptive. Therefore, it's evil too."
Instead of him giving me your rant about why do this? Because all the stuff you just ranted about for about for the last five minutes, we're doing that in DAD and we're probably doing it a little bit better because we're looking at a bigger picture than you might be. So just take a look at it and you'll see that, "Oh, wow, it's all about giving you choice." I think choice is pretty good.
Yes. So most organizations, the data management effort is pretty much a disaster. We have very significant data quality problems. This is a well-known thing. When we engage in organizations, we often see -- we go and ask “how’s this working, how’s that working?” We get to a point, "How's the date management crap working?" Ninety-nine percent of the time, people just roll their eyes, "Don't go there. It's hopeless." It's not hopeless.
What I talked about in the Data Warehousing talk was basically how will a data warehouse or BI team go about working in an Agile manner? You've got to get away from do all the logical data modeling upfront, do all the physical data modeling and then finally build and then act surprised when nobody's interested in what we built because it doesn't meet the actual needs.
Just some industry stats. So data warehousing teams have the second lowest success rate of all different project types. The lowest type is package implementation, right? So there's significant room for improvement on the data warehousing side of things. There's also significant improvement on the data quality side of things.
We recently did a study where we asked people, "Do you think data's a corporate asset?" Twenty percent of the respondents said, "No. We don't consider data to be a corporate asset anymore in our organization." Six or seven years ago, that number was 4%. So this data technical debt, which is something that's very rarely talked about, for some strange reason, is increasing. People basically have given up.
Anyway, so what the talk was all about was what are some of the Agile principles that a data warehousing team could adopt? Like working more collaboratively, working in an evolutionary manner, working in a usage-driven approach, as well as a data-driven approach. If I had to pick one critical failure factor for a data warehousing project, it would be letting data modeling drive the effort.
I mean, you’ve pretty much shot yourself in the foot from the very beginning if you think data is data is your primary issue, which is completely non-intuitive, which I think is why these teams get into so much trouble. So the heart of the presentation, I guess, was working down what I call the Agile database technique stack.
It's interesting. It's the exact same thing as the Agile stack. You could talk to an Agile developer, they'll tell you, both will tell you, "We need to worry about clean design. Clean architecture and clean design. There's different ways we can do that." Then in order to make that happen, we need to be able to refactor and refined our portfolio. In order to make that happen, we need to regression test. In order to make that happen, we need to do continuous integration. In order to make that happen, we need to have solid configuration management. That's preaching to the choir, because every Agile developer will pretty much tell you that story --
Shane: Or certainly should.
Or certainly should. Yes. After climbing the learning curve, is fine. Now, in the data world, this is rocket science, right? So maybe just put the word data or database in front of all those techniques. Suddenly they say "How do I do that?" Assuming it can't be done, so there's this common misunderstanding in the data community that it’s difficult to change the production database. It's difficult to evolve or refactor the schema.
Seven or eight years ago, I wrote a book about refactoring called "Refactoring Databases," that shows that it's a no-brainer. So if you've got the amazing ability to type in a code from a book or better yet use a tool, you, in fact, can refactor a production mission-critical database.
Shane: An SQL script is an SQL script.
Yes, yes. It's pretty basic. It's a little bit harder than code refactoring and repair, but not that much harder. So what happens? So once you understand how to refactor a database, well, suddenly, you don't need to work in this serial manner that is preferred by the data community. But the unfortunate thing is the data folks have a very different culture than the Agile folks. This is something I start the presentation with.
They never really made it out of the 1970s or 1980s. Their body of knowledge is very, very, far behind. But to be fair, they just started to catch up. So we are seeing a little bit of discussion around the Agile Data Warehousing finally.
I walk them through. One of the questions -- I have about 100 people in the room. I didn't actually expect the results to be this bad but, anyway, so one of the questions was that I asked, "If I went to back to your company tomorrow and I asked to see the regression test suite for your production, your number one database, your customer database, or whatever's the most important for you, I want to see regression test suite run. Will you even have one?"
Only one person in the room stuck up their hand. Now, I was expecting about 15%. If I ask that same question, "Show me the regression test suite run for your most important application or app server", I would think, at least in the Agile world 85%, 90% if not more would raise their hand up and it would be working. "Why are you asking such a stupid question?" as opposed to looking like a deer caught in the headlights. So I think -- that's just a reflection of the -- it's not their fault. It's the reflection of their culture because when you’re doing traditional development, you're saying, "Well, we do the upfront requirements and architecture. Eventually, they hand it off to the testing monkey. So we're not responsible for any of that". So they have no -- there's no body of knowledge around that. So anyway, so there's a skills deficit the set within the data community right now. There's a serious skills and cultural deficit.
We’re finally starting to overcome them. At the same time, the tooling for Agile database refactoring and database testing, there's some good stuff, there's plenty of good stuff. There’s some commercial tools and some good guys like Redgate. But for the most part, it's lacking. The vendors, the database vendors clearly don't get it. There's -- but anyway, so there's some opportunity for anybody in the tool business, anyone who wants to gets into tool business to, "Here's a huge niche that is not being built very well right now."
Talking about their experiences adopting Agile. There were different people who are at different levels But it was really healthy hearing the issues that they were dealing with and they're all pretty much still on the scaling issues. Some are in the regulatory space, some are large but there were all these scaling stuff.
On the panel, we talked about the various scaling -- the Four Horsemen of the scaling apocalypse is something people like to call us. We're talking about their approaches. What's really interesting, they were all different, which is good, it's good to have choice. But they were all reasonably prescriptive except for Disciplined Agile. So that was pretty good.
There was very good questions coming from the audience. It was very clear the executives to get. So for all the rhetoric, the developers saying "my bosses are stupid." So they really aren't. They’re worried about different issues and at a higher level than you are but they really do seem to get it, at least the ones who were in the forum. So you have a self-selecting group.
But yes, we're all talking about different ways to solve the problem. Yes. So what happened is the -- we're taking different panelists, we took questions from audience where each person was given a chance how to address it. We had some common, slightly different opinions but for the most part, we were pretty much in agreement other than the various frameworks we were coming from.
Well, I'd say the biggest challenge that I think was the only one, the real challenge is that there we are only really Disciplined Agile approach. These organizations are all different. Everybody, especially the people from the larger organizations, it was very clear, they had some Scrummish-type teams, some Leanish-type teams. They're still doing some traditional stuff.
It's a mix and match. It wasn't we're doing everything. We're all doing Scrum, we're all doing Lean, and we're all doing the X, Y, Z. There's different teams for different situations. we’ve really got to start embracing this basic observation. They were all dealing with hard problems.
Many of them were looking at DevOps and they were at different levels of adoption. Very few of them had their act together on Agile data stuff from what I could tell. The data stuff, for some reason, is always lumped to the side, there's very low expectations, which is a real shame. We can do better.
But nobody was there. Nobody was in Nirvana. They were all pretty much large organizations that have been around for many, many decades. Several of them were -- it was quite impressive what they were doing. It wasn't this rather artificial, a small ten-person startup company. So therefore, I've got no legacy, no technical debt, complete control of what we're doing. Nobody was in that simplistic situation.
Shane: The Cinderella Project.
Yes. A Cinderella Project. Yes. They're all in the, "We've been around for 150 years. We've got hundreds of millions lines of code."
Shane: Some of it written in 1960.
Yes, yes, yes. Some of it written by your grandfather, stuff like that.
Yes. Lately we -- the most recent surveys have been around cadences and technical debts. We've been releasing surveys on technical debt lately. I just -- I'm actually in the process of creating a new survey right now based on the question I got here at the conference this week around iterations and -- well, by the time you see this recording, it will be long gone but around -- our iteration dates, are they stable? So we have two weeks iteration or two week sprint. Do you let the date slip, if something happens?
The vast majority of the teams keep their iteration dates stable which is exactly what I want. It was interesting. One of the questions they asked is, "So if you do happen to let your date slip, why?" Most of the reasons we're getting are things like, well, it's a holiday. So we're still getting a nine day -- we've got a holiday in the middle of the iteration, there's still the ten day iteration.
Shane: It's still ten days work. Yes.
Yes. So some of the, "Yes, we let the date slip” stats are a bit off because of that. Yes. So it's sort of a minor thing but it's one of the things we're -- the reason why I ran it is one of response I got -- so I told the -- when I first answered the question, to the person, I said, "Well, yes. The vast majority of teams have stable iterations” because this is what I've seen for years."
He didn't believe me because in his organization, apparently, it's a mess. So it became a religious battle, right? Well, I don't -- "This is not my experience. So You're wrong," type of conversations. One of the reasons why I run the surveys is I want to have discussions around what's actually happening out there.
What are people actually doing not what the gurus are telling us, not what the certification guy is trying to sell us, not what the vendors are trying to sell us but what are people actually doing? What's working? Sometimes even why is it working? Figuring out the "whys" is sometimes hard but -- I share that data.
So at AmbiSoft.com you can download, absolutely free of charge, the questions asked or have been asked for every single survey I've ever run for the last ten years. The questions asked, they were asked, the source data, the answers as they were given, and then my analysis. That might be the only thing with a little bit of bias in it but you can analyze it yourself.
I've had masters students, I've seen papers and various journals referring back to my stuff. So there's some pretty a lot of usage of it
Shane: I will say it's a great resource.
Oh, thank you.
Shane: Yes. It really is. Scott, thanks for taking the time to talk to us.
Oh, my pleasure.
Shane: It's been really good to catch up and enjoy the rest of the conference.
I will. Thank you very much.