BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Articles Book Review: Making it Big in Software

Book Review: Making it Big in Software

Bookmarks

Sam Lightstone's book, Making it Big in Software: Get the Job. Work the Org. Become Great. is primarily about the very last phrase in the title - becoming a great professional and software craftsman. Becoming great will, of course, advance your career and enhance your income, but the real value of the book is in the advice and insights on how to become a better developer.

Agile developers, especially extreme programmers, will find this book offers practical techniques that support the XP values of Courage and Communication. Those aspiring to become craftspeople will gain insight into the "person" in the "craftsperson."

The book is divided into three sections: Fundamentals, Leadership, and Greatness. Each section mixes commentary, practical advice, and illustrative interviews with some of the most prominent members of our profession to convey information about critical skills, education (academic and practical), and the relationships between "soft" and technical skills that assure software success.

The author and publisher have provided an excerpt from the book Chapter 17: Leadership in Software Innovation. Most of the major themes in the book are discussed in an interview with the author.

The book is called "Making it Big in Software" - but it seems like it could have been called "Making it Big in any career." You did discuss why each lesson was applied in the world of software. Can you summarize in a few sentences why this book is particularly useful to software professionals and those with aspirations for same?

It's true that a lot of the ideas in the book are valuable to many careers. There are commonalities in the behaviors that lead to success. It doesn't matter if you are a programmer, or a brain surgeon, or an accountant - you will be more successful in what you do if you are kind to everyone, have effective time management, and are a great public speaker. My book, Making it Big in Software, is 1/3 general career advice, 1/3 software- specific advice, and 1/3 career interviews with major software personalities. It's the two-thirds that focus on software that make it specifically a truly valuable career guide for geeks and not just "yet another ho-hum self-help book".

We are fortunate to work in a very dynamic and growing profession. We all have huge opportunities in high tech. Almost everything that runs on DC power runs software. There are over 2 billion people connected to the Internet today. All of our technology is increasingly automated and interconnected. That's opportunity! For those who want to be successful in software, and leverage these opportunities, it's worth learning the skills for success that aren't taught in school.

The people you interviewed were very interesting and the interviews quite revealing. How did you select the people to be interviewed?

The interviews were very, very carefully selected. I wanted a mix of researchers, innovators, and business leaders; young and old; and men and women. I also wanted to interview people who really had profound impact. I started by studying the list of ACM Turing Award Winners for the past two decades. I also asked prominent executives whom I trust for their opinions on who they consider to be the most influential people in software over the past 30 years. I made sure to include the innovators of major languages and operating systems. I looked at the list of the top-ranked software companies and made sure I had representation from the leading ones. Then, with a list of about 30 names in hand, I whittled it down until I had a list that struck the right balance of attributes and the right number of names.

Did you interview people that had to be excluded from the book on other than simple "space constraint" grounds?

A dangerous question! There were a couple of people who agreed to be interviewed but then didn't follow through. That was frustrating, although I can't mention names.

Was there anyone you really wanted to interview and include in the book, but did not? (I was a bit surprised not to see Alan Kay in the list, for example.)

I actually did interview Alan Kay! Space constraints made it difficult to include the entire interview. Alan had a profoundly inspired period of creativity and contribution through a pivotal era in modern computing. He was a major force in object- oriented and dynamically interpreted languages, and the early ideas around graphical interfaces. He also invested a lot of time advancing the penetration of computers for educational use. One of the big positive ideas that came through in my interview with Alan was his passion for the art of computer science. And he truly views computer science as an art, similar to playing jazz guitar (another of his passions). At the end of chapter 18 about "Greatness", I included an inspirational quote from my exchange with Alan about how passion for your art drives true joy and accomplishment. For your readers, here it is:
"Artists are people who are driven by inner senses of ideals to do their thing. They tend to be very focused and compulsive. Music of various kinds has been an “alternate art form,” but I’ve been similarly compulsive about it (highly developed forms of music like classical and jazz require lots of thinking and lots of grinding, at least for me). For many such personalities, idealism pretty much sets up goals that are unlikely to be reached. Butler Lampson likes the Browning “reach should exceed grasp, else what’s a heaven for?” quote, and he’s a lot more balanced than I am (I think). I have a master glassblower friend who told me he would take a bite of the molten glass if he could. I know exactly what he meant. This is 'happiness in art.' "

It seemed that almost all of the people interviewed got their start in computing while in grade school or high school. This would mean that they all put in the 10,000 hours of practice that Gladwell talks about so much in his book Outliers. To what extent is an early start critical to a successful career - especially to one that is truly stellar?

I'd like to believe that an early start isn't critical, because it's too painful to believe otherwise. It seems unfair, even cruel, to believe you need to start early or just not bother. However, the reality is that there are very few examples of truly successful software innovators who didn't get immersed early. Painfully, while it may not be necessary, it seems to be very useful. I once heard an interview about a musician who learned to play guitar when he was in his 20s. Decades later he said, "I still play guitar with an accent". This guitarist was brilliant, but his music just didn't flow like someone who'd been playing since childhood. It seems our brains integrate language very young. Spoken language, music, math, programming - these are all languages- and an early start builds fluency that can never be perfectly acquired later on. It's a limitation of the human condition. While coding mastery benefits from an early start, business leadership and business innovation much less so.

The other benefit of youth is that it's unburdened by responsibility. No mortgage to pay, no family to sustain, no children who need attention. This allows a whole range of behaviors you can't easily do when you are older and settled. You can take business risks much more easily (when you have nothin' to lose, there ain't much risk), and you can work all-nighters. Great startups often emerge that way. This is the "two guys in a garage, with a crate of coke and chips" model, and it really does have youth as a prerequisite. See your doctor before trying that after forty!

Please correct me if I have misread the book, but it seems like a minority, maybe thirty to forty percent, of the knowledge and skills required to succeed are "technical." It seems that most of them are "soft" skills. Is there a way to pick up the soft skills other than through experience?

I think the mix is really close to 50-50 overall, because they are co-dependent factors for success. You can't get by on charm alone, nor can you get buy on geekhood (tech skills) alone. For many of us, the skew is towards technical skills during our initial professional years and the soft skills increase in importance over time. Higher-level positions on both the technical and managerial tracks do require more soft skills. That's because these kinds of roles require people interaction, effective management, experience, jockeying for position, and persuasiveness.

You set up a significant contrast between school and work and seem to find some significant shortcomings in academia. Does our academic system (grade school through PhD) adequately prepare people for the kind of career success you talk about?

No, schools don't prepare us for real success - and I don't think they try to. That's why I wrote Making it Big in Software. Perhaps more directly, is school intended to be about professional preparedness? The first words in my book read: "You went to university to study a profession, but they were hell-bent on giving you an education instead." Our university degree programs grew out of a European tradition based in teaching people how to think, and covering major theoretical ideas. Big ideas, profound questions, and great books. They have always been focused on the theoretical. Even degree programs in such eminently practical areas like nursing, engineering, and law, remain fundamentally rooted in great ideas as much as practical application. The big ideas approach is super valuable and I wouldn't get rid of it. But real success requires both technical and soft skills that aren't taught in school. So if you believe university degree programs are about giving people theoretical background and teaching people how to think, then I think they do a pretty good job. If you believe they should be training grounds for professional success, then they fall short. I recently gave a lecture to a computer science class at one of the top computer science schools in America. I was surprised to learn that the students weren’t taught programming languages! They learn design patterns, theory, and algorithms but the practical bit about actually learning syntax is something the school assumes students will pick up along the way. I think that's a great example of the focus that a great institution has and where they have placed their educational priorities. We can debate whether this approach is ideal, but at least I think there is clarity in their educational strategy.

Is a degree simply a "pass" that gets you past human resources? Most of the people you interviewed that have graduate degrees, seem to say that the experience of obtaining the degree was highly enjoyable and that the educational experience was valuable, but did not think the degree was itself a pre-requisite for their own or anyone else's success. Are there lessons here that would help a student make better use of their time in college?

Let's separate a bachelor's degree from graduate degrees (masters and PhD). I think a bachelor’s degree is sensationally valuable. It's not just a piece of paper. Although of course there are exceptions, people who have gone through a top-level bachelors program are better skilled on average, than people who have picked up programming through diploma programs or are self-taught. Remember I said "on average". There are lots of very notable exceptions, and some very impressive, brilliant and hugely successful people who achieved great success without having a degree. Some notables: Steve Jobs (Apple), Bill Gates (Microsoft), Mark Zuckerberg (Facebook), Larry Ellison (Oracle). There are really important practical ideas you learn in a bachelors program around structures, algorithms, operating systems, complexity analysis, that are hard to pick up in a diploma program or through self-learning. A bachelor’s degree is also a certification of your skills, mental focus and work ethic. It is also true that a degree is a "pass" that gets an applicant past HR. That being said, the correlation between academic success and career success is loose. There are too many things that career success needs that have nothing to do with schooling. Some examples: Passion for your art, innovative thinking, forming structured plans to tackle fuzzy problems, the ability to lead and organize others, clear communication, knowing how to get things done in a (large) group, humility when it's needed, and a dash of hubris when it's needed as well.

Graduate degrees (masters and PhD's) really are much less valuable for the content of what you learn, and more valuable in the skills they force students to learn in doing scientific writing, literature review, and independent research. Not all jobs require those skills, but for those that do, having gone through a graduate program is a great growth experience. So I agree with the sentiment that many of the people I interviewed expressed that graduate degrees (masters and PhDs) are valuable for their side effects more than their content, and in most cases you can have a brilliant career without them.

To your second question about making better use of time in college, I'll answer with an idea from the book; your odds of getting somewhere in life are massively improved by knowing where that is. That's because if you don't know where you want to get to, you're very unlikely to arrive. So the best thing that people can do while they are in school is try to figure out what they love, and balance that with what's practical, to form a vision of where they want to go. For most of us that vision is usually a compromise between something really practical like accounting, and something much more fun like being a rock star. That's where computer science comes in - you get to work at a nice stable job like an accountant, and dress like a rock star! Seriously, if you know what you want to do and where you want to get to, then you can use that vision to 'game' the system by taking courses and having the experiences that will best prepare you for that goal, regardless of whether they are formally on the curriculum.

What 3-5 things would you change in our college / graduate educational system?

At a meta level I think we should introduce more practical non-engineering skills, including soft skills and business skills. I'm a huge supporter of the current emphasis on theoretical understanding, and learning how to think, but would like to see the professional skills blended in. To achieve that balance, here's what I would change:
  1. Business skills. Every computer science and engineering graduate should be taught about business development and entrepreneurship. There are two reasons for this as a priority.
    • First, because these schools will produce a core of our society's high tech business leaders. For those who come up with the next big tech idea that will launch a Google, or a Facebook, we'd like them to have a clue how to take that idea forward.
    • Second, even if our graduates are certified techies, who are passionately committed to geekdom and thoroughly uninterested in business matters, the world will be a better place when the people developing our technology are sensitized to market pressures, business needs, and product lifecycle effects.
  2. Effective communication as a mandate. Techies are notorious for their poor communication, but so much of what makes great technology come together, depends on effective communication. We have become a technical community that communicates heavily in words, but also by email, and through presentations. If you can't speak in public, or create compelling charts, or write cohesive emails, you are professionally hindered. You have to be clearly understood by the people you work with, regardless of whether or not you are a native speaker of the local language.
  3. Usability and design. Our technologists thrive on function over form, but except in a very few domains where you can assume the operator is either a machine or a human being with years of training, for the most part the priorities are inverted. The mantra of ‘make it work, then make it easy’, is killing us. I think Apple took the approach, 'Make it easy or don't make it at all' and it has served them very well. Part of the reason the priority is inverted is because we haven't trained people about how to design for usability. We train people about how to design programs from a software engineering perspective, but very little about how to design software for human beings to use. We need to indoctrinate our next generation to believe that the human being is more important than the capabilities we're throwing at them; that is, giving engineers the tools for how to elevate their work so it's designed for the human experience.
  4. Emerging technology. School tends to teach students about fundamentals. We expect undergrad students to pick up the cutting-edge pieces by experience, after they graduate. Even in grad school students look at cutting-edge topics in a very narrow domain. We lose all the possibilities that come from cross-pollination where two or more good ideas are combined to create something really new and great.
  5. How computers work. Amazingly, very few computer science students really understand the entire system in any detail. That has a lot of ripple effects on how programmers develop code. People who understand the machine are more sensitive when they program, such as how you leverage the CPU, consume memory, consume disk space and network bandwidth.

What role do you see for design and design thinking as a critical skill for career success. Design is almost never taught in any kind of computer science or software engineering program, or even talked about much in the professional world. (Except of course for program "design" or computer systems "architecture.")

Your question is very insightful and I alluded to some of this in my previous answer. Good design makes success possible. If you ask people what frustrates them about technology, it's usually two things: the software is too buggy, or the software is too hard to use. Usability design speaks to that second aspect, and it's one of the most important qualities that drive success for products. In fact, just to be a little provocative, I'm going to argue that most of the great software business successes have really been about design more than function. Let's look at some examples: The Apple computer: it was neither the most powerful personal computer nor the least expensive. It was successful because it was usable enough for high school students to start programming in a serious way. Facebook doesn't let us collaborate with anyone we couldn't previously reach by email, but it has made the experience more graphical and usable because we can all find each other and share information within a usable graphical space. Almost everything from hand-held computers or complex relational database systems has found success by providing a previously available power with improved usability. One of my mentors during the mid-90s was James Hamilton, who is now Distinguished Engineer and Vice President for Cloud Computing at Amazon. James once told me "It's really hard to argue with success". People who drive successful technology or successful business will be credited with achievement regardless of any number of failings they may have had in their operational style. Elegance in design is one of the real 'secret sauces' for project success, and by extension, career success.

Is an undergraduate computer science / software engineering education the best preparation for a career in computing? When I was a full time IT manager, it seemed that at least half of my best people had undergraduate degrees in areas outside of computing.

It depends on the job of course. For intense programming jobs an undergraduate degree in computer science really is the best training. Many IT jobs, probably the majority, aren't about intense programming. For IT admin, IT business consulting, web design and many other areas you can do very well, and possibly better, without a computer science degree and still have loads of opportunity to do creative and intellectually challenging work.

Has software become too large, cumbersome and unusable? Has the organization that produces software become too stratified and method/process oriented to be effective? I seemed to sense some answers in the affirmative as I read the book, but maybe I was just projecting my own biases.

A fascinating question and I think the real answer is that we are at an interesting phase in the history of computing with a lot of organically evolving answers. Serious software has always been complex, and the oldest and most mature systems today have been growing for a few decades. These systems have now reached previously unfathomable levels of code and design complexity. What do you expect when you have millions of lines of code authored by hundreds of people who are coming and going to and from an organization over many years?

On the flip side, Agile development methods are really emerging that are making the process of developing software a lot less cumbersome and reducing risk, though most organizations struggle to apply Agile with rigor. Cloud computing is simplifying a lot of the infrastructure pain behind development teams. As well, programming models are getting simpler. People with little experience are able to do more because the skills barriers are dropping. I was invited to speak at a user group near San Francisco a few months ago, and shortly after I spoke they had another guest speaker who showed off his new iPhone app, with snazzy image rendering, a product selection and ordering mechanism, and a PayPal interface for payment. It was all "very nice, now where's the tea and biscuits?" until he confessed that he had developed the entire application in three weeks, and before that he hadn't programmed a day in his life! I just about fell off my chair. His wife said "I'm tired of my job, so make me an iPhone app so I can run a home business", and three weeks later there he was. People are building rich beautiful online content and it continues to get easier.

So while code is complex, and getting larger, there is this push towards higher levels of abstraction that is making things easier and more iterative and dynamic development methodologies that are reducing risk. We work in interesting times!

Thank you for your answers.

Sam Lightstone is author of the book, ‘Making It Big in Software: Get the Job. Work the Org. Become Great’, published by Pearson/Prentice Hall Professional, March 2010, ISBN 0137059671, Copyright 2010 Sam Lightstone. For a complete Table of Contents please visit the publisher. For more information, please visit Sam’s site.

This chapter is from the book, ‘Making It Big in Software: Get the Job. Work the Org. Become Great’, authored by Sam Lightstone, published by Pearson/Prentice Hall Professional, March 2010, ISBN 0137059671, Copyright 2010 Sam Lightstone. For a complete Table of Contents please visit this link.  

Rate this Article

Adoption
Style

BT