The result of a recent discussion of some side projects and whether or not it was professional development.
The project had business stakeholders separate from development team.
The project had testers separate from development team.
The project had end users separate from development team.
The project had a development plan.
The project was deployed to production.
The project has end users who continue to use the system.
The project had at least three significant challenges - challenges not solved by Googling or reading stack overflow. These are hard things you had to solve as an individual or team.
1, 2, and 3 need not always be completely distinct groups, but they do need to be distinct from development team.
1 through 6 can be easy at times, and can happen very fast, if you know what you're doing.
It is usually number 7 that distinguishes professional development from amateur work, hobbyist side-projects, or other simple coding endeavors.
Notice that there is not a Number 8 - 'You got paid for doing it.', becaucse being paid for it isn't an absolute requirement to make it real work. I've seen a lot of people get paid for things that don't count. And I've seen a lot of great work that was created because someone cared enough to do it, even though they weren't getting paid in dollars or other currency.
If your work checks off number 7 on the list then you were getting paid in experience, and this can be invaluable to you.
Clients frequently request that we use existing applications as the basis for work when we are rebuilding or recreating a system with them.
This is very difficult. It can work, but it is much less effective for developers and results in a lot of iterations as the dev team goes back and forth figuring out what an application needs to do. It leads to a lot of bugs that are only discovered in UAT.
In the beginning it is much more efficient for your business stakeholders, which is why they ask for it. Why go through the existing application and document it? That's a lot of work for users and business people, not to mention the BAs you have to pay to do it.
In the end, the same people who didn't want to go through the process of documenting will be mad because it is taking so long. It is your responsibility as a developer, lead, or PM to set the right expectations in the beginning.
If your stakeholder (whether client, boss, CEO, etc.) requires you to work this way you need to expose the risk and set appropriate expectations. The very best case scenario is that you have to set aside time for late project iterations when you run into these challenges. The worst case scenario is that the project will drag on so long that will be cancelled. It happens all the time. Also possible: you fixed-bid the project and put your company out of business trying to complete it.
Setting the right expectations can be difficult, and that is a topic for another post some time. My goal today is just to describe a few of the ways this has come up over the years, so you know what to look out for.
Just make the new app do what the old one does. Why do you need requirements? An all-time classic. Why not? I mean there's already code, how hard can be to just: read it, understand it, understand all the subcomponents and UI, assess whether or not it is still necessary, talk to users, and figure out how to test it. It definitely wouldn't be easier to have that done and approved before you start development. Danger level: Red Flag
Preserve the business logic, everything else you can get rid of. This assumes that preserving the business logic is easy, which it usually isn't. Even when someone has done a good job of separating business logic from UI and data access (very rare) often the technical and business requirements of the rebuild make reusing the code in its original form impossible. Danger level: Yellow Flag
We'll figure it out as we go along. This one can seem reassuring in that your stakeholder has seemingly granted you permission to iterate. But be careful here: set expectations, ask follow-up questions, establish what 'figuring it out' will really look like. Danger level: Yellow Flag
There's a lot you can re-use. Just tell me what it will cost to rewrite the things that you can't re-use when it comes up. You should assume that you are rewriting everything. If you get to re-use something that's a win. It will most likely save you a little time in testing, but not anywhere else. Danger level: Yellow Flag
You don't need to talk to users. Bob knows everything about this application and Bob is your main point of contact. Unless Bob is the only user (and even then) he almost certainly doesn't know everything. Even if Bob isn't an ego-maniac (and he might be) you are still facing delays in understanding the system because Bob has to go ask someone instead of you asking them. Danger level: Red Flag
This spreadsheet will tell you everything that you need to know about the system. AKA Our old system was a spreadsheet, just look at that. Usually these are worse than looking at code because in addition to code (like VBA) you also get things like embedded charts, formulas in cells, and obsure data access thrown into the mix, requiring even more detailed reading to understand. Danger level: Red Flag
Jennifer developed most of the original system and she'll be working closely with you on this project. Usually Jennifer is retiring, and that is a hard deadline. Also, what does most mean? Danger level: Red Flag
We have all the requirements from 20 years of work we did on the old system. You can read through that. To paraphrase Sartre, Hell is other people's requirements documents There are a few reasons why this is true, but probably the most profound is that writing the requirements document immerses the team members in the system. Without having written it, asked the questions, and fully digested the material then you or your BA will always be at a deep disadvantage. I wish this weren't true, but it is. At least 50% of the reason for documenting stuff is to make sure the person responsible really understands it. Danger level: Red Flag
In my continued quest for biometric insights and stress management, I recently picked up a Spire Stone. It monitors respiration and it can give you real-time feedback on your breathing patterns. Very handy for identifying and managing stressful moments throughout the day.
Breathe less than 14 times in a minute? You are calm. Your breath is full and consistent. Feels pretty good, doesn't it?
Breathe more than 22 times in a minute? You are stressed. Your breathing has become quick and shallow.
If this happens, the Spire device gives you a small nudge in the form of vibration to let you know.
The great news? You can do something about it. Take deeper, more normal breaths and help your body relax and feel less stressed-out.
I like it because, like FitBit, it gives me a way to impact my health that I can make conscious choices about. Breathing? There's always time for that. And improved awareness leads to small changes that can have a big impact.
With the HRV, brain wave, and sleep devices I've tried, you just don't control those things as directly as breathing or steps.
Spire gives you control over your stress level. Changing your breathing helps you feel less anxious and it can help you BE less anxious by sending a calming signal to all the other systems in the body.
It is a really good device. I've noticed that I don't always agree 100% with its breath count. During meditation I think it over counts, slightly, depending on the position of the device.
I also have had a few false positives on the 'stress vibration'. But not more than 1 a day.
These are very minor flaws in my opinion, and the value it provides more than outweigh these (small) negatives.
The additional awareness it provides to your current breathing pattern is very helpful.
They also have an app with some nice features. They have helpful guided breathing exercises and stats on how much time you were calm, anxious, active, and focused.
Spire has a new product coming out called the Health Tag which is designed to be attached to clothes, more or less permanently (it can be washed). In addition to respiration it also monitors HRV.
You've worked at a business for several years, you're really starting to understand how it works. You're grokking it. You're gaining a deep understanding of the innerworkings, outer-workings, and every kind of work-between.
And you suddenly have it - a moment of insight - a blinding flash that helps you to solve a problem you've been stuck on.
They're great when the happen. But how often do they really occur for you? For the lucky, perhaps several times a year. For the rest of us, less often.
One of the great promises of AI is to be able to achieve these types of insights faster and more repeatedly by asking the right questions.
AI will allow us, as it becomes more ubiquitous, to ask better questions and find the answers faster.
Here are the stages of inspiration and insight as we move forward with Assisted Inspiration (we need a new acronym).
Human-Only Insight - the amazing and super-powerful capability that we humans possess to understand our world and make amazing leaps forward. Pattern recognition, dreams, emotional resilience, quantum gravity. It's all part of it.
Tool Assisted Insight - explicit use of math and spreadsheets and literature to figure stuff out, in addition to our own powerful minds.
AI Assisted Insight - use of machine learning and other sophisticated computer tools to value information and create models for problem solving.
As we get better at formulating our questions, generating and processing data, and creating these models - as these skills become more pervasive in the workforce - then the pace of AI Assisted Insights is only going to INCREASE. Think the world is going fast now? Think it's changing? Think we're in a VUCA phase?
Buckle your seat-belt. As our kids join the workforce and spend less time writing code (it probably really will happen this time) and more time thinking about problems and using AI to assist them, the more the pace of change will accelerate.
Will this be challenging at times? Yes, definitely. It also represent a fundamental new phase of work, life, and society.
What an amazing time to be alive.
I've been lucky enough to have some human-only and data-assisted insights in my life. Those moments are joyful when they occur. They are real breakthroughs when you see things in a whole new light.
Imagine having more of those. More breakthroughs. More insight. More inspiration. It will be challenging, and for those of who are willing to face these challenges a broad new plateau of opportunity and human potential awaits us.
I bring this up because it's something that I've heard a few times over the years (including recently) and I think it's worth examining the hidden anxiety behind this statement.
On one level, agile is not anything like communism. Agile practices extend some of the decision making in business to people close to the problem, people who will do the work. In this way, it's much more like a capitalist/democratic system than a communist/authoritarian model. In democracies individuals have agency/a vote/influence in the system (just like agile) and in capitalism individuals/businesses control the means of production and what work gets done, not a central authority (just like agile).
Looked at in this way, the answer to the question above is clearly 'No'. Agile equates to democracy and looks nothing like communist/authoritarian systems. But this doesn't get at the anxiety, which I think is important.
Why the comparison? Why do people say "Agile = Communism"
I think it's because in traditional businesses (a key feature of capitalism) the democratic principles of society don't extend inside the business. Inside the business the business owner and their appointed managers run the business and make key decisions. The businesses themselves demonstrate the authoritarian characteristics that the rest of society does not.
Looking at in this light, I think the anxiety could be expressed thus, "Agile is not like traditional business management and that makes me nervous. So, I will express my fear by equating it to something that also doesn't look like traditional business/capitalism, which is communism."
This fear is not unreasonable. Agile is different. It does distribute decision making differently. I think that it is hard to relinquish control and you should expect this type of reaction to change, as you should expect this reaction to ANY change at all. It's just one more manifestation of anxiety around change.
So, where does that leave you?
Understanding doesn't mean accepting. You understand the anxiety to facilitate the change, not give in to the resistors.
We've seen that the pace of change in life and business is accelerating. Predict and control structures become outdated too quickly. Your prediction will now almost certainly wrong because the assumptions that underlie your prediction lose their currency quite quickly.
Why rely on the assumptions of one person? Why not have a high functioning team working together? In this way, agile can be part of the antidote to the anxiety.
You (and your leadership team) need smart people working on effective teams with the ability to execute. Whether you call it Agile Management Practices or Holocracy or something else, it makes sense when the world changes quickly.
Of course, individual business owners and leaders are free to make decisions to run their companies in whatever way they see fit, that is capitalism.
But, as a leader, don't you want to hire the best people and get the most out of them? Empowering them is one way to do that. It does require you to let go of some control. And it does require you to have enough governance to ensure people don't bet the farm or the business without oversight.
But after that, you WANT people to feel ownership and make decisions. It's going to make them more loyal, successful employees, and it is going to help your business be more effective in the long run.
I'm asking this question because I realize that if I teach my child about the technology of today - how it works, how to use it, how to build stuff with it - that a great deal of that information will be out of date when my child enters the workforce in 12 - 15 years.
This is going to happen for a bunch of reasons:
I certainly hope that writing basic, redundant code will be done automatically. This might happen because tooling improves dramatically or because the languages that are used improve dramatically and remove the need for this kind of stuff.
AI is going to replace a lot of jobs, even in technology.
The jobs that will exist in the future don't exist and (mostly) haven't been dreamt up yet.
In such an environment, who can teach technology skills that matter? Not me certainly.
So, if we know that technology will continue to be important (it will) and we know that the details of that will change constantly and dramatically such that what you will need to do detail-wise at a job 15 years from now is hard to know, where do you go from there?
There are still skills that matter, you just have to emphasize them, even if you also do some detail-y technical work. Here's what I choose to emphasize:
Do things with your creative thoughts - make stuff.
Use technology and don't be afraid to get your hands dirty.
In the world of software, this last point is largely figurative. Your hands don't literally get dirty unless you've spilled a lot of coffee or toast crumbs into your keyboard over the years.
So, you have to teach them stuff. Teach them the big with the small.
The small stuff (details, technology) will expire. It always does. Sometimes faster, sometimes slower.
But while you're doing that you can learn to learn and not be afraid of it.