![]() |
Last week, I mentioned that the world of 1978 represented a set of hard constraints that drove the thinking and design of systems. These hard constraints influenced the expectation of what could be done by IT. We have traded those hard constraints for less tangible soft constraints. These soft constraints emerge and almost sneak up on an organization in the form of extreme complexity of interacting systems. Now, the expectations of what technology can deliver have been significantly elevated, so these soft constraints must be overcome and they are not viewed as all that difficult to solve by the business. In 1978, programmers were often limited in the amount of time they had on a computer to write and debug code. Desk checking was an essential task of a programmer. Many of those who worked in the minicomputer space were very aware of the 16 bit address space limitations and had to get very creative about program size and efficiency. These constraints also influenced data design so that packing a byte of data effectively was a key to performance. Assembler was the tool of choice for most high performance applications, especially those involved in real time messaging. The result of this incredible emphasis on tightness and efficiency was palpable as we came closer to the year 2000. So many shortcuts were taken to pack date representations into tiny parcels that many applications were doomed to fail if not addressed. Some older operating systems could not handle the year 2000. What is so interesting about those times was the focus on the here and now. My mentor was confronted with this fact in 1979 when we were designing the new database for a market data system. He was told that if we made the design choice he was advocating, we would have a problem with the year 2000. His response, “if I have to worry about that in 21 years then I have not risen very far in my chosen field.” I never forgot that, and I joked about it with him when we worked on a conversion project in 1999. In fact, he was very successful in his career nonetheless. With advanced languages, IDE’s and a cornucopia of tools to choose from, along with a 64 bit address space, much of those “stone age” limits seem to be a nightmare of the past. Of course the expectations have grown enormously as well – just think about the fact that people expect to access their bank account information over a wireless connection and expect the transaction to meet all of the stringent properties of a wired connection. But the hard constraints have been replaced by a set of soft, interconnected constraints that resist change much as a spider web gently resists a fly’s attempt to extricate itself. The kind of soft constraint that pervades an organization now is the lack of understanding in how interconnected a vital system is so that it can be moved, recovered or properly upgraded. How should an organization reconcile multiple overlapping applications that basically serve the same functions? How best to determine if a set of applications are ready for some sort of cloud deployment? Why are they just soft constraints? Because the expectation is that the IT organization must know how to fix their world. But any of these questions raises the red risk flag, because getting objective results is almost impossible without some new perspectives. And so the IT function is caught while trying to fly through the web as the spider of risk and financial reality slowly does its work. Let me know what you think. |
Update your feed preferences |
