Vector’s international installed base ranges from 2 PC/Server instances of remote control technology to multi-campus IT asset and service management instances in the 10,000 PC+ range. In plotting a development course we look at the challenges our customers face and the investment options open to them. It appears they all face exponential growth of problems to solve, demands to meet, and technology options to deploy.
Virtualization – complexity born of innovation or work-around?
Server virtualization has been a great example of a sophisticated concept driven from the mundane realization that server architecture and deployment had for many organizations gone down a path of increasingly poor utilization. Instead of addressing the basic need for a suitable server operating system architecture, the industry has created a new approach that perpetuates the use of inappropriate architecture and employs increasing processor power to cope with the additional computing complexity inherent in the approach. So now organizations have to find ways to manage their deployment of the new paradigm, with their software suppliers still scratching their heads as to how to license their products for this new environment, and IT management realizing they are close to losing control over where they are running that software.
ITAM and ITSM – evolving to keep pace with infrastructure and application delivery
Simply managing the IT infrastructure is becoming increasingly complex, before the technology actually achieves anything useful for the organization. Even the fundamental disciplines of IT Asset and Service Management have to evolve to deal with the layering of virtual and physical machines and the development of new ways in which applications are delivered.
Organizations have to choose how to deploy their stretched budgets, given the competing demands of the increasing complexity of the infrastructure, and the range of opportunities to invest in directions more clearly identifiable with helping push the organization forward; directions such as business process innovation, support for mobile platforms, predictive analytics. In parallel, the security challenges become more complex, the dependency on internet and other connectivity more heavy.
Software complexity – one way street
Behind all these options, the underlying software becomes increasingly complex. Applications become larger, as network bandwidths make larger and larger downloads more acceptable. Dr. Moore’s prediction was never intended to look more than a few year’s ahead, and after around 30 years the fundamental concept of constant re-doubling of the number of transistors on a chip may eventually be running into the buffers. But the industry will find other ways to continue to provide more compute cycles, and the software will either follow by inventing new ways to use that power, or its own increasing complexity will continue to drive up hardware performance.
Is there a method by which software complexity can be assessed and tracked as it evolves? Is there a Moore’s law that predicts how software will continue to evolve in its complexity and richness?