How to effectively Manage Today’s IT Challenges

How to effectively Manage Today’s IT Challenges

The Agile Manifesto states:

We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:

Individuals and interactions over processes and tools

Working software over comprehensive documentation

Customer collaboration over contract negotiation

Responding to change over following a plan

Source: Beck, Kent; et al. (2001) – “Manifesto for Agile Software Development”; Agile Alliance

In addition to managing software/hardware and resources, IT must identify and focus on a myriad of key factors and anticipate obstacles before they occur. Costs, processes, technology changes, and the ability to visualize future opportunities are all critical elements that must be predicted and planned for. The ‘top five’ areas IT managers should strictly adhere to are:

  • Communication
  • Planning
  • Testing
  • Vision
  • Maintenance & Support

While there is a myriad of challenges facing IT teams, I want to highlight five of the most crucial challenges that illustrate the need for and use of the top five areas mentioned above.

Top Six IT Challenges


TECH4A good example of a network security breach is the US government’s healthcare website, which has had its sensitive information compromised because of failures in the website’s security. Essentially, systems protecting the customer’s (applicant) security and privacy were never tested because of deadline restrictions. In addition to threatening customer security, the security of other agencies including the IRS, the Social Security Administration, and the Department of Veteran’s Affairs, which interface with the compromised hub, were also put at risk. That is why strategic project planning and communication must be at the forefront of every project and strict guidelines must be respected and adhered to during each stage of the development process, and testing and retesting of each project component, and then of course providing maintenance of the new system, take preventive measures and eliminate potential problems such as downtime or security breaches, which are unacceptable on any level.

Moving beyond dated or overly complex systems and towards technological innovations also requires IT to address similar concerns and deliver simple, effective solutions to the organization and sharing of an ever-increasing amount of information. Not only must new technologies be introduced, they must seamlessly interact with existing, older ones. More and more companies are recognizing that their customers want to connect with their companies through applications and social media. It is therefore essential to construct technologies to facilitate that communication and interaction with systems while protecting their security and integrity. These systems must also be available to company staff while remaining invisible to outside entities. Any breach will impact not only the company’s financial wellness but also client confidence, so appropriate security measures such as encryption, authentication and other standard measures must be fully tested and integrated from the start. With agility in mind, processes and tools are important, yet clear, effective communication among all project participants is vital to the success of every project deployment, as this example illustrates.


Creating virtual processing environments that can be used by a varied assortment of users is not only a must, but one that requires deep knowledge not only of each department’s applications environment but how processes connect and communicate with each other. Each user’s snapshot of the virtual environment must be current and accurately portray how finished applications will look and behave in the ‘real world’.  The ‘virtualized’ IT environment demands larger and sound storage measures. Many organizations IT systems are aging rapidly and must be upgraded to meet new demands. This can present new challenges as the infrastructure may be somewhat piecemeal, which requires careful planning to ensure that new technologies are seamlessly connected to the older systems so that all network components communicate with each other effortlessly. Agile methodologies stress responding to change over following a plan — Which of course doesn’t mean that Agile teams ignore planning. Rather, flexible plans allowing for these rapid changes are built-in the software and processes, eliminating costly do-overs and major upgrades.

Cloud Computing Services & Social Media

Cloud computing, while advantageous and useful, present new threats to not only application security but to the entire infrastructure as a whole, and must be planned for and prevented. This requires the rapid development and deployment of new technologies being built into the networks to provide authentication of all users wanting access to company’s networks.

Information collected through social media merges with selling and marketing data to provide a valuable bank of information. This in turn necessitates the construction and implementation of assorted data repositories, statistical data, and new tools and processes to distribute and analyze the collected information. Agility recognizes that processes must evolve with industry changes. In the new workplace, companies eschew printed or digitized documentation in favor of information stored and retrieved from the Cloud. Agile stresses that working software is more important that comprehensive documentation, since it is more advantageous for customers to readily obtain information from the Cloud and interact with the network with a single click.


Organizing offshore computers on the computational grid is essential for providing unrestricted access to all computers on the grid, regardless of geographical location or system configuration. A global environment requires that the entire infrastructure, including processes, be standardized to facilitate growth and expansion across all departments worldwide and enabling updates and maintenance of software on open-source systems.

Change Management

More than ever, IT professionals must recognize, plan for and manage network and organizational changes. Agile developers especially realize the importance of responding to change, since even the best-laid plans can’t prevent a few glitches. Globalization, constantly developing processes and emerging technologies all require strategic approaches to the successful management of business/technologies change management. What are the changes required, how will they affect the organization as a whole, and what training/mentoring will be required to make the transition smoother? This is especially crucial when adopting newer methodologies like Agile.

Cost (of IT Services)

Sometimes it can be a challenge to convince CFO’s and other financial principals why the client company should do away with recently acquired, conventional systems and processes in lieu of emerging technologies. In this instance, IT should explain why upgrading early will be beneficial to their company and save them money in the long run. Other costs to consider are adding on to existing infrastructures (hardware/software acquisition), technical support, training, etc. In this scenario, vision, planning and communication all have an important part to play in identifying costs and even eliminating non-essential client expenditures. Agile methodologies recognize the value of customer collaboration to project success rates, and cultivate this through allowing one or more customer representatives on the project team. Customer representatives then work closely with developers through each stage in the process, providing critical feedback and ultimately reducing costs.


Effective strategies for managing  IT operations efficiently are a critical part of improving business operations. Recognizing important elements and fostering clear communication of all strategies and processes creates a collaborative vision that translates to success. What are some of your experiences in IT/Agile management? I would enjoy reading your comments, and if you enjoyed this article, feel free to share. In my next blog, I will elaborate more on this topic with real-life experiences, and would like to hear your own as well. You can also read more about IT management and Agile in my books.

Should you consider project management for your organization?

Software development can happen with or without a formal project management practice within your organization. Hard to believe? May be, but it’s true. Read on.

If you are running your organization lean then you definitely do no want the additional management costs and organization disruptions caused by project management. Your software development goals can be easily accomplished using your existing systems department (or a similar organizational unit) which I’m assuming is quite well setup using functional organizational structure, and that can be spun around to use the project approach.

You should consider this simple criterion to evaluate the desirability of having a formal project management with in your organization.

a. Find out if the job your department is about to take on is large or technically complex. If the job is large or technically complex and challenging, then you should consider a formal project management approach within your organization. Why? Because such complexities often require and deserve close management attention, that is available with formal project management only.

b. Find out if the job involves the integration of many components into a functioning, operational whole. Having a single individual representation for such jobs can help reduce redundancy of float information and clears out the communication blubber. Thus having project management enables better handling of interaction between various components.

c. Another thing to consider is to find out what your management wants. If your management is leaning towards having a single individual that should be available at all times as a focal point regarding information and responsibility, then you definitely need project management. Management also might require such an individual to scrutinize budget, and manage schedules of the job.

d. Find out whether a group of people with different skills-set need to be brought together in order to achieve a common goal. If your job does requires such a diverse set of professionals, then you need a project management environment within your organization. This way the group interactions can deserve a goal oriented leadership to reach its end objective.

e. Finally, you need to, closely look, at your organization whether the requirements for the job are changing or needing revisions every now and then. Having a project management approach will help your organization solve this kind of problems, bring an agile perspective to changing needs with in your organization and will help in gaining a better rate of customer satisfaction within the organization.

Though having presented all these interesting points, it is fascinating to find out that the majority of software projects fail because of poor adoption and application of project management practices. There’s no hard and fast rule for doing or executing software project management. Adaptability to the development environment in correspondence with the organization, and its subsequent applicability to the needs of the development project are critical to any software development project;  and, if either of this is missing then the software projects cannot be a success.

Six Sigma and Software Engineering and Reliability

I recently finished reading the book “What is six-sigma?” by Peter Pande, and Larry Holpp. In terms of Software Engineering, Six Sigma is much more than a specific analysis of software reliability. It is a quality improvement framework, and mindset focused on the measurement of process variation as the culprit for lack of quality. I want to point out that the term “six sigma” when used in conjunction with software reliability, has little or nothing to do with statistics, with distributions, with their moments, etc. It is a buzzword and will remain a buzzword until such a time as it is defined in statistically correct ways.

The real Sense for Six Sigma

Six Sigma as the name implies stands for six standard deviations from the mean. Sigma is a statistical measure of variability around the average. The concept of Six Sigma comes from reliability engineering prediction of system or component failure probabilities. For example, the wear out time of a component may be normally distributed – that is meant – standard deviation. So, we want a component having a very small of failure before its design life. If, we set this at one sigma from the mean, we get ~80% reliability, and 2 sigmas gives us ~95%, and 3 sigmas ~99%, and so on. Six Sigma gives us ~99.9997% reliability – near perfect; or, in other ways 3.4 defects per million.

Six Sigma and Software Reliability.

In terms of software engineering, however, it is not so quite clear cut as compared to mechanical or electronic components. Also in case of software reliability, we don’t have very good predictive models, failure models, etc. As somebody suggested, that one approach to this could be to predict faults remaining as a function of faults found in earlier phases. In general terms, for software reliability, Six Sigma would mean that the software process will find ~99.9997% of all the faults before the software is put into service.

What do we need to do?

We need to adjust the design life accordingly. In common terms, the design life of shrink wrapped software is ten seconds before we open the package, and for the custom software ten seconds after the check clears.

In the language of Motorola official release:

”Motorola wants to be free of errors and defects 99.9997% of the time in all that it does. That means no more than 3.4 defects per million units.”

- ‘Electronic Business’, October 16, 1989

Statistical Tools – Improved Software Quality

Use of Statistical Tools to Improve Software Quality and some points to remember regarding this:

  • Today, the complexity and size of software has grown substantially, along

with the size and complexity of the silicon processors, perhaps exceeding

Moore’s Law (a doubling of processing power every 18 months).
  • The business risk of developing very large software systems has spurred

the development of a very large shrink wrapped software industry, primarily

because of the failure of many very large complex systems.
  • Software factories, of which the primary case would be Microsoft, flourish

by delivering very large, internally complex products, at prices consumers

can afford to bear, exclusively by delivering extremely large volumes of like

products. The only technique that has proven effective for quality assurance 

is using thousands of volunteer quality inspectors (beta testers) to report

the errors prior to the final release of the product. Because the cost of manufacturing

 beta copies is so low, it is far out weighed by the economic benefit the company

receives from this type of testing process.
  • Hence, can we ever assume that the software development industry will ever

achieve on standardized uniform measure of software quality, given that to

be relevant, the definition of a software standard must be reached between

the consumer of the software are and the producer of the software ? I would conjecture,

probably no.
  • The reason for this is due to the nature of software. An algorithm may

be provably correct but may be implemented in an inefficient manner. (A possible

 defect). It might be physically damaged in the duplication of a disk (a manufacturing 

problem), which might manifest itself by the consumer being unable to install

 and use the product. The cause of the problem, may remain the inefficient 

implementation of the algorithm, but it manifests itself in so many potential

ways, it will be in all likelihood, impossible for the consumer to identify

the defect, and unless a defect can be quantitatively measured it will be impossible

 to detect.
  • At the very core of the problem, the inefficient algorithm might be the

work of one designer or developer, being unaware that more efficient mechanisms 

might exist, or it may be the result of a specification error, or perhaps the 

algorithm subroutine was purchased from an outside supplier, who provided 

poor instructions regarding it’s limitations.
  • Statistical tools can be used to analyze overall system quality, such as

 a transaction failure. These tools are severely limited in the applicability

 to an individual software developer because the development task is typically

 to design and write single software modules, as opposed to a large scale software

  • We keep learning more and developing new insights, so things will change, 

most probably through the use of better software partitioning and packaging



In the end, the people at large, the **users** does not understand why a concept that is worthy and meaningful in the hardware and manufacturing domain ***does not*** apply to software. Consequently, the **users** might be mislead and ill-served because they are led to believe “six sigma” software is somehow comparable to “six sigma” hardware. Is it?? Does it??

[I am convinced that others who have read the authoritative literature on six sigma and have attended the appropriate training could talk more intelligently about this technology.]

© Manoj Khanna 2003 – 2013.



In the far competing market today we are crowded by thousands of small and medium sized companies. Things are pretty tough with the investment-hungry-crowds scattered around the various networks of business circles. The software industry is tighter than ever. The so-called-high-technology professionals are lying down low, accepting the challenge. The market today is not driven by mere h-o-o-p-l-a-h-s-a-n-d-u-h-s-a-n-d-a-h-s-o-f-t-e-c-h-n-o-l-o-g-y. Its driven by t-h-e-f-a-c-t-s. The reality is bitter than what it was couple of years ago. Things have changed. People have become more sane in accepting and knowing technology. They have become more realistic in knowing what’s that which stands in their way in knowing the exact implementation of a certain technology. And most of all concerns about high ROI and low TCO.

The famous Game Theory, and its benefits are reaping its benefit. Today its not the ideal market, and neither are the wants and likes of the industry at large ideal. The best way to envisage success in this not-so-eventful-market is play safe and in teams. A team is far better than actually producing a result on your own. No body is perfect and similarly no company is perfect in providing a sure shot all the way solution for a complete enterprise-solution-hungry corporation.

Where does it then entails us to? Not so very far. Rather close to our neighbors and allies. Looking beyond the impossible is not the call of the hour. Rather looking at the obvious is something more productive and profit reaping. Web services, content management, workflow, billing and accounting management, combined with a complete customer care, what does it tells you? A-v-e-r-y-c-o-m-p-l-i-c-a-t-e-d-s-y-s-t-e-m-i-n-t-h-e-w-a-y. Not really. Rather its the most simple model if we look at the different scenarios and also when we combine the different providers into it.

To get the maximum power for a high ROI and low TCO the services have to be combined at an effective price and with an effective strategy. And looking for a maximum functionality is t-h-e-d-e-m-a-n-d-o-f-t-h-e-h-o-u-r.

Pin It on Pinterest