What is The Value of Software: is it Artificial Intelligence (AI), Decision Support or Operations?

1
2319

Why pay for software? I debated why we should write a program and not just pay someone 5 dollars per hour to do some digital tasks for us. The two main reasons are repeatable and scalable. Repeatable is using the same software to solve the problems of many new clients. Scalable is the ability to do more problem solving without an increase in human capital. Let us explore in detail the dimensions that make up these two categories.

The actual test is this if the software can do the job faster than a human, making sense to get it done via software. Any process or system’s goals are to consistently, quickly, and accurately complete the process’s tasks. – Stephen Choo Quan

There is a point in any organizational process where it takes too much time and effort to do it consistently 100% accurate and quickly as a mortal man. When human threshold hold is met or exceeded, performance drops, and software becomes orders of magnitude better at the task. Let us look at the factors that push us toward that boundary and beyond.

VALUE OF OUTPUT. Hard and soft returns can typically measure the value of the output. The hard returns are income in cash, while the soft returns are qualitative, like the types of decisions made based on consuming the output. The number of users, the impact of the decision-maker, the number of departments, the number of enterprises that subscribe makes the result exponentially more valuable.

SPEED PER TASK. If the task is to pick a random number, a human can match software to do this task. If we had to divide that number by seven then multiply it by 54 after picking the random number, then now the scale hast tipped where software is the better choice.

COMPLEXITY. The speed of execution is directly proportional to the complexity of the task. Suppose the data input needs a fair amount of transformations to arrive at the desired output. In that case, the software is better, especially if this is time-consuming with many sequenced microtasks. Once we can get one transaction faster, then we can now performance scale using software, it creates an opportunity to take on more of this kind of work for many clients.

CONFORMING. 75% of the time, we are searching for data to begin the task. The software can access and search other silo systems of data and pull the data together. Each application has its inherent security layer that protects its data forming a logical boundary from different data sources. Using the software, you can gain an automated level of access to add value to your processing. Multi-silo access is a form of transformation that accelerates data processing in favor of software over manual processing.

VOLUME OF DATA/EXPECTED THROUGHPUT. The speed of execution is directly proportional to the volume to be processed. If we are processing 30 transactions, it might be faster for me to do that by hand in an excel spreadsheet than to pay someone to write software to do the task, but what if I had to process 1000 transactions. If we suddenly got to respond to a hurricane, the volume would jump from 0 to 50,000 in days, and the throughput needed would be 24 hours. That puts a different level of focus on the task. What happens to mortal man as he repeats the process after the first 100 times? If we had to look over 1000 transactions daily, that would be a jail sentence in any event. To add a new layer of complexity, what if we have to do it every 30 minutes to keep the data real-time and useful for the data consumers? What if lives depended on data processing?

FREQUENCY OF PROCESSING. If I need one hour to process 1000 transactions, I need to start the process again in 30 minutes. I would be defeated and lapped by the task. The time it takes is beyond what I have time allocated per batch then I need a program to do this for me. Software processing power is much faster than human labor. Human labor is far more costly than software as well per minute of execution time. One year of payroll is much more than creating and running a program for a year. Speed is especially true when processing power gives you a competitive edge to give your consumer the best data.

Suppose we factor in the complexity of operations needed, the volume of data, and frequency to process like aggregations, e.g., creating sum totals or breaking up segments by department or customer. The breaking point for software becomes much more effective.

We live in a time where at the bounce of a button, we can contact millions of people and make hundred of thousands of transactions. The power of digital data and software that powers that strategy makes it a beautiful time to be alive. It is companies that use this in decision making that will have the competitive edge.

The question that lingers is, did technology live up to the hype? We use data for operational decisions, we live in the hope of AI’s utopia, and that programmed machines or machine learning will make software autonomous. Today’s computed findings that still need to be calibrated by the human. It might be missing the human heart’s caring or features that are not in the data but can yet be known to humans. Complex insight still leaves us in the heterogeneous gray work of decision support by software and humans.

Other Primal Forces to consider in building or buying software.

  1. Functional
  2. Performance
  3. Complexity
  4. Change Impact and Change Management, future changes against current code
  5. IT personnel and skill level(s)
  6. Security
  7. Software Expected Life Span

Thanks for reading and Sharing ❤

Follow me On: Instagram | Facebook | Linked in

1 COMMENT

Comments are closed.