sharadsinha

Posts Tagged ‘Economics’

Economic Cost of Badly Designed Software Systems

In Design Methodologies, Education, Embedded Systems, Engineering Principles on July 18, 2016 at 10:47 PM

The goal of every design activity, whether in computing or in some other field, is to come up with a system that serves some economic purpose. So, there are software and hardware systems that fly an airplane, that run our cars and power grids etc. In the past, people were distantly connected with these systems. They were mostly passive users with these systems being used for very specific purposes. However, there has been growing emphasis on using these systems, especially software systems, in governance and delivery of public services to citizenry. A lot of these public services are routine in nature and not particularly associated with life threatening risks (unlike power grids, cars etc.). Perhaps this is one reason why so many software systems for the delivery of public services are so poorly designed. Not only the design itself can be poor, but also the testing and validation for these systems is taken very lightly. I also feel that the testing and validation of these systems have to sync with the general life style and attitudes of the citizenry they serve. However, this is perhaps asking for the famous Swiss chocolates when not even a basic candy is available. 😛

Software systems that are used in industrial systems undergo rigorous testing and validation and still they can fail, crash, malfunction and give erroneous results. Studies conducted on the economic cost of such badly designed systems have reported losses of billions of dollars (see here and here). However, if badly designed software is used to provide citizen services, I am not aware of any report that analyzes the associated economic loss. You may be wondering what triggered this post or this conclusion. Well, in India, the government has mandated booking of cooking gas via dedicated hotline numbers which connect to a software system that manages the booking request, generation of customer invoice etc. However, during a recent such exercise, my father received a SMS that the booking has been cancelled (with an even funnier reason stated in the SMS: “Reason: Cancelled Booking”). He did not apply for cancellation. So, he had to drive to the vendor to inquire about this because a number of these vendors are not responsive enough to answer such questions on phone. The vendor replied that it is a software glitch and the booking will be processed shortly; the SMS can be ignored. Not only all this put stress on a citizen but also resulted in precious petrol going down the drain. Now multiply this one incident with another one lakh (a hundred thousand; a very conservative estimate) such cases a month and you get the picture. By the way, there are around 15 crore (i.e. 15 million) consumers of liquefied petroleum gas (LPG, the primary cooking gas in India) (see here).

Apart from the economic cost (whether big or small), such incident create friction and distrust in the system. This is a bigger danger as it cannot be put in monetary terms. Citizens begin to suspect service providers and begin to complain. All of this can be avoided if these social software systems are properly designed and the service providers educated about their proper usage. Unfortunately, this last part seems to be the least of concerns for many people involved in such exercises.

When Economic Forces Influence Universities

In Education, Research and Development, Science & Technology Promotion and Public Policy on January 31, 2015 at 9:28 PM

That universities are being increasingly subjected to economic forces is no longer a surprising news. Many articles have been written about the utility of research done at universities, transforming them into products, restricting funding to research in areas of less economic importance etc. I won’t discuss these in this post as this subject is vast. However, I will highlight one important development that I learned about only recently. I was talking to a professor and we discussed faculty appointments, research areas at his university etc. It came to me as a surprise that most students in his department were opting for courses that led to jobs in companies in a few prominent industries in the region. As a result, the university and the department were increasingly more interested in hiring faculty who had experience in those subjects. This was not always the case with those students. Five to ten years ago, the student population was not skewed this way. As a result, the department had faculty in almost all areas of study/research. Now that the student population had become so skewed, a number of faculty members have very reduced teaching load. In effect, these faculty members are now becoming “surplus faculty”. Needless to say that their areas of research and scholarship are only remotely related or unrelated to areas in which students are getting placed. Consequently, there is little hiring of faculty members in these areas and it may also have an impact on the number of faculty members who get tenure. Is this good for education and research? What should a university do in such a case? I would say that such an effect of economic forces is not good for education and research. However, in a world that increasingly wants to relate every human activity to some sort of economic force, it can be difficult to make a convincing case for hiring/retaining scholars in those disciplines. As far as what a university should do is concerned, it is not an easy question to answer. It requires administration with vision, foresight and strength to deal with such a scenario. Whatever be the case, it seems that the concept of a university is undergoing evolution and there is a need to choose a path that is least damaging to all/most stakeholders.

The Curious Case of Algorithms

In Education, Interdisciplinary Science, Mathematics on October 31, 2013 at 2:40 PM

I finished reading “The Golden Ticket: P, NP and The Search for The Impossible” some time back. It is a very nice book that introduces one to complexity theory. Essentially, it describes, without too much of Mathematics, what kinds of problems can be solved and what other kinds will take forever to solve. However, if these – the forever to solve ones– were to get solved one day, what would be the impact. P  refers to the problems that can be solved quickly using computers to get the best solution. On the other hand NP refers to problems whose best solution cannot be found quickly using computers. I have deliberately simplified things for your understanding. This field is vastly complex!

The word “quickly” is used here with reference to a time span which is acceptable to the seeker of the solution. It could be a few seconds, or a few weeks.  Going by the nature of humans, any solution (best or otherwise) that might be delivered in months or years will probably be unacceptable. The search of any solution is accomplished using algorithms. It is these algorithms that can either give us a solution “quickly” or might take ages to finish their task. It is believed that if we could find algorithms that could solve any problem in the class of NP problemswe could solve many challenges facing us. These problems can be found in varied fields like biology, cancer research, mathematics, computer science, economics etc. However, some of the modern day systems which we feel very secure and safe about will lose these strengths if an NP problem is solved. This is because they rely on the fact that NP problems are extremely hard to solve quickly. For instance, your secure online bank transaction won’t be secure anymore. The public-key cryptography, on which it relies, would be broken by then.

Another technologically interesting aspect of algorithms is their ability to provide information based on someone’s taste in color, clothes, books, music etc. In fact, it is this type of algorithms which is used by eBay, Amazon etc. to recommend to users items for purchase. They track their actions: which items they click on, which items they buy etc. to create an “algorithmic profile” of users. While all this sounds interesting and potentially time saving for someone who knows what to buy, this also has a negative side effect. As a regular user of such platforms, you end up getting information that is tailored to your existing taste. Therefore, you cannot easily get information that is not relevant to your taste. Effectively, your ability to explore ( if you are also someone who likes to explore) becomes limited. Of course there are ways to overcome this, simplest of them being not to sign in when performing a search!! You can argue that many prefer automatic sign-ins to save time and the need to remember passwords. True, but then you have to decide whether you want to work/live like a frog in a well or like a whale exploring an ocean! 🙂

On Diffusion of Innovations

In Education, Interdisciplinary Science, Research and Development, Startup on May 10, 2013 at 1:57 AM

Diffusion of Innovations is a remarkable book by Everett M. Rogers. It is also a field of study and research where questions related to the diffusion of innovations through different groups of people and cultures are studied. This theory seeks to explain how innovations spread, how they are adopted or rejected, their social impact and the rate at which these processes occur over a period of time. This book has plenty of examples of innovations that diffused and those that did not. Notable examples include the idea of water boiling that the public health service in Peru wanted to promote in a Peruvian village and failed in doing so; non-diffusion of the Dvorak keyboard; the relatively successful STOP AIDS campaign in San Francisco in the mid-1980s etc. Note that the use of the term innovation  is not restricted to technological innovations only. According to Rogers, “An  innovation is an idea, practice, or object that is perceived as new by an individual or other unit of adoption“.

Technologists and engineers generally think that a new idea will sell itself, that advantageous innovations will be quickly adopted. However, this is seldom the case and the adoption, in general, is slow. This is a fact that is of relevance to many start ups. There are social and cultural aspects of innovation that have a big influence on its adoption. Influencing the adopters involves not only relevant marketing but also addressing social, cultural and economic issues. Of course the range of issues to be addressed depends on the innovation that we are trying to sell or promote.

It would come as a surprise to many that Everett M. Rogers was not from business or engineering background. He was a scholar in  communications and sociology!

The World as a State Machine

In Design Methodologies, Education, Engineering Principles, Mathematics on April 29, 2013 at 9:46 PM

A state machine is basically a model of computation which helps one analyze the effects of input on a system. This system can remain in different states throughout its life cycle though in only one state at a time. It can transition from one state to another depending on some input. Every state machine has a start state and it progresses from there to other states eventually leading to an end state. Note that it is possible to reach the end state from any intermediate state as well as the start state. It depends on the system being modeled. Also, the output of each state may depend on the current state as well the inputs to that state.  Thus state machines model reactive systems i.e. systems which react. A good description of state machines can be found here. Note that the description there is related to finite state machines, which are so called because they have finite number of states. State machines are used in different fields of study not just electrical or computer engineering. They are used in biology, mathematics, linguistics etc. They also have different variants each trying to capture some additional parameters of a system which I would not go into. You can read about them at the link mentioned earlier.

I was wondering if the world can be modeled as a state machineI think that the world in fact is a state machine except that its end state is unknown. Those with absolute faith in cosmological physics would state that the “Big Bang” can be considered as the start  state. Those with religious views might consider something else as the start state.  The beauty of this world being considered as a state machine lies in the fact that it does not matter whether you believe in science or not. It does not matter whether you have more of a religious bent of mind and would like to see the world from a religious or theological perspective or whether you want to see it only from a scientific standpoint. Either way, the world can be modeled as a state machine. You get to choose the start state depending on which viewpoint you are more comfortable with. In both the cases, the world is in fact a reactive system. It can even be considered as an aggregation of interacting state machines where each state machine can represent the economic, social, political, religious and scientific state of the world. And nobody would deny that all these concepts influence each other. Every electrical or computer engineering student studies about Moore and Mealy state machines. To them, the world is probably a Mealy state machine though not strictly so: the outputs in any state that this world resides in is dependent not only on the current inputs but also on the current state. If we look around us, it sounds so true,   does it not? However, this state machine is extremely complex!

A Case of Two Means:Geometric & Arithmetic

In Engineering Principles, Mathematics on November 7, 2012 at 11:55 PM

Why is this post there? I have come across several examples of quoting results (numbers) in papers, reports etc. where the authors have used arithmetic mean. For instance, people would run an application on different computing platforms and then calculate the time taken on each platform. They would present their results in a table and the last column would have an entry titled “mean”. Often, it is the arithmetic mean (AM) that is quoted. How many times have you seen the geometric mean (GM) being quoted? Not many. The primary reason being that we are too comfortable with the arithmetic mean. This is what pops up in our heads generally when we think of a mean. But we forget in the process if AM is the right choice. It is important to understand when to use AM and GM . AM is biased towards large data points in a data set while that is not the case with GM. GM is generally used when several quantities multiply together to produce a result while AM is generally used when they add up to produce a result. Sometimes it is obvious when they add up and when they multiply. Sometimes,it is not so obvious. So you have to put extra effort in finding out which mean to use  and what message you are trying to drive home through that mean value. In the example cited in the beginning, GM should be used. Some nice references to read are : ref1, ref2, ref3, ref4. Similary, understanding when to use Harmonic Mean (HM) is also important. Whichever mean you choose,you have to understand your data points as well as be clear about the message you are trying to convey. Means and averages are very important in economics, mathematical finance etc.

Can a computer do envy-free divison?

In Education, Interdisciplinary Science on July 28, 2012 at 10:15 PM

We have all studied division. In the world of simple mathematics, 8 divided by 2 is always 4.  But what about dividing a cake into 2 equal pieces? A computer program can always divide 8 by 2 and give 4 as answer, but can a computer program divide a cake into 2 equal pieces? Let us make it a bit more complicated. Say the cake has to be divided between persons A and B and in such a way that neither of them feels that the other person got more. This means that neither A or B will envy the share received by the other. So here the notion of equal division has to be understood in the context of the result leading to an envy-free solution. This is the subject of “Fair Division” also known as cake cutting problem. It is studied in politics, mathematics, economics and the like. Methods and algorithms have been proposed to achieve fair division but all require inputs from the parties involved in the division at different stages of the procedure. Note that these inputs need not be disclosed as these could be the feelings/assumptions/conclusions running in the minds of the parties involved. This means that different inputs at different stages can lead to different outcomes. Does it remind of “Observer Effect” in Physics? Yes. The inputs(observation of a current state of division) by a party affects the outcome of division (phenomenon being observed). It is impossible (?) for a computer to solve a problem of this type entirely on its own. Such problems arise routinely in allocation of goods, dispute resolution, negotiation of  treaties etc.

Borrowing terms from economics, a number can be treated as ‘a homogeneous good’ while a cake is essentially ‘a heterogeneous good’ as different parts of it can taste different. Hence, its envy-free division is far more complicated. If you are interested, try to read “Fair Division-From cake-cutting to dispute resolution“, an excellent book by Steven J. Brams (political scientist) and Alan D. Taylor (mathematician).